@farazirfan/costar-server-executor 1.7.28 → 1.7.29

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@farazirfan/costar-server-executor",
3
- "version": "1.7.28",
3
+ "version": "1.7.29",
4
4
  "description": "CoStar Server Executor - 24/7 autonomous agent in TypeScript (cloned from OpenClaw)",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -19,12 +19,12 @@ Before ANY trading decision, read [LEARNING.md](LEARNING.md) first. It contains
19
19
  - Confluence over conviction — require 2-3 confirming signals, never trade on one
20
20
  - Process over outcome — good process matters more than any single result
21
21
  - Build, don't just log — every trade should leave behind infrastructure, not just notes
22
- - The internet is your edge — GitHub, expert blogs, papers, news, communities. Not just the 250 data APIs
22
+ - The internet is your edge — GitHub, expert blogs, papers, news, communities. Not just the data APIs
23
23
  - Every request gets expert treatment — "buy BTC" gets the same depth of thinking as "grow my portfolio"
24
24
 
25
25
  ## Your Data Arsenal
26
26
 
27
- You have extensive financial APIs via `discover_data_api` and `fetch_api_data` covering real-time quotes, historical prices, technical indicators, fundamentals, news, institutional data, economic data, and screening.
27
+ You have extensive financial APIs via `discover_data_api` and `fetch_api_data` covering real-time quotes, historical prices, technical indicators, fundamentals, news, institutional data, economic data, and screening. Outside of agent tools, use `python3 scripts/data-api.py` to call the same APIs from CLI, cron jobs, or other scripts (see below).
28
28
 
29
29
  But the APIs are just one corner. You also have `web_search`, `web_fetch`, and `browser` to access everything:
30
30
  - GitHub — open-source trading tools, indicators, backtesting frameworks, sentiment repos
@@ -123,7 +123,7 @@ Route execution to the appropriate platform:
123
123
 
124
124
  ### 10. Monitor
125
125
 
126
- Trail stop to breakeven after 1R profit. Never move stop away from entry. Watch for invalidation. If regime changes mid-trade, reassess.
126
+ Create a live dashboard for this trade (see Trade Dashboards section below). Trail stop to breakeven after 1R profit. Never move stop away from entry. Watch for invalidation. If regime changes mid-trade, reassess. Update the dashboard with every significant change.
127
127
 
128
128
  ### 11. Post-Trade Evolution
129
129
 
@@ -192,6 +192,86 @@ See [references/risk-management.md](references/risk-management.md) for the compl
192
192
 
193
193
  Track your performance metrics and improve them over time. Document your risk approach in LEARNING.md. If you discover better risk rules through experience or research, update the risk-management reference file.
194
194
 
195
+ ## Trade Dashboards
196
+
197
+ Every trade gets its own live dashboard. You build each one from scratch — the layout, the data, the visualizations, the update logic. No two dashboards should look the same because no two trades are the same.
198
+
199
+ ### You Own Everything
200
+
201
+ `scripts/trade-dashboard-template.html` gives you a design system starter (Tailwind, dark theme, fonts, glass-card style) and an empty `<div id="app">`. That's it. The rest is yours:
202
+
203
+ - **Build the entire HTML, CSS, and JS for each trade.** Add sections, remove sections, restructure everything. The template imposes zero constraints on layout, data format, or functionality.
204
+ - **Add any CDN library** — Chart.js, D3, ApexCharts, Plotly, Three.js, Lightweight Charts, or anything else. Swap libraries between dashboards if one works better for a specific trade type.
205
+ - **Fetch live data** — Use `fetch()` to hit external APIs, WebSocket for real-time streaming, or embed data directly. Pull from your data APIs, news sources, on-chain analytics, whatever the trade needs.
206
+ - **Invent new visualizations** — Correlation heatmaps, multi-timeframe charts, order flow waterfalls, sentiment gauges, regime indicators, funding rate trackers, economic calendar overlays. If it helps monitor the trade, build it.
207
+ - **Build custom interactivity** — Toggles, tabs, zoom controls, timeframe selectors, what-if scenarios. Make the dashboard a tool, not just a display.
208
+ - **Update the template itself** — When you discover patterns that should apply to all future dashboards, improve the template. You own it.
209
+
210
+ ### Integrating Live Data
211
+
212
+ Your dashboards should show real-time data. You have multiple ways to feed data into them:
213
+
214
+ **Using your data APIs via `scripts/data-api.py`**
215
+
216
+ `scripts/data-api.py` is a standalone Python client for the same backend that powers `discover_data_api` and `fetch_api_data`. Use it anywhere — CLI, cron jobs, dashboard data pipelines, other scripts:
217
+
218
+ ```bash
219
+ # Discover APIs for a query
220
+ python3 scripts/data-api.py search "Bitcoin price and volume"
221
+
222
+ # Fetch data from a discovered API
223
+ python3 scripts/data-api.py fetch FMP_QUOTE '{"symbol": "AAPL"}'
224
+ python3 scripts/data-api.py fetch FMP_CRYPTO_QUOTE '{"symbol": "BTCUSD"}'
225
+
226
+ # Discover + fetch in one shot
227
+ python3 scripts/data-api.py get "current gold price"
228
+
229
+ # Pipe raw data to a JSON file for dashboards
230
+ python3 scripts/data-api.py fetch FMP_QUOTE '{"symbol": "AAPL"}' --raw > dashboards/data/aapl.json
231
+ ```
232
+
233
+ It's also importable in other Python scripts:
234
+ ```python
235
+ from data_api import search_apis, fetch_api
236
+ apis = search_apis("Bitcoin technical indicators")
237
+ data = fetch_api("FMP_CRYPTO_QUOTE", {"symbol": "BTCUSD"})
238
+ ```
239
+
240
+ Use this to build data pipelines: cron runs `data-api.py` → writes JSON → dashboard reads it on reload. Or use `discover_data_api` in the agent to find vendor codes, then hardcode them into scripts that run independently.
241
+
242
+ **Using any internet source**
243
+
244
+ You are not limited to your data APIs. Find and integrate anything:
245
+ - Free public APIs (CoinGecko, Yahoo Finance, Alpha Vantage, Binance, etc.) — call them directly via `fetch()` in the dashboard JS if they support CORS, or build a proxy script.
246
+ - WebSocket feeds for real-time streaming (Binance WS, Coinbase WS, etc.) — connect directly from dashboard JS.
247
+ - RSS feeds, news APIs, social sentiment APIs — fetch and render in the dashboard.
248
+ - On-chain data providers (Glassnode, Dune, DefiLlama) — for crypto trades.
249
+ - Build Python scripts that scrape, aggregate, or transform data from any source and output JSON for the dashboard to consume.
250
+
251
+ **Data update patterns**
252
+
253
+ Pick whatever pattern fits the trade:
254
+ - **Embedded data** — You write the data directly into the HTML. Simplest. Update the file whenever data changes.
255
+ - **Polling** — Dashboard JS uses `fetch()` on an interval to pull from an API or a local JSON file that a script keeps updated.
256
+ - **WebSocket** — Dashboard JS connects to a streaming feed for real-time ticks. Best for active crypto/forex trades.
257
+ - **Script pipeline** — A cron job runs a Python script that fetches data → writes a JSON file → dashboard reads the JSON file on reload.
258
+
259
+ No restrictions. If a data source exists and would help monitor the trade, integrate it.
260
+
261
+ ### Workflow
262
+
263
+ 1. When opening a trade, create `dashboards/<asset>-<direction>-<date>.html` in your workspace. Start from the template or from scratch — your choice.
264
+ 2. Build the dashboard for this specific trade. Think about what information matters most for monitoring THIS trade.
265
+ 3. Serve dashboards with `python3 scripts/dashboard-server.py --dir dashboards/ --port 8500`. Use `cron` to keep it running.
266
+ 4. Update the dashboard as the trade evolves — new data, moved stops, new events, changed regime, anything. Rewrite entire sections if the trade context changes.
267
+ 5. When the trade closes, update the dashboard with final results. It stays as a historical record.
268
+
269
+ ### Scripts
270
+
271
+ - `scripts/data-api.py` — Standalone client for hundreds of data APIs. Search, fetch, or auto-fetch from CLI. Importable in other Python scripts. Powers dashboard data pipelines.
272
+ - `scripts/trade-dashboard-template.html` — Minimal design system skeleton. Tailwind + dark theme + fonts. Empty body. You build everything else.
273
+ - `scripts/dashboard-server.py` — HTTP server for the dashboards directory with auto-generated index page.
274
+
195
275
  ## Key Principles
196
276
 
197
277
  - No FOMO — there is always another trade
@@ -0,0 +1,191 @@
1
+ #!/usr/bin/env python3
2
+ """
3
+ Trade Dashboard Server — Serves per-trade HTML dashboards.
4
+
5
+ Usage:
6
+ # Serve all dashboards in a directory
7
+ python3 dashboard-server.py --dir ./dashboards --port 8500
8
+
9
+ # Serve a single dashboard file
10
+ python3 dashboard-server.py --file ./dashboards/btc-long-2024-01-15.html --port 8500
11
+
12
+ The server serves static HTML files. Each dashboard is a self-contained
13
+ HTML file that the agent creates from the template and updates over time.
14
+
15
+ Dashboard directory structure:
16
+ dashboards/
17
+ btc-long-2024-01-15.html
18
+ eurusd-short-2024-01-16.html
19
+ index.html (auto-generated listing page)
20
+ """
21
+
22
+ import argparse
23
+ import http.server
24
+ import json
25
+ import os
26
+ import sys
27
+ from pathlib import Path
28
+ from datetime import datetime
29
+
30
+
31
+ def generate_index(dashboard_dir):
32
+ """Generate an index.html listing all trade dashboards."""
33
+ dashboards = []
34
+ for f in sorted(Path(dashboard_dir).glob("*.html")):
35
+ if f.name == "index.html":
36
+ continue
37
+ stat = f.stat()
38
+ # Try to extract trade data from the file
39
+ content = f.read_text(encoding="utf-8")
40
+ trade_info = {"file": f.name, "modified": datetime.fromtimestamp(stat.st_mtime).isoformat()}
41
+
42
+ # Extract trade data JSON if present
43
+ start = content.find('id="trade-data"')
44
+ if start != -1:
45
+ json_start = content.find("{", start)
46
+ json_end = content.find("</script>", json_start)
47
+ if json_start != -1 and json_end != -1:
48
+ try:
49
+ data = json.loads(content[json_start:json_end])
50
+ trade_info.update({
51
+ "asset": data.get("asset", ""),
52
+ "direction": data.get("direction", ""),
53
+ "status": data.get("status", ""),
54
+ "pnl": data.get("pnl_dollars", 0),
55
+ "pnl_pct": data.get("pnl_percent", 0),
56
+ })
57
+ except json.JSONDecodeError:
58
+ pass
59
+
60
+ dashboards.append(trade_info)
61
+
62
+ html = f"""<!DOCTYPE html>
63
+ <html lang="en" class="dark">
64
+ <head>
65
+ <meta charset="UTF-8">
66
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
67
+ <title>Trade Dashboards</title>
68
+ <script src="https://cdn.tailwindcss.com"></script>
69
+ <script>
70
+ tailwind.config = {{ darkMode: 'class' }}
71
+ </script>
72
+ <style>body {{ background: #0f1117; }}</style>
73
+ </head>
74
+ <body class="min-h-screen text-gray-100 p-8">
75
+ <div class="max-w-4xl mx-auto">
76
+ <h1 class="text-3xl font-bold mb-2">Trade Dashboards</h1>
77
+ <p class="text-gray-400 mb-8">Active and historical trade dashboards</p>
78
+ <div class="space-y-3">
79
+ """
80
+ for d in dashboards:
81
+ status_colors = {
82
+ "open": "bg-green-500",
83
+ "pending": "bg-amber-500",
84
+ "closed": "bg-gray-500",
85
+ "stopped": "bg-red-500",
86
+ }
87
+ dot_color = status_colors.get(d.get("status", ""), "bg-gray-500")
88
+ pnl = d.get("pnl", 0)
89
+ pnl_color = "text-green-400" if pnl >= 0 else "text-red-400"
90
+ pnl_str = f"+${pnl:,.2f}" if pnl >= 0 else f"-${abs(pnl):,.2f}"
91
+
92
+ direction = d.get("direction", "").upper()
93
+ dir_class = "bg-green-500/20 text-green-400" if direction == "LONG" else "bg-red-500/20 text-red-400" if direction == "SHORT" else ""
94
+
95
+ html += f"""
96
+ <a href="{d['file']}" class="block bg-gray-900/60 border border-gray-800 rounded-xl p-4 hover:border-gray-600 transition-colors">
97
+ <div class="flex items-center justify-between">
98
+ <div class="flex items-center gap-3">
99
+ <div class="w-2.5 h-2.5 rounded-full {dot_color}"></div>
100
+ <span class="font-semibold">{d.get('asset', d['file'])}</span>
101
+ {f'<span class="text-xs px-2 py-0.5 rounded {dir_class}">{direction}</span>' if direction else ''}
102
+ </div>
103
+ <div class="text-right">
104
+ <div class="{pnl_color} font-mono font-semibold">{pnl_str}</div>
105
+ <div class="text-xs text-gray-500">{d.get('status', 'unknown')}</div>
106
+ </div>
107
+ </div>
108
+ </a>
109
+ """
110
+
111
+ if not dashboards:
112
+ html += """
113
+ <div class="text-center text-gray-500 py-12">
114
+ <div class="text-4xl mb-3">📊</div>
115
+ <div>No trade dashboards yet. Dashboards appear here as trades are opened.</div>
116
+ </div>
117
+ """
118
+
119
+ html += f"""
120
+ </div>
121
+ <div class="text-center text-xs text-gray-600 mt-8">
122
+ {len(dashboards)} dashboard{'s' if len(dashboards) != 1 else ''} &mdash; Auto-refreshes every 30s
123
+ </div>
124
+ </div>
125
+ <script>setTimeout(() => location.reload(), 30000);</script>
126
+ </body>
127
+ </html>"""
128
+
129
+ index_path = Path(dashboard_dir) / "index.html"
130
+ index_path.write_text(html, encoding="utf-8")
131
+ return index_path
132
+
133
+
134
+ class DashboardHandler(http.server.SimpleHTTPRequestHandler):
135
+ """Custom handler that auto-generates index and suppresses logs."""
136
+
137
+ def __init__(self, *args, directory=None, **kwargs):
138
+ super().__init__(*args, directory=directory, **kwargs)
139
+
140
+ def do_GET(self):
141
+ # Regenerate index on each request to root
142
+ if self.path in ("/", "/index.html"):
143
+ generate_index(self.directory)
144
+ super().do_GET()
145
+
146
+ def log_message(self, format, *args):
147
+ # Quieter logging
148
+ sys.stderr.write(f"[dashboard] {args[0]}\n")
149
+
150
+
151
+ def main():
152
+ parser = argparse.ArgumentParser(description="Trade Dashboard Server")
153
+ parser.add_argument("--dir", default="./dashboards", help="Directory containing dashboard HTML files")
154
+ parser.add_argument("--file", help="Serve a single dashboard file")
155
+ parser.add_argument("--port", type=int, default=8500, help="Port to serve on (default: 8500)")
156
+
157
+ args = parser.parse_args()
158
+
159
+ if args.file:
160
+ # Serve single file's parent directory
161
+ file_path = Path(args.file).resolve()
162
+ serve_dir = str(file_path.parent)
163
+ else:
164
+ serve_dir = str(Path(args.dir).resolve())
165
+
166
+ # Ensure directory exists
167
+ Path(serve_dir).mkdir(parents=True, exist_ok=True)
168
+
169
+ # Generate initial index
170
+ generate_index(serve_dir)
171
+
172
+ handler = lambda *a, **kw: DashboardHandler(*a, directory=serve_dir, **kw)
173
+
174
+ with http.server.HTTPServer(("0.0.0.0", args.port), handler) as httpd:
175
+ print(json.dumps({
176
+ "status": "running",
177
+ "port": args.port,
178
+ "directory": serve_dir,
179
+ "url": f"http://localhost:{args.port}",
180
+ "message": f"Dashboard server running on port {args.port}",
181
+ }, indent=2))
182
+ sys.stdout.flush()
183
+
184
+ try:
185
+ httpd.serve_forever()
186
+ except KeyboardInterrupt:
187
+ print("\nDashboard server stopped.")
188
+
189
+
190
+ if __name__ == "__main__":
191
+ main()
@@ -0,0 +1,211 @@
1
+ #!/usr/bin/env python3
2
+ """
3
+ Data API Client — Search and fetch data from hundreds of APIs.
4
+
5
+ Standalone CLI for the same backend that powers discover_data_api and
6
+ fetch_api_data agent tools. Use this in scripts, cron jobs, dashboard
7
+ data pipelines, or anywhere you need financial/market data outside
8
+ the agent runtime.
9
+
10
+ Usage:
11
+ # Discover APIs for a query
12
+ python3 data-api.py search "Bitcoin price and volume"
13
+ python3 data-api.py search "Apple stock technical indicators" --limit 10
14
+
15
+ # Fetch data from a discovered API
16
+ python3 data-api.py fetch FMP_QUOTE '{"symbol": "AAPL"}'
17
+ python3 data-api.py fetch FMP_CRYPTO_QUOTE '{"symbol": "BTCUSD"}'
18
+ python3 data-api.py fetch FMP_TECHNICAL_INDICATOR '{"symbol": "AAPL", "type": "rsi", "period": 14}'
19
+
20
+ # Discover + fetch in one shot (finds best API, calls it)
21
+ python3 data-api.py get "current Bitcoin price"
22
+ python3 data-api.py get "EUR/USD exchange rate"
23
+ python3 data-api.py get "S&P 500 fear and greed index"
24
+
25
+ # Output just the data (for piping to files or other scripts)
26
+ python3 data-api.py fetch FMP_QUOTE '{"symbol": "AAPL"}' --raw
27
+
28
+ Importable:
29
+ from data_api import search_apis, fetch_api, auto_fetch
30
+ results = search_apis("Bitcoin price")
31
+ data = fetch_api("FMP_CRYPTO_QUOTE", {"symbol": "BTCUSD"})
32
+ data = auto_fetch("current gold price")
33
+
34
+ Environment:
35
+ DATA_API_BASE_URL — Override backend URL (default: CoStar production backend)
36
+ """
37
+
38
+ import argparse
39
+ import json
40
+ import os
41
+ import sys
42
+ import urllib.request
43
+ import urllib.error
44
+
45
+ DEFAULT_BASE_URL = "https://costar-backend-hetzner-production.up.railway.app"
46
+ SEARCH_TIMEOUT = 30 # seconds
47
+ FETCH_TIMEOUT = 60 # seconds
48
+
49
+
50
+ def _get_base_url():
51
+ return os.environ.get("DATA_API_BASE_URL", DEFAULT_BASE_URL)
52
+
53
+
54
+ def _post(url, payload, timeout):
55
+ """POST JSON and return parsed response."""
56
+ data = json.dumps(payload).encode("utf-8")
57
+ req = urllib.request.Request(
58
+ url,
59
+ data=data,
60
+ headers={"Content-Type": "application/json"},
61
+ method="POST",
62
+ )
63
+ try:
64
+ with urllib.request.urlopen(req, timeout=timeout) as resp:
65
+ return json.loads(resp.read().decode("utf-8"))
66
+ except urllib.error.HTTPError as e:
67
+ body = e.read().decode("utf-8", errors="replace")
68
+ return {"success": False, "error": f"HTTP {e.code}", "details": body}
69
+ except urllib.error.URLError as e:
70
+ return {"success": False, "error": f"Connection error: {e.reason}"}
71
+ except Exception as e:
72
+ return {"success": False, "error": str(e)}
73
+
74
+
75
+ # ─── Core Functions (importable) ─────────────────────────────────
76
+
77
+ def search_apis(query, limit=5):
78
+ """
79
+ Search for APIs matching a natural language query.
80
+
81
+ Returns dict with 'apis' list. Each API has:
82
+ vendor_code, use_for, params, returns
83
+ """
84
+ base = _get_base_url()
85
+ result = _post(f"{base}/api/search", {"query": query, "limit": limit}, SEARCH_TIMEOUT)
86
+ return result
87
+
88
+
89
+ def fetch_api(vendor_code, params=None):
90
+ """
91
+ Fetch data from an API by vendor_code.
92
+
93
+ Use search_apis() first to discover vendor_codes and required params.
94
+ """
95
+ base = _get_base_url()
96
+ result = _post(
97
+ f"{base}/api/proxy",
98
+ {"vendor_code": vendor_code, "params": params or {}},
99
+ FETCH_TIMEOUT,
100
+ )
101
+ # Unwrap: backend may nest data under 'data' key
102
+ if isinstance(result, dict) and "data" in result and result.get("success") is not False:
103
+ return result["data"]
104
+ return result
105
+
106
+
107
+ def auto_fetch(query):
108
+ """
109
+ Discover + fetch in one call. Finds the best API for the query,
110
+ extracts likely params from the query context, and fetches data.
111
+
112
+ Returns the raw data from the best matching API.
113
+ """
114
+ # Step 1: Discover
115
+ discovery = search_apis(query, limit=3)
116
+ apis = discovery.get("apis", [])
117
+
118
+ if not apis:
119
+ return {"success": False, "error": "No APIs found for query", "query": query}
120
+
121
+ # Step 2: Pick the first (best) match
122
+ best = apis[0]
123
+ vendor_code = best["vendor_code"]
124
+ required_params = {}
125
+
126
+ # Try to extract params from the discovered API's param spec
127
+ param_spec = best.get("params", {})
128
+ if isinstance(param_spec, dict):
129
+ for key, info in param_spec.items():
130
+ if isinstance(info, dict) and info.get("required"):
131
+ # For required params, we include them with None
132
+ # The agent should fill these in — this is a best-effort helper
133
+ required_params[key] = info.get("default")
134
+
135
+ # Step 3: Fetch
136
+ data = fetch_api(vendor_code, required_params)
137
+
138
+ return {
139
+ "vendor_code": vendor_code,
140
+ "use_for": best.get("use_for", ""),
141
+ "params_used": required_params,
142
+ "data": data,
143
+ }
144
+
145
+
146
+ # ─── CLI Commands ────────────────────────────────────────────────
147
+
148
+ def cmd_search(args):
149
+ """Search for available APIs."""
150
+ result = search_apis(args.query, args.limit)
151
+ print(json.dumps(result, indent=2, default=str))
152
+
153
+
154
+ def cmd_fetch(args):
155
+ """Fetch data from a specific API."""
156
+ params = json.loads(args.params) if args.params else {}
157
+ result = fetch_api(args.vendor_code, params)
158
+
159
+ if args.raw:
160
+ # Output just the data for piping
161
+ if isinstance(result, dict) and "data" in result:
162
+ print(json.dumps(result["data"], indent=2, default=str))
163
+ else:
164
+ print(json.dumps(result, indent=2, default=str))
165
+ else:
166
+ print(json.dumps(result, indent=2, default=str))
167
+
168
+
169
+ def cmd_get(args):
170
+ """Discover + fetch in one shot."""
171
+ result = auto_fetch(args.query)
172
+ print(json.dumps(result, indent=2, default=str))
173
+
174
+
175
+ def main():
176
+ parser = argparse.ArgumentParser(
177
+ description="Data API Client — Search and fetch from hundreds of APIs"
178
+ )
179
+ subparsers = parser.add_subparsers(dest="command", help="Command to run")
180
+
181
+ # search
182
+ search_p = subparsers.add_parser("search", help="Discover APIs for a query")
183
+ search_p.add_argument("query", help="Natural language description of data needed")
184
+ search_p.add_argument("--limit", type=int, default=5, help="Max APIs to return (1-10)")
185
+
186
+ # fetch
187
+ fetch_p = subparsers.add_parser("fetch", help="Fetch data from a specific API")
188
+ fetch_p.add_argument("vendor_code", help="API vendor code (e.g., FMP_QUOTE)")
189
+ fetch_p.add_argument("params", nargs="?", default="{}", help='JSON params (e.g., \'{"symbol": "AAPL"}\')')
190
+ fetch_p.add_argument("--raw", action="store_true", help="Output raw data only (for piping)")
191
+
192
+ # get (discover + fetch combined)
193
+ get_p = subparsers.add_parser("get", help="Discover and fetch in one shot")
194
+ get_p.add_argument("query", help="Natural language query (e.g., 'current Bitcoin price')")
195
+
196
+ args = parser.parse_args()
197
+
198
+ commands = {
199
+ "search": cmd_search,
200
+ "fetch": cmd_fetch,
201
+ "get": cmd_get,
202
+ }
203
+
204
+ if args.command in commands:
205
+ commands[args.command](args)
206
+ else:
207
+ parser.print_help()
208
+
209
+
210
+ if __name__ == "__main__":
211
+ main()
@@ -0,0 +1,81 @@
1
+ <!DOCTYPE html>
2
+ <html lang="en" class="dark">
3
+ <head>
4
+ <meta charset="UTF-8">
5
+ <meta name="viewport" content="width=device-width, initial-scale=1.0">
6
+ <title>Trade Dashboard</title>
7
+
8
+ <!-- ═══════════════════════════════════════════════════════════════
9
+ DESIGN SYSTEM — CDN libraries and base styles.
10
+ Agent: add/remove/swap any library. These are just suggestions.
11
+ Examples: Chart.js, D3.js, ApexCharts, Plotly, Three.js, etc.
12
+ ═══════════════════════════════════════════════════════════════ -->
13
+ <script src="https://cdn.tailwindcss.com"></script>
14
+ <script src="https://cdn.jsdelivr.net/npm/lightweight-charts@4.1.0/dist/lightweight-charts.standalone.production.js"></script>
15
+ <script>
16
+ tailwind.config = {
17
+ darkMode: 'class',
18
+ theme: {
19
+ extend: {
20
+ colors: {
21
+ surface: { 1: '#0f1117', 2: '#161822', 3: '#1e2030' },
22
+ accent: { green: '#22c55e', red: '#ef4444', blue: '#3b82f6', amber: '#f59e0b' },
23
+ }
24
+ }
25
+ }
26
+ }
27
+ </script>
28
+ <style>
29
+ @import url('https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=JetBrains+Mono:wght@400;500&display=swap');
30
+ body { background: #0f1117; font-family: 'Inter', system-ui, sans-serif; }
31
+ .mono { font-family: 'JetBrains Mono', monospace; }
32
+ .glass { background: rgba(30, 32, 48, 0.6); backdrop-filter: blur(12px); border: 1px solid rgba(255,255,255,0.06); }
33
+ </style>
34
+ </head>
35
+ <body class="min-h-screen text-gray-100">
36
+
37
+ <!-- ═══════════════════════════════════════════════════════════════
38
+ THIS IS A MINIMAL SKELETON. NOT A FIXED STRUCTURE.
39
+
40
+ Agent: you own this file completely. For every trade:
41
+ - Build the HTML layout from scratch or reuse parts of this
42
+ - Add any sections, charts, data sources, widgets you want
43
+ - Add any CDN libraries you need in <head>
44
+ - Add inline <style> or Tailwind classes for custom styling
45
+ - Add <script> blocks for data fetching, rendering, interactivity
46
+ - Structure the data however makes sense for this trade
47
+ - Every dashboard should be unique to its trade
48
+
49
+ There are NO required sections, NO required data format,
50
+ NO required layout. Build what the trade needs.
51
+
52
+ DATA SOURCES — use any combination:
53
+ - Use discover_data_api to find vendor_codes, then build
54
+ Python scripts that call the backend proxy and output JSON
55
+ for the dashboard to consume (via fetch or file read).
56
+ - Call any public API directly via fetch() — CoinGecko,
57
+ Binance, Yahoo Finance, Alpha Vantage, DefiLlama, etc.
58
+ - Connect WebSocket feeds for real-time streaming —
59
+ Binance WS, Coinbase WS, forex feeds, etc.
60
+ - Build Python script pipelines (cron → script → JSON file
61
+ → dashboard reads on reload).
62
+ - Embed data directly in the HTML when you update the file.
63
+ - Find ANY data source from the internet and integrate it.
64
+ No restrictions on data sources or update patterns.
65
+
66
+ IDEAS (not requirements):
67
+ - Price charts (candlestick, line, area, heatmap)
68
+ - P&L tracking, risk gauges, progress bars
69
+ - News feeds, sentiment widgets, correlation matrices
70
+ - Order flow, volume profile, funding rates
71
+ - Economic calendar overlays, regime indicators
72
+ - Multiple timeframe views, comparison charts
73
+ - Any visualization that helps monitor this specific trade
74
+ ═══════════════════════════════════════════════════════════════ -->
75
+
76
+ <div id="app" class="max-w-7xl mx-auto px-4 py-6">
77
+ <!-- Agent: build your dashboard here -->
78
+ </div>
79
+
80
+ </body>
81
+ </html>