opentradex 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (70) hide show
  1. package/.env.example +8 -0
  2. package/CLAUDE.md +98 -0
  3. package/README.md +246 -0
  4. package/SOUL.md +79 -0
  5. package/SPEC.md +317 -0
  6. package/SUBMISSION.md +30 -0
  7. package/architecture.excalidraw +170 -0
  8. package/architecture.png +0 -0
  9. package/bin/opentradex.mjs +4 -0
  10. package/data/.gitkeep +0 -0
  11. package/data/strategy_notes.md +158 -0
  12. package/gossip/__init__.py +0 -0
  13. package/gossip/dashboard.py +150 -0
  14. package/gossip/db.py +358 -0
  15. package/gossip/kalshi.py +492 -0
  16. package/gossip/news.py +235 -0
  17. package/gossip/trader.py +646 -0
  18. package/main.py +287 -0
  19. package/package.json +47 -0
  20. package/requirements.txt +7 -0
  21. package/src/cli.mjs +124 -0
  22. package/src/index.mjs +420 -0
  23. package/web/AGENTS.md +5 -0
  24. package/web/CLAUDE.md +1 -0
  25. package/web/README.md +36 -0
  26. package/web/components.json +25 -0
  27. package/web/eslint.config.mjs +18 -0
  28. package/web/next.config.ts +7 -0
  29. package/web/package-lock.json +11626 -0
  30. package/web/package.json +37 -0
  31. package/web/postcss.config.mjs +7 -0
  32. package/web/public/file.svg +1 -0
  33. package/web/public/globe.svg +1 -0
  34. package/web/public/next.svg +1 -0
  35. package/web/public/vercel.svg +1 -0
  36. package/web/public/window.svg +1 -0
  37. package/web/src/app/api/agent/route.ts +77 -0
  38. package/web/src/app/api/agent/stream/route.ts +87 -0
  39. package/web/src/app/api/markets/route.ts +15 -0
  40. package/web/src/app/api/news/live/route.ts +77 -0
  41. package/web/src/app/api/news/reddit/route.ts +118 -0
  42. package/web/src/app/api/news/route.ts +10 -0
  43. package/web/src/app/api/news/tiktok/route.ts +115 -0
  44. package/web/src/app/api/news/truthsocial/route.ts +116 -0
  45. package/web/src/app/api/news/twitter/route.ts +186 -0
  46. package/web/src/app/api/portfolio/route.ts +50 -0
  47. package/web/src/app/api/prices/route.ts +18 -0
  48. package/web/src/app/api/trades/route.ts +10 -0
  49. package/web/src/app/favicon.ico +0 -0
  50. package/web/src/app/globals.css +170 -0
  51. package/web/src/app/layout.tsx +36 -0
  52. package/web/src/app/page.tsx +366 -0
  53. package/web/src/components/AgentLog.tsx +71 -0
  54. package/web/src/components/LiveStream.tsx +394 -0
  55. package/web/src/components/MarketScanner.tsx +111 -0
  56. package/web/src/components/NewsFeed.tsx +561 -0
  57. package/web/src/components/PortfolioStrip.tsx +139 -0
  58. package/web/src/components/PositionsPanel.tsx +219 -0
  59. package/web/src/components/TopBar.tsx +127 -0
  60. package/web/src/components/ui/badge.tsx +52 -0
  61. package/web/src/components/ui/button.tsx +60 -0
  62. package/web/src/components/ui/card.tsx +103 -0
  63. package/web/src/components/ui/scroll-area.tsx +55 -0
  64. package/web/src/components/ui/separator.tsx +25 -0
  65. package/web/src/components/ui/tabs.tsx +82 -0
  66. package/web/src/components/ui/tooltip.tsx +66 -0
  67. package/web/src/lib/db.ts +81 -0
  68. package/web/src/lib/types.ts +130 -0
  69. package/web/src/lib/utils.ts +6 -0
  70. package/web/tsconfig.json +34 -0
package/SPEC.md ADDED
@@ -0,0 +1,317 @@
1
+ # Open Trademaxxxing - EF Hackathon Spec
2
+
3
+ ## What It Is
4
+
5
+ An autonomous trading agent for Kalshi prediction markets that uses real-time news intelligence to find and exploit mispriced markets. The signal is the news - the agent listens to fast-moving information, figures out what it means for prediction markets, and trades before the crowd catches up.
6
+
7
+ **Key design decision:** Claude Code IS the agent. No Anthropic API calls. We spawn the `claude` CLI as a subprocess (Paperclip pattern), which means zero API cost if you have a Claude Max subscription. The Python modules are CLI tools that Claude Code invokes — not an orchestrator that calls an LLM.
8
+
9
+ ## Core Thesis
10
+
11
+ Kalshi's retail crowd is slow to react to breaking news and public data. An agent that:
12
+ 1. Scrapes news continuously (Apify — Google News, Twitter, RSS)
13
+ 2. Matches news to active Kalshi markets
14
+ 3. Estimates how the news shifts the true probability
15
+ 4. Trades when market price is stale vs the news
16
+
17
+ ...should consistently find edge on event markets that are news-driven (politics, economics, companies, world events).
18
+
19
+ ## Architecture
20
+
21
+ ```
22
+ ┌──────────────────────────────────────────────────────────┐
23
+ │ OPEN TRADEMAXXXING │
24
+ │ │
25
+ │ ┌─────────────────────────────────────────────────────┐ │
26
+ │ │ CLAUDE CODE (the brain) │ │
27
+ │ │ Spawned via: claude --print --output-format │ │
28
+ │ │ stream-json --resume <sessionId> │ │
29
+ │ │ │ │
30
+ │ │ • Reads news (shells out to gossip/news.py) │ │
31
+ │ │ • Scans markets (shells out to gossip/kalshi.py) │ │
32
+ │ │ • Reasons about probability (native LLM thinking) │ │
33
+ │ │ • Decides trades (shells out to gossip/trader.py) │ │
34
+ │ │ • Maintains context across cycles via --resume │ │
35
+ │ └─────────────────────────────────────────────────────┘ │
36
+ │ │ │ │ │
37
+ │ ▼ ▼ ▼ │
38
+ │ ┌─────────┐ ┌──────────┐ ┌─────────┐ │
39
+ │ │ NEWS │ │ MARKETS │ │ TRADER │ │
40
+ │ │ (Apify) │ │ (Kalshi) │ │ (Paper) │ │
41
+ │ └─────────┘ └──────────┘ └─────────┘ │
42
+ │ │ │ │ │
43
+ │ ▼ ▼ ▼ │
44
+ │ news articles market data trades.json │
45
+ │ + summaries orderbooks positions │
46
+ │ prices P&L │
47
+ │ │
48
+ │ ┌────────────────────────────────────────┐ │
49
+ │ │ DASHBOARD (Streamlit) │ │
50
+ │ │ Live markets · News feed · Positions │ │
51
+ │ │ Trade log · P&L · Agent reasoning │ │
52
+ │ └────────────────────────────────────────┘ │
53
+ └──────────────────────────────────────────────────────────┘
54
+ ```
55
+
56
+ ## Claude Code as Agent (Paperclip Pattern)
57
+
58
+ Instead of calling the Anthropic API, we spawn the `claude` CLI as a subprocess. This is the same pattern used by Paperclip (github.com/paperclipai/paperclip).
59
+
60
+ ### How it works
61
+
62
+ **`main.py`** is a thin Python loop that:
63
+ 1. Spawns `claude --print - --output-format stream-json --resume <session_id>` as a subprocess
64
+ 2. Pipes a prompt via stdin: "Scan markets, scrape news, find mispriced opportunities, trade."
65
+ 3. Parses JSON output from stdout
66
+ 4. Saves the session ID for context persistence across cycles
67
+ 5. Sleeps, then repeats
68
+
69
+ **Claude Code** receives the prompt and:
70
+ 1. Runs `python3 gossip/news.py` to scrape news via Apify
71
+ 2. Runs `python3 gossip/kalshi.py scan` to get active markets
72
+ 3. Thinks about which markets are mispriced given the news (this is the LLM reasoning — no API call, it's native)
73
+ 4. Runs `python3 gossip/trader.py trade TICKER --side yes --contracts 3 --estimate 0.72 --reasoning "..."` to execute
74
+ 5. Returns a summary of what it did
75
+
76
+ **Cost model:**
77
+ - Claude Max/Pro subscription → $0 per cycle (rides on subscription auth)
78
+ - ANTHROPIC_API_KEY set → API billing (fallback)
79
+
80
+ ### Session persistence
81
+
82
+ `--resume <sessionId>` keeps the full conversation context across cycles. The agent remembers:
83
+ - What it researched last cycle
84
+ - What positions it holds and why
85
+ - What news it already processed (avoids duplicate analysis)
86
+ - Its evolving thesis on each market
87
+
88
+ ### Spawning Claude Code (main.py)
89
+
90
+ ```python
91
+ import subprocess
92
+ import json
93
+
94
+ def run_agent_cycle(session_id: str | None, prompt: str) -> dict:
95
+ cmd = ["claude", "--print", "-", "--output-format", "stream-json", "--verbose"]
96
+ if session_id:
97
+ cmd.extend(["--resume", session_id])
98
+
99
+ result = subprocess.run(
100
+ cmd,
101
+ input=prompt,
102
+ capture_output=True,
103
+ text=True,
104
+ timeout=300,
105
+ env={**os.environ} # inherits Claude auth
106
+ )
107
+
108
+ # Parse stream-json output for session_id and result
109
+ for line in result.stdout.strip().split("\n"):
110
+ msg = json.loads(line)
111
+ if msg.get("type") == "system" and "session_id" in msg:
112
+ new_session_id = msg["session_id"]
113
+ if msg.get("type") == "result":
114
+ return {"session_id": new_session_id, "result": msg}
115
+
116
+ return {"session_id": session_id, "result": None}
117
+ ```
118
+
119
+ ## The Loop (main.py)
120
+
121
+ Runs continuously on a configurable interval (default: 15 min):
122
+
123
+ ```python
124
+ AGENT_PROMPT = """
125
+ You are Open Trademaxxxing, an autonomous prediction market trader.
126
+
127
+ Your tools (run these as shell commands):
128
+ - python3 gossip/news.py [--keywords "k1,k2"] [--hours 4] → scrape recent news
129
+ - python3 gossip/kalshi.py scan [--categories "Economics,Politics"] [--days 14] → active markets
130
+ - python3 gossip/kalshi.py market TICKER → market details + orderbook
131
+ - python3 gossip/trader.py portfolio → current positions + P&L
132
+ - python3 gossip/trader.py trade TICKER --side yes/no --contracts N --estimate 0.XX --reasoning "..."
133
+ - python3 gossip/trader.py exit TICKER --reasoning "..."
134
+
135
+ Your job each cycle:
136
+ 1. Scrape news for topics relevant to active Kalshi markets
137
+ 2. Scan active markets and compare prices to what the news implies
138
+ 3. For any market where you see 10pp+ edge, place a trade
139
+ 4. Check existing positions — exit if thesis invalidated by new news
140
+ 5. Log your reasoning for every decision
141
+
142
+ Think step by step. Be specific about probability estimates.
143
+ Don't trade on noise — only trade when you can articulate WHY the market is wrong.
144
+ """
145
+
146
+ async def main():
147
+ session_id = load_session_id()
148
+ while True:
149
+ result = run_agent_cycle(session_id, AGENT_PROMPT)
150
+ session_id = result["session_id"]
151
+ save_session_id(session_id)
152
+ await asyncio.sleep(CYCLE_INTERVAL)
153
+ ```
154
+
155
+ ## Module Breakdown
156
+
157
+ ### `gossip/news.py` — News Intelligence Layer (CLI tool)
158
+
159
+ Invoked by Claude Code as: `python3 gossip/news.py --keywords "bitcoin,tariff" --hours 4`
160
+
161
+ **Apify actors:**
162
+ - `apify/google-search-scraper` — Google News results by keyword
163
+ - Twitter/X scraper (community actor) — trending topics, keyword monitoring
164
+ - RSS reader — for specific sources (Reuters, AP, Bloomberg, Fed releases)
165
+
166
+ **CLI interface:**
167
+ ```
168
+ python3 gossip/news.py # scrape default keywords
169
+ python3 gossip/news.py --keywords "bitcoin,cpi" # specific keywords
170
+ python3 gossip/news.py --hours 2 # last 2 hours only
171
+ python3 gossip/news.py --trending # just trending topics
172
+ ```
173
+
174
+ **Output:** JSON to stdout — list of articles with title, url, source, published_at, snippet.
175
+
176
+ **Keyword generation:**
177
+ - Base keywords: ["inflation", "CPI", "GDP", "bitcoin", "trump", "tariff", "fed rate", "unemployment", ...]
178
+ - Claude Code dynamically picks keywords based on what markets it sees
179
+
180
+ ### `gossip/kalshi.py` — Kalshi API Client (CLI tool)
181
+
182
+ Invoked by Claude Code as: `python3 gossip/kalshi.py scan` or `python3 gossip/kalshi.py market TICKER`
183
+
184
+ Port from Casket Trader's scanner.py + trade.py, plus patterns from kalshi-trading-bot-cli.
185
+
186
+ **CLI interface:**
187
+ ```
188
+ python3 gossip/kalshi.py scan # all active markets
189
+ python3 gossip/kalshi.py scan --categories "Economics" # filtered
190
+ python3 gossip/kalshi.py scan --days 7 # closing within 7 days
191
+ python3 gossip/kalshi.py market KXCPI-26MAY-T0.5 # single market details
192
+ python3 gossip/kalshi.py orderbook KXCPI-26MAY-T0.5 # orderbook depth
193
+ python3 gossip/kalshi.py search "bitcoin" # search by keyword
194
+ ```
195
+
196
+ **Output:** JSON to stdout.
197
+
198
+ **Market dataclass:**
199
+ - ticker, title, category, rules, close_time, yes_bid, yes_ask, volume, open_interest, spread
200
+
201
+ **Filtering:**
202
+ - Skip 15-min crypto (not news-tradeable)
203
+ - Skip sports/esports
204
+ - Focus on: Economics, Politics, Companies, World, Science/Tech, Climate
205
+ - Minimum volume/OI thresholds
206
+
207
+ **Auth:**
208
+ - Unauthenticated for market data (public API)
209
+ - RSA key-based auth for real trading (from kalshi-trading-bot-cli pattern)
210
+ - Demo API support: `KALSHI_USE_DEMO=true` → `https://demo-api.kalshi.co/trade-api/v2`
211
+
212
+ ### `gossip/trader.py` — Execution Engine (CLI tool)
213
+
214
+ Invoked by Claude Code as: `python3 gossip/trader.py trade TICKER --side yes --contracts 3 --estimate 0.72 --reasoning "..."`
215
+
216
+ **CLI interface:**
217
+ ```
218
+ python3 gossip/trader.py portfolio # show positions + P&L
219
+ python3 gossip/trader.py trade TICKER --side yes --contracts 3 --estimate 0.72 --confidence high --reasoning "..."
220
+ python3 gossip/trader.py exit TICKER --reasoning "..."
221
+ python3 gossip/trader.py settle TICKER --outcome yes # settle resolved market
222
+ python3 gossip/trader.py history # trade log
223
+ ```
224
+
225
+ **Sizing (Kelly criterion):**
226
+ ```
227
+ edge = estimated_prob - market_price (for YES side)
228
+ kelly_fraction = edge / (1 - market_price)
229
+ bet_size = bankroll * kelly_fraction * 0.5 (half-Kelly for safety)
230
+ contracts = floor(bet_size / market_price)
231
+ ```
232
+
233
+ **Risk rules:**
234
+ - Max 30% of bankroll on one position
235
+ - Max 5 concurrent positions
236
+ - Min edge: 10pp (configurable)
237
+ - Min confidence: medium
238
+ - Paper mode by default — no real money without explicit `--live` flag
239
+
240
+ **Trade logging:**
241
+ - Every trade saved to `data/trades.json` with full context: news trigger, LLM reasoning, probability estimate, entry price, timestamp
242
+ - Portfolio state in `data/portfolio.json`
243
+
244
+ ### `gossip/dashboard.py` — Streamlit Demo Dashboard
245
+
246
+ For the hackathon live demo. Shows:
247
+
248
+ 1. **Active Markets** — Kalshi markets the agent is watching, with current prices
249
+ 2. **News Feed** — Latest scraped articles, color-coded by relevance to markets
250
+ 3. **Trade Signals** — Agent's analysis: market, news trigger, estimated prob vs market price, edge
251
+ 4. **Positions** — Current paper portfolio with P&L
252
+ 5. **Agent Log** — Running log of agent reasoning (what it researched, why it traded/passed)
253
+
254
+ Run with: `streamlit run gossip/dashboard.py`
255
+
256
+ ## Demo Strategy (Hackathon Day)
257
+
258
+ For the live demo, focus on markets that resolve quickly and have active news:
259
+
260
+ 1. **Crypto markets** (daily/weekly BTC/ETH price targets) — fast-moving, news-driven
261
+ 2. **Politics/tariffs** — Trump administration moves, tariff announcements
262
+ 3. **Economic data** — CPI, jobs numbers, GDP if timing aligns
263
+ 4. **Company earnings** — if any earnings calls happen during the hackathon
264
+
265
+ Pre-seed the agent with a few positions before the demo so there's a portfolio to show.
266
+
267
+ ## Tech Stack
268
+
269
+ - Python 3.11+, asyncio
270
+ - Claude Code CLI as the LLM brain (zero API cost on Max subscription)
271
+ - Apify (news scraping) — free tier, Google News + Twitter + web search + article extraction
272
+ - Kalshi REST API (markets + trading) with RSA auth
273
+ - SQLite (data/gossip.db) — trades, news, market snapshots, agent logs. Single file, zero config.
274
+ - Streamlit (real-time dashboard)
275
+ - JSON files as secondary persistence (trader.py dual-writes)
276
+
277
+ ## File Map
278
+
279
+ ```
280
+ open-trademaxxxing/
281
+ ├── SPEC.md ← this file
282
+ ├── main.py ← thin loop: spawn Claude Code, sleep, repeat
283
+ ├── requirements.txt
284
+ ├── .env.example
285
+ ├── .gitignore
286
+ ├── gossip/
287
+ │ ├── __init__.py
288
+ │ ├── db.py ← SQLite database layer (trades, news, snapshots, logs)
289
+ │ ├── news.py ← CLI tool: Apify news scraping
290
+ │ ├── kalshi.py ← CLI tool: Kalshi API client
291
+ │ ├── trader.py ← CLI tool: trade execution + sizing
292
+ │ └── dashboard.py ← Streamlit real-time dashboard
293
+ ├── data/
294
+ │ ├── gossip.db ← SQLite database (source of truth for dashboard)
295
+ │ ├── trades.json ← trade log with reasoning (secondary)
296
+ │ └── session_id.txt ← Claude Code session persistence
297
+ └── references/ ← (gitignored) cloned repos for reference
298
+ ├── prediction-market-assistant/
299
+ └── kalshi-trading-bot-cli/
300
+ ```
301
+
302
+ ## Build Order (Hackathon Timeline)
303
+
304
+ 1. **kalshi.py** — Get markets loading, filtering, search working. Port from Casket Trader + kalshi-trading-bot-cli patterns.
305
+ 2. **news.py** — Apify integration, scrape Google News for market keywords. CLI tool outputting JSON.
306
+ 3. **trader.py** — Paper trading, Kelly sizing, trade logging. CLI tool.
307
+ 4. **main.py** — Wire the Claude Code subprocess loop. One end-to-end cycle.
308
+ 5. **dashboard.py** — Streamlit UI showing everything. Polish for demo.
309
+
310
+ Steps 1-3 can be parallelized across team members. Step 4 is where it all comes together — Claude Code as the brain tying news to markets to trades.
311
+
312
+ ## References
313
+
314
+ - **Paperclip** (github.com/paperclipai/paperclip) — Claude Code subprocess pattern, session persistence, skills injection
315
+ - **prediction-market-assistant** (github.com/hackingthemarkets/prediction-market-assistant) — Kalshi API pagination, Perplexity-for-research pattern
316
+ - **kalshi-trading-bot-cli** (github.com/OctagonAI/kalshi-trading-bot-cli) — RSA auth, Kelly sizing, risk gates, market search, demo mode
317
+ - **Casket Trader** (wicktastic/raghav/agent/) — Our own research agent patterns, market scanner, paper trading
package/SUBMISSION.md ADDED
@@ -0,0 +1,30 @@
1
+ ## You've seen the posts. Someone on X made $1M on Polymarket betting on the election. Another turned $50K into $500K calling the Fed rate decision. We built the agent that does it for you while you sleep.
2
+
3
+ Open Trademaxxxing is a fully autonomous AI agent that scrapes news, reasons about real-world events, and executes real trades on prediction markets - no human in the loop.
4
+
5
+ Every 15 minutes, Claude Code is spawned as a subprocess with access to the internet, market data, and a trading account. Its job: figure out what's happening in the world, find where markets are wrong, and trade.
6
+
7
+ ### How it works
8
+
9
+ 1. **Scrape** — Pulls real-time signals from Reddit, Twitter/X, TikTok, Truth Social, and Google News via Apify
10
+ 2. **Reason** — Claude reads primary sources, cross-references headlines, and estimates true probabilities
11
+ 3. **Trade** — When it finds edge, it sizes positions using Kelly criterion and executes real trades on Kalshi
12
+ 4. **Learn** — Writes strategy notes for its future self, building memory across cycles
13
+
14
+ No rules engine. No hardcoded strategies. Pure reasoning — the same kind those Twitter traders use, just faster and tireless.
15
+
16
+ ### Where the edge comes from
17
+
18
+ The agent is a news agent at its core. It operationalizes information that retail traders skim. It focuses where the edge is richest: legislative markets (retail doesn't read bill text), confusion premiums (headlines create more uncertainty than the details warrant), and resolution lag (events happen before markets update). During our demo, it spotted a market still open on Pam Bondi's AG departure — days after Trump already fired her — and traded on it live.
19
+
20
+ ### Thesis mode
21
+
22
+ Have a lead? Type a hypothesis — *"I think tariffs will escalate"* — and the agent researches it, finds relevant markets, and trades on your behalf.
23
+
24
+ ### Built with
25
+
26
+ - **Claude Code** as the autonomous brain (subprocess orchestration via `--print`)
27
+ - **Apify** for real-time news and social media scraping
28
+ - **Kalshi API** for live market data and order execution
29
+ - **Next.js dashboard** with live agent streaming, portfolio tracking, and multi-source news feeds
30
+ - **SQLite** for all state — trades, news, market snapshots, agent logs