seo-intel 1.1.3 → 1.1.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/package.json +10 -2
  2. package/setup/ROADMAP.md +0 -109
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "seo-intel",
3
- "version": "1.1.3",
3
+ "version": "1.1.4",
4
4
  "description": "Local Ahrefs-style SEO competitor intelligence. Crawl → SQLite → cloud analysis.",
5
5
  "type": "module",
6
6
  "license": "SEE LICENSE IN LICENSE",
@@ -31,7 +31,15 @@
31
31
  "scheduler.js",
32
32
  "seo-audit.js",
33
33
  "lib/",
34
- "setup/",
34
+ "setup/checks.js",
35
+ "setup/config-builder.js",
36
+ "setup/engine.js",
37
+ "setup/installers.js",
38
+ "setup/models.js",
39
+ "setup/openclaw-bridge.js",
40
+ "setup/validator.js",
41
+ "setup/web-routes.js",
42
+ "setup/wizard.html",
35
43
  "config/setup-wizard.js",
36
44
  "config/example.json",
37
45
  "crawler/",
package/setup/ROADMAP.md DELETED
@@ -1,109 +0,0 @@
1
- # SEO Intel Setup — Roadmap
2
-
3
- > From open-source CLI tool → standalone product
4
-
5
- ## Current State (v0.2)
6
- - [x] System detection (Node, npm, Ollama, Playwright, VRAM)
7
- - [x] Model recommendations (VRAM-based extraction + analysis tiers)
8
- - [x] Project configuration (target domain, competitors, crawl mode)
9
- - [x] API key setup (Gemini, Claude, OpenAI, DeepSeek)
10
- - [x] Pipeline validation (Ollama → API → crawl → extraction)
11
- - [x] CLI wizard + Web wizard at /setup
12
- - [x] GSC setup step (CSV upload + export guide + auto-detection)
13
- - [x] License system (lib/license.js + lib/gate.js)
14
- - [x] Free/Pro tier gating on all 23 CLI commands
15
- - [x] Page limit enforcement (500/domain on free tier)
16
- - [x] License status in `status` command
17
-
18
- ## Priority 1 — GSC Setup Guide
19
- **Status: ✅ Done (CSV upload, export guide, auto-detection)**
20
-
21
- Google Search Console is the #1 data source users need but can't figure out alone.
22
- Currently: manual CSV export, no API, no guidance in wizard.
23
-
24
- - [ ] Add Step 3.5: "Connect Google Search Console" in web wizard
25
- - [ ] Visual walkthrough: how to export CSVs from GSC UI (screenshots/steps)
26
- - [ ] Auto-detect existing GSC data in `gsc/` folder
27
- - [ ] GSC API integration (service account JSON key upload)
28
- - [ ] Auto-fetch GSC data on schedule (replaces manual CSV)
29
-
30
- ## Priority 2 — Ollama Auto-Install
31
- **Status: 📋 Planned**
32
-
33
- If Ollama isn't found, offer to install it instead of just warning.
34
-
35
- - [ ] macOS: `brew install ollama` or direct download
36
- - [ ] Linux: `curl -fsSL https://ollama.com/install.sh | sh`
37
- - [ ] Windows: direct user to installer URL
38
- - [ ] Auto-start Ollama after install
39
- - [ ] Auto-pull recommended model after install
40
-
41
- ## Priority 3 — Scheduling / Automation
42
- **Status: 📋 Planned**
43
-
44
- After setup, users need recurring crawls. "Set and forget."
45
-
46
- - [ ] "Schedule weekly crawl?" step in wizard
47
- - [ ] Cron job generator (macOS launchd / Linux cron / Windows Task Scheduler)
48
- - [ ] Built-in scheduler (node-cron or setTimeout loop in server.js)
49
- - [ ] Crawl → Extract → Analyze → Regenerate dashboard pipeline
50
- - [ ] "Last run" / "Next run" display on dashboard
51
-
52
- ## Priority 4 — First Run Experience
53
- **Status: 📋 Planned**
54
-
55
- Don't just show CLI commands — offer to run the first crawl right there.
56
-
57
- - [ ] "Run your first crawl now?" button on Step 5
58
- - [ ] SSE progress stream showing crawl progress in real-time
59
- - [ ] Auto-trigger extraction + analysis after crawl
60
- - [ ] Redirect to dashboard when done
61
- - [ ] Estimated time based on competitor count × pages per domain
62
-
63
- ## Priority 5 — Proxy & Rate Limiting
64
- **Status: 📋 Planned**
65
-
66
- Stealth mode users need proxy config to avoid blocks.
67
-
68
- - [ ] Proxy URL input (HTTP/SOCKS5)
69
- - [ ] Proxy rotation list upload
70
- - [ ] Rate limit slider (requests/minute)
71
- - [ ] Per-domain delay configuration
72
- - [ ] "Test proxy" validation step
73
-
74
- ## Priority 6 — Notifications
75
- **Status: 📋 Planned**
76
-
77
- Know when things happen without checking manually.
78
-
79
- - [ ] Email notifications (SMTP setup in wizard)
80
- - [ ] Slack webhook integration
81
- - [ ] Discord webhook integration
82
- - [ ] Configurable triggers: crawl complete, ranking drop, new competitor page
83
- - [ ] Weekly digest email with key metrics
84
-
85
- ## Priority 7 — Data & Backup
86
- **Status: 📋 Planned**
87
-
88
- Where data lives, how big it gets, how to manage it.
89
-
90
- - [ ] Show data directory + size in dashboard footer
91
- - [ ] One-click export (SQLite → JSON/CSV)
92
- - [ ] Auto-backup before major operations
93
- - [ ] Data retention settings (keep last N crawls)
94
- - [ ] Cloud backup option (S3/GCS)
95
-
96
- ---
97
-
98
- ## Open Source → Product Progression
99
-
100
- | Feature | Open Source (froggo.pro) | Standalone SaaS |
101
- |---------|------------------------|-----------------|
102
- | Setup | CLI wizard | Web wizard + onboarding email |
103
- | Auth | None (local) | User accounts + API keys |
104
- | GSC | Manual CSV or API key | OAuth "Connect GSC" button |
105
- | Scheduling | Cron jobs | Built-in + hosted workers |
106
- | Notifications | Webhook only | Email + Slack + in-app |
107
- | Data | Local SQLite | Cloud DB + CDN dashboards |
108
- | Multi-user | Single | Teams + permissions |
109
- | Billing | Free / one-time | Subscription tiers |