prizmkit 1.0.141 → 1.0.142
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/bundled/VERSION.json
CHANGED
|
@@ -103,14 +103,37 @@ Detect user intent from their message, then follow the corresponding workflow:
|
|
|
103
103
|
--action status 2>/dev/null
|
|
104
104
|
```
|
|
105
105
|
|
|
106
|
-
4. **
|
|
106
|
+
4. **Run environment preflight checks** (database connectivity, migrations, dev server):
|
|
107
|
+
|
|
108
|
+
Run the preflight script to auto-detect the database type, verify env vars, test connectivity, and check migration status:
|
|
109
|
+
```bash
|
|
110
|
+
python3 ${SKILL_DIR}/scripts/preflight-check.py feature-list.json
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
The script:
|
|
114
|
+
- Reads `global_context.database` from `feature-list.json` and `.prizmkit/config.json`
|
|
115
|
+
- Scans `.env.local` / `.env` for connection variables (supports Supabase, PostgreSQL, MySQL, MongoDB, Firebase, and generic `DATABASE_URL`)
|
|
116
|
+
- Tests connectivity using the appropriate method per database type
|
|
117
|
+
- Checks migration status (Prisma, Drizzle, Supabase raw SQL, or generic migration directories)
|
|
118
|
+
- Checks if the dev server is running (from `browser_interaction` URLs)
|
|
119
|
+
- Outputs `PREFLIGHT ✓` (pass), `PREFLIGHT ⚠` (warning), or `PREFLIGHT ℹ` (info) lines
|
|
120
|
+
- Exits 0 (all clear), 1 (warnings found), or 2 (error — feature list not found)
|
|
121
|
+
|
|
122
|
+
If the script reports `⚠` warnings, present them to the user and ask:
|
|
123
|
+
> "Environment preflight found issues (listed above). The pipeline can still run, but database-related features may produce code that passes mock tests without real database verification. Continue anyway?"
|
|
124
|
+
|
|
125
|
+
Wait for user confirmation. If they want to fix issues first, suggest remediation based on the warnings (apply migrations, configure env vars, check database service status).
|
|
126
|
+
|
|
127
|
+
If `global_context.database` is absent and no features mention database keywords, the script skips DB checks automatically.
|
|
128
|
+
|
|
129
|
+
5. **Ask execution mode** (first user decision):
|
|
107
130
|
|
|
108
131
|
Present the three modes and ask the user to choose:
|
|
109
132
|
- **(1) Foreground** (recommended) — pipeline runs in the current session via `run.sh run`. Visible output and direct error feedback.
|
|
110
133
|
- **(2) Background daemon** — pipeline runs fully detached via `launch-daemon.sh`. Survives AI CLI session closure.
|
|
111
134
|
- **(3) Manual** — display the final assembled commands only. Do not execute anything. User runs them on their own.
|
|
112
135
|
|
|
113
|
-
|
|
136
|
+
6. **Present configuration options**: After execution mode is chosen, show the remaining options with defaults. Ask the user to confirm or override.
|
|
114
137
|
|
|
115
138
|
**Configuration Options:**
|
|
116
139
|
|
|
@@ -149,7 +172,7 @@ Detect user intent from their message, then follow the corresponding workflow:
|
|
|
149
172
|
Want to change any options, or proceed with these defaults?
|
|
150
173
|
```
|
|
151
174
|
|
|
152
|
-
|
|
175
|
+
7. **Show final command**: Assemble the complete command from execution mode + confirmed configuration, and present it to the user.
|
|
153
176
|
|
|
154
177
|
**Foreground command:**
|
|
155
178
|
```bash
|
|
@@ -171,7 +194,7 @@ Detect user intent from their message, then follow the corresponding workflow:
|
|
|
171
194
|
--env "VERBOSE=1 ENABLE_CRITIC=true MAX_RETRIES=5"
|
|
172
195
|
```
|
|
173
196
|
|
|
174
|
-
**Manual mode**: Print the assembled command(s) and **stop here**. Do not execute anything. Do not proceed to step
|
|
197
|
+
**Manual mode**: Print the assembled command(s) and **stop here**. Do not execute anything. Do not proceed to step 8.
|
|
175
198
|
```
|
|
176
199
|
# To run in foreground:
|
|
177
200
|
VERBOSE=1 dev-pipeline/run.sh run feature-list.json
|
|
@@ -183,13 +206,13 @@ Detect user intent from their message, then follow the corresponding workflow:
|
|
|
183
206
|
dev-pipeline/run.sh status feature-list.json
|
|
184
207
|
```
|
|
185
208
|
|
|
186
|
-
|
|
209
|
+
8. **Confirm and launch** (Foreground and Background only — Manual mode ends at step 7):
|
|
187
210
|
|
|
188
211
|
Ask: "Ready to launch the pipeline with the above command?"
|
|
189
212
|
|
|
190
|
-
After user confirms, execute the command from step
|
|
213
|
+
After user confirms, execute the command from step 7.
|
|
191
214
|
|
|
192
|
-
|
|
215
|
+
9. **Post-launch** (depends on execution mode):
|
|
193
216
|
|
|
194
217
|
**If foreground**: Pipeline runs to completion in the terminal. After it finishes:
|
|
195
218
|
- Summarize results: total features, succeeded, failed, skipped
|
|
@@ -354,6 +377,9 @@ After pipeline completion, if features have `browser_interaction` fields and `pl
|
|
|
354
377
|
| All features blocked/failed | Show status, suggest daemon-safe recovery: `dev-pipeline/reset-feature.sh <F-XXX> --clean --run feature-list.json` |
|
|
355
378
|
| `playwright-cli` not installed | Browser verification skipped (non-blocking). Suggest: `npm install -g @playwright/cli@latest && playwright-cli install --skills` |
|
|
356
379
|
| Permission denied on script | Run `chmod +x dev-pipeline/launch-daemon.sh dev-pipeline/run.sh` |
|
|
380
|
+
| `.env.local` missing or incomplete | Warn: database connection variables not found. Suggest creating env file with required connection variables for the project's database |
|
|
381
|
+
| Database unreachable | Warn: database features will produce mock-only tests. Suggest checking database service status and connection credentials |
|
|
382
|
+
| Migrations not applied | Warn: tables or schema referenced in migration files not found in database. Suggest applying pending migrations |
|
|
357
383
|
|
|
358
384
|
### Integration Notes
|
|
359
385
|
|
|
@@ -0,0 +1,462 @@
|
|
|
1
|
+
#!/usr/bin/env python3
|
|
2
|
+
"""
|
|
3
|
+
dev-pipeline environment preflight checker.
|
|
4
|
+
|
|
5
|
+
Detects database type from feature-list.json / .prizmkit/config.json,
|
|
6
|
+
verifies env vars, tests connectivity, and checks migration status.
|
|
7
|
+
|
|
8
|
+
Usage:
|
|
9
|
+
python3 preflight-check.py [feature-list.json]
|
|
10
|
+
|
|
11
|
+
Output: PREFLIGHT lines to stdout (✓ / ⚠ / ℹ), JSON summary to stderr.
|
|
12
|
+
Exit code: 0 = all clear, 1 = warnings found, 2 = error.
|
|
13
|
+
"""
|
|
14
|
+
|
|
15
|
+
import json
|
|
16
|
+
import glob
|
|
17
|
+
import os
|
|
18
|
+
import re
|
|
19
|
+
import subprocess
|
|
20
|
+
import sys
|
|
21
|
+
|
|
22
|
+
# ── Config ──────────────────────────────────────────────────────
|
|
23
|
+
|
|
24
|
+
ENV_FILES = [".env.local", ".env", ".env.development.local", ".env.development"]
|
|
25
|
+
|
|
26
|
+
# (group_label, regex_pattern) — matched against env var names.
|
|
27
|
+
# IMPORTANT: use non-capturing groups (?:...) inside patterns so that
|
|
28
|
+
# group(1) in the scan regex captures the full variable name.
|
|
29
|
+
DB_VAR_PATTERNS = [
|
|
30
|
+
("SUPABASE_URL", r"(?:NEXT_PUBLIC_)?SUPABASE_URL"),
|
|
31
|
+
("SUPABASE_KEY", r"(?:NEXT_PUBLIC_)?SUPABASE_(?:ANON_KEY|SERVICE_ROLE_KEY)"),
|
|
32
|
+
("DATABASE_URL", r"DATABASE_URL"),
|
|
33
|
+
("DB_CONNECTION", r"DB_(?:HOST|CONNECTION|URL|PORT|NAME|USER|PASSWORD)"),
|
|
34
|
+
("FIREBASE", r"(?:NEXT_PUBLIC_)?FIREBASE_(?:API_KEY|PROJECT_ID|AUTH_DOMAIN)"),
|
|
35
|
+
("MONGODB", r"MONGO(?:DB)?_(?:URI|URL|CONNECTION)"),
|
|
36
|
+
("REDIS", r"REDIS_(?:URL|HOST|PORT)"),
|
|
37
|
+
("MYSQL", r"(?:MYSQL|PLANETSCALE)_(?:URL|HOST|DATABASE)"),
|
|
38
|
+
("POSTGRES", r"(?:PG|POSTGRES(?:QL)?)_(?:URL|HOST|CONNECTION)"),
|
|
39
|
+
]
|
|
40
|
+
|
|
41
|
+
DB_KEYWORDS = [
|
|
42
|
+
"migration", "database", "create table", "table", "rls",
|
|
43
|
+
"storage bucket", "schema", "model", "prisma", "drizzle",
|
|
44
|
+
"sequelize", "typeorm", "supabase", "firebase", "mongodb",
|
|
45
|
+
"postgres", "mysql", "sqlite", "redis",
|
|
46
|
+
]
|
|
47
|
+
|
|
48
|
+
warnings = []
|
|
49
|
+
passes = []
|
|
50
|
+
infos = []
|
|
51
|
+
|
|
52
|
+
|
|
53
|
+
def out(level, msg):
|
|
54
|
+
"""Print a preflight line and collect it."""
|
|
55
|
+
print(f"PREFLIGHT {level} {msg}")
|
|
56
|
+
if level == "⚠":
|
|
57
|
+
warnings.append(msg)
|
|
58
|
+
elif level == "✓":
|
|
59
|
+
passes.append(msg)
|
|
60
|
+
else:
|
|
61
|
+
infos.append(msg)
|
|
62
|
+
|
|
63
|
+
|
|
64
|
+
# ── 1. Detect DB type and DB-related features ──────────────────
|
|
65
|
+
|
|
66
|
+
def detect_db(feature_list_path):
|
|
67
|
+
"""Return (db_type_str, list_of_feature_ids_with_db)."""
|
|
68
|
+
db_str = ""
|
|
69
|
+
db_features = []
|
|
70
|
+
|
|
71
|
+
try:
|
|
72
|
+
with open(feature_list_path) as f:
|
|
73
|
+
data = json.load(f)
|
|
74
|
+
except Exception:
|
|
75
|
+
return "", []
|
|
76
|
+
|
|
77
|
+
db_str = data.get("global_context", {}).get("database", "")
|
|
78
|
+
|
|
79
|
+
if not db_str:
|
|
80
|
+
try:
|
|
81
|
+
with open(".prizmkit/config.json") as f:
|
|
82
|
+
cfg = json.load(f)
|
|
83
|
+
db_str = cfg.get("tech_stack", {}).get("database", "")
|
|
84
|
+
except Exception:
|
|
85
|
+
pass
|
|
86
|
+
|
|
87
|
+
if not db_str:
|
|
88
|
+
return "", []
|
|
89
|
+
|
|
90
|
+
for feat in data.get("features", []):
|
|
91
|
+
desc = (feat.get("description", "") + " " + feat.get("title", "")).lower()
|
|
92
|
+
if any(k in desc for k in DB_KEYWORDS):
|
|
93
|
+
db_features.append(feat["id"])
|
|
94
|
+
|
|
95
|
+
return db_str, db_features
|
|
96
|
+
|
|
97
|
+
|
|
98
|
+
# ── 2. Scan env files for DB connection vars ────────────────────
|
|
99
|
+
|
|
100
|
+
def scan_env_vars():
|
|
101
|
+
"""Return (env_file_used, {var_name: value})."""
|
|
102
|
+
for env_file in ENV_FILES:
|
|
103
|
+
if not os.path.isfile(env_file):
|
|
104
|
+
continue
|
|
105
|
+
found = {}
|
|
106
|
+
with open(env_file) as f:
|
|
107
|
+
content = f.read()
|
|
108
|
+
for _group, pattern in DB_VAR_PATTERNS:
|
|
109
|
+
for m in re.finditer(
|
|
110
|
+
r"^(?!#)("+pattern + r")\s*=\s*(.+)",
|
|
111
|
+
content,
|
|
112
|
+
re.MULTILINE | re.IGNORECASE,
|
|
113
|
+
):
|
|
114
|
+
var_name = m.group(1)
|
|
115
|
+
var_val = m.group(2).strip().strip('"').strip("'")
|
|
116
|
+
if var_val:
|
|
117
|
+
found[var_name] = var_val
|
|
118
|
+
if found:
|
|
119
|
+
return env_file, found
|
|
120
|
+
return None, {}
|
|
121
|
+
|
|
122
|
+
|
|
123
|
+
# ── 3. Connectivity checks (per DB type) ───────────────────────
|
|
124
|
+
|
|
125
|
+
def _curl_code(url, headers=None, timeout=10):
|
|
126
|
+
"""Run curl and return HTTP status code string."""
|
|
127
|
+
cmd = ["curl", "-s", "-o", "/dev/null", "-w", "%{http_code}",
|
|
128
|
+
"--max-time", str(timeout), url]
|
|
129
|
+
for h in (headers or []):
|
|
130
|
+
cmd += ["-H", h]
|
|
131
|
+
try:
|
|
132
|
+
r = subprocess.run(cmd, capture_output=True, text=True, timeout=timeout + 5)
|
|
133
|
+
return r.stdout.strip()
|
|
134
|
+
except Exception:
|
|
135
|
+
return "000"
|
|
136
|
+
|
|
137
|
+
|
|
138
|
+
def _get_var(env_vars, *patterns):
|
|
139
|
+
"""Find first env var matching any pattern."""
|
|
140
|
+
for pat in patterns:
|
|
141
|
+
for k, v in env_vars.items():
|
|
142
|
+
if re.match(pat + "$", k, re.IGNORECASE) and v:
|
|
143
|
+
return k, v
|
|
144
|
+
return None, None
|
|
145
|
+
|
|
146
|
+
|
|
147
|
+
def check_connectivity(db_type, env_vars):
|
|
148
|
+
"""Test database connectivity. Returns True if connected."""
|
|
149
|
+
dt = db_type.lower()
|
|
150
|
+
|
|
151
|
+
# ── Supabase ──
|
|
152
|
+
if "supabase" in dt:
|
|
153
|
+
_, url = _get_var(env_vars, r"(?:NEXT_PUBLIC_)?SUPABASE_URL")
|
|
154
|
+
_, key = _get_var(env_vars, r"(?:NEXT_PUBLIC_)?SUPABASE_ANON_KEY")
|
|
155
|
+
if not key:
|
|
156
|
+
_, key = _get_var(env_vars, r"SUPABASE_SERVICE_ROLE_KEY")
|
|
157
|
+
if not (url and key):
|
|
158
|
+
out("⚠", "Supabase URL or anon key not found — cannot test connectivity")
|
|
159
|
+
return False
|
|
160
|
+
# Find a table from first migration to test against
|
|
161
|
+
test_table = "profiles"
|
|
162
|
+
mig_files = sorted(glob.glob("supabase/migrations/*.sql"))
|
|
163
|
+
if mig_files:
|
|
164
|
+
with open(mig_files[0]) as f:
|
|
165
|
+
for line in f:
|
|
166
|
+
m = re.search(r"CREATE TABLE\s+(?:public\.)?(\w+)", line, re.I)
|
|
167
|
+
if m:
|
|
168
|
+
test_table = m.group(1)
|
|
169
|
+
break
|
|
170
|
+
code = _curl_code(
|
|
171
|
+
f"{url}/rest/v1/{test_table}?limit=0",
|
|
172
|
+
[f"apikey: {key}", f"Authorization: Bearer {key}"],
|
|
173
|
+
)
|
|
174
|
+
if code == "200":
|
|
175
|
+
out("✓", "Database API reachable (Supabase)")
|
|
176
|
+
return True
|
|
177
|
+
else:
|
|
178
|
+
out("⚠", f"Database API unreachable (Supabase HTTP {code})")
|
|
179
|
+
return False
|
|
180
|
+
|
|
181
|
+
# ── PostgreSQL / Neon ──
|
|
182
|
+
if any(k in dt for k in ("postgres", "pg", "neon")):
|
|
183
|
+
_, db_url = _get_var(env_vars, r"DATABASE_URL", r"POSTGRES(QL)?_URL", r"PG_URL")
|
|
184
|
+
if not db_url:
|
|
185
|
+
out("⚠", "No DATABASE_URL found — cannot test PostgreSQL connectivity")
|
|
186
|
+
return False
|
|
187
|
+
try:
|
|
188
|
+
r = subprocess.run(
|
|
189
|
+
["pg_isready", "-d", db_url],
|
|
190
|
+
capture_output=True, text=True, timeout=10,
|
|
191
|
+
)
|
|
192
|
+
if r.returncode == 0:
|
|
193
|
+
out("✓", "PostgreSQL reachable")
|
|
194
|
+
return True
|
|
195
|
+
out("⚠", f"PostgreSQL unreachable: {r.stderr.strip()}")
|
|
196
|
+
return False
|
|
197
|
+
except FileNotFoundError:
|
|
198
|
+
out("ℹ", "pg_isready not installed — cannot verify PostgreSQL connectivity")
|
|
199
|
+
return False
|
|
200
|
+
except Exception as e:
|
|
201
|
+
out("⚠", f"PostgreSQL connectivity test failed: {e}")
|
|
202
|
+
return False
|
|
203
|
+
|
|
204
|
+
# ── MySQL / PlanetScale / MariaDB ──
|
|
205
|
+
if any(k in dt for k in ("mysql", "planetscale", "mariadb")):
|
|
206
|
+
_, db_url = _get_var(env_vars, r"DATABASE_URL", r"MYSQL_(URL|HOST)")
|
|
207
|
+
if not db_url:
|
|
208
|
+
out("⚠", "No DATABASE_URL found — cannot test MySQL connectivity")
|
|
209
|
+
return False
|
|
210
|
+
try:
|
|
211
|
+
hostport = db_url.split("@")[-1].split("/")[0] if "@" in db_url else db_url
|
|
212
|
+
host = hostport.split(":")[0]
|
|
213
|
+
port_args = ["-P", hostport.split(":")[1]] if ":" in hostport else []
|
|
214
|
+
r = subprocess.run(
|
|
215
|
+
["mysqladmin", "ping", "-h", host] + port_args,
|
|
216
|
+
capture_output=True, text=True, timeout=10,
|
|
217
|
+
)
|
|
218
|
+
if "alive" in r.stdout.lower():
|
|
219
|
+
out("✓", "MySQL reachable")
|
|
220
|
+
return True
|
|
221
|
+
out("⚠", "MySQL unreachable")
|
|
222
|
+
return False
|
|
223
|
+
except FileNotFoundError:
|
|
224
|
+
out("ℹ", "mysqladmin not installed — cannot verify MySQL connectivity")
|
|
225
|
+
return False
|
|
226
|
+
except Exception as e:
|
|
227
|
+
out("⚠", f"MySQL connectivity test failed: {e}")
|
|
228
|
+
return False
|
|
229
|
+
|
|
230
|
+
# ── MongoDB ──
|
|
231
|
+
if "mongo" in dt:
|
|
232
|
+
_, db_url = _get_var(env_vars, r"MONGO(DB)?_(URI|URL|CONNECTION)", r"DATABASE_URL")
|
|
233
|
+
if not db_url:
|
|
234
|
+
out("⚠", "No MongoDB URI found — cannot test connectivity")
|
|
235
|
+
return False
|
|
236
|
+
try:
|
|
237
|
+
r = subprocess.run(
|
|
238
|
+
["mongosh", "--eval", "db.runCommand({ping:1})", db_url, "--quiet"],
|
|
239
|
+
capture_output=True, text=True, timeout=10,
|
|
240
|
+
)
|
|
241
|
+
if r.returncode == 0:
|
|
242
|
+
out("✓", "MongoDB reachable")
|
|
243
|
+
return True
|
|
244
|
+
out("⚠", "MongoDB unreachable")
|
|
245
|
+
return False
|
|
246
|
+
except FileNotFoundError:
|
|
247
|
+
out("ℹ", "mongosh not installed — cannot verify MongoDB connectivity")
|
|
248
|
+
return False
|
|
249
|
+
except Exception as e:
|
|
250
|
+
out("⚠", f"MongoDB connectivity test failed: {e}")
|
|
251
|
+
return False
|
|
252
|
+
|
|
253
|
+
# ── Firebase ──
|
|
254
|
+
if "firebase" in dt:
|
|
255
|
+
_, project_id = _get_var(env_vars, r"(NEXT_PUBLIC_)?FIREBASE_PROJECT_ID")
|
|
256
|
+
if not project_id:
|
|
257
|
+
out("⚠", "No Firebase project ID found — cannot test connectivity")
|
|
258
|
+
return False
|
|
259
|
+
code = _curl_code(
|
|
260
|
+
f"https://firestore.googleapis.com/v1/projects/{project_id}/databases/(default)/documents?pageSize=0"
|
|
261
|
+
)
|
|
262
|
+
if code in ("200", "401", "403"):
|
|
263
|
+
out("✓", "Firebase project reachable")
|
|
264
|
+
return True
|
|
265
|
+
out("⚠", f"Firebase unreachable (HTTP {code})")
|
|
266
|
+
return False
|
|
267
|
+
|
|
268
|
+
# ── Generic DATABASE_URL fallback ──
|
|
269
|
+
_, db_url = _get_var(env_vars, r"DATABASE_URL")
|
|
270
|
+
if db_url and "://" in db_url:
|
|
271
|
+
proto = db_url.split("://")[0]
|
|
272
|
+
if proto in ("postgres", "postgresql"):
|
|
273
|
+
try:
|
|
274
|
+
r = subprocess.run(
|
|
275
|
+
["pg_isready", "-d", db_url],
|
|
276
|
+
capture_output=True, text=True, timeout=10,
|
|
277
|
+
)
|
|
278
|
+
ok = r.returncode == 0
|
|
279
|
+
out("✓" if ok else "⚠", f"PostgreSQL {'reachable' if ok else 'unreachable'}")
|
|
280
|
+
return ok
|
|
281
|
+
except FileNotFoundError:
|
|
282
|
+
out("ℹ", "pg_isready not installed — cannot verify connectivity")
|
|
283
|
+
elif proto in ("mysql", "mariadb"):
|
|
284
|
+
out("ℹ", "MySQL DATABASE_URL found — install mysqladmin to verify connectivity")
|
|
285
|
+
elif proto in ("mongodb", "mongodb+srv"):
|
|
286
|
+
out("ℹ", "MongoDB DATABASE_URL found — install mongosh to verify connectivity")
|
|
287
|
+
else:
|
|
288
|
+
out("ℹ", f"DATABASE_URL found (protocol: {proto}) — cannot auto-verify")
|
|
289
|
+
return False
|
|
290
|
+
|
|
291
|
+
out("ℹ", f"Database type \"{db_type}\" detected but no connection variables found")
|
|
292
|
+
return False
|
|
293
|
+
|
|
294
|
+
|
|
295
|
+
# ── 4. Migration status ────────────────────────────────────────
|
|
296
|
+
|
|
297
|
+
def check_migrations(db_type, env_vars, connected):
|
|
298
|
+
"""Check whether migrations have been applied."""
|
|
299
|
+
dt = db_type.lower()
|
|
300
|
+
checked = False
|
|
301
|
+
|
|
302
|
+
# ── Prisma ──
|
|
303
|
+
if os.path.isdir("prisma/migrations") or os.path.isfile("prisma/schema.prisma"):
|
|
304
|
+
checked = True
|
|
305
|
+
try:
|
|
306
|
+
env = os.environ.copy()
|
|
307
|
+
env.update(env_vars)
|
|
308
|
+
r = subprocess.run(
|
|
309
|
+
["npx", "prisma", "migrate", "status"],
|
|
310
|
+
capture_output=True, text=True, timeout=30, env=env,
|
|
311
|
+
)
|
|
312
|
+
if "not yet been applied" in r.stdout.lower():
|
|
313
|
+
out("⚠", "Prisma: unapplied migrations detected")
|
|
314
|
+
for line in r.stdout.splitlines():
|
|
315
|
+
if "not yet been applied" in line.lower() or line.strip().startswith("- "):
|
|
316
|
+
out("⚠", f" {line.strip()}")
|
|
317
|
+
elif r.returncode == 0:
|
|
318
|
+
out("✓", "Prisma: all migrations applied")
|
|
319
|
+
else:
|
|
320
|
+
snippet = (r.stderr or r.stdout or "").strip()[:200]
|
|
321
|
+
snippet = re.sub(r"://[^\s]+@", "://[REDACTED]@", snippet)
|
|
322
|
+
out("⚠", f"Prisma migrate status failed: {snippet}")
|
|
323
|
+
except FileNotFoundError:
|
|
324
|
+
out("ℹ", "npx not found — cannot check Prisma migration status")
|
|
325
|
+
except Exception as e:
|
|
326
|
+
out("⚠", f"Prisma check failed: {e}")
|
|
327
|
+
|
|
328
|
+
# ── Drizzle ──
|
|
329
|
+
if os.path.isdir("drizzle") and glob.glob("drizzle/*.sql"):
|
|
330
|
+
checked = True
|
|
331
|
+
out("ℹ", "Drizzle migrations found — verify with `npx drizzle-kit push` or `npx drizzle-kit migrate`")
|
|
332
|
+
|
|
333
|
+
# ── Supabase raw SQL ──
|
|
334
|
+
if os.path.isdir("supabase/migrations") and "supabase" in dt:
|
|
335
|
+
checked = True
|
|
336
|
+
url = env_vars.get("NEXT_PUBLIC_SUPABASE_URL", env_vars.get("SUPABASE_URL", ""))
|
|
337
|
+
key = env_vars.get("NEXT_PUBLIC_SUPABASE_ANON_KEY", env_vars.get("SUPABASE_ANON_KEY", ""))
|
|
338
|
+
|
|
339
|
+
if url and key and connected:
|
|
340
|
+
for mig in sorted(glob.glob("supabase/migrations/*.sql")):
|
|
341
|
+
with open(mig) as f:
|
|
342
|
+
content = f.read()
|
|
343
|
+
bn = os.path.basename(mig)
|
|
344
|
+
for match in re.finditer(
|
|
345
|
+
r"CREATE TABLE\s+(?:(public|auth|storage)\.)?(\w+)",
|
|
346
|
+
content, re.I,
|
|
347
|
+
):
|
|
348
|
+
schema = (match.group(1) or "public").lower()
|
|
349
|
+
tbl = match.group(2)
|
|
350
|
+
if schema != "public":
|
|
351
|
+
continue # auth/storage tables not accessible via REST API
|
|
352
|
+
code = _curl_code(
|
|
353
|
+
f"{url}/rest/v1/{tbl}?limit=0",
|
|
354
|
+
[f"apikey: {key}", f"Authorization: Bearer {key}"],
|
|
355
|
+
timeout=5,
|
|
356
|
+
)
|
|
357
|
+
if code == "200":
|
|
358
|
+
out("✓", f"{bn}: table '{tbl}' exists")
|
|
359
|
+
else:
|
|
360
|
+
out("⚠", f"{bn}: table '{tbl}' NOT FOUND — migration may not be applied")
|
|
361
|
+
else:
|
|
362
|
+
n = len(glob.glob("supabase/migrations/*.sql"))
|
|
363
|
+
out("ℹ", f"{n} Supabase migration file(s) found — cannot verify without API connection")
|
|
364
|
+
|
|
365
|
+
# ── Knex / Rails / generic ──
|
|
366
|
+
for mig_dir in ("migrations", "db/migrate", "db/migrations"):
|
|
367
|
+
if os.path.isdir(mig_dir) and not checked:
|
|
368
|
+
checked = True
|
|
369
|
+
n = len(os.listdir(mig_dir))
|
|
370
|
+
out("ℹ", f"{n} migration file(s) in {mig_dir}/ — verify manually that all are applied")
|
|
371
|
+
|
|
372
|
+
if not checked:
|
|
373
|
+
out("ℹ", "No migration directory detected — skipping migration check")
|
|
374
|
+
|
|
375
|
+
|
|
376
|
+
# ── 5. Dev server ──────────────────────────────────────────────
|
|
377
|
+
|
|
378
|
+
def check_dev_server(feature_list_path):
|
|
379
|
+
"""Check if dev server is running (from browser_interaction URLs)."""
|
|
380
|
+
try:
|
|
381
|
+
with open(feature_list_path) as f:
|
|
382
|
+
data = json.load(f)
|
|
383
|
+
except Exception:
|
|
384
|
+
return
|
|
385
|
+
checked_bases = set()
|
|
386
|
+
for feat in data.get("features", []):
|
|
387
|
+
bi = feat.get("browser_interaction")
|
|
388
|
+
if bi and isinstance(bi, dict) and bi.get("url"):
|
|
389
|
+
m = re.match(r"(https?://[^/]+)", bi["url"])
|
|
390
|
+
if m:
|
|
391
|
+
base = m.group(1)
|
|
392
|
+
if base in checked_bases:
|
|
393
|
+
continue
|
|
394
|
+
checked_bases.add(base)
|
|
395
|
+
code = _curl_code(base, timeout=5)
|
|
396
|
+
if code in ("200", "302"):
|
|
397
|
+
out("✓", f"Dev server reachable at {base}")
|
|
398
|
+
else:
|
|
399
|
+
out("ℹ", f"Dev server not running at {base} (AI sessions can start it)")
|
|
400
|
+
|
|
401
|
+
|
|
402
|
+
# ── Main ────────────────────────────────────────────────────────
|
|
403
|
+
|
|
404
|
+
def main():
|
|
405
|
+
feature_list = sys.argv[1] if len(sys.argv) > 1 else "feature-list.json"
|
|
406
|
+
|
|
407
|
+
if not os.path.isfile(feature_list):
|
|
408
|
+
print(f"PREFLIGHT ⚠ Feature list not found: {feature_list}")
|
|
409
|
+
sys.exit(2)
|
|
410
|
+
|
|
411
|
+
# 1. Detect database
|
|
412
|
+
db_type, db_features = detect_db(feature_list)
|
|
413
|
+
if not db_type:
|
|
414
|
+
print("PREFLIGHT ℹ No database configured in global_context — skipping DB checks")
|
|
415
|
+
check_dev_server(feature_list)
|
|
416
|
+
_print_summary()
|
|
417
|
+
return
|
|
418
|
+
if not db_features:
|
|
419
|
+
print(f"PREFLIGHT ℹ Database: {db_type} (no features reference DB — skipping detailed checks)")
|
|
420
|
+
check_dev_server(feature_list)
|
|
421
|
+
_print_summary()
|
|
422
|
+
return
|
|
423
|
+
|
|
424
|
+
print(f"PREFLIGHT ℹ Database: {db_type}")
|
|
425
|
+
print(f"PREFLIGHT ℹ DB-related features: {', '.join(db_features)}")
|
|
426
|
+
|
|
427
|
+
# 2. Env vars
|
|
428
|
+
env_file, env_vars = scan_env_vars()
|
|
429
|
+
if not env_file:
|
|
430
|
+
out("⚠", "No env file found (.env.local, .env, etc.) — database connection will likely fail")
|
|
431
|
+
elif not env_vars:
|
|
432
|
+
out("⚠", f"{env_file} exists but no database connection variables detected")
|
|
433
|
+
else:
|
|
434
|
+
for var_name in sorted(env_vars.keys()):
|
|
435
|
+
out("✓", f"{var_name} configured")
|
|
436
|
+
|
|
437
|
+
# 3. Connectivity
|
|
438
|
+
connected = check_connectivity(db_type, env_vars)
|
|
439
|
+
|
|
440
|
+
# 4. Migrations
|
|
441
|
+
check_migrations(db_type, env_vars, connected)
|
|
442
|
+
|
|
443
|
+
# 5. Dev server
|
|
444
|
+
check_dev_server(feature_list)
|
|
445
|
+
|
|
446
|
+
_print_summary()
|
|
447
|
+
|
|
448
|
+
|
|
449
|
+
def _print_summary():
|
|
450
|
+
"""Print JSON summary to stderr and set exit code."""
|
|
451
|
+
summary = {
|
|
452
|
+
"pass_count": len(passes),
|
|
453
|
+
"warn_count": len(warnings),
|
|
454
|
+
"info_count": len(infos),
|
|
455
|
+
"warnings": warnings,
|
|
456
|
+
}
|
|
457
|
+
print(json.dumps(summary), file=sys.stderr)
|
|
458
|
+
sys.exit(1 if warnings else 0)
|
|
459
|
+
|
|
460
|
+
|
|
461
|
+
if __name__ == "__main__":
|
|
462
|
+
main()
|