@xubylele/schema-forge 1.6.0 → 1.7.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -13,6 +13,7 @@ A modern CLI tool for database schema management with a clean DSL and automatic
13
13
  - **Default Change Detection** - Detects added/removed/modified column defaults and generates ALTER COLUMN SET/DROP DEFAULT
14
14
  - **Postgres/Supabase** - Currently supports PostgreSQL and Supabase
15
15
  - **Constraint Diffing** - Detects UNIQUE and PRIMARY KEY changes with deterministic constraint names
16
+ - **Live PostgreSQL Introspection** - Extract normalized schema directly from `information_schema`
16
17
 
17
18
  ## Installation
18
19
 
@@ -56,6 +57,17 @@ Run tests:
56
57
  npm test
57
58
  ```
58
59
 
60
+ Run real-db drift integration tests:
61
+
62
+ ```bash
63
+ npm run test:integration:drift
64
+ ```
65
+
66
+ Notes:
67
+
68
+ - Local explicit run: set `SF_RUN_REAL_DB_TESTS=true` (uses Testcontainers `postgres:16-alpine`, Docker required).
69
+ - CI/service mode: set `SF_USE_CI_POSTGRES=true` and `DATABASE_URL` to reuse an existing Postgres service.
70
+
59
71
  ## Getting Started
60
72
 
61
73
  Here's a quick walkthrough to get started with SchemaForge:
@@ -205,9 +217,13 @@ schema-forge diff [--safe] [--force]
205
217
 
206
218
  - `--safe` - Block execution if destructive operations are detected (exits with error)
207
219
  - `--force` - Bypass safety checks and proceed with displaying destructive SQL (shows warning)
220
+ - `--url` - PostgreSQL connection URL for live database diff (fallback: `DATABASE_URL`)
221
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
208
222
 
209
223
  Shows what SQL would be generated if you ran `generate`. Useful for previewing changes. Safety behavior is the same as `generate` command. In CI environments, exits with code 3 if destructive operations are detected unless `--force` is used. See [CI Behavior](#ci-behavior) for more details.
210
224
 
225
+ When `--url` (or `DATABASE_URL`) is provided, `diff` compares your target DSL schema against the live PostgreSQL schema introspected from `information_schema`.
226
+
211
227
  ### `schema-forge import`
212
228
 
213
229
  Reconstruct `schemaforge/schema.sf` from existing PostgreSQL/Supabase SQL migrations.
@@ -234,6 +250,36 @@ Detect destructive or risky schema changes before generating/applying migrations
234
250
  schema-forge validate
235
251
  ```
236
252
 
253
+ Live drift validation:
254
+
255
+ ```bash
256
+ schema-forge validate --url "$DATABASE_URL" --json
257
+ ```
258
+
259
+ Live `--json` output returns a structured `DriftReport`:
260
+
261
+ ```json
262
+ {
263
+ "missingTables": ["users_archive"],
264
+ "extraTables": ["audit_log"],
265
+ "columnDifferences": [
266
+ {
267
+ "tableName": "users",
268
+ "missingInLive": ["nickname"],
269
+ "extraInLive": ["last_login"]
270
+ }
271
+ ],
272
+ "typeMismatches": [
273
+ {
274
+ "tableName": "users",
275
+ "columnName": "email",
276
+ "expectedType": "varchar",
277
+ "actualType": "text"
278
+ }
279
+ ]
280
+ }
281
+ ```
282
+
237
283
  Validation checks include:
238
284
 
239
285
  - Dropped tables (`DROP_TABLE`, error)
@@ -247,6 +293,13 @@ Use JSON mode for CI and automation:
247
293
  schema-forge validate --json
248
294
  ```
249
295
 
296
+ Live mode options:
297
+
298
+ - `--url` - PostgreSQL connection URL for live drift validation (fallback: `DATABASE_URL`)
299
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
300
+
301
+ In live mode, exit code `2` is used when drift is detected between `state.json` and the live database.
302
+
250
303
  Exit codes (also see [CI Behavior](#ci-behavior)):
251
304
 
252
305
  - `3` in CI environment if destructive findings detected
@@ -258,6 +311,46 @@ Exit codes:
258
311
  - `1` when one or more `error` findings are detected
259
312
  - `0` when no `error` findings are detected (warnings alone do not fail)
260
313
 
314
+ ### `schema-forge doctor`
315
+
316
+ Check live database drift against your tracked `state.json`.
317
+
318
+ ```bash
319
+ schema-forge doctor --url "$DATABASE_URL"
320
+ ```
321
+
322
+ Use JSON mode for CI and automation:
323
+
324
+ ```bash
325
+ schema-forge doctor --url "$DATABASE_URL" --json
326
+ ```
327
+
328
+ Options:
329
+
330
+ - `--url` - PostgreSQL connection URL (fallback: `DATABASE_URL`)
331
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
332
+ - `--json` - Output structured drift report JSON
333
+
334
+ Exit codes:
335
+
336
+ - `0` when no drift is detected (healthy)
337
+ - `2` when drift is detected between `state.json` and live database schema
338
+
339
+ ### `schema-forge introspect`
340
+
341
+ Extract normalized schema directly from a live PostgreSQL database.
342
+
343
+ ```bash
344
+ schema-forge introspect --url "$DATABASE_URL" --json
345
+ ```
346
+
347
+ **Options:**
348
+
349
+ - `--url` - PostgreSQL connection URL (fallback: `DATABASE_URL`)
350
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
351
+ - `--json` - Output normalized schema as JSON
352
+ - `--out <path>` - Write normalized schema JSON to a file
353
+
261
354
  ## CI Behavior
262
355
 
263
356
  SchemaForge ensures deterministic behavior in Continuous Integration (CI) environments to prevent accidental destructive operations.
@@ -276,8 +369,8 @@ SchemaForge uses specific exit codes for different scenarios:
276
369
  | Exit Code | Meaning |
277
370
  | --------- | ------- |
278
371
  | `0` | Success - no changes or no destructive operations detected |
279
- | `1` | General error - validation failed, operation declined, missing files, etc. |
280
- | `2` | Schema validation error - invalid DSL syntax or structure |
372
+ | `1` | Validation/general error - invalid DSL, operation declined, missing files, etc. |
373
+ | `2` | Drift detected between expected state and live database schema |
281
374
  | `3` | **CI Destructive** - destructive operations detected in CI environment without `--force` |
282
375
 
283
376
  ### Destructive Operations in CI
@@ -312,6 +405,19 @@ When `CI=true`, SchemaForge will:
312
405
  - ✅ Allow explicit override with `--force` flag
313
406
  - ❌ Not accept user input for confirmation
314
407
 
408
+ ### Drift Integration Tests in CI
409
+
410
+ For drift reliability checks against a real database, run:
411
+
412
+ ```bash
413
+ npm run test:integration:drift
414
+ ```
415
+
416
+ The integration harness supports two deterministic paths:
417
+
418
+ - `SF_USE_CI_POSTGRES=true` + `DATABASE_URL`: uses the CI Postgres service directly.
419
+ - No CI Postgres env: spins up an isolated Testcontainers Postgres instance.
420
+
315
421
  ### Using `--safe` in CI
316
422
 
317
423
  The `--safe` flag is compatible with CI and blocks execution of destructive operations:
package/dist/cli.d.ts CHANGED
@@ -1,2 +1 @@
1
-
2
- export { }
1
+ #!/usr/bin/env node
package/dist/cli.js CHANGED
@@ -1,3 +1,4 @@
1
+ #!/usr/bin/env node
1
2
  "use strict";
2
3
  var __create = Object.create;
3
4
  var __defProp = Object.defineProperty;
@@ -45,11 +46,11 @@ function parseSchema(source) {
45
46
  "date"
46
47
  ]);
47
48
  const validIdentifierPattern = /^[A-Za-z_][A-Za-z0-9_]*$/;
48
- function normalizeColumnType4(type) {
49
+ function normalizeColumnType6(type) {
49
50
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
50
51
  }
51
52
  function isValidColumnType2(type) {
52
- const normalizedType = normalizeColumnType4(type);
53
+ const normalizedType = normalizeColumnType6(type);
53
54
  if (validBaseColumnTypes.has(normalizedType)) {
54
55
  return true;
55
56
  }
@@ -173,7 +174,7 @@ function parseSchema(source) {
173
174
  throw new Error(`Line ${lineNum}: Invalid column definition. Expected: <name> <type> [modifiers...]`);
174
175
  }
175
176
  const colName = tokens[0];
176
- const colType = normalizeColumnType4(tokens[1]);
177
+ const colType = normalizeColumnType6(tokens[1]);
177
178
  validateIdentifier(colName, lineNum, "column");
178
179
  if (!isValidColumnType2(colType)) {
179
180
  throw new Error(`Line ${lineNum}: Invalid column type '${tokens[1]}'. Valid types: ${Array.from(validBaseColumnTypes).join(", ")}, varchar(n), numeric(p,s)`);
@@ -690,6 +691,75 @@ var init_diff = __esm({
690
691
  }
691
692
  });
692
693
 
694
+ // node_modules/@xubylele/schema-forge-core/dist/core/drift-analyzer.js
695
+ function normalizeColumnType2(type) {
696
+ return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
697
+ }
698
+ function getSortedNames2(values) {
699
+ return Array.from(values).sort((left, right) => left.localeCompare(right));
700
+ }
701
+ function analyzeSchemaDrift(state, liveSchema) {
702
+ const stateTableNames = getSortedNames2(Object.keys(state.tables));
703
+ const liveTableNames = getSortedNames2(Object.keys(liveSchema.tables));
704
+ const liveTableNameSet = new Set(liveTableNames);
705
+ const stateTableNameSet = new Set(stateTableNames);
706
+ const missingTables = stateTableNames.filter((tableName) => !liveTableNameSet.has(tableName));
707
+ const extraTables = liveTableNames.filter((tableName) => !stateTableNameSet.has(tableName));
708
+ const commonTableNames = liveTableNames.filter((tableName) => stateTableNameSet.has(tableName));
709
+ const columnDifferences = [];
710
+ const typeMismatches = [];
711
+ for (const tableName of commonTableNames) {
712
+ const stateTable = state.tables[tableName];
713
+ const liveTable = liveSchema.tables[tableName];
714
+ if (!stateTable || !liveTable) {
715
+ continue;
716
+ }
717
+ const stateColumnNames = getSortedNames2(Object.keys(stateTable.columns));
718
+ const liveColumnsByName = new Map(liveTable.columns.map((column) => [column.name, column]));
719
+ const liveColumnNames = getSortedNames2(liveColumnsByName.keys());
720
+ const stateColumnNameSet = new Set(stateColumnNames);
721
+ const liveColumnNameSet = new Set(liveColumnNames);
722
+ const missingInLive = stateColumnNames.filter((columnName) => !liveColumnNameSet.has(columnName));
723
+ const extraInLive = liveColumnNames.filter((columnName) => !stateColumnNameSet.has(columnName));
724
+ if (missingInLive.length > 0 || extraInLive.length > 0) {
725
+ columnDifferences.push({
726
+ tableName,
727
+ missingInLive,
728
+ extraInLive
729
+ });
730
+ }
731
+ const commonColumns = stateColumnNames.filter((columnName) => liveColumnNameSet.has(columnName));
732
+ for (const columnName of commonColumns) {
733
+ const stateColumn = stateTable.columns[columnName];
734
+ const liveColumn = liveColumnsByName.get(columnName);
735
+ if (!stateColumn || !liveColumn) {
736
+ continue;
737
+ }
738
+ const expectedType = stateColumn.type;
739
+ const actualType = liveColumn.type;
740
+ if (normalizeColumnType2(expectedType) !== normalizeColumnType2(actualType)) {
741
+ typeMismatches.push({
742
+ tableName,
743
+ columnName,
744
+ expectedType,
745
+ actualType
746
+ });
747
+ }
748
+ }
749
+ }
750
+ return {
751
+ missingTables,
752
+ extraTables,
753
+ columnDifferences,
754
+ typeMismatches
755
+ };
756
+ }
757
+ var init_drift_analyzer = __esm({
758
+ "node_modules/@xubylele/schema-forge-core/dist/core/drift-analyzer.js"() {
759
+ "use strict";
760
+ }
761
+ });
762
+
693
763
  // node_modules/@xubylele/schema-forge-core/dist/core/validator.js
694
764
  function isValidColumnType(type) {
695
765
  const normalizedType = type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
@@ -777,15 +847,15 @@ var init_validator = __esm({
777
847
  });
778
848
 
779
849
  // node_modules/@xubylele/schema-forge-core/dist/core/safety/operation-classifier.js
780
- function normalizeColumnType2(type) {
850
+ function normalizeColumnType3(type) {
781
851
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
782
852
  }
783
853
  function parseVarcharLength(type) {
784
- const match = normalizeColumnType2(type).match(/^varchar\((\d+)\)$/);
854
+ const match = normalizeColumnType3(type).match(/^varchar\((\d+)\)$/);
785
855
  return match ? Number(match[1]) : null;
786
856
  }
787
857
  function parseNumericType(type) {
788
- const match = normalizeColumnType2(type).match(/^numeric\((\d+),(\d+)\)$/);
858
+ const match = normalizeColumnType3(type).match(/^numeric\((\d+),(\d+)\)$/);
789
859
  if (!match) {
790
860
  return null;
791
861
  }
@@ -795,8 +865,8 @@ function parseNumericType(type) {
795
865
  };
796
866
  }
797
867
  function classifyTypeChange(from, to) {
798
- const fromType = normalizeColumnType2(from);
799
- const toType = normalizeColumnType2(to);
868
+ const fromType = normalizeColumnType3(from);
869
+ const toType = normalizeColumnType3(to);
800
870
  const uuidInvolved = fromType === "uuid" || toType === "uuid";
801
871
  if (uuidInvolved && fromType !== toType) {
802
872
  return "DESTRUCTIVE";
@@ -868,15 +938,15 @@ var init_operation_classifier = __esm({
868
938
  });
869
939
 
870
940
  // node_modules/@xubylele/schema-forge-core/dist/core/safety/safety-checker.js
871
- function normalizeColumnType3(type) {
941
+ function normalizeColumnType4(type) {
872
942
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
873
943
  }
874
944
  function parseVarcharLength2(type) {
875
- const match = normalizeColumnType3(type).match(/^varchar\((\d+)\)$/);
945
+ const match = normalizeColumnType4(type).match(/^varchar\((\d+)\)$/);
876
946
  return match ? Number(match[1]) : null;
877
947
  }
878
948
  function parseNumericType2(type) {
879
- const match = normalizeColumnType3(type).match(/^numeric\((\d+),(\d+)\)$/);
949
+ const match = normalizeColumnType4(type).match(/^numeric\((\d+),(\d+)\)$/);
880
950
  if (!match) {
881
951
  return null;
882
952
  }
@@ -886,8 +956,8 @@ function parseNumericType2(type) {
886
956
  };
887
957
  }
888
958
  function generateTypeChangeMessage(from, to) {
889
- const fromType = normalizeColumnType3(from);
890
- const toType = normalizeColumnType3(to);
959
+ const fromType = normalizeColumnType4(from);
960
+ const toType = normalizeColumnType4(to);
891
961
  const uuidInvolved = fromType === "uuid" || toType === "uuid";
892
962
  if (uuidInvolved && fromType !== toType) {
893
963
  return `Type changed from ${fromType} to ${toType} (likely incompatible cast)`;
@@ -952,8 +1022,8 @@ function checkOperationSafety(operation) {
952
1022
  code: "ALTER_COLUMN_TYPE",
953
1023
  table: operation.tableName,
954
1024
  column: operation.columnName,
955
- from: normalizeColumnType3(operation.fromType),
956
- to: normalizeColumnType3(operation.toType),
1025
+ from: normalizeColumnType4(operation.fromType),
1026
+ to: normalizeColumnType4(operation.toType),
957
1027
  message: generateTypeChangeMessage(operation.fromType, operation.toType),
958
1028
  operationKind: operation.kind
959
1029
  };
@@ -2156,6 +2226,359 @@ var init_load_migrations = __esm({
2156
2226
  }
2157
2227
  });
2158
2228
 
2229
+ // node_modules/@xubylele/schema-forge-core/dist/core/sql/introspect-postgres.js
2230
+ function toTableKey(schema, table) {
2231
+ if (schema === DEFAULT_SCHEMA) {
2232
+ return table;
2233
+ }
2234
+ return `${schema}.${table}`;
2235
+ }
2236
+ function normalizeSchemas(schemas) {
2237
+ const values = schemas ?? [DEFAULT_SCHEMA];
2238
+ const deduped = /* @__PURE__ */ new Set();
2239
+ for (const schema of values) {
2240
+ const trimmed = schema.trim();
2241
+ if (trimmed.length > 0) {
2242
+ deduped.add(trimmed);
2243
+ }
2244
+ }
2245
+ if (deduped.size === 0) {
2246
+ deduped.add(DEFAULT_SCHEMA);
2247
+ }
2248
+ return Array.from(deduped).sort((a, b) => a.localeCompare(b));
2249
+ }
2250
+ function normalizeColumnType5(row) {
2251
+ const dataType = row.data_type.toLowerCase();
2252
+ const udtName = row.udt_name.toLowerCase();
2253
+ if (dataType === "character varying") {
2254
+ if (row.character_maximum_length !== null) {
2255
+ return `varchar(${row.character_maximum_length})`;
2256
+ }
2257
+ return "varchar";
2258
+ }
2259
+ if (dataType === "timestamp with time zone") {
2260
+ return "timestamptz";
2261
+ }
2262
+ if (dataType === "integer" || udtName === "int4") {
2263
+ return "int";
2264
+ }
2265
+ if (dataType === "bigint" || udtName === "int8") {
2266
+ return "bigint";
2267
+ }
2268
+ if (dataType === "numeric") {
2269
+ if (row.numeric_precision !== null && row.numeric_scale !== null) {
2270
+ return `numeric(${row.numeric_precision},${row.numeric_scale})`;
2271
+ }
2272
+ return "numeric";
2273
+ }
2274
+ if (dataType === "boolean" || udtName === "bool") {
2275
+ return "boolean";
2276
+ }
2277
+ if (dataType === "uuid") {
2278
+ return "uuid";
2279
+ }
2280
+ if (dataType === "text") {
2281
+ return "text";
2282
+ }
2283
+ if (dataType === "date") {
2284
+ return "date";
2285
+ }
2286
+ return dataType;
2287
+ }
2288
+ function normalizeConstraints(constraintRows, foreignKeyRows) {
2289
+ const constraints = /* @__PURE__ */ new Map();
2290
+ for (const row of constraintRows) {
2291
+ const key = `${row.table_schema}.${row.table_name}.${row.constraint_name}.${row.constraint_type}`;
2292
+ const existing = constraints.get(key);
2293
+ if (existing) {
2294
+ if (row.column_name !== null) {
2295
+ existing.columns.push({
2296
+ name: row.column_name,
2297
+ position: row.ordinal_position ?? Number.MAX_SAFE_INTEGER
2298
+ });
2299
+ }
2300
+ if (!existing.checkClause && row.check_clause) {
2301
+ existing.checkClause = row.check_clause;
2302
+ }
2303
+ continue;
2304
+ }
2305
+ constraints.set(key, {
2306
+ tableSchema: row.table_schema,
2307
+ tableName: row.table_name,
2308
+ name: row.constraint_name,
2309
+ type: row.constraint_type,
2310
+ columns: row.column_name === null ? [] : [{
2311
+ name: row.column_name,
2312
+ position: row.ordinal_position ?? Number.MAX_SAFE_INTEGER
2313
+ }],
2314
+ ...row.check_clause ? { checkClause: row.check_clause } : {}
2315
+ });
2316
+ }
2317
+ const normalized = [];
2318
+ for (const value of constraints.values()) {
2319
+ value.columns.sort((left, right) => {
2320
+ if (left.position !== right.position) {
2321
+ return left.position - right.position;
2322
+ }
2323
+ return left.name.localeCompare(right.name);
2324
+ });
2325
+ normalized.push({
2326
+ tableSchema: value.tableSchema,
2327
+ tableName: value.tableName,
2328
+ name: value.name,
2329
+ type: value.type,
2330
+ columns: value.columns.map((item) => item.name),
2331
+ ...value.checkClause ? { checkClause: value.checkClause } : {}
2332
+ });
2333
+ }
2334
+ const foreignKeys = /* @__PURE__ */ new Map();
2335
+ for (const row of foreignKeyRows) {
2336
+ const key = `${row.table_schema}.${row.table_name}.${row.constraint_name}`;
2337
+ const existing = foreignKeys.get(key);
2338
+ if (existing) {
2339
+ existing.local.push({ name: row.local_column_name, position: row.position });
2340
+ existing.referenced.push({ name: row.referenced_column_name, position: row.position });
2341
+ continue;
2342
+ }
2343
+ foreignKeys.set(key, {
2344
+ tableSchema: row.table_schema,
2345
+ tableName: row.table_name,
2346
+ name: row.constraint_name,
2347
+ local: [{ name: row.local_column_name, position: row.position }],
2348
+ referencedTableSchema: row.referenced_table_schema,
2349
+ referencedTableName: row.referenced_table_name,
2350
+ referenced: [{ name: row.referenced_column_name, position: row.position }]
2351
+ });
2352
+ }
2353
+ for (const value of foreignKeys.values()) {
2354
+ value.local.sort((left, right) => left.position - right.position || left.name.localeCompare(right.name));
2355
+ value.referenced.sort((left, right) => left.position - right.position || left.name.localeCompare(right.name));
2356
+ normalized.push({
2357
+ tableSchema: value.tableSchema,
2358
+ tableName: value.tableName,
2359
+ name: value.name,
2360
+ type: "FOREIGN KEY",
2361
+ columns: value.local.map((item) => item.name),
2362
+ referencedTableSchema: value.referencedTableSchema,
2363
+ referencedTableName: value.referencedTableName,
2364
+ referencedColumns: value.referenced.map((item) => item.name)
2365
+ });
2366
+ }
2367
+ normalized.sort((left, right) => {
2368
+ if (left.tableSchema !== right.tableSchema) {
2369
+ return left.tableSchema.localeCompare(right.tableSchema);
2370
+ }
2371
+ if (left.tableName !== right.tableName) {
2372
+ return left.tableName.localeCompare(right.tableName);
2373
+ }
2374
+ const typeOrderDiff = CONSTRAINT_TYPE_ORDER[left.type] - CONSTRAINT_TYPE_ORDER[right.type];
2375
+ if (typeOrderDiff !== 0) {
2376
+ return typeOrderDiff;
2377
+ }
2378
+ if (left.name !== right.name) {
2379
+ return left.name.localeCompare(right.name);
2380
+ }
2381
+ return left.columns.join(",").localeCompare(right.columns.join(","));
2382
+ });
2383
+ return normalized;
2384
+ }
2385
+ async function introspectPostgresSchema(options) {
2386
+ const schemaFilter = normalizeSchemas(options.schemas);
2387
+ const [tableRows, columnRows, constraintRows, foreignKeyRows] = await Promise.all([
2388
+ options.query(TABLES_QUERY, [schemaFilter]),
2389
+ options.query(COLUMNS_QUERY, [schemaFilter]),
2390
+ options.query(CONSTRAINTS_QUERY, [schemaFilter]),
2391
+ options.query(FOREIGN_KEYS_QUERY, [schemaFilter])
2392
+ ]);
2393
+ const sortedTables = [...tableRows].sort((left, right) => {
2394
+ if (left.table_schema !== right.table_schema) {
2395
+ return left.table_schema.localeCompare(right.table_schema);
2396
+ }
2397
+ return left.table_name.localeCompare(right.table_name);
2398
+ });
2399
+ const sortedColumns = [...columnRows].sort((left, right) => {
2400
+ if (left.table_schema !== right.table_schema) {
2401
+ return left.table_schema.localeCompare(right.table_schema);
2402
+ }
2403
+ if (left.table_name !== right.table_name) {
2404
+ return left.table_name.localeCompare(right.table_name);
2405
+ }
2406
+ if (left.ordinal_position !== right.ordinal_position) {
2407
+ return left.ordinal_position - right.ordinal_position;
2408
+ }
2409
+ return left.column_name.localeCompare(right.column_name);
2410
+ });
2411
+ const normalizedConstraints = normalizeConstraints(constraintRows, foreignKeyRows);
2412
+ const tableMap = /* @__PURE__ */ new Map();
2413
+ const columnMap = /* @__PURE__ */ new Map();
2414
+ for (const row of sortedTables) {
2415
+ const key = toTableKey(row.table_schema, row.table_name);
2416
+ const table = { name: key, columns: [], primaryKey: null };
2417
+ tableMap.set(key, table);
2418
+ columnMap.set(key, /* @__PURE__ */ new Map());
2419
+ }
2420
+ for (const row of sortedColumns) {
2421
+ const key = toTableKey(row.table_schema, row.table_name);
2422
+ const table = tableMap.get(key);
2423
+ const columnsByName = columnMap.get(key);
2424
+ if (!table || !columnsByName) {
2425
+ continue;
2426
+ }
2427
+ const column = {
2428
+ name: row.column_name,
2429
+ type: normalizeColumnType5(row),
2430
+ nullable: row.is_nullable === "YES"
2431
+ };
2432
+ const normalizedDefault = normalizeDefault(row.column_default);
2433
+ if (normalizedDefault !== null) {
2434
+ column.default = normalizedDefault;
2435
+ }
2436
+ table.columns.push(column);
2437
+ columnsByName.set(column.name, column);
2438
+ }
2439
+ for (const constraint of normalizedConstraints) {
2440
+ const tableKey = toTableKey(constraint.tableSchema, constraint.tableName);
2441
+ const table = tableMap.get(tableKey);
2442
+ const columnsByName = columnMap.get(tableKey);
2443
+ if (!table || !columnsByName) {
2444
+ continue;
2445
+ }
2446
+ if (constraint.type === "PRIMARY KEY") {
2447
+ if (constraint.columns.length === 1) {
2448
+ const column = columnsByName.get(constraint.columns[0]);
2449
+ if (column) {
2450
+ column.primaryKey = true;
2451
+ column.nullable = false;
2452
+ table.primaryKey = column.name;
2453
+ }
2454
+ }
2455
+ continue;
2456
+ }
2457
+ if (constraint.type === "UNIQUE") {
2458
+ if (constraint.columns.length === 1) {
2459
+ const column = columnsByName.get(constraint.columns[0]);
2460
+ if (column) {
2461
+ column.unique = true;
2462
+ }
2463
+ }
2464
+ continue;
2465
+ }
2466
+ if (constraint.type === "FOREIGN KEY") {
2467
+ if (constraint.columns.length === 1 && constraint.referencedColumns && constraint.referencedColumns.length === 1 && constraint.referencedTableName) {
2468
+ const column = columnsByName.get(constraint.columns[0]);
2469
+ if (column) {
2470
+ column.foreignKey = {
2471
+ table: toTableKey(constraint.referencedTableSchema ?? DEFAULT_SCHEMA, constraint.referencedTableName),
2472
+ column: constraint.referencedColumns[0]
2473
+ };
2474
+ }
2475
+ }
2476
+ }
2477
+ }
2478
+ const orderedTableNames = Array.from(tableMap.keys()).sort((left, right) => left.localeCompare(right));
2479
+ const tables = {};
2480
+ for (const tableName of orderedTableNames) {
2481
+ const table = tableMap.get(tableName);
2482
+ if (table) {
2483
+ tables[tableName] = table;
2484
+ }
2485
+ }
2486
+ return { tables };
2487
+ }
2488
+ var DEFAULT_SCHEMA, CONSTRAINT_TYPE_ORDER, TABLES_QUERY, COLUMNS_QUERY, CONSTRAINTS_QUERY, FOREIGN_KEYS_QUERY;
2489
+ var init_introspect_postgres = __esm({
2490
+ "node_modules/@xubylele/schema-forge-core/dist/core/sql/introspect-postgres.js"() {
2491
+ "use strict";
2492
+ init_normalize();
2493
+ DEFAULT_SCHEMA = "public";
2494
+ CONSTRAINT_TYPE_ORDER = {
2495
+ "PRIMARY KEY": 0,
2496
+ UNIQUE: 1,
2497
+ "FOREIGN KEY": 2,
2498
+ CHECK: 3
2499
+ };
2500
+ TABLES_QUERY = `
2501
+ SELECT
2502
+ table_schema,
2503
+ table_name
2504
+ FROM information_schema.tables
2505
+ WHERE table_type = 'BASE TABLE'
2506
+ AND table_schema = ANY($1::text[])
2507
+ `;
2508
+ COLUMNS_QUERY = `
2509
+ SELECT
2510
+ table_schema,
2511
+ table_name,
2512
+ column_name,
2513
+ ordinal_position,
2514
+ is_nullable,
2515
+ data_type,
2516
+ udt_name,
2517
+ character_maximum_length,
2518
+ numeric_precision,
2519
+ numeric_scale,
2520
+ column_default
2521
+ FROM information_schema.columns
2522
+ WHERE table_schema = ANY($1::text[])
2523
+ `;
2524
+ CONSTRAINTS_QUERY = `
2525
+ SELECT
2526
+ tc.table_schema,
2527
+ tc.table_name,
2528
+ tc.constraint_name,
2529
+ tc.constraint_type,
2530
+ kcu.column_name,
2531
+ kcu.ordinal_position,
2532
+ cc.check_clause
2533
+ FROM information_schema.table_constraints tc
2534
+ LEFT JOIN information_schema.key_column_usage kcu
2535
+ ON tc.constraint_catalog = kcu.constraint_catalog
2536
+ AND tc.constraint_schema = kcu.constraint_schema
2537
+ AND tc.constraint_name = kcu.constraint_name
2538
+ AND tc.table_schema = kcu.table_schema
2539
+ AND tc.table_name = kcu.table_name
2540
+ LEFT JOIN information_schema.check_constraints cc
2541
+ ON tc.constraint_catalog = cc.constraint_catalog
2542
+ AND tc.constraint_schema = cc.constraint_schema
2543
+ AND tc.constraint_name = cc.constraint_name
2544
+ WHERE tc.table_schema = ANY($1::text[])
2545
+ AND tc.constraint_type IN ('PRIMARY KEY', 'UNIQUE', 'CHECK')
2546
+ `;
2547
+ FOREIGN_KEYS_QUERY = `
2548
+ SELECT
2549
+ src_ns.nspname AS table_schema,
2550
+ src.relname AS table_name,
2551
+ con.conname AS constraint_name,
2552
+ src_attr.attname AS local_column_name,
2553
+ ref_ns.nspname AS referenced_table_schema,
2554
+ ref.relname AS referenced_table_name,
2555
+ ref_attr.attname AS referenced_column_name,
2556
+ src_key.ord AS position
2557
+ FROM pg_constraint con
2558
+ JOIN pg_class src
2559
+ ON src.oid = con.conrelid
2560
+ JOIN pg_namespace src_ns
2561
+ ON src_ns.oid = src.relnamespace
2562
+ JOIN pg_class ref
2563
+ ON ref.oid = con.confrelid
2564
+ JOIN pg_namespace ref_ns
2565
+ ON ref_ns.oid = ref.relnamespace
2566
+ JOIN LATERAL unnest(con.conkey) WITH ORDINALITY AS src_key(attnum, ord)
2567
+ ON TRUE
2568
+ JOIN LATERAL unnest(con.confkey) WITH ORDINALITY AS ref_key(attnum, ord)
2569
+ ON ref_key.ord = src_key.ord
2570
+ JOIN pg_attribute src_attr
2571
+ ON src_attr.attrelid = con.conrelid
2572
+ AND src_attr.attnum = src_key.attnum
2573
+ JOIN pg_attribute ref_attr
2574
+ ON ref_attr.attrelid = con.confrelid
2575
+ AND ref_attr.attnum = ref_key.attnum
2576
+ WHERE con.contype = 'f'
2577
+ AND src_ns.nspname = ANY($1::text[])
2578
+ `;
2579
+ }
2580
+ });
2581
+
2159
2582
  // node_modules/@xubylele/schema-forge-core/dist/core/paths.js
2160
2583
  function getProjectRoot2(cwd = process.cwd()) {
2161
2584
  return cwd;
@@ -2218,6 +2641,7 @@ var init_errors = __esm({
2218
2641
  var dist_exports = {};
2219
2642
  __export(dist_exports, {
2220
2643
  SchemaValidationError: () => SchemaValidationError,
2644
+ analyzeSchemaDrift: () => analyzeSchemaDrift,
2221
2645
  applySqlOps: () => applySqlOps,
2222
2646
  checkOperationSafety: () => checkOperationSafety,
2223
2647
  checkSchemaSafety: () => checkSchemaSafety,
@@ -2236,6 +2660,7 @@ __export(dist_exports, {
2236
2660
  getStatePath: () => getStatePath2,
2237
2661
  getTableNamesFromSchema: () => getTableNamesFromSchema,
2238
2662
  getTableNamesFromState: () => getTableNamesFromState,
2663
+ introspectPostgresSchema: () => introspectPostgresSchema,
2239
2664
  legacyPkName: () => legacyPkName,
2240
2665
  legacyUqName: () => legacyUqName,
2241
2666
  loadMigrationSqlInput: () => loadMigrationSqlInput,
@@ -2273,6 +2698,7 @@ var init_dist = __esm({
2273
2698
  "use strict";
2274
2699
  init_parser();
2275
2700
  init_diff();
2701
+ init_drift_analyzer();
2276
2702
  init_validator();
2277
2703
  init_validate();
2278
2704
  init_safety();
@@ -2283,6 +2709,7 @@ var init_dist = __esm({
2283
2709
  init_schema_to_dsl();
2284
2710
  init_load_migrations();
2285
2711
  init_split_statements();
2712
+ init_introspect_postgres();
2286
2713
  init_fs();
2287
2714
  init_normalize();
2288
2715
  init_paths();
@@ -2292,12 +2719,12 @@ var init_dist = __esm({
2292
2719
  });
2293
2720
 
2294
2721
  // src/cli.ts
2295
- var import_commander6 = require("commander");
2722
+ var import_commander8 = require("commander");
2296
2723
 
2297
2724
  // package.json
2298
2725
  var package_default = {
2299
2726
  name: "@xubylele/schema-forge",
2300
- version: "1.6.0",
2727
+ version: "1.7.0",
2301
2728
  description: "Universal migration generator from schema DSL",
2302
2729
  main: "dist/cli.js",
2303
2730
  type: "commonjs",
@@ -2308,6 +2735,7 @@ var package_default = {
2308
2735
  build: "tsup src/cli.ts --format cjs --dts",
2309
2736
  dev: "ts-node src/cli.ts",
2310
2737
  test: "vitest",
2738
+ "test:integration:drift": "vitest run test/drift-realdb.integration.test.ts",
2311
2739
  prepublishOnly: "npm run build",
2312
2740
  "publish:public": "npm publish --access public",
2313
2741
  changeset: "changeset",
@@ -2339,12 +2767,15 @@ var package_default = {
2339
2767
  dependencies: {
2340
2768
  boxen: "^8.0.1",
2341
2769
  chalk: "^5.6.2",
2342
- commander: "^14.0.3"
2770
+ commander: "^14.0.3",
2771
+ pg: "^8.19.0"
2343
2772
  },
2344
2773
  devDependencies: {
2345
- "@changesets/cli": "^2.29.8",
2774
+ "@changesets/cli": "^2.30.0",
2346
2775
  "@types/node": "^25.2.3",
2347
- "@xubylele/schema-forge-core": "^1.2.0",
2776
+ "@types/pg": "^8.18.0",
2777
+ "@xubylele/schema-forge-core": "^1.3.0",
2778
+ testcontainers: "^11.8.1",
2348
2779
  "ts-node": "^10.9.2",
2349
2780
  tsup: "^8.5.1",
2350
2781
  typescript: "^5.9.3",
@@ -2491,6 +2922,14 @@ async function parseMigrationSql2(sql) {
2491
2922
  const core = await loadCore();
2492
2923
  return core.parseMigrationSql(sql);
2493
2924
  }
2925
+ async function introspectPostgresSchema2(options) {
2926
+ const core = await loadCore();
2927
+ return core.introspectPostgresSchema(options);
2928
+ }
2929
+ async function analyzeSchemaDrift2(state, liveSchema) {
2930
+ const core = await loadCore();
2931
+ return core.analyzeSchemaDrift(state, liveSchema);
2932
+ }
2494
2933
  async function applySqlOps2(ops) {
2495
2934
  const core = await loadCore();
2496
2935
  return core.applySqlOps(ops);
@@ -2512,6 +2951,40 @@ async function isSchemaValidationError(error2) {
2512
2951
  return error2 instanceof core.SchemaValidationError;
2513
2952
  }
2514
2953
 
2954
+ // src/core/postgres.ts
2955
+ var import_pg = require("pg");
2956
+ function resolvePostgresConnectionString(options = {}) {
2957
+ const explicitUrl = options.url?.trim();
2958
+ if (explicitUrl) {
2959
+ return explicitUrl;
2960
+ }
2961
+ const envUrl = process.env.DATABASE_URL?.trim();
2962
+ if (envUrl) {
2963
+ return envUrl;
2964
+ }
2965
+ throw new Error("PostgreSQL connection URL is required. Pass --url or set DATABASE_URL.");
2966
+ }
2967
+ function parseSchemaList(value) {
2968
+ if (!value) {
2969
+ return void 0;
2970
+ }
2971
+ const schemas = value.split(",").map((item) => item.trim()).filter(Boolean);
2972
+ return schemas.length > 0 ? schemas : void 0;
2973
+ }
2974
+ async function withPostgresQueryExecutor(connectionString, run) {
2975
+ const client = new import_pg.Client({ connectionString });
2976
+ await client.connect();
2977
+ const query = async (sql, params) => {
2978
+ const result = await client.query(sql, params ? [...params] : void 0);
2979
+ return result.rows;
2980
+ };
2981
+ try {
2982
+ return await run(query);
2983
+ } finally {
2984
+ await client.end();
2985
+ }
2986
+ }
2987
+
2515
2988
  // src/utils/exitCodes.ts
2516
2989
  var EXIT_CODES = {
2517
2990
  /** Successful operation */
@@ -2646,7 +3119,6 @@ function hasDestructiveFindings(findings) {
2646
3119
  }
2647
3120
 
2648
3121
  // src/commands/diff.ts
2649
- var REQUIRED_CONFIG_FIELDS = ["schemaFile", "stateFile"];
2650
3122
  function resolveConfigPath(root, targetPath) {
2651
3123
  return import_path7.default.isAbsolute(targetPath) ? targetPath : import_path7.default.join(root, targetPath);
2652
3124
  }
@@ -2660,14 +3132,16 @@ async function runDiff(options = {}) {
2660
3132
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2661
3133
  }
2662
3134
  const config = await readJsonFile(configPath, {});
2663
- for (const field of REQUIRED_CONFIG_FIELDS) {
3135
+ const useLiveDatabase = Boolean(options.url || process.env.DATABASE_URL);
3136
+ const requiredFields = useLiveDatabase ? ["schemaFile"] : ["schemaFile", "stateFile"];
3137
+ for (const field of requiredFields) {
2664
3138
  const value = config[field];
2665
3139
  if (!value || typeof value !== "string") {
2666
3140
  throw new Error(`Invalid config: '${field}' is required`);
2667
3141
  }
2668
3142
  }
2669
3143
  const schemaPath = resolveConfigPath(root, config.schemaFile);
2670
- const statePath = resolveConfigPath(root, config.stateFile);
3144
+ const statePath = config.stateFile ? resolveConfigPath(root, config.stateFile) : null;
2671
3145
  const { provider } = resolveProvider(config.provider);
2672
3146
  const schemaSource = await readTextFile(schemaPath);
2673
3147
  const schema = await parseSchema2(schemaSource);
@@ -2679,7 +3153,17 @@ async function runDiff(options = {}) {
2679
3153
  }
2680
3154
  throw error2;
2681
3155
  }
2682
- const previousState = await loadState2(statePath);
3156
+ const previousState = useLiveDatabase ? await withPostgresQueryExecutor(
3157
+ resolvePostgresConnectionString({ url: options.url }),
3158
+ async (query) => {
3159
+ const schemaFilters = parseSchemaList(options.schema);
3160
+ const liveSchema = await introspectPostgresSchema2({
3161
+ query,
3162
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3163
+ });
3164
+ return schemaToState2(liveSchema);
3165
+ }
3166
+ ) : await loadState2(statePath ?? "");
2683
3167
  const diff = await diffSchemas2(previousState, schema);
2684
3168
  if (options.force) {
2685
3169
  forceWarning("Are you sure to use --force? This option will bypass safety checks for destructive operations.");
@@ -2724,9 +3208,72 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2724
3208
  process.exitCode = EXIT_CODES.SUCCESS;
2725
3209
  }
2726
3210
 
2727
- // src/commands/generate.ts
3211
+ // src/commands/doctor.ts
2728
3212
  var import_commander2 = require("commander");
2729
3213
  var import_path8 = __toESM(require("path"));
3214
+ function resolveConfigPath2(root, targetPath) {
3215
+ return import_path8.default.isAbsolute(targetPath) ? targetPath : import_path8.default.join(root, targetPath);
3216
+ }
3217
+ function hasDrift(report) {
3218
+ return report.missingTables.length > 0 || report.extraTables.length > 0 || report.columnDifferences.length > 0 || report.typeMismatches.length > 0;
3219
+ }
3220
+ function printDriftReport(report) {
3221
+ if (report.missingTables.length > 0) {
3222
+ console.log(`Missing tables in live DB: ${report.missingTables.join(", ")}`);
3223
+ }
3224
+ if (report.extraTables.length > 0) {
3225
+ console.log(`Extra tables in live DB: ${report.extraTables.join(", ")}`);
3226
+ }
3227
+ for (const difference of report.columnDifferences) {
3228
+ if (difference.missingInLive.length > 0) {
3229
+ console.log(`Missing columns in ${difference.tableName}: ${difference.missingInLive.join(", ")}`);
3230
+ }
3231
+ if (difference.extraInLive.length > 0) {
3232
+ console.log(`Extra columns in ${difference.tableName}: ${difference.extraInLive.join(", ")}`);
3233
+ }
3234
+ }
3235
+ for (const mismatch of report.typeMismatches) {
3236
+ console.log(`Type mismatch ${mismatch.tableName}.${mismatch.columnName}: ${mismatch.expectedType} -> ${mismatch.actualType}`);
3237
+ }
3238
+ }
3239
+ async function runDoctor(options = {}) {
3240
+ const root = getProjectRoot();
3241
+ const configPath = getConfigPath(root);
3242
+ if (!await fileExists(configPath)) {
3243
+ throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
3244
+ }
3245
+ const config = await readJsonFile(configPath, {});
3246
+ if (!config.stateFile || typeof config.stateFile !== "string") {
3247
+ throw new Error("Invalid config: 'stateFile' is required");
3248
+ }
3249
+ const statePath = resolveConfigPath2(root, config.stateFile);
3250
+ const previousState = await loadState2(statePath);
3251
+ const schemaFilters = parseSchemaList(options.schema);
3252
+ const liveSchema = await withPostgresQueryExecutor(
3253
+ resolvePostgresConnectionString({ url: options.url }),
3254
+ (query) => introspectPostgresSchema2({
3255
+ query,
3256
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3257
+ })
3258
+ );
3259
+ const driftReport = await analyzeSchemaDrift2(previousState, liveSchema);
3260
+ const detected = hasDrift(driftReport);
3261
+ process.exitCode = detected ? EXIT_CODES.DRIFT_DETECTED : EXIT_CODES.SUCCESS;
3262
+ if (options.json) {
3263
+ console.log(JSON.stringify(driftReport, null, 2));
3264
+ return;
3265
+ }
3266
+ if (!detected) {
3267
+ success("No schema drift detected");
3268
+ return;
3269
+ }
3270
+ console.log("Schema drift detected");
3271
+ printDriftReport(driftReport);
3272
+ }
3273
+
3274
+ // src/commands/generate.ts
3275
+ var import_commander3 = require("commander");
3276
+ var import_path9 = __toESM(require("path"));
2730
3277
 
2731
3278
  // src/core/utils.ts
2732
3279
  function nowTimestamp2() {
@@ -2739,13 +3286,13 @@ function slugifyName2(name) {
2739
3286
  }
2740
3287
 
2741
3288
  // src/commands/generate.ts
2742
- var REQUIRED_CONFIG_FIELDS2 = [
3289
+ var REQUIRED_CONFIG_FIELDS = [
2743
3290
  "schemaFile",
2744
3291
  "stateFile",
2745
3292
  "outputDir"
2746
3293
  ];
2747
- function resolveConfigPath2(root, targetPath) {
2748
- return import_path8.default.isAbsolute(targetPath) ? targetPath : import_path8.default.join(root, targetPath);
3294
+ function resolveConfigPath3(root, targetPath) {
3295
+ return import_path9.default.isAbsolute(targetPath) ? targetPath : import_path9.default.join(root, targetPath);
2749
3296
  }
2750
3297
  async function runGenerate(options) {
2751
3298
  if (options.safe && options.force) {
@@ -2757,15 +3304,15 @@ async function runGenerate(options) {
2757
3304
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2758
3305
  }
2759
3306
  const config = await readJsonFile(configPath, {});
2760
- for (const field of REQUIRED_CONFIG_FIELDS2) {
3307
+ for (const field of REQUIRED_CONFIG_FIELDS) {
2761
3308
  const value = config[field];
2762
3309
  if (!value || typeof value !== "string") {
2763
3310
  throw new Error(`Invalid config: '${field}' is required`);
2764
3311
  }
2765
3312
  }
2766
- const schemaPath = resolveConfigPath2(root, config.schemaFile);
2767
- const statePath = resolveConfigPath2(root, config.stateFile);
2768
- const outputDir = resolveConfigPath2(root, config.outputDir);
3313
+ const schemaPath = resolveConfigPath3(root, config.schemaFile);
3314
+ const statePath = resolveConfigPath3(root, config.stateFile);
3315
+ const outputDir = resolveConfigPath3(root, config.outputDir);
2769
3316
  const { provider, usedDefault } = resolveProvider(config.provider);
2770
3317
  if (usedDefault) {
2771
3318
  info("Provider not set; defaulting to postgres.");
@@ -2826,7 +3373,7 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2826
3373
  const slug = slugifyName2(options.name ?? "migration");
2827
3374
  const fileName = `${timestamp}-${slug}.sql`;
2828
3375
  await ensureDir(outputDir);
2829
- const migrationPath = import_path8.default.join(outputDir, fileName);
3376
+ const migrationPath = import_path9.default.join(outputDir, fileName);
2830
3377
  await writeTextFile(migrationPath, sql + "\n");
2831
3378
  const nextState = await schemaToState2(schema);
2832
3379
  await saveState2(statePath, nextState);
@@ -2835,14 +3382,14 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2835
3382
  }
2836
3383
 
2837
3384
  // src/commands/import.ts
2838
- var import_commander3 = require("commander");
2839
- var import_path9 = __toESM(require("path"));
2840
- function resolveConfigPath3(root, targetPath) {
2841
- return import_path9.default.isAbsolute(targetPath) ? targetPath : import_path9.default.join(root, targetPath);
3385
+ var import_commander4 = require("commander");
3386
+ var import_path10 = __toESM(require("path"));
3387
+ function resolveConfigPath4(root, targetPath) {
3388
+ return import_path10.default.isAbsolute(targetPath) ? targetPath : import_path10.default.join(root, targetPath);
2842
3389
  }
2843
3390
  async function runImport(inputPath, options = {}) {
2844
3391
  const root = getProjectRoot();
2845
- const absoluteInputPath = resolveConfigPath3(root, inputPath);
3392
+ const absoluteInputPath = resolveConfigPath4(root, inputPath);
2846
3393
  const inputs = await loadMigrationSqlInput2(absoluteInputPath);
2847
3394
  if (inputs.length === 0) {
2848
3395
  throw new Error(`No .sql migration files found in: ${absoluteInputPath}`);
@@ -2853,7 +3400,7 @@ async function runImport(inputPath, options = {}) {
2853
3400
  const result = await parseMigrationSql2(input.sql);
2854
3401
  allOps.push(...result.ops);
2855
3402
  parseWarnings.push(...result.warnings.map((item) => ({
2856
- statement: `[${import_path9.default.basename(input.filePath)}] ${item.statement}`,
3403
+ statement: `[${import_path10.default.basename(input.filePath)}] ${item.statement}`,
2857
3404
  reason: item.reason
2858
3405
  })));
2859
3406
  }
@@ -2869,7 +3416,7 @@ async function runImport(inputPath, options = {}) {
2869
3416
  }
2870
3417
  }
2871
3418
  }
2872
- const schemaPath = targetPath ? resolveConfigPath3(root, targetPath) : getSchemaFilePath(root);
3419
+ const schemaPath = targetPath ? resolveConfigPath4(root, targetPath) : getSchemaFilePath(root);
2873
3420
  await writeTextFile(schemaPath, dsl);
2874
3421
  success(`Imported ${inputs.length} migration file(s) into ${schemaPath}`);
2875
3422
  info(`Parsed ${allOps.length} supported DDL operation(s)`);
@@ -2887,7 +3434,7 @@ async function runImport(inputPath, options = {}) {
2887
3434
  }
2888
3435
 
2889
3436
  // src/commands/init.ts
2890
- var import_commander4 = require("commander");
3437
+ var import_commander5 = require("commander");
2891
3438
  async function runInit() {
2892
3439
  const root = getProjectRoot();
2893
3440
  const schemaForgeDir = getSchemaForgeDir(root);
@@ -2946,28 +3493,61 @@ table users {
2946
3493
  process.exitCode = EXIT_CODES.SUCCESS;
2947
3494
  }
2948
3495
 
3496
+ // src/commands/introspect.ts
3497
+ var import_commander6 = require("commander");
3498
+ var import_path11 = __toESM(require("path"));
3499
+ function resolveOutputPath(root, outputPath) {
3500
+ return import_path11.default.isAbsolute(outputPath) ? outputPath : import_path11.default.join(root, outputPath);
3501
+ }
3502
+ async function runIntrospect(options = {}) {
3503
+ const connectionString = resolvePostgresConnectionString({ url: options.url });
3504
+ const schemas = parseSchemaList(options.schema);
3505
+ const root = getProjectRoot();
3506
+ const schema = await withPostgresQueryExecutor(connectionString, (query) => introspectPostgresSchema2({
3507
+ query,
3508
+ ...schemas ? { schemas } : {}
3509
+ }));
3510
+ const output = JSON.stringify(schema, null, 2);
3511
+ if (!options.json && !options.out) {
3512
+ info(`Introspected ${Object.keys(schema.tables).length} table(s) from PostgreSQL.`);
3513
+ }
3514
+ if (options.out) {
3515
+ const outputPath = resolveOutputPath(root, options.out);
3516
+ await writeTextFile(outputPath, `${output}
3517
+ `);
3518
+ success(`Live schema written to ${outputPath}`);
3519
+ }
3520
+ if (options.json || !options.out) {
3521
+ console.log(output);
3522
+ }
3523
+ }
3524
+
2949
3525
  // src/commands/validate.ts
2950
- var import_commander5 = require("commander");
2951
- var import_path10 = __toESM(require("path"));
2952
- var REQUIRED_CONFIG_FIELDS3 = ["schemaFile", "stateFile"];
2953
- function resolveConfigPath4(root, targetPath) {
2954
- return import_path10.default.isAbsolute(targetPath) ? targetPath : import_path10.default.join(root, targetPath);
3526
+ var import_commander7 = require("commander");
3527
+ var import_path12 = __toESM(require("path"));
3528
+ function resolveConfigPath5(root, targetPath) {
3529
+ return import_path12.default.isAbsolute(targetPath) ? targetPath : import_path12.default.join(root, targetPath);
2955
3530
  }
2956
3531
  async function runValidate(options = {}) {
2957
3532
  const root = getProjectRoot();
2958
3533
  const configPath = getConfigPath(root);
3534
+ const useLiveDatabase = Boolean(options.url || process.env.DATABASE_URL);
2959
3535
  if (!await fileExists(configPath)) {
2960
3536
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2961
3537
  }
2962
3538
  const config = await readJsonFile(configPath, {});
2963
- for (const field of REQUIRED_CONFIG_FIELDS3) {
3539
+ const requiredFields = ["schemaFile", "stateFile"];
3540
+ for (const field of requiredFields) {
2964
3541
  const value = config[field];
2965
3542
  if (!value || typeof value !== "string") {
2966
3543
  throw new Error(`Invalid config: '${field}' is required`);
2967
3544
  }
2968
3545
  }
2969
- const schemaPath = resolveConfigPath4(root, config.schemaFile);
2970
- const statePath = resolveConfigPath4(root, config.stateFile);
3546
+ const schemaPath = resolveConfigPath5(root, config.schemaFile);
3547
+ if (!config.stateFile) {
3548
+ throw new Error("Invalid config: 'stateFile' is required");
3549
+ }
3550
+ const statePath = resolveConfigPath5(root, config.stateFile);
2971
3551
  const schemaSource = await readTextFile(schemaPath);
2972
3552
  const schema = await parseSchema2(schemaSource);
2973
3553
  try {
@@ -2979,6 +3559,45 @@ async function runValidate(options = {}) {
2979
3559
  throw error2;
2980
3560
  }
2981
3561
  const previousState = await loadState2(statePath);
3562
+ if (useLiveDatabase) {
3563
+ const schemaFilters = parseSchemaList(options.schema);
3564
+ const liveSchema = await withPostgresQueryExecutor(
3565
+ resolvePostgresConnectionString({ url: options.url }),
3566
+ (query) => introspectPostgresSchema2({
3567
+ query,
3568
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3569
+ })
3570
+ );
3571
+ const driftReport = await analyzeSchemaDrift2(previousState, liveSchema);
3572
+ const hasDrift2 = driftReport.missingTables.length > 0 || driftReport.extraTables.length > 0 || driftReport.columnDifferences.length > 0 || driftReport.typeMismatches.length > 0;
3573
+ process.exitCode = hasDrift2 ? EXIT_CODES.DRIFT_DETECTED : EXIT_CODES.SUCCESS;
3574
+ if (options.json) {
3575
+ console.log(JSON.stringify(driftReport, null, 2));
3576
+ return;
3577
+ }
3578
+ if (!hasDrift2) {
3579
+ success("No schema drift detected");
3580
+ return;
3581
+ }
3582
+ if (driftReport.missingTables.length > 0) {
3583
+ console.log(`Missing tables in live DB: ${driftReport.missingTables.join(", ")}`);
3584
+ }
3585
+ if (driftReport.extraTables.length > 0) {
3586
+ console.log(`Extra tables in live DB: ${driftReport.extraTables.join(", ")}`);
3587
+ }
3588
+ for (const difference of driftReport.columnDifferences) {
3589
+ if (difference.missingInLive.length > 0) {
3590
+ console.log(`Missing columns in ${difference.tableName}: ${difference.missingInLive.join(", ")}`);
3591
+ }
3592
+ if (difference.extraInLive.length > 0) {
3593
+ console.log(`Extra columns in ${difference.tableName}: ${difference.extraInLive.join(", ")}`);
3594
+ }
3595
+ }
3596
+ for (const mismatch of driftReport.typeMismatches) {
3597
+ console.log(`Type mismatch ${mismatch.tableName}.${mismatch.columnName}: ${mismatch.expectedType} -> ${mismatch.actualType}`);
3598
+ }
3599
+ return;
3600
+ }
2982
3601
  const findings = await validateSchemaChanges2(previousState, schema);
2983
3602
  const report = await toValidationReport2(findings);
2984
3603
  if (isCI() && hasDestructiveFindings(findings)) {
@@ -3052,7 +3671,7 @@ async function seedLastSeenVersion(version) {
3052
3671
  }
3053
3672
 
3054
3673
  // src/cli.ts
3055
- var program = new import_commander6.Command();
3674
+ var program = new import_commander8.Command();
3056
3675
  program.name("schema-forge").description("CLI tool for schema management and SQL generation").version(package_default.version).option("--safe", "Prevent execution of destructive operations").option("--force", "Force execution by bypassing safety checks and CI detection");
3057
3676
  function validateFlagExclusivity(options) {
3058
3677
  if (options.safe && options.force) {
@@ -3088,11 +3707,25 @@ program.command("generate").description("Generate SQL from schema files. In CI e
3088
3707
  await handleError(error2);
3089
3708
  }
3090
3709
  });
3091
- program.command("diff").description("Compare two schema versions and generate migration SQL. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").action(async () => {
3710
+ program.command("diff").description("Compare two schema versions and generate migration SQL. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").option("--url <string>", "PostgreSQL connection URL for live diff (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3092
3711
  try {
3093
3712
  const globalOptions = program.opts();
3094
3713
  validateFlagExclusivity(globalOptions);
3095
- await runDiff(globalOptions);
3714
+ await runDiff({ ...options, ...globalOptions });
3715
+ } catch (error2) {
3716
+ await handleError(error2);
3717
+ }
3718
+ });
3719
+ program.command("doctor").description("Check live database drift against state. Exits with code 2 when drift is detected.").option("--json", "Output structured JSON").option("--url <string>", "PostgreSQL connection URL (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3720
+ try {
3721
+ await runDoctor(options);
3722
+ } catch (error2) {
3723
+ await handleError(error2);
3724
+ }
3725
+ });
3726
+ program.command("introspect").description("Extract normalized live schema from PostgreSQL").option("--url <string>", "PostgreSQL connection URL (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names (default: public)").option("--json", "Output normalized schema JSON to stdout").option("--out <path>", "Write normalized schema JSON to a file").action(async (options) => {
3727
+ try {
3728
+ await runIntrospect(options);
3096
3729
  } catch (error2) {
3097
3730
  await handleError(error2);
3098
3731
  }
@@ -3104,7 +3737,7 @@ program.command("import").description("Import schema from SQL migrations").argum
3104
3737
  await handleError(error2);
3105
3738
  }
3106
3739
  });
3107
- program.command("validate").description("Detect destructive or risky schema changes. In CI environments (CI=true), exits with code 3 if destructive operations are detected.").option("--json", "Output structured JSON").action(async (options) => {
3740
+ program.command("validate").description("Detect destructive or risky schema changes. In CI environments (CI=true), exits with code 3 if destructive operations are detected.").option("--json", "Output structured JSON").option("--url <string>", "PostgreSQL connection URL for live drift validation (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3108
3741
  try {
3109
3742
  await runValidate(options);
3110
3743
  } catch (error2) {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@xubylele/schema-forge",
3
- "version": "1.6.0",
3
+ "version": "1.7.0",
4
4
  "description": "Universal migration generator from schema DSL",
5
5
  "main": "dist/cli.js",
6
6
  "type": "commonjs",
@@ -11,6 +11,7 @@
11
11
  "build": "tsup src/cli.ts --format cjs --dts",
12
12
  "dev": "ts-node src/cli.ts",
13
13
  "test": "vitest",
14
+ "test:integration:drift": "vitest run test/drift-realdb.integration.test.ts",
14
15
  "prepublishOnly": "npm run build",
15
16
  "publish:public": "npm publish --access public",
16
17
  "changeset": "changeset",
@@ -42,12 +43,15 @@
42
43
  "dependencies": {
43
44
  "boxen": "^8.0.1",
44
45
  "chalk": "^5.6.2",
45
- "commander": "^14.0.3"
46
+ "commander": "^14.0.3",
47
+ "pg": "^8.19.0"
46
48
  },
47
49
  "devDependencies": {
48
- "@changesets/cli": "^2.29.8",
50
+ "@changesets/cli": "^2.30.0",
49
51
  "@types/node": "^25.2.3",
50
- "@xubylele/schema-forge-core": "^1.2.0",
52
+ "@types/pg": "^8.18.0",
53
+ "@xubylele/schema-forge-core": "^1.3.0",
54
+ "testcontainers": "^11.8.1",
51
55
  "ts-node": "^10.9.2",
52
56
  "tsup": "^8.5.1",
53
57
  "typescript": "^5.9.3",