@xubylele/schema-forge 1.6.1 → 1.7.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +108 -2
  2. package/dist/cli.js +685 -53
  3. package/package.json +8 -4
package/README.md CHANGED
@@ -13,6 +13,7 @@ A modern CLI tool for database schema management with a clean DSL and automatic
13
13
  - **Default Change Detection** - Detects added/removed/modified column defaults and generates ALTER COLUMN SET/DROP DEFAULT
14
14
  - **Postgres/Supabase** - Currently supports PostgreSQL and Supabase
15
15
  - **Constraint Diffing** - Detects UNIQUE and PRIMARY KEY changes with deterministic constraint names
16
+ - **Live PostgreSQL Introspection** - Extract normalized schema directly from `information_schema`
16
17
 
17
18
  ## Installation
18
19
 
@@ -56,6 +57,17 @@ Run tests:
56
57
  npm test
57
58
  ```
58
59
 
60
+ Run real-db drift integration tests:
61
+
62
+ ```bash
63
+ npm run test:integration:drift
64
+ ```
65
+
66
+ Notes:
67
+
68
+ - Local explicit run: set `SF_RUN_REAL_DB_TESTS=true` (uses Testcontainers `postgres:16-alpine`, Docker required).
69
+ - CI/service mode: set `SF_USE_CI_POSTGRES=true` and `DATABASE_URL` to reuse an existing Postgres service.
70
+
59
71
  ## Getting Started
60
72
 
61
73
  Here's a quick walkthrough to get started with SchemaForge:
@@ -205,9 +217,13 @@ schema-forge diff [--safe] [--force]
205
217
 
206
218
  - `--safe` - Block execution if destructive operations are detected (exits with error)
207
219
  - `--force` - Bypass safety checks and proceed with displaying destructive SQL (shows warning)
220
+ - `--url` - PostgreSQL connection URL for live database diff (fallback: `DATABASE_URL`)
221
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
208
222
 
209
223
  Shows what SQL would be generated if you ran `generate`. Useful for previewing changes. Safety behavior is the same as `generate` command. In CI environments, exits with code 3 if destructive operations are detected unless `--force` is used. See [CI Behavior](#ci-behavior) for more details.
210
224
 
225
+ When `--url` (or `DATABASE_URL`) is provided, `diff` compares your target DSL schema against the live PostgreSQL schema introspected from `information_schema`.
226
+
211
227
  ### `schema-forge import`
212
228
 
213
229
  Reconstruct `schemaforge/schema.sf` from existing PostgreSQL/Supabase SQL migrations.
@@ -234,6 +250,36 @@ Detect destructive or risky schema changes before generating/applying migrations
234
250
  schema-forge validate
235
251
  ```
236
252
 
253
+ Live drift validation:
254
+
255
+ ```bash
256
+ schema-forge validate --url "$DATABASE_URL" --json
257
+ ```
258
+
259
+ Live `--json` output returns a structured `DriftReport`:
260
+
261
+ ```json
262
+ {
263
+ "missingTables": ["users_archive"],
264
+ "extraTables": ["audit_log"],
265
+ "columnDifferences": [
266
+ {
267
+ "tableName": "users",
268
+ "missingInLive": ["nickname"],
269
+ "extraInLive": ["last_login"]
270
+ }
271
+ ],
272
+ "typeMismatches": [
273
+ {
274
+ "tableName": "users",
275
+ "columnName": "email",
276
+ "expectedType": "varchar",
277
+ "actualType": "text"
278
+ }
279
+ ]
280
+ }
281
+ ```
282
+
237
283
  Validation checks include:
238
284
 
239
285
  - Dropped tables (`DROP_TABLE`, error)
@@ -247,6 +293,13 @@ Use JSON mode for CI and automation:
247
293
  schema-forge validate --json
248
294
  ```
249
295
 
296
+ Live mode options:
297
+
298
+ - `--url` - PostgreSQL connection URL for live drift validation (fallback: `DATABASE_URL`)
299
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
300
+
301
+ In live mode, exit code `2` is used when drift is detected between `state.json` and the live database.
302
+
250
303
  Exit codes (also see [CI Behavior](#ci-behavior)):
251
304
 
252
305
  - `3` in CI environment if destructive findings detected
@@ -258,6 +311,46 @@ Exit codes:
258
311
  - `1` when one or more `error` findings are detected
259
312
  - `0` when no `error` findings are detected (warnings alone do not fail)
260
313
 
314
+ ### `schema-forge doctor`
315
+
316
+ Check live database drift against your tracked `state.json`.
317
+
318
+ ```bash
319
+ schema-forge doctor --url "$DATABASE_URL"
320
+ ```
321
+
322
+ Use JSON mode for CI and automation:
323
+
324
+ ```bash
325
+ schema-forge doctor --url "$DATABASE_URL" --json
326
+ ```
327
+
328
+ Options:
329
+
330
+ - `--url` - PostgreSQL connection URL (fallback: `DATABASE_URL`)
331
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
332
+ - `--json` - Output structured drift report JSON
333
+
334
+ Exit codes:
335
+
336
+ - `0` when no drift is detected (healthy)
337
+ - `2` when drift is detected between `state.json` and live database schema
338
+
339
+ ### `schema-forge introspect`
340
+
341
+ Extract normalized schema directly from a live PostgreSQL database.
342
+
343
+ ```bash
344
+ schema-forge introspect --url "$DATABASE_URL" --json
345
+ ```
346
+
347
+ **Options:**
348
+
349
+ - `--url` - PostgreSQL connection URL (fallback: `DATABASE_URL`)
350
+ - `--schema` - Comma-separated schema names to introspect (default: `public`)
351
+ - `--json` - Output normalized schema as JSON
352
+ - `--out <path>` - Write normalized schema JSON to a file
353
+
261
354
  ## CI Behavior
262
355
 
263
356
  SchemaForge ensures deterministic behavior in Continuous Integration (CI) environments to prevent accidental destructive operations.
@@ -276,8 +369,8 @@ SchemaForge uses specific exit codes for different scenarios:
276
369
  | Exit Code | Meaning |
277
370
  | --------- | ------- |
278
371
  | `0` | Success - no changes or no destructive operations detected |
279
- | `1` | General error - validation failed, operation declined, missing files, etc. |
280
- | `2` | Schema validation error - invalid DSL syntax or structure |
372
+ | `1` | Validation/general error - invalid DSL, operation declined, missing files, etc. |
373
+ | `2` | Drift detected between expected state and live database schema |
281
374
  | `3` | **CI Destructive** - destructive operations detected in CI environment without `--force` |
282
375
 
283
376
  ### Destructive Operations in CI
@@ -312,6 +405,19 @@ When `CI=true`, SchemaForge will:
312
405
  - ✅ Allow explicit override with `--force` flag
313
406
  - ❌ Not accept user input for confirmation
314
407
 
408
+ ### Drift Integration Tests in CI
409
+
410
+ For drift reliability checks against a real database, run:
411
+
412
+ ```bash
413
+ npm run test:integration:drift
414
+ ```
415
+
416
+ The integration harness supports two deterministic paths:
417
+
418
+ - `SF_USE_CI_POSTGRES=true` + `DATABASE_URL`: uses the CI Postgres service directly.
419
+ - No CI Postgres env: spins up an isolated Testcontainers Postgres instance.
420
+
315
421
  ### Using `--safe` in CI
316
422
 
317
423
  The `--safe` flag is compatible with CI and blocks execution of destructive operations:
package/dist/cli.js CHANGED
@@ -46,11 +46,11 @@ function parseSchema(source) {
46
46
  "date"
47
47
  ]);
48
48
  const validIdentifierPattern = /^[A-Za-z_][A-Za-z0-9_]*$/;
49
- function normalizeColumnType4(type) {
49
+ function normalizeColumnType6(type) {
50
50
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
51
51
  }
52
52
  function isValidColumnType2(type) {
53
- const normalizedType = normalizeColumnType4(type);
53
+ const normalizedType = normalizeColumnType6(type);
54
54
  if (validBaseColumnTypes.has(normalizedType)) {
55
55
  return true;
56
56
  }
@@ -174,7 +174,7 @@ function parseSchema(source) {
174
174
  throw new Error(`Line ${lineNum}: Invalid column definition. Expected: <name> <type> [modifiers...]`);
175
175
  }
176
176
  const colName = tokens[0];
177
- const colType = normalizeColumnType4(tokens[1]);
177
+ const colType = normalizeColumnType6(tokens[1]);
178
178
  validateIdentifier(colName, lineNum, "column");
179
179
  if (!isValidColumnType2(colType)) {
180
180
  throw new Error(`Line ${lineNum}: Invalid column type '${tokens[1]}'. Valid types: ${Array.from(validBaseColumnTypes).join(", ")}, varchar(n), numeric(p,s)`);
@@ -691,6 +691,75 @@ var init_diff = __esm({
691
691
  }
692
692
  });
693
693
 
694
+ // node_modules/@xubylele/schema-forge-core/dist/core/drift-analyzer.js
695
+ function normalizeColumnType2(type) {
696
+ return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
697
+ }
698
+ function getSortedNames2(values) {
699
+ return Array.from(values).sort((left, right) => left.localeCompare(right));
700
+ }
701
+ function analyzeSchemaDrift(state, liveSchema) {
702
+ const stateTableNames = getSortedNames2(Object.keys(state.tables));
703
+ const liveTableNames = getSortedNames2(Object.keys(liveSchema.tables));
704
+ const liveTableNameSet = new Set(liveTableNames);
705
+ const stateTableNameSet = new Set(stateTableNames);
706
+ const missingTables = stateTableNames.filter((tableName) => !liveTableNameSet.has(tableName));
707
+ const extraTables = liveTableNames.filter((tableName) => !stateTableNameSet.has(tableName));
708
+ const commonTableNames = liveTableNames.filter((tableName) => stateTableNameSet.has(tableName));
709
+ const columnDifferences = [];
710
+ const typeMismatches = [];
711
+ for (const tableName of commonTableNames) {
712
+ const stateTable = state.tables[tableName];
713
+ const liveTable = liveSchema.tables[tableName];
714
+ if (!stateTable || !liveTable) {
715
+ continue;
716
+ }
717
+ const stateColumnNames = getSortedNames2(Object.keys(stateTable.columns));
718
+ const liveColumnsByName = new Map(liveTable.columns.map((column) => [column.name, column]));
719
+ const liveColumnNames = getSortedNames2(liveColumnsByName.keys());
720
+ const stateColumnNameSet = new Set(stateColumnNames);
721
+ const liveColumnNameSet = new Set(liveColumnNames);
722
+ const missingInLive = stateColumnNames.filter((columnName) => !liveColumnNameSet.has(columnName));
723
+ const extraInLive = liveColumnNames.filter((columnName) => !stateColumnNameSet.has(columnName));
724
+ if (missingInLive.length > 0 || extraInLive.length > 0) {
725
+ columnDifferences.push({
726
+ tableName,
727
+ missingInLive,
728
+ extraInLive
729
+ });
730
+ }
731
+ const commonColumns = stateColumnNames.filter((columnName) => liveColumnNameSet.has(columnName));
732
+ for (const columnName of commonColumns) {
733
+ const stateColumn = stateTable.columns[columnName];
734
+ const liveColumn = liveColumnsByName.get(columnName);
735
+ if (!stateColumn || !liveColumn) {
736
+ continue;
737
+ }
738
+ const expectedType = stateColumn.type;
739
+ const actualType = liveColumn.type;
740
+ if (normalizeColumnType2(expectedType) !== normalizeColumnType2(actualType)) {
741
+ typeMismatches.push({
742
+ tableName,
743
+ columnName,
744
+ expectedType,
745
+ actualType
746
+ });
747
+ }
748
+ }
749
+ }
750
+ return {
751
+ missingTables,
752
+ extraTables,
753
+ columnDifferences,
754
+ typeMismatches
755
+ };
756
+ }
757
+ var init_drift_analyzer = __esm({
758
+ "node_modules/@xubylele/schema-forge-core/dist/core/drift-analyzer.js"() {
759
+ "use strict";
760
+ }
761
+ });
762
+
694
763
  // node_modules/@xubylele/schema-forge-core/dist/core/validator.js
695
764
  function isValidColumnType(type) {
696
765
  const normalizedType = type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
@@ -778,15 +847,15 @@ var init_validator = __esm({
778
847
  });
779
848
 
780
849
  // node_modules/@xubylele/schema-forge-core/dist/core/safety/operation-classifier.js
781
- function normalizeColumnType2(type) {
850
+ function normalizeColumnType3(type) {
782
851
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
783
852
  }
784
853
  function parseVarcharLength(type) {
785
- const match = normalizeColumnType2(type).match(/^varchar\((\d+)\)$/);
854
+ const match = normalizeColumnType3(type).match(/^varchar\((\d+)\)$/);
786
855
  return match ? Number(match[1]) : null;
787
856
  }
788
857
  function parseNumericType(type) {
789
- const match = normalizeColumnType2(type).match(/^numeric\((\d+),(\d+)\)$/);
858
+ const match = normalizeColumnType3(type).match(/^numeric\((\d+),(\d+)\)$/);
790
859
  if (!match) {
791
860
  return null;
792
861
  }
@@ -796,8 +865,8 @@ function parseNumericType(type) {
796
865
  };
797
866
  }
798
867
  function classifyTypeChange(from, to) {
799
- const fromType = normalizeColumnType2(from);
800
- const toType = normalizeColumnType2(to);
868
+ const fromType = normalizeColumnType3(from);
869
+ const toType = normalizeColumnType3(to);
801
870
  const uuidInvolved = fromType === "uuid" || toType === "uuid";
802
871
  if (uuidInvolved && fromType !== toType) {
803
872
  return "DESTRUCTIVE";
@@ -869,15 +938,15 @@ var init_operation_classifier = __esm({
869
938
  });
870
939
 
871
940
  // node_modules/@xubylele/schema-forge-core/dist/core/safety/safety-checker.js
872
- function normalizeColumnType3(type) {
941
+ function normalizeColumnType4(type) {
873
942
  return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
874
943
  }
875
944
  function parseVarcharLength2(type) {
876
- const match = normalizeColumnType3(type).match(/^varchar\((\d+)\)$/);
945
+ const match = normalizeColumnType4(type).match(/^varchar\((\d+)\)$/);
877
946
  return match ? Number(match[1]) : null;
878
947
  }
879
948
  function parseNumericType2(type) {
880
- const match = normalizeColumnType3(type).match(/^numeric\((\d+),(\d+)\)$/);
949
+ const match = normalizeColumnType4(type).match(/^numeric\((\d+),(\d+)\)$/);
881
950
  if (!match) {
882
951
  return null;
883
952
  }
@@ -887,8 +956,8 @@ function parseNumericType2(type) {
887
956
  };
888
957
  }
889
958
  function generateTypeChangeMessage(from, to) {
890
- const fromType = normalizeColumnType3(from);
891
- const toType = normalizeColumnType3(to);
959
+ const fromType = normalizeColumnType4(from);
960
+ const toType = normalizeColumnType4(to);
892
961
  const uuidInvolved = fromType === "uuid" || toType === "uuid";
893
962
  if (uuidInvolved && fromType !== toType) {
894
963
  return `Type changed from ${fromType} to ${toType} (likely incompatible cast)`;
@@ -953,8 +1022,8 @@ function checkOperationSafety(operation) {
953
1022
  code: "ALTER_COLUMN_TYPE",
954
1023
  table: operation.tableName,
955
1024
  column: operation.columnName,
956
- from: normalizeColumnType3(operation.fromType),
957
- to: normalizeColumnType3(operation.toType),
1025
+ from: normalizeColumnType4(operation.fromType),
1026
+ to: normalizeColumnType4(operation.toType),
958
1027
  message: generateTypeChangeMessage(operation.fromType, operation.toType),
959
1028
  operationKind: operation.kind
960
1029
  };
@@ -2157,6 +2226,359 @@ var init_load_migrations = __esm({
2157
2226
  }
2158
2227
  });
2159
2228
 
2229
+ // node_modules/@xubylele/schema-forge-core/dist/core/sql/introspect-postgres.js
2230
+ function toTableKey(schema, table) {
2231
+ if (schema === DEFAULT_SCHEMA) {
2232
+ return table;
2233
+ }
2234
+ return `${schema}.${table}`;
2235
+ }
2236
+ function normalizeSchemas(schemas) {
2237
+ const values = schemas ?? [DEFAULT_SCHEMA];
2238
+ const deduped = /* @__PURE__ */ new Set();
2239
+ for (const schema of values) {
2240
+ const trimmed = schema.trim();
2241
+ if (trimmed.length > 0) {
2242
+ deduped.add(trimmed);
2243
+ }
2244
+ }
2245
+ if (deduped.size === 0) {
2246
+ deduped.add(DEFAULT_SCHEMA);
2247
+ }
2248
+ return Array.from(deduped).sort((a, b) => a.localeCompare(b));
2249
+ }
2250
+ function normalizeColumnType5(row) {
2251
+ const dataType = row.data_type.toLowerCase();
2252
+ const udtName = row.udt_name.toLowerCase();
2253
+ if (dataType === "character varying") {
2254
+ if (row.character_maximum_length !== null) {
2255
+ return `varchar(${row.character_maximum_length})`;
2256
+ }
2257
+ return "varchar";
2258
+ }
2259
+ if (dataType === "timestamp with time zone") {
2260
+ return "timestamptz";
2261
+ }
2262
+ if (dataType === "integer" || udtName === "int4") {
2263
+ return "int";
2264
+ }
2265
+ if (dataType === "bigint" || udtName === "int8") {
2266
+ return "bigint";
2267
+ }
2268
+ if (dataType === "numeric") {
2269
+ if (row.numeric_precision !== null && row.numeric_scale !== null) {
2270
+ return `numeric(${row.numeric_precision},${row.numeric_scale})`;
2271
+ }
2272
+ return "numeric";
2273
+ }
2274
+ if (dataType === "boolean" || udtName === "bool") {
2275
+ return "boolean";
2276
+ }
2277
+ if (dataType === "uuid") {
2278
+ return "uuid";
2279
+ }
2280
+ if (dataType === "text") {
2281
+ return "text";
2282
+ }
2283
+ if (dataType === "date") {
2284
+ return "date";
2285
+ }
2286
+ return dataType;
2287
+ }
2288
+ function normalizeConstraints(constraintRows, foreignKeyRows) {
2289
+ const constraints = /* @__PURE__ */ new Map();
2290
+ for (const row of constraintRows) {
2291
+ const key = `${row.table_schema}.${row.table_name}.${row.constraint_name}.${row.constraint_type}`;
2292
+ const existing = constraints.get(key);
2293
+ if (existing) {
2294
+ if (row.column_name !== null) {
2295
+ existing.columns.push({
2296
+ name: row.column_name,
2297
+ position: row.ordinal_position ?? Number.MAX_SAFE_INTEGER
2298
+ });
2299
+ }
2300
+ if (!existing.checkClause && row.check_clause) {
2301
+ existing.checkClause = row.check_clause;
2302
+ }
2303
+ continue;
2304
+ }
2305
+ constraints.set(key, {
2306
+ tableSchema: row.table_schema,
2307
+ tableName: row.table_name,
2308
+ name: row.constraint_name,
2309
+ type: row.constraint_type,
2310
+ columns: row.column_name === null ? [] : [{
2311
+ name: row.column_name,
2312
+ position: row.ordinal_position ?? Number.MAX_SAFE_INTEGER
2313
+ }],
2314
+ ...row.check_clause ? { checkClause: row.check_clause } : {}
2315
+ });
2316
+ }
2317
+ const normalized = [];
2318
+ for (const value of constraints.values()) {
2319
+ value.columns.sort((left, right) => {
2320
+ if (left.position !== right.position) {
2321
+ return left.position - right.position;
2322
+ }
2323
+ return left.name.localeCompare(right.name);
2324
+ });
2325
+ normalized.push({
2326
+ tableSchema: value.tableSchema,
2327
+ tableName: value.tableName,
2328
+ name: value.name,
2329
+ type: value.type,
2330
+ columns: value.columns.map((item) => item.name),
2331
+ ...value.checkClause ? { checkClause: value.checkClause } : {}
2332
+ });
2333
+ }
2334
+ const foreignKeys = /* @__PURE__ */ new Map();
2335
+ for (const row of foreignKeyRows) {
2336
+ const key = `${row.table_schema}.${row.table_name}.${row.constraint_name}`;
2337
+ const existing = foreignKeys.get(key);
2338
+ if (existing) {
2339
+ existing.local.push({ name: row.local_column_name, position: row.position });
2340
+ existing.referenced.push({ name: row.referenced_column_name, position: row.position });
2341
+ continue;
2342
+ }
2343
+ foreignKeys.set(key, {
2344
+ tableSchema: row.table_schema,
2345
+ tableName: row.table_name,
2346
+ name: row.constraint_name,
2347
+ local: [{ name: row.local_column_name, position: row.position }],
2348
+ referencedTableSchema: row.referenced_table_schema,
2349
+ referencedTableName: row.referenced_table_name,
2350
+ referenced: [{ name: row.referenced_column_name, position: row.position }]
2351
+ });
2352
+ }
2353
+ for (const value of foreignKeys.values()) {
2354
+ value.local.sort((left, right) => left.position - right.position || left.name.localeCompare(right.name));
2355
+ value.referenced.sort((left, right) => left.position - right.position || left.name.localeCompare(right.name));
2356
+ normalized.push({
2357
+ tableSchema: value.tableSchema,
2358
+ tableName: value.tableName,
2359
+ name: value.name,
2360
+ type: "FOREIGN KEY",
2361
+ columns: value.local.map((item) => item.name),
2362
+ referencedTableSchema: value.referencedTableSchema,
2363
+ referencedTableName: value.referencedTableName,
2364
+ referencedColumns: value.referenced.map((item) => item.name)
2365
+ });
2366
+ }
2367
+ normalized.sort((left, right) => {
2368
+ if (left.tableSchema !== right.tableSchema) {
2369
+ return left.tableSchema.localeCompare(right.tableSchema);
2370
+ }
2371
+ if (left.tableName !== right.tableName) {
2372
+ return left.tableName.localeCompare(right.tableName);
2373
+ }
2374
+ const typeOrderDiff = CONSTRAINT_TYPE_ORDER[left.type] - CONSTRAINT_TYPE_ORDER[right.type];
2375
+ if (typeOrderDiff !== 0) {
2376
+ return typeOrderDiff;
2377
+ }
2378
+ if (left.name !== right.name) {
2379
+ return left.name.localeCompare(right.name);
2380
+ }
2381
+ return left.columns.join(",").localeCompare(right.columns.join(","));
2382
+ });
2383
+ return normalized;
2384
+ }
2385
+ async function introspectPostgresSchema(options) {
2386
+ const schemaFilter = normalizeSchemas(options.schemas);
2387
+ const [tableRows, columnRows, constraintRows, foreignKeyRows] = await Promise.all([
2388
+ options.query(TABLES_QUERY, [schemaFilter]),
2389
+ options.query(COLUMNS_QUERY, [schemaFilter]),
2390
+ options.query(CONSTRAINTS_QUERY, [schemaFilter]),
2391
+ options.query(FOREIGN_KEYS_QUERY, [schemaFilter])
2392
+ ]);
2393
+ const sortedTables = [...tableRows].sort((left, right) => {
2394
+ if (left.table_schema !== right.table_schema) {
2395
+ return left.table_schema.localeCompare(right.table_schema);
2396
+ }
2397
+ return left.table_name.localeCompare(right.table_name);
2398
+ });
2399
+ const sortedColumns = [...columnRows].sort((left, right) => {
2400
+ if (left.table_schema !== right.table_schema) {
2401
+ return left.table_schema.localeCompare(right.table_schema);
2402
+ }
2403
+ if (left.table_name !== right.table_name) {
2404
+ return left.table_name.localeCompare(right.table_name);
2405
+ }
2406
+ if (left.ordinal_position !== right.ordinal_position) {
2407
+ return left.ordinal_position - right.ordinal_position;
2408
+ }
2409
+ return left.column_name.localeCompare(right.column_name);
2410
+ });
2411
+ const normalizedConstraints = normalizeConstraints(constraintRows, foreignKeyRows);
2412
+ const tableMap = /* @__PURE__ */ new Map();
2413
+ const columnMap = /* @__PURE__ */ new Map();
2414
+ for (const row of sortedTables) {
2415
+ const key = toTableKey(row.table_schema, row.table_name);
2416
+ const table = { name: key, columns: [], primaryKey: null };
2417
+ tableMap.set(key, table);
2418
+ columnMap.set(key, /* @__PURE__ */ new Map());
2419
+ }
2420
+ for (const row of sortedColumns) {
2421
+ const key = toTableKey(row.table_schema, row.table_name);
2422
+ const table = tableMap.get(key);
2423
+ const columnsByName = columnMap.get(key);
2424
+ if (!table || !columnsByName) {
2425
+ continue;
2426
+ }
2427
+ const column = {
2428
+ name: row.column_name,
2429
+ type: normalizeColumnType5(row),
2430
+ nullable: row.is_nullable === "YES"
2431
+ };
2432
+ const normalizedDefault = normalizeDefault(row.column_default);
2433
+ if (normalizedDefault !== null) {
2434
+ column.default = normalizedDefault;
2435
+ }
2436
+ table.columns.push(column);
2437
+ columnsByName.set(column.name, column);
2438
+ }
2439
+ for (const constraint of normalizedConstraints) {
2440
+ const tableKey = toTableKey(constraint.tableSchema, constraint.tableName);
2441
+ const table = tableMap.get(tableKey);
2442
+ const columnsByName = columnMap.get(tableKey);
2443
+ if (!table || !columnsByName) {
2444
+ continue;
2445
+ }
2446
+ if (constraint.type === "PRIMARY KEY") {
2447
+ if (constraint.columns.length === 1) {
2448
+ const column = columnsByName.get(constraint.columns[0]);
2449
+ if (column) {
2450
+ column.primaryKey = true;
2451
+ column.nullable = false;
2452
+ table.primaryKey = column.name;
2453
+ }
2454
+ }
2455
+ continue;
2456
+ }
2457
+ if (constraint.type === "UNIQUE") {
2458
+ if (constraint.columns.length === 1) {
2459
+ const column = columnsByName.get(constraint.columns[0]);
2460
+ if (column) {
2461
+ column.unique = true;
2462
+ }
2463
+ }
2464
+ continue;
2465
+ }
2466
+ if (constraint.type === "FOREIGN KEY") {
2467
+ if (constraint.columns.length === 1 && constraint.referencedColumns && constraint.referencedColumns.length === 1 && constraint.referencedTableName) {
2468
+ const column = columnsByName.get(constraint.columns[0]);
2469
+ if (column) {
2470
+ column.foreignKey = {
2471
+ table: toTableKey(constraint.referencedTableSchema ?? DEFAULT_SCHEMA, constraint.referencedTableName),
2472
+ column: constraint.referencedColumns[0]
2473
+ };
2474
+ }
2475
+ }
2476
+ }
2477
+ }
2478
+ const orderedTableNames = Array.from(tableMap.keys()).sort((left, right) => left.localeCompare(right));
2479
+ const tables = {};
2480
+ for (const tableName of orderedTableNames) {
2481
+ const table = tableMap.get(tableName);
2482
+ if (table) {
2483
+ tables[tableName] = table;
2484
+ }
2485
+ }
2486
+ return { tables };
2487
+ }
2488
+ var DEFAULT_SCHEMA, CONSTRAINT_TYPE_ORDER, TABLES_QUERY, COLUMNS_QUERY, CONSTRAINTS_QUERY, FOREIGN_KEYS_QUERY;
2489
+ var init_introspect_postgres = __esm({
2490
+ "node_modules/@xubylele/schema-forge-core/dist/core/sql/introspect-postgres.js"() {
2491
+ "use strict";
2492
+ init_normalize();
2493
+ DEFAULT_SCHEMA = "public";
2494
+ CONSTRAINT_TYPE_ORDER = {
2495
+ "PRIMARY KEY": 0,
2496
+ UNIQUE: 1,
2497
+ "FOREIGN KEY": 2,
2498
+ CHECK: 3
2499
+ };
2500
+ TABLES_QUERY = `
2501
+ SELECT
2502
+ table_schema,
2503
+ table_name
2504
+ FROM information_schema.tables
2505
+ WHERE table_type = 'BASE TABLE'
2506
+ AND table_schema = ANY($1::text[])
2507
+ `;
2508
+ COLUMNS_QUERY = `
2509
+ SELECT
2510
+ table_schema,
2511
+ table_name,
2512
+ column_name,
2513
+ ordinal_position,
2514
+ is_nullable,
2515
+ data_type,
2516
+ udt_name,
2517
+ character_maximum_length,
2518
+ numeric_precision,
2519
+ numeric_scale,
2520
+ column_default
2521
+ FROM information_schema.columns
2522
+ WHERE table_schema = ANY($1::text[])
2523
+ `;
2524
+ CONSTRAINTS_QUERY = `
2525
+ SELECT
2526
+ tc.table_schema,
2527
+ tc.table_name,
2528
+ tc.constraint_name,
2529
+ tc.constraint_type,
2530
+ kcu.column_name,
2531
+ kcu.ordinal_position,
2532
+ cc.check_clause
2533
+ FROM information_schema.table_constraints tc
2534
+ LEFT JOIN information_schema.key_column_usage kcu
2535
+ ON tc.constraint_catalog = kcu.constraint_catalog
2536
+ AND tc.constraint_schema = kcu.constraint_schema
2537
+ AND tc.constraint_name = kcu.constraint_name
2538
+ AND tc.table_schema = kcu.table_schema
2539
+ AND tc.table_name = kcu.table_name
2540
+ LEFT JOIN information_schema.check_constraints cc
2541
+ ON tc.constraint_catalog = cc.constraint_catalog
2542
+ AND tc.constraint_schema = cc.constraint_schema
2543
+ AND tc.constraint_name = cc.constraint_name
2544
+ WHERE tc.table_schema = ANY($1::text[])
2545
+ AND tc.constraint_type IN ('PRIMARY KEY', 'UNIQUE', 'CHECK')
2546
+ `;
2547
+ FOREIGN_KEYS_QUERY = `
2548
+ SELECT
2549
+ src_ns.nspname AS table_schema,
2550
+ src.relname AS table_name,
2551
+ con.conname AS constraint_name,
2552
+ src_attr.attname AS local_column_name,
2553
+ ref_ns.nspname AS referenced_table_schema,
2554
+ ref.relname AS referenced_table_name,
2555
+ ref_attr.attname AS referenced_column_name,
2556
+ src_key.ord AS position
2557
+ FROM pg_constraint con
2558
+ JOIN pg_class src
2559
+ ON src.oid = con.conrelid
2560
+ JOIN pg_namespace src_ns
2561
+ ON src_ns.oid = src.relnamespace
2562
+ JOIN pg_class ref
2563
+ ON ref.oid = con.confrelid
2564
+ JOIN pg_namespace ref_ns
2565
+ ON ref_ns.oid = ref.relnamespace
2566
+ JOIN LATERAL unnest(con.conkey) WITH ORDINALITY AS src_key(attnum, ord)
2567
+ ON TRUE
2568
+ JOIN LATERAL unnest(con.confkey) WITH ORDINALITY AS ref_key(attnum, ord)
2569
+ ON ref_key.ord = src_key.ord
2570
+ JOIN pg_attribute src_attr
2571
+ ON src_attr.attrelid = con.conrelid
2572
+ AND src_attr.attnum = src_key.attnum
2573
+ JOIN pg_attribute ref_attr
2574
+ ON ref_attr.attrelid = con.confrelid
2575
+ AND ref_attr.attnum = ref_key.attnum
2576
+ WHERE con.contype = 'f'
2577
+ AND src_ns.nspname = ANY($1::text[])
2578
+ `;
2579
+ }
2580
+ });
2581
+
2160
2582
  // node_modules/@xubylele/schema-forge-core/dist/core/paths.js
2161
2583
  function getProjectRoot2(cwd = process.cwd()) {
2162
2584
  return cwd;
@@ -2219,6 +2641,7 @@ var init_errors = __esm({
2219
2641
  var dist_exports = {};
2220
2642
  __export(dist_exports, {
2221
2643
  SchemaValidationError: () => SchemaValidationError,
2644
+ analyzeSchemaDrift: () => analyzeSchemaDrift,
2222
2645
  applySqlOps: () => applySqlOps,
2223
2646
  checkOperationSafety: () => checkOperationSafety,
2224
2647
  checkSchemaSafety: () => checkSchemaSafety,
@@ -2237,6 +2660,7 @@ __export(dist_exports, {
2237
2660
  getStatePath: () => getStatePath2,
2238
2661
  getTableNamesFromSchema: () => getTableNamesFromSchema,
2239
2662
  getTableNamesFromState: () => getTableNamesFromState,
2663
+ introspectPostgresSchema: () => introspectPostgresSchema,
2240
2664
  legacyPkName: () => legacyPkName,
2241
2665
  legacyUqName: () => legacyUqName,
2242
2666
  loadMigrationSqlInput: () => loadMigrationSqlInput,
@@ -2274,6 +2698,7 @@ var init_dist = __esm({
2274
2698
  "use strict";
2275
2699
  init_parser();
2276
2700
  init_diff();
2701
+ init_drift_analyzer();
2277
2702
  init_validator();
2278
2703
  init_validate();
2279
2704
  init_safety();
@@ -2284,6 +2709,7 @@ var init_dist = __esm({
2284
2709
  init_schema_to_dsl();
2285
2710
  init_load_migrations();
2286
2711
  init_split_statements();
2712
+ init_introspect_postgres();
2287
2713
  init_fs();
2288
2714
  init_normalize();
2289
2715
  init_paths();
@@ -2293,12 +2719,12 @@ var init_dist = __esm({
2293
2719
  });
2294
2720
 
2295
2721
  // src/cli.ts
2296
- var import_commander6 = require("commander");
2722
+ var import_commander8 = require("commander");
2297
2723
 
2298
2724
  // package.json
2299
2725
  var package_default = {
2300
2726
  name: "@xubylele/schema-forge",
2301
- version: "1.6.1",
2727
+ version: "1.7.0",
2302
2728
  description: "Universal migration generator from schema DSL",
2303
2729
  main: "dist/cli.js",
2304
2730
  type: "commonjs",
@@ -2309,6 +2735,7 @@ var package_default = {
2309
2735
  build: "tsup src/cli.ts --format cjs --dts",
2310
2736
  dev: "ts-node src/cli.ts",
2311
2737
  test: "vitest",
2738
+ "test:integration:drift": "vitest run test/drift-realdb.integration.test.ts",
2312
2739
  prepublishOnly: "npm run build",
2313
2740
  "publish:public": "npm publish --access public",
2314
2741
  changeset: "changeset",
@@ -2340,12 +2767,15 @@ var package_default = {
2340
2767
  dependencies: {
2341
2768
  boxen: "^8.0.1",
2342
2769
  chalk: "^5.6.2",
2343
- commander: "^14.0.3"
2770
+ commander: "^14.0.3",
2771
+ pg: "^8.19.0"
2344
2772
  },
2345
2773
  devDependencies: {
2346
- "@changesets/cli": "^2.29.8",
2774
+ "@changesets/cli": "^2.30.0",
2347
2775
  "@types/node": "^25.2.3",
2348
- "@xubylele/schema-forge-core": "^1.2.0",
2776
+ "@types/pg": "^8.18.0",
2777
+ "@xubylele/schema-forge-core": "^1.3.0",
2778
+ testcontainers: "^11.8.1",
2349
2779
  "ts-node": "^10.9.2",
2350
2780
  tsup: "^8.5.1",
2351
2781
  typescript: "^5.9.3",
@@ -2492,6 +2922,14 @@ async function parseMigrationSql2(sql) {
2492
2922
  const core = await loadCore();
2493
2923
  return core.parseMigrationSql(sql);
2494
2924
  }
2925
+ async function introspectPostgresSchema2(options) {
2926
+ const core = await loadCore();
2927
+ return core.introspectPostgresSchema(options);
2928
+ }
2929
+ async function analyzeSchemaDrift2(state, liveSchema) {
2930
+ const core = await loadCore();
2931
+ return core.analyzeSchemaDrift(state, liveSchema);
2932
+ }
2495
2933
  async function applySqlOps2(ops) {
2496
2934
  const core = await loadCore();
2497
2935
  return core.applySqlOps(ops);
@@ -2513,6 +2951,40 @@ async function isSchemaValidationError(error2) {
2513
2951
  return error2 instanceof core.SchemaValidationError;
2514
2952
  }
2515
2953
 
2954
+ // src/core/postgres.ts
2955
+ var import_pg = require("pg");
2956
+ function resolvePostgresConnectionString(options = {}) {
2957
+ const explicitUrl = options.url?.trim();
2958
+ if (explicitUrl) {
2959
+ return explicitUrl;
2960
+ }
2961
+ const envUrl = process.env.DATABASE_URL?.trim();
2962
+ if (envUrl) {
2963
+ return envUrl;
2964
+ }
2965
+ throw new Error("PostgreSQL connection URL is required. Pass --url or set DATABASE_URL.");
2966
+ }
2967
+ function parseSchemaList(value) {
2968
+ if (!value) {
2969
+ return void 0;
2970
+ }
2971
+ const schemas = value.split(",").map((item) => item.trim()).filter(Boolean);
2972
+ return schemas.length > 0 ? schemas : void 0;
2973
+ }
2974
+ async function withPostgresQueryExecutor(connectionString, run) {
2975
+ const client = new import_pg.Client({ connectionString });
2976
+ await client.connect();
2977
+ const query = async (sql, params) => {
2978
+ const result = await client.query(sql, params ? [...params] : void 0);
2979
+ return result.rows;
2980
+ };
2981
+ try {
2982
+ return await run(query);
2983
+ } finally {
2984
+ await client.end();
2985
+ }
2986
+ }
2987
+
2516
2988
  // src/utils/exitCodes.ts
2517
2989
  var EXIT_CODES = {
2518
2990
  /** Successful operation */
@@ -2647,7 +3119,6 @@ function hasDestructiveFindings(findings) {
2647
3119
  }
2648
3120
 
2649
3121
  // src/commands/diff.ts
2650
- var REQUIRED_CONFIG_FIELDS = ["schemaFile", "stateFile"];
2651
3122
  function resolveConfigPath(root, targetPath) {
2652
3123
  return import_path7.default.isAbsolute(targetPath) ? targetPath : import_path7.default.join(root, targetPath);
2653
3124
  }
@@ -2661,14 +3132,16 @@ async function runDiff(options = {}) {
2661
3132
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2662
3133
  }
2663
3134
  const config = await readJsonFile(configPath, {});
2664
- for (const field of REQUIRED_CONFIG_FIELDS) {
3135
+ const useLiveDatabase = Boolean(options.url || process.env.DATABASE_URL);
3136
+ const requiredFields = useLiveDatabase ? ["schemaFile"] : ["schemaFile", "stateFile"];
3137
+ for (const field of requiredFields) {
2665
3138
  const value = config[field];
2666
3139
  if (!value || typeof value !== "string") {
2667
3140
  throw new Error(`Invalid config: '${field}' is required`);
2668
3141
  }
2669
3142
  }
2670
3143
  const schemaPath = resolveConfigPath(root, config.schemaFile);
2671
- const statePath = resolveConfigPath(root, config.stateFile);
3144
+ const statePath = config.stateFile ? resolveConfigPath(root, config.stateFile) : null;
2672
3145
  const { provider } = resolveProvider(config.provider);
2673
3146
  const schemaSource = await readTextFile(schemaPath);
2674
3147
  const schema = await parseSchema2(schemaSource);
@@ -2680,7 +3153,17 @@ async function runDiff(options = {}) {
2680
3153
  }
2681
3154
  throw error2;
2682
3155
  }
2683
- const previousState = await loadState2(statePath);
3156
+ const previousState = useLiveDatabase ? await withPostgresQueryExecutor(
3157
+ resolvePostgresConnectionString({ url: options.url }),
3158
+ async (query) => {
3159
+ const schemaFilters = parseSchemaList(options.schema);
3160
+ const liveSchema = await introspectPostgresSchema2({
3161
+ query,
3162
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3163
+ });
3164
+ return schemaToState2(liveSchema);
3165
+ }
3166
+ ) : await loadState2(statePath ?? "");
2684
3167
  const diff = await diffSchemas2(previousState, schema);
2685
3168
  if (options.force) {
2686
3169
  forceWarning("Are you sure to use --force? This option will bypass safety checks for destructive operations.");
@@ -2725,9 +3208,72 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2725
3208
  process.exitCode = EXIT_CODES.SUCCESS;
2726
3209
  }
2727
3210
 
2728
- // src/commands/generate.ts
3211
+ // src/commands/doctor.ts
2729
3212
  var import_commander2 = require("commander");
2730
3213
  var import_path8 = __toESM(require("path"));
3214
+ function resolveConfigPath2(root, targetPath) {
3215
+ return import_path8.default.isAbsolute(targetPath) ? targetPath : import_path8.default.join(root, targetPath);
3216
+ }
3217
+ function hasDrift(report) {
3218
+ return report.missingTables.length > 0 || report.extraTables.length > 0 || report.columnDifferences.length > 0 || report.typeMismatches.length > 0;
3219
+ }
3220
+ function printDriftReport(report) {
3221
+ if (report.missingTables.length > 0) {
3222
+ console.log(`Missing tables in live DB: ${report.missingTables.join(", ")}`);
3223
+ }
3224
+ if (report.extraTables.length > 0) {
3225
+ console.log(`Extra tables in live DB: ${report.extraTables.join(", ")}`);
3226
+ }
3227
+ for (const difference of report.columnDifferences) {
3228
+ if (difference.missingInLive.length > 0) {
3229
+ console.log(`Missing columns in ${difference.tableName}: ${difference.missingInLive.join(", ")}`);
3230
+ }
3231
+ if (difference.extraInLive.length > 0) {
3232
+ console.log(`Extra columns in ${difference.tableName}: ${difference.extraInLive.join(", ")}`);
3233
+ }
3234
+ }
3235
+ for (const mismatch of report.typeMismatches) {
3236
+ console.log(`Type mismatch ${mismatch.tableName}.${mismatch.columnName}: ${mismatch.expectedType} -> ${mismatch.actualType}`);
3237
+ }
3238
+ }
3239
+ async function runDoctor(options = {}) {
3240
+ const root = getProjectRoot();
3241
+ const configPath = getConfigPath(root);
3242
+ if (!await fileExists(configPath)) {
3243
+ throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
3244
+ }
3245
+ const config = await readJsonFile(configPath, {});
3246
+ if (!config.stateFile || typeof config.stateFile !== "string") {
3247
+ throw new Error("Invalid config: 'stateFile' is required");
3248
+ }
3249
+ const statePath = resolveConfigPath2(root, config.stateFile);
3250
+ const previousState = await loadState2(statePath);
3251
+ const schemaFilters = parseSchemaList(options.schema);
3252
+ const liveSchema = await withPostgresQueryExecutor(
3253
+ resolvePostgresConnectionString({ url: options.url }),
3254
+ (query) => introspectPostgresSchema2({
3255
+ query,
3256
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3257
+ })
3258
+ );
3259
+ const driftReport = await analyzeSchemaDrift2(previousState, liveSchema);
3260
+ const detected = hasDrift(driftReport);
3261
+ process.exitCode = detected ? EXIT_CODES.DRIFT_DETECTED : EXIT_CODES.SUCCESS;
3262
+ if (options.json) {
3263
+ console.log(JSON.stringify(driftReport, null, 2));
3264
+ return;
3265
+ }
3266
+ if (!detected) {
3267
+ success("No schema drift detected");
3268
+ return;
3269
+ }
3270
+ console.log("Schema drift detected");
3271
+ printDriftReport(driftReport);
3272
+ }
3273
+
3274
+ // src/commands/generate.ts
3275
+ var import_commander3 = require("commander");
3276
+ var import_path9 = __toESM(require("path"));
2731
3277
 
2732
3278
  // src/core/utils.ts
2733
3279
  function nowTimestamp2() {
@@ -2740,13 +3286,13 @@ function slugifyName2(name) {
2740
3286
  }
2741
3287
 
2742
3288
  // src/commands/generate.ts
2743
- var REQUIRED_CONFIG_FIELDS2 = [
3289
+ var REQUIRED_CONFIG_FIELDS = [
2744
3290
  "schemaFile",
2745
3291
  "stateFile",
2746
3292
  "outputDir"
2747
3293
  ];
2748
- function resolveConfigPath2(root, targetPath) {
2749
- return import_path8.default.isAbsolute(targetPath) ? targetPath : import_path8.default.join(root, targetPath);
3294
+ function resolveConfigPath3(root, targetPath) {
3295
+ return import_path9.default.isAbsolute(targetPath) ? targetPath : import_path9.default.join(root, targetPath);
2750
3296
  }
2751
3297
  async function runGenerate(options) {
2752
3298
  if (options.safe && options.force) {
@@ -2758,15 +3304,15 @@ async function runGenerate(options) {
2758
3304
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2759
3305
  }
2760
3306
  const config = await readJsonFile(configPath, {});
2761
- for (const field of REQUIRED_CONFIG_FIELDS2) {
3307
+ for (const field of REQUIRED_CONFIG_FIELDS) {
2762
3308
  const value = config[field];
2763
3309
  if (!value || typeof value !== "string") {
2764
3310
  throw new Error(`Invalid config: '${field}' is required`);
2765
3311
  }
2766
3312
  }
2767
- const schemaPath = resolveConfigPath2(root, config.schemaFile);
2768
- const statePath = resolveConfigPath2(root, config.stateFile);
2769
- const outputDir = resolveConfigPath2(root, config.outputDir);
3313
+ const schemaPath = resolveConfigPath3(root, config.schemaFile);
3314
+ const statePath = resolveConfigPath3(root, config.stateFile);
3315
+ const outputDir = resolveConfigPath3(root, config.outputDir);
2770
3316
  const { provider, usedDefault } = resolveProvider(config.provider);
2771
3317
  if (usedDefault) {
2772
3318
  info("Provider not set; defaulting to postgres.");
@@ -2827,7 +3373,7 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2827
3373
  const slug = slugifyName2(options.name ?? "migration");
2828
3374
  const fileName = `${timestamp}-${slug}.sql`;
2829
3375
  await ensureDir(outputDir);
2830
- const migrationPath = import_path8.default.join(outputDir, fileName);
3376
+ const migrationPath = import_path9.default.join(outputDir, fileName);
2831
3377
  await writeTextFile(migrationPath, sql + "\n");
2832
3378
  const nextState = await schemaToState2(schema);
2833
3379
  await saveState2(statePath, nextState);
@@ -2836,14 +3382,14 @@ Remove --safe flag or modify schema to avoid destructive changes.`
2836
3382
  }
2837
3383
 
2838
3384
  // src/commands/import.ts
2839
- var import_commander3 = require("commander");
2840
- var import_path9 = __toESM(require("path"));
2841
- function resolveConfigPath3(root, targetPath) {
2842
- return import_path9.default.isAbsolute(targetPath) ? targetPath : import_path9.default.join(root, targetPath);
3385
+ var import_commander4 = require("commander");
3386
+ var import_path10 = __toESM(require("path"));
3387
+ function resolveConfigPath4(root, targetPath) {
3388
+ return import_path10.default.isAbsolute(targetPath) ? targetPath : import_path10.default.join(root, targetPath);
2843
3389
  }
2844
3390
  async function runImport(inputPath, options = {}) {
2845
3391
  const root = getProjectRoot();
2846
- const absoluteInputPath = resolveConfigPath3(root, inputPath);
3392
+ const absoluteInputPath = resolveConfigPath4(root, inputPath);
2847
3393
  const inputs = await loadMigrationSqlInput2(absoluteInputPath);
2848
3394
  if (inputs.length === 0) {
2849
3395
  throw new Error(`No .sql migration files found in: ${absoluteInputPath}`);
@@ -2854,7 +3400,7 @@ async function runImport(inputPath, options = {}) {
2854
3400
  const result = await parseMigrationSql2(input.sql);
2855
3401
  allOps.push(...result.ops);
2856
3402
  parseWarnings.push(...result.warnings.map((item) => ({
2857
- statement: `[${import_path9.default.basename(input.filePath)}] ${item.statement}`,
3403
+ statement: `[${import_path10.default.basename(input.filePath)}] ${item.statement}`,
2858
3404
  reason: item.reason
2859
3405
  })));
2860
3406
  }
@@ -2870,7 +3416,7 @@ async function runImport(inputPath, options = {}) {
2870
3416
  }
2871
3417
  }
2872
3418
  }
2873
- const schemaPath = targetPath ? resolveConfigPath3(root, targetPath) : getSchemaFilePath(root);
3419
+ const schemaPath = targetPath ? resolveConfigPath4(root, targetPath) : getSchemaFilePath(root);
2874
3420
  await writeTextFile(schemaPath, dsl);
2875
3421
  success(`Imported ${inputs.length} migration file(s) into ${schemaPath}`);
2876
3422
  info(`Parsed ${allOps.length} supported DDL operation(s)`);
@@ -2888,7 +3434,7 @@ async function runImport(inputPath, options = {}) {
2888
3434
  }
2889
3435
 
2890
3436
  // src/commands/init.ts
2891
- var import_commander4 = require("commander");
3437
+ var import_commander5 = require("commander");
2892
3438
  async function runInit() {
2893
3439
  const root = getProjectRoot();
2894
3440
  const schemaForgeDir = getSchemaForgeDir(root);
@@ -2947,28 +3493,61 @@ table users {
2947
3493
  process.exitCode = EXIT_CODES.SUCCESS;
2948
3494
  }
2949
3495
 
3496
+ // src/commands/introspect.ts
3497
+ var import_commander6 = require("commander");
3498
+ var import_path11 = __toESM(require("path"));
3499
+ function resolveOutputPath(root, outputPath) {
3500
+ return import_path11.default.isAbsolute(outputPath) ? outputPath : import_path11.default.join(root, outputPath);
3501
+ }
3502
+ async function runIntrospect(options = {}) {
3503
+ const connectionString = resolvePostgresConnectionString({ url: options.url });
3504
+ const schemas = parseSchemaList(options.schema);
3505
+ const root = getProjectRoot();
3506
+ const schema = await withPostgresQueryExecutor(connectionString, (query) => introspectPostgresSchema2({
3507
+ query,
3508
+ ...schemas ? { schemas } : {}
3509
+ }));
3510
+ const output = JSON.stringify(schema, null, 2);
3511
+ if (!options.json && !options.out) {
3512
+ info(`Introspected ${Object.keys(schema.tables).length} table(s) from PostgreSQL.`);
3513
+ }
3514
+ if (options.out) {
3515
+ const outputPath = resolveOutputPath(root, options.out);
3516
+ await writeTextFile(outputPath, `${output}
3517
+ `);
3518
+ success(`Live schema written to ${outputPath}`);
3519
+ }
3520
+ if (options.json || !options.out) {
3521
+ console.log(output);
3522
+ }
3523
+ }
3524
+
2950
3525
  // src/commands/validate.ts
2951
- var import_commander5 = require("commander");
2952
- var import_path10 = __toESM(require("path"));
2953
- var REQUIRED_CONFIG_FIELDS3 = ["schemaFile", "stateFile"];
2954
- function resolveConfigPath4(root, targetPath) {
2955
- return import_path10.default.isAbsolute(targetPath) ? targetPath : import_path10.default.join(root, targetPath);
3526
+ var import_commander7 = require("commander");
3527
+ var import_path12 = __toESM(require("path"));
3528
+ function resolveConfigPath5(root, targetPath) {
3529
+ return import_path12.default.isAbsolute(targetPath) ? targetPath : import_path12.default.join(root, targetPath);
2956
3530
  }
2957
3531
  async function runValidate(options = {}) {
2958
3532
  const root = getProjectRoot();
2959
3533
  const configPath = getConfigPath(root);
3534
+ const useLiveDatabase = Boolean(options.url || process.env.DATABASE_URL);
2960
3535
  if (!await fileExists(configPath)) {
2961
3536
  throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
2962
3537
  }
2963
3538
  const config = await readJsonFile(configPath, {});
2964
- for (const field of REQUIRED_CONFIG_FIELDS3) {
3539
+ const requiredFields = ["schemaFile", "stateFile"];
3540
+ for (const field of requiredFields) {
2965
3541
  const value = config[field];
2966
3542
  if (!value || typeof value !== "string") {
2967
3543
  throw new Error(`Invalid config: '${field}' is required`);
2968
3544
  }
2969
3545
  }
2970
- const schemaPath = resolveConfigPath4(root, config.schemaFile);
2971
- const statePath = resolveConfigPath4(root, config.stateFile);
3546
+ const schemaPath = resolveConfigPath5(root, config.schemaFile);
3547
+ if (!config.stateFile) {
3548
+ throw new Error("Invalid config: 'stateFile' is required");
3549
+ }
3550
+ const statePath = resolveConfigPath5(root, config.stateFile);
2972
3551
  const schemaSource = await readTextFile(schemaPath);
2973
3552
  const schema = await parseSchema2(schemaSource);
2974
3553
  try {
@@ -2980,6 +3559,45 @@ async function runValidate(options = {}) {
2980
3559
  throw error2;
2981
3560
  }
2982
3561
  const previousState = await loadState2(statePath);
3562
+ if (useLiveDatabase) {
3563
+ const schemaFilters = parseSchemaList(options.schema);
3564
+ const liveSchema = await withPostgresQueryExecutor(
3565
+ resolvePostgresConnectionString({ url: options.url }),
3566
+ (query) => introspectPostgresSchema2({
3567
+ query,
3568
+ ...schemaFilters ? { schemas: schemaFilters } : {}
3569
+ })
3570
+ );
3571
+ const driftReport = await analyzeSchemaDrift2(previousState, liveSchema);
3572
+ const hasDrift2 = driftReport.missingTables.length > 0 || driftReport.extraTables.length > 0 || driftReport.columnDifferences.length > 0 || driftReport.typeMismatches.length > 0;
3573
+ process.exitCode = hasDrift2 ? EXIT_CODES.DRIFT_DETECTED : EXIT_CODES.SUCCESS;
3574
+ if (options.json) {
3575
+ console.log(JSON.stringify(driftReport, null, 2));
3576
+ return;
3577
+ }
3578
+ if (!hasDrift2) {
3579
+ success("No schema drift detected");
3580
+ return;
3581
+ }
3582
+ if (driftReport.missingTables.length > 0) {
3583
+ console.log(`Missing tables in live DB: ${driftReport.missingTables.join(", ")}`);
3584
+ }
3585
+ if (driftReport.extraTables.length > 0) {
3586
+ console.log(`Extra tables in live DB: ${driftReport.extraTables.join(", ")}`);
3587
+ }
3588
+ for (const difference of driftReport.columnDifferences) {
3589
+ if (difference.missingInLive.length > 0) {
3590
+ console.log(`Missing columns in ${difference.tableName}: ${difference.missingInLive.join(", ")}`);
3591
+ }
3592
+ if (difference.extraInLive.length > 0) {
3593
+ console.log(`Extra columns in ${difference.tableName}: ${difference.extraInLive.join(", ")}`);
3594
+ }
3595
+ }
3596
+ for (const mismatch of driftReport.typeMismatches) {
3597
+ console.log(`Type mismatch ${mismatch.tableName}.${mismatch.columnName}: ${mismatch.expectedType} -> ${mismatch.actualType}`);
3598
+ }
3599
+ return;
3600
+ }
2983
3601
  const findings = await validateSchemaChanges2(previousState, schema);
2984
3602
  const report = await toValidationReport2(findings);
2985
3603
  if (isCI() && hasDestructiveFindings(findings)) {
@@ -3053,7 +3671,7 @@ async function seedLastSeenVersion(version) {
3053
3671
  }
3054
3672
 
3055
3673
  // src/cli.ts
3056
- var program = new import_commander6.Command();
3674
+ var program = new import_commander8.Command();
3057
3675
  program.name("schema-forge").description("CLI tool for schema management and SQL generation").version(package_default.version).option("--safe", "Prevent execution of destructive operations").option("--force", "Force execution by bypassing safety checks and CI detection");
3058
3676
  function validateFlagExclusivity(options) {
3059
3677
  if (options.safe && options.force) {
@@ -3089,11 +3707,25 @@ program.command("generate").description("Generate SQL from schema files. In CI e
3089
3707
  await handleError(error2);
3090
3708
  }
3091
3709
  });
3092
- program.command("diff").description("Compare two schema versions and generate migration SQL. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").action(async () => {
3710
+ program.command("diff").description("Compare two schema versions and generate migration SQL. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").option("--url <string>", "PostgreSQL connection URL for live diff (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3093
3711
  try {
3094
3712
  const globalOptions = program.opts();
3095
3713
  validateFlagExclusivity(globalOptions);
3096
- await runDiff(globalOptions);
3714
+ await runDiff({ ...options, ...globalOptions });
3715
+ } catch (error2) {
3716
+ await handleError(error2);
3717
+ }
3718
+ });
3719
+ program.command("doctor").description("Check live database drift against state. Exits with code 2 when drift is detected.").option("--json", "Output structured JSON").option("--url <string>", "PostgreSQL connection URL (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3720
+ try {
3721
+ await runDoctor(options);
3722
+ } catch (error2) {
3723
+ await handleError(error2);
3724
+ }
3725
+ });
3726
+ program.command("introspect").description("Extract normalized live schema from PostgreSQL").option("--url <string>", "PostgreSQL connection URL (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names (default: public)").option("--json", "Output normalized schema JSON to stdout").option("--out <path>", "Write normalized schema JSON to a file").action(async (options) => {
3727
+ try {
3728
+ await runIntrospect(options);
3097
3729
  } catch (error2) {
3098
3730
  await handleError(error2);
3099
3731
  }
@@ -3105,7 +3737,7 @@ program.command("import").description("Import schema from SQL migrations").argum
3105
3737
  await handleError(error2);
3106
3738
  }
3107
3739
  });
3108
- program.command("validate").description("Detect destructive or risky schema changes. In CI environments (CI=true), exits with code 3 if destructive operations are detected.").option("--json", "Output structured JSON").action(async (options) => {
3740
+ program.command("validate").description("Detect destructive or risky schema changes. In CI environments (CI=true), exits with code 3 if destructive operations are detected.").option("--json", "Output structured JSON").option("--url <string>", "PostgreSQL connection URL for live drift validation (defaults to DATABASE_URL)").option("--schema <list>", "Comma-separated schema names to introspect (default: public)").action(async (options) => {
3109
3741
  try {
3110
3742
  await runValidate(options);
3111
3743
  } catch (error2) {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@xubylele/schema-forge",
3
- "version": "1.6.1",
3
+ "version": "1.7.0",
4
4
  "description": "Universal migration generator from schema DSL",
5
5
  "main": "dist/cli.js",
6
6
  "type": "commonjs",
@@ -11,6 +11,7 @@
11
11
  "build": "tsup src/cli.ts --format cjs --dts",
12
12
  "dev": "ts-node src/cli.ts",
13
13
  "test": "vitest",
14
+ "test:integration:drift": "vitest run test/drift-realdb.integration.test.ts",
14
15
  "prepublishOnly": "npm run build",
15
16
  "publish:public": "npm publish --access public",
16
17
  "changeset": "changeset",
@@ -42,12 +43,15 @@
42
43
  "dependencies": {
43
44
  "boxen": "^8.0.1",
44
45
  "chalk": "^5.6.2",
45
- "commander": "^14.0.3"
46
+ "commander": "^14.0.3",
47
+ "pg": "^8.19.0"
46
48
  },
47
49
  "devDependencies": {
48
- "@changesets/cli": "^2.29.8",
50
+ "@changesets/cli": "^2.30.0",
49
51
  "@types/node": "^25.2.3",
50
- "@xubylele/schema-forge-core": "^1.2.0",
52
+ "@types/pg": "^8.18.0",
53
+ "@xubylele/schema-forge-core": "^1.3.0",
54
+ "testcontainers": "^11.8.1",
51
55
  "ts-node": "^10.9.2",
52
56
  "tsup": "^8.5.1",
53
57
  "typescript": "^5.9.3",