@xubylele/schema-forge 1.5.2 → 1.6.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +92 -3
- package/dist/cli.js +469 -115
- package/package.json +3 -3
package/README.md
CHANGED
|
@@ -171,12 +171,25 @@ Creates the necessary directory structure and configuration files.
|
|
|
171
171
|
Generate SQL migration from schema changes.
|
|
172
172
|
|
|
173
173
|
```bash
|
|
174
|
-
schema-forge generate [--name "migration description"]
|
|
174
|
+
schema-forge generate [--name "migration description"] [--safe] [--force]
|
|
175
175
|
```
|
|
176
176
|
|
|
177
177
|
**Options:**
|
|
178
178
|
|
|
179
179
|
- `--name` - Optional name for the migration (default: "migration")
|
|
180
|
+
- `--safe` - Block execution if destructive operations are detected (exits with error)
|
|
181
|
+
- `--force` - Bypass safety checks and proceed with destructive operations (shows warning)
|
|
182
|
+
|
|
183
|
+
**Safety Behavior:**
|
|
184
|
+
|
|
185
|
+
When destructive or risky operations are detected (like dropping columns or tables), SchemaForge will:
|
|
186
|
+
|
|
187
|
+
1. **Without flags** - Display an interactive prompt showing the risky operations and ask for confirmation (yes/no)
|
|
188
|
+
2. **With `--safe`** - Block execution immediately and exit with error code 1, listing all destructive operations
|
|
189
|
+
3. **With `--force`** - Bypass safety checks, show a warning message, and proceed with generating the migration
|
|
190
|
+
4. **In CI environment** (`CI=true`) - Skip the interactive prompt, fail with exit code 3 for destructive operations unless `--force` is used
|
|
191
|
+
|
|
192
|
+
See [CI Behavior](#ci-behavior) for more details.
|
|
180
193
|
|
|
181
194
|
Compares your current schema with the tracked state, generates SQL for any changes, and updates the state file.
|
|
182
195
|
|
|
@@ -185,10 +198,15 @@ Compares your current schema with the tracked state, generates SQL for any chang
|
|
|
185
198
|
Compare your schema with the current state without generating files.
|
|
186
199
|
|
|
187
200
|
```bash
|
|
188
|
-
schema-forge diff
|
|
201
|
+
schema-forge diff [--safe] [--force]
|
|
189
202
|
```
|
|
190
203
|
|
|
191
|
-
|
|
204
|
+
**Options:**
|
|
205
|
+
|
|
206
|
+
- `--safe` - Block execution if destructive operations are detected (exits with error)
|
|
207
|
+
- `--force` - Bypass safety checks and proceed with displaying destructive SQL (shows warning)
|
|
208
|
+
|
|
209
|
+
Shows what SQL would be generated if you ran `generate`. Useful for previewing changes. Safety behavior is the same as `generate` command. In CI environments, exits with code 3 if destructive operations are detected unless `--force` is used. See [CI Behavior](#ci-behavior) for more details.
|
|
192
210
|
|
|
193
211
|
### `schema-forge import`
|
|
194
212
|
|
|
@@ -229,11 +247,82 @@ Use JSON mode for CI and automation:
|
|
|
229
247
|
schema-forge validate --json
|
|
230
248
|
```
|
|
231
249
|
|
|
250
|
+
Exit codes (also see [CI Behavior](#ci-behavior)):
|
|
251
|
+
|
|
252
|
+
- `3` in CI environment if destructive findings detected
|
|
253
|
+
- `1` if one or more `error` findings are detected
|
|
254
|
+
- `0` if no `error` findings are detected (warnings alone do not fail)
|
|
255
|
+
|
|
232
256
|
Exit codes:
|
|
233
257
|
|
|
234
258
|
- `1` when one or more `error` findings are detected
|
|
235
259
|
- `0` when no `error` findings are detected (warnings alone do not fail)
|
|
236
260
|
|
|
261
|
+
## CI Behavior
|
|
262
|
+
|
|
263
|
+
SchemaForge ensures deterministic behavior in Continuous Integration (CI) environments to prevent accidental destructive operations.
|
|
264
|
+
|
|
265
|
+
### Detecting CI Environment
|
|
266
|
+
|
|
267
|
+
CI mode is automatically activated when either environment variable is set:
|
|
268
|
+
|
|
269
|
+
- `CI=true`
|
|
270
|
+
- `CONTINUOUS_INTEGRATION=true`
|
|
271
|
+
|
|
272
|
+
### Exit Codes
|
|
273
|
+
|
|
274
|
+
SchemaForge uses specific exit codes for different scenarios:
|
|
275
|
+
|
|
276
|
+
| Exit Code | Meaning |
|
|
277
|
+
| --------- | ------- |
|
|
278
|
+
| `0` | Success - no changes or no destructive operations detected |
|
|
279
|
+
| `1` | General error - validation failed, operation declined, missing files, etc. |
|
|
280
|
+
| `2` | Schema validation error - invalid DSL syntax or structure |
|
|
281
|
+
| `3` | **CI Destructive** - destructive operations detected in CI environment without `--force` |
|
|
282
|
+
|
|
283
|
+
### Destructive Operations in CI
|
|
284
|
+
|
|
285
|
+
When running in a CI environment, destructive operations (those flagged as `error` or `warning` level findings) trigger exit code 3:
|
|
286
|
+
|
|
287
|
+
**Operations classified as destructive:**
|
|
288
|
+
|
|
289
|
+
- Dropping tables (`DROP_TABLE`)
|
|
290
|
+
- Dropping columns (`DROP_COLUMN`)
|
|
291
|
+
- Changing column types in incompatible ways
|
|
292
|
+
- Making columns NOT NULL when they allow NULL
|
|
293
|
+
|
|
294
|
+
### Overriding in CI
|
|
295
|
+
|
|
296
|
+
To proceed with destructive operations in CI, use the `--force` flag:
|
|
297
|
+
|
|
298
|
+
```bash
|
|
299
|
+
# This will fail with exit code 3 if destructive changes detected
|
|
300
|
+
schema-forge generate
|
|
301
|
+
|
|
302
|
+
# This will proceed despite destructive changes (requires explicit acknowledgment)
|
|
303
|
+
schema-forge generate --force
|
|
304
|
+
```
|
|
305
|
+
|
|
306
|
+
### No Interactive Prompts in CI
|
|
307
|
+
|
|
308
|
+
When `CI=true`, SchemaForge will:
|
|
309
|
+
|
|
310
|
+
- ✅ Never show interactive prompts, preventing script hangs
|
|
311
|
+
- ✅ Fail deterministically (exit code 3) for destructive operations
|
|
312
|
+
- ✅ Allow explicit override with `--force` flag
|
|
313
|
+
- ❌ Not accept user input for confirmation
|
|
314
|
+
|
|
315
|
+
### Using `--safe` in CI
|
|
316
|
+
|
|
317
|
+
The `--safe` flag is compatible with CI and blocks execution of destructive operations:
|
|
318
|
+
|
|
319
|
+
```bash
|
|
320
|
+
# Blocks execution if destructive operations detected, exits with code 1
|
|
321
|
+
schema-forge generate --safe
|
|
322
|
+
```
|
|
323
|
+
|
|
324
|
+
This is useful for strict CI pipelines where all destructive changes must be reviewed and merged separately.
|
|
325
|
+
|
|
237
326
|
## Constraint Change Detection
|
|
238
327
|
|
|
239
328
|
SchemaForge detects and generates migrations for:
|
package/dist/cli.js
CHANGED
|
@@ -46,11 +46,11 @@ function parseSchema(source) {
|
|
|
46
46
|
"date"
|
|
47
47
|
]);
|
|
48
48
|
const validIdentifierPattern = /^[A-Za-z_][A-Za-z0-9_]*$/;
|
|
49
|
-
function
|
|
49
|
+
function normalizeColumnType4(type) {
|
|
50
50
|
return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
|
|
51
51
|
}
|
|
52
52
|
function isValidColumnType2(type) {
|
|
53
|
-
const normalizedType =
|
|
53
|
+
const normalizedType = normalizeColumnType4(type);
|
|
54
54
|
if (validBaseColumnTypes.has(normalizedType)) {
|
|
55
55
|
return true;
|
|
56
56
|
}
|
|
@@ -174,7 +174,7 @@ function parseSchema(source) {
|
|
|
174
174
|
throw new Error(`Line ${lineNum}: Invalid column definition. Expected: <name> <type> [modifiers...]`);
|
|
175
175
|
}
|
|
176
176
|
const colName = tokens[0];
|
|
177
|
-
const colType =
|
|
177
|
+
const colType = normalizeColumnType4(tokens[1]);
|
|
178
178
|
validateIdentifier(colName, lineNum, "column");
|
|
179
179
|
if (!isValidColumnType2(colType)) {
|
|
180
180
|
throw new Error(`Line ${lineNum}: Invalid column type '${tokens[1]}'. Valid types: ${Array.from(validBaseColumnTypes).join(", ")}, varchar(n), numeric(p,s)`);
|
|
@@ -777,7 +777,7 @@ var init_validator = __esm({
|
|
|
777
777
|
}
|
|
778
778
|
});
|
|
779
779
|
|
|
780
|
-
// node_modules/@xubylele/schema-forge-core/dist/core/
|
|
780
|
+
// node_modules/@xubylele/schema-forge-core/dist/core/safety/operation-classifier.js
|
|
781
781
|
function normalizeColumnType2(type) {
|
|
782
782
|
return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
|
|
783
783
|
}
|
|
@@ -800,119 +800,246 @@ function classifyTypeChange(from, to) {
|
|
|
800
800
|
const toType = normalizeColumnType2(to);
|
|
801
801
|
const uuidInvolved = fromType === "uuid" || toType === "uuid";
|
|
802
802
|
if (uuidInvolved && fromType !== toType) {
|
|
803
|
-
return
|
|
804
|
-
severity: "error",
|
|
805
|
-
message: `Type changed from ${fromType} to ${toType} (likely incompatible cast)`
|
|
806
|
-
};
|
|
803
|
+
return "DESTRUCTIVE";
|
|
807
804
|
}
|
|
808
805
|
if (fromType === "int" && toType === "bigint") {
|
|
809
|
-
return
|
|
810
|
-
severity: "warning",
|
|
811
|
-
message: "Type widened from int to bigint"
|
|
812
|
-
};
|
|
806
|
+
return "WARNING";
|
|
813
807
|
}
|
|
814
808
|
if (fromType === "bigint" && toType === "int") {
|
|
815
|
-
return
|
|
816
|
-
severity: "error",
|
|
817
|
-
message: "Type narrowed from bigint to int (likely incompatible cast)"
|
|
818
|
-
};
|
|
809
|
+
return "DESTRUCTIVE";
|
|
819
810
|
}
|
|
820
811
|
if (fromType === "text" && parseVarcharLength(toType) !== null) {
|
|
821
|
-
return
|
|
822
|
-
severity: "error",
|
|
823
|
-
message: `Type changed from text to ${toType} (may truncate existing values)`
|
|
824
|
-
};
|
|
812
|
+
return "DESTRUCTIVE";
|
|
825
813
|
}
|
|
826
814
|
if (parseVarcharLength(fromType) !== null && toType === "text") {
|
|
827
|
-
return
|
|
828
|
-
severity: "warning",
|
|
829
|
-
message: "Type widened from varchar(n) to text"
|
|
830
|
-
};
|
|
815
|
+
return "WARNING";
|
|
831
816
|
}
|
|
832
817
|
const fromVarcharLength = parseVarcharLength(fromType);
|
|
833
818
|
const toVarcharLength = parseVarcharLength(toType);
|
|
834
819
|
if (fromVarcharLength !== null && toVarcharLength !== null) {
|
|
835
820
|
if (toVarcharLength >= fromVarcharLength) {
|
|
836
|
-
return
|
|
837
|
-
severity: "warning",
|
|
838
|
-
message: `Type widened from varchar(${fromVarcharLength}) to varchar(${toVarcharLength})`
|
|
839
|
-
};
|
|
821
|
+
return "WARNING";
|
|
840
822
|
}
|
|
841
|
-
return
|
|
842
|
-
severity: "error",
|
|
843
|
-
message: `Type narrowed from varchar(${fromVarcharLength}) to varchar(${toVarcharLength})`
|
|
844
|
-
};
|
|
823
|
+
return "DESTRUCTIVE";
|
|
845
824
|
}
|
|
846
825
|
const fromNumeric = parseNumericType(fromType);
|
|
847
826
|
const toNumeric = parseNumericType(toType);
|
|
848
827
|
if (fromNumeric && toNumeric && fromNumeric.scale === toNumeric.scale) {
|
|
849
828
|
if (toNumeric.precision >= fromNumeric.precision) {
|
|
850
|
-
return
|
|
851
|
-
severity: "warning",
|
|
852
|
-
message: `Type widened from numeric(${fromNumeric.precision},${fromNumeric.scale}) to numeric(${toNumeric.precision},${toNumeric.scale})`
|
|
853
|
-
};
|
|
829
|
+
return "WARNING";
|
|
854
830
|
}
|
|
855
|
-
return
|
|
856
|
-
|
|
857
|
-
|
|
858
|
-
|
|
831
|
+
return "DESTRUCTIVE";
|
|
832
|
+
}
|
|
833
|
+
return "WARNING";
|
|
834
|
+
}
|
|
835
|
+
function classifyOperation(operation) {
|
|
836
|
+
switch (operation.kind) {
|
|
837
|
+
case "create_table":
|
|
838
|
+
return "SAFE";
|
|
839
|
+
case "add_column":
|
|
840
|
+
return "SAFE";
|
|
841
|
+
case "drop_table":
|
|
842
|
+
return "DESTRUCTIVE";
|
|
843
|
+
case "drop_column":
|
|
844
|
+
return "DESTRUCTIVE";
|
|
845
|
+
case "column_type_changed":
|
|
846
|
+
return classifyTypeChange(operation.fromType, operation.toType);
|
|
847
|
+
case "column_nullability_changed":
|
|
848
|
+
if (operation.from && !operation.to) {
|
|
849
|
+
return "WARNING";
|
|
850
|
+
}
|
|
851
|
+
return "SAFE";
|
|
852
|
+
case "column_default_changed":
|
|
853
|
+
return "SAFE";
|
|
854
|
+
case "column_unique_changed":
|
|
855
|
+
return "SAFE";
|
|
856
|
+
case "drop_primary_key_constraint":
|
|
857
|
+
return "DESTRUCTIVE";
|
|
858
|
+
case "add_primary_key_constraint":
|
|
859
|
+
return "SAFE";
|
|
860
|
+
default:
|
|
861
|
+
const _exhaustive = operation;
|
|
862
|
+
return _exhaustive;
|
|
863
|
+
}
|
|
864
|
+
}
|
|
865
|
+
var init_operation_classifier = __esm({
|
|
866
|
+
"node_modules/@xubylele/schema-forge-core/dist/core/safety/operation-classifier.js"() {
|
|
867
|
+
"use strict";
|
|
868
|
+
}
|
|
869
|
+
});
|
|
870
|
+
|
|
871
|
+
// node_modules/@xubylele/schema-forge-core/dist/core/safety/safety-checker.js
|
|
872
|
+
function normalizeColumnType3(type) {
|
|
873
|
+
return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
|
|
874
|
+
}
|
|
875
|
+
function parseVarcharLength2(type) {
|
|
876
|
+
const match = normalizeColumnType3(type).match(/^varchar\((\d+)\)$/);
|
|
877
|
+
return match ? Number(match[1]) : null;
|
|
878
|
+
}
|
|
879
|
+
function parseNumericType2(type) {
|
|
880
|
+
const match = normalizeColumnType3(type).match(/^numeric\((\d+),(\d+)\)$/);
|
|
881
|
+
if (!match) {
|
|
882
|
+
return null;
|
|
859
883
|
}
|
|
860
884
|
return {
|
|
861
|
-
|
|
862
|
-
|
|
885
|
+
precision: Number(match[1]),
|
|
886
|
+
scale: Number(match[2])
|
|
863
887
|
};
|
|
864
888
|
}
|
|
865
|
-
function
|
|
866
|
-
const
|
|
867
|
-
const
|
|
868
|
-
|
|
869
|
-
|
|
870
|
-
|
|
871
|
-
|
|
872
|
-
|
|
873
|
-
|
|
874
|
-
|
|
875
|
-
|
|
876
|
-
|
|
877
|
-
|
|
878
|
-
|
|
879
|
-
|
|
880
|
-
|
|
881
|
-
|
|
882
|
-
|
|
883
|
-
|
|
884
|
-
|
|
885
|
-
|
|
886
|
-
|
|
887
|
-
|
|
888
|
-
|
|
889
|
-
|
|
890
|
-
|
|
891
|
-
|
|
889
|
+
function generateTypeChangeMessage(from, to) {
|
|
890
|
+
const fromType = normalizeColumnType3(from);
|
|
891
|
+
const toType = normalizeColumnType3(to);
|
|
892
|
+
const uuidInvolved = fromType === "uuid" || toType === "uuid";
|
|
893
|
+
if (uuidInvolved && fromType !== toType) {
|
|
894
|
+
return `Type changed from ${fromType} to ${toType} (likely incompatible cast)`;
|
|
895
|
+
}
|
|
896
|
+
if (fromType === "int" && toType === "bigint") {
|
|
897
|
+
return "Type widened from int to bigint";
|
|
898
|
+
}
|
|
899
|
+
if (fromType === "bigint" && toType === "int") {
|
|
900
|
+
return "Type narrowed from bigint to int (likely incompatible cast)";
|
|
901
|
+
}
|
|
902
|
+
if (fromType === "text" && parseVarcharLength2(toType) !== null) {
|
|
903
|
+
return `Type changed from text to ${toType} (may truncate existing values)`;
|
|
904
|
+
}
|
|
905
|
+
if (parseVarcharLength2(fromType) !== null && toType === "text") {
|
|
906
|
+
const length = parseVarcharLength2(fromType);
|
|
907
|
+
return `Type widened from varchar(${length}) to text`;
|
|
908
|
+
}
|
|
909
|
+
const fromVarcharLength = parseVarcharLength2(fromType);
|
|
910
|
+
const toVarcharLength = parseVarcharLength2(toType);
|
|
911
|
+
if (fromVarcharLength !== null && toVarcharLength !== null) {
|
|
912
|
+
if (toVarcharLength >= fromVarcharLength) {
|
|
913
|
+
return `Type widened from varchar(${fromVarcharLength}) to varchar(${toVarcharLength})`;
|
|
914
|
+
}
|
|
915
|
+
return `Type narrowed from varchar(${fromVarcharLength}) to varchar(${toVarcharLength})`;
|
|
916
|
+
}
|
|
917
|
+
const fromNumeric = parseNumericType2(fromType);
|
|
918
|
+
const toNumeric = parseNumericType2(toType);
|
|
919
|
+
if (fromNumeric && toNumeric && fromNumeric.scale === toNumeric.scale) {
|
|
920
|
+
if (toNumeric.precision >= fromNumeric.precision) {
|
|
921
|
+
return `Type widened from numeric(${fromNumeric.precision},${fromNumeric.scale}) to numeric(${toNumeric.precision},${toNumeric.scale})`;
|
|
922
|
+
}
|
|
923
|
+
return `Type narrowed from numeric(${fromNumeric.precision},${fromNumeric.scale}) to numeric(${toNumeric.precision},${toNumeric.scale})`;
|
|
924
|
+
}
|
|
925
|
+
return `Type changed from ${fromType} to ${toType} (compatibility unknown)`;
|
|
926
|
+
}
|
|
927
|
+
function checkOperationSafety(operation) {
|
|
928
|
+
const safetyLevel = classifyOperation(operation);
|
|
929
|
+
if (safetyLevel === "SAFE") {
|
|
930
|
+
return null;
|
|
931
|
+
}
|
|
932
|
+
switch (operation.kind) {
|
|
933
|
+
case "drop_table":
|
|
934
|
+
return {
|
|
935
|
+
safetyLevel,
|
|
936
|
+
code: "DROP_TABLE",
|
|
937
|
+
table: operation.tableName,
|
|
938
|
+
message: "Table removed",
|
|
939
|
+
operationKind: operation.kind
|
|
940
|
+
};
|
|
941
|
+
case "drop_column":
|
|
942
|
+
return {
|
|
943
|
+
safetyLevel,
|
|
944
|
+
code: "DROP_COLUMN",
|
|
945
|
+
table: operation.tableName,
|
|
946
|
+
column: operation.columnName,
|
|
947
|
+
message: "Column removed",
|
|
948
|
+
operationKind: operation.kind
|
|
949
|
+
};
|
|
950
|
+
case "column_type_changed":
|
|
951
|
+
return {
|
|
952
|
+
safetyLevel,
|
|
953
|
+
code: "ALTER_COLUMN_TYPE",
|
|
954
|
+
table: operation.tableName,
|
|
955
|
+
column: operation.columnName,
|
|
956
|
+
from: normalizeColumnType3(operation.fromType),
|
|
957
|
+
to: normalizeColumnType3(operation.toType),
|
|
958
|
+
message: generateTypeChangeMessage(operation.fromType, operation.toType),
|
|
959
|
+
operationKind: operation.kind
|
|
960
|
+
};
|
|
961
|
+
case "column_nullability_changed":
|
|
962
|
+
if (operation.from && !operation.to) {
|
|
963
|
+
return {
|
|
964
|
+
safetyLevel,
|
|
965
|
+
code: "SET_NOT_NULL",
|
|
892
966
|
table: operation.tableName,
|
|
893
967
|
column: operation.columnName,
|
|
894
|
-
|
|
895
|
-
|
|
896
|
-
|
|
897
|
-
});
|
|
898
|
-
break;
|
|
968
|
+
message: "Column changed to NOT NULL (may fail if data contains NULLs)",
|
|
969
|
+
operationKind: operation.kind
|
|
970
|
+
};
|
|
899
971
|
}
|
|
900
|
-
|
|
901
|
-
|
|
902
|
-
|
|
903
|
-
|
|
904
|
-
|
|
905
|
-
|
|
906
|
-
|
|
907
|
-
|
|
908
|
-
|
|
909
|
-
|
|
910
|
-
|
|
911
|
-
|
|
912
|
-
|
|
972
|
+
return null;
|
|
973
|
+
case "drop_primary_key_constraint":
|
|
974
|
+
return {
|
|
975
|
+
safetyLevel,
|
|
976
|
+
code: "DROP_TABLE",
|
|
977
|
+
// Reuse code for primary key drops
|
|
978
|
+
table: operation.tableName,
|
|
979
|
+
message: "Primary key constraint removed",
|
|
980
|
+
operationKind: operation.kind
|
|
981
|
+
};
|
|
982
|
+
default:
|
|
983
|
+
return null;
|
|
984
|
+
}
|
|
985
|
+
}
|
|
986
|
+
function checkSchemaSafety(previousState, currentSchema) {
|
|
987
|
+
const findings = [];
|
|
988
|
+
const diff = diffSchemas(previousState, currentSchema);
|
|
989
|
+
for (const operation of diff.operations) {
|
|
990
|
+
const finding = checkOperationSafety(operation);
|
|
991
|
+
if (finding) {
|
|
992
|
+
findings.push(finding);
|
|
913
993
|
}
|
|
914
994
|
}
|
|
915
|
-
|
|
995
|
+
const hasWarnings = findings.some((f) => f.safetyLevel === "WARNING");
|
|
996
|
+
const hasDestructiveOps = findings.some((f) => f.safetyLevel === "DESTRUCTIVE");
|
|
997
|
+
return {
|
|
998
|
+
findings,
|
|
999
|
+
hasSafeIssues: false,
|
|
1000
|
+
// All findings are non-safe by definition
|
|
1001
|
+
hasWarnings,
|
|
1002
|
+
hasDestructiveOps
|
|
1003
|
+
};
|
|
1004
|
+
}
|
|
1005
|
+
var init_safety_checker = __esm({
|
|
1006
|
+
"node_modules/@xubylele/schema-forge-core/dist/core/safety/safety-checker.js"() {
|
|
1007
|
+
"use strict";
|
|
1008
|
+
init_diff();
|
|
1009
|
+
init_operation_classifier();
|
|
1010
|
+
}
|
|
1011
|
+
});
|
|
1012
|
+
|
|
1013
|
+
// node_modules/@xubylele/schema-forge-core/dist/core/safety/index.js
|
|
1014
|
+
var init_safety = __esm({
|
|
1015
|
+
"node_modules/@xubylele/schema-forge-core/dist/core/safety/index.js"() {
|
|
1016
|
+
"use strict";
|
|
1017
|
+
init_operation_classifier();
|
|
1018
|
+
init_safety_checker();
|
|
1019
|
+
}
|
|
1020
|
+
});
|
|
1021
|
+
|
|
1022
|
+
// node_modules/@xubylele/schema-forge-core/dist/core/validate.js
|
|
1023
|
+
function safetyLevelToSeverity(level) {
|
|
1024
|
+
if (level === "DESTRUCTIVE") {
|
|
1025
|
+
return "error";
|
|
1026
|
+
}
|
|
1027
|
+
return "warning";
|
|
1028
|
+
}
|
|
1029
|
+
function adaptSafetyFinding(finding) {
|
|
1030
|
+
return {
|
|
1031
|
+
severity: safetyLevelToSeverity(finding.safetyLevel),
|
|
1032
|
+
code: finding.code,
|
|
1033
|
+
table: finding.table,
|
|
1034
|
+
column: finding.column,
|
|
1035
|
+
from: finding.from,
|
|
1036
|
+
to: finding.to,
|
|
1037
|
+
message: finding.message
|
|
1038
|
+
};
|
|
1039
|
+
}
|
|
1040
|
+
function validateSchemaChanges(previousState, currentSchema) {
|
|
1041
|
+
const safetyReport = checkSchemaSafety(previousState, currentSchema);
|
|
1042
|
+
return safetyReport.findings.map(adaptSafetyFinding);
|
|
916
1043
|
}
|
|
917
1044
|
function toValidationReport(findings) {
|
|
918
1045
|
const errors = findings.filter((finding) => finding.severity === "error");
|
|
@@ -927,7 +1054,7 @@ function toValidationReport(findings) {
|
|
|
927
1054
|
var init_validate = __esm({
|
|
928
1055
|
"node_modules/@xubylele/schema-forge-core/dist/core/validate.js"() {
|
|
929
1056
|
"use strict";
|
|
930
|
-
|
|
1057
|
+
init_safety();
|
|
931
1058
|
}
|
|
932
1059
|
});
|
|
933
1060
|
|
|
@@ -2093,6 +2220,9 @@ var dist_exports = {};
|
|
|
2093
2220
|
__export(dist_exports, {
|
|
2094
2221
|
SchemaValidationError: () => SchemaValidationError,
|
|
2095
2222
|
applySqlOps: () => applySqlOps,
|
|
2223
|
+
checkOperationSafety: () => checkOperationSafety,
|
|
2224
|
+
checkSchemaSafety: () => checkSchemaSafety,
|
|
2225
|
+
classifyOperation: () => classifyOperation,
|
|
2096
2226
|
diffSchemas: () => diffSchemas,
|
|
2097
2227
|
ensureDir: () => ensureDir2,
|
|
2098
2228
|
fileExists: () => fileExists2,
|
|
@@ -2146,6 +2276,7 @@ var init_dist = __esm({
|
|
|
2146
2276
|
init_diff();
|
|
2147
2277
|
init_validator();
|
|
2148
2278
|
init_validate();
|
|
2279
|
+
init_safety();
|
|
2149
2280
|
init_state_manager();
|
|
2150
2281
|
init_sql_generator();
|
|
2151
2282
|
init_parse_migration();
|
|
@@ -2167,7 +2298,7 @@ var import_commander6 = require("commander");
|
|
|
2167
2298
|
// package.json
|
|
2168
2299
|
var package_default = {
|
|
2169
2300
|
name: "@xubylele/schema-forge",
|
|
2170
|
-
version: "1.
|
|
2301
|
+
version: "1.6.1",
|
|
2171
2302
|
description: "Universal migration generator from schema DSL",
|
|
2172
2303
|
main: "dist/cli.js",
|
|
2173
2304
|
type: "commonjs",
|
|
@@ -2214,7 +2345,7 @@ var package_default = {
|
|
|
2214
2345
|
devDependencies: {
|
|
2215
2346
|
"@changesets/cli": "^2.29.8",
|
|
2216
2347
|
"@types/node": "^25.2.3",
|
|
2217
|
-
"@xubylele/schema-forge-core": "^1.
|
|
2348
|
+
"@xubylele/schema-forge-core": "^1.2.0",
|
|
2218
2349
|
"ts-node": "^10.9.2",
|
|
2219
2350
|
tsup: "^8.5.1",
|
|
2220
2351
|
typescript: "^5.9.3",
|
|
@@ -2382,6 +2513,18 @@ async function isSchemaValidationError(error2) {
|
|
|
2382
2513
|
return error2 instanceof core.SchemaValidationError;
|
|
2383
2514
|
}
|
|
2384
2515
|
|
|
2516
|
+
// src/utils/exitCodes.ts
|
|
2517
|
+
var EXIT_CODES = {
|
|
2518
|
+
/** Successful operation */
|
|
2519
|
+
SUCCESS: 0,
|
|
2520
|
+
/** Validation error (invalid DSL syntax, config errors, missing files, etc.) */
|
|
2521
|
+
VALIDATION_ERROR: 1,
|
|
2522
|
+
/** Drift detected - Reserved for future use when comparing actual DB state vs schema */
|
|
2523
|
+
DRIFT_DETECTED: 2,
|
|
2524
|
+
/** Destructive operation detected in CI environment without --force */
|
|
2525
|
+
CI_DESTRUCTIVE: 3
|
|
2526
|
+
};
|
|
2527
|
+
|
|
2385
2528
|
// src/utils/output.ts
|
|
2386
2529
|
var import_boxen = __toESM(require("boxen"));
|
|
2387
2530
|
var import_chalk = require("chalk");
|
|
@@ -2422,13 +2565,96 @@ function warning(message) {
|
|
|
2422
2565
|
function error(message) {
|
|
2423
2566
|
console.error(theme.error(`[ERROR] ${message}`));
|
|
2424
2567
|
}
|
|
2568
|
+
function forceWarning(message) {
|
|
2569
|
+
console.error(theme.warning(`[FORCE] ${message}`));
|
|
2570
|
+
}
|
|
2571
|
+
|
|
2572
|
+
// src/utils/prompt.ts
|
|
2573
|
+
var import_node_readline = __toESM(require("readline"));
|
|
2574
|
+
function isCI() {
|
|
2575
|
+
return process.env.CI === "true" || process.env.CONTINUOUS_INTEGRATION === "true";
|
|
2576
|
+
}
|
|
2577
|
+
function formatFindingsSummary(findings) {
|
|
2578
|
+
const errors = findings.filter((f) => f.severity === "error");
|
|
2579
|
+
const warnings = findings.filter((f) => f.severity === "warning");
|
|
2580
|
+
const lines = [];
|
|
2581
|
+
if (errors.length > 0) {
|
|
2582
|
+
lines.push(theme.error("DESTRUCTIVE OPERATIONS:"));
|
|
2583
|
+
for (const finding of errors) {
|
|
2584
|
+
const columnPart = finding.column ? `.${finding.column}` : "";
|
|
2585
|
+
const fromTo = finding.from && finding.to ? ` (${finding.from} \u2192 ${finding.to})` : "";
|
|
2586
|
+
lines.push(theme.error(` \u2022 ${finding.code}: ${finding.table}${columnPart}${fromTo}`));
|
|
2587
|
+
}
|
|
2588
|
+
}
|
|
2589
|
+
if (warnings.length > 0) {
|
|
2590
|
+
if (lines.length > 0) lines.push("");
|
|
2591
|
+
lines.push(theme.warning("WARNING OPERATIONS:"));
|
|
2592
|
+
for (const finding of warnings) {
|
|
2593
|
+
const columnPart = finding.column ? `.${finding.column}` : "";
|
|
2594
|
+
const fromTo = finding.from && finding.to ? ` (${finding.from} \u2192 ${finding.to})` : "";
|
|
2595
|
+
lines.push(theme.warning(` \u2022 ${finding.code}: ${finding.table}${columnPart}${fromTo}`));
|
|
2596
|
+
}
|
|
2597
|
+
}
|
|
2598
|
+
return lines.join("\n");
|
|
2599
|
+
}
|
|
2600
|
+
async function readConfirmation(input = process.stdin, output = process.stdout) {
|
|
2601
|
+
const rl = import_node_readline.default.createInterface({
|
|
2602
|
+
input,
|
|
2603
|
+
output
|
|
2604
|
+
});
|
|
2605
|
+
return new Promise((resolve) => {
|
|
2606
|
+
const askQuestion = () => {
|
|
2607
|
+
rl.question(theme.primary("Proceed with these changes? (yes/no): "), (answer) => {
|
|
2608
|
+
const normalized = answer.trim().toLowerCase();
|
|
2609
|
+
if (normalized === "yes" || normalized === "y") {
|
|
2610
|
+
rl.close();
|
|
2611
|
+
resolve(true);
|
|
2612
|
+
} else if (normalized === "no" || normalized === "n") {
|
|
2613
|
+
rl.close();
|
|
2614
|
+
resolve(false);
|
|
2615
|
+
} else {
|
|
2616
|
+
console.log(theme.warning('Please answer "yes" or "no".'));
|
|
2617
|
+
askQuestion();
|
|
2618
|
+
}
|
|
2619
|
+
});
|
|
2620
|
+
};
|
|
2621
|
+
askQuestion();
|
|
2622
|
+
});
|
|
2623
|
+
}
|
|
2624
|
+
async function confirmDestructiveOps(findings, input, output) {
|
|
2625
|
+
const riskyFindings = findings.filter(
|
|
2626
|
+
(f) => f.severity === "error" || f.severity === "warning"
|
|
2627
|
+
);
|
|
2628
|
+
if (riskyFindings.length === 0) {
|
|
2629
|
+
return true;
|
|
2630
|
+
}
|
|
2631
|
+
if (isCI()) {
|
|
2632
|
+
error("Cannot run interactive prompts in CI environment. Use --force flag to bypass safety checks.");
|
|
2633
|
+
process.exitCode = EXIT_CODES.CI_DESTRUCTIVE;
|
|
2634
|
+
return false;
|
|
2635
|
+
}
|
|
2636
|
+
console.log("");
|
|
2637
|
+
console.log(formatFindingsSummary(riskyFindings));
|
|
2638
|
+
console.log("");
|
|
2639
|
+
const confirmed = await readConfirmation(input, output);
|
|
2640
|
+
if (!confirmed) {
|
|
2641
|
+
warning("Operation cancelled by user.");
|
|
2642
|
+
}
|
|
2643
|
+
return confirmed;
|
|
2644
|
+
}
|
|
2645
|
+
function hasDestructiveFindings(findings) {
|
|
2646
|
+
return findings.some((f) => f.severity === "error" || f.severity === "warning");
|
|
2647
|
+
}
|
|
2425
2648
|
|
|
2426
2649
|
// src/commands/diff.ts
|
|
2427
2650
|
var REQUIRED_CONFIG_FIELDS = ["schemaFile", "stateFile"];
|
|
2428
2651
|
function resolveConfigPath(root, targetPath) {
|
|
2429
2652
|
return import_path7.default.isAbsolute(targetPath) ? targetPath : import_path7.default.join(root, targetPath);
|
|
2430
2653
|
}
|
|
2431
|
-
async function runDiff() {
|
|
2654
|
+
async function runDiff(options = {}) {
|
|
2655
|
+
if (options.safe && options.force) {
|
|
2656
|
+
throw new Error("Cannot use --safe and --force flags together. Choose one:\n --safe: Block destructive operations\n --force: Bypass safety checks");
|
|
2657
|
+
}
|
|
2432
2658
|
const root = getProjectRoot();
|
|
2433
2659
|
const configPath = getConfigPath(root);
|
|
2434
2660
|
if (!await fileExists(configPath)) {
|
|
@@ -2456,12 +2682,47 @@ async function runDiff() {
|
|
|
2456
2682
|
}
|
|
2457
2683
|
const previousState = await loadState2(statePath);
|
|
2458
2684
|
const diff = await diffSchemas2(previousState, schema);
|
|
2685
|
+
if (options.force) {
|
|
2686
|
+
forceWarning("Are you sure to use --force? This option will bypass safety checks for destructive operations.");
|
|
2687
|
+
}
|
|
2688
|
+
if (options.safe && !options.force && diff.operations.length > 0) {
|
|
2689
|
+
const findings = await validateSchemaChanges2(previousState, schema);
|
|
2690
|
+
const destructiveFindings = findings.filter((f) => f.severity === "error");
|
|
2691
|
+
if (destructiveFindings.length > 0) {
|
|
2692
|
+
const errorMessages = destructiveFindings.map((f) => {
|
|
2693
|
+
const target = f.column ? `${f.table}.${f.column}` : f.table;
|
|
2694
|
+
const typeRange = f.from && f.to ? ` (${f.from} -> ${f.to})` : "";
|
|
2695
|
+
return ` - ${f.code}: ${target}${typeRange}`;
|
|
2696
|
+
}).join("\n");
|
|
2697
|
+
throw await createSchemaValidationError(
|
|
2698
|
+
`Cannot proceed with --safe flag: Found ${destructiveFindings.length} destructive operation(s):
|
|
2699
|
+
${errorMessages}
|
|
2700
|
+
|
|
2701
|
+
Remove --safe flag or modify schema to avoid destructive changes.`
|
|
2702
|
+
);
|
|
2703
|
+
}
|
|
2704
|
+
}
|
|
2705
|
+
if (!options.safe && !options.force && diff.operations.length > 0) {
|
|
2706
|
+
const findings = await validateSchemaChanges2(previousState, schema);
|
|
2707
|
+
const riskyFindings = findings.filter((f) => f.severity === "error" || f.severity === "warning");
|
|
2708
|
+
if (riskyFindings.length > 0) {
|
|
2709
|
+
const confirmed = await confirmDestructiveOps(findings);
|
|
2710
|
+
if (!confirmed) {
|
|
2711
|
+
if (process.exitCode !== EXIT_CODES.CI_DESTRUCTIVE) {
|
|
2712
|
+
process.exitCode = EXIT_CODES.VALIDATION_ERROR;
|
|
2713
|
+
}
|
|
2714
|
+
return;
|
|
2715
|
+
}
|
|
2716
|
+
}
|
|
2717
|
+
}
|
|
2459
2718
|
if (diff.operations.length === 0) {
|
|
2460
2719
|
success("No changes detected");
|
|
2720
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2461
2721
|
return;
|
|
2462
2722
|
}
|
|
2463
2723
|
const sql = await generateSql2(diff, provider, config.sql);
|
|
2464
2724
|
console.log(sql);
|
|
2725
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2465
2726
|
}
|
|
2466
2727
|
|
|
2467
2728
|
// src/commands/generate.ts
|
|
@@ -2488,6 +2749,9 @@ function resolveConfigPath2(root, targetPath) {
|
|
|
2488
2749
|
return import_path8.default.isAbsolute(targetPath) ? targetPath : import_path8.default.join(root, targetPath);
|
|
2489
2750
|
}
|
|
2490
2751
|
async function runGenerate(options) {
|
|
2752
|
+
if (options.safe && options.force) {
|
|
2753
|
+
throw new Error("Cannot use --safe and --force flags together. Choose one:\n --safe: Block destructive operations\n --force: Bypass safety checks");
|
|
2754
|
+
}
|
|
2491
2755
|
const root = getProjectRoot();
|
|
2492
2756
|
const configPath = getConfigPath(root);
|
|
2493
2757
|
if (!await fileExists(configPath)) {
|
|
@@ -2520,8 +2784,42 @@ async function runGenerate(options) {
|
|
|
2520
2784
|
}
|
|
2521
2785
|
const previousState = await loadState2(statePath);
|
|
2522
2786
|
const diff = await diffSchemas2(previousState, schema);
|
|
2787
|
+
if (options.force) {
|
|
2788
|
+
forceWarning("Are you sure to use --force? This option will bypass safety checks for destructive operations.");
|
|
2789
|
+
}
|
|
2790
|
+
if (options.safe && !options.force && diff.operations.length > 0) {
|
|
2791
|
+
const findings = await validateSchemaChanges2(previousState, schema);
|
|
2792
|
+
const destructiveFindings = findings.filter((f) => f.severity === "error");
|
|
2793
|
+
if (destructiveFindings.length > 0) {
|
|
2794
|
+
const errorMessages = destructiveFindings.map((f) => {
|
|
2795
|
+
const target = f.column ? `${f.table}.${f.column}` : f.table;
|
|
2796
|
+
const typeRange = f.from && f.to ? ` (${f.from} -> ${f.to})` : "";
|
|
2797
|
+
return ` - ${f.code}: ${target}${typeRange}`;
|
|
2798
|
+
}).join("\n");
|
|
2799
|
+
throw await createSchemaValidationError(
|
|
2800
|
+
`Cannot proceed with --safe flag: Found ${destructiveFindings.length} destructive operation(s):
|
|
2801
|
+
${errorMessages}
|
|
2802
|
+
|
|
2803
|
+
Remove --safe flag or modify schema to avoid destructive changes.`
|
|
2804
|
+
);
|
|
2805
|
+
}
|
|
2806
|
+
}
|
|
2807
|
+
if (!options.safe && !options.force && diff.operations.length > 0) {
|
|
2808
|
+
const findings = await validateSchemaChanges2(previousState, schema);
|
|
2809
|
+
const riskyFindings = findings.filter((f) => f.severity === "error" || f.severity === "warning");
|
|
2810
|
+
if (riskyFindings.length > 0) {
|
|
2811
|
+
const confirmed = await confirmDestructiveOps(findings);
|
|
2812
|
+
if (!confirmed) {
|
|
2813
|
+
if (process.exitCode !== EXIT_CODES.CI_DESTRUCTIVE) {
|
|
2814
|
+
process.exitCode = EXIT_CODES.VALIDATION_ERROR;
|
|
2815
|
+
}
|
|
2816
|
+
return;
|
|
2817
|
+
}
|
|
2818
|
+
}
|
|
2819
|
+
}
|
|
2523
2820
|
if (diff.operations.length === 0) {
|
|
2524
2821
|
info("No changes detected");
|
|
2822
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2525
2823
|
return;
|
|
2526
2824
|
}
|
|
2527
2825
|
const sql = await generateSql2(diff, provider, config.sql);
|
|
@@ -2534,6 +2832,7 @@ async function runGenerate(options) {
|
|
|
2534
2832
|
const nextState = await schemaToState2(schema);
|
|
2535
2833
|
await saveState2(statePath, nextState);
|
|
2536
2834
|
success(`SQL generated successfully: ${migrationPath}`);
|
|
2835
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2537
2836
|
}
|
|
2538
2837
|
|
|
2539
2838
|
// src/commands/import.ts
|
|
@@ -2585,6 +2884,7 @@ async function runImport(inputPath, options = {}) {
|
|
|
2585
2884
|
warning(`...and ${warnings.length - 10} more`);
|
|
2586
2885
|
}
|
|
2587
2886
|
}
|
|
2887
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2588
2888
|
}
|
|
2589
2889
|
|
|
2590
2890
|
// src/commands/init.ts
|
|
@@ -2593,24 +2893,19 @@ async function runInit() {
|
|
|
2593
2893
|
const root = getProjectRoot();
|
|
2594
2894
|
const schemaForgeDir = getSchemaForgeDir(root);
|
|
2595
2895
|
if (await fileExists(schemaForgeDir)) {
|
|
2596
|
-
|
|
2597
|
-
error("Please remove it or run init in a different directory");
|
|
2598
|
-
process.exit(1);
|
|
2896
|
+
throw new Error("schemaforge/ directory already exists. Please remove it or run init in a different directory.");
|
|
2599
2897
|
}
|
|
2600
2898
|
const schemaFilePath = getSchemaFilePath(root);
|
|
2601
2899
|
const configPath = getConfigPath(root);
|
|
2602
2900
|
const statePath = getStatePath(root);
|
|
2603
2901
|
if (await fileExists(schemaFilePath)) {
|
|
2604
|
-
|
|
2605
|
-
process.exit(1);
|
|
2902
|
+
throw new Error(`${schemaFilePath} already exists`);
|
|
2606
2903
|
}
|
|
2607
2904
|
if (await fileExists(configPath)) {
|
|
2608
|
-
|
|
2609
|
-
process.exit(1);
|
|
2905
|
+
throw new Error(`${configPath} already exists`);
|
|
2610
2906
|
}
|
|
2611
2907
|
if (await fileExists(statePath)) {
|
|
2612
|
-
|
|
2613
|
-
process.exit(1);
|
|
2908
|
+
throw new Error(`${statePath} already exists`);
|
|
2614
2909
|
}
|
|
2615
2910
|
info("Initializing schema project...");
|
|
2616
2911
|
await ensureDir(schemaForgeDir);
|
|
@@ -2649,6 +2944,7 @@ table users {
|
|
|
2649
2944
|
info("Next steps:");
|
|
2650
2945
|
info(" 1. Edit schemaforge/schema.sf to define your schema");
|
|
2651
2946
|
info(" 2. Run: schema-forge generate");
|
|
2947
|
+
process.exitCode = EXIT_CODES.SUCCESS;
|
|
2652
2948
|
}
|
|
2653
2949
|
|
|
2654
2950
|
// src/commands/validate.ts
|
|
@@ -2686,14 +2982,17 @@ async function runValidate(options = {}) {
|
|
|
2686
2982
|
const previousState = await loadState2(statePath);
|
|
2687
2983
|
const findings = await validateSchemaChanges2(previousState, schema);
|
|
2688
2984
|
const report = await toValidationReport2(findings);
|
|
2985
|
+
if (isCI() && hasDestructiveFindings(findings)) {
|
|
2986
|
+
process.exitCode = EXIT_CODES.CI_DESTRUCTIVE;
|
|
2987
|
+
} else {
|
|
2988
|
+
process.exitCode = report.hasErrors ? EXIT_CODES.VALIDATION_ERROR : EXIT_CODES.SUCCESS;
|
|
2989
|
+
}
|
|
2689
2990
|
if (options.json) {
|
|
2690
2991
|
console.log(JSON.stringify(report, null, 2));
|
|
2691
|
-
process.exitCode = report.hasErrors ? 1 : 0;
|
|
2692
2992
|
return;
|
|
2693
2993
|
}
|
|
2694
2994
|
if (findings.length === 0) {
|
|
2695
2995
|
success("No destructive changes detected");
|
|
2696
|
-
process.exitCode = 0;
|
|
2697
2996
|
return;
|
|
2698
2997
|
}
|
|
2699
2998
|
console.log(
|
|
@@ -2710,16 +3009,61 @@ async function runValidate(options = {}) {
|
|
|
2710
3009
|
);
|
|
2711
3010
|
}
|
|
2712
3011
|
}
|
|
2713
|
-
|
|
3012
|
+
}
|
|
3013
|
+
|
|
3014
|
+
// src/utils/whatsNew.ts
|
|
3015
|
+
var import_node_os = __toESM(require("os"));
|
|
3016
|
+
var import_node_path = __toESM(require("path"));
|
|
3017
|
+
function getCliMetaPath() {
|
|
3018
|
+
return import_node_path.default.join(import_node_os.default.homedir(), ".schema-forge", "cli-meta.json");
|
|
3019
|
+
}
|
|
3020
|
+
function getReleaseUrl(version) {
|
|
3021
|
+
return `https://github.com/xubylele/schema-forge/releases/tag/v${version}`;
|
|
3022
|
+
}
|
|
3023
|
+
function shouldShowWhatsNew(argv) {
|
|
3024
|
+
if (argv.length === 0) {
|
|
3025
|
+
return false;
|
|
3026
|
+
}
|
|
3027
|
+
if (argv.includes("--help") || argv.includes("-h") || argv.includes("--version") || argv.includes("-V")) {
|
|
3028
|
+
return false;
|
|
3029
|
+
}
|
|
3030
|
+
return true;
|
|
3031
|
+
}
|
|
3032
|
+
async function showWhatsNewIfUpdated(currentVersion, argv) {
|
|
3033
|
+
if (!shouldShowWhatsNew(argv)) {
|
|
3034
|
+
return;
|
|
3035
|
+
}
|
|
3036
|
+
try {
|
|
3037
|
+
const metaPath = getCliMetaPath();
|
|
3038
|
+
const meta = await readJsonFile(metaPath, {});
|
|
3039
|
+
if (meta.lastSeenVersion === currentVersion) {
|
|
3040
|
+
return;
|
|
3041
|
+
}
|
|
3042
|
+
info(`What's new in schema-forge v${currentVersion}: ${getReleaseUrl(currentVersion)}`);
|
|
3043
|
+
await writeJsonFile(metaPath, { lastSeenVersion: currentVersion });
|
|
3044
|
+
} catch {
|
|
3045
|
+
}
|
|
3046
|
+
}
|
|
3047
|
+
async function seedLastSeenVersion(version) {
|
|
3048
|
+
const metaPath = getCliMetaPath();
|
|
3049
|
+
const exists = await fileExists(metaPath);
|
|
3050
|
+
if (!exists) {
|
|
3051
|
+
await writeJsonFile(metaPath, { lastSeenVersion: version });
|
|
3052
|
+
}
|
|
2714
3053
|
}
|
|
2715
3054
|
|
|
2716
3055
|
// src/cli.ts
|
|
2717
3056
|
var program = new import_commander6.Command();
|
|
2718
|
-
program.name("schema-forge").description("CLI tool for schema management and SQL generation").version(package_default.version);
|
|
3057
|
+
program.name("schema-forge").description("CLI tool for schema management and SQL generation").version(package_default.version).option("--safe", "Prevent execution of destructive operations").option("--force", "Force execution by bypassing safety checks and CI detection");
|
|
3058
|
+
function validateFlagExclusivity(options) {
|
|
3059
|
+
if (options.safe && options.force) {
|
|
3060
|
+
throw new Error("Cannot use --safe and --force flags together. Choose one:\n --safe: Block destructive operations\n --force: Bypass safety checks");
|
|
3061
|
+
}
|
|
3062
|
+
}
|
|
2719
3063
|
async function handleError(error2) {
|
|
2720
3064
|
if (await isSchemaValidationError(error2) && error2 instanceof Error) {
|
|
2721
3065
|
error(error2.message);
|
|
2722
|
-
process.exitCode =
|
|
3066
|
+
process.exitCode = EXIT_CODES.VALIDATION_ERROR;
|
|
2723
3067
|
return;
|
|
2724
3068
|
}
|
|
2725
3069
|
if (error2 instanceof Error) {
|
|
@@ -2727,7 +3071,7 @@ async function handleError(error2) {
|
|
|
2727
3071
|
} else {
|
|
2728
3072
|
error("Unexpected error");
|
|
2729
3073
|
}
|
|
2730
|
-
process.exitCode =
|
|
3074
|
+
process.exitCode = EXIT_CODES.VALIDATION_ERROR;
|
|
2731
3075
|
}
|
|
2732
3076
|
program.command("init").description("Initialize a new schema project").action(async () => {
|
|
2733
3077
|
try {
|
|
@@ -2736,16 +3080,20 @@ program.command("init").description("Initialize a new schema project").action(as
|
|
|
2736
3080
|
await handleError(error2);
|
|
2737
3081
|
}
|
|
2738
3082
|
});
|
|
2739
|
-
program.command("generate").description("Generate SQL from schema files").option("--name <string>", "Schema name to generate").action(async (options) => {
|
|
3083
|
+
program.command("generate").description("Generate SQL from schema files. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").option("--name <string>", "Schema name to generate").action(async (options) => {
|
|
2740
3084
|
try {
|
|
2741
|
-
|
|
3085
|
+
const globalOptions = program.opts();
|
|
3086
|
+
validateFlagExclusivity(globalOptions);
|
|
3087
|
+
await runGenerate({ ...options, ...globalOptions });
|
|
2742
3088
|
} catch (error2) {
|
|
2743
3089
|
await handleError(error2);
|
|
2744
3090
|
}
|
|
2745
3091
|
});
|
|
2746
|
-
program.command("diff").description("Compare two schema versions and generate migration SQL").action(async () => {
|
|
3092
|
+
program.command("diff").description("Compare two schema versions and generate migration SQL. In CI environments (CI=true), exits with code 3 if destructive operations are detected unless --force is used.").action(async () => {
|
|
2747
3093
|
try {
|
|
2748
|
-
|
|
3094
|
+
const globalOptions = program.opts();
|
|
3095
|
+
validateFlagExclusivity(globalOptions);
|
|
3096
|
+
await runDiff(globalOptions);
|
|
2749
3097
|
} catch (error2) {
|
|
2750
3098
|
await handleError(error2);
|
|
2751
3099
|
}
|
|
@@ -2757,14 +3105,20 @@ program.command("import").description("Import schema from SQL migrations").argum
|
|
|
2757
3105
|
await handleError(error2);
|
|
2758
3106
|
}
|
|
2759
3107
|
});
|
|
2760
|
-
program.command("validate").description("Detect destructive or risky schema changes").option("--json", "Output structured JSON").action(async (options) => {
|
|
3108
|
+
program.command("validate").description("Detect destructive or risky schema changes. In CI environments (CI=true), exits with code 3 if destructive operations are detected.").option("--json", "Output structured JSON").action(async (options) => {
|
|
2761
3109
|
try {
|
|
2762
3110
|
await runValidate(options);
|
|
2763
3111
|
} catch (error2) {
|
|
2764
3112
|
await handleError(error2);
|
|
2765
3113
|
}
|
|
2766
3114
|
});
|
|
2767
|
-
|
|
2768
|
-
|
|
2769
|
-
|
|
3115
|
+
async function main() {
|
|
3116
|
+
const argv = process.argv.slice(2);
|
|
3117
|
+
await seedLastSeenVersion(package_default.version);
|
|
3118
|
+
await showWhatsNewIfUpdated(package_default.version, argv);
|
|
3119
|
+
program.parse(process.argv);
|
|
3120
|
+
if (!argv.length) {
|
|
3121
|
+
program.outputHelp();
|
|
3122
|
+
}
|
|
2770
3123
|
}
|
|
3124
|
+
void main();
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@xubylele/schema-forge",
|
|
3
|
-
"version": "1.
|
|
3
|
+
"version": "1.6.1",
|
|
4
4
|
"description": "Universal migration generator from schema DSL",
|
|
5
5
|
"main": "dist/cli.js",
|
|
6
6
|
"type": "commonjs",
|
|
@@ -47,10 +47,10 @@
|
|
|
47
47
|
"devDependencies": {
|
|
48
48
|
"@changesets/cli": "^2.29.8",
|
|
49
49
|
"@types/node": "^25.2.3",
|
|
50
|
-
"@xubylele/schema-forge-core": "^1.
|
|
50
|
+
"@xubylele/schema-forge-core": "^1.2.0",
|
|
51
51
|
"ts-node": "^10.9.2",
|
|
52
52
|
"tsup": "^8.5.1",
|
|
53
53
|
"typescript": "^5.9.3",
|
|
54
54
|
"vitest": "^4.0.18"
|
|
55
55
|
}
|
|
56
|
-
}
|
|
56
|
+
}
|