@xubylele/schema-forge 0.3.1 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +35 -23
  2. package/dist/cli.js +197 -88
  3. package/package.json +7 -5
package/README.md CHANGED
@@ -24,8 +24,6 @@ Or use directly with npx (no installation required):
24
24
  npx @xubylele/schema-forge init
25
25
  ```
26
26
 
27
- > **Note for contributors:** This is a scoped package. To publish, use `npm publish --access public` or run `npm run publish:public`.
28
-
29
27
  ## Development
30
28
 
31
29
  Clone the repository and install dependencies:
@@ -61,7 +59,7 @@ Here's a quick walkthrough to get started with SchemaForge:
61
59
  ### 1. Initialize a new project
62
60
 
63
61
  ```bash
64
- schemaforge init
62
+ schema-forge init
65
63
  ```
66
64
 
67
65
  This creates:
@@ -98,7 +96,7 @@ table posts {
98
96
  ### 3. Generate your first migration
99
97
 
100
98
  ```bash
101
- schemaforge generate
99
+ schema-forge generate
102
100
  ```
103
101
 
104
102
  This generates a timestamped SQL migration file with CREATE TABLE statements and updates the state file.
@@ -120,7 +118,7 @@ table users {
120
118
  ### 5. Generate a migration for the changes
121
119
 
122
120
  ```bash
123
- schemaforge generate --name "add user avatar"
121
+ schema-forge generate --name "add user avatar"
124
122
  ```
125
123
 
126
124
  This generates a new migration file with ALTER TABLE statements.
@@ -128,29 +126,29 @@ This generates a new migration file with ALTER TABLE statements.
128
126
  ### 6. Check for pending changes
129
127
 
130
128
  ```bash
131
- schemaforge diff
129
+ schema-forge diff
132
130
  ```
133
131
 
134
132
  If your schema matches the state file, you'll see "No changes detected". If there are changes, it will display the SQL that would be generated.
135
133
 
136
134
  ## Commands
137
135
 
138
- ### `schemaforge init`
136
+ ### `schema-forge init`
139
137
 
140
138
  Initialize a new SchemaForge project in the current directory.
141
139
 
142
140
  ```bash
143
- schemaforge init
141
+ schema-forge init
144
142
  ```
145
143
 
146
144
  Creates the necessary directory structure and configuration files.
147
145
 
148
- ### `schemaforge generate`
146
+ ### `schema-forge generate`
149
147
 
150
148
  Generate SQL migration from schema changes.
151
149
 
152
150
  ```bash
153
- schemaforge generate [--name "migration description"]
151
+ schema-forge generate [--name "migration description"]
154
152
  ```
155
153
 
156
154
  **Options:**
@@ -159,12 +157,12 @@ schemaforge generate [--name "migration description"]
159
157
 
160
158
  Compares your current schema with the tracked state, generates SQL for any changes, and updates the state file.
161
159
 
162
- ### `schemaforge diff`
160
+ ### `schema-forge diff`
163
161
 
164
162
  Compare your schema with the current state without generating files.
165
163
 
166
164
  ```bash
167
- schemaforge diff
165
+ schema-forge diff
168
166
  ```
169
167
 
170
168
  Shows what SQL would be generated if you ran `generate`. Useful for previewing changes.
@@ -244,14 +242,14 @@ table profiles {
244
242
 
245
243
  ```bash
246
244
  your-project/
247
- ├── schemaforge/
248
- ├── schema.sf # Your schema definition (edit this!)
249
- ├── config.json # Project configuration
250
- └── state.json # State tracking (auto-generated)
251
- └── supabase/
252
- └── migrations/ # Generated SQL migrations
253
- ├── 20240101120000-initial.sql
254
- └── 20240101120100-add-user-avatar.sql
245
+ +-- schemaforge/
246
+ | +-- schema.sf # Your schema definition (edit this!)
247
+ | +-- config.json # Project configuration
248
+ | \-- state.json # State tracking (auto-generated)
249
+ \-- supabase/
250
+ \-- migrations/ # Generated SQL migrations
251
+ +-- 20240101120000-initial.sql
252
+ \-- 20240101120100-add-user-avatar.sql
255
253
  ```
256
254
 
257
255
  ## Configuration
@@ -282,10 +280,10 @@ Currently supports:
282
280
 
283
281
  A typical development workflow looks like this:
284
282
 
285
- 1. **Initialize** - `schemaforge init` (one time)
283
+ 1. **Initialize** - `schema-forge init` (one time)
286
284
  2. **Edit schema** - Modify `schemaforge/schema.sf`
287
- 3. **Preview changes** - `schemaforge diff` (optional)
288
- 4. **Generate migration** - `schemaforge generate --name "description"`
285
+ 3. **Preview changes** - `schema-forge diff` (optional)
286
+ 4. **Generate migration** - `schema-forge generate --name "description"`
289
287
  5. **Apply migration** - Run the generated SQL against your database
290
288
  6. **Repeat** - Continue editing and generating migrations as needed
291
289
 
@@ -315,6 +313,20 @@ Once your PR is merged to `main`, the release workflow automatically:
315
313
 
316
314
  No manual steps required! See [docs/releasing.md](docs/releasing.md) for detailed documentation.
317
315
 
316
+ ### Publishing Manually
317
+
318
+ To publish a scoped package to npm:
319
+
320
+ ```bash
321
+ npm publish --access public
322
+ ```
323
+
324
+ Or use the convenience script:
325
+
326
+ ```bash
327
+ npm run publish:public
328
+ ```
329
+
318
330
  For detailed guidelines on contributing and automated releases, see [CONTRIBUTING.md](CONTRIBUTING.md) and [docs/releasing.md](docs/releasing.md).
319
331
 
320
332
  ## License
package/dist/cli.js CHANGED
@@ -29,22 +29,22 @@ var import_commander4 = require("commander");
29
29
  // package.json
30
30
  var package_default = {
31
31
  name: "@xubylele/schema-forge",
32
- version: "0.3.1",
32
+ version: "1.2.0",
33
33
  description: "Universal migration generator from schema DSL",
34
34
  main: "dist/cli.js",
35
35
  type: "commonjs",
36
36
  bin: {
37
- schemaforge: "dist/cli.js"
37
+ "schema-forge": "dist/cli.js"
38
38
  },
39
39
  scripts: {
40
40
  build: "tsup src/cli.ts --format cjs --dts",
41
41
  dev: "ts-node src/cli.ts",
42
42
  test: "vitest",
43
43
  prepublishOnly: "npm run build",
44
+ "publish:public": "npm publish --access public",
44
45
  changeset: "changeset",
45
46
  "version-packages": "changeset version",
46
- release: "changeset publish",
47
- "publish:public": "npm publish --access public"
47
+ release: "changeset publish"
48
48
  },
49
49
  keywords: [
50
50
  "cli",
@@ -69,6 +69,8 @@ var package_default = {
69
69
  node: ">=18.0.0"
70
70
  },
71
71
  dependencies: {
72
+ boxen: "^8.0.1",
73
+ chalk: "^5.6.2",
72
74
  commander: "^14.0.3"
73
75
  },
74
76
  devDependencies: {
@@ -85,14 +87,6 @@ var package_default = {
85
87
  var import_commander = require("commander");
86
88
  var import_path4 = __toESM(require("path"));
87
89
 
88
- // src/core/errors.ts
89
- var SchemaValidationError = class extends Error {
90
- constructor(message) {
91
- super(message);
92
- this.name = "SchemaValidationError";
93
- }
94
- };
95
-
96
90
  // src/core/diff.ts
97
91
  function getTableNamesFromState(state) {
98
92
  return new Set(Object.keys(state.tables));
@@ -109,6 +103,9 @@ function getColumnNamesFromSchema(dbColumns) {
109
103
  function getSortedNames(names) {
110
104
  return Array.from(names).sort((a, b) => a.localeCompare(b));
111
105
  }
106
+ function normalizeColumnType(type) {
107
+ return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
108
+ }
112
109
  function diffSchemas(oldState, newSchema) {
113
110
  const operations = [];
114
111
  const oldTableNames = getTableNamesFromState(oldState);
@@ -126,6 +123,30 @@ function diffSchemas(oldState, newSchema) {
126
123
  const commonTableNames = sortedNewTableNames.filter(
127
124
  (tableName) => oldTableNames.has(tableName)
128
125
  );
126
+ for (const tableName of commonTableNames) {
127
+ const newTable = newSchema.tables[tableName];
128
+ const oldTable = oldState.tables[tableName];
129
+ if (!newTable || !oldTable) {
130
+ continue;
131
+ }
132
+ for (const column of newTable.columns) {
133
+ const previousColumn = oldTable.columns[column.name];
134
+ if (!previousColumn) {
135
+ continue;
136
+ }
137
+ const previousType = normalizeColumnType(previousColumn.type);
138
+ const currentType = normalizeColumnType(column.type);
139
+ if (previousType !== currentType) {
140
+ operations.push({
141
+ kind: "column_type_changed",
142
+ tableName,
143
+ columnName: column.name,
144
+ fromType: previousColumn.type,
145
+ toType: column.type
146
+ });
147
+ }
148
+ }
149
+ }
129
150
  for (const tableName of commonTableNames) {
130
151
  const newTable = newSchema.tables[tableName];
131
152
  const oldTable = oldState.tables[tableName];
@@ -171,14 +192,22 @@ function diffSchemas(oldState, newSchema) {
171
192
  return { operations };
172
193
  }
173
194
 
195
+ // src/core/errors.ts
196
+ var SchemaValidationError = class extends Error {
197
+ constructor(message) {
198
+ super(message);
199
+ this.name = "SchemaValidationError";
200
+ }
201
+ };
202
+
174
203
  // src/core/fs.ts
175
204
  var import_fs = require("fs");
176
205
  var import_path = __toESM(require("path"));
177
206
  async function ensureDir(dirPath) {
178
207
  try {
179
208
  await import_fs.promises.mkdir(dirPath, { recursive: true });
180
- } catch (error) {
181
- throw new Error(`Failed to create directory ${dirPath}: ${error}`);
209
+ } catch (error2) {
210
+ throw new Error(`Failed to create directory ${dirPath}: ${error2}`);
182
211
  }
183
212
  }
184
213
  async function fileExists(filePath) {
@@ -192,8 +221,8 @@ async function fileExists(filePath) {
192
221
  async function readTextFile(filePath) {
193
222
  try {
194
223
  return await import_fs.promises.readFile(filePath, "utf-8");
195
- } catch (error) {
196
- throw new Error(`Failed to read file ${filePath}: ${error}`);
224
+ } catch (error2) {
225
+ throw new Error(`Failed to read file ${filePath}: ${error2}`);
197
226
  }
198
227
  }
199
228
  async function writeTextFile(filePath, content) {
@@ -201,8 +230,8 @@ async function writeTextFile(filePath, content) {
201
230
  const dir = import_path.default.dirname(filePath);
202
231
  await ensureDir(dir);
203
232
  await import_fs.promises.writeFile(filePath, content, "utf-8");
204
- } catch (error) {
205
- throw new Error(`Failed to write file ${filePath}: ${error}`);
233
+ } catch (error2) {
234
+ throw new Error(`Failed to write file ${filePath}: ${error2}`);
206
235
  }
207
236
  }
208
237
  async function readJsonFile(filePath, fallback) {
@@ -213,16 +242,16 @@ async function readJsonFile(filePath, fallback) {
213
242
  }
214
243
  const content = await readTextFile(filePath);
215
244
  return JSON.parse(content);
216
- } catch (error) {
217
- throw new Error(`Failed to read JSON file ${filePath}: ${error}`);
245
+ } catch (error2) {
246
+ throw new Error(`Failed to read JSON file ${filePath}: ${error2}`);
218
247
  }
219
248
  }
220
249
  async function writeJsonFile(filePath, data) {
221
250
  try {
222
251
  const content = JSON.stringify(data, null, 2);
223
252
  await writeTextFile(filePath, content);
224
- } catch (error) {
225
- throw new Error(`Failed to write JSON file ${filePath}: ${error}`);
253
+ } catch (error2) {
254
+ throw new Error(`Failed to write JSON file ${filePath}: ${error2}`);
226
255
  }
227
256
  }
228
257
  async function findFiles(dirPath, pattern) {
@@ -238,12 +267,53 @@ async function findFiles(dirPath, pattern) {
238
267
  results.push(fullPath);
239
268
  }
240
269
  }
241
- } catch (error) {
242
- throw new Error(`Failed to find files in ${dirPath}: ${error}`);
270
+ } catch (error2) {
271
+ throw new Error(`Failed to find files in ${dirPath}: ${error2}`);
243
272
  }
244
273
  return results;
245
274
  }
246
275
 
276
+ // src/utils/output.ts
277
+ var import_boxen = __toESM(require("boxen"));
278
+ var import_chalk = require("chalk");
279
+ var isInteractive = Boolean(process.stdout?.isTTY);
280
+ var colorsEnabled = isInteractive && process.env.FORCE_COLOR !== "0" && !("NO_COLOR" in process.env);
281
+ var color = new import_chalk.Chalk({ level: colorsEnabled ? 3 : 0 });
282
+ var theme = {
283
+ primary: color.cyanBright,
284
+ success: color.hex("#00FF88"),
285
+ warning: color.hex("#FFD166"),
286
+ error: color.hex("#EF476F"),
287
+ accent: color.magentaBright
288
+ };
289
+ function success(message) {
290
+ const text = theme.success(`[OK] ${message}`);
291
+ if (!isInteractive) {
292
+ console.log(text);
293
+ return;
294
+ }
295
+ try {
296
+ console.log(
297
+ (0, import_boxen.default)(text, {
298
+ padding: 1,
299
+ borderColor: "cyan",
300
+ borderStyle: "round"
301
+ })
302
+ );
303
+ } catch {
304
+ console.log(text);
305
+ }
306
+ }
307
+ function info(message) {
308
+ console.log(theme.primary(message));
309
+ }
310
+ function warning(message) {
311
+ console.warn(theme.warning(`[WARN] ${message}`));
312
+ }
313
+ function error(message) {
314
+ console.error(theme.error(`[ERROR] ${message}`));
315
+ }
316
+
247
317
  // src/core/parser.ts
248
318
  var SchemaParser = class {
249
319
  /**
@@ -253,8 +323,8 @@ var SchemaParser = class {
253
323
  try {
254
324
  const schema = await readJsonFile(filePath, {});
255
325
  return this.normalizeSchema(schema);
256
- } catch (error) {
257
- throw new Error(`Failed to parse schema file ${filePath}: ${error}`);
326
+ } catch (error2) {
327
+ throw new Error(`Failed to parse schema file ${filePath}: ${error2}`);
258
328
  }
259
329
  }
260
330
  /**
@@ -267,8 +337,9 @@ var SchemaParser = class {
267
337
  try {
268
338
  const schema = await this.parseSchemaFile(file);
269
339
  schemas.push(schema);
270
- } catch (error) {
271
- console.warn(`Warning: Could not parse ${file}:`, error);
340
+ } catch (error2) {
341
+ const reason = error2 instanceof Error ? error2.message : String(error2);
342
+ warning(`Could not parse ${file}: ${reason}`);
272
343
  }
273
344
  }
274
345
  return schemas;
@@ -286,7 +357,7 @@ var SchemaParser = class {
286
357
  for (const table of schema.tables) {
287
358
  const existingIndex = mergedTables.findIndex((t) => t.name === table.name);
288
359
  if (existingIndex >= 0) {
289
- console.warn(`Warning: Duplicate table '${table.name}' found, using first occurrence`);
360
+ warning(`Duplicate table '${table.name}' found, using first occurrence`);
290
361
  } else {
291
362
  mergedTables.push(table);
292
363
  }
@@ -330,8 +401,8 @@ var SchemaParser = class {
330
401
  try {
331
402
  const schema = JSON.parse(jsonString);
332
403
  return this.normalizeSchema(schema);
333
- } catch (error) {
334
- throw new Error(`Failed to parse schema JSON: ${error}`);
404
+ } catch (error2) {
405
+ throw new Error(`Failed to parse schema JSON: ${error2}`);
335
406
  }
336
407
  }
337
408
  };
@@ -340,15 +411,26 @@ function parseSchema(source) {
340
411
  const lines = source.split("\n");
341
412
  const tables = {};
342
413
  let currentLine = 0;
343
- const validColumnTypes = /* @__PURE__ */ new Set([
414
+ const validBaseColumnTypes = /* @__PURE__ */ new Set([
344
415
  "uuid",
345
416
  "varchar",
346
417
  "text",
347
418
  "int",
419
+ "bigint",
348
420
  "boolean",
349
421
  "timestamptz",
350
422
  "date"
351
423
  ]);
424
+ function normalizeColumnType2(type) {
425
+ return type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
426
+ }
427
+ function isValidColumnType2(type) {
428
+ const normalizedType = normalizeColumnType2(type);
429
+ if (validBaseColumnTypes.has(normalizedType)) {
430
+ return true;
431
+ }
432
+ return /^varchar\(\d+\)$/.test(normalizedType) || /^numeric\(\d+,\d+\)$/.test(normalizedType);
433
+ }
352
434
  function cleanLine(line) {
353
435
  const commentIndex = line.search(/(?:\/\/|#)/);
354
436
  if (commentIndex !== -1) {
@@ -372,9 +454,11 @@ function parseSchema(source) {
372
454
  throw new Error(`Line ${lineNum}: Invalid column definition. Expected: <name> <type> [modifiers...]`);
373
455
  }
374
456
  const colName = tokens[0];
375
- const colType = tokens[1];
376
- if (!validColumnTypes.has(colType)) {
377
- throw new Error(`Line ${lineNum}: Invalid column type '${colType}'. Valid types: ${Array.from(validColumnTypes).join(", ")}`);
457
+ const colType = normalizeColumnType2(tokens[1]);
458
+ if (!isValidColumnType2(colType)) {
459
+ throw new Error(
460
+ `Line ${lineNum}: Invalid column type '${tokens[1]}'. Valid types: ${Array.from(validBaseColumnTypes).join(", ")}, varchar(n), numeric(p,s)`
461
+ );
378
462
  }
379
463
  const column = {
380
464
  name: colName,
@@ -444,8 +528,8 @@ function parseSchema(source) {
444
528
  try {
445
529
  const column = parseColumn(cleaned, lineIdx + 1);
446
530
  columns.push(column);
447
- } catch (error) {
448
- throw error;
531
+ } catch (error2) {
532
+ throw error2;
449
533
  }
450
534
  lineIdx++;
451
535
  }
@@ -828,7 +912,23 @@ var SchemaValidator = class {
828
912
  }
829
913
  };
830
914
  var defaultValidator = new SchemaValidator();
831
- var VALID_COLUMN_TYPES = ["uuid", "varchar", "text", "int", "boolean", "timestamptz", "date"];
915
+ var VALID_BASE_COLUMN_TYPES = [
916
+ "uuid",
917
+ "varchar",
918
+ "text",
919
+ "int",
920
+ "bigint",
921
+ "boolean",
922
+ "timestamptz",
923
+ "date"
924
+ ];
925
+ function isValidColumnType(type) {
926
+ const normalizedType = type.toLowerCase().trim().replace(/\s+/g, " ").replace(/\s*\(\s*/g, "(").replace(/\s*,\s*/g, ",").replace(/\s*\)\s*/g, ")");
927
+ if (VALID_BASE_COLUMN_TYPES.includes(normalizedType)) {
928
+ return true;
929
+ }
930
+ return /^varchar\(\d+\)$/.test(normalizedType) || /^numeric\(\d+,\d+\)$/.test(normalizedType);
931
+ }
832
932
  function validateSchema(schema) {
833
933
  validateDuplicateTables(schema);
834
934
  for (const tableName in schema.tables) {
@@ -841,7 +941,7 @@ function validateDuplicateTables(schema) {
841
941
  const seen = /* @__PURE__ */ new Set();
842
942
  for (const tableName of tableNames) {
843
943
  if (seen.has(tableName)) {
844
- throw new Error(`Tabla duplicada: '${tableName}'`);
944
+ throw new Error(`Duplicate table: '${tableName}'`);
845
945
  }
846
946
  seen.add(tableName);
847
947
  }
@@ -851,15 +951,15 @@ function validateTableColumns(tableName, table, allTables) {
851
951
  let primaryKeyCount = 0;
852
952
  for (const column of table.columns) {
853
953
  if (columnNames.has(column.name)) {
854
- throw new Error(`Tabla '${tableName}': columna duplicada '${column.name}'`);
954
+ throw new Error(`Table '${tableName}': duplicate column '${column.name}'`);
855
955
  }
856
956
  columnNames.add(column.name);
857
957
  if (column.primaryKey) {
858
958
  primaryKeyCount++;
859
959
  }
860
- if (!VALID_COLUMN_TYPES.includes(column.type)) {
960
+ if (!isValidColumnType(column.type)) {
861
961
  throw new Error(
862
- `Tabla '${tableName}', columna '${column.name}': tipo '${column.type}' no es v\xE1lido. Tipos soportados: ${VALID_COLUMN_TYPES.join(", ")}`
962
+ `Table '${tableName}', column '${column.name}': type '${column.type}' is not valid. Supported types: ${VALID_BASE_COLUMN_TYPES.join(", ")}, varchar(n), numeric(p,s)`
863
963
  );
864
964
  }
865
965
  if (column.foreignKey) {
@@ -867,20 +967,20 @@ function validateTableColumns(tableName, table, allTables) {
867
967
  const fkColumn = column.foreignKey.column;
868
968
  if (!allTables[fkTable]) {
869
969
  throw new Error(
870
- `Tabla '${tableName}', columna '${column.name}': tabla referenciada '${fkTable}' no existe`
970
+ `Table '${tableName}', column '${column.name}': referenced table '${fkTable}' does not exist`
871
971
  );
872
972
  }
873
973
  const referencedTable = allTables[fkTable];
874
974
  const columnExists = referencedTable.columns.some((col) => col.name === fkColumn);
875
975
  if (!columnExists) {
876
976
  throw new Error(
877
- `Tabla '${tableName}', columna '${column.name}': tabla '${fkTable}' no tiene columna '${fkColumn}'`
977
+ `Table '${tableName}', column '${column.name}': table '${fkTable}' does not have column '${fkColumn}'`
878
978
  );
879
979
  }
880
980
  }
881
981
  }
882
982
  if (primaryKeyCount > 1) {
883
- throw new Error(`Tabla '${tableName}': solo puede tener una primary key (encontradas ${primaryKeyCount})`);
983
+ throw new Error(`Table '${tableName}': can only have one primary key (found ${primaryKeyCount})`);
884
984
  }
885
985
  }
886
986
 
@@ -901,6 +1001,12 @@ function generateOperation(operation, provider, sqlConfig) {
901
1001
  return generateCreateTable(operation.table, provider, sqlConfig);
902
1002
  case "drop_table":
903
1003
  return generateDropTable(operation.tableName);
1004
+ case "column_type_changed":
1005
+ return generateAlterColumnType(
1006
+ operation.tableName,
1007
+ operation.columnName,
1008
+ operation.toType
1009
+ );
904
1010
  case "add_column":
905
1011
  return generateAddColumn(operation.tableName, operation.column, provider, sqlConfig);
906
1012
  case "drop_column":
@@ -952,6 +1058,9 @@ function generateAddColumn(tableName, column, provider, sqlConfig) {
952
1058
  function generateDropColumn(tableName, columnName) {
953
1059
  return `ALTER TABLE ${tableName} DROP COLUMN ${columnName};`;
954
1060
  }
1061
+ function generateAlterColumnType(tableName, columnName, newType) {
1062
+ return `ALTER TABLE ${tableName} ALTER COLUMN ${columnName} TYPE ${newType} USING ${columnName}::${newType};`;
1063
+ }
955
1064
 
956
1065
  // src/commands/diff.ts
957
1066
  var REQUIRED_CONFIG_FIELDS = ["schemaFile", "stateFile"];
@@ -962,7 +1071,7 @@ async function runDiff() {
962
1071
  const root = getProjectRoot();
963
1072
  const configPath = getConfigPath(root);
964
1073
  if (!await fileExists(configPath)) {
965
- throw new Error('SchemaForge project not initialized. Run "schemaforge init" first.');
1074
+ throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
966
1075
  }
967
1076
  const config = await readJsonFile(configPath, {});
968
1077
  for (const field of REQUIRED_CONFIG_FIELDS) {
@@ -981,16 +1090,16 @@ async function runDiff() {
981
1090
  const schema = parseSchema(schemaSource);
982
1091
  try {
983
1092
  validateSchema(schema);
984
- } catch (error) {
985
- if (error instanceof Error) {
986
- throw new SchemaValidationError(error.message);
1093
+ } catch (error2) {
1094
+ if (error2 instanceof Error) {
1095
+ throw new SchemaValidationError(error2.message);
987
1096
  }
988
- throw error;
1097
+ throw error2;
989
1098
  }
990
1099
  const previousState = await loadState(statePath);
991
1100
  const diff = diffSchemas(previousState, schema);
992
1101
  if (diff.operations.length === 0) {
993
- console.log("No changes detected");
1102
+ success("No changes detected");
994
1103
  return;
995
1104
  }
996
1105
  const sql = generateSql(diff, provider, config.sql);
@@ -1024,7 +1133,7 @@ async function runGenerate(options) {
1024
1133
  const root = getProjectRoot();
1025
1134
  const configPath = getConfigPath(root);
1026
1135
  if (!await fileExists(configPath)) {
1027
- throw new Error('SchemaForge project not initialized. Run "schemaforge init" first.');
1136
+ throw new Error('SchemaForge project not initialized. Run "schema-forge init" first.');
1028
1137
  }
1029
1138
  const config = await readJsonFile(configPath, {});
1030
1139
  for (const field of REQUIRED_CONFIG_FIELDS2) {
@@ -1041,23 +1150,23 @@ async function runGenerate(options) {
1041
1150
  }
1042
1151
  const provider = config.provider ?? "postgres";
1043
1152
  if (!config.provider) {
1044
- console.log("Provider not set; defaulting to postgres.");
1153
+ info("Provider not set; defaulting to postgres.");
1045
1154
  }
1046
- console.log("Generating SQL...");
1155
+ info("Generating SQL...");
1047
1156
  const schemaSource = await readTextFile(schemaPath);
1048
1157
  const schema = parseSchema(schemaSource);
1049
1158
  try {
1050
1159
  validateSchema(schema);
1051
- } catch (error) {
1052
- if (error instanceof Error) {
1053
- throw new SchemaValidationError(error.message);
1160
+ } catch (error2) {
1161
+ if (error2 instanceof Error) {
1162
+ throw new SchemaValidationError(error2.message);
1054
1163
  }
1055
- throw error;
1164
+ throw error2;
1056
1165
  }
1057
1166
  const previousState = await loadState(statePath);
1058
1167
  const diff = diffSchemas(previousState, schema);
1059
1168
  if (diff.operations.length === 0) {
1060
- console.log("No changes detected");
1169
+ info("No changes detected");
1061
1170
  return;
1062
1171
  }
1063
1172
  const sql = generateSql(diff, provider, config.sql);
@@ -1069,7 +1178,7 @@ async function runGenerate(options) {
1069
1178
  await writeTextFile(migrationPath, sql + "\n");
1070
1179
  const nextState = await schemaToState(schema);
1071
1180
  await saveState(statePath, nextState);
1072
- console.log(`\u2713 SQL generated successfully: ${migrationPath}`);
1181
+ success(`SQL generated successfully: ${migrationPath}`);
1073
1182
  }
1074
1183
 
1075
1184
  // src/commands/init.ts
@@ -1078,29 +1187,29 @@ async function runInit() {
1078
1187
  const root = getProjectRoot();
1079
1188
  const schemaForgeDir = getSchemaForgeDir(root);
1080
1189
  if (await fileExists(schemaForgeDir)) {
1081
- console.error("Error: schemaforge/ directory already exists");
1082
- console.error("Please remove it or run init in a different directory");
1190
+ error("schemaforge/ directory already exists");
1191
+ error("Please remove it or run init in a different directory");
1083
1192
  process.exit(1);
1084
1193
  }
1085
1194
  const schemaFilePath = getSchemaFilePath(root);
1086
1195
  const configPath = getConfigPath(root);
1087
1196
  const statePath = getStatePath(root);
1088
1197
  if (await fileExists(schemaFilePath)) {
1089
- console.error(`Error: ${schemaFilePath} already exists`);
1198
+ error(`${schemaFilePath} already exists`);
1090
1199
  process.exit(1);
1091
1200
  }
1092
1201
  if (await fileExists(configPath)) {
1093
- console.error(`Error: ${configPath} already exists`);
1202
+ error(`${configPath} already exists`);
1094
1203
  process.exit(1);
1095
1204
  }
1096
1205
  if (await fileExists(statePath)) {
1097
- console.error(`Error: ${statePath} already exists`);
1206
+ error(`${statePath} already exists`);
1098
1207
  process.exit(1);
1099
1208
  }
1100
- console.log("Initializing schema project...");
1209
+ info("Initializing schema project...");
1101
1210
  await ensureDir(schemaForgeDir);
1102
1211
  const schemaContent = `# SchemaForge schema definition
1103
- # Run: schemaforge generate
1212
+ # Run: schema-forge generate
1104
1213
 
1105
1214
  table users {
1106
1215
  id uuid pk
@@ -1108,7 +1217,7 @@ table users {
1108
1217
  }
1109
1218
  `;
1110
1219
  await writeTextFile(schemaFilePath, schemaContent);
1111
- console.log(`\u2713 Created ${schemaFilePath}`);
1220
+ success(`Created ${schemaFilePath}`);
1112
1221
  const config = {
1113
1222
  provider: "supabase",
1114
1223
  outputDir: "supabase/migrations",
@@ -1120,57 +1229,57 @@ table users {
1120
1229
  }
1121
1230
  };
1122
1231
  await writeJsonFile(configPath, config);
1123
- console.log(`\u2713 Created ${configPath}`);
1232
+ success(`Created ${configPath}`);
1124
1233
  const state = {
1125
1234
  version: 1,
1126
1235
  tables: {}
1127
1236
  };
1128
1237
  await writeJsonFile(statePath, state);
1129
- console.log(`\u2713 Created ${statePath}`);
1238
+ success(`Created ${statePath}`);
1130
1239
  const outputDir = "supabase/migrations";
1131
1240
  await ensureDir(outputDir);
1132
- console.log(`\u2713 Created ${outputDir}`);
1133
- console.log("\n\u2713 Project initialized successfully");
1134
- console.log("Next steps:");
1135
- console.log(" 1. Edit schemaforge/schema.sf to define your schema");
1136
- console.log(" 2. Run: schemaforge generate");
1241
+ success(`Created ${outputDir}`);
1242
+ success("Project initialized successfully");
1243
+ info("Next steps:");
1244
+ info(" 1. Edit schemaforge/schema.sf to define your schema");
1245
+ info(" 2. Run: schema-forge generate");
1137
1246
  }
1138
1247
 
1139
1248
  // src/cli.ts
1140
1249
  var program = new import_commander4.Command();
1141
- program.name("schemaforge").description("CLI tool for schema management and SQL generation").version(package_default.version);
1142
- function handleError(error) {
1143
- if (error instanceof SchemaValidationError) {
1144
- console.error(error.message);
1250
+ program.name("schema-forge").description("CLI tool for schema management and SQL generation").version(package_default.version);
1251
+ function handleError(error2) {
1252
+ if (error2 instanceof SchemaValidationError) {
1253
+ error(error2.message);
1145
1254
  process.exitCode = 2;
1146
1255
  return;
1147
1256
  }
1148
- if (error instanceof Error) {
1149
- console.error(error.message);
1257
+ if (error2 instanceof Error) {
1258
+ error(error2.message);
1150
1259
  } else {
1151
- console.error("Unexpected error");
1260
+ error("Unexpected error");
1152
1261
  }
1153
1262
  process.exitCode = 1;
1154
1263
  }
1155
1264
  program.command("init").description("Initialize a new schema project").action(async () => {
1156
1265
  try {
1157
1266
  await runInit();
1158
- } catch (error) {
1159
- handleError(error);
1267
+ } catch (error2) {
1268
+ handleError(error2);
1160
1269
  }
1161
1270
  });
1162
1271
  program.command("generate").description("Generate SQL from schema files").option("--name <string>", "Schema name to generate").action(async (options) => {
1163
1272
  try {
1164
1273
  await runGenerate(options);
1165
- } catch (error) {
1166
- handleError(error);
1274
+ } catch (error2) {
1275
+ handleError(error2);
1167
1276
  }
1168
1277
  });
1169
1278
  program.command("diff").description("Compare two schema versions and generate migration SQL").action(async () => {
1170
1279
  try {
1171
1280
  await runDiff();
1172
- } catch (error) {
1173
- handleError(error);
1281
+ } catch (error2) {
1282
+ handleError(error2);
1174
1283
  }
1175
1284
  });
1176
1285
  program.parse(process.argv);
package/package.json CHANGED
@@ -1,21 +1,21 @@
1
1
  {
2
2
  "name": "@xubylele/schema-forge",
3
- "version": "0.3.1",
3
+ "version": "1.2.0",
4
4
  "description": "Universal migration generator from schema DSL",
5
5
  "main": "dist/cli.js",
6
6
  "type": "commonjs",
7
7
  "bin": {
8
- "schemaforge": "dist/cli.js"
8
+ "schema-forge": "dist/cli.js"
9
9
  },
10
10
  "scripts": {
11
11
  "build": "tsup src/cli.ts --format cjs --dts",
12
12
  "dev": "ts-node src/cli.ts",
13
13
  "test": "vitest",
14
14
  "prepublishOnly": "npm run build",
15
+ "publish:public": "npm publish --access public",
15
16
  "changeset": "changeset",
16
17
  "version-packages": "changeset version",
17
- "release": "changeset publish",
18
- "publish:public": "npm publish --access public"
18
+ "release": "changeset publish"
19
19
  },
20
20
  "keywords": [
21
21
  "cli",
@@ -40,6 +40,8 @@
40
40
  "node": ">=18.0.0"
41
41
  },
42
42
  "dependencies": {
43
+ "boxen": "^8.0.1",
44
+ "chalk": "^5.6.2",
43
45
  "commander": "^14.0.3"
44
46
  },
45
47
  "devDependencies": {
@@ -50,4 +52,4 @@
50
52
  "typescript": "^5.9.3",
51
53
  "vitest": "^4.0.18"
52
54
  }
53
- }
55
+ }