supatool 0.4.3 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Supatool
2
2
 
3
- **The AI-Native Schema Management CLI for Supabase.** Extract database schemas into LLM-friendly structures, generate `llms.txt` catalogs, and manage seeds without drowning your AI's context.
3
+ **Schema Management CLI for PostgreSQL.** Works with Cloud SQL, Supabase, and any PostgreSQL database. Extract schemas into LLM-friendly structures, deploy diffs, apply migrations, and export seeds.
4
4
 
5
5
  [![npm version](https://img.shields.io/npm/v/supatool.svg)](https://www.npmjs.com/package/supatool)
6
6
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
@@ -9,10 +9,23 @@
9
9
 
10
10
  Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database schemas. Typical issues include:
11
11
  - **Token Waste:** Reading the entire schema at once consumes 10k+ tokens.
12
- - **Lost Context:** Frequent API calls to fetch table details via MCP lead to fragmented reasoning.
12
+ - **Lost Context:** Frequent API calls to fetch table details lead to fragmented reasoning.
13
13
  - **Inaccuracy:** AI misses RLS policies or complex FK relations split across multiple files.
14
14
 
15
- **Supatool solves this** by reorganizing your Supabase schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.
15
+ **Supatool solves this** by reorganizing your schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.
16
+
17
+ ---
18
+
19
+ ## Supported Databases
20
+
21
+ Any **PostgreSQL** database:
22
+
23
+ - Google Cloud SQL (PostgreSQL)
24
+ - Supabase
25
+ - Amazon RDS (PostgreSQL)
26
+ - Self-hosted PostgreSQL
27
+
28
+ Connection strings in both `postgresql://` and `postgres://` formats are accepted.
16
29
 
17
30
  ---
18
31
 
@@ -21,9 +34,9 @@ Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database
21
34
  - **Extract (AI-Optimized)** – DDL, RLS, and Triggers are bundled into **one file per table**. AI gets the full picture of a table by opening just one file.
22
35
  - **llms.txt Catalog** – Automatically generates a standard `llms.txt` listing all OBJECTS, RELATIONS (FKs), and RPC dependencies. This serves as the "Map" for AI agents.
23
36
  - **Multi-Schema Support** – Group objects by schema (e.g., `public`, `agent`, `auth`) with proper schema-qualification in SQL.
37
+ - **Migrate** – Apply pending `db/migrations/*.sql` files to remote, with tracking and transaction safety.
24
38
  - **Seed for AI** – Export table data as JSON. Includes a dedicated `llms.txt` for seeds so AI can see real data structures.
25
39
  - **Safe Deploy** – Push local schema changes with `--dry-run` to preview DDL before execution.
26
- - **CRUD (Deprecated)** – Legacy code generation is still available but discouraged in favor of LLM-native development.
27
40
 
28
41
  ---
29
42
 
@@ -31,18 +44,21 @@ Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database
31
44
 
32
45
  ```bash
33
46
  npm install -g supatool
34
- # Set your connection string
35
- export SUPABASE_CONNECTION_STRING="postgresql://postgres:[password]@db.[ref].supabase.co:5432/postgres"
36
47
 
37
- # Extract schema and generate AI-ready docs
38
- supatool extract --schema public,auth -o supabase/schemas
48
+ # Set connection string in .env.local
49
+ echo 'DB_CONNECTION_STRING=postgresql://user:password@host:5432/dbname' > .env.local
39
50
 
51
+ # Generate config
52
+ supatool config:init
53
+
54
+ # Extract schema and generate AI-ready docs
55
+ supatool extract --all -o db/schemas
40
56
  ```
41
57
 
42
58
  ### Output Structure
43
59
 
44
60
  ```text
45
- supabase/schemas/
61
+ db/schemas/
46
62
  ├── llms.txt # 🗺️ THE ENTRY POINT: Read this first to understand the DB map
47
63
  ├── schema_index.json # 🤖 For JSON-parsing agents
48
64
  ├── schema_summary.md # 📄 Single-file overview for quick human/AI scanning
@@ -51,18 +67,15 @@ supabase/schemas/
51
67
  ├── tables/ # table_name.sql (DDL + RLS + Triggers)
52
68
  ├── views/
53
69
  └── rpc/
54
-
55
70
  ```
56
71
 
57
72
  ---
58
73
 
59
74
  ## Best Practices for AI Agents (Cursor / Claude / MCP)
60
75
 
61
- To get the best results from your AI coding assistant, follow these steps:
62
-
63
- 1. **Start with the Map:** Always ask the AI to read `supabase/schemas/llms.txt` first.
64
- 2. **Targeted Reading:** Once the AI identifies the relevant tables from the catalog, instruct it to open only those specific `.sql` files.
65
- 3. **Understand Relations:** Use the `RELATIONS` section in `llms.txt` to help the AI write accurate JOINs without reading every file.
76
+ 1. **Start with the Map:** Always ask the AI to read `db/schemas/llms.txt` first.
77
+ 2. **Targeted Reading:** Once the AI identifies the relevant tables, instruct it to open only those specific `.sql` files.
78
+ 3. **Understand Relations:** Use the `RELATIONS` section in `llms.txt` to help the AI write accurate JOINs.
66
79
  4. **RPC Context:** If using functions, refer to `RPC_TABLES` in `llms.txt` to know which tables are affected.
67
80
 
68
81
  ---
@@ -71,32 +84,83 @@ To get the best results from your AI coding assistant, follow these steps:
71
84
 
72
85
  ### Extract
73
86
 
87
+ Pull schema from remote DB into local files:
88
+
74
89
  ```bash
75
- supatool extract --all -o supabase/schemas
90
+ supatool extract --all -o db/schemas
76
91
  # Options:
77
92
  # --schema public,agent Specify schemas
78
93
  # -t "user_*" Filter tables by pattern
79
- # --force Clear output dir before writing (prevents orphan files)
94
+ # --force Clear output dir before writing
95
+ ```
96
+
97
+ ### Deploy
98
+
99
+ Push local schema changes to remote (diff → migration → apply):
100
+
101
+ ```bash
102
+ supatool deploy --table users --dry-run # preview
103
+ supatool deploy --table all --dry-run # all tables
104
+ supatool deploy --table users # confirm before apply
105
+ ```
106
+
107
+ ### Migrate
80
108
 
109
+ Apply pending SQL files from `db/migrations/` to remote:
110
+
111
+ ```bash
112
+ supatool migrate # apply pending migrations
113
+ supatool migrate --dry-run # preview only
114
+ supatool migrate -d path/to/dir # custom directory
81
115
  ```
82
116
 
117
+ Migration files are applied in alphabetical order. Applied files are tracked in a `_supatool_migrations` table (auto-created).
118
+
83
119
  ### Seed
84
120
 
85
- Export specific tables for AI reference or testing:
121
+ Export table data as JSON for AI reference or testing:
86
122
 
87
123
  ```bash
88
124
  supatool seed --tables tables.yaml
89
125
 
126
+ # tables.yaml format:
127
+ # public:
128
+ # - users
129
+ # - posts
90
130
  ```
91
131
 
92
- *Outputs JSON files and a `llms.txt` index in `supabase/seeds/`.*
132
+ *Outputs JSON files and a `llms.txt` index in `db/seeds/`.*
93
133
 
94
- ### Deploy
134
+ ### Config
95
135
 
96
136
  ```bash
97
- supatool deploy --dry-run
137
+ supatool config:init # generate supatool.config.json + .env.local template
138
+ ```
139
+
140
+ ---
141
+
142
+ ## Configuration
143
+
144
+ `supatool.config.json`:
145
+
146
+ ```json
147
+ {
148
+ "schemaDir": "./db/schemas",
149
+ "tablePattern": "*",
150
+ "migration": {
151
+ "naming": "timestamp",
152
+ "dir": "db/migrations"
153
+ }
154
+ }
155
+ ```
156
+
157
+ `.env.local` (never commit):
98
158
 
99
159
  ```
160
+ DB_CONNECTION_STRING=postgresql://user:password@host:5432/dbname
161
+ ```
162
+
163
+ Legacy env vars are also accepted: `SUPABASE_CONNECTION_STRING`, `DATABASE_URL`.
100
164
 
101
165
  ---
102
166
 
@@ -106,4 +170,4 @@ supatool deploy --dry-run
106
170
 
107
171
  ---
108
172
 
109
- *Developed with ❤️ for the Supabase community. Use at your own risk. Always backup your DB before deployment.*
173
+ *Works with any PostgreSQL database. Always backup your DB before deployment.*
@@ -4,49 +4,62 @@ exports.modelSchemaHelp = exports.helpText = void 0;
4
4
  // See: [src/bin/helptext.ts](./src/bin/helptext.ts) from project root
5
5
  // Help text (command section from README, English only)
6
6
  exports.helpText = `
7
- Supatool CLI - Supabase schema extraction and TypeScript CRUD generation
7
+ Supatool CLI - PostgreSQL schema management (Cloud SQL, Supabase, and any PostgreSQL)
8
8
 
9
9
  Usage:
10
10
  supatool <command> [options]
11
11
 
12
12
  Commands:
13
- extract Extract database objects from Supabase
13
+ extract Extract database objects from remote DB into local files
14
+ deploy Deploy local schema changes to remote (diff → migration → apply)
15
+ migrate Apply pending db/migrations/*.sql files to remote DB
16
+ seed Export table data as AI-friendly seed JSON
17
+ config:init Generate supatool.config.json and .env.local template
14
18
  gen:types Generate TypeScript types from model YAML
15
- gen:crud Generate CRUD TypeScript code from model YAML [deprecated - prefer writing code with LLM]
16
19
  gen:docs Generate Markdown documentation from model YAML
17
- gen:sql Generate SQL (tables, relations, RLS/security) from model YAML
18
- gen:rls Generate RLS/security SQL from model YAML
19
- gen:all Generate all outputs from model YAML
20
- create Generate a template model YAML
21
- crud Generate CRUD code from Supabase type definitions [deprecated - prefer writing code with LLM]
22
- deploy Deploy local schema changes to remote (recommended)
23
- sync Sync local and remote schemas [deprecated - use deploy]
24
- seed Export selected table data as AI-friendly seed JSON
25
- config:init Generate configuration template
26
- help Show help
20
+ gen:sql Generate SQL (tables, relations, RLS) from model YAML
21
+ gen:rls Generate RLS policy SQL from model YAML
22
+ help Show this help
27
23
 
28
24
  Common Options:
29
- -c, --connection <string> Supabase connection string
25
+ -c, --connection <string> Connection string (postgresql:// or postgres://)
30
26
  -o, --output-dir <path> Output directory
31
- -t, --tables <pattern|path> Table pattern or YAML path
32
- --schema <schemas> Target schemas (comma-separated)
27
+ --schema <schemas> Target schemas (comma-separated, default: public)
33
28
  --config <path> Configuration file path
34
29
  -f, --force Force overwrite
35
30
 
36
- seed command:
37
- supatool seed -c <connection> [-t tables.yaml] [-o supabase/seeds]
31
+ Examples:
38
32
 
39
- tables.yaml format (schema-grouped):
40
- public:
41
- - users
42
- - posts
43
- admin:
44
- - platforms
33
+ # Extract full schema
34
+ supatool extract --all -o db/schemas
45
35
 
46
- Output: supabase/seeds/<timestamp>/<schema>/<table>_seed.json
47
- supabase/seeds/llms.txt (index for AI)
36
+ # Multiple schemas and table filter
37
+ supatool extract --schema public,agent -t "user_*" -o db/schemas
48
38
 
49
- For details, see the documentation.
39
+ # Deploy (preview first)
40
+ supatool deploy --table users --dry-run
41
+ supatool deploy --table all --dry-run
42
+
43
+ # Apply migrations
44
+ supatool migrate --dry-run
45
+ supatool migrate
46
+
47
+ # Seed export
48
+ supatool seed --tables tables.yaml -o db/seeds
49
+
50
+ # tables.yaml format:
51
+ # public:
52
+ # - users
53
+ # - posts
54
+
55
+ Connection string is read from (in priority order):
56
+ 1. --connection option
57
+ 2. DB_CONNECTION_STRING (.env.local)
58
+ 3. SUPABASE_CONNECTION_STRING (legacy)
59
+ 4. DATABASE_URL (legacy)
60
+ 5. supatool.config.json
61
+
62
+ For details, see https://github.com/idea-garage/supatool
50
63
  `;
51
64
  // Model Schema Usage
52
65
  exports.modelSchemaHelp = `
@@ -65,8 +78,4 @@ Model Schema Usage (schemas/supatool-data.schema.ts):
65
78
  } else {
66
79
  console.log('Valid!');
67
80
  }
68
-
69
- - Use with AI:
70
- const schemaJson = JSON.stringify(SUPATOOL_MODEL_SCHEMA, null, 2);
71
- // Pass schemaJson to your AI prompt or API
72
81
  `;
@@ -37,57 +37,62 @@ Object.defineProperty(exports, "__esModule", { value: true });
37
37
  // CLI entry point
38
38
  // Subcommand support with commander
39
39
  const commander_1 = require("commander");
40
- const index_1 = require("../index");
41
- const helptext_1 = require("./helptext"); // Import help text from external file
40
+ const helptext_1 = require("./helptext");
42
41
  const package_json_1 = require("../../package.json");
43
42
  const modelParser_1 = require("../parser/modelParser");
44
43
  const docGenerator_1 = require("../generator/docGenerator");
45
44
  const typeGenerator_1 = require("../generator/typeGenerator");
46
- const crudGenerator_1 = require("../generator/crudGenerator");
47
45
  const sqlGenerator_1 = require("../generator/sqlGenerator");
48
46
  const rlsGenerator_1 = require("../generator/rlsGenerator");
49
47
  const sync_1 = require("../sync");
50
48
  const definitionExtractor_1 = require("../sync/definitionExtractor");
51
49
  const seedGenerator_1 = require("../sync/seedGenerator");
50
+ const migrateRemote_1 = require("../sync/migrateRemote");
52
51
  const fs = __importStar(require("fs"));
53
- const path = __importStar(require("path"));
54
52
  const program = new commander_1.Command();
55
53
  program
56
54
  .name('supatool')
57
55
  .description('Supatool CLI')
58
56
  .version(package_json_1.version);
57
+ function connectionRequiredError() {
58
+ console.error('Connection string is required. Set it using one of:');
59
+ console.error('1. --connection option');
60
+ console.error('2. DB_CONNECTION_STRING environment variable');
61
+ console.error('3. SUPABASE_CONNECTION_STRING environment variable (legacy)');
62
+ console.error('4. DATABASE_URL environment variable (legacy)');
63
+ console.error('5. supatool.config.json configuration file');
64
+ process.exit(1);
65
+ }
59
66
  // extract command
60
67
  program
61
68
  .command('extract')
62
- .description('Extract and categorize database objects from Supabase')
63
- .option('-c, --connection <string>', 'Supabase connection string')
64
- .option('-o, --output-dir <path>', 'Output directory', './supabase/schemas')
69
+ .description('Extract and categorize database objects (tables, views, RLS, functions, triggers, types)')
70
+ .option('-c, --connection <string>', 'Connection string (postgresql:// or postgres://)')
71
+ .option('-o, --output-dir <path>', 'Output directory', './db/schemas')
65
72
  .option('-t, --tables <pattern>', 'Table pattern with wildcards', '*')
66
73
  .option('--tables-only', 'Extract only table definitions')
67
74
  .option('--views-only', 'Extract only view definitions')
68
- .option('--all', 'Extract all DB objects (tables, views, RLS, functions, triggers, cron, types)')
75
+ .option('--all', 'Extract all DB objects')
69
76
  .option('--no-separate', 'Output all objects in same directory')
70
77
  .option('--schema <schemas>', 'Target schemas, comma-separated (default: public)')
78
+ .option('--all-schemas', 'Target all schemas in the DB (use with -e to exclude some)')
79
+ .option('-e, --exclude-schema <schemas>', 'Schemas to exclude, comma-separated (use with --all-schemas)')
71
80
  .option('--config <path>', 'Configuration file path')
72
81
  .option('-f, --force', 'Force overwrite without confirmation')
73
82
  .action(async (options) => {
74
83
  const config = (0, sync_1.resolveConfig)({
75
84
  connectionString: options.connection
76
85
  }, options.config);
77
- if (!config.connectionString) {
78
- console.error('Connection string is required. Set it using one of:');
79
- console.error('1. --connection option');
80
- console.error('2. SUPABASE_CONNECTION_STRING environment variable');
81
- console.error('3. DATABASE_URL environment variable');
82
- console.error('4. supatool.config.json configuration file');
83
- process.exit(1);
84
- }
86
+ if (!config.connectionString)
87
+ connectionRequiredError();
85
88
  try {
86
- // Handle --schema option
87
- let schemas = ['public']; // default
89
+ let schemas = ['public'];
88
90
  if (options.schema) {
89
91
  schemas = options.schema.split(',').map((s) => s.trim());
90
92
  }
93
+ const excludeSchemas = options.excludeSchema
94
+ ? options.excludeSchema.split(',').map((s) => s.trim())
95
+ : [];
91
96
  await (0, definitionExtractor_1.extractDefinitions)({
92
97
  connectionString: config.connectionString,
93
98
  outputDir: options.outputDir,
@@ -98,6 +103,8 @@ program
98
103
  tablePattern: options.tables,
99
104
  force: options.force,
100
105
  schemas: schemas,
106
+ allSchemas: options.allSchemas || false,
107
+ excludeSchemas,
101
108
  version: package_json_1.version
102
109
  });
103
110
  }
@@ -124,17 +131,6 @@ program
124
131
  (0, typeGenerator_1.generateTypesFromModel)(model, options.out);
125
132
  console.log('TypeScript types output:', options.out);
126
133
  });
127
- // gen:crud subcommand
128
- program
129
- .command('gen:crud <modelPath>')
130
- .description('Generate CRUD TypeScript code from model YAML [deprecated - prefer writing code with LLM]')
131
- .option('-o, --out <dir>', 'Output directory', 'docs/generated/crud')
132
- .action((modelPath, options) => {
133
- console.warn('⚠️ gen:crud is deprecated. With LLM development, writing code as needed is often more efficient.');
134
- const model = (0, modelParser_1.parseModelYaml)(modelPath);
135
- (0, crudGenerator_1.generateCrudFromModel)(model, options.out);
136
- console.log('Generated CRUD TypeScript code:', options.out);
137
- });
138
134
  // gen:docs subcommand
139
135
  program
140
136
  .command('gen:docs <modelPath>')
@@ -154,12 +150,10 @@ program
154
150
  .option('-o, --out <path>', 'Output path', 'docs/generated/schema.sql')
155
151
  .action((modelPath, options) => {
156
152
  const model = (0, modelParser_1.parseModelYaml)(modelPath);
157
- // Write to temp files first
158
153
  const tmpSchema = 'docs/generated/.tmp_schema.sql';
159
154
  const tmpRls = 'docs/generated/.tmp_rls.sql';
160
155
  (0, sqlGenerator_1.generateSqlFromModel)(model, tmpSchema);
161
156
  (0, rlsGenerator_1.generateRlsSqlFromModel)(model, tmpRls);
162
- // Merge into single file
163
157
  const schema = fs.readFileSync(tmpSchema, 'utf-8');
164
158
  const rls = fs.readFileSync(tmpRls, 'utf-8');
165
159
  fs.writeFileSync(options.out, schema + '\n' + rls);
@@ -177,131 +171,34 @@ program
177
171
  (0, rlsGenerator_1.generateRlsSqlFromModel)(model, options.out);
178
172
  console.log('RLS/security policy SQL output:', options.out);
179
173
  });
180
- // gen:all subcommand
181
- program
182
- .command('gen:all <modelPath>')
183
- .description('Generate all from model YAML')
184
- .action((modelPath) => {
185
- const model = (0, modelParser_1.parseModelYaml)(modelPath);
186
- (0, typeGenerator_1.generateTypesFromModel)(model, 'docs/generated/types.ts');
187
- (0, crudGenerator_1.generateCrudFromModel)(model, 'docs/generated/crud');
188
- (0, docGenerator_1.generateTableDocMarkdown)(model, 'docs/generated/table-doc.md');
189
- (0, docGenerator_1.generateRelationsMarkdown)(model, 'docs/generated/relations.md');
190
- console.log('TypeScript types output: docs/generated/types.ts');
191
- console.log('CRUD code output: docs/generated/crud/');
192
- console.log('Table doc output: docs/generated/table-doc.md');
193
- console.log('Relations list output: docs/generated/relations.md');
194
- });
195
- // create subcommand
196
- program
197
- .command('create <template>')
198
- .description('Generate template YAML')
199
- .option('-o, --out <path>', 'Output path', 'docs/model-schema-example.yaml')
200
- .action((template, options) => {
201
- const srcPath = path.join(__dirname, '../templates/yaml', `${template}.yaml`);
202
- const destPath = options.out;
203
- if (!fs.existsSync(srcPath)) {
204
- console.error(`Template not found: ${srcPath}`);
205
- process.exit(1);
206
- }
207
- fs.copyFileSync(srcPath, destPath);
208
- console.log(`Template generated: ${destPath}`);
209
- });
210
- // crud command (Supabase types -> CRUD generation)
211
- program
212
- .command('crud')
213
- .description('Generate CRUD from Supabase type definitions [deprecated - prefer writing code with LLM]')
214
- .option('-i, --input <path>', 'Type definition input path', 'shared/')
215
- .option('-o, --output <path>', 'CRUD code output path', 'src/integrations/supabase/')
216
- .option('-t, --tables <names>', 'Target tables (comma-separated)')
217
- .option('-f, --force', 'Force overwrite output')
218
- .action((options) => {
219
- console.warn('⚠️ crud is deprecated. With LLM development, writing code as needed is often more efficient.');
220
- // Pass argv to main() for CLI args
221
- (0, index_1.main)();
222
- });
223
- // help subcommand
224
- program
225
- .command('help')
226
- .description('Show help')
227
- .action(() => {
228
- console.log(helptext_1.helpText);
229
- });
230
- // sync command (deprecated)
231
- program
232
- .command('sync')
233
- .description('Synchronize local and remote schemas [deprecated]')
234
- .option('-c, --connection <string>', 'Supabase connection string')
235
- .option('-s, --schema-dir <path>', 'Local schema directory', './supabase/schemas')
236
- .option('-t, --tables <pattern>', 'Table pattern (wildcards supported)', '*')
237
- .option('-f, --force', 'Force overwrite (no confirmation)')
238
- .option('--config <path>', 'Configuration file path')
239
- .action(async (options) => {
240
- console.warn('⚠️ WARNING: sync command is deprecated.');
241
- console.warn(' Please use `supatool deploy` command instead.');
242
- console.warn(' Example: supatool deploy --table users --dry-run');
243
- console.warn(' Example: supatool deploy --table all --dry-run # all tables');
244
- console.warn('');
245
- const config = (0, sync_1.resolveConfig)({
246
- connectionString: options.connection
247
- }, options.config);
248
- if (!config.connectionString) {
249
- console.error('Connection string is required. Set it using one of:');
250
- console.error('1. --connection option');
251
- console.error('2. SUPABASE_CONNECTION_STRING environment variable');
252
- console.error('3. DATABASE_URL environment variable');
253
- console.error('4. supatool.config.json configuration file');
254
- process.exit(1);
255
- }
256
- try {
257
- await (0, sync_1.syncAllTables)({
258
- connectionString: config.connectionString,
259
- schemaDir: options.schemaDir,
260
- tablePattern: options.tables,
261
- force: options.force
262
- });
263
- }
264
- catch (error) {
265
- console.error('⚠️ Sync error:', error);
266
- process.exit(1);
267
- }
268
- });
269
- // deploy command (recommended)
174
+ // deploy command
270
175
  program
271
176
  .command('deploy')
272
177
  .description('Deploy local schema to remote (diff detection, migration generation, confirm before apply)')
273
- .option('-c, --connection <string>', 'Supabase connection string')
274
- .option('-s, --schema-dir <path>', 'Local schema directory', './supabase/schemas')
178
+ .option('-c, --connection <string>', 'Connection string (postgresql:// or postgres://)')
179
+ .option('-s, --schema-dir <path>', 'Local schema directory', './db/schemas')
275
180
  .option('-t, --table <name>', 'Target table name (specify "all" for all tables)')
276
181
  .option('--auto-apply', 'Auto-apply to remote (no confirmation)')
277
182
  .option('--dry-run', 'Preview changes only (recommended)')
278
183
  .option('--generate-only', 'Generate migration files only (no apply)')
184
+ .option('--rls <mode>', 'RLS migration mode: rewrite = DROP+CREATE policies (default: skip)', 'skip')
279
185
  .option('--config <path>', 'Configuration file path')
280
186
  .action(async (options) => {
281
187
  const config = (0, sync_1.resolveConfig)({
282
188
  connectionString: options.connection
283
189
  }, options.config);
284
- if (!config.connectionString) {
285
- console.error('Connection string is required. Set it using one of:');
286
- console.error('1. --connection option');
287
- console.error('2. SUPABASE_CONNECTION_STRING environment variable');
288
- console.error('3. DATABASE_URL environment variable');
289
- console.error('4. supatool.config.json configuration file');
290
- process.exit(1);
291
- }
292
- // Validate table specification
190
+ if (!config.connectionString)
191
+ connectionRequiredError();
293
192
  if (!options.table) {
294
193
  console.error('❌ Table name is required. Use --table <table-name>');
295
194
  console.error(' Example: supatool deploy --table users --dry-run');
296
- console.error(' Example: supatool deploy --table all --dry-run # all tables');
195
+ console.error(' Example: supatool deploy --table all --dry-run');
297
196
  process.exit(1);
298
197
  }
299
198
  const tablePattern = options.table === 'all' ? '*' : options.table;
300
- // Option processing
301
199
  const isDryRun = options.dryRun || false;
302
200
  const isAutoApply = options.autoApply || false;
303
201
  const isGenerateOnly = options.generateOnly || false;
304
- // Conflict check
305
202
  const activeOptions = [isDryRun, isAutoApply, isGenerateOnly].filter(Boolean).length;
306
203
  if (activeOptions > 1) {
307
204
  console.error('❌ --dry-run, --auto-apply, --generate-only cannot be specified simultaneously');
@@ -328,7 +225,9 @@ program
328
225
  force: isAutoApply,
329
226
  dryRun: isDryRun,
330
227
  generateOnly: isGenerateOnly,
331
- requireConfirmation: !isDryRun && !isAutoApply && !isGenerateOnly
228
+ requireConfirmation: !isDryRun && !isAutoApply && !isGenerateOnly,
229
+ migrationConfig: config.migration,
230
+ rlsMode: options.rls
332
231
  });
333
232
  }
334
233
  catch (error) {
@@ -336,27 +235,46 @@ program
336
235
  process.exit(1);
337
236
  }
338
237
  });
238
+ // migrate command — apply pending SQL migration files to remote DB
239
+ program
240
+ .command('migrate')
241
+ .description('Apply pending migration files from db/migrations/ to remote DB')
242
+ .option('-c, --connection <string>', 'Connection string (postgresql:// or postgres://)')
243
+ .option('-d, --dir <path>', 'Migrations directory', 'db/migrations')
244
+ .option('--dry-run', 'Preview pending migrations without applying')
245
+ .option('--config <path>', 'Configuration file path')
246
+ .action(async (options) => {
247
+ const config = (0, sync_1.resolveConfig)({
248
+ connectionString: options.connection
249
+ }, options.config);
250
+ if (!config.connectionString)
251
+ connectionRequiredError();
252
+ try {
253
+ await (0, migrateRemote_1.migrateRemote)({
254
+ connectionString: config.connectionString,
255
+ migrationsDir: options.dir,
256
+ dryRun: options.dryRun || false
257
+ });
258
+ }
259
+ catch (error) {
260
+ console.error('⚠️ Migration error:', error);
261
+ process.exit(1);
262
+ }
263
+ });
339
264
  // seed command
340
265
  program
341
266
  .command('seed')
342
- .description('Fetch table data from remote DB and generate AI seed JSON')
343
- .option('-c, --connection <string>', 'Supabase connection string')
267
+ .description('Fetch table data from remote DB and generate seed JSON')
268
+ .option('-c, --connection <string>', 'Connection string (postgresql:// or postgres://)')
344
269
  .option('-t, --tables <path>', 'Tables list YAML', 'tables.yaml')
345
- .option('-o, --out <dir>', 'Output directory', 'supabase/seeds')
270
+ .option('-o, --out <dir>', 'Output directory', 'db/seeds')
346
271
  .option('--config <path>', 'Configuration file path')
347
272
  .action(async (options) => {
348
- // Resolve connection
349
273
  const config = (0, sync_1.resolveConfig)({
350
274
  connectionString: options.connection
351
275
  }, options.config);
352
- if (!config.connectionString) {
353
- console.error('Connection string is required. Set it using one of:');
354
- console.error('1. --connection option');
355
- console.error('2. SUPABASE_CONNECTION_STRING environment variable');
356
- console.error('3. DATABASE_URL environment variable');
357
- console.error('4. supatool.config.json configuration file');
358
- process.exit(1);
359
- }
276
+ if (!config.connectionString)
277
+ connectionRequiredError();
360
278
  try {
361
279
  await (0, seedGenerator_1.generateSeedsFromRemote)({
362
280
  connectionString: config.connectionString,
@@ -369,7 +287,14 @@ program
369
287
  process.exit(1);
370
288
  }
371
289
  });
372
- // If no subcommand is specified, show helpText only (do not call main)
290
+ // help subcommand
291
+ program
292
+ .command('help')
293
+ .description('Show help')
294
+ .action(() => {
295
+ console.log(helptext_1.helpText);
296
+ });
297
+ // If no subcommand is specified, show helpText only
373
298
  if (!process.argv.slice(2).length) {
374
299
  console.log(helptext_1.helpText);
375
300
  process.exit(0);