supatool 0.4.0 → 0.4.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,97 +1,109 @@
1
1
  # Supatool
2
2
 
3
- CLI for Supabase: **extract** schema to files with **llms.txt** for LLMs, and **seed** export as AI-friendly JSON. Deploy and CRUD (deprecated) also available.
3
+ **The AI-Native Schema Management CLI for Supabase.** Extract database schemas into LLM-friendly structures, generate `llms.txt` catalogs, and manage seeds without drowning your AI's context.
4
4
 
5
- ## Features
5
+ [![npm version](https://img.shields.io/npm/v/supatool.svg)](https://www.npmjs.com/package/supatool)
6
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
6
7
 
7
- - **Extract** – Tables, views, RLS, functions, triggers from DB into files. One file per table (DDL + RLS + triggers). Multi-schema: `--schema public,agent` → `schemas/public/`, `schemas/agent/`.
8
- - **llms.txt** – Catalog with OBJECTS, RELATIONS, RPC_TABLES, ALL_SCHEMAS. README in output links to it.
9
- - **Seed** – Export table data to JSON; llms.txt index in `supabase/seeds/`.
10
- - **Deploy** – Push local schema to remote (`supatool deploy --dry-run`).
11
- - CRUD code gen is **deprecated** (`supatool crud`, `gen:crud`); prefer writing code with an LLM.
8
+ ## Why Supatool?
12
9
 
13
- See [CHANGELOG.md](./CHANGELOG.md) for version history.
10
+ Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database schemas. Typical issues include:
11
+ - **Token Waste:** Reading the entire schema at once consumes 10k+ tokens.
12
+ - **Lost Context:** Frequent API calls to fetch table details via MCP lead to fragmented reasoning.
13
+ - **Inaccuracy:** AI misses RLS policies or complex FK relations split across multiple files.
14
14
 
15
- ## Install
15
+ **Supatool solves this** by reorganizing your Supabase schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.
16
16
 
17
- ```bash
18
- npm install -g supatool
19
- # or: yarn global add supatool | pnpm add -g supatool
20
- ```
17
+ ---
21
18
 
22
- ## Extract
19
+ ## Key Features
23
20
 
24
- Set connection (e.g. in `.env.local`):
21
+ - **Extract (AI-Optimized)** DDL, RLS, and Triggers are bundled into **one file per table**. AI gets the full picture of a table by opening just one file.
22
+ - **llms.txt Catalog** – Automatically generates a standard `llms.txt` listing all OBJECTS, RELATIONS (FKs), and RPC dependencies. This serves as the "Map" for AI agents.
23
+ - **Multi-Schema Support** – Group objects by schema (e.g., `public`, `agent`, `auth`) with proper schema-qualification in SQL.
24
+ - **Seed for AI** – Export table data as JSON. Includes a dedicated `llms.txt` for seeds so AI can see real data structures.
25
+ - **Safe Deploy** – Push local schema changes with `--dry-run` to preview DDL before execution.
26
+ - **CRUD (Deprecated)** – Legacy code generation is still available but discouraged in favor of LLM-native development.
25
27
 
26
- ```bash
27
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
28
- ```
28
+ ---
29
29
 
30
- Extract all objects:
30
+ ## Quick Start
31
31
 
32
32
  ```bash
33
- supatool extract --all -o supabase/schemas
34
- # Optional: --schema public,agent | -t "user_*" | -c "postgresql://..."
35
- # With --force: clears output dir then writes (no orphan files).
36
- ```
33
+ npm install -g supatool
34
+ # Set your connection string
35
+ export SUPABASE_CONNECTION_STRING="postgresql://postgres:[password]@db.[ref].supabase.co:5432/postgres"
37
36
 
38
- **Output:**
37
+ # Extract schema and generate AI-ready docs
38
+ supatool extract --schema public,auth -o supabase/schemas
39
39
 
40
40
  ```
41
+
42
+ ### Output Structure
43
+
44
+ ```text
41
45
  supabase/schemas/
42
- ├── README.md # Points to llms.txt
43
- ├── llms.txt # OBJECTS, RELATIONS, RPC_TABLES, ALL_SCHEMAS (read this first for AI)
44
- ├── schema_index.json # Same as llms.txt, for agents that parse JSON
45
- ├── schema_summary.md # One-file overview: tables, relations, RPCs
46
- ├── tables/
47
- ├── views/
48
- ├── rpc/
49
- ├── cron/
50
- └── types/
46
+ ├── llms.txt # 🗺️ THE ENTRY POINT: Read this first to understand the DB map
47
+ ├── schema_index.json # 🤖 For JSON-parsing agents
48
+ ├── schema_summary.md # 📄 Single-file overview for quick human/AI scanning
49
+ ├── README.md # Navigation guide
50
+ └── [schema_name]/
51
+ ├── tables/ # table_name.sql (DDL + RLS + Triggers)
52
+ ├── views/
53
+ └── rpc/
54
+
51
55
  ```
52
56
 
53
- Multi-schema: `schemas/public/`, `schemas/agent/`, etc., each with tables/, views/, rpc/.
57
+ ---
54
58
 
55
- **Env:** `SUPABASE_CONNECTION_STRING` or `DATABASE_URL`; optional `SUPATOOL_MAX_CONCURRENT` (default 20).
59
+ ## Best Practices for AI Agents (Cursor / Claude / MCP)
56
60
 
57
- ## Seed
61
+ To get the best results from your AI coding assistant, follow these steps:
58
62
 
59
- Export selected tables as JSON for AI/reference:
63
+ 1. **Start with the Map:** Always ask the AI to read `supabase/schemas/llms.txt` first.
64
+ 2. **Targeted Reading:** Once the AI identifies the relevant tables from the catalog, instruct it to open only those specific `.sql` files.
65
+ 3. **Understand Relations:** Use the `RELATIONS` section in `llms.txt` to help the AI write accurate JOINs without reading every file.
66
+ 4. **RPC Context:** If using functions, refer to `RPC_TABLES` in `llms.txt` to know which tables are affected.
67
+
68
+ ---
69
+
70
+ ## Commands
71
+
72
+ ### Extract
60
73
 
61
74
  ```bash
62
- supatool seed --tables tables.yaml --connection "postgresql://..."
75
+ supatool extract --all -o supabase/schemas
76
+ # Options:
77
+ # --schema public,agent Specify schemas
78
+ # -t "user_*" Filter tables by pattern
79
+ # --force Clear output dir before writing (prevents orphan files)
80
+
63
81
  ```
64
82
 
65
- `tables.yaml`:
83
+ ### Seed
66
84
 
67
- ```yaml
68
- tables:
69
- - users
70
- - public.orders
71
- ```
85
+ Export specific tables for AI reference or testing:
72
86
 
73
- - Output: `supabase/seeds/<timestamp>_supatool/{table}_seed.json`
74
- - `llms.txt` is written in `supabase/seeds/` listing files and row counts.
87
+ ```bash
88
+ supatool seed --tables tables.yaml
89
+
90
+ ```
75
91
 
76
- Do not include sensitive data in seed files.
92
+ *Outputs JSON files and a `llms.txt` index in `supabase/seeds/`.*
77
93
 
78
- ## For AI / coding agents (Claude CLI, Cursor, etc.)
94
+ ### Deploy
79
95
 
80
- - **Entry point:** Read `supabase/schemas/llms.txt` first. It lists all objects, relations, RPC→tables, and schemas.
81
- - **Then** open only the files you need (paths under OBJECTS). Use RELATIONS for joins; RPC_TABLES to see which RPCs touch which tables.
82
- - **Optional:** `schema_index.json` (same data as llms.txt) and `schema_summary.md` (one-file overview) are written when you run `supatool extract`.
83
- - **Seeds:** `supabase/seeds/llms.txt` lists seed JSON files; it references the schema catalog at `../schemas/llms.txt` when present.
96
+ ```bash
97
+ supatool deploy --dry-run
84
98
 
85
- ## Other commands
99
+ ```
86
100
 
87
- - `supatool help` – List commands
88
- - `supatool deploy --dry-run` – Preview deploy
89
- - CRUD: `supatool crud`, `gen:crud` (deprecated)
101
+ ---
90
102
 
91
103
  ## Repository
92
104
 
93
- [https://github.com/idea-garage/supatool](https://github.com/idea-garage/supatool) · [npm](https://www.npmjs.com/package/supatool)
105
+ [GitHub](https://github.com/idea-garage/supatool) · [npm](https://www.npmjs.com/package/supatool)
94
106
 
95
107
  ---
96
108
 
97
- Acknowledgements: Development would not be this convenient without [Supabase](https://supabase.com/). Thanks to the team and the platform.
109
+ *Developed with ❤️ for the Supabase community. Use at your own risk. Always backup your DB before deployment.*
@@ -33,6 +33,19 @@ Common Options:
33
33
  --config <path> Configuration file path
34
34
  -f, --force Force overwrite
35
35
 
36
+ seed command:
37
+ supatool seed -c <connection> [-t tables.yaml] [-o supabase/seeds]
38
+
39
+ tables.yaml format (schema-grouped):
40
+ public:
41
+ - users
42
+ - posts
43
+ admin:
44
+ - platforms
45
+
46
+ Output: supabase/seeds/<timestamp>/<schema>/<table>_seed.json
47
+ supabase/seeds/llms.txt (index for AI)
48
+
36
49
  For details, see the documentation.
37
50
  `;
38
51
  // Model Schema Usage
@@ -189,6 +189,44 @@ async function fetchRlsPolicies(client, spinner, progress, schemas = ['public'])
189
189
  return [];
190
190
  }
191
191
  }
192
+ /**
193
+ * Fetch RLS enabled flag and policy count for all tables (pg_class.relrowsecurity + pg_policies)
194
+ */
195
+ async function fetchTableRlsStatus(client, schemas = ['public']) {
196
+ if (schemas.length === 0)
197
+ return [];
198
+ const schemaPlaceholders = schemas.map((_, i) => `$${i + 1}`).join(', ');
199
+ const result = await client.query(`
200
+ SELECT
201
+ n.nspname AS schema_name,
202
+ c.relname AS table_name,
203
+ COALESCE(c.relrowsecurity, false) AS rls_enabled
204
+ FROM pg_class c
205
+ JOIN pg_namespace n ON n.oid = c.relnamespace
206
+ WHERE c.relkind = 'r'
207
+ AND n.nspname IN (${schemaPlaceholders})
208
+ ORDER BY n.nspname, c.relname
209
+ `, schemas);
210
+ const policyCountMap = new Map();
211
+ const policyResult = await client.query(`
212
+ SELECT schemaname, tablename, COUNT(*) AS cnt
213
+ FROM pg_policies
214
+ WHERE schemaname IN (${schemaPlaceholders})
215
+ GROUP BY schemaname, tablename
216
+ `, schemas);
217
+ for (const row of policyResult.rows) {
218
+ policyCountMap.set(`${row.schemaname}.${row.tablename}`, parseInt(row.cnt, 10));
219
+ }
220
+ return result.rows.map((r) => {
221
+ const key = `${r.schema_name}.${r.table_name}`;
222
+ return {
223
+ schema: r.schema_name,
224
+ table: r.table_name,
225
+ rlsEnabled: !!r.rls_enabled,
226
+ policyCount: policyCountMap.get(key) ?? 0
227
+ };
228
+ });
229
+ }
192
230
  /**
193
231
  * Fetch FK relations list (for llms.txt RELATIONS)
194
232
  */
@@ -973,7 +1011,7 @@ async function generateCreateTableDDL(client, tableName, schemaName = 'public')
973
1011
  /**
974
1012
  * Save definitions to files (merge RLS/triggers into table/view; schema folders when multi-schema)
975
1013
  */
976
- async function saveDefinitionsByType(definitions, outputDir, separateDirectories = true, schemas = ['public'], relations = [], rpcTables = [], allSchemas = [], version) {
1014
+ async function saveDefinitionsByType(definitions, outputDir, separateDirectories = true, schemas = ['public'], relations = [], rpcTables = [], allSchemas = [], version, tableRlsStatus = []) {
977
1015
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
978
1016
  const path = await Promise.resolve().then(() => __importStar(require('path')));
979
1017
  const outputDate = new Date().toLocaleDateString('en-CA', { year: 'numeric', month: '2-digit', day: '2-digit' });
@@ -1003,7 +1041,12 @@ async function saveDefinitionsByType(definitions, outputDir, separateDirectories
1003
1041
  list.push(t.ddl);
1004
1042
  triggersByCategory.set(t.category, list);
1005
1043
  }
1006
- // Build merged DDL (table/view + RLS + triggers)
1044
+ // schema.table -> RLS status (for appending comment/DDL when no policies)
1045
+ const rlsStatusByCategory = new Map();
1046
+ for (const s of tableRlsStatus) {
1047
+ rlsStatusByCategory.set(`${s.schema}.${s.table}`, s);
1048
+ }
1049
+ // Build merged DDL (table/view + RLS + triggers). Tables with RLS disabled or 0 policies get a comment block in the file.
1007
1050
  const mergeRlsAndTriggers = (def) => {
1008
1051
  const cat = def.schema && def.name ? `${def.schema}.${def.name}` : def.category ?? '';
1009
1052
  let ddl = def.ddl.trimEnd();
@@ -1011,6 +1054,19 @@ async function saveDefinitionsByType(definitions, outputDir, separateDirectories
1011
1054
  if (rlsDdl) {
1012
1055
  ddl += '\n\n' + rlsDdl.trim();
1013
1056
  }
1057
+ else if (def.type === 'table' && def.schema && def.name) {
1058
+ const rlsStatus = rlsStatusByCategory.get(cat);
1059
+ if (rlsStatus) {
1060
+ if (!rlsStatus.rlsEnabled) {
1061
+ ddl += '\n\n-- RLS: disabled. Consider enabling for production.';
1062
+ ddl += '\n-- ALTER TABLE ' + def.schema + '.' + def.name + ' ENABLE ROW LEVEL SECURITY;';
1063
+ }
1064
+ else if (rlsStatus.policyCount === 0) {
1065
+ ddl += '\n\n-- RLS: enabled, no policies defined';
1066
+ ddl += '\nALTER TABLE ' + def.schema + '.' + def.name + ' ENABLE ROW LEVEL SECURITY;';
1067
+ }
1068
+ }
1069
+ }
1014
1070
  const trgList = triggersByCategory.get(cat);
1015
1071
  if (trgList && trgList.length > 0) {
1016
1072
  ddl += '\n\n' + trgList.map(t => t.trim()).join('\n\n');
@@ -1064,12 +1120,12 @@ async function saveDefinitionsByType(definitions, outputDir, separateDirectories
1064
1120
  const ddlWithNewline = def.ddl.endsWith('\n') ? def.ddl : def.ddl + '\n';
1065
1121
  await fsPromises.writeFile(filePath, headerComment + ddlWithNewline);
1066
1122
  }
1067
- await generateIndexFile(toWrite, outputDir, separateDirectories, multiSchema, relations, rpcTables, allSchemas, schemas, version);
1123
+ await generateIndexFile(toWrite, outputDir, separateDirectories, multiSchema, relations, rpcTables, allSchemas, schemas, version, tableRlsStatus);
1068
1124
  }
1069
1125
  /**
1070
1126
  * Generate index file for DB objects (RLS/triggers already merged into table/view)
1071
1127
  */
1072
- async function generateIndexFile(definitions, outputDir, separateDirectories = true, multiSchema = false, relations = [], rpcTables = [], allSchemas = [], extractedSchemas = [], version) {
1128
+ async function generateIndexFile(definitions, outputDir, separateDirectories = true, multiSchema = false, relations = [], rpcTables = [], allSchemas = [], extractedSchemas = [], version, tableRlsStatus = []) {
1073
1129
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
1074
1130
  const path = await Promise.resolve().then(() => __importStar(require('path')));
1075
1131
  const outputDate = new Date().toLocaleDateString('en-CA', { year: 'numeric', month: '2-digit', day: '2-digit' });
@@ -1101,6 +1157,21 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1101
1157
  cron: definitions.filter(def => def.type === 'cron'),
1102
1158
  type: definitions.filter(def => def.type === 'type')
1103
1159
  };
1160
+ // schema.table -> RLS status (for Tables docs and warnings)
1161
+ const rlsMap = new Map();
1162
+ for (const s of tableRlsStatus) {
1163
+ rlsMap.set(`${s.schema}.${s.table}`, s);
1164
+ }
1165
+ const formatRlsNote = (schema, name) => {
1166
+ const s = rlsMap.get(`${schema}.${name}`);
1167
+ if (!s)
1168
+ return '';
1169
+ if (!s.rlsEnabled)
1170
+ return ' **⚠️ RLS disabled**';
1171
+ if (s.policyCount === 0)
1172
+ return ' (RLS: enabled, policies: 0)';
1173
+ return ` (RLS: enabled, policies: ${s.policyCount})`;
1174
+ };
1104
1175
  // Build relative path per file (schema/type/file when multiSchema)
1105
1176
  const getRelPath = (def) => {
1106
1177
  const typeDir = separateDirectories ? (typeDirNames[def.type] ?? def.type) : '.';
@@ -1123,6 +1194,9 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1123
1194
  readmeContent += 'When multiple schemas are extracted, each schema has its own subfolder (e.g. `public/tables/`, `agent/views/`).\n\n';
1124
1195
  }
1125
1196
  readmeContent += 'Full catalog and relations: [llms.txt](llms.txt)\n';
1197
+ if (tableRlsStatus.some(s => !s.rlsEnabled)) {
1198
+ readmeContent += '\n⚠️ Tables with RLS disabled: [rls_warnings.md](rls_warnings.md)\n';
1199
+ }
1126
1200
  // === llms.txt ===
1127
1201
  let llmsContent = headerLine;
1128
1202
  llmsContent += 'Database Schema - Complete Objects Catalog\n';
@@ -1138,7 +1212,8 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1138
1212
  const filePath = getRelPath(def);
1139
1213
  const commentSuffix = def.comment ? ` # ${def.comment}` : '';
1140
1214
  const displayName = def.schema ? `${def.schema}.${def.name}` : def.name;
1141
- llmsContent += `${def.type}:${displayName}:${filePath}${commentSuffix}\n`;
1215
+ const rlsSuffix = def.type === 'table' && def.schema ? formatRlsNote(def.schema, def.name) : '';
1216
+ llmsContent += `${def.type}:${displayName}:${filePath}${commentSuffix}${rlsSuffix}\n`;
1142
1217
  });
1143
1218
  if (relations.length > 0) {
1144
1219
  llmsContent += '\nRELATIONS\n';
@@ -1172,12 +1247,21 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1172
1247
  fs.writeFileSync(llmsPath, llmsContent);
1173
1248
  // schema_index.json (same data for agents that parse JSON)
1174
1249
  const schemaIndex = {
1175
- objects: definitions.map(def => ({
1176
- type: def.type,
1177
- name: def.schema ? `${def.schema}.${def.name}` : def.name,
1178
- path: getRelPath(def),
1179
- ...(def.comment && { comment: def.comment })
1180
- })),
1250
+ objects: definitions.map(def => {
1251
+ const base = {
1252
+ type: def.type,
1253
+ name: def.schema ? `${def.schema}.${def.name}` : def.name,
1254
+ path: getRelPath(def),
1255
+ ...(def.comment && { comment: def.comment })
1256
+ };
1257
+ if (def.type === 'table' && def.schema) {
1258
+ const s = rlsMap.get(`${def.schema}.${def.name}`);
1259
+ if (s) {
1260
+ base.rls = s.rlsEnabled ? (s.policyCount === 0 ? 'enabled_no_policies' : `enabled_${s.policyCount}_policies`) : 'disabled';
1261
+ }
1262
+ }
1263
+ return base;
1264
+ }),
1181
1265
  relations: relations.map(r => ({ from: r.from, to: r.to })),
1182
1266
  rpc_tables: rpcTables.map(rt => ({ rpc: rt.rpc, tables: rt.tables })),
1183
1267
  all_schemas: allSchemas.length > 0
@@ -1188,14 +1272,15 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1188
1272
  : undefined
1189
1273
  };
1190
1274
  fs.writeFileSync(path.join(outputDir, 'schema_index.json'), JSON.stringify(schemaIndex, null, 2), 'utf8');
1191
- // schema_summary.md (one-file overview for AI)
1275
+ // schema_summary.md (one-file overview for AI) — include RLS status per table
1192
1276
  let summaryMd = '# Schema summary\n\n';
1193
1277
  const tableDefs = definitions.filter(d => d.type === 'table' || d.type === 'view');
1194
1278
  if (tableDefs.length > 0) {
1195
1279
  summaryMd += '## Tables / Views\n';
1196
1280
  tableDefs.forEach(d => {
1197
1281
  const name = d.schema ? `${d.schema}.${d.name}` : d.name;
1198
- summaryMd += d.comment ? `- ${name} (# ${d.comment})\n` : `- ${name}\n`;
1282
+ const rlsNote = d.type === 'table' && d.schema ? formatRlsNote(d.schema, d.name) : '';
1283
+ summaryMd += d.comment ? `- ${name}${rlsNote} (# ${d.comment})\n` : `- ${name}${rlsNote}\n`;
1199
1284
  });
1200
1285
  summaryMd += '\n';
1201
1286
  }
@@ -1220,6 +1305,18 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1220
1305
  summaryMd += `- Not extracted: ${allSchemas.filter(s => !extractedSet.has(s)).join(', ') || '(none)'}\n`;
1221
1306
  }
1222
1307
  fs.writeFileSync(path.join(outputDir, 'schema_summary.md'), summaryMd, 'utf8');
1308
+ // RLS disabled tables warning doc (tables only; RLS enabled with 0 policies is not warned)
1309
+ const rlsNotEnabled = tableRlsStatus.filter(s => !s.rlsEnabled);
1310
+ if (rlsNotEnabled.length > 0) {
1311
+ let warnMd = '# Tables with RLS disabled (warning)\n\n';
1312
+ warnMd += 'The following tables do not have Row Level Security enabled.\n';
1313
+ warnMd += 'Enabling RLS is recommended for production security.\n\n';
1314
+ warnMd += '| Schema | Table |\n|--------|-------|\n';
1315
+ rlsNotEnabled.forEach(s => {
1316
+ warnMd += `| ${s.schema} | ${s.table} |\n`;
1317
+ });
1318
+ fs.writeFileSync(path.join(outputDir, 'rls_warnings.md'), warnMd, 'utf8');
1319
+ }
1223
1320
  }
1224
1321
  /**
1225
1322
  * Classify and output definitions
@@ -1479,13 +1576,34 @@ async function extractDefinitions(options) {
1479
1576
  console.warn('RELATIONS/RPC_TABLES extraction skipped:', err);
1480
1577
  }
1481
1578
  }
1579
+ // RLS status (for Tables docs, rls_warnings.md, and extract-time warning)
1580
+ let tableRlsStatus = [];
1581
+ try {
1582
+ const tableDefs = allDefinitions.filter(d => d.type === 'table');
1583
+ if (tableDefs.length > 0) {
1584
+ tableRlsStatus = await fetchTableRlsStatus(client, schemas);
1585
+ }
1586
+ }
1587
+ catch (err) {
1588
+ if (process.env.SUPATOOL_DEBUG) {
1589
+ console.warn('RLS status fetch skipped:', err);
1590
+ }
1591
+ }
1482
1592
  // When force: remove output dir then write (so removed tables don't leave files)
1483
1593
  if (force && fs.existsSync(outputDir)) {
1484
1594
  fs.rmSync(outputDir, { recursive: true });
1485
1595
  }
1486
1596
  // Save definitions (table+RLS+triggers merged, schema folders)
1487
1597
  spinner.text = 'Saving definitions to files...';
1488
- await saveDefinitionsByType(allDefinitions, outputDir, separateDirectories, schemas, relations, rpcTables, allSchemas, version);
1598
+ await saveDefinitionsByType(allDefinitions, outputDir, separateDirectories, schemas, relations, rpcTables, allSchemas, version, tableRlsStatus);
1599
+ // Warn at extract time when any table has RLS disabled
1600
+ const rlsNotEnabled = tableRlsStatus.filter(s => !s.rlsEnabled);
1601
+ if (rlsNotEnabled.length > 0) {
1602
+ console.warn('');
1603
+ console.warn('⚠️ Tables with RLS disabled: ' + rlsNotEnabled.map(s => `${s.schema}.${s.table}`).join(', '));
1604
+ console.warn(' Details: ' + outputDir + '/rls_warnings.md');
1605
+ console.warn('');
1606
+ }
1489
1607
  // Show stats
1490
1608
  const counts = {
1491
1609
  table: allDefinitions.filter(def => def.type === 'table').length,
@@ -8,17 +8,55 @@ const pg_1 = require("pg");
8
8
  const fs_1 = __importDefault(require("fs"));
9
9
  const path_1 = __importDefault(require("path"));
10
10
  const js_yaml_1 = __importDefault(require("js-yaml"));
11
+ /**
12
+ * Parse tables.yaml into { schema -> table[] } map.
13
+ * Format:
14
+ * public:
15
+ * - users
16
+ * - posts
17
+ * admin:
18
+ * - platforms
19
+ */
20
+ function parseTablesYaml(yamlPath) {
21
+ const yamlObj = js_yaml_1.default.load(fs_1.default.readFileSync(yamlPath, 'utf8'));
22
+ if (!yamlObj || typeof yamlObj !== 'object' || Array.isArray(yamlObj)) {
23
+ throw new Error('Invalid tables.yaml format. Use schema-grouped format:\n public:\n - users\n admin:\n - platforms');
24
+ }
25
+ // Detect old format: top-level "tables:" key with array value
26
+ if ('tables' in yamlObj && Array.isArray(yamlObj.tables)) {
27
+ throw new Error('Outdated tables.yaml format detected.\n\n' +
28
+ 'Migrate to schema-grouped format:\n\n' +
29
+ ' Before: After:\n' +
30
+ ' tables: public:\n' +
31
+ ' - users → - users\n' +
32
+ ' - posts - posts\n' +
33
+ ' - admin.x admin:\n' +
34
+ ' - x\n');
35
+ }
36
+ const entries = [];
37
+ for (const [schema, tables] of Object.entries(yamlObj)) {
38
+ if (!Array.isArray(tables)) {
39
+ throw new Error(`tables.yaml: value of "${schema}" must be a list of table names`);
40
+ }
41
+ for (const table of tables) {
42
+ // Detect dot notation inside a schema group
43
+ if (table.includes('.')) {
44
+ throw new Error(`tables.yaml: "${table}" under "${schema}" uses dot notation.\n` +
45
+ `Use schema-grouped format instead:\n` +
46
+ ` ${table.split('.')[0]}:\n` +
47
+ ` - ${table.split('.')[1]}`);
48
+ }
49
+ entries.push({ schema, table });
50
+ }
51
+ }
52
+ return entries;
53
+ }
11
54
  /**
12
55
  * Fetch table data from remote DB and generate AI seed JSON
13
56
  * @param options SeedGenOptions
14
57
  */
15
58
  async function generateSeedsFromRemote(options) {
16
- // Load tables.yaml
17
- const yamlObj = js_yaml_1.default.load(fs_1.default.readFileSync(options.tablesYamlPath, 'utf8'));
18
- if (!yamlObj || !Array.isArray(yamlObj.tables)) {
19
- throw new Error('Invalid tables.yaml format. Specify as tables: [ ... ]');
20
- }
21
- const tables = yamlObj.tables;
59
+ const tables = parseTablesYaml(options.tablesYamlPath);
22
60
  // Generate datetime subdir name (e.g. 20250705_1116_supatool)
23
61
  const now = new Date();
24
62
  const y = now.getFullYear();
@@ -28,67 +66,54 @@ async function generateSeedsFromRemote(options) {
28
66
  const mm = String(now.getMinutes()).padStart(2, '0');
29
67
  const folderName = `${y}${m}${d}_${hh}${mm}_supatool`;
30
68
  const outDir = path_1.default.join(options.outputDir, folderName);
31
- // Create output directory
32
- if (!fs_1.default.existsSync(outDir)) {
33
- fs_1.default.mkdirSync(outDir, { recursive: true });
34
- }
35
69
  // DB connection
36
70
  const client = new pg_1.Client({ connectionString: options.connectionString });
37
71
  await client.connect();
38
- let processedCount = 0;
39
- for (const tableFullName of tables) {
40
- // No schema specified -> public
41
- let schema = 'public';
42
- let table = tableFullName;
43
- if (tableFullName.includes('.')) {
44
- [schema, table] = tableFullName.split('.');
72
+ const processedFiles = [];
73
+ for (const { schema, table } of tables) {
74
+ // Create schema subdir
75
+ const schemaDir = path_1.default.join(outDir, schema);
76
+ if (!fs_1.default.existsSync(schemaDir)) {
77
+ fs_1.default.mkdirSync(schemaDir, { recursive: true });
45
78
  }
46
79
  // Fetch data
47
80
  const res = await client.query(`SELECT * FROM "${schema}"."${table}"`);
48
81
  const rows = res.rows;
49
- // File name
50
- const fileName = `${table}_seed.json`;
51
- const filePath = path_1.default.join(outDir, fileName);
52
82
  // Output JSON
83
+ const fileName = `${table}_seed.json`;
84
+ const filePath = path_1.default.join(schemaDir, fileName);
53
85
  const json = {
54
86
  table: `${schema}.${table}`,
55
87
  fetched_at: now.toISOString(),
56
- fetched_by: 'supatool v0.3.5',
88
+ fetched_by: 'supatool',
57
89
  note: 'This data is a snapshot of the remote DB at the above time. For AI coding reference. You can update it by running the update command again.',
58
90
  rows
59
91
  };
60
92
  fs_1.default.writeFileSync(filePath, JSON.stringify(json, null, 2), 'utf8');
61
- processedCount++;
93
+ processedFiles.push({ schema, table, fileName, rowCount: rows.length });
62
94
  }
63
95
  await client.end();
64
- // llms.txt index output (overwrite under supabase/seeds each run)
65
- const files = fs_1.default.readdirSync(outDir);
66
- const seedFiles = files.filter(f => f.endsWith('_seed.json'));
96
+ // llms.txt index (schema/file paths, overwrite each run)
67
97
  let llmsTxt = `# AI seed data index (generated by supatool)\n`;
68
98
  llmsTxt += `# fetched_at: ${now.toISOString()}\n`;
69
99
  llmsTxt += `# folder: ${folderName}\n`;
70
100
  llmsTxt += `# Schema catalog: ../schemas/llms.txt\n`;
71
- for (const basename of seedFiles) {
72
- const file = path_1.default.join(outDir, basename);
73
- const content = JSON.parse(fs_1.default.readFileSync(file, 'utf8'));
74
- // Table comment (empty if none)
101
+ for (const { schema, table, fileName, rowCount } of processedFiles) {
75
102
  let tableComment = '';
76
103
  try {
77
- const [schema, table] = content.table.split('.');
78
104
  const commentRes = await getTableComment(options.connectionString, schema, table);
79
105
  if (commentRes)
80
106
  tableComment = commentRes;
81
107
  }
82
108
  catch { }
83
- llmsTxt += `${content.table}: ${basename} (${Array.isArray(content.rows) ? content.rows.length : 0} rows)`;
109
+ llmsTxt += `${schema}.${table}: ${schema}/${fileName} (${rowCount} rows)`;
84
110
  if (tableComment)
85
111
  llmsTxt += ` # ${tableComment}`;
86
112
  llmsTxt += `\n`;
87
113
  }
88
114
  const llmsPath = path_1.default.join(options.outputDir, 'llms.txt');
89
115
  fs_1.default.writeFileSync(llmsPath, llmsTxt, 'utf8');
90
- // Output summary in English
91
- console.log(`Seed export completed. Processed tables: ${processedCount}`);
116
+ console.log(`Seed export completed. Processed tables: ${processedFiles.length}`);
92
117
  console.log(`llms.txt index written to: ${llmsPath}`);
93
118
  }
94
119
  /** Utility to get table comment */
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "supatool",
3
- "version": "0.4.0",
3
+ "version": "0.4.2",
4
4
  "description": "CLI for Supabase: extract schema (tables, views, RLS, RPC) to files + llms.txt for LLM, deploy local schema, seed export. CRUD code gen deprecated.",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",