supatool 0.4.0 → 0.4.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,97 +1,109 @@
1
1
  # Supatool
2
2
 
3
- CLI for Supabase: **extract** schema to files with **llms.txt** for LLMs, and **seed** export as AI-friendly JSON. Deploy and CRUD (deprecated) also available.
3
+ **The AI-Native Schema Management CLI for Supabase.** Extract database schemas into LLM-friendly structures, generate `llms.txt` catalogs, and manage seeds without drowning your AI's context.
4
4
 
5
- ## Features
5
+ [![npm version](https://img.shields.io/npm/v/supatool.svg)](https://www.npmjs.com/package/supatool)
6
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
6
7
 
7
- - **Extract** – Tables, views, RLS, functions, triggers from DB into files. One file per table (DDL + RLS + triggers). Multi-schema: `--schema public,agent` → `schemas/public/`, `schemas/agent/`.
8
- - **llms.txt** – Catalog with OBJECTS, RELATIONS, RPC_TABLES, ALL_SCHEMAS. README in output links to it.
9
- - **Seed** – Export table data to JSON; llms.txt index in `supabase/seeds/`.
10
- - **Deploy** – Push local schema to remote (`supatool deploy --dry-run`).
11
- - CRUD code gen is **deprecated** (`supatool crud`, `gen:crud`); prefer writing code with an LLM.
8
+ ## Why Supatool?
12
9
 
13
- See [CHANGELOG.md](./CHANGELOG.md) for version history.
10
+ Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database schemas. Typical issues include:
11
+ - **Token Waste:** Reading the entire schema at once consumes 10k+ tokens.
12
+ - **Lost Context:** Frequent API calls to fetch table details via MCP lead to fragmented reasoning.
13
+ - **Inaccuracy:** AI misses RLS policies or complex FK relations split across multiple files.
14
14
 
15
- ## Install
15
+ **Supatool solves this** by reorganizing your Supabase schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.
16
16
 
17
- ```bash
18
- npm install -g supatool
19
- # or: yarn global add supatool | pnpm add -g supatool
20
- ```
17
+ ---
21
18
 
22
- ## Extract
19
+ ## Key Features
23
20
 
24
- Set connection (e.g. in `.env.local`):
21
+ - **Extract (AI-Optimized)** DDL, RLS, and Triggers are bundled into **one file per table**. AI gets the full picture of a table by opening just one file.
22
+ - **llms.txt Catalog** – Automatically generates a standard `llms.txt` listing all OBJECTS, RELATIONS (FKs), and RPC dependencies. This serves as the "Map" for AI agents.
23
+ - **Multi-Schema Support** – Group objects by schema (e.g., `public`, `agent`, `auth`) with proper schema-qualification in SQL.
24
+ - **Seed for AI** – Export table data as JSON. Includes a dedicated `llms.txt` for seeds so AI can see real data structures.
25
+ - **Safe Deploy** – Push local schema changes with `--dry-run` to preview DDL before execution.
26
+ - **CRUD (Deprecated)** – Legacy code generation is still available but discouraged in favor of LLM-native development.
25
27
 
26
- ```bash
27
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
28
- ```
28
+ ---
29
29
 
30
- Extract all objects:
30
+ ## Quick Start
31
31
 
32
32
  ```bash
33
- supatool extract --all -o supabase/schemas
34
- # Optional: --schema public,agent | -t "user_*" | -c "postgresql://..."
35
- # With --force: clears output dir then writes (no orphan files).
36
- ```
33
+ npm install -g supatool
34
+ # Set your connection string
35
+ export SUPABASE_CONNECTION_STRING="postgresql://postgres:[password]@db.[ref].supabase.co:5432/postgres"
37
36
 
38
- **Output:**
37
+ # Extract schema and generate AI-ready docs
38
+ supatool extract --schema public,auth -o supabase/schemas
39
39
 
40
40
  ```
41
+
42
+ ### Output Structure
43
+
44
+ ```text
41
45
  supabase/schemas/
42
- ├── README.md # Points to llms.txt
43
- ├── llms.txt # OBJECTS, RELATIONS, RPC_TABLES, ALL_SCHEMAS (read this first for AI)
44
- ├── schema_index.json # Same as llms.txt, for agents that parse JSON
45
- ├── schema_summary.md # One-file overview: tables, relations, RPCs
46
- ├── tables/
47
- ├── views/
48
- ├── rpc/
49
- ├── cron/
50
- └── types/
46
+ ├── llms.txt # 🗺️ THE ENTRY POINT: Read this first to understand the DB map
47
+ ├── schema_index.json # 🤖 For JSON-parsing agents
48
+ ├── schema_summary.md # 📄 Single-file overview for quick human/AI scanning
49
+ ├── README.md # Navigation guide
50
+ └── [schema_name]/
51
+ ├── tables/ # table_name.sql (DDL + RLS + Triggers)
52
+ ├── views/
53
+ └── rpc/
54
+
51
55
  ```
52
56
 
53
- Multi-schema: `schemas/public/`, `schemas/agent/`, etc., each with tables/, views/, rpc/.
57
+ ---
54
58
 
55
- **Env:** `SUPABASE_CONNECTION_STRING` or `DATABASE_URL`; optional `SUPATOOL_MAX_CONCURRENT` (default 20).
59
+ ## Best Practices for AI Agents (Cursor / Claude / MCP)
56
60
 
57
- ## Seed
61
+ To get the best results from your AI coding assistant, follow these steps:
58
62
 
59
- Export selected tables as JSON for AI/reference:
63
+ 1. **Start with the Map:** Always ask the AI to read `supabase/schemas/llms.txt` first.
64
+ 2. **Targeted Reading:** Once the AI identifies the relevant tables from the catalog, instruct it to open only those specific `.sql` files.
65
+ 3. **Understand Relations:** Use the `RELATIONS` section in `llms.txt` to help the AI write accurate JOINs without reading every file.
66
+ 4. **RPC Context:** If using functions, refer to `RPC_TABLES` in `llms.txt` to know which tables are affected.
67
+
68
+ ---
69
+
70
+ ## Commands
71
+
72
+ ### Extract
60
73
 
61
74
  ```bash
62
- supatool seed --tables tables.yaml --connection "postgresql://..."
75
+ supatool extract --all -o supabase/schemas
76
+ # Options:
77
+ # --schema public,agent Specify schemas
78
+ # -t "user_*" Filter tables by pattern
79
+ # --force Clear output dir before writing (prevents orphan files)
80
+
63
81
  ```
64
82
 
65
- `tables.yaml`:
83
+ ### Seed
66
84
 
67
- ```yaml
68
- tables:
69
- - users
70
- - public.orders
71
- ```
85
+ Export specific tables for AI reference or testing:
72
86
 
73
- - Output: `supabase/seeds/<timestamp>_supatool/{table}_seed.json`
74
- - `llms.txt` is written in `supabase/seeds/` listing files and row counts.
87
+ ```bash
88
+ supatool seed --tables tables.yaml
89
+
90
+ ```
75
91
 
76
- Do not include sensitive data in seed files.
92
+ *Outputs JSON files and a `llms.txt` index in `supabase/seeds/`.*
77
93
 
78
- ## For AI / coding agents (Claude CLI, Cursor, etc.)
94
+ ### Deploy
79
95
 
80
- - **Entry point:** Read `supabase/schemas/llms.txt` first. It lists all objects, relations, RPC→tables, and schemas.
81
- - **Then** open only the files you need (paths under OBJECTS). Use RELATIONS for joins; RPC_TABLES to see which RPCs touch which tables.
82
- - **Optional:** `schema_index.json` (same data as llms.txt) and `schema_summary.md` (one-file overview) are written when you run `supatool extract`.
83
- - **Seeds:** `supabase/seeds/llms.txt` lists seed JSON files; it references the schema catalog at `../schemas/llms.txt` when present.
96
+ ```bash
97
+ supatool deploy --dry-run
84
98
 
85
- ## Other commands
99
+ ```
86
100
 
87
- - `supatool help` – List commands
88
- - `supatool deploy --dry-run` – Preview deploy
89
- - CRUD: `supatool crud`, `gen:crud` (deprecated)
101
+ ---
90
102
 
91
103
  ## Repository
92
104
 
93
- [https://github.com/idea-garage/supatool](https://github.com/idea-garage/supatool) · [npm](https://www.npmjs.com/package/supatool)
105
+ [GitHub](https://github.com/idea-garage/supatool) · [npm](https://www.npmjs.com/package/supatool)
94
106
 
95
107
  ---
96
108
 
97
- Acknowledgements: Development would not be this convenient without [Supabase](https://supabase.com/). Thanks to the team and the platform.
109
+ *Developed with ❤️ for the Supabase community. Use at your own risk. Always backup your DB before deployment.*
@@ -189,6 +189,44 @@ async function fetchRlsPolicies(client, spinner, progress, schemas = ['public'])
189
189
  return [];
190
190
  }
191
191
  }
192
+ /**
193
+ * Fetch RLS enabled flag and policy count for all tables (pg_class.relrowsecurity + pg_policies)
194
+ */
195
+ async function fetchTableRlsStatus(client, schemas = ['public']) {
196
+ if (schemas.length === 0)
197
+ return [];
198
+ const schemaPlaceholders = schemas.map((_, i) => `$${i + 1}`).join(', ');
199
+ const result = await client.query(`
200
+ SELECT
201
+ n.nspname AS schema_name,
202
+ c.relname AS table_name,
203
+ COALESCE(c.relrowsecurity, false) AS rls_enabled
204
+ FROM pg_class c
205
+ JOIN pg_namespace n ON n.oid = c.relnamespace
206
+ WHERE c.relkind = 'r'
207
+ AND n.nspname IN (${schemaPlaceholders})
208
+ ORDER BY n.nspname, c.relname
209
+ `, schemas);
210
+ const policyCountMap = new Map();
211
+ const policyResult = await client.query(`
212
+ SELECT schemaname, tablename, COUNT(*) AS cnt
213
+ FROM pg_policies
214
+ WHERE schemaname IN (${schemaPlaceholders})
215
+ GROUP BY schemaname, tablename
216
+ `, schemas);
217
+ for (const row of policyResult.rows) {
218
+ policyCountMap.set(`${row.schemaname}.${row.tablename}`, parseInt(row.cnt, 10));
219
+ }
220
+ return result.rows.map((r) => {
221
+ const key = `${r.schema_name}.${r.table_name}`;
222
+ return {
223
+ schema: r.schema_name,
224
+ table: r.table_name,
225
+ rlsEnabled: !!r.rls_enabled,
226
+ policyCount: policyCountMap.get(key) ?? 0
227
+ };
228
+ });
229
+ }
192
230
  /**
193
231
  * Fetch FK relations list (for llms.txt RELATIONS)
194
232
  */
@@ -973,7 +1011,7 @@ async function generateCreateTableDDL(client, tableName, schemaName = 'public')
973
1011
  /**
974
1012
  * Save definitions to files (merge RLS/triggers into table/view; schema folders when multi-schema)
975
1013
  */
976
- async function saveDefinitionsByType(definitions, outputDir, separateDirectories = true, schemas = ['public'], relations = [], rpcTables = [], allSchemas = [], version) {
1014
+ async function saveDefinitionsByType(definitions, outputDir, separateDirectories = true, schemas = ['public'], relations = [], rpcTables = [], allSchemas = [], version, tableRlsStatus = []) {
977
1015
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
978
1016
  const path = await Promise.resolve().then(() => __importStar(require('path')));
979
1017
  const outputDate = new Date().toLocaleDateString('en-CA', { year: 'numeric', month: '2-digit', day: '2-digit' });
@@ -1064,12 +1102,12 @@ async function saveDefinitionsByType(definitions, outputDir, separateDirectories
1064
1102
  const ddlWithNewline = def.ddl.endsWith('\n') ? def.ddl : def.ddl + '\n';
1065
1103
  await fsPromises.writeFile(filePath, headerComment + ddlWithNewline);
1066
1104
  }
1067
- await generateIndexFile(toWrite, outputDir, separateDirectories, multiSchema, relations, rpcTables, allSchemas, schemas, version);
1105
+ await generateIndexFile(toWrite, outputDir, separateDirectories, multiSchema, relations, rpcTables, allSchemas, schemas, version, tableRlsStatus);
1068
1106
  }
1069
1107
  /**
1070
1108
  * Generate index file for DB objects (RLS/triggers already merged into table/view)
1071
1109
  */
1072
- async function generateIndexFile(definitions, outputDir, separateDirectories = true, multiSchema = false, relations = [], rpcTables = [], allSchemas = [], extractedSchemas = [], version) {
1110
+ async function generateIndexFile(definitions, outputDir, separateDirectories = true, multiSchema = false, relations = [], rpcTables = [], allSchemas = [], extractedSchemas = [], version, tableRlsStatus = []) {
1073
1111
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
1074
1112
  const path = await Promise.resolve().then(() => __importStar(require('path')));
1075
1113
  const outputDate = new Date().toLocaleDateString('en-CA', { year: 'numeric', month: '2-digit', day: '2-digit' });
@@ -1101,6 +1139,21 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1101
1139
  cron: definitions.filter(def => def.type === 'cron'),
1102
1140
  type: definitions.filter(def => def.type === 'type')
1103
1141
  };
1142
+ // schema.table -> RLS status (for Tables docs and warnings)
1143
+ const rlsMap = new Map();
1144
+ for (const s of tableRlsStatus) {
1145
+ rlsMap.set(`${s.schema}.${s.table}`, s);
1146
+ }
1147
+ const formatRlsNote = (schema, name) => {
1148
+ const s = rlsMap.get(`${schema}.${name}`);
1149
+ if (!s)
1150
+ return '';
1151
+ if (!s.rlsEnabled)
1152
+ return ' **⚠️ RLS disabled**';
1153
+ if (s.policyCount === 0)
1154
+ return ' (RLS: enabled, policies: 0)';
1155
+ return ` (RLS: enabled, policies: ${s.policyCount})`;
1156
+ };
1104
1157
  // Build relative path per file (schema/type/file when multiSchema)
1105
1158
  const getRelPath = (def) => {
1106
1159
  const typeDir = separateDirectories ? (typeDirNames[def.type] ?? def.type) : '.';
@@ -1123,6 +1176,9 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1123
1176
  readmeContent += 'When multiple schemas are extracted, each schema has its own subfolder (e.g. `public/tables/`, `agent/views/`).\n\n';
1124
1177
  }
1125
1178
  readmeContent += 'Full catalog and relations: [llms.txt](llms.txt)\n';
1179
+ if (tableRlsStatus.some(s => !s.rlsEnabled)) {
1180
+ readmeContent += '\n⚠️ Tables with RLS disabled: [rls_warnings.md](rls_warnings.md)\n';
1181
+ }
1126
1182
  // === llms.txt ===
1127
1183
  let llmsContent = headerLine;
1128
1184
  llmsContent += 'Database Schema - Complete Objects Catalog\n';
@@ -1138,7 +1194,8 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1138
1194
  const filePath = getRelPath(def);
1139
1195
  const commentSuffix = def.comment ? ` # ${def.comment}` : '';
1140
1196
  const displayName = def.schema ? `${def.schema}.${def.name}` : def.name;
1141
- llmsContent += `${def.type}:${displayName}:${filePath}${commentSuffix}\n`;
1197
+ const rlsSuffix = def.type === 'table' && def.schema ? formatRlsNote(def.schema, def.name) : '';
1198
+ llmsContent += `${def.type}:${displayName}:${filePath}${commentSuffix}${rlsSuffix}\n`;
1142
1199
  });
1143
1200
  if (relations.length > 0) {
1144
1201
  llmsContent += '\nRELATIONS\n';
@@ -1172,12 +1229,21 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1172
1229
  fs.writeFileSync(llmsPath, llmsContent);
1173
1230
  // schema_index.json (same data for agents that parse JSON)
1174
1231
  const schemaIndex = {
1175
- objects: definitions.map(def => ({
1176
- type: def.type,
1177
- name: def.schema ? `${def.schema}.${def.name}` : def.name,
1178
- path: getRelPath(def),
1179
- ...(def.comment && { comment: def.comment })
1180
- })),
1232
+ objects: definitions.map(def => {
1233
+ const base = {
1234
+ type: def.type,
1235
+ name: def.schema ? `${def.schema}.${def.name}` : def.name,
1236
+ path: getRelPath(def),
1237
+ ...(def.comment && { comment: def.comment })
1238
+ };
1239
+ if (def.type === 'table' && def.schema) {
1240
+ const s = rlsMap.get(`${def.schema}.${def.name}`);
1241
+ if (s) {
1242
+ base.rls = s.rlsEnabled ? (s.policyCount === 0 ? 'enabled_no_policies' : `enabled_${s.policyCount}_policies`) : 'disabled';
1243
+ }
1244
+ }
1245
+ return base;
1246
+ }),
1181
1247
  relations: relations.map(r => ({ from: r.from, to: r.to })),
1182
1248
  rpc_tables: rpcTables.map(rt => ({ rpc: rt.rpc, tables: rt.tables })),
1183
1249
  all_schemas: allSchemas.length > 0
@@ -1188,14 +1254,15 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1188
1254
  : undefined
1189
1255
  };
1190
1256
  fs.writeFileSync(path.join(outputDir, 'schema_index.json'), JSON.stringify(schemaIndex, null, 2), 'utf8');
1191
- // schema_summary.md (one-file overview for AI)
1257
+ // schema_summary.md (one-file overview for AI) — include RLS status per table
1192
1258
  let summaryMd = '# Schema summary\n\n';
1193
1259
  const tableDefs = definitions.filter(d => d.type === 'table' || d.type === 'view');
1194
1260
  if (tableDefs.length > 0) {
1195
1261
  summaryMd += '## Tables / Views\n';
1196
1262
  tableDefs.forEach(d => {
1197
1263
  const name = d.schema ? `${d.schema}.${d.name}` : d.name;
1198
- summaryMd += d.comment ? `- ${name} (# ${d.comment})\n` : `- ${name}\n`;
1264
+ const rlsNote = d.type === 'table' && d.schema ? formatRlsNote(d.schema, d.name) : '';
1265
+ summaryMd += d.comment ? `- ${name}${rlsNote} (# ${d.comment})\n` : `- ${name}${rlsNote}\n`;
1199
1266
  });
1200
1267
  summaryMd += '\n';
1201
1268
  }
@@ -1220,6 +1287,18 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1220
1287
  summaryMd += `- Not extracted: ${allSchemas.filter(s => !extractedSet.has(s)).join(', ') || '(none)'}\n`;
1221
1288
  }
1222
1289
  fs.writeFileSync(path.join(outputDir, 'schema_summary.md'), summaryMd, 'utf8');
1290
+ // RLS disabled tables warning doc (tables only; RLS enabled with 0 policies is not warned)
1291
+ const rlsNotEnabled = tableRlsStatus.filter(s => !s.rlsEnabled);
1292
+ if (rlsNotEnabled.length > 0) {
1293
+ let warnMd = '# Tables with RLS disabled (warning)\n\n';
1294
+ warnMd += 'The following tables do not have Row Level Security enabled.\n';
1295
+ warnMd += 'Enabling RLS is recommended for production security.\n\n';
1296
+ warnMd += '| Schema | Table |\n|--------|-------|\n';
1297
+ rlsNotEnabled.forEach(s => {
1298
+ warnMd += `| ${s.schema} | ${s.table} |\n`;
1299
+ });
1300
+ fs.writeFileSync(path.join(outputDir, 'rls_warnings.md'), warnMd, 'utf8');
1301
+ }
1223
1302
  }
1224
1303
  /**
1225
1304
  * Classify and output definitions
@@ -1479,13 +1558,34 @@ async function extractDefinitions(options) {
1479
1558
  console.warn('RELATIONS/RPC_TABLES extraction skipped:', err);
1480
1559
  }
1481
1560
  }
1561
+ // RLS status (for Tables docs, rls_warnings.md, and extract-time warning)
1562
+ let tableRlsStatus = [];
1563
+ try {
1564
+ const tableDefs = allDefinitions.filter(d => d.type === 'table');
1565
+ if (tableDefs.length > 0) {
1566
+ tableRlsStatus = await fetchTableRlsStatus(client, schemas);
1567
+ }
1568
+ }
1569
+ catch (err) {
1570
+ if (process.env.SUPATOOL_DEBUG) {
1571
+ console.warn('RLS status fetch skipped:', err);
1572
+ }
1573
+ }
1482
1574
  // When force: remove output dir then write (so removed tables don't leave files)
1483
1575
  if (force && fs.existsSync(outputDir)) {
1484
1576
  fs.rmSync(outputDir, { recursive: true });
1485
1577
  }
1486
1578
  // Save definitions (table+RLS+triggers merged, schema folders)
1487
1579
  spinner.text = 'Saving definitions to files...';
1488
- await saveDefinitionsByType(allDefinitions, outputDir, separateDirectories, schemas, relations, rpcTables, allSchemas, version);
1580
+ await saveDefinitionsByType(allDefinitions, outputDir, separateDirectories, schemas, relations, rpcTables, allSchemas, version, tableRlsStatus);
1581
+ // Warn at extract time when any table has RLS disabled
1582
+ const rlsNotEnabled = tableRlsStatus.filter(s => !s.rlsEnabled);
1583
+ if (rlsNotEnabled.length > 0) {
1584
+ console.warn('');
1585
+ console.warn('⚠️ Tables with RLS disabled: ' + rlsNotEnabled.map(s => `${s.schema}.${s.table}`).join(', '));
1586
+ console.warn(' Details: ' + outputDir + '/rls_warnings.md');
1587
+ console.warn('');
1588
+ }
1489
1589
  // Show stats
1490
1590
  const counts = {
1491
1591
  table: allDefinitions.filter(def => def.type === 'table').length,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "supatool",
3
- "version": "0.4.0",
3
+ "version": "0.4.1",
4
4
  "description": "CLI for Supabase: extract schema (tables, views, RLS, RPC) to files + llms.txt for LLM, deploy local schema, seed export. CRUD code gen deprecated.",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",