supatool 0.3.4 → 0.3.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -2,6 +2,14 @@
2
2
 
3
3
  A CLI tool that automatically generates TypeScript CRUD code from Supabase type definitions.
4
4
 
5
+ ## Features
6
+ - Extract and categorize all database objects (tables, views, RLS, functions, triggers) from Supabase
7
+ - Generate TypeScript CRUD functions from Supabase types or model YAML
8
+ - Output human-readable and AI-friendly schema/index files
9
+ - Flexible environment/configuration and batch processing
10
+ - Simple CLI with help and documentation
11
+
12
+ > For all new features and version history, see [CHANGELOG.md](./CHANGELOG.md).
5
13
 
6
14
  ## Install
7
15
 
@@ -15,7 +23,7 @@ pnpm add -g supatool
15
23
 
16
24
  ## Usage
17
25
 
18
- ### 1. Extract Database Schema (NEW in v0.3.0)
26
+ ### Extract Database Schema
19
27
 
20
28
  Extract and categorize all database objects from your Supabase project:
21
29
 
@@ -50,7 +58,7 @@ supabase/schemas/
50
58
  └── rpc/ # Functions & triggers
51
59
  ```
52
60
 
53
- ### 2. Generate CRUD Code
61
+ ### Generate CRUD Code
54
62
 
55
63
  Generate TypeScript CRUD functions from Supabase types:
56
64
 
@@ -67,7 +75,7 @@ supatool gen:crud model.yaml
67
75
 
68
76
  **Output:** `src/integrations/supabase/crud-autogen/`
69
77
 
70
- ### 3. Environment Configuration
78
+ ### Environment Configuration
71
79
 
72
80
  For security and convenience, set your connection string in environment variables:
73
81
 
@@ -88,7 +96,7 @@ supatool extract --all -o supabase/schemas
88
96
  - `DATABASE_URL` (fallback)
89
97
  - `SUPATOOL_MAX_CONCURRENT` (max concurrent table processing, default: 20, max: 50)
90
98
 
91
- ### 4. Additional Commands
99
+ ### Additional Commands
92
100
 
93
101
  ```bash
94
102
  # Show help for all commands
@@ -98,16 +106,12 @@ supatool help
98
106
  supatool crud -i path/to/types -o path/to/output
99
107
  ```
100
108
 
101
- ## Note: Supabase Client Requirement
109
+ ## Database Comments
102
110
 
103
- The generated CRUD code assumes that a Supabase client is defined in ../client.ts (relative to the export folder).
104
- Example:
111
+ Supatool automatically extracts and includes PostgreSQL comments in all generated files. Comments enhance documentation and AI understanding of your schema.
105
112
 
106
- ```ts
107
- // src/integrations/supabase/client.ts
108
- import { createClient } from '@supabase/supabase-js'
109
- export const supabase = createClient('YOUR_SUPABASE_URL', 'YOUR_SUPABASE_ANON_KEY')
110
- ```
113
+ - Table, view, function, and type comments are included in generated SQL and documentation.
114
+ - AI-friendly index files (llms.txt) and Markdown index (index.md) include comments for better context.
111
115
 
112
116
  ## VSCode/Cursor Integration
113
117
 
@@ -271,99 +275,56 @@ supatool extract --all --schema public,auth,extensions -o supabase/schemas
271
275
  supatool extract --all -c "postgresql://..." -o supabase/schemas
272
276
  ```
273
277
 
274
- ## New Features (v0.3.0)
275
-
276
- - **🔍 Schema Extraction**: Extract and categorize all database objects (tables, views, RLS, functions, triggers)
277
- - **📋 Supabase Declarative Schema**: Fully compliant with [Supabase's declarative database schemas](https://supabase.com/docs/guides/local-development/declarative-database-schemas) workflow
278
- - **🤖 AI-Friendly Index**: Auto-generated index.md and llms.txt files for better AI understanding of schema structure
279
- - **💬 Comment Support**: Automatically extracts and includes database comments in generated files
280
- - **📁 Organized Output**: Separate directories for different object types with flexible organization options
281
- - **🎯 Pattern Matching**: Extract specific tables/views using wildcard patterns
282
- - **👁️ View Support**: Enhanced CRUD generation with SELECT-only operations for database views
283
- - **⚛️ React Query Integration**: Generate modern React hooks for data fetching
284
- - **🔧 Flexible Workflows**: Support both database-first and model-first development approaches
285
-
286
- ## Changelog
287
-
288
- ### v0.3.4
289
-
290
- - **FIXED**: Corrected RLS policy to proper format
291
- - **FIXED**: Ensured semicolon (;) is properly appended to function definitions
292
- - **FIXED**: Removed trailing whitespace from RLS template files
293
-
294
- ### v0.3.3
295
-
296
- - **ENHANCED**: Improved SQL comment placement (moved to end of each SQL statement)
297
- - **ENHANCED**: Unified comment format for tables, views, functions, and custom types
298
- - **FIXED**: Preserved view `security_invoker` settings
299
-
300
- ### v0.3.2
301
-
302
- - **ENHANCED**: Adjust for extensions(vector, geometry etc.)
303
- - **FIXED**: USER-DEFINED column types are now rendered with full type definitions (e.g. `vector(1536)`, `geometry(Point,4326)`).
304
- - **ADDED**: `FOREIGN KEY` constraints are now included as `CONSTRAINT ... FOREIGN KEY ... REFERENCES ...` inside generated `CREATE TABLE` statements.
305
-
306
- ### v0.3.0
307
-
308
- **NEW Features:**
309
- - **NEW**: `extract` command for database schema extraction
310
- - **NEW**: Full compliance with Supabase declarative database schemas workflow
311
- - **NEW**: AI-friendly index.md and llms.txt generation for better schema understanding
312
- - **NEW**: Database comment extraction and integration
313
- - **NEW**: Organized directory structure (tables/, views/, rls/, rpc/)
314
- - **NEW**: Pattern matching for selective extraction
315
- - **ENHANCED**: Support for all database object types (RLS, functions, triggers, cron jobs, custom types)
316
- - **ENHANCED**: Flexible output options with --no-separate compatibility
317
-
318
- **Enhanced Error Handling:**
319
- - Comprehensive try-catch blocks for all CRUD operations
320
- - Enhanced null/undefined checks with proper fallbacks
321
- - Detailed error messages with contextual information
322
- - Special handling for PGRST116 errors (record not found)
323
- - Parameter validation for required fields
324
- - Proper error logging and debugging support
325
-
326
- **Breaking Changes:**
327
- - **Function Parameter Format**: All CRUD functions now use destructuring assignment
328
- - Before: `selectTableRowById(id: string)`
329
- - After: `selectTableRowById({ id }: { id: string })`
330
- - **Type Safety**: Enhanced TypeScript type annotations for all functions
331
-
332
- ### v0.2.0
333
- - Added `gen:` commands for code and schema generation
334
- - Enhanced `create` command
335
- - Introduced model schema support (`schemas/supatool-data.schema.ts`)
278
+ ## Seed Command (v0.3.5+)
336
279
 
337
- ## Database Comments
280
+ Export selected table data from your remote Supabase DB as AI-friendly seed JSON files.
338
281
 
339
- Supatool automatically extracts and includes PostgreSQL comments in all generated files. Comments enhance documentation and AI understanding of your schema.
282
+ ### Usage
340
283
 
341
- ### Adding Comments to Your Database
284
+ ```
285
+ supatool seed --tables tables.yaml --connection <CONNECTION_STRING>
286
+ ```
342
287
 
343
- ```sql
344
- -- Table comments
345
- COMMENT ON TABLE users IS 'User account information and authentication data';
288
+ - `tables.yaml` example:
289
+ ```yaml
290
+ tables:
291
+ - users
292
+ - public.orders
293
+ ```
294
+ - Output: `supabase/seeds/<timestamp>_supatool/{table}_seed.json`
295
+ - Each file contains a snapshot of the remote DB table at the time of export.
346
296
 
347
- -- View comments
348
- COMMENT ON VIEW user_profiles IS 'Combined user data with profile information';
297
+ ### Example output (users_seed.json)
298
+ ```json
299
+ {
300
+ "table": "public.users",
301
+ "fetched_at": "2024-07-05T11:16:00Z",
302
+ "fetched_by": "supatool v0.3.5",
303
+ "note": "This data is a snapshot of the remote DB at the above time. For AI coding reference. You can update it by running the update command again.",
304
+ "rows": [
305
+ { "id": 1, "name": "Taro Yamada", "email": "taro@example.com" },
306
+ { "id": 2, "name": "Hanako Suzuki", "email": "hanako@example.com" }
307
+ ]
308
+ }
309
+ ```
349
310
 
350
- -- Function comments
351
- COMMENT ON FUNCTION update_timestamp() IS 'Automatically updates the updated_at column';
311
+ > **Warning:** Do not include sensitive or personal data in seed files. Handle all exported data with care.
352
312
 
353
- -- Custom type comments
354
- COMMENT ON TYPE user_status IS 'Enumeration of possible user account statuses';
355
- ```
313
+ ### llms.txt (AI seed data index)
356
314
 
357
- ### Comment Integration
315
+ After exporting, a file named `llms.txt` is automatically generated (and overwritten) in the `supabase/seeds/` directory. This file lists all seed JSON files in the latest timestamped folder, with table name, fetch time, and row count for AI reference.
316
+
317
+ - Note: `llms.txt` is not generated inside each timestamped subfolder, only in `supabase/seeds/`.
318
+
319
+ #### Example llms.txt
320
+ ```
321
+ # AI seed data index (generated by supatool)
322
+ # fetched_at: 2024-07-05T11:16:00Z
323
+ # folder: 20240705_1116_supatool
324
+ public.users: users_seed.json (2 rows) # User account table
325
+ public.orders: orders_seed.json (5 rows)
326
+ ```
358
327
 
359
- Comments appear in:
360
- - **index.md**: Human-readable file listings with descriptions (tables/views only)
361
- - **llms.txt**: AI-friendly format (`type:name:path:comment`)
362
- - **Generated SQL**: As `COMMENT ON` statements for full schema recreation
328
+ ## More Information
363
329
 
364
- **Example output:**
365
- ```markdown
366
- ## Tables
367
- - [users](tables/users.sql) - User account information and authentication data
368
- - [posts](tables/posts.sql) - User-generated content and blog posts
369
- ```
330
+ For full version history and detailed changes, see [CHANGELOG.md](./CHANGELOG.md).
@@ -4,15 +4,13 @@ exports.modelSchemaHelp = exports.helpText = void 0;
4
4
  // See: [src/bin/helptext.ts](./src/bin/helptext.ts) from project root
5
5
  // Help text (command section from README, English only)
6
6
  exports.helpText = `
7
- Supatool CLI - Supabase database schema extraction and TypeScript CRUD generation
7
+ Supatool CLI - Supabase schema extraction and TypeScript CRUD generation
8
8
 
9
9
  Usage:
10
10
  supatool <command> [options]
11
11
 
12
12
  Commands:
13
- extract Extract and categorize database objects from Supabase
14
- gen:schema-crud Generate CRUD code from supabase/schemas SQL files
15
- crud Generate CRUD code from Supabase type definitions
13
+ extract Extract database objects from Supabase
16
14
  gen:types Generate TypeScript types from model YAML
17
15
  gen:crud Generate CRUD TypeScript code from model YAML
18
16
  gen:docs Generate Markdown documentation from model YAML
@@ -20,83 +18,21 @@ Commands:
20
18
  gen:rls Generate RLS/security SQL from model YAML
21
19
  gen:all Generate all outputs from model YAML
22
20
  create Generate a template model YAML
21
+ crud Generate CRUD code from Supabase type definitions
22
+ sync Sync local and remote schemas
23
+ seed Export selected table data as AI-friendly seed JSON
23
24
  config:init Generate configuration template
24
25
  help Show help
25
26
 
26
- Extract Options:
27
+ Common Options:
27
28
  -c, --connection <string> Supabase connection string
28
- -o, --output-dir <path> Output directory (default: ./supabase/schemas)
29
- -t, --tables <pattern> Table pattern with wildcards (default: *)
30
- --tables-only Extract only table definitions
31
- --views-only Extract only view definitions
32
- --all Extract all DB objects (tables, views, RLS, functions, triggers, cron, types)
33
- --no-separate Output all objects in same directory
34
- --schema <schemas> Target schemas, comma-separated (default: public)
35
-
36
- Examples:
37
- # Set connection in .env.local (recommended)
38
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
39
-
40
- # Extract all database objects with AI-friendly index
41
- supatool extract --all -o supabase/schemas
42
- # Output:
43
- # supabase/schemas/index.md (Human-readable index with table/view comments)
44
- # supabase/schemas/llms.txt (AI-friendly structured data with comments)
45
- # supabase/schemas/tables/*.sql (Tables with comments)
46
- # supabase/schemas/views/*.sql (Views with comments)
47
- # supabase/schemas/rls/*.sql (RLS policies)
48
- # supabase/schemas/rpc/*.sql (Functions & triggers)
49
- # supabase/schemas/cron/*.sql (Cron jobs)
50
- # supabase/schemas/types/*.sql (Custom types)
51
-
52
- # Extract only tables and views (default)
53
- supatool extract -o supabase/schemas
54
-
55
- # Extract to single directory (legacy mode)
56
- supatool extract --no-separate -o supabase/schemas
57
-
58
- # Extract specific pattern
59
- supatool extract -t "user_*" -o ./user-tables
60
-
61
- # Extract from specific schemas (default: public)
62
- supatool extract --all --schema public,auth,extensions -o supabase/schemas
63
-
64
- # Alternative: specify connection directly
65
- supatool extract --all -c "postgresql://..." -o supabase/schemas
66
-
67
- # Complete database-first workflow
68
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
69
- supatool extract --all -o supabase/schemas
70
- supatool gen:schema-crud --include-views --react-query
71
-
72
- # Model-first workflow
73
- supatool create model.yaml
74
- supatool gen:all model.yaml
75
-
76
- # Legacy CRUD generation
77
- supatool crud
29
+ -o, --output-dir <path> Output directory
30
+ -t, --tables <pattern|path> Table pattern or YAML path
31
+ --schema <schemas> Target schemas (comma-separated)
32
+ --config <path> Configuration file path
33
+ -f, --force Force overwrite
78
34
 
79
- Database Comments:
80
- Supatool automatically extracts and includes database comments in generated files.
81
-
82
- To add comments to your database objects:
83
-
84
- # Table comments
85
- COMMENT ON TABLE users IS 'User account information';eha,
86
-
87
- # View comments
88
- COMMENT ON VIEW user_profiles IS 'Combined user data with profile information';
89
-
90
- # Function comments
91
- COMMENT ON FUNCTION update_timestamp() IS 'Automatically updates the updated_at column';
92
-
93
- # Custom type comments
94
- COMMENT ON TYPE user_status IS 'Enumeration of possible user account statuses';
95
-
96
- Comments will appear in:
97
- - index.md: Human-readable list with descriptions (tables/views only)
98
- - llms.txt: AI-friendly format (type:name:path:comment)
99
- - Generated SQL files: As COMMENT statements
35
+ For details, see the documentation.
100
36
  `;
101
37
  // Model Schema Usage
102
38
  exports.modelSchemaHelp = `
@@ -18,6 +18,7 @@ const sqlGenerator_1 = require("../generator/sqlGenerator");
18
18
  const rlsGenerator_1 = require("../generator/rlsGenerator");
19
19
  const sync_1 = require("../sync");
20
20
  const definitionExtractor_1 = require("../sync/definitionExtractor");
21
+ const seedGenerator_1 = require("../sync/seedGenerator");
21
22
  const fs_1 = __importDefault(require("fs"));
22
23
  const path_1 = __importDefault(require("path"));
23
24
  const program = new commander_1.Command();
@@ -193,6 +194,73 @@ program
193
194
  .action(() => {
194
195
  console.log(helptext_1.helpText);
195
196
  });
197
+ // sync コマンド
198
+ program
199
+ .command('sync')
200
+ .description('ローカルスキーマとリモートスキーマを同期')
201
+ .option('-c, --connection <string>', 'Supabase connection string')
202
+ .option('-s, --schema-dir <path>', 'ローカルスキーマディレクトリ', './supabase/schemas')
203
+ .option('-t, --tables <pattern>', 'テーブルパターン(ワイルドカード対応)', '*')
204
+ .option('-f, --force', '強制上書き(確認なし)')
205
+ .option('--config <path>', '設定ファイルパス')
206
+ .action(async (options) => {
207
+ const config = (0, sync_1.resolveConfig)({
208
+ connectionString: options.connection
209
+ }, options.config);
210
+ if (!config.connectionString) {
211
+ console.error('Connection string is required. Set it using one of:');
212
+ console.error('1. --connection option');
213
+ console.error('2. SUPABASE_CONNECTION_STRING environment variable');
214
+ console.error('3. DATABASE_URL environment variable');
215
+ console.error('4. supatool.config.json configuration file');
216
+ process.exit(1);
217
+ }
218
+ try {
219
+ await (0, sync_1.syncAllTables)({
220
+ connectionString: config.connectionString,
221
+ schemaDir: options.schemaDir,
222
+ tablePattern: options.tables,
223
+ force: options.force
224
+ });
225
+ }
226
+ catch (error) {
227
+ console.error('⚠️ Sync error:', error);
228
+ process.exit(1);
229
+ }
230
+ });
231
+ // seed コマンド
232
+ program
233
+ .command('seed')
234
+ .description('指定テーブルのデータをリモートDBから取得し、AI用シードJSONを生成')
235
+ .option('-c, --connection <string>', 'Supabase接続文字列')
236
+ .option('-t, --tables <path>', '取得テーブル一覧YAML', 'tables.yaml')
237
+ .option('-o, --out <dir>', '出力ディレクトリ', 'supabase/seeds')
238
+ .option('--config <path>', '設定ファイルパス')
239
+ .action(async (options) => {
240
+ // 接続情報の解決
241
+ const config = (0, sync_1.resolveConfig)({
242
+ connectionString: options.connection
243
+ }, options.config);
244
+ if (!config.connectionString) {
245
+ console.error('Connection string is required. Set it using one of:');
246
+ console.error('1. --connection option');
247
+ console.error('2. SUPABASE_CONNECTION_STRING environment variable');
248
+ console.error('3. DATABASE_URL environment variable');
249
+ console.error('4. supatool.config.json configuration file');
250
+ process.exit(1);
251
+ }
252
+ try {
253
+ await (0, seedGenerator_1.generateSeedsFromRemote)({
254
+ connectionString: config.connectionString,
255
+ tablesYamlPath: options.tables,
256
+ outputDir: options.out
257
+ });
258
+ }
259
+ catch (error) {
260
+ console.error('⚠️ Seed取得エラー:', error);
261
+ process.exit(1);
262
+ }
263
+ });
196
264
  // If no subcommand is specified, show helpText only (do not call main)
197
265
  if (!process.argv.slice(2).length) {
198
266
  console.log(helptext_1.helpText);
@@ -85,6 +85,8 @@ function generateCrudFromModel(model, outDir) {
85
85
  // フィルターで検索関数
86
86
  code += `/** フィルターで複数件取得 */\n`;
87
87
  code += `export async function select${capitalizedName}RowsWithFilters({ filters }: { filters: Filters }): Promise<${tableName}[]> {\n`;
88
+ code += ` // filtersのガード\n`;
89
+ code += ` if (!filters || typeof filters !== 'object') return [];\n`;
88
90
  code += ` try {\n`;
89
91
  code += ` let query = supabase.from('${tableName}').select('*');\n`;
90
92
  code += ` \n`;
package/dist/index.js CHANGED
@@ -82,36 +82,50 @@ function main() {
82
82
  const typeName = typeAliasDecl.name.text;
83
83
  const typeNode = typeAliasDecl.type;
84
84
  if (typeNode.kind === typescript_1.SyntaxKind.TypeLiteral && typeName === 'Database') {
85
- const schemaName = getSchemaName(typeNode);
86
- const schemaType = typeNode.members.find((member) => member.name && member.name.text === schemaName);
87
- if (schemaType && schemaType.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
88
- const tablesAndViewsType = schemaType.type.members.filter((member) => member.name && (member.name.text === 'Tables' || member.name.text === 'Views'));
89
- return tablesAndViewsType.flatMap((tablesOrViewsType) => {
90
- if (tablesOrViewsType.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
91
- return tablesOrViewsType.type.members.map((tableOrViewMember) => {
92
- const tableName = tableOrViewMember.name.text;
93
- const isView = tablesOrViewsType.name.text === 'Views';
94
- const rowType = tableOrViewMember.type.members.find((member) => member.name && member.name.text === 'Row');
95
- if (rowType && rowType.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
96
- const fields = rowType.type.members.map((member) => {
97
- if (member.name && member.name.kind === typescript_1.SyntaxKind.Identifier) {
98
- const name = member.name.getText(sourceFile);
99
- const type = member.type ? member.type.getText(sourceFile) : 'unknown';
100
- return { name, type };
101
- }
102
- return { name: 'unknown', type: 'unknown' };
103
- });
104
- return { typeName: tableName, fields, isView };
105
- }
106
- return null;
107
- }).filter((type) => type !== null);
108
- }
109
- return [];
110
- });
111
- }
85
+ console.log('Found Database type, processing schemas...');
86
+ // Database型内の全てのスキーマを処理
87
+ return typeNode.members.flatMap((schemaMember) => {
88
+ if (schemaMember.name && schemaMember.type && schemaMember.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
89
+ const schemaName = schemaMember.name.text;
90
+ console.log(`Processing schema: ${schemaName}`);
91
+ const schemaType = schemaMember.type;
92
+ // スキーマ内のTablesとViewsを処理
93
+ const tablesAndViewsType = schemaType.members.filter((member) => member.name && (member.name.text === 'Tables' || member.name.text === 'Views'));
94
+ return tablesAndViewsType.flatMap((tablesOrViewsType) => {
95
+ if (tablesOrViewsType.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
96
+ const tableCount = tablesOrViewsType.type.members.length;
97
+ console.log(`Found ${tableCount} ${tablesOrViewsType.name.text} in schema ${schemaName}`);
98
+ return tablesOrViewsType.type.members.map((tableOrViewMember) => {
99
+ const tableName = tableOrViewMember.name.text;
100
+ const isView = tablesOrViewsType.name.text === 'Views';
101
+ console.log(`Processing ${isView ? 'view' : 'table'}: ${tableName}`);
102
+ const rowType = tableOrViewMember.type.members.find((member) => member.name && member.name.text === 'Row');
103
+ if (rowType && rowType.type.kind === typescript_1.SyntaxKind.TypeLiteral) {
104
+ const fields = rowType.type.members.map((member) => {
105
+ if (member.name && member.name.kind === typescript_1.SyntaxKind.Identifier) {
106
+ const name = member.name.getText(sourceFile);
107
+ const type = member.type ? member.type.getText(sourceFile) : 'unknown';
108
+ return { name, type };
109
+ }
110
+ return { name: 'unknown', type: 'unknown' };
111
+ });
112
+ return { typeName: tableName, fields, isView, schema: schemaName };
113
+ }
114
+ return null;
115
+ }).filter((type) => type !== null);
116
+ }
117
+ return [];
118
+ });
119
+ }
120
+ return [];
121
+ });
112
122
  }
113
123
  return [];
114
124
  });
125
+ console.log(`Total types found: ${types.length}`);
126
+ types.forEach(type => {
127
+ console.log(`- ${type.schema}.${type.typeName} (${type.isView ? 'view' : 'table'}) with ${type.fields.length} fields`);
128
+ });
115
129
  // Show start of generation process
116
130
  console.log(`Import path: ${importPath}`);
117
131
  console.log(`Export path: ${exportPath}`);
@@ -129,23 +143,25 @@ function main() {
129
143
  return true;
130
144
  })
131
145
  .forEach(type => {
146
+ // スキーマごとにフォルダ分け
147
+ const schemaFolder = path_1.default.join(crudFolderPath, type.schema);
148
+ if (!(0, fs_1.existsSync)(schemaFolder)) {
149
+ (0, fs_1.mkdirSync)(schemaFolder, { recursive: true });
150
+ }
132
151
  const fileName = toLowerCamelCase(type.typeName);
133
- const crudCode = crudTemplate(type.typeName, type.fields, type.isView);
134
- const filePath = crudFolderPath + `${fileName}.ts`;
152
+ // スキーマフォルダ分けがある場合のインポートパス調整
153
+ const hasSchemaFolders = types.some(t => t.schema !== type.schema);
154
+ const crudCode = crudTemplate(type.typeName, type.fields, type.isView, type.schema, hasSchemaFolders);
155
+ const filePath = path_1.default.join(schemaFolder, `${fileName}.ts`);
135
156
  // Show in console
136
157
  if (type.isView) {
137
- console.log(`Generating select operations only for view: ${fileName}`);
158
+ console.log(`Generating select operations only for view: ${type.schema}/${fileName}`);
138
159
  }
139
160
  else {
140
- console.log(`Generating full CRUD operations for table: ${fileName}`);
141
- }
142
- // Create directory if it does not exist
143
- const dirPath = filePath.substring(0, filePath.lastIndexOf('/'));
144
- if (!(0, fs_1.existsSync)(dirPath)) {
145
- (0, fs_1.mkdirSync)(dirPath, { recursive: true });
161
+ console.log(`Generating full CRUD operations for table: ${type.schema}/${fileName}`);
146
162
  }
147
163
  (0, fs_1.writeFileSync)(filePath, crudCode);
148
- console.log(`Generated ${fileName}.ts`);
164
+ console.log(`Generated ${type.schema}/${fileName}.ts`);
149
165
  });
150
166
  console.log("CRUD operations have been generated.");
151
167
  }
@@ -158,8 +174,8 @@ const toLowerCamelCase = (str) => {
158
174
  const toUpperCamelCase = (str) => {
159
175
  return str.split('_').map(word => word.charAt(0).toUpperCase() + word.slice(1)).join('');
160
176
  };
161
- // CRUD template
162
- const crudTemplate = (typeName, fields, isView) => {
177
+ // CRUDテンプレート本体 - エレガントな文字列生成
178
+ const crudTemplate = (typeName, fields, isView, schema, hasSchemaFolders) => {
163
179
  const upperCamelTypeName = toUpperCamelCase(typeName);
164
180
  const getByIdFunctionName = 'select' + upperCamelTypeName + 'RowById';
165
181
  const getByFiltersFunctionName = 'select' + upperCamelTypeName + 'RowsWithFilters';
@@ -168,233 +184,198 @@ const crudTemplate = (typeName, fields, isView) => {
168
184
  const updateFunctionName = 'update' + upperCamelTypeName + 'Row';
169
185
  const deleteFunctionName = 'delete' + upperCamelTypeName + 'Row';
170
186
  const idType = fields.find((field) => field.name === 'id')?.type || 'string';
171
- const hasIdColumn = idType !== undefined; // Check if 'id' column exists
172
- const exportHeaders = `// Supabase CRUD operations for ${typeName}
173
- // This file is automatically generated. Do not edit it directly.
174
- import { supabase } from "../client";
175
- import { Tables, TablesInsert, TablesUpdate } from "@shared/types";
176
-
177
- type ${typeName} = Tables<'${typeName}'>;
178
- type FilterTypesValue = string | number | boolean | null | Record<string, any>;
179
- type Filters = Record<string, FilterTypesValue | FilterTypesValue[]>;
180
- `;
181
- const exportSelectById = `
182
- // Read single row using id
183
- export async function ${getByIdFunctionName}({ id }: { id: ${idType} }): Promise<${typeName} | null> {
184
- if (!id) {
185
- throw new Error('ID is required');
186
- }
187
- try {
188
- const result = await supabase
189
- .from('${typeName.toLowerCase()}')
190
- .select('*')
191
- .eq('id', id)
192
- .single();
193
-
194
- if (result.error) {
195
- if (result.error.code === 'PGRST116') {
196
- return null;
197
- }
198
- throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);
199
- }
200
-
201
- return result.data as ${typeName};
202
- } catch (error) {
203
- console.error('Error in ${getByIdFunctionName}:', error);
204
- throw error;
205
- }
206
- }
207
- `;
208
- const exportSelectQueries = `
209
- // Function to apply filters to a query
210
- function applyFilters(query: any, filters: Filters): any {
211
- for (const [key, value] of Object.entries(filters)) {
212
- if (Array.isArray(value)) {
213
- query = query.in(key, value); // Use 'in' for array values
214
- } else if (typeof value === 'object' && value !== null) {
215
- for (const [operator, val] of Object.entries(value)) {
216
- switch (operator) {
217
- case 'eq':
218
- query = query.eq(key, val);
219
- break;
220
- case 'neq':
221
- query = query.neq(key, val);
222
- break;
223
- case 'like':
224
- query = query.like(key, val);
225
- break;
226
- case 'ilike':
227
- query = query.ilike(key, val);
228
- break;
229
- case 'lt':
230
- query = query.lt(key, val);
231
- break;
232
- case 'lte':
233
- query = query.lte(key, val);
234
- break;
235
- case 'gte':
236
- query = query.gte(key, val);
237
- break;
238
- case 'gt':
239
- query = query.gt(key, val);
240
- break;
241
- case 'contains':
242
- query = query.contains(key, val);
243
- break;
244
- case 'contains_any':
245
- query = query.contains_any(key, val);
246
- break;
247
- case 'contains_all':
248
- query = query.contains_all(key, val);
249
- break;
250
- // Add more operators as needed
251
- default:
252
- throw new Error('Unsupported operator: ' + operator);
253
- }
254
- }
255
- } else {
256
- query = query.eq(key, value); // Default to 'eq' for simple values
257
- }
258
- }
259
- return query;
260
- }
261
-
262
- // Read multiple rows with dynamic filters
263
- export async function ${getByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName}[]> {
264
- try {
265
- let query = supabase.from('${typeName.toLowerCase()}').select('*');
266
- query = applyFilters(query, filters);
267
-
268
- const result = await query;
269
-
270
- if (result.error) {
271
- throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);
272
- }
273
-
274
- return (result.data as unknown as ${typeName}[]) || [];
275
- } catch (error) {
276
- console.error('Error in ${getByFiltersFunctionName}:', error);
277
- throw error;
278
- }
279
- }
280
-
281
- // Read a single row with dynamic filters
282
- export async function ${getSingleByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName} | null> {
283
- try {
284
- let query = supabase.from('${typeName.toLowerCase()}').select('*');
285
- query = applyFilters(query, filters).single();
286
-
287
- const result = await query;
288
-
289
- if (result.error) {
290
- if (result.error.code === 'PGRST116') {
291
- return null;
292
- }
293
- throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);
294
- }
295
-
296
- return result.data as unknown as ${typeName};
297
- } catch (error) {
298
- console.error('Error in ${getSingleByFiltersFunctionName}:', error);
299
- throw error;
300
- }
301
- }
302
- `;
303
- const exportInsertOperation = isView ? '' :
304
- `
305
- // Create Function
306
- export async function ${createFunctionName}({ data }: { data: TablesInsert<'${typeName}'> }): Promise<${typeName}> {
307
- if (!data) {
308
- throw new Error('Data is required for creation');
309
- }
310
- try {
311
- const result = await supabase
312
- .from('${typeName}')
313
- .insert([data])
314
- .select()
315
- .single();
316
-
317
- if (result.error) {
318
- throw new Error(\`Failed to create ${typeName}: \${result.error.message}\`);
319
- }
320
-
321
- if (!result.data) {
322
- throw new Error('No data returned after creation');
323
- }
324
-
325
- return result.data as ${typeName};
326
- } catch (error) {
327
- console.error('Error in ${createFunctionName}:', error);
328
- throw error;
329
- }
330
- }
331
- `;
332
- const exportUpdateOperation = isView ? '' :
333
- `
334
- // Update Function
335
- export async function ${updateFunctionName}({ id, data }: { id: ${idType}; data: TablesUpdate<'${typeName}'> }): Promise<${typeName}> {
336
- if (!id) {
337
- throw new Error('ID is required for update');
338
- }
339
- if (!data || Object.keys(data).length === 0) {
340
- throw new Error('Update data is required');
341
- }
342
- try {
343
- const result = await supabase
344
- .from('${typeName.toLowerCase()}')
345
- .update(data)
346
- .eq('id', id)
347
- .select()
348
- .single();
349
-
350
- if (result.error) {
351
- if (result.error.code === 'PGRST116') {
352
- throw new Error(\`${typeName} with ID \${id} not found\`);
353
- }
354
- throw new Error(\`Failed to update ${typeName}: \${result.error.message}\`);
355
- }
356
-
357
- if (!result.data) {
358
- throw new Error(\`${typeName} with ID \${id} not found\`);
359
- }
360
-
361
- return result.data as ${typeName};
362
- } catch (error) {
363
- console.error('Error in ${updateFunctionName}:', error);
364
- throw error;
365
- }
366
- }
367
- `;
368
- const exportDeleteOperation = isView ? '' :
369
- `
370
- // Delete Function
371
- export async function ${deleteFunctionName}({ id }: { id: ${idType} }): Promise<boolean> {
372
- if (!id) {
373
- throw new Error('ID is required for deletion');
374
- }
375
- try {
376
- const result = await supabase
377
- .from('${typeName.toLowerCase()}')
378
- .delete()
379
- .eq('id', id);
380
-
381
- if (result.error) {
382
- throw new Error(\`Failed to delete ${typeName}: \${result.error.message}\`);
383
- }
384
-
385
- return true;
386
- } catch (error) {
387
- console.error('Error in ${deleteFunctionName}:', error);
388
- throw error;
389
- }
390
- }
391
- `;
392
- // Export all functions
393
- const exportAll = `
394
- // All functions are exported individually above
395
- `;
396
- // Return all the code
397
- return exportHeaders + exportSelectQueries + (hasIdColumn ? exportSelectById : '') + exportInsertOperation + exportUpdateOperation + exportDeleteOperation + exportAll;
187
+ // インポートパスを動的に調整
188
+ const importPath = hasSchemaFolders ? '../../client' : '../client';
189
+ // ヘッダー部分
190
+ const header = [
191
+ `// Supabase CRUD operations for ${typeName} (${schema} schema)`,
192
+ '// 自動生成ファイル',
193
+ `import { supabase } from "${importPath}";`,
194
+ 'import { Tables, TablesInsert, TablesUpdate } from "@shared/types";',
195
+ '',
196
+ `type ${typeName} = Tables<'${typeName}'>;`,
197
+ 'type FilterTypesValue = string | number | boolean | null | Record<string, any>;',
198
+ 'type Filters = Record<string, FilterTypesValue | FilterTypesValue[]>;',
199
+ ''
200
+ ].join('\n');
201
+ // フィルター適用関数
202
+ const filterFunction = [
203
+ '/**',
204
+ ' * フィルター適用関数',
205
+ ' */',
206
+ 'function applyFilters(query: any, filters: Filters): any {',
207
+ ' for (const [key, value] of Object.entries(filters)) {',
208
+ ' if (Array.isArray(value)) {',
209
+ ' query = query.in(key, value);',
210
+ ' } else if (typeof value === "object" && value !== null) {',
211
+ ' for (const [operator, val] of Object.entries(value)) {',
212
+ ' switch (operator) {',
213
+ ' case "eq": query = query.eq(key, val); break;',
214
+ ' case "neq": query = query.neq(key, val); break;',
215
+ ' case "like": query = query.like(key, val); break;',
216
+ ' case "ilike": query = query.ilike(key, val); break;',
217
+ ' case "lt": query = query.lt(key, val); break;',
218
+ ' case "lte": query = query.lte(key, val); break;',
219
+ ' case "gte": query = query.gte(key, val); break;',
220
+ ' case "gt": query = query.gt(key, val); break;',
221
+ ' case "contains": query = query.contains(key, val); break;',
222
+ ' case "contains_any": query = query.contains_any(key, val); break;',
223
+ ' case "contains_all": query = query.contains_all(key, val); break;',
224
+ ' default: throw new Error("Unsupported operator: " + operator);',
225
+ ' }',
226
+ ' }',
227
+ ' } else {',
228
+ ' query = query.eq(key, value);',
229
+ ' }',
230
+ ' }',
231
+ ' return query;',
232
+ '}',
233
+ ''
234
+ ].join('\n');
235
+ // IDで1件取得
236
+ const selectById = [
237
+ '/**',
238
+ ' * IDで1件取得',
239
+ ' */',
240
+ `export async function ${getByIdFunctionName}({ id }: { id: ${idType} }): Promise<${typeName} | null> {`,
241
+ ' if (!id) throw new Error("ID is required");',
242
+ ' try {',
243
+ ' const result = await supabase',
244
+ ` .schema("${schema}")`,
245
+ ` .from("${typeName.toLowerCase()}")`,
246
+ ' .select("*")',
247
+ ' .eq("id", id)',
248
+ ' .single();',
249
+ ' if (result.error) {',
250
+ ' if (result.error.code === "PGRST116") return null;',
251
+ ` throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);`,
252
+ ' }',
253
+ ` return result.data as ${typeName};`,
254
+ ' } catch (error) {',
255
+ ` console.error("Error in ${getByIdFunctionName}:", error);`,
256
+ ' throw error;',
257
+ ' }',
258
+ '}',
259
+ ''
260
+ ].join('\n');
261
+ // フィルターで複数取得
262
+ const selectMultiple = [
263
+ '/**',
264
+ ' * フィルターで複数取得',
265
+ ' */',
266
+ `export async function ${getByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName}[]> {`,
267
+ ' if (!filters || typeof filters !== "object") return [];',
268
+ ' try {',
269
+ ` let query = supabase.schema("${schema}").from("${typeName.toLowerCase()}").select("*");`,
270
+ ' query = applyFilters(query, filters);',
271
+ ' const result = await query;',
272
+ ` if (result.error) throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);`,
273
+ ` return (result.data as unknown as ${typeName}[]) || [];`,
274
+ ' } catch (error) {',
275
+ ` console.error("Error in ${getByFiltersFunctionName}:", error);`,
276
+ ' throw error;',
277
+ ' }',
278
+ '}',
279
+ ''
280
+ ].join('\n');
281
+ // フィルターで1件取得
282
+ const selectSingle = [
283
+ '/**',
284
+ ' * フィルターで1件取得',
285
+ ' */',
286
+ `export async function ${getSingleByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName} | null> {`,
287
+ ' if (!filters || typeof filters !== "object") return null;',
288
+ ' try {',
289
+ ` let query = supabase.schema("${schema}").from("${typeName.toLowerCase()}").select("*");`,
290
+ ' query = applyFilters(query, filters).single();',
291
+ ' const result = await query;',
292
+ ' if (result.error) {',
293
+ ' if (result.error.code === "PGRST116") return null;',
294
+ ` throw new Error(\`Failed to fetch ${typeName}: \${result.error.message}\`);`,
295
+ ' }',
296
+ ` return result.data as unknown as ${typeName};`,
297
+ ' } catch (error) {',
298
+ ` console.error("Error in ${getSingleByFiltersFunctionName}:", error);`,
299
+ ' throw error;',
300
+ ' }',
301
+ '}',
302
+ ''
303
+ ].join('\n');
304
+ // 追加(ビューでない場合のみ)
305
+ const insertOperation = isView ? '' : [
306
+ '/**',
307
+ ' * 追加',
308
+ ' */',
309
+ `export async function ${createFunctionName}({ data }: { data: TablesInsert<"${typeName}"> }): Promise<${typeName}> {`,
310
+ ' if (!data) throw new Error("Data is required for creation");',
311
+ ' try {',
312
+ ' const result = await supabase',
313
+ ` .schema("${schema}")`,
314
+ ` .from("${typeName.toLowerCase()}")`,
315
+ ' .insert([data])',
316
+ ' .select()',
317
+ ' .single();',
318
+ ` if (result.error) throw new Error(\`Failed to create ${typeName}: \${result.error.message}\`);`,
319
+ ' if (!result.data) throw new Error("No data returned after creation");',
320
+ ` return result.data as ${typeName};`,
321
+ ' } catch (error) {',
322
+ ` console.error("Error in ${createFunctionName}:", error);`,
323
+ ' throw error;',
324
+ ' }',
325
+ '}',
326
+ ''
327
+ ].join('\n');
328
+ // 更新(ビューでない場合のみ)
329
+ const updateOperation = isView ? '' : [
330
+ '/**',
331
+ ' * 更新',
332
+ ' */',
333
+ `export async function ${updateFunctionName}({ id, data }: { id: ${idType}; data: TablesUpdate<"${typeName}"> }): Promise<${typeName}> {`,
334
+ ' if (!id) throw new Error("ID is required for update");',
335
+ ' if (!data || Object.keys(data).length === 0) throw new Error("Update data is required");',
336
+ ' try {',
337
+ ' const result = await supabase',
338
+ ` .schema("${schema}")`,
339
+ ` .from("${typeName.toLowerCase()}")`,
340
+ ' .update(data)',
341
+ ' .eq("id", id)',
342
+ ' .select()',
343
+ ' .single();',
344
+ ' if (result.error) {',
345
+ ` if (result.error.code === "PGRST116") throw new Error(\`${typeName} with ID \${id} not found\`);`,
346
+ ` throw new Error(\`Failed to update ${typeName}: \${result.error.message}\`);`,
347
+ ' }',
348
+ ` if (!result.data) throw new Error(\`${typeName} with ID \${id} not found\`);`,
349
+ ` return result.data as ${typeName};`,
350
+ ' } catch (error) {',
351
+ ` console.error("Error in ${updateFunctionName}:", error);`,
352
+ ' throw error;',
353
+ ' }',
354
+ '}',
355
+ ''
356
+ ].join('\n');
357
+ // 削除(ビューでない場合のみ)
358
+ const deleteOperation = isView ? '' : [
359
+ '/**',
360
+ ' * 削除',
361
+ ' */',
362
+ `export async function ${deleteFunctionName}({ id }: { id: ${idType} }): Promise<boolean> {`,
363
+ ' if (!id) throw new Error("ID is required for deletion");',
364
+ ' try {',
365
+ ' const result = await supabase',
366
+ ` .schema("${schema}")`,
367
+ ` .from("${typeName.toLowerCase()}")`,
368
+ ' .delete()',
369
+ ' .eq("id", id);',
370
+ ` if (result.error) throw new Error(\`Failed to delete ${typeName}: \${result.error.message}\`);`,
371
+ ' return true;',
372
+ ' } catch (error) {',
373
+ ` console.error("Error in ${deleteFunctionName}:", error);`,
374
+ ' throw error;',
375
+ ' }',
376
+ '}',
377
+ ''
378
+ ].join('\n');
379
+ // 全体を結合
380
+ return header + filterFunction + selectById + selectMultiple + selectSingle + insertOperation + updateOperation + deleteOperation;
398
381
  };
399
- // console.log(crudFolderPath);
400
- // console.log(types);
@@ -1068,6 +1068,87 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1068
1068
  */
1069
1069
  async function extractDefinitions(options) {
1070
1070
  const { connectionString, outputDir, separateDirectories = true, tablesOnly = false, viewsOnly = false, all = false, tablePattern = '*', force = false, schemas = ['public'] } = options;
1071
+ // Node.jsのSSL証明書検証を無効化
1072
+ process.env.NODE_TLS_REJECT_UNAUTHORIZED = '0';
1073
+ // 接続文字列の検証
1074
+ if (!connectionString) {
1075
+ throw new Error('接続文字列が設定されていません。以下のいずれかで設定してください:\n' +
1076
+ '1. --connection オプション\n' +
1077
+ '2. SUPABASE_CONNECTION_STRING 環境変数\n' +
1078
+ '3. DATABASE_URL 環境変数\n' +
1079
+ '4. supatool.config.json 設定ファイル');
1080
+ }
1081
+ // 接続文字列の形式検証
1082
+ if (!connectionString.startsWith('postgresql://') && !connectionString.startsWith('postgres://')) {
1083
+ throw new Error(`不正な接続文字列形式です: ${connectionString}\n` +
1084
+ '正しい形式: postgresql://username:password@host:port/database');
1085
+ }
1086
+ // パスワード部分をURLエンコード
1087
+ let encodedConnectionString = connectionString;
1088
+ console.log('🔍 元の接続文字列:', connectionString);
1089
+ try {
1090
+ // パスワードに@が含まれる場合の特別処理
1091
+ if (connectionString.includes('@') && connectionString.split('@').length > 2) {
1092
+ console.log('⚠️ パスワードに@が含まれているため特別処理を実行');
1093
+ // 最後の@を区切り文字として使用
1094
+ const parts = connectionString.split('@');
1095
+ const lastPart = parts.pop(); // 最後の部分(host:port/database)
1096
+ const firstParts = parts.join('@'); // 最初の部分(postgresql://user:password)
1097
+ console.log(' 分割結果:');
1098
+ console.log(' 前半部分:', firstParts);
1099
+ console.log(' 後半部分:', lastPart);
1100
+ // パスワード部分をエンコード
1101
+ const colonIndex = firstParts.lastIndexOf(':');
1102
+ if (colonIndex > 0) {
1103
+ const protocolAndUser = firstParts.substring(0, colonIndex);
1104
+ const password = firstParts.substring(colonIndex + 1);
1105
+ const encodedPassword = encodeURIComponent(password);
1106
+ encodedConnectionString = `${protocolAndUser}:${encodedPassword}@${lastPart}`;
1107
+ console.log(' エンコード結果:');
1108
+ console.log(' プロトコル+ユーザー:', protocolAndUser);
1109
+ console.log(' 元パスワード:', password);
1110
+ console.log(' エンコードパスワード:', encodedPassword);
1111
+ console.log(' 最終接続文字列:', encodedConnectionString);
1112
+ }
1113
+ }
1114
+ else {
1115
+ console.log('✅ 通常のURL解析を実行');
1116
+ // 通常のURL解析
1117
+ const url = new URL(connectionString);
1118
+ // ユーザー名にドットが含まれる場合の処理
1119
+ if (url.username && url.username.includes('.')) {
1120
+ console.log(`ユーザー名(ドット含む): ${url.username}`);
1121
+ }
1122
+ if (url.password) {
1123
+ // パスワード部分のみをエンコード
1124
+ const encodedPassword = encodeURIComponent(url.password);
1125
+ url.password = encodedPassword;
1126
+ encodedConnectionString = url.toString();
1127
+ console.log(' パスワードエンコード:', encodedPassword);
1128
+ }
1129
+ }
1130
+ // Supabase接続用にSSL設定を追加
1131
+ if (!encodedConnectionString.includes('sslmode=')) {
1132
+ const separator = encodedConnectionString.includes('?') ? '&' : '?';
1133
+ encodedConnectionString += `${separator}sslmode=require`;
1134
+ console.log(' SSL設定を追加:', encodedConnectionString);
1135
+ }
1136
+ // デバッグ情報を表示(パスワードは隠す)
1137
+ const debugUrl = new URL(encodedConnectionString);
1138
+ const maskedPassword = debugUrl.password ? '*'.repeat(debugUrl.password.length) : '';
1139
+ debugUrl.password = maskedPassword;
1140
+ console.log('🔍 接続情報:');
1141
+ console.log(` ホスト: ${debugUrl.hostname}`);
1142
+ console.log(` ポート: ${debugUrl.port}`);
1143
+ console.log(` データベース: ${debugUrl.pathname.slice(1)}`);
1144
+ console.log(` ユーザー: ${debugUrl.username}`);
1145
+ console.log(` SSL: ${debugUrl.searchParams.get('sslmode') || 'require'}`);
1146
+ }
1147
+ catch (error) {
1148
+ // URL解析に失敗した場合は元の文字列を使用
1149
+ console.warn('接続文字列のURL解析に失敗しました。特殊文字が含まれている可能性があります。');
1150
+ console.warn('エラー詳細:', error instanceof Error ? error.message : String(error));
1151
+ }
1071
1152
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
1072
1153
  const readline = await Promise.resolve().then(() => __importStar(require('readline')));
1073
1154
  // 上書き確認
@@ -1091,8 +1172,18 @@ async function extractDefinitions(options) {
1091
1172
  // スピナーを動的インポート
1092
1173
  const { default: ora } = await Promise.resolve().then(() => __importStar(require('ora')));
1093
1174
  const spinner = ora('Connecting to database...').start();
1094
- const client = new pg_1.Client({ connectionString });
1175
+ const client = new pg_1.Client({
1176
+ connectionString: encodedConnectionString,
1177
+ ssl: {
1178
+ rejectUnauthorized: false,
1179
+ ca: undefined
1180
+ }
1181
+ });
1095
1182
  try {
1183
+ // 接続前のデバッグ情報
1184
+ console.log('🔧 接続設定:');
1185
+ console.log(` SSL: rejectUnauthorized=false`);
1186
+ console.log(` 接続文字列長: ${encodedConnectionString.length}`);
1096
1187
  await client.connect();
1097
1188
  spinner.text = 'Connected to database';
1098
1189
  let allDefinitions = [];
@@ -0,0 +1,107 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ exports.generateSeedsFromRemote = generateSeedsFromRemote;
7
+ const pg_1 = require("pg");
8
+ const fs_1 = __importDefault(require("fs"));
9
+ const path_1 = __importDefault(require("path"));
10
+ const js_yaml_1 = __importDefault(require("js-yaml"));
11
+ /**
12
+ * 指定テーブルのデータをリモートDBから取得し、AI用シードJSONを生成
13
+ * @param options SeedGenOptions
14
+ */
15
+ async function generateSeedsFromRemote(options) {
16
+ // tables.yaml読み込み
17
+ const yamlObj = js_yaml_1.default.load(fs_1.default.readFileSync(options.tablesYamlPath, 'utf8'));
18
+ if (!yamlObj || !Array.isArray(yamlObj.tables)) {
19
+ throw new Error('tables.yamlの形式が不正です。tables: [ ... ] で指定してください');
20
+ }
21
+ const tables = yamlObj.tables;
22
+ // 日時付きサブディレクトリ名生成(例: 20250705_1116_supatool)
23
+ const now = new Date();
24
+ const y = now.getFullYear();
25
+ const m = String(now.getMonth() + 1).padStart(2, '0');
26
+ const d = String(now.getDate()).padStart(2, '0');
27
+ const hh = String(now.getHours()).padStart(2, '0');
28
+ const mm = String(now.getMinutes()).padStart(2, '0');
29
+ const folderName = `${y}${m}${d}_${hh}${mm}_supatool`;
30
+ const outDir = path_1.default.join(options.outputDir, folderName);
31
+ // 出力ディレクトリ作成
32
+ if (!fs_1.default.existsSync(outDir)) {
33
+ fs_1.default.mkdirSync(outDir, { recursive: true });
34
+ }
35
+ // DB接続
36
+ const client = new pg_1.Client({ connectionString: options.connectionString });
37
+ await client.connect();
38
+ let processedCount = 0;
39
+ for (const tableFullName of tables) {
40
+ // スキーマ指定なし→public
41
+ let schema = 'public';
42
+ let table = tableFullName;
43
+ if (tableFullName.includes('.')) {
44
+ [schema, table] = tableFullName.split('.');
45
+ }
46
+ // データ取得
47
+ const res = await client.query(`SELECT * FROM "${schema}"."${table}"`);
48
+ const rows = res.rows;
49
+ // ファイル名
50
+ const fileName = `${table}_seed.json`;
51
+ const filePath = path_1.default.join(outDir, fileName);
52
+ // 出力JSON
53
+ const json = {
54
+ table: `${schema}.${table}`,
55
+ fetched_at: now.toISOString(),
56
+ fetched_by: 'supatool v0.3.5',
57
+ note: 'This data is a snapshot of the remote DB at the above time. For AI coding reference. You can update it by running the update command again.',
58
+ rows
59
+ };
60
+ fs_1.default.writeFileSync(filePath, JSON.stringify(json, null, 2), 'utf8');
61
+ processedCount++;
62
+ }
63
+ await client.end();
64
+ // llms.txtインデックス出力(supabase/seeds直下に毎回上書き)
65
+ const files = fs_1.default.readdirSync(outDir);
66
+ const seedFiles = files.filter(f => f.endsWith('_seed.json'));
67
+ let llmsTxt = `# AI seed data index (generated by supatool)\n`;
68
+ llmsTxt += `# fetched_at: ${now.toISOString()}\n`;
69
+ llmsTxt += `# folder: ${folderName}\n`;
70
+ for (const basename of seedFiles) {
71
+ const file = path_1.default.join(outDir, basename);
72
+ const content = JSON.parse(fs_1.default.readFileSync(file, 'utf8'));
73
+ // テーブルコメント(なければ空)
74
+ let tableComment = '';
75
+ try {
76
+ const [schema, table] = content.table.split('.');
77
+ const commentRes = await getTableComment(options.connectionString, schema, table);
78
+ if (commentRes)
79
+ tableComment = commentRes;
80
+ }
81
+ catch { }
82
+ llmsTxt += `${content.table}: ${basename} (${Array.isArray(content.rows) ? content.rows.length : 0} rows)`;
83
+ if (tableComment)
84
+ llmsTxt += ` # ${tableComment}`;
85
+ llmsTxt += `\n`;
86
+ }
87
+ const llmsPath = path_1.default.join(options.outputDir, 'llms.txt');
88
+ fs_1.default.writeFileSync(llmsPath, llmsTxt, 'utf8');
89
+ // 英語でまとめて出力
90
+ console.log(`Seed export completed. Processed tables: ${processedCount}`);
91
+ console.log(`llms.txt index written to: ${llmsPath}`);
92
+ }
93
+ // テーブルコメント取得ユーティリティ
94
+ async function getTableComment(connectionString, schema, table) {
95
+ const client = new pg_1.Client({ connectionString });
96
+ await client.connect();
97
+ try {
98
+ const res = await client.query(`SELECT obj_description(c.oid) as comment FROM pg_class c JOIN pg_namespace n ON c.relnamespace = n.oid WHERE c.relname = $1 AND n.nspname = $2 AND c.relkind = 'r'`, [table, schema]);
99
+ if (res.rows.length > 0 && res.rows[0].comment) {
100
+ return res.rows[0].comment;
101
+ }
102
+ return null;
103
+ }
104
+ finally {
105
+ await client.end();
106
+ }
107
+ }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "supatool",
3
- "version": "0.3.4",
3
+ "version": "0.3.6",
4
4
  "description": "A CLI tool for Supabase schema extraction and TypeScript CRUD generation with declarative database schema support.",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",