supatool 0.3.3 → 0.3.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -2,6 +2,14 @@
2
2
 
3
3
  A CLI tool that automatically generates TypeScript CRUD code from Supabase type definitions.
4
4
 
5
+ ## Features
6
+ - Extract and categorize all database objects (tables, views, RLS, functions, triggers) from Supabase
7
+ - Generate TypeScript CRUD functions from Supabase types or model YAML
8
+ - Output human-readable and AI-friendly schema/index files
9
+ - Flexible environment/configuration and batch processing
10
+ - Simple CLI with help and documentation
11
+
12
+ > For all new features and version history, see [CHANGELOG.md](./CHANGELOG.md).
5
13
 
6
14
  ## Install
7
15
 
@@ -15,7 +23,7 @@ pnpm add -g supatool
15
23
 
16
24
  ## Usage
17
25
 
18
- ### 1. Extract Database Schema (NEW in v0.3.0)
26
+ ### Extract Database Schema
19
27
 
20
28
  Extract and categorize all database objects from your Supabase project:
21
29
 
@@ -50,7 +58,7 @@ supabase/schemas/
50
58
  └── rpc/ # Functions & triggers
51
59
  ```
52
60
 
53
- ### 2. Generate CRUD Code
61
+ ### Generate CRUD Code
54
62
 
55
63
  Generate TypeScript CRUD functions from Supabase types:
56
64
 
@@ -67,7 +75,7 @@ supatool gen:crud model.yaml
67
75
 
68
76
  **Output:** `src/integrations/supabase/crud-autogen/`
69
77
 
70
- ### 3. Environment Configuration
78
+ ### Environment Configuration
71
79
 
72
80
  For security and convenience, set your connection string in environment variables:
73
81
 
@@ -88,7 +96,7 @@ supatool extract --all -o supabase/schemas
88
96
  - `DATABASE_URL` (fallback)
89
97
  - `SUPATOOL_MAX_CONCURRENT` (max concurrent table processing, default: 20, max: 50)
90
98
 
91
- ### 4. Additional Commands
99
+ ### Additional Commands
92
100
 
93
101
  ```bash
94
102
  # Show help for all commands
@@ -98,16 +106,12 @@ supatool help
98
106
  supatool crud -i path/to/types -o path/to/output
99
107
  ```
100
108
 
101
- ## Note: Supabase Client Requirement
109
+ ## Database Comments
102
110
 
103
- The generated CRUD code assumes that a Supabase client is defined in ../client.ts (relative to the export folder).
104
- Example:
111
+ Supatool automatically extracts and includes PostgreSQL comments in all generated files. Comments enhance documentation and AI understanding of your schema.
105
112
 
106
- ```ts
107
- // src/integrations/supabase/client.ts
108
- import { createClient } from '@supabase/supabase-js'
109
- export const supabase = createClient('YOUR_SUPABASE_URL', 'YOUR_SUPABASE_ANON_KEY')
110
- ```
113
+ - Table, view, function, and type comments are included in generated SQL and documentation.
114
+ - AI-friendly index files (llms.txt) and Markdown index (index.md) include comments for better context.
111
115
 
112
116
  ## VSCode/Cursor Integration
113
117
 
@@ -271,93 +275,56 @@ supatool extract --all --schema public,auth,extensions -o supabase/schemas
271
275
  supatool extract --all -c "postgresql://..." -o supabase/schemas
272
276
  ```
273
277
 
274
- ## New Features (v0.3.0)
275
-
276
- - **🔍 Schema Extraction**: Extract and categorize all database objects (tables, views, RLS, functions, triggers)
277
- - **📋 Supabase Declarative Schema**: Fully compliant with [Supabase's declarative database schemas](https://supabase.com/docs/guides/local-development/declarative-database-schemas) workflow
278
- - **🤖 AI-Friendly Index**: Auto-generated index.md and llms.txt files for better AI understanding of schema structure
279
- - **💬 Comment Support**: Automatically extracts and includes database comments in generated files
280
- - **📁 Organized Output**: Separate directories for different object types with flexible organization options
281
- - **🎯 Pattern Matching**: Extract specific tables/views using wildcard patterns
282
- - **👁️ View Support**: Enhanced CRUD generation with SELECT-only operations for database views
283
- - **⚛️ React Query Integration**: Generate modern React hooks for data fetching
284
- - **🔧 Flexible Workflows**: Support both database-first and model-first development approaches
285
-
286
- ## Changelog
287
-
288
- ### v0.3.3
289
-
290
- - **ENHANCED**: Improved SQL comment placement (moved to end of each SQL statement)
291
- - **ENHANCED**: Unified comment format for tables, views, functions, and custom types
292
- - **FIXED**: Preserved view `security_invoker` settings
293
-
294
- ### v0.3.2
295
-
296
- - **ENHANCED**: Adjust for extensions(vector, geometry etc.)
297
- - **FIXED**: USER-DEFINED column types are now rendered with full type definitions (e.g. `vector(1536)`, `geometry(Point,4326)`).
298
- - **ADDED**: `FOREIGN KEY` constraints are now included as `CONSTRAINT ... FOREIGN KEY ... REFERENCES ...` inside generated `CREATE TABLE` statements.
299
-
300
- ### v0.3.0
301
-
302
- **NEW Features:**
303
- - **NEW**: `extract` command for database schema extraction
304
- - **NEW**: Full compliance with Supabase declarative database schemas workflow
305
- - **NEW**: AI-friendly index.md and llms.txt generation for better schema understanding
306
- - **NEW**: Database comment extraction and integration
307
- - **NEW**: Organized directory structure (tables/, views/, rls/, rpc/)
308
- - **NEW**: Pattern matching for selective extraction
309
- - **ENHANCED**: Support for all database object types (RLS, functions, triggers, cron jobs, custom types)
310
- - **ENHANCED**: Flexible output options with --no-separate compatibility
311
-
312
- **Enhanced Error Handling:**
313
- - Comprehensive try-catch blocks for all CRUD operations
314
- - Enhanced null/undefined checks with proper fallbacks
315
- - Detailed error messages with contextual information
316
- - Special handling for PGRST116 errors (record not found)
317
- - Parameter validation for required fields
318
- - Proper error logging and debugging support
319
-
320
- **Breaking Changes:**
321
- - **Function Parameter Format**: All CRUD functions now use destructuring assignment
322
- - Before: `selectTableRowById(id: string)`
323
- - After: `selectTableRowById({ id }: { id: string })`
324
- - **Type Safety**: Enhanced TypeScript type annotations for all functions
325
-
326
- ### v0.2.0
327
- - Added `gen:` commands for code and schema generation
328
- - Enhanced `create` command
329
- - Introduced model schema support (`schemas/supatool-data.schema.ts`)
278
+ ## Seed Command (v0.3.5+)
330
279
 
331
- ## Database Comments
280
+ Export selected table data from your remote Supabase DB as AI-friendly seed JSON files.
332
281
 
333
- Supatool automatically extracts and includes PostgreSQL comments in all generated files. Comments enhance documentation and AI understanding of your schema.
282
+ ### Usage
334
283
 
335
- ### Adding Comments to Your Database
284
+ ```
285
+ supatool seed --tables tables.yaml --connection <CONNECTION_STRING>
286
+ ```
336
287
 
337
- ```sql
338
- -- Table comments
339
- COMMENT ON TABLE users IS 'User account information and authentication data';
288
+ - `tables.yaml` example:
289
+ ```yaml
290
+ tables:
291
+ - users
292
+ - public.orders
293
+ ```
294
+ - Output: `supabase/seeds/<timestamp>_supatool/{table}_seed.json`
295
+ - Each file contains a snapshot of the remote DB table at the time of export.
340
296
 
341
- -- View comments
342
- COMMENT ON VIEW user_profiles IS 'Combined user data with profile information';
297
+ ### Example output (users_seed.json)
298
+ ```json
299
+ {
300
+ "table": "public.users",
301
+ "fetched_at": "2024-07-05T11:16:00Z",
302
+ "fetched_by": "supatool v0.3.5",
303
+ "note": "This data is a snapshot of the remote DB at the above time. For AI coding reference. You can update it by running the update command again.",
304
+ "rows": [
305
+ { "id": 1, "name": "Taro Yamada", "email": "taro@example.com" },
306
+ { "id": 2, "name": "Hanako Suzuki", "email": "hanako@example.com" }
307
+ ]
308
+ }
309
+ ```
343
310
 
344
- -- Function comments
345
- COMMENT ON FUNCTION update_timestamp() IS 'Automatically updates the updated_at column';
311
+ > **Warning:** Do not include sensitive or personal data in seed files. Handle all exported data with care.
346
312
 
347
- -- Custom type comments
348
- COMMENT ON TYPE user_status IS 'Enumeration of possible user account statuses';
349
- ```
313
+ ### llms.txt (AI seed data index)
350
314
 
351
- ### Comment Integration
315
+ After exporting, a file named `llms.txt` is automatically generated (and overwritten) in the `supabase/seeds/` directory. This file lists all seed JSON files in the latest timestamped folder, with table name, fetch time, and row count for AI reference.
316
+
317
+ - Note: `llms.txt` is not generated inside each timestamped subfolder, only in `supabase/seeds/`.
318
+
319
+ #### Example llms.txt
320
+ ```
321
+ # AI seed data index (generated by supatool)
322
+ # fetched_at: 2024-07-05T11:16:00Z
323
+ # folder: 20240705_1116_supatool
324
+ public.users: users_seed.json (2 rows) # User account table
325
+ public.orders: orders_seed.json (5 rows)
326
+ ```
352
327
 
353
- Comments appear in:
354
- - **index.md**: Human-readable file listings with descriptions (tables/views only)
355
- - **llms.txt**: AI-friendly format (`type:name:path:comment`)
356
- - **Generated SQL**: As `COMMENT ON` statements for full schema recreation
328
+ ## More Information
357
329
 
358
- **Example output:**
359
- ```markdown
360
- ## Tables
361
- - [users](tables/users.sql) - User account information and authentication data
362
- - [posts](tables/posts.sql) - User-generated content and blog posts
363
- ```
330
+ For full version history and detailed changes, see [CHANGELOG.md](./CHANGELOG.md).
@@ -4,15 +4,13 @@ exports.modelSchemaHelp = exports.helpText = void 0;
4
4
  // See: [src/bin/helptext.ts](./src/bin/helptext.ts) from project root
5
5
  // Help text (command section from README, English only)
6
6
  exports.helpText = `
7
- Supatool CLI - Supabase database schema extraction and TypeScript CRUD generation
7
+ Supatool CLI - Supabase schema extraction and TypeScript CRUD generation
8
8
 
9
9
  Usage:
10
10
  supatool <command> [options]
11
11
 
12
12
  Commands:
13
- extract Extract and categorize database objects from Supabase
14
- gen:schema-crud Generate CRUD code from supabase/schemas SQL files
15
- crud Generate CRUD code from Supabase type definitions
13
+ extract Extract database objects from Supabase
16
14
  gen:types Generate TypeScript types from model YAML
17
15
  gen:crud Generate CRUD TypeScript code from model YAML
18
16
  gen:docs Generate Markdown documentation from model YAML
@@ -20,83 +18,21 @@ Commands:
20
18
  gen:rls Generate RLS/security SQL from model YAML
21
19
  gen:all Generate all outputs from model YAML
22
20
  create Generate a template model YAML
21
+ crud Generate CRUD code from Supabase type definitions
22
+ sync Sync local and remote schemas
23
+ seed Export selected table data as AI-friendly seed JSON
23
24
  config:init Generate configuration template
24
25
  help Show help
25
26
 
26
- Extract Options:
27
+ Common Options:
27
28
  -c, --connection <string> Supabase connection string
28
- -o, --output-dir <path> Output directory (default: ./supabase/schemas)
29
- -t, --tables <pattern> Table pattern with wildcards (default: *)
30
- --tables-only Extract only table definitions
31
- --views-only Extract only view definitions
32
- --all Extract all DB objects (tables, views, RLS, functions, triggers, cron, types)
33
- --no-separate Output all objects in same directory
34
- --schema <schemas> Target schemas, comma-separated (default: public)
35
-
36
- Examples:
37
- # Set connection in .env.local (recommended)
38
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
39
-
40
- # Extract all database objects with AI-friendly index
41
- supatool extract --all -o supabase/schemas
42
- # Output:
43
- # supabase/schemas/index.md (Human-readable index with table/view comments)
44
- # supabase/schemas/llms.txt (AI-friendly structured data with comments)
45
- # supabase/schemas/tables/*.sql (Tables with comments)
46
- # supabase/schemas/views/*.sql (Views with comments)
47
- # supabase/schemas/rls/*.sql (RLS policies)
48
- # supabase/schemas/rpc/*.sql (Functions & triggers)
49
- # supabase/schemas/cron/*.sql (Cron jobs)
50
- # supabase/schemas/types/*.sql (Custom types)
51
-
52
- # Extract only tables and views (default)
53
- supatool extract -o supabase/schemas
54
-
55
- # Extract to single directory (legacy mode)
56
- supatool extract --no-separate -o supabase/schemas
57
-
58
- # Extract specific pattern
59
- supatool extract -t "user_*" -o ./user-tables
60
-
61
- # Extract from specific schemas (default: public)
62
- supatool extract --all --schema public,auth,extensions -o supabase/schemas
63
-
64
- # Alternative: specify connection directly
65
- supatool extract --all -c "postgresql://..." -o supabase/schemas
66
-
67
- # Complete database-first workflow
68
- echo "SUPABASE_CONNECTION_STRING=postgresql://..." >> .env.local
69
- supatool extract --all -o supabase/schemas
70
- supatool gen:schema-crud --include-views --react-query
71
-
72
- # Model-first workflow
73
- supatool create model.yaml
74
- supatool gen:all model.yaml
75
-
76
- # Legacy CRUD generation
77
- supatool crud
29
+ -o, --output-dir <path> Output directory
30
+ -t, --tables <pattern|path> Table pattern or YAML path
31
+ --schema <schemas> Target schemas (comma-separated)
32
+ --config <path> Configuration file path
33
+ -f, --force Force overwrite
78
34
 
79
- Database Comments:
80
- Supatool automatically extracts and includes database comments in generated files.
81
-
82
- To add comments to your database objects:
83
-
84
- # Table comments
85
- COMMENT ON TABLE users IS 'User account information';eha,
86
-
87
- # View comments
88
- COMMENT ON VIEW user_profiles IS 'Combined user data with profile information';
89
-
90
- # Function comments
91
- COMMENT ON FUNCTION update_timestamp() IS 'Automatically updates the updated_at column';
92
-
93
- # Custom type comments
94
- COMMENT ON TYPE user_status IS 'Enumeration of possible user account statuses';
95
-
96
- Comments will appear in:
97
- - index.md: Human-readable list with descriptions (tables/views only)
98
- - llms.txt: AI-friendly format (type:name:path:comment)
99
- - Generated SQL files: As COMMENT statements
35
+ For details, see the documentation.
100
36
  `;
101
37
  // Model Schema Usage
102
38
  exports.modelSchemaHelp = `
@@ -18,6 +18,7 @@ const sqlGenerator_1 = require("../generator/sqlGenerator");
18
18
  const rlsGenerator_1 = require("../generator/rlsGenerator");
19
19
  const sync_1 = require("../sync");
20
20
  const definitionExtractor_1 = require("../sync/definitionExtractor");
21
+ const seedGenerator_1 = require("../sync/seedGenerator");
21
22
  const fs_1 = __importDefault(require("fs"));
22
23
  const path_1 = __importDefault(require("path"));
23
24
  const program = new commander_1.Command();
@@ -193,6 +194,73 @@ program
193
194
  .action(() => {
194
195
  console.log(helptext_1.helpText);
195
196
  });
197
+ // sync コマンド
198
+ program
199
+ .command('sync')
200
+ .description('ローカルスキーマとリモートスキーマを同期')
201
+ .option('-c, --connection <string>', 'Supabase connection string')
202
+ .option('-s, --schema-dir <path>', 'ローカルスキーマディレクトリ', './supabase/schemas')
203
+ .option('-t, --tables <pattern>', 'テーブルパターン(ワイルドカード対応)', '*')
204
+ .option('-f, --force', '強制上書き(確認なし)')
205
+ .option('--config <path>', '設定ファイルパス')
206
+ .action(async (options) => {
207
+ const config = (0, sync_1.resolveConfig)({
208
+ connectionString: options.connection
209
+ }, options.config);
210
+ if (!config.connectionString) {
211
+ console.error('Connection string is required. Set it using one of:');
212
+ console.error('1. --connection option');
213
+ console.error('2. SUPABASE_CONNECTION_STRING environment variable');
214
+ console.error('3. DATABASE_URL environment variable');
215
+ console.error('4. supatool.config.json configuration file');
216
+ process.exit(1);
217
+ }
218
+ try {
219
+ await (0, sync_1.syncAllTables)({
220
+ connectionString: config.connectionString,
221
+ schemaDir: options.schemaDir,
222
+ tablePattern: options.tables,
223
+ force: options.force
224
+ });
225
+ }
226
+ catch (error) {
227
+ console.error('⚠️ Sync error:', error);
228
+ process.exit(1);
229
+ }
230
+ });
231
+ // seed コマンド
232
+ program
233
+ .command('seed')
234
+ .description('指定テーブルのデータをリモートDBから取得し、AI用シードJSONを生成')
235
+ .option('-c, --connection <string>', 'Supabase接続文字列')
236
+ .option('-t, --tables <path>', '取得テーブル一覧YAML', 'tables.yaml')
237
+ .option('-o, --out <dir>', '出力ディレクトリ', 'supabase/seeds')
238
+ .option('--config <path>', '設定ファイルパス')
239
+ .action(async (options) => {
240
+ // 接続情報の解決
241
+ const config = (0, sync_1.resolveConfig)({
242
+ connectionString: options.connection
243
+ }, options.config);
244
+ if (!config.connectionString) {
245
+ console.error('Connection string is required. Set it using one of:');
246
+ console.error('1. --connection option');
247
+ console.error('2. SUPABASE_CONNECTION_STRING environment variable');
248
+ console.error('3. DATABASE_URL environment variable');
249
+ console.error('4. supatool.config.json configuration file');
250
+ process.exit(1);
251
+ }
252
+ try {
253
+ await (0, seedGenerator_1.generateSeedsFromRemote)({
254
+ connectionString: config.connectionString,
255
+ tablesYamlPath: options.tables,
256
+ outputDir: options.out
257
+ });
258
+ }
259
+ catch (error) {
260
+ console.error('⚠️ Seed取得エラー:', error);
261
+ process.exit(1);
262
+ }
263
+ });
196
264
  // If no subcommand is specified, show helpText only (do not call main)
197
265
  if (!process.argv.slice(2).length) {
198
266
  console.log(helptext_1.helpText);
@@ -85,6 +85,8 @@ function generateCrudFromModel(model, outDir) {
85
85
  // フィルターで検索関数
86
86
  code += `/** フィルターで複数件取得 */\n`;
87
87
  code += `export async function select${capitalizedName}RowsWithFilters({ filters }: { filters: Filters }): Promise<${tableName}[]> {\n`;
88
+ code += ` // filtersのガード\n`;
89
+ code += ` if (!filters || typeof filters !== 'object') return [];\n`;
88
90
  code += ` try {\n`;
89
91
  code += ` let query = supabase.from('${tableName}').select('*');\n`;
90
92
  code += ` \n`;
package/dist/index.js CHANGED
@@ -261,6 +261,7 @@ function applyFilters(query: any, filters: Filters): any {
261
261
 
262
262
  // Read multiple rows with dynamic filters
263
263
  export async function ${getByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName}[]> {
264
+ if (!filters || typeof filters !== 'object') return [];
264
265
  try {
265
266
  let query = supabase.from('${typeName.toLowerCase()}').select('*');
266
267
  query = applyFilters(query, filters);
@@ -280,6 +281,7 @@ export async function ${getByFiltersFunctionName}({ filters }: { filters: Filter
280
281
 
281
282
  // Read a single row with dynamic filters
282
283
  export async function ${getSingleByFiltersFunctionName}({ filters }: { filters: Filters }): Promise<${typeName} | null> {
284
+ if (!filters || typeof filters !== 'object') return null;
283
285
  try {
284
286
  let query = supabase.from('${typeName.toLowerCase()}').select('*');
285
287
  query = applyFilters(query, filters).single();
@@ -157,11 +157,8 @@ async function fetchRlsPolicies(client, spinner, progress, schemas = ['public'])
157
157
  if (Array.isArray(policy.roles)) {
158
158
  roles = policy.roles.join(', ');
159
159
  }
160
- else if (typeof policy.roles === 'string') {
161
- roles = policy.roles;
162
- }
163
160
  else {
164
- // PostgreSQLの配列リテラル形式 "{role1,role2}" の場合
161
+ // PostgreSQLの配列リテラル形式 "{role1,role2}" または単純な文字列を処理
165
162
  roles = String(policy.roles)
166
163
  .replace(/[{}]/g, '') // 中括弧を除去
167
164
  .replace(/"/g, ''); // ダブルクォートを除去
@@ -234,8 +231,12 @@ async function fetchFunctions(client, spinner, progress, schemas = ['public']) {
234
231
  else {
235
232
  ddl += `-- ${row.comment}\n`;
236
233
  }
237
- // 関数定義を追加
238
- ddl += row.definition + '\n\n';
234
+ // 関数定義を追加(セミコロンを確実に付与)
235
+ let functionDef = row.definition;
236
+ if (!functionDef.trim().endsWith(';')) {
237
+ functionDef += ';';
238
+ }
239
+ ddl += functionDef + '\n\n';
239
240
  // COMMENT ON文を追加
240
241
  if (!row.comment) {
241
242
  ddl += `-- COMMENT ON FUNCTION ${functionSignature} IS '_your_comment_here_';\n\n`;
@@ -1067,6 +1068,87 @@ async function generateIndexFile(definitions, outputDir, separateDirectories = t
1067
1068
  */
1068
1069
  async function extractDefinitions(options) {
1069
1070
  const { connectionString, outputDir, separateDirectories = true, tablesOnly = false, viewsOnly = false, all = false, tablePattern = '*', force = false, schemas = ['public'] } = options;
1071
+ // Node.jsのSSL証明書検証を無効化
1072
+ process.env.NODE_TLS_REJECT_UNAUTHORIZED = '0';
1073
+ // 接続文字列の検証
1074
+ if (!connectionString) {
1075
+ throw new Error('接続文字列が設定されていません。以下のいずれかで設定してください:\n' +
1076
+ '1. --connection オプション\n' +
1077
+ '2. SUPABASE_CONNECTION_STRING 環境変数\n' +
1078
+ '3. DATABASE_URL 環境変数\n' +
1079
+ '4. supatool.config.json 設定ファイル');
1080
+ }
1081
+ // 接続文字列の形式検証
1082
+ if (!connectionString.startsWith('postgresql://') && !connectionString.startsWith('postgres://')) {
1083
+ throw new Error(`不正な接続文字列形式です: ${connectionString}\n` +
1084
+ '正しい形式: postgresql://username:password@host:port/database');
1085
+ }
1086
+ // パスワード部分をURLエンコード
1087
+ let encodedConnectionString = connectionString;
1088
+ console.log('🔍 元の接続文字列:', connectionString);
1089
+ try {
1090
+ // パスワードに@が含まれる場合の特別処理
1091
+ if (connectionString.includes('@') && connectionString.split('@').length > 2) {
1092
+ console.log('⚠️ パスワードに@が含まれているため特別処理を実行');
1093
+ // 最後の@を区切り文字として使用
1094
+ const parts = connectionString.split('@');
1095
+ const lastPart = parts.pop(); // 最後の部分(host:port/database)
1096
+ const firstParts = parts.join('@'); // 最初の部分(postgresql://user:password)
1097
+ console.log(' 分割結果:');
1098
+ console.log(' 前半部分:', firstParts);
1099
+ console.log(' 後半部分:', lastPart);
1100
+ // パスワード部分をエンコード
1101
+ const colonIndex = firstParts.lastIndexOf(':');
1102
+ if (colonIndex > 0) {
1103
+ const protocolAndUser = firstParts.substring(0, colonIndex);
1104
+ const password = firstParts.substring(colonIndex + 1);
1105
+ const encodedPassword = encodeURIComponent(password);
1106
+ encodedConnectionString = `${protocolAndUser}:${encodedPassword}@${lastPart}`;
1107
+ console.log(' エンコード結果:');
1108
+ console.log(' プロトコル+ユーザー:', protocolAndUser);
1109
+ console.log(' 元パスワード:', password);
1110
+ console.log(' エンコードパスワード:', encodedPassword);
1111
+ console.log(' 最終接続文字列:', encodedConnectionString);
1112
+ }
1113
+ }
1114
+ else {
1115
+ console.log('✅ 通常のURL解析を実行');
1116
+ // 通常のURL解析
1117
+ const url = new URL(connectionString);
1118
+ // ユーザー名にドットが含まれる場合の処理
1119
+ if (url.username && url.username.includes('.')) {
1120
+ console.log(`ユーザー名(ドット含む): ${url.username}`);
1121
+ }
1122
+ if (url.password) {
1123
+ // パスワード部分のみをエンコード
1124
+ const encodedPassword = encodeURIComponent(url.password);
1125
+ url.password = encodedPassword;
1126
+ encodedConnectionString = url.toString();
1127
+ console.log(' パスワードエンコード:', encodedPassword);
1128
+ }
1129
+ }
1130
+ // Supabase接続用にSSL設定を追加
1131
+ if (!encodedConnectionString.includes('sslmode=')) {
1132
+ const separator = encodedConnectionString.includes('?') ? '&' : '?';
1133
+ encodedConnectionString += `${separator}sslmode=require`;
1134
+ console.log(' SSL設定を追加:', encodedConnectionString);
1135
+ }
1136
+ // デバッグ情報を表示(パスワードは隠す)
1137
+ const debugUrl = new URL(encodedConnectionString);
1138
+ const maskedPassword = debugUrl.password ? '*'.repeat(debugUrl.password.length) : '';
1139
+ debugUrl.password = maskedPassword;
1140
+ console.log('🔍 接続情報:');
1141
+ console.log(` ホスト: ${debugUrl.hostname}`);
1142
+ console.log(` ポート: ${debugUrl.port}`);
1143
+ console.log(` データベース: ${debugUrl.pathname.slice(1)}`);
1144
+ console.log(` ユーザー: ${debugUrl.username}`);
1145
+ console.log(` SSL: ${debugUrl.searchParams.get('sslmode') || 'require'}`);
1146
+ }
1147
+ catch (error) {
1148
+ // URL解析に失敗した場合は元の文字列を使用
1149
+ console.warn('接続文字列のURL解析に失敗しました。特殊文字が含まれている可能性があります。');
1150
+ console.warn('エラー詳細:', error instanceof Error ? error.message : String(error));
1151
+ }
1070
1152
  const fs = await Promise.resolve().then(() => __importStar(require('fs')));
1071
1153
  const readline = await Promise.resolve().then(() => __importStar(require('readline')));
1072
1154
  // 上書き確認
@@ -1090,8 +1172,18 @@ async function extractDefinitions(options) {
1090
1172
  // スピナーを動的インポート
1091
1173
  const { default: ora } = await Promise.resolve().then(() => __importStar(require('ora')));
1092
1174
  const spinner = ora('Connecting to database...').start();
1093
- const client = new pg_1.Client({ connectionString });
1175
+ const client = new pg_1.Client({
1176
+ connectionString: encodedConnectionString,
1177
+ ssl: {
1178
+ rejectUnauthorized: false,
1179
+ ca: undefined
1180
+ }
1181
+ });
1094
1182
  try {
1183
+ // 接続前のデバッグ情報
1184
+ console.log('🔧 接続設定:');
1185
+ console.log(` SSL: rejectUnauthorized=false`);
1186
+ console.log(` 接続文字列長: ${encodedConnectionString.length}`);
1095
1187
  await client.connect();
1096
1188
  spinner.text = 'Connected to database';
1097
1189
  let allDefinitions = [];
@@ -0,0 +1,107 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ exports.generateSeedsFromRemote = generateSeedsFromRemote;
7
+ const pg_1 = require("pg");
8
+ const fs_1 = __importDefault(require("fs"));
9
+ const path_1 = __importDefault(require("path"));
10
+ const js_yaml_1 = __importDefault(require("js-yaml"));
11
+ /**
12
+ * 指定テーブルのデータをリモートDBから取得し、AI用シードJSONを生成
13
+ * @param options SeedGenOptions
14
+ */
15
+ async function generateSeedsFromRemote(options) {
16
+ // tables.yaml読み込み
17
+ const yamlObj = js_yaml_1.default.load(fs_1.default.readFileSync(options.tablesYamlPath, 'utf8'));
18
+ if (!yamlObj || !Array.isArray(yamlObj.tables)) {
19
+ throw new Error('tables.yamlの形式が不正です。tables: [ ... ] で指定してください');
20
+ }
21
+ const tables = yamlObj.tables;
22
+ // 日時付きサブディレクトリ名生成(例: 20250705_1116_supatool)
23
+ const now = new Date();
24
+ const y = now.getFullYear();
25
+ const m = String(now.getMonth() + 1).padStart(2, '0');
26
+ const d = String(now.getDate()).padStart(2, '0');
27
+ const hh = String(now.getHours()).padStart(2, '0');
28
+ const mm = String(now.getMinutes()).padStart(2, '0');
29
+ const folderName = `${y}${m}${d}_${hh}${mm}_supatool`;
30
+ const outDir = path_1.default.join(options.outputDir, folderName);
31
+ // 出力ディレクトリ作成
32
+ if (!fs_1.default.existsSync(outDir)) {
33
+ fs_1.default.mkdirSync(outDir, { recursive: true });
34
+ }
35
+ // DB接続
36
+ const client = new pg_1.Client({ connectionString: options.connectionString });
37
+ await client.connect();
38
+ let processedCount = 0;
39
+ for (const tableFullName of tables) {
40
+ // スキーマ指定なし→public
41
+ let schema = 'public';
42
+ let table = tableFullName;
43
+ if (tableFullName.includes('.')) {
44
+ [schema, table] = tableFullName.split('.');
45
+ }
46
+ // データ取得
47
+ const res = await client.query(`SELECT * FROM "${schema}"."${table}"`);
48
+ const rows = res.rows;
49
+ // ファイル名
50
+ const fileName = `${table}_seed.json`;
51
+ const filePath = path_1.default.join(outDir, fileName);
52
+ // 出力JSON
53
+ const json = {
54
+ table: `${schema}.${table}`,
55
+ fetched_at: now.toISOString(),
56
+ fetched_by: 'supatool v0.3.5',
57
+ note: 'This data is a snapshot of the remote DB at the above time. For AI coding reference. You can update it by running the update command again.',
58
+ rows
59
+ };
60
+ fs_1.default.writeFileSync(filePath, JSON.stringify(json, null, 2), 'utf8');
61
+ processedCount++;
62
+ }
63
+ await client.end();
64
+ // llms.txtインデックス出力(supabase/seeds直下に毎回上書き)
65
+ const files = fs_1.default.readdirSync(outDir);
66
+ const seedFiles = files.filter(f => f.endsWith('_seed.json'));
67
+ let llmsTxt = `# AI seed data index (generated by supatool)\n`;
68
+ llmsTxt += `# fetched_at: ${now.toISOString()}\n`;
69
+ llmsTxt += `# folder: ${folderName}\n`;
70
+ for (const basename of seedFiles) {
71
+ const file = path_1.default.join(outDir, basename);
72
+ const content = JSON.parse(fs_1.default.readFileSync(file, 'utf8'));
73
+ // テーブルコメント(なければ空)
74
+ let tableComment = '';
75
+ try {
76
+ const [schema, table] = content.table.split('.');
77
+ const commentRes = await getTableComment(options.connectionString, schema, table);
78
+ if (commentRes)
79
+ tableComment = commentRes;
80
+ }
81
+ catch { }
82
+ llmsTxt += `${content.table}: ${basename} (${Array.isArray(content.rows) ? content.rows.length : 0} rows)`;
83
+ if (tableComment)
84
+ llmsTxt += ` # ${tableComment}`;
85
+ llmsTxt += `\n`;
86
+ }
87
+ const llmsPath = path_1.default.join(options.outputDir, 'llms.txt');
88
+ fs_1.default.writeFileSync(llmsPath, llmsTxt, 'utf8');
89
+ // 英語でまとめて出力
90
+ console.log(`Seed export completed. Processed tables: ${processedCount}`);
91
+ console.log(`llms.txt index written to: ${llmsPath}`);
92
+ }
93
+ // テーブルコメント取得ユーティリティ
94
+ async function getTableComment(connectionString, schema, table) {
95
+ const client = new pg_1.Client({ connectionString });
96
+ await client.connect();
97
+ try {
98
+ const res = await client.query(`SELECT obj_description(c.oid) as comment FROM pg_class c JOIN pg_namespace n ON c.relnamespace = n.oid WHERE c.relname = $1 AND n.nspname = $2 AND c.relkind = 'r'`, [table, schema]);
99
+ if (res.rows.length > 0 && res.rows[0].comment) {
100
+ return res.rows[0].comment;
101
+ }
102
+ return null;
103
+ }
104
+ finally {
105
+ await client.end();
106
+ }
107
+ }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "supatool",
3
- "version": "0.3.3",
3
+ "version": "0.3.5",
4
4
  "description": "A CLI tool for Supabase schema extraction and TypeScript CRUD generation with declarative database schema support.",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",