@miniidealab/openlogos 0.9.5 → 0.9.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (43) hide show
  1. package/README.md +1 -0
  2. package/claude-plugin-template/.claude-plugin/plugin.json +13 -0
  3. package/claude-plugin-template/agents/change-reviewer.md +46 -0
  4. package/claude-plugin-template/bin/openlogos-phase +428 -0
  5. package/claude-plugin-template/commands/archive.md +22 -0
  6. package/claude-plugin-template/commands/change.md +19 -0
  7. package/claude-plugin-template/commands/index.md +13 -0
  8. package/claude-plugin-template/commands/init.md +30 -0
  9. package/claude-plugin-template/commands/launch.md +16 -0
  10. package/claude-plugin-template/commands/merge.md +20 -0
  11. package/claude-plugin-template/commands/next.md +18 -0
  12. package/claude-plugin-template/commands/status.md +10 -0
  13. package/claude-plugin-template/commands/sync.md +12 -0
  14. package/claude-plugin-template/commands/verify.md +17 -0
  15. package/claude-plugin-template/hooks/hooks.json +14 -0
  16. package/claude-plugin-template/skills/api-designer/SKILL.md +230 -0
  17. package/claude-plugin-template/skills/architecture-designer/SKILL.md +186 -0
  18. package/claude-plugin-template/skills/change-writer/SKILL.md +160 -0
  19. package/claude-plugin-template/skills/code-implementor/SKILL.md +116 -0
  20. package/claude-plugin-template/skills/code-reviewer/SKILL.md +214 -0
  21. package/claude-plugin-template/skills/db-designer/SKILL.md +259 -0
  22. package/claude-plugin-template/skills/merge-executor/SKILL.md +118 -0
  23. package/claude-plugin-template/skills/prd-writer/SKILL.md +203 -0
  24. package/claude-plugin-template/skills/product-designer/SKILL.md +235 -0
  25. package/claude-plugin-template/skills/project-init/SKILL.md +168 -0
  26. package/claude-plugin-template/skills/scenario-architect/SKILL.md +229 -0
  27. package/claude-plugin-template/skills/test-orchestrator/SKILL.md +147 -0
  28. package/claude-plugin-template/skills/test-writer/SKILL.md +252 -0
  29. package/dist/commands/init.d.ts +7 -0
  30. package/dist/commands/init.d.ts.map +1 -1
  31. package/dist/commands/init.js +145 -10
  32. package/dist/commands/init.js.map +1 -1
  33. package/dist/commands/module.d.ts.map +1 -1
  34. package/dist/commands/module.js +14 -9
  35. package/dist/commands/module.js.map +1 -1
  36. package/dist/commands/sync.d.ts.map +1 -1
  37. package/dist/commands/sync.js +15 -1
  38. package/dist/commands/sync.js.map +1 -1
  39. package/dist/i18n.d.ts.map +1 -1
  40. package/dist/i18n.js +8 -0
  41. package/dist/i18n.js.map +1 -1
  42. package/dist/index.js +0 -0
  43. package/package.json +5 -4
@@ -0,0 +1,116 @@
1
+ ---
2
+ name: code-implementor
3
+ description: "Generate business code and test code based on the full specification chain (sequence diagrams, API YAML, DB DDL, test case specs). Use when test cases exist in logos/resources/test/ but logos/resources/implementation/ is empty."
4
+ ---
5
+
6
+ # Skill: Code Implementor
7
+
8
+ > Generate business code and test code based on the full specification chain (sequence diagrams, API YAML, DB DDL, test case specs). Ensure strict spec fidelity, embed OpenLogos reporter, and deliver closed-loop batches per scenario.
9
+
10
+ ## Trigger Conditions
11
+
12
+ - User requests code implementation or code generation
13
+ - User mentions "Phase 3 Step 4", "code generation", "implement S01"
14
+ - Test case design is complete (`logos/resources/test/` is non-empty) and coding should begin
15
+ - User specifies a scenario number (e.g., S01) to implement
16
+
17
+ ## Prerequisites
18
+
19
+ - `logos/resources/prd/3-technical-plan/2-scenario-implementation/` contains sequence diagrams (**required**)
20
+ - `logos/resources/api/` contains API specifications (read if present)
21
+ - `logos/resources/database/` contains DB DDL (read if present)
22
+ - `logos/resources/test/` contains test case specifications (**required**)
23
+ - `logos/logos-project.yaml` contains `tech_stack` (**required**)
24
+
25
+ If sequence diagrams or test case directories are empty, prompt the user to complete Phase 3 Step 1 (scenario-architect) and Step 3a (test-writer) first.
26
+
27
+ ## Core Capabilities
28
+
29
+ 1. Load full specification context to establish an implementation baseline
30
+ 2. Plan batch execution strategy, splitting large tasks by scenario or module
31
+ 3. Generate business code strictly consistent with API YAML (routes, status codes, error codes, fields)
32
+ 4. Generate data access code strictly aligned with DB DDL (table names, column names, types, constraints)
33
+ 5. Generate test code with IDs exactly matching test-cases.md
34
+ 6. Embed OpenLogos reporter in test code (outputting to `test-results.jsonl`)
35
+ 7. Self-check after each batch to ensure spec fidelity
36
+
37
+ ## Relationship with test-writer and code-reviewer
38
+
39
+ This Skill sits in the middle of a three-Skill chain:
40
+
41
+ - **test-writer** (Step 3a): Designs test case **specification documents** (Markdown), defines UT/ST IDs — the "exam designer"
42
+ - **code-implementor** (Step 4, this Skill): Transforms all specs into **runnable business code and test code** — the "exam taker"
43
+ - **code-reviewer** (after Step 4): Audits generated business code against specs, outputs a review report — the "exam grader"
44
+
45
+ test-writer writes no code; code-implementor designs no test cases; code-reviewer modifies no code. Together they form a **Design → Execute → Review** closed loop.
46
+
47
+ ## Execution Steps
48
+
49
+ ### Step 1: Load Specification Context
50
+
51
+ **Before writing any line of code**, read the following documents to establish full context:
52
+
53
+ | Document | Path | Purpose |
54
+ |----------|------|---------|
55
+ | Architecture | `prd/3-technical-plan/1-architecture/` | Overall structure, framework, design patterns |
56
+ | Sequence diagrams | `prd/3-technical-plan/2-scenario-implementation/` | Implementation blueprint — Step sequence = code call chain |
57
+ | API specs | `logos/resources/api/*.yaml` | Endpoint contracts — routes, methods, status codes, fields |
58
+ | DB DDL | `logos/resources/database/*.sql` | Data layer contracts — table structures, constraints, indexes |
59
+ | Test case specs | `logos/resources/test/*-test-cases.md` | Verification targets — UT/ST IDs, expected inputs/outputs |
60
+ | Orchestration tests | `logos/resources/scenario/*.json` | End-to-end verification targets (API projects) |
61
+ | Project config | `logos/logos-project.yaml` | `tech_stack`, `external_dependencies` |
62
+ | Source roots | `logos/logos.config.json` → `sourceRoots` | Output root directories for business code and test code |
63
+
64
+ After loading, confirm:
65
+ - Which scenarios are in scope (S01, S02...)
66
+ - Which API endpoints and DB tables are involved
67
+ - Total UT/ST case count
68
+ - Technology stack confirmation (language, framework, test framework)
69
+ - Source output directories confirmed (read `sourceRoots.src` and `sourceRoots.test`; default to `src/` and `test/` if absent)
70
+
71
+ ### Step 2: Plan Batch Strategy
72
+
73
+ **Large tasks must be batched, but each batch must be closed-loop.**
74
+
75
+ 1. **Split dimension**: By scenario (S01, S02) or by module (auth, projects)
76
+ 2. **Pre-batch declaration**: Before each batch, list the UT/ST case IDs covered, ensuring traceability to `logos/resources/test/*.md`
77
+ 3. **Closed-loop requirement**: Each batch must deliver all three elements — business code + test code + reporter
78
+ 4. **No deferred testing**: "Write all business code first, add tests later" is not allowed
79
+
80
+ ### Step 3: Generate Business Code
81
+
82
+ Implement business logic following the sequence diagram Step sequence, strictly adhering to API YAML and DB DDL fidelity rules.
83
+
84
+ ### Step 4: Generate Test Code
85
+
86
+ Every test file must embed the OpenLogos reporter per `logos/spec/test-results.md`:
87
+
88
+ - Output path: `logos/resources/verify/test-results.jsonl`
89
+ - Format: JSONL (one JSON object per line)
90
+ - Each case outputs: `{ "id": "UT-S01-01", "status": "pass"|"fail"|"skip", ... }`
91
+
92
+ ### Step 5: Self-Check
93
+
94
+ After each batch, verify:
95
+ - [ ] API route paths and HTTP methods match YAML
96
+ - [ ] HTTP status codes match YAML
97
+ - [ ] DB operations use correct table/column names from DDL
98
+ - [ ] All pre-declared UT/ST IDs exist in test code
99
+ - [ ] Reporter is embedded with correct output path
100
+
101
+ ### Step 6: Guide Next Steps
102
+
103
+ After all batches complete, guide the user to run `openlogos verify` for Gate 3.5 acceptance.
104
+
105
+ ## Output Specification
106
+
107
+ - **Business code**: `sourceRoots.src` (default `src/`)
108
+ - **Test code**: `sourceRoots.test` (default `test/`)
109
+ - **JSONL results**: `logos/resources/verify/test-results.jsonl`
110
+ - **Implementation manifest**: `logos/resources/implementation/implementation-manifest.md`
111
+
112
+ ## Recommended Prompts
113
+
114
+ - `Help me implement S01`
115
+ - `Execute Phase 3 Step 4, batch by scenario`
116
+ - `Please execute Phase 3 Step 4 for S01. Deliver business code + test code + OpenLogos reporter in each batch.`
@@ -0,0 +1,214 @@
1
+ ---
2
+ name: code-reviewer
3
+ description: "Review code for OpenLogos methodology compliance, including YAML validity checks. Use when reviewing code changes, checking pull requests, or performing code quality analysis."
4
+ ---
5
+
6
+ # Skill: Code Reviewer
7
+
8
+ > Review AI-generated code by performing systematic validation against the full OpenLogos specification chain (API YAML, sequence diagram EX cases, DB DDL), ensuring code is fully consistent with design documents, covers all exception paths, and meets security requirements.
9
+
10
+ ## Trigger Conditions
11
+
12
+ - User requests a code review or Code Review
13
+ - User mentions "Phase 3 Step 4", "code audit", "code review"
14
+ - AI has just generated code that needs quality verification
15
+ - Final check before deployment
16
+ - Need to locate code issues after orchestration test failures
17
+
18
+ ## Prerequisites
19
+
20
+ - `logos/resources/api/` contains API YAML specifications
21
+ - `logos/resources/prd/3-technical-plan/2-scenario-implementation/` contains scenario sequence diagrams (with EX cases)
22
+ - `logos/resources/database/` contains DB DDL
23
+ - The code to be reviewed is accessible
24
+
25
+ For projects without APIs (pure CLI / libraries), API consistency checks can be skipped; focus on sequence diagram coverage and exception handling instead.
26
+
27
+ ## Core Capabilities
28
+
29
+ 1. Validate code implementation consistency with API YAML specifications
30
+ 2. Check whether exception handling covers all EX cases
31
+ 3. Check whether DB operations conform to DDL design
32
+ 4. Check security policies (authentication, RLS, input validation)
33
+ 5. Check code style and best practices
34
+ 6. Output a structured review report
35
+
36
+ ## Execution Steps
37
+
38
+ ### Step 1: Load Specification Context
39
+
40
+ **Pre-check — YAML Validity (before anything else):**
41
+
42
+ Before loading API specs, validate that all `logos/resources/api/*.yaml` files are syntactically valid YAML and conform to the OpenAPI 3.x schema. If any file fails parsing (e.g., unquoted special characters in `description` fields), report it as a **Critical** blocker immediately — do not proceed with the rest of the review until YAML errors are fixed.
43
+
44
+ **Then** read the following files to establish a "reference baseline" for the code review:
45
+
46
+ - **API YAML** (`logos/resources/api/*.yaml`): Extract endpoint inventory, record each endpoint's path, method, request body schema, response schema, and status codes
47
+ - **Scenario Sequence Diagrams** (`logos/resources/prd/3-technical-plan/2-scenario-implementation/`): Extract all EX exception case IDs and expected behaviors
48
+ - **DB DDL** (`logos/resources/database/`): Extract table structures, column types, constraints, and indexes
49
+ - **`logos-project.yaml`**: Read `tech_stack` to confirm the technology stack, `external_dependencies` to confirm external dependencies
50
+
51
+ Summarize into a review checklist:
52
+
53
+ ```markdown
54
+ Review scope: S01-related code
55
+ - API endpoints: 4 (auth.yaml)
56
+ - EX exception cases: 7 (EX-2.1 ~ EX-5.2)
57
+ - DB tables: 2 (users, profiles)
58
+ - Security policies: 2 RLS rules
59
+ ```
60
+
61
+ ### Step 2: API Consistency Review
62
+
63
+ Compare code implementation against API YAML specification endpoint by endpoint:
64
+
65
+ **Checklist**:
66
+
67
+ | Check Item | Description | Severity |
68
+ |------------|-------------|----------|
69
+ | Path Match | Whether route paths in code exactly match `paths` in YAML | Critical |
70
+ | HTTP Method | Whether GET/POST/PUT/DELETE matches | Critical |
71
+ | Request Body Fields | Whether code reads all required fields defined in YAML `requestBody.schema` | Critical |
72
+ | Request Body Validation | Whether field type, format (email/uuid), minLength and other constraints are validated in code | Warning |
73
+ | Response Fields | Whether JSON field names and types returned by code match YAML `responses.schema` | Critical |
74
+ | Status Codes | Whether HTTP status codes returned in normal and error cases match YAML definitions | Critical |
75
+ | Error Response Format | Whether error responses follow the unified `{ code, message, details? }` format | Warning |
76
+ | YAML Validity | All `logos/resources/api/*.yaml` files parse as valid YAML and valid OpenAPI 3.x — unquoted special characters (`:`, `→`, `#`) in `description`/`summary` values are a common failure mode | Critical |
77
+
78
+ **Output format**:
79
+
80
+ ```markdown
81
+ ### API Consistency
82
+
83
+ | Endpoint | Check Item | Status | Notes |
84
+ |----------|------------|--------|-------|
85
+ | POST /api/auth/register | Request body fields | ✅ | email, password both read |
86
+ | POST /api/auth/register | Response status code | ❌ Critical | Registration success returns 200, YAML defines 201 |
87
+ | POST /api/auth/register | Error code | ❌ Warning | Duplicate email returns generic 400, YAML defines 409 |
88
+ ```
89
+
90
+ ### Step 3: Exception Handling Coverage Review
91
+
92
+ Map all EX exception cases from sequence diagrams to error handling in code one by one:
93
+
94
+ 1. List all EX case IDs and their expected behaviors for the scenario
95
+ 2. Search for corresponding try/catch, if/else, error handlers in code
96
+ 3. Flag uncovered EX cases
97
+
98
+ **Key checks**:
99
+
100
+ - Whether each EX case has a corresponding code branch
101
+ - Whether the correct HTTP status code and error code are returned in exception scenarios
102
+ - Whether there are "silently swallowed exceptions" (empty catch blocks or catch blocks that only log without returning errors)
103
+ - Whether external service calls (DB, third-party APIs) all have timeout and error handling
104
+ - Whether there are exception handlers in code that don't exist in sequence diagrams (which may indicate sequence diagram omissions)
105
+
106
+ **Output format**:
107
+
108
+ ```markdown
109
+ ### Exception Handling Coverage
110
+
111
+ | EX ID | Exception Description | Code Coverage | Notes |
112
+ |-------|----------------------|---------------|-------|
113
+ | EX-2.1 | Email already registered | ✅ | Returns 409, format correct |
114
+ | EX-2.2 | Auth service unavailable | ❌ Critical | No try/catch wrapping the supabase.auth.signUp call |
115
+ | EX-4.1 | profiles write failure | ❌ Critical | auth.users record not rolled back after INSERT failure |
116
+ ```
117
+
118
+ ### Step 4: DB Operations Review
119
+
120
+ Check whether database operations in code conform to DDL design:
121
+
122
+ **Checklist**:
123
+
124
+ - **Table and column names**: Whether table/column names referenced in code match DDL (no typos, case differences)
125
+ - **Field types**: Whether value types passed in code match DDL definitions (e.g., for an `INTEGER` amount field in DDL, whether code passes cents instead of dollars)
126
+ - **Constraint compliance**: Whether NOT NULL fields always have values, whether UNIQUE fields have conflict handling, whether CHECK constraint enum values have corresponding constants in code
127
+ - **Transaction usage**: Whether multi-table write operations are wrapped in transactions
128
+ - **Migration consistency**: Whether the latest fields in DDL are used in code (avoid DDL being updated but code not following up)
129
+
130
+ ### Step 5: Security Review
131
+
132
+ Check the security implementation of the code:
133
+
134
+ | Check Item | Description | Severity |
135
+ |------------|-------------|----------|
136
+ | Authentication Check | Whether endpoints requiring authentication verify token/session before processing logic | Critical |
137
+ | Authorization Check | Whether users can only access their own data (owner check) | Critical |
138
+ | Input Validation | Whether user input has type validation and length limits (prevent injection, prevent XSS) | Critical |
139
+ | Sensitive Data | Whether responses leak password hashes, internal IDs, or stack traces | Critical |
140
+ | RLS Dependency | If relying on PostgreSQL RLS, whether code correctly sets the `auth.uid()` context | Warning |
141
+ | SQL Injection | Whether parameterized queries are used (string-concatenated SQL is prohibited) | Critical |
142
+ | Rate Limiting | Whether critical endpoints (login, registration) have rate limiting against brute force | Warning |
143
+
144
+ ### Step 6: Output Review Report
145
+
146
+ Summarize all findings by severity and generate a structured report:
147
+
148
+ ```markdown
149
+ # Code Review Report: S01 User Registration
150
+
151
+ ## Review Scope
152
+ - Scenario: S01
153
+ - Endpoints: 4
154
+ - EX cases: 7
155
+ - Code files: src/api/auth/register.ts, src/api/auth/login.ts
156
+
157
+ ## Review Summary
158
+
159
+ | Severity | Count |
160
+ |----------|-------|
161
+ | 🔴 Critical | 2 |
162
+ | 🟡 Warning | 3 |
163
+ | 🔵 Info | 1 |
164
+
165
+ ## Critical Findings
166
+
167
+ ### [C1] POST /api/auth/register status code mismatch
168
+ - **Spec source**: auth.yaml → register → responses.201
169
+ - **Issue**: Code returns 200, spec defines 201
170
+ - **Fix suggestion**: Change `res.status(200)` to `res.status(201)`
171
+
172
+ ### [C2] EX-2.2 unhandled: Auth service unavailable
173
+ - **Spec source**: S01 sequence diagram → EX-2.2
174
+ - **Issue**: `supabase.auth.signUp()` call is not wrapped in try/catch
175
+ - **Fix suggestion**: Add try/catch, return 503 on timeout or 5xx
176
+
177
+ ## Warning Findings
178
+ ...
179
+
180
+ ## Info Findings
181
+ ...
182
+ ```
183
+
184
+ **Report principles**:
185
+ - Critical issues must be fixed before proceeding to orchestration acceptance
186
+ - Warning issues are recommended to fix but do not block delivery
187
+ - Info items are improvement suggestions that can be addressed later
188
+ - Every finding must reference a spec source (API YAML, EX ID, DDL)
189
+
190
+ ## Output Specification
191
+
192
+ - Review report is output directly in the conversation (not written to a file)
193
+ - Categorized by severity: Critical / Warning / Info
194
+ - Each finding format: ID + spec source + issue description + fix suggestion
195
+ - End with a summary and next-step recommendation (e.g., "Fix 2 Critical issues, then run orchestration acceptance")
196
+
197
+ ## Best Practices
198
+
199
+ - **Consistency first**: Code must be fully consistent with API YAML — field names, types, and status codes must not deviate. Most production bugs come from subtle inconsistencies between code and specs
200
+ - **Exception handling is the focus**: Most bugs occur in exception paths; carefully check that every EX case has a corresponding catch/error handler
201
+ - **No shortcuts on security**: Authentication checks, RLS policies, input validation — any missing item is a Critical issue
202
+ - **Don't over-review**: Code style issues should be marked as Info and not block delivery. The core goal of the review is "code matches specs", not "code is perfect"
203
+ - **Run tests before reviewing**: If the code can run, execute orchestration tests first and use failing cases to pinpoint issues — this is more efficient than reading code line by line
204
+ - **Watch for compensation logic**: If multi-step writes (e.g., first creating an auth user then writing a profile) fail midway, check whether there is a rollback or compensation mechanism — this is the most commonly missed Critical issue
205
+
206
+ ## Recommended Prompts
207
+
208
+ The following prompts can be copied directly for use with AI:
209
+
210
+ - `Help me do a code review`
211
+ - `Help me check if this code conforms to the API YAML spec`
212
+ - `Review the code implementation related to S01`
213
+ - `Help me check if exception handling is complete`
214
+ - `Help me check if security policies are in place`
@@ -0,0 +1,259 @@
1
+ ---
2
+ name: db-designer
3
+ description: "Design database schema based on API and scenario requirements. Use when scenarios exist but logos/resources/database/ is empty."
4
+ ---
5
+
6
+ # Skill: DB Designer
7
+
8
+ > Derive database table structures from API specifications and generate SQL DDL in the appropriate dialect. The database type is determined during Phase 3 Step 0 technology selection, ensuring that field types, constraints, indexes, and security policies are fully aligned with API endpoints.
9
+
10
+ ## Trigger Conditions
11
+
12
+ - User requests database design or SQL writing
13
+ - User mentions "Phase 3 Step 2", "DB design", "table structure"
14
+ - API YAML specifications already exist and database design needs to be derived
15
+ - User provides a data model that needs to be converted to DDL
16
+
17
+ ## Core Capabilities
18
+
19
+ 1. Derive table structures from API request/response structures
20
+ 2. Read `tech_stack.database` from `logos-project.yaml` to determine the database type
21
+ 3. Generate SQL DDL in the corresponding database dialect
22
+ 4. Design indexes with rationale for each
23
+ 5. Design security policies (RLS / application-level permissions)
24
+ 6. Add comments to every table and every field
25
+
26
+ ## Prerequisites
27
+
28
+ - `logos/resources/api/` contains API YAML specifications (output from api-designer)
29
+ - `tech_stack.database` in `logos-project.yaml` is filled in
30
+
31
+ If the API directory is empty, prompt the user to complete the API design (api-designer) in Phase 3 Step 2 first. If `tech_stack.database` is not filled in, prompt the user to complete Phase 3 Step 0 (architecture-designer) first.
32
+
33
+ ## Execution Steps
34
+
35
+ ### Step 1: Determine Database Type
36
+
37
+ Read the `tech_stack` field from `logos/logos-project.yaml` to determine the database type and dialect:
38
+
39
+ - PostgreSQL → Use features like UUID, TIMESTAMPTZ, RLS, JSONB, etc.
40
+ - MySQL → Use features like InnoDB, utf8mb4, TIMESTAMP, etc.
41
+ - SQLite → Use simplified types like INTEGER PRIMARY KEY, TEXT, etc.
42
+ - Other → Confirm with the user and select the closest dialect
43
+
44
+ ### Step 2: Extract Data Entities
45
+
46
+ Extract all data entities that need to be persisted from the API YAML:
47
+
48
+ 1. Scan `requestBody` and `responses` across all endpoints to identify core data objects
49
+ 2. Distinguish between "needs persistence" and "transfer-only" data:
50
+ - Objects with CRUD operations → need a table (e.g., `users`, `projects`)
51
+ - Objects that only appear in requests/responses but are not stored directly → no table needed (e.g., `loginRequest`)
52
+ 3. Annotate each object with its source API endpoint
53
+
54
+ Output an entity checklist for user confirmation:
55
+
56
+ ```markdown
57
+ Identified N data entities requiring persistence from API specifications:
58
+
59
+ | # | Entity | Source Endpoint | Core Fields |
60
+ |---|--------|----------------|-------------|
61
+ | 1 | users | auth.yaml → register, login | email, password, status |
62
+ | 2 | projects | projects.yaml → create, list, get | name, description, owner_id |
63
+ | 3 | subscriptions | billing.yaml → subscribe | plan, status, expires_at |
64
+ ```
65
+
66
+ ### Step 3: Design Table Structures
67
+
68
+ Design complete table structures for each entity, following the current database dialect:
69
+
70
+ **Every table must include**:
71
+ - Primary key (UUID or auto-increment ID, depending on dialect)
72
+ - Business fields (mapped from API schema, with types converted to database types)
73
+ - Audit fields: `created_at`, `updated_at`
74
+ - Soft delete field: `deleted_at` (as needed)
75
+ - Field constraints: `NOT NULL`, `UNIQUE`, `CHECK`, `DEFAULT`
76
+
77
+ **Type mapping principles**:
78
+ - API `string + format: email` → `TEXT NOT NULL` (with CHECK constraint or application-level validation)
79
+ - API `string + format: uuid` → `UUID` (PostgreSQL) / `CHAR(36)` (MySQL)
80
+ - API `integer` → `INTEGER` / `BIGINT`
81
+ - API `boolean` → `BOOLEAN` (PostgreSQL) / `TINYINT(1)` (MySQL)
82
+ - API `string + enum` → `TEXT + CHECK` constraint (listing enum values)
83
+ - Monetary fields → `INTEGER` (store in cents), **DECIMAL/FLOAT is prohibited**
84
+
85
+ **Example (PostgreSQL)**:
86
+
87
+ ```sql
88
+ -- Users table (source: auth.yaml → register, login)
89
+ CREATE TABLE users (
90
+ id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
91
+ email TEXT NOT NULL UNIQUE,
92
+ password TEXT NOT NULL,
93
+ status TEXT NOT NULL DEFAULT 'pending'
94
+ CHECK (status IN ('pending', 'active', 'disabled')),
95
+ created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
96
+ updated_at TIMESTAMPTZ NOT NULL DEFAULT now()
97
+ );
98
+ ```
99
+
100
+ **Example (SQLite — using `@comment` structured annotations)**:
101
+
102
+ ```sql
103
+ -- Users table (source: auth.yaml → register, login)
104
+ CREATE TABLE users (
105
+ -- @comment User unique identifier, UUID v4 string
106
+ id TEXT PRIMARY KEY NOT NULL,
107
+ -- @comment User email, normalized to lowercase
108
+ email TEXT NOT NULL UNIQUE,
109
+ -- @comment Argon2id password hash, stores hash only
110
+ password_hash TEXT NOT NULL,
111
+ -- @comment Creation time, ISO 8601 format
112
+ created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
113
+ -- @comment Last update time
114
+ updated_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now'))
115
+ );
116
+ -- @table-comment users Users table storing core registered user information
117
+ ```
118
+
119
+ > **SQLite Comment Convention**: SQLite does not support `COMMENT ON` syntax. You MUST use `-- @comment` preceding annotations (for columns) and `-- @table-comment <table> <description>` trailing annotations (for tables). See `logos/spec/sql-comment-convention.md` for details.
120
+
121
+ ### Step 4: Design Table Relationships
122
+
123
+ Design foreign keys based on entity relationships in the API:
124
+
125
+ 1. Derive relationships from nested paths and reference fields in API endpoints (e.g., `/api/projects/:projectId/members` → `project_members` table linking `projects` and `users`)
126
+ 2. Determine relationship types (one-to-many, many-to-many)
127
+ 3. Design foreign key constraints and cascade strategies:
128
+ - `ON DELETE CASCADE`: child records are deleted when the parent record is deleted (e.g., user deleted → projects deleted)
129
+ - `ON DELETE SET NULL`: child records are retained but the foreign key is set to null when the parent is deleted
130
+ - `ON DELETE RESTRICT`: prevent deletion of the parent record if child records exist
131
+
132
+ ### Step 5: Design Security Policies
133
+
134
+ Design corresponding security mechanisms based on the database type:
135
+
136
+ **PostgreSQL — Row-Level Security (RLS)**:
137
+
138
+ ```sql
139
+ ALTER TABLE projects ENABLE ROW LEVEL SECURITY;
140
+
141
+ CREATE POLICY projects_owner_policy ON projects
142
+ USING (owner_id = auth.uid());
143
+ ```
144
+
145
+ - Enable RLS on all tables containing user data
146
+ - Design at least one Policy per table (owner / admin / public)
147
+ - Document the correspondence between RLS policies and the API authentication scheme
148
+
149
+ **MySQL — Application-Level Permissions**:
150
+
151
+ - Annotate data access permissions in table comments (owner-only / admin / public)
152
+ - Do not implement permission control in DDL; delegate to the application layer
153
+
154
+ ### Step 6: Design Indexes
155
+
156
+ Design indexes for common query patterns, with a rationale for each index:
157
+
158
+ ```sql
159
+ -- User lookup by email (login scenario, source: S02)
160
+ CREATE UNIQUE INDEX idx_users_email ON users(email);
161
+
162
+ -- Project lookup by owner (project list, source: S04 Step 1)
163
+ CREATE INDEX idx_projects_owner ON projects(owner_id);
164
+ ```
165
+
166
+ Index design principles:
167
+ - Foreign key columns: indexes are mandatory (to avoid full table scans on JOINs)
168
+ - Unique constraint columns: unique indexes are created automatically
169
+ - High-frequency query columns: determine based on API query parameters
170
+ - Composite indexes: consider for multi-condition queries (leftmost prefix rule)
171
+ - Avoid over-indexing: limit index count on write-heavy tables
172
+
173
+ ### Step 7: Output Complete DDL
174
+
175
+ Organize the DDL file in the following order:
176
+
177
+ 1. File header comment (source, database type, generation timestamp)
178
+ 2. Base tables (tables without foreign key dependencies first)
179
+ 3. Association tables (tables with foreign key dependencies after)
180
+ 4. Indexes
181
+ 5. Security policies (RLS / Policy)
182
+ 6. Table and field comments:
183
+ - PostgreSQL: use `COMMENT ON TABLE` / `COMMENT ON COLUMN`
184
+ - MySQL: use inline `COMMENT`
185
+ - SQLite: use `-- @comment` (columns) + `-- @table-comment` (tables), see SQLite comment rules below
186
+
187
+ Add a comment above each DDL block noting the source API endpoint.
188
+
189
+ **SQLite Comment Rules (MUST Follow)**:
190
+
191
+ When `tech_stack.database` is SQLite, you MUST use the following structured comment format:
192
+
193
+ 1. **Column comments**: write `-- @comment <description>` on the line **immediately above** the column definition
194
+ - **No blank lines** allowed between `-- @comment` and the column (blank lines break the association)
195
+ - Multi-line: consecutive `-- @comment` lines are concatenated automatically
196
+ 2. **Table comments**: write `-- @table-comment <table_name> <description>` on the line **immediately after** `CREATE TABLE ... ();`
197
+ 3. Constraint lines (`FOREIGN KEY`, standalone `CHECK`, `UNIQUE`) do **not** need `-- @comment`
198
+
199
+ ## Output Specification
200
+
201
+ - File format: SQL (dialect determined by `tech_stack.database`)
202
+ - Storage location: `logos/resources/database/`
203
+ - Single file output: `schema.sql` (simple projects); or split by domain: `auth.sql`, `billing.sql` (complex projects)
204
+ - Every table must have a comment (PostgreSQL: `COMMENT ON TABLE`; MySQL: `COMMENT = '...'`; SQLite: `-- @table-comment`)
205
+ - Every field must have a comment (PostgreSQL: `COMMENT ON COLUMN`; MySQL: `COMMENT '...'` after field definition; SQLite: `-- @comment`)
206
+ - Add a SQL comment above each DDL block noting the source API endpoint
207
+
208
+ ## Database Dialect Quick Reference
209
+
210
+ | Feature | PostgreSQL | MySQL | SQLite |
211
+ |---------|-----------|-------|--------|
212
+ | UUID Primary Key | `UUID DEFAULT gen_random_uuid()` | `CHAR(36) DEFAULT (UUID())` or `BINARY(16)` | `TEXT PRIMARY KEY NOT NULL` (app-generated UUID) |
213
+ | Timestamp Type | `TIMESTAMPTZ` | `DATETIME` / `TIMESTAMP` (mind timezone handling) | `TEXT` (ISO 8601 string) |
214
+ | JSON Support | `JSONB` (indexable) | `JSON` (limited functionality) | `TEXT` (app-layer JSON serialization) |
215
+ | Row-Level Security | RLS (`ENABLE ROW LEVEL SECURITY`) | Not supported; application layer | Not supported; application layer |
216
+ | Table Comment | `COMMENT ON TABLE t IS '...'` | `CREATE TABLE t (...) COMMENT = '...'` | `-- @table-comment t description` |
217
+ | Column Comment | `COMMENT ON COLUMN t.c IS '...'` | `col_name TYPE COMMENT '...'` | `-- @comment description` (preceding line) |
218
+
219
+ ## Best Practices
220
+
221
+ ### General (All Databases)
222
+
223
+ - **Store monetary values as INTEGER in cents**: DECIMAL/FLOAT is prohibited to avoid floating-point precision issues
224
+ - **Soft delete**: prefer a `deleted_at` timestamp field over physical deletion
225
+ - **Audit fields**: every table should include `created_at` and `updated_at`
226
+ - **Timestamp fields with timezone**: avoid timezone pitfalls
227
+ - **Field names aligned with API**: DB column names should match API YAML field names as closely as possible (e.g., API uses `userId` → DB uses `user_id`; as long as the mapping rule is clear), reducing unnecessary transformations in the code layer
228
+ - **Core tables first, auxiliary tables later**: don't try to design all tables at once — output core business tables for user review first, then add auxiliary tables
229
+
230
+ ### PostgreSQL-Specific
231
+
232
+ - **Primary key**: `id UUID DEFAULT gen_random_uuid() PRIMARY KEY`
233
+ - **Timestamp type**: use `TIMESTAMPTZ`
234
+ - **RLS**: enable on all tables with `ALTER TABLE ... ENABLE ROW LEVEL SECURITY;`
235
+ - **JSONB**: prefer JSONB for unstructured storage and create GIN indexes
236
+
237
+ ### MySQL-Specific
238
+
239
+ - **Primary key**: `id CHAR(36) DEFAULT (UUID()) PRIMARY KEY` or auto-increment BIGINT
240
+ - **Timestamp type**: use `TIMESTAMP` (automatic timezone conversion) or `DATETIME` (stored as-is)
241
+ - **Character set**: specify `CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci` when creating tables
242
+ - **Engine**: always use `ENGINE=InnoDB`
243
+
244
+ ### SQLite-Specific
245
+
246
+ - **Primary key**: `TEXT PRIMARY KEY NOT NULL` (app-generated UUID v4) or `INTEGER PRIMARY KEY AUTOINCREMENT`
247
+ - **Timestamp type**: use `TEXT` with ISO 8601 strings, default `DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now'))`
248
+ - **Foreign keys**: must execute `PRAGMA foreign_keys = ON;` at connection time
249
+ - **Comments**: MUST use `-- @comment` / `-- @table-comment` structured annotations (see `logos/spec/sql-comment-convention.md`)
250
+ - **No triggers**: `updated_at` must be refreshed at the application layer; do not rely on `ON UPDATE` triggers
251
+
252
+ ## Recommended Prompts
253
+
254
+ The following prompts can be copied directly for use with AI:
255
+
256
+ - `Help me design the database`
257
+ - `Derive database DDL from the API specifications`
258
+ - `Help me design the database tables involved in S01`
259
+ - `Help me add indexes and RLS policies to the existing table structures`