specrails-core 3.1.0 → 3.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/commands/setup.md CHANGED
@@ -689,12 +689,78 @@ Ask the user where they manage their product backlog:
689
689
 
690
690
  Where do you track your product backlog?
691
691
 
692
- 1. **GitHub Issues** — uses `gh` CLI to read/create issues with labels and VPC scores
693
- 2. **JIRA** uses JIRA CLI or REST API to read/create tickets in a JIRA project
694
- 3. **None** — skip backlog commands (you can still use /implement with text descriptions)
692
+ 1. **Local tickets** (recommended) lightweight JSON-based ticket management built into the project.
693
+ No external tools required. Tickets stored in `.claude/local-tickets.json`, version-controlled and diffable.
694
+ 2. **GitHub Issues** — uses `gh` CLI to read/create issues with labels and VPC scores
695
+ 3. **JIRA** — uses JIRA CLI or REST API to read/create tickets in a JIRA project
696
+ 4. **None** — skip backlog commands (you can still use /implement with text descriptions)
695
697
  ```
696
698
 
697
- Wait for the user's choice. Set `BACKLOG_PROVIDER` to `github`, `jira`, or `none`.
699
+ Wait for the user's choice. Set `BACKLOG_PROVIDER` to `local`, `github`, `jira`, or `none`.
700
+
701
+ #### If Local Tickets
702
+
703
+ No external tools or credentials required. Initialize the storage file:
704
+
705
+ 1. Copy `templates/local-tickets-schema.json` to `$SPECRAILS_DIR/local-tickets.json`
706
+ 2. Set `last_updated` to the current ISO-8601 timestamp
707
+
708
+ Store configuration in `$SPECRAILS_DIR/backlog-config.json`:
709
+ ```json
710
+ {
711
+ "provider": "local",
712
+ "write_access": true,
713
+ "git_auto": true
714
+ }
715
+ ```
716
+
717
+ Local tickets are always read-write — there is no "read only" mode since the file is local.
718
+
719
+ **Ticket schema** — each entry in the `tickets` map has these fields:
720
+
721
+ ```json
722
+ {
723
+ "id": 1,
724
+ "title": "Feature title",
725
+ "description": "Markdown description",
726
+ "status": "todo",
727
+ "priority": "medium",
728
+ "labels": ["area:frontend", "effort:medium"],
729
+ "assignee": null,
730
+ "prerequisites": [],
731
+ "metadata": {
732
+ "vpc_scores": {},
733
+ "effort_level": "Medium",
734
+ "user_story": "",
735
+ "area": ""
736
+ },
737
+ "comments": [],
738
+ "created_at": "<ISO-8601>",
739
+ "updated_at": "<ISO-8601>",
740
+ "created_by": "user",
741
+ "source": "manual"
742
+ }
743
+ ```
744
+
745
+ **Status values:** `todo`, `in_progress`, `done`, `cancelled`
746
+ **Priority values:** `critical`, `high`, `medium`, `low`
747
+ **Labels:** Freeform strings following the `area:*` and `effort:*` convention
748
+ **Source values:** `manual`, `product-backlog`, `propose-spec`
749
+
750
+ **Advisory file locking protocol** (CLI agents and hub server must both follow this):
751
+
752
+ The `revision` counter in the JSON root enables optimistic concurrency — increment it on **every** write. The lock file prevents concurrent corruption:
753
+
754
+ 1. **Acquire lock:** Check for `$SPECRAILS_DIR/local-tickets.json.lock`
755
+ - If the file exists and its `timestamp` is less than 30 seconds old: wait 500ms and retry (max 5 attempts before aborting with an error)
756
+ - If the file exists and its `timestamp` is 30+ seconds old (stale): delete it and proceed
757
+ - If no lock file exists: proceed immediately
758
+ 2. **Create lock file:** Write `{"agent": "<agent-name-or-process>", "timestamp": "<ISO-8601>"}` to `$SPECRAILS_DIR/local-tickets.json.lock`
759
+ 3. **Minimal lock window:** Read the JSON → modify in memory → write back → release
760
+ 4. **Release lock:** Delete `$SPECRAILS_DIR/local-tickets.json.lock`
761
+ 5. **Always increment `revision`** by 1 and update `last_updated` on every successful write
762
+
763
+ The hub server uses `proper-lockfile` (or equivalent) to honor the same protocol via the `.lock` file path.
698
764
 
699
765
  #### If GitHub Issues
700
766
 
@@ -1059,6 +1125,27 @@ When adapting `update-product-driven-backlog.md` and `product-backlog.md`, subst
1059
1125
 
1060
1126
  **When `IS_OSS=false`**: All Kai-related persona references are omitted. `{{MAX_SCORE}}` reduces by 5. Tables and inline scores contain only user-generated personas.
1061
1127
 
1128
+ #### Local Tickets (`BACKLOG_PROVIDER=local`)
1129
+
1130
+ For the local provider, backlog placeholders resolve to **inline file-operation instructions** embedded in the generated command markdown — not shell commands. Agents execute these by reading/writing `$SPECRAILS_DIR/local-tickets.json` directly using their file tools.
1131
+
1132
+ All write operations must follow the **advisory file locking protocol** defined in Phase 3.2. Always increment `revision` and update `last_updated` on every write.
1133
+
1134
+ | Placeholder | Substituted value |
1135
+ |-------------|-------------------|
1136
+ | `{{BACKLOG_PROVIDER_NAME}}` | `Local Tickets` |
1137
+ | `{{BACKLOG_PREFLIGHT}}` | `[[ -f "$SPECRAILS_DIR/local-tickets.json" ]] && echo "Local tickets storage: OK" \|\| echo "WARNING: $SPECRAILS_DIR/local-tickets.json not found — run /setup to initialize"` |
1138
+ | `{{BACKLOG_FETCH_CMD}}` | Read `$SPECRAILS_DIR/local-tickets.json`. Parse the `tickets` map and return all entries where `status` is `"todo"` or `"in_progress"`. |
1139
+ | `{{BACKLOG_FETCH_ALL_CMD}}` | Read `$SPECRAILS_DIR/local-tickets.json`. Parse the `tickets` map and return all entries regardless of status. |
1140
+ | `{{BACKLOG_FETCH_CLOSED_CMD}}` | Read `$SPECRAILS_DIR/local-tickets.json`. Parse the `tickets` map and return all entries where `status` is `"done"` or `"cancelled"`. |
1141
+ | `{{BACKLOG_VIEW_CMD}}` | Read `$SPECRAILS_DIR/local-tickets.json`. Parse JSON and return the full ticket object at `tickets["{id}"]`, or an error if not found. |
1142
+ | `{{BACKLOG_CREATE_CMD}}` | Write to `$SPECRAILS_DIR/local-tickets.json` using the advisory locking protocol: acquire lock → read file → set `id = next_id`, increment `next_id`, set all ticket fields, set `created_at` and `updated_at` to now, bump `revision`, update `last_updated` → write → release lock. |
1143
+ | `{{BACKLOG_UPDATE_CMD}}` | Write to `$SPECRAILS_DIR/local-tickets.json` using the advisory locking protocol: acquire lock → read file → update fields in `tickets["{id}"]`, set `updated_at` to now, bump `revision`, update `last_updated` → write → release lock. |
1144
+ | `{{BACKLOG_DELETE_CMD}}` | Write to `$SPECRAILS_DIR/local-tickets.json` using the advisory locking protocol: acquire lock → read file → delete `tickets["{id}"]`, bump `revision`, update `last_updated` → write → release lock. |
1145
+ | `{{BACKLOG_COMMENT_CMD}}` | Write to `$SPECRAILS_DIR/local-tickets.json` using the advisory locking protocol: acquire lock → read file → append `{"author": "<agent-name>", "body": "<comment>", "created_at": "<ISO-8601>"}` to `tickets["{id}"].comments` (create the array if absent), set `updated_at` to now, bump `revision`, update `last_updated` → write → release lock. |
1146
+ | `{{BACKLOG_PARTIAL_COMMENT_CMD}}` | Same as `{{BACKLOG_COMMENT_CMD}}` but append `{"author": "<agent-name>", "body": "<comment>", "type": "progress", "created_at": "<ISO-8601>"}`. |
1147
+ | `{{BACKLOG_INIT_LABELS_CMD}}` | No label initialization required. Local tickets use freeform label strings. Standard label conventions: `area:frontend`, `area:backend`, `area:api`, `effort:low`, `effort:medium`, `effort:high`. |
1148
+
1062
1149
  #### GitHub Issues (`BACKLOG_PROVIDER=github`)
1063
1150
  - Issue fetch: `gh issue list --label "product-driven-backlog" --state open --limit 100 --json number,title,labels,body`
1064
1151
  - Issue create: `gh issue create --title "..." --label "..." --body "..."`
@@ -1087,7 +1174,7 @@ When adapting `update-product-driven-backlog.md` and `product-backlog.md`, subst
1087
1174
  }
1088
1175
  ```
1089
1176
 
1090
- The command templates use `{{BACKLOG_FETCH_CMD}}`, `{{BACKLOG_CREATE_CMD}}`, `{{BACKLOG_VIEW_CMD}}`, and `{{BACKLOG_PREFLIGHT}}` placeholders that get filled with the provider-specific commands.
1177
+ The command templates use `{{BACKLOG_FETCH_CMD}}`, `{{BACKLOG_CREATE_CMD}}`, `{{BACKLOG_VIEW_CMD}}`, `{{BACKLOG_PREFLIGHT}}`, and related placeholders that get filled with the provider-specific commands (for `local`) or instructions (for `github`, `jira`). The `{{BACKLOG_PROVIDER_NAME}}` placeholder is substituted with a human-readable provider label in all three cases.
1091
1178
 
1092
1179
  ### 4.4 Generate rules
1093
1180
 
@@ -35,5 +35,11 @@
35
35
  "refactor-recommender",
36
36
  "health-check",
37
37
  "compat-check"
38
- ]
38
+ ],
39
+ "ticketProvider": {
40
+ "type": "local",
41
+ "storageFile": "local-tickets.json",
42
+ "lockFile": "local-tickets.json.lock",
43
+ "capabilities": ["crud", "labels", "status", "priorities", "dependencies", "comments"]
44
+ }
39
45
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "specrails-core",
3
- "version": "3.1.0",
3
+ "version": "3.2.0",
4
4
  "description": "AI agent workflow system for Claude Code — installs 12 specialized agents, orchestration commands, and persona-driven product discovery into any repository",
5
5
  "bin": {
6
6
  "specrails-core": "bin/specrails-core.js"
@@ -24,12 +24,22 @@ Check the environment variable `CLAUDE_CODE_ENTRYPOINT`. If it contains `remote_
24
24
 
25
25
  ### Checks to run (sequential, fail-fast)
26
26
 
27
- #### 1. GitHub CLI authentication
27
+ #### 1. Backlog provider availability
28
28
 
29
+ Read `.claude/backlog-config.json` and extract `BACKLOG_PROVIDER`.
30
+
31
+ **If `BACKLOG_PROVIDER=local`:**
29
32
  ```bash
30
- gh auth status 2>&1
33
+ [[ -f "$SPECRAILS_DIR/local-tickets.json" ]] && echo "Local tickets storage: OK" || echo "WARNING: local-tickets.json not found"
31
34
  ```
35
+ - Set `LOCAL_TICKETS_AVAILABLE=true/false` based on file existence.
36
+ - Set `GH_AVAILABLE=false` (GitHub CLI not needed for local provider).
37
+ - Set `BACKLOG_AVAILABLE=true` if local-tickets.json exists.
32
38
 
39
+ **Otherwise:**
40
+ ```bash
41
+ gh auth status 2>&1
42
+ ```
33
43
  - Set `GH_AVAILABLE=true/false` for later phases.
34
44
 
35
45
  #### 2. OpenSpec CLI
@@ -107,7 +117,7 @@ Initialize conflict-tracking variables:
107
117
  - Set `SINGLE_MODE = true`. No worktrees, no parallelism.
108
118
  - **Skip Phase 1 and Phase 2** — go directly to Phase 3a.
109
119
 
110
- **If the user passed issue/ticket references** (e.g. `#85, #71` for GitHub or `PROJ-85, PROJ-71` for JIRA):
120
+ **If the user passed issue/ticket references** (e.g. `#85, #71` for GitHub, `#1, #2` for local tickets, or `PROJ-85, PROJ-71` for JIRA):
111
121
  - Fetch each issue/ticket:
112
122
  ```bash
113
123
  {{BACKLOG_VIEW_CMD}}
@@ -120,7 +130,44 @@ Initialize conflict-tracking variables:
120
130
 
121
131
  After fetching issue refs, capture a baseline snapshot for conflict detection.
122
132
 
123
- **If `GH_AVAILABLE=true` and the input mode was issue numbers:**
133
+ ##### If `BACKLOG_PROVIDER=local` and input mode was issue numbers:
134
+
135
+ For each resolved ticket ID, read `$SPECRAILS_DIR/local-tickets.json` and extract the ticket object at `tickets["{id}"]`.
136
+
137
+ Build a snapshot object for each ticket:
138
+ - `number`: ticket `id` (integer)
139
+ - `title`: ticket `title` string
140
+ - `state`: map ticket `status` — `"done"` or `"cancelled"` → `"closed"`, otherwise → `"open"`
141
+ - `assignees`: `[ticket.assignee]` if non-null, else `[]`
142
+ - `labels`: ticket `labels` array, sorted alphabetically
143
+ - `body_sha`: SHA-256 of the ticket `description` string — compute with:
144
+ ```bash
145
+ echo -n "{description}" | sha256sum | cut -d' ' -f1
146
+ ```
147
+ If `sha256sum` is not available, fall back to `openssl dgst -sha256 -r` or `shasum -a 256`.
148
+ - `updated_at`: ticket `updated_at` value
149
+ - `captured_at`: current local time in ISO 8601 format
150
+
151
+ Write the following JSON to `.claude/backlog-cache.json` (overwrite fully — this establishes a fresh baseline for this run):
152
+
153
+ ```json
154
+ {
155
+ "schema_version": "1",
156
+ "provider": "local",
157
+ "last_updated": "<ISO 8601 timestamp>",
158
+ "written_by": "implement",
159
+ "issues": {
160
+ "<id>": { <snapshot object> },
161
+ ...
162
+ }
163
+ }
164
+ ```
165
+
166
+ If the write succeeds: set `SNAPSHOTS_CAPTURED=true`.
167
+
168
+ If the write fails: print `[backlog-cache] Warning: could not write cache. Conflict detection disabled for this run.` and set `SNAPSHOTS_CAPTURED=false`. Do NOT abort the pipeline.
169
+
170
+ ##### If `GH_AVAILABLE=true` and input mode was issue numbers (GitHub/JIRA):
124
171
 
125
172
  For each resolved issue number, run:
126
173
 
@@ -161,9 +208,9 @@ If the write succeeds: set `SNAPSHOTS_CAPTURED=true`.
161
208
 
162
209
  If the write fails (e.g., `.claude/` directory does not exist): print `[backlog-cache] Warning: could not write cache. Conflict detection disabled for this run.` and set `SNAPSHOTS_CAPTURED=false`. Do NOT abort the pipeline.
163
210
 
164
- **If `GH_AVAILABLE=false` or input was not issue numbers:**
211
+ ##### Otherwise (no backlog available or non-issue input):
165
212
 
166
- Set `SNAPSHOTS_CAPTURED=false`. Print: `[conflict-check] Snapshot skipped — GH unavailable or non-issue input.`
213
+ Set `SNAPSHOTS_CAPTURED=false`. Print: `[conflict-check] Snapshot skipped — backlog unavailable or non-issue input.`
167
214
 
168
215
  #### Gitignore advisory
169
216
 
@@ -267,7 +314,9 @@ Pick the single idea with the best impact/effort ratio from each exploration. Pr
267
314
 
268
315
  Otherwise, re-fetch each issue in scope and diff against the Phase 0 snapshot:
269
316
 
270
- For each issue number in `ISSUE_REFS`:
317
+ **If `BACKLOG_PROVIDER=local`:** For each ticket ID in `ISSUE_REFS`, read `$SPECRAILS_DIR/local-tickets.json` and extract the ticket at `tickets["{id}"]`. If the ticket does not exist (deleted): treat as a CRITICAL conflict — field `"state"`, was `<cached state>`, now `"deleted"`. Otherwise, reconstruct a current snapshot using the same mapping as the Phase 0 local snapshot.
318
+
319
+ **If `BACKLOG_PROVIDER=github`:** For each issue number in `ISSUE_REFS`:
271
320
 
272
321
  ```bash
273
322
  gh issue view {number} --json number,title,state,assignees,labels,body,updatedAt
@@ -275,7 +324,7 @@ gh issue view {number} --json number,title,state,assignees,labels,body,updatedAt
275
324
 
276
325
  If the `gh` command returns non-zero (issue deleted or inaccessible): treat as a CRITICAL conflict — field `"state"`, was `<cached state>`, now `"deleted"`.
277
326
 
278
- Otherwise, reconstruct a current snapshot (same shape as Phase 0: sort `assignees` and `labels`, compute `body_sha`).
327
+ In both cases, reconstruct a current snapshot (same shape as Phase 0: sort `assignees` and `labels`, compute `body_sha`).
279
328
 
280
329
  **Short-circuit:** If `current.updatedAt == cached.updated_at`, mark the issue as clean and skip field comparison.
281
330
 
@@ -848,6 +897,9 @@ This check is independent of Phase 3a.0. Even if the user chose to continue thro
848
897
 
849
898
  Re-fetch each issue in `ISSUE_REFS` and diff against `.claude/backlog-cache.json` using the same algorithm as Phase 3a.0:
850
899
 
900
+ **If `BACKLOG_PROVIDER=local`:** Read `$SPECRAILS_DIR/local-tickets.json` and extract each ticket by ID.
901
+
902
+ **If `BACKLOG_PROVIDER=github`:**
851
903
  ```bash
852
904
  gh issue view {number} --json number,title,state,assignees,labels,body,updatedAt
853
905
  ```
@@ -881,8 +933,12 @@ Record skipped operations to `.cache-manifest.json` under `skipped_operations`:
881
933
  - `"git: commit"`
882
934
  - `"git: push"`
883
935
  - `"github: pr creation"` (if `GH_AVAILABLE=true`)
884
- - `"github: issue comment #N"` for each issue in scope (if `BACKLOG_WRITE=true`)
885
- - `"github: issue close #N (via PR merge)"` for each fully resolved issue (if `BACKLOG_WRITE=true`)
936
+ - If `BACKLOG_PROVIDER=local` and `BACKLOG_WRITE=true`:
937
+ - `"local: ticket comment #{id}"` for each ticket in scope
938
+ - `"local: ticket status update #{id}"` for each fully resolved ticket
939
+ - If `BACKLOG_PROVIDER=github` and `BACKLOG_WRITE=true`:
940
+ - `"github: issue comment #N"` for each issue in scope
941
+ - `"github: issue close #N (via PR merge)"` for each fully resolved issue
886
942
 
887
943
  Then skip the rest of Phase 4c and proceed directly to Phase 4e.
888
944
 
@@ -934,17 +990,19 @@ All implementation is complete and CI checks pass.
934
990
  #### Backlog updates (both modes)
935
991
 
936
992
  **If `BACKLOG_WRITE=true`:**
937
- - For fully resolved issues/tickets: add a comment noting completion and reference the PR. Do NOT close the issue explicitly — use `Closes #N` in the PR body so GitHub/JIRA closes it automatically when the PR is merged:
993
+ - For fully resolved issues/tickets: add a comment noting completion and reference the PR:
938
994
  ```bash
939
995
  {{BACKLOG_COMMENT_CMD}}
940
996
  ```
941
- - GitHub: `gh issue comment {number} --body "Implemented in PR #XX. All acceptance criteria met."`
942
- - JIRA: `jira issue comment {key} --message "Implemented in PR #XX. All acceptance criteria met."`
943
- - Ensure the PR body includes `Closes #N` for each fully resolved issue (GitHub auto-closes on merge)
997
+ - **Local:** Update the ticket status to `"done"` using `{{BACKLOG_UPDATE_CMD}}` and add a comment: `"Implemented in PR #XX. All acceptance criteria met."` via `{{BACKLOG_COMMENT_CMD}}`. Local tickets are closed directly — there is no auto-close-on-merge mechanism.
998
+ - **GitHub:** `gh issue comment {number} --body "Implemented in PR #XX. All acceptance criteria met."` — do NOT close the issue explicitly. Use `Closes #N` in the PR body so GitHub auto-closes on merge.
999
+ - **JIRA:** `jira issue comment {key} --message "Implemented in PR #XX. All acceptance criteria met."`
1000
+ - For GitHub/JIRA: ensure the PR body includes `Closes #N` for each fully resolved issue (auto-closes on merge)
944
1001
  - For partially resolved issues/tickets: add a comment noting progress:
945
1002
  ```bash
946
1003
  {{BACKLOG_PARTIAL_COMMENT_CMD}}
947
1004
  ```
1005
+ - **Local:** Additionally update the ticket status to `"in_progress"` via `{{BACKLOG_UPDATE_CMD}}` if it is still `"todo"`.
948
1006
 
949
1007
  **If `BACKLOG_WRITE=false`:**
950
1008
  - Do NOT create, modify, or comment on any issues/tickets.
@@ -161,6 +161,33 @@ The product-analyst receives this prompt:
161
161
 
162
162
  7. **[Orchestrator]** After the product-analyst completes, write issue snapshots to `.claude/backlog-cache.json`.
163
163
 
164
+ #### If provider=local — Cache from Local Tickets
165
+
166
+ Read `$SPECRAILS_DIR/local-tickets.json` and parse the `tickets` map. For each ticket with `"product-driven-backlog"` in its `labels` array and `status` not `"cancelled"`, build a snapshot object:
167
+ - `number`: ticket `id` (integer)
168
+ - `title`: ticket `title` string
169
+ - `state`: map ticket `status` — `"done"` or `"cancelled"` → `"closed"`, otherwise → `"open"`
170
+ - `assignees`: `[ticket.assignee]` if non-null, else `[]`
171
+ - `labels`: ticket `labels` array, sorted alphabetically
172
+ - `body_sha`: SHA-256 of the ticket `description` string — compute with:
173
+ ```bash
174
+ echo -n "{description}" | sha256sum | cut -d' ' -f1
175
+ ```
176
+ If `sha256sum` is not available, fall back to `openssl dgst -sha256 -r` or `shasum -a 256`.
177
+ - `updated_at`: ticket `updated_at` value
178
+ - `captured_at`: current local time in ISO 8601 format
179
+
180
+ Write to `.claude/backlog-cache.json` with:
181
+ - `schema_version`: `"1"`
182
+ - `provider`: `"local"`
183
+ - `last_updated`: current ISO 8601 timestamp
184
+ - `written_by`: `"product-backlog"`
185
+ - `issues`: the map keyed by string ticket ID
186
+
187
+ If the write fails: print `[backlog-cache] Warning: could not write cache. Continuing.` Do not abort.
188
+
189
+ #### If provider=github — Cache from GitHub Issues
190
+
164
191
  **Guard:** If `GH_AVAILABLE=false` (from Phase 0 pre-flight), print `[backlog-cache] Skipped — GH unavailable.` and return. Do not attempt the write.
165
192
 
166
193
  **Fetch all open backlog issues in one call:**
@@ -193,3 +220,7 @@ The product-analyst receives this prompt:
193
220
  - `issues`: the merged map keyed by string issue number
194
221
 
195
222
  If the write fails (e.g., `.claude/` directory does not exist): print `[backlog-cache] Warning: could not write cache. Continuing.` Do not abort.
223
+
224
+ #### If provider=jira or provider=none
225
+
226
+ Print `[backlog-cache] Skipped — provider does not support cache.` Do not attempt the write.
@@ -42,3 +42,59 @@ Output ONLY the following structured markdown. Do not add any preamble or explan
42
42
  ## Estimated Complexity
43
43
  [One of: Low (< 1 day) / Medium (1-3 days) / High (3-7 days) / Very High (> 1 week)]
44
44
  [One sentence justifying the estimate]
45
+
46
+ ---
47
+
48
+ ## Backlog Sync
49
+
50
+ After generating the proposal, read `.claude/backlog-config.json` to determine `BACKLOG_PROVIDER` and `BACKLOG_WRITE`.
51
+
52
+ ### If provider=local — Create Local Ticket
53
+
54
+ Create a local ticket from the proposal output:
55
+
56
+ ```
57
+ {{BACKLOG_CREATE_CMD}}
58
+ ```
59
+
60
+ Set the following fields:
61
+ - `title`: The Spec Title from the proposal
62
+ - `description`: The full structured proposal markdown (all sections from Problem Statement through Estimated Complexity)
63
+ - `status`: `"todo"`
64
+ - `priority`: Map Estimated Complexity — Low → `"low"`, Medium → `"medium"`, High/Very High → `"high"`
65
+ - `labels`: `["spec-proposal"]`
66
+ - `source`: `"propose-spec"`
67
+ - `created_by`: `"sr-product-engineer"`
68
+
69
+ Print: `Created local ticket #{id}: {title}`
70
+
71
+ ### If provider=github and BACKLOG_WRITE=true — Create GitHub Issue
72
+
73
+ ```bash
74
+ {{BACKLOG_CREATE_CMD}}
75
+ ```
76
+
77
+ Create a GitHub Issue with:
78
+ - Title: The Spec Title
79
+ - Body: Full structured proposal markdown
80
+ - Labels: `spec-proposal`
81
+
82
+ Print: `Created GitHub Issue #{number}: {title}`
83
+
84
+ ### If provider=jira and BACKLOG_WRITE=true — Create JIRA Story
85
+
86
+ Create a JIRA Story using the same authentication and API pattern as `/sr:update-product-driven-backlog`:
87
+ - Summary: The Spec Title
88
+ - Description: Full structured proposal in Atlassian Document Format
89
+ - Labels: `spec-proposal`
90
+
91
+ Print: `Created JIRA ticket {key}: {title}`
92
+
93
+ ### If BACKLOG_WRITE=false or provider=none
94
+
95
+ Do NOT create any tickets. Print:
96
+ ```
97
+ Spec proposal ready. Create a ticket manually if desired:
98
+ Title: {Spec Title}
99
+ Complexity: {Estimated Complexity}
100
+ ```
@@ -68,7 +68,7 @@ After the Explore agent completes:
68
68
  1. **Display** results to the user.
69
69
 
70
70
  2. Read `.claude/backlog-config.json` and extract:
71
- - `BACKLOG_PROVIDER` (`github`, `jira`, or `none`)
71
+ - `BACKLOG_PROVIDER` (`local`, `github`, `jira`, or `none`)
72
72
  - `BACKLOG_WRITE` (from `write_access`)
73
73
 
74
74
  ### If `BACKLOG_WRITE=false` — Display only (no sync)
@@ -98,6 +98,46 @@ After the Explore agent completes:
98
98
 
99
99
  4. **Do NOT** create, modify, or comment on any issues/tickets.
100
100
 
101
+ ### If provider=local — Sync to Local Tickets
102
+
103
+ Local tickets are always read-write. Sync directly to `$SPECRAILS_DIR/local-tickets.json`.
104
+
105
+ 3. **Fetch existing local tickets** to avoid duplicates:
106
+ ```
107
+ {{BACKLOG_FETCH_ALL_CMD}}
108
+ ```
109
+ Collect all ticket titles into a duplicate-check set.
110
+
111
+ 4. **Initialize labels** (idempotent):
112
+ ```
113
+ {{BACKLOG_INIT_LABELS_CMD}}
114
+ ```
115
+
116
+ 5. **For each proposed feature, create a local ticket** (skip if title matches an existing ticket):
117
+ ```
118
+ {{BACKLOG_CREATE_CMD}}
119
+ ```
120
+ Set the following fields on each new ticket:
121
+ - `title`: Feature name
122
+ - `description`: Full VPC body markdown (same format as the GitHub/JIRA issue body above)
123
+ - `status`: `"todo"`
124
+ - `priority`: Map effort to priority — Low effort → `"high"` priority, Medium → `"medium"`, High → `"low"`
125
+ - `labels`: `["product-driven-backlog", "area:{area}"]`
126
+ - `metadata.vpc_scores`: Object with per-persona scores from the VPC evaluation
127
+ - `metadata.effort_level`: `"High"`, `"Medium"`, or `"Low"`
128
+ - `metadata.user_story`: The user story text
129
+ - `metadata.area`: The area name (without `area:` prefix)
130
+ - `prerequisites`: Array of ticket IDs for any features this depends on (empty if none)
131
+ - `source`: `"product-backlog"`
132
+ - `created_by`: `"sr-product-manager"`
133
+
134
+ 6. **Report** sync results:
135
+ ```
136
+ Product discovery complete:
137
+ - Created: {N} new feature ideas as local tickets
138
+ - Skipped: {N} duplicates (already exist)
139
+ ```
140
+
101
141
  ### If provider=github and BACKLOG_WRITE=true — Sync to GitHub Issues
102
142
 
103
143
  3. **Fetch existing product-driven backlog items** to avoid duplicates:
@@ -0,0 +1,7 @@
1
+ {
2
+ "schema_version": "1.0",
3
+ "revision": 0,
4
+ "last_updated": null,
5
+ "next_id": 1,
6
+ "tickets": {}
7
+ }