@goondocks/myco 0.6.4 → 0.9.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (288) hide show
  1. package/.claude-plugin/marketplace.json +2 -3
  2. package/.claude-plugin/plugin.json +3 -3
  3. package/CONTRIBUTING.md +37 -30
  4. package/README.md +64 -28
  5. package/bin/myco-run +2 -0
  6. package/dist/agent-run-EFICNTAU.js +34 -0
  7. package/dist/agent-run-EFICNTAU.js.map +1 -0
  8. package/dist/agent-tasks-RXJ7Z5NG.js +180 -0
  9. package/dist/agent-tasks-RXJ7Z5NG.js.map +1 -0
  10. package/dist/chunk-2T7RPVPP.js +116 -0
  11. package/dist/chunk-2T7RPVPP.js.map +1 -0
  12. package/dist/chunk-3K5WGSJ4.js +165 -0
  13. package/dist/chunk-3K5WGSJ4.js.map +1 -0
  14. package/dist/chunk-46PWOKSI.js +26 -0
  15. package/dist/chunk-46PWOKSI.js.map +1 -0
  16. package/dist/chunk-4LPQ26CK.js +277 -0
  17. package/dist/chunk-4LPQ26CK.js.map +1 -0
  18. package/dist/chunk-5PEUFJ6U.js +92 -0
  19. package/dist/chunk-5PEUFJ6U.js.map +1 -0
  20. package/dist/chunk-5VZ52A4T.js +136 -0
  21. package/dist/chunk-5VZ52A4T.js.map +1 -0
  22. package/dist/chunk-BUSP3OJB.js +103 -0
  23. package/dist/chunk-BUSP3OJB.js.map +1 -0
  24. package/dist/chunk-D7TYRPRM.js +7312 -0
  25. package/dist/chunk-D7TYRPRM.js.map +1 -0
  26. package/dist/chunk-DCXRSSBP.js +22 -0
  27. package/dist/chunk-DCXRSSBP.js.map +1 -0
  28. package/dist/chunk-E4VLWIJC.js +2 -0
  29. package/dist/chunk-FFAYUQ5N.js +39 -0
  30. package/dist/chunk-FFAYUQ5N.js.map +1 -0
  31. package/dist/chunk-IB76KGBY.js +2 -0
  32. package/dist/chunk-JMJJEQ3P.js +486 -0
  33. package/dist/chunk-JMJJEQ3P.js.map +1 -0
  34. package/dist/{chunk-N33KUCFP.js → chunk-JTYZRPX5.js} +1 -9
  35. package/dist/chunk-JTYZRPX5.js.map +1 -0
  36. package/dist/{chunk-NLUE6CYG.js → chunk-JYOOJCPQ.js} +33 -17
  37. package/dist/chunk-JYOOJCPQ.js.map +1 -0
  38. package/dist/{chunk-Z74SDEKE.js → chunk-KB4DGYIY.js} +91 -9
  39. package/dist/chunk-KB4DGYIY.js.map +1 -0
  40. package/dist/{chunk-ERG2IEWX.js → chunk-KH64DHOY.js} +3 -7413
  41. package/dist/chunk-KH64DHOY.js.map +1 -0
  42. package/dist/chunk-KV4OC4H3.js +498 -0
  43. package/dist/chunk-KV4OC4H3.js.map +1 -0
  44. package/dist/chunk-KYLDNM7H.js +66 -0
  45. package/dist/chunk-KYLDNM7H.js.map +1 -0
  46. package/dist/chunk-LPUQPDC2.js +19 -0
  47. package/dist/chunk-LPUQPDC2.js.map +1 -0
  48. package/dist/chunk-M5XWW7UI.js +97 -0
  49. package/dist/chunk-M5XWW7UI.js.map +1 -0
  50. package/dist/chunk-MHSCMET3.js +275 -0
  51. package/dist/chunk-MHSCMET3.js.map +1 -0
  52. package/dist/chunk-MYX5NCRH.js +45 -0
  53. package/dist/chunk-MYX5NCRH.js.map +1 -0
  54. package/dist/chunk-OXZSXYAT.js +877 -0
  55. package/dist/chunk-OXZSXYAT.js.map +1 -0
  56. package/dist/chunk-PB6TOLRQ.js +35 -0
  57. package/dist/chunk-PB6TOLRQ.js.map +1 -0
  58. package/dist/chunk-PT5IC642.js +162 -0
  59. package/dist/chunk-PT5IC642.js.map +1 -0
  60. package/dist/chunk-QIK2XSDQ.js +187 -0
  61. package/dist/chunk-QIK2XSDQ.js.map +1 -0
  62. package/dist/chunk-RJ6ZQKG5.js +26 -0
  63. package/dist/chunk-RJ6ZQKG5.js.map +1 -0
  64. package/dist/{chunk-YIQLYIHW.js → chunk-TRUJLI6K.js} +29 -43
  65. package/dist/chunk-TRUJLI6K.js.map +1 -0
  66. package/dist/chunk-U3IBO3O3.js +41 -0
  67. package/dist/chunk-U3IBO3O3.js.map +1 -0
  68. package/dist/{chunk-7WHF2OIZ.js → chunk-UBZPD4HN.js} +25 -7
  69. package/dist/chunk-UBZPD4HN.js.map +1 -0
  70. package/dist/{chunk-HIN3UVOG.js → chunk-V7XG6V6C.js} +20 -11
  71. package/dist/chunk-V7XG6V6C.js.map +1 -0
  72. package/dist/chunk-WGTCA2NU.js +84 -0
  73. package/dist/chunk-WGTCA2NU.js.map +1 -0
  74. package/dist/{chunk-O6PERU7U.js → chunk-XNOCTDHF.js} +2 -2
  75. package/dist/chunk-YDN4OM33.js +80 -0
  76. package/dist/chunk-YDN4OM33.js.map +1 -0
  77. package/dist/cli-ODLFRIYS.js +128 -0
  78. package/dist/cli-ODLFRIYS.js.map +1 -0
  79. package/dist/client-EYOTW3JU.js +19 -0
  80. package/dist/client-MXRNQ5FI.js +13 -0
  81. package/dist/{config-IBS6KOLQ.js → config-UR5BSGVX.js} +21 -34
  82. package/dist/config-UR5BSGVX.js.map +1 -0
  83. package/dist/detect-H5OPI7GD.js +17 -0
  84. package/dist/detect-H5OPI7GD.js.map +1 -0
  85. package/dist/detect-providers-Q42OD4OS.js +26 -0
  86. package/dist/detect-providers-Q42OD4OS.js.map +1 -0
  87. package/dist/doctor-JLKTXDEH.js +258 -0
  88. package/dist/doctor-JLKTXDEH.js.map +1 -0
  89. package/dist/executor-ONSDHPGX.js +1441 -0
  90. package/dist/executor-ONSDHPGX.js.map +1 -0
  91. package/dist/init-6GWY345B.js +198 -0
  92. package/dist/init-6GWY345B.js.map +1 -0
  93. package/dist/init-wizard-UONLDYLI.js +294 -0
  94. package/dist/init-wizard-UONLDYLI.js.map +1 -0
  95. package/dist/llm-BV3QNVRD.js +17 -0
  96. package/dist/llm-BV3QNVRD.js.map +1 -0
  97. package/dist/loader-SH67XD54.js +28 -0
  98. package/dist/loader-SH67XD54.js.map +1 -0
  99. package/dist/loader-XVXKZZDH.js +18 -0
  100. package/dist/loader-XVXKZZDH.js.map +1 -0
  101. package/dist/{chunk-H7PRCVGQ.js → logs-QZVYF6FP.js} +74 -5
  102. package/dist/logs-QZVYF6FP.js.map +1 -0
  103. package/dist/main-BMCL7CPO.js +4393 -0
  104. package/dist/main-BMCL7CPO.js.map +1 -0
  105. package/dist/openai-embeddings-C265WRNK.js +14 -0
  106. package/dist/openai-embeddings-C265WRNK.js.map +1 -0
  107. package/dist/openrouter-U6VFCRX2.js +14 -0
  108. package/dist/openrouter-U6VFCRX2.js.map +1 -0
  109. package/dist/post-compact-OWFSOITU.js +26 -0
  110. package/dist/post-compact-OWFSOITU.js.map +1 -0
  111. package/dist/post-tool-use-DOUM7CGQ.js +56 -0
  112. package/dist/post-tool-use-DOUM7CGQ.js.map +1 -0
  113. package/dist/post-tool-use-failure-SG3C7PE6.js +28 -0
  114. package/dist/post-tool-use-failure-SG3C7PE6.js.map +1 -0
  115. package/dist/pre-compact-3J33CHXQ.js +25 -0
  116. package/dist/pre-compact-3J33CHXQ.js.map +1 -0
  117. package/dist/provider-check-3WBPZADE.js +12 -0
  118. package/dist/provider-check-3WBPZADE.js.map +1 -0
  119. package/dist/registry-J4XTWARS.js +25 -0
  120. package/dist/registry-J4XTWARS.js.map +1 -0
  121. package/dist/resolution-events-TFEQPVKS.js +12 -0
  122. package/dist/resolution-events-TFEQPVKS.js.map +1 -0
  123. package/dist/resolve-3FEUV462.js +9 -0
  124. package/dist/resolve-3FEUV462.js.map +1 -0
  125. package/dist/{restart-XCMILOL5.js → restart-2VM33WOB.js} +10 -6
  126. package/dist/{restart-XCMILOL5.js.map → restart-2VM33WOB.js.map} +1 -1
  127. package/dist/search-ZGQR5MDE.js +91 -0
  128. package/dist/search-ZGQR5MDE.js.map +1 -0
  129. package/dist/{server-6UDN35QN.js → server-6KMBJCHZ.js} +308 -517
  130. package/dist/server-6KMBJCHZ.js.map +1 -0
  131. package/dist/session-Z2FXDDG6.js +68 -0
  132. package/dist/session-Z2FXDDG6.js.map +1 -0
  133. package/dist/session-end-FLVX32LE.js +38 -0
  134. package/dist/session-end-FLVX32LE.js.map +1 -0
  135. package/dist/session-start-UCLK7PXE.js +169 -0
  136. package/dist/session-start-UCLK7PXE.js.map +1 -0
  137. package/dist/setup-digest-4KDSXAIV.js +15 -0
  138. package/dist/setup-digest-4KDSXAIV.js.map +1 -0
  139. package/dist/setup-llm-GKMCHURK.js +81 -0
  140. package/dist/setup-llm-GKMCHURK.js.map +1 -0
  141. package/dist/src/agent/definitions/agent.yaml +35 -0
  142. package/dist/src/agent/definitions/tasks/digest-only.yaml +84 -0
  143. package/dist/src/agent/definitions/tasks/extract-only.yaml +87 -0
  144. package/dist/src/agent/definitions/tasks/full-intelligence.yaml +472 -0
  145. package/dist/src/agent/definitions/tasks/graph-maintenance.yaml +92 -0
  146. package/dist/src/agent/definitions/tasks/review-session.yaml +132 -0
  147. package/dist/src/agent/definitions/tasks/supersession-sweep.yaml +86 -0
  148. package/dist/src/agent/definitions/tasks/title-summary.yaml +88 -0
  149. package/dist/src/agent/prompts/agent.md +121 -0
  150. package/dist/src/agent/prompts/orchestrator.md +91 -0
  151. package/dist/src/cli.js +1 -8
  152. package/dist/src/cli.js.map +1 -1
  153. package/dist/src/daemon/main.js +1 -8
  154. package/dist/src/daemon/main.js.map +1 -1
  155. package/dist/src/hooks/post-tool-use.js +3 -50
  156. package/dist/src/hooks/post-tool-use.js.map +1 -1
  157. package/dist/src/hooks/session-end.js +3 -32
  158. package/dist/src/hooks/session-end.js.map +1 -1
  159. package/dist/src/hooks/session-start.js +2 -8
  160. package/dist/src/hooks/session-start.js.map +1 -1
  161. package/dist/src/hooks/stop.js +3 -42
  162. package/dist/src/hooks/stop.js.map +1 -1
  163. package/dist/src/hooks/user-prompt-submit.js +3 -53
  164. package/dist/src/hooks/user-prompt-submit.js.map +1 -1
  165. package/dist/src/mcp/server.js +1 -8
  166. package/dist/src/mcp/server.js.map +1 -1
  167. package/dist/src/prompts/digest-system.md +1 -1
  168. package/dist/src/symbionts/manifests/claude-code.yaml +16 -0
  169. package/dist/src/symbionts/manifests/cursor.yaml +14 -0
  170. package/dist/stats-IUJPZSVZ.js +94 -0
  171. package/dist/stats-IUJPZSVZ.js.map +1 -0
  172. package/dist/stop-XRQLLXST.js +42 -0
  173. package/dist/stop-XRQLLXST.js.map +1 -0
  174. package/dist/stop-failure-2CAJJKRG.js +26 -0
  175. package/dist/stop-failure-2CAJJKRG.js.map +1 -0
  176. package/dist/subagent-start-MWWQTZMQ.js +26 -0
  177. package/dist/subagent-start-MWWQTZMQ.js.map +1 -0
  178. package/dist/subagent-stop-PJXYGRXB.js +28 -0
  179. package/dist/subagent-stop-PJXYGRXB.js.map +1 -0
  180. package/dist/task-completed-4LFRJVGI.js +27 -0
  181. package/dist/task-completed-4LFRJVGI.js.map +1 -0
  182. package/dist/ui/assets/index-DZrElonz.js +744 -0
  183. package/dist/ui/assets/index-TkeiYbZB.css +1 -0
  184. package/dist/ui/favicon.svg +7 -7
  185. package/dist/ui/fonts/Inter-Variable.woff2 +0 -0
  186. package/dist/ui/fonts/JetBrainsMono-Variable.woff2 +0 -0
  187. package/dist/ui/fonts/Newsreader-Italic-Variable.woff2 +0 -0
  188. package/dist/ui/fonts/Newsreader-Variable.woff2 +0 -0
  189. package/dist/ui/index.html +2 -2
  190. package/dist/user-prompt-submit-KSM3AR6P.js +59 -0
  191. package/dist/user-prompt-submit-KSM3AR6P.js.map +1 -0
  192. package/dist/{verify-TOWQHPBX.js → verify-UDAYVX37.js} +17 -22
  193. package/dist/verify-UDAYVX37.js.map +1 -0
  194. package/dist/{version-36RVCQA6.js → version-KLBN4HZT.js} +3 -4
  195. package/dist/version-KLBN4HZT.js.map +1 -0
  196. package/hooks/hooks.json +82 -5
  197. package/package.json +6 -3
  198. package/skills/myco/SKILL.md +10 -10
  199. package/skills/myco/references/cli-usage.md +15 -13
  200. package/skills/myco/references/vault-status.md +3 -3
  201. package/skills/myco/references/wisdom.md +4 -4
  202. package/skills/myco-curate/SKILL.md +86 -0
  203. package/dist/chunk-2ZIBCEYO.js +0 -113
  204. package/dist/chunk-2ZIBCEYO.js.map +0 -1
  205. package/dist/chunk-4RMSHZE4.js +0 -107
  206. package/dist/chunk-4RMSHZE4.js.map +0 -1
  207. package/dist/chunk-4XVKZ3WA.js +0 -1078
  208. package/dist/chunk-4XVKZ3WA.js.map +0 -1
  209. package/dist/chunk-6FQISQNA.js +0 -61
  210. package/dist/chunk-6FQISQNA.js.map +0 -1
  211. package/dist/chunk-7WHF2OIZ.js.map +0 -1
  212. package/dist/chunk-ERG2IEWX.js.map +0 -1
  213. package/dist/chunk-FPRXMJLT.js +0 -56
  214. package/dist/chunk-FPRXMJLT.js.map +0 -1
  215. package/dist/chunk-GENQ5QGP.js +0 -37
  216. package/dist/chunk-GENQ5QGP.js.map +0 -1
  217. package/dist/chunk-H7PRCVGQ.js.map +0 -1
  218. package/dist/chunk-HIN3UVOG.js.map +0 -1
  219. package/dist/chunk-HYVT345Y.js +0 -159
  220. package/dist/chunk-HYVT345Y.js.map +0 -1
  221. package/dist/chunk-J4D4CROB.js +0 -143
  222. package/dist/chunk-J4D4CROB.js.map +0 -1
  223. package/dist/chunk-MDLSAFPP.js +0 -99
  224. package/dist/chunk-MDLSAFPP.js.map +0 -1
  225. package/dist/chunk-N33KUCFP.js.map +0 -1
  226. package/dist/chunk-NL6WQO56.js +0 -65
  227. package/dist/chunk-NL6WQO56.js.map +0 -1
  228. package/dist/chunk-NLUE6CYG.js.map +0 -1
  229. package/dist/chunk-P723N2LP.js +0 -147
  230. package/dist/chunk-P723N2LP.js.map +0 -1
  231. package/dist/chunk-QLUE3BUL.js +0 -161
  232. package/dist/chunk-QLUE3BUL.js.map +0 -1
  233. package/dist/chunk-QN4W3JUA.js +0 -43
  234. package/dist/chunk-QN4W3JUA.js.map +0 -1
  235. package/dist/chunk-RGVBGTD6.js +0 -21
  236. package/dist/chunk-RGVBGTD6.js.map +0 -1
  237. package/dist/chunk-TWSTAVLO.js +0 -132
  238. package/dist/chunk-TWSTAVLO.js.map +0 -1
  239. package/dist/chunk-UP4P4OAA.js +0 -4423
  240. package/dist/chunk-UP4P4OAA.js.map +0 -1
  241. package/dist/chunk-YIQLYIHW.js.map +0 -1
  242. package/dist/chunk-YTFXA4RX.js +0 -86
  243. package/dist/chunk-YTFXA4RX.js.map +0 -1
  244. package/dist/chunk-Z74SDEKE.js.map +0 -1
  245. package/dist/cli-IHILSS6N.js +0 -97
  246. package/dist/cli-IHILSS6N.js.map +0 -1
  247. package/dist/client-AGFNR2S4.js +0 -12
  248. package/dist/config-IBS6KOLQ.js.map +0 -1
  249. package/dist/curate-3D4GHKJH.js +0 -78
  250. package/dist/curate-3D4GHKJH.js.map +0 -1
  251. package/dist/detect-providers-XEP4QA3R.js +0 -35
  252. package/dist/detect-providers-XEP4QA3R.js.map +0 -1
  253. package/dist/digest-7HLJXL77.js +0 -85
  254. package/dist/digest-7HLJXL77.js.map +0 -1
  255. package/dist/init-ARQ53JOR.js +0 -109
  256. package/dist/init-ARQ53JOR.js.map +0 -1
  257. package/dist/logs-IENORIYR.js +0 -84
  258. package/dist/logs-IENORIYR.js.map +0 -1
  259. package/dist/main-6AGPIMH2.js +0 -5715
  260. package/dist/main-6AGPIMH2.js.map +0 -1
  261. package/dist/rebuild-Q2ACEB6F.js +0 -64
  262. package/dist/rebuild-Q2ACEB6F.js.map +0 -1
  263. package/dist/reprocess-CDEFGQOV.js +0 -79
  264. package/dist/reprocess-CDEFGQOV.js.map +0 -1
  265. package/dist/search-7W25SKCB.js +0 -120
  266. package/dist/search-7W25SKCB.js.map +0 -1
  267. package/dist/server-6UDN35QN.js.map +0 -1
  268. package/dist/session-F326AWCH.js +0 -44
  269. package/dist/session-F326AWCH.js.map +0 -1
  270. package/dist/session-start-K6IGAC7H.js +0 -192
  271. package/dist/session-start-K6IGAC7H.js.map +0 -1
  272. package/dist/setup-digest-X5PN27F4.js +0 -15
  273. package/dist/setup-llm-S5OHQJXK.js +0 -15
  274. package/dist/src/prompts/classification.md +0 -43
  275. package/dist/stats-TTSDXGJV.js +0 -58
  276. package/dist/stats-TTSDXGJV.js.map +0 -1
  277. package/dist/templates-XPRBOWCE.js +0 -38
  278. package/dist/templates-XPRBOWCE.js.map +0 -1
  279. package/dist/ui/assets/index-08wKT7wS.css +0 -1
  280. package/dist/ui/assets/index-CMSMi4Jb.js +0 -369
  281. package/dist/verify-TOWQHPBX.js.map +0 -1
  282. package/skills/setup/SKILL.md +0 -174
  283. package/skills/setup/references/model-recommendations.md +0 -83
  284. /package/dist/{client-AGFNR2S4.js.map → chunk-E4VLWIJC.js.map} +0 -0
  285. /package/dist/{setup-digest-X5PN27F4.js.map → chunk-IB76KGBY.js.map} +0 -0
  286. /package/dist/{chunk-O6PERU7U.js.map → chunk-XNOCTDHF.js.map} +0 -0
  287. /package/dist/{setup-llm-S5OHQJXK.js.map → client-EYOTW3JU.js.map} +0 -0
  288. /package/dist/{version-36RVCQA6.js.map → client-MXRNQ5FI.js.map} +0 -0
@@ -0,0 +1,68 @@
1
+ import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
2
+ import {
3
+ initVaultDb
4
+ } from "./chunk-KB4DGYIY.js";
5
+ import "./chunk-SAKJMNSR.js";
6
+ import "./chunk-UBZPD4HN.js";
7
+ import {
8
+ getSession,
9
+ listSessions
10
+ } from "./chunk-4LPQ26CK.js";
11
+ import "./chunk-MYX5NCRH.js";
12
+ import "./chunk-TRUJLI6K.js";
13
+ import "./chunk-5VZ52A4T.js";
14
+ import "./chunk-WGTCA2NU.js";
15
+ import "./chunk-PB6TOLRQ.js";
16
+ import "./chunk-LPUQPDC2.js";
17
+ import "./chunk-PZUWP5VK.js";
18
+
19
+ // src/cli/session.ts
20
+ async function run(args, vaultDir) {
21
+ const idOrLatest = args[0];
22
+ const cleanup = initVaultDb(vaultDir);
23
+ try {
24
+ const sessions = listSessions({ limit: 100 });
25
+ if (sessions.length === 0) {
26
+ console.log("No sessions found");
27
+ return;
28
+ }
29
+ let targetId;
30
+ if (!idOrLatest || idOrLatest === "latest") {
31
+ targetId = sessions[0].id;
32
+ } else {
33
+ const match = sessions.find((s) => s.id.includes(idOrLatest));
34
+ if (!match) {
35
+ console.error(`Session not found: ${idOrLatest}`);
36
+ console.log("Available:", sessions.map((s) => s.id.slice(0, 12)).join(", "));
37
+ return;
38
+ }
39
+ targetId = match.id;
40
+ }
41
+ const target = getSession(targetId);
42
+ if (!target) {
43
+ console.error(`Failed to fetch session: ${targetId}`);
44
+ return;
45
+ }
46
+ console.log(`Session: ${target.id}`);
47
+ console.log(`Status: ${target.status}`);
48
+ if (target.title) console.log(`Title: ${target.title}`);
49
+ if (target.branch) console.log(`Branch: ${target.branch}`);
50
+ if (target.user) console.log(`User: ${target.user}`);
51
+ console.log(`Started: ${new Date(target.started_at * 1e3).toISOString()}`);
52
+ if (target.ended_at) console.log(`Ended: ${new Date(target.ended_at * 1e3).toISOString()}`);
53
+ if (target.prompt_count) console.log(`Prompts: ${target.prompt_count}`);
54
+ if (target.tool_count) console.log(`Tools: ${target.tool_count}`);
55
+ if (target.summary) console.log(`
56
+ Summary:
57
+ ${target.summary}`);
58
+ } catch (err) {
59
+ console.error("Failed to read vault database:", err.message);
60
+ process.exit(1);
61
+ } finally {
62
+ cleanup();
63
+ }
64
+ }
65
+ export {
66
+ run
67
+ };
68
+ //# sourceMappingURL=session-Z2FXDDG6.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/cli/session.ts"],"sourcesContent":["/**\n * CLI: myco session — display session info via direct SQLite reads.\n *\n * Opens the database directly (WAL mode allows concurrent reads).\n * Does NOT require the daemon to be running.\n */\n\nimport { listSessions, getSession } from '@myco/db/queries/sessions.js';\nimport { initVaultDb } from './shared.js';\n\n// ---------------------------------------------------------------------------\n// Command\n// ---------------------------------------------------------------------------\n\nexport async function run(args: string[], vaultDir: string): Promise<void> {\n const idOrLatest = args[0];\n\n const cleanup = initVaultDb(vaultDir);\n try {\n const sessions = listSessions({ limit: 100 });\n if (sessions.length === 0) {\n console.log('No sessions found');\n return;\n }\n\n // Resolve target session ID\n let targetId: string;\n if (!idOrLatest || idOrLatest === 'latest') {\n targetId = sessions[0].id;\n } else {\n const match = sessions.find((s) => s.id.includes(idOrLatest));\n if (!match) {\n console.error(`Session not found: ${idOrLatest}`);\n console.log('Available:', sessions.map((s) => s.id.slice(0, 12)).join(', '));\n return;\n }\n targetId = match.id;\n }\n\n // Fetch full session detail\n const target = getSession(targetId);\n if (!target) {\n console.error(`Failed to fetch session: ${targetId}`);\n return;\n }\n\n console.log(`Session: ${target.id}`);\n console.log(`Status: ${target.status}`);\n if (target.title) console.log(`Title: ${target.title}`);\n if (target.branch) console.log(`Branch: ${target.branch}`);\n if (target.user) console.log(`User: ${target.user}`);\n console.log(`Started: ${new Date(target.started_at * 1000).toISOString()}`);\n if (target.ended_at) console.log(`Ended: ${new Date(target.ended_at * 1000).toISOString()}`);\n if (target.prompt_count) console.log(`Prompts: ${target.prompt_count}`);\n if (target.tool_count) console.log(`Tools: ${target.tool_count}`);\n if (target.summary) console.log(`\\nSummary:\\n${target.summary}`);\n } catch (err) {\n console.error('Failed to read vault database:', (err as Error).message);\n process.exit(1);\n } finally {\n cleanup();\n }\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;AAcA,eAAsB,IAAI,MAAgB,UAAiC;AACzE,QAAM,aAAa,KAAK,CAAC;AAEzB,QAAM,UAAU,YAAY,QAAQ;AACpC,MAAI;AACF,UAAM,WAAW,aAAa,EAAE,OAAO,IAAI,CAAC;AAC5C,QAAI,SAAS,WAAW,GAAG;AACzB,cAAQ,IAAI,mBAAmB;AAC/B;AAAA,IACF;AAGA,QAAI;AACJ,QAAI,CAAC,cAAc,eAAe,UAAU;AAC1C,iBAAW,SAAS,CAAC,EAAE;AAAA,IACzB,OAAO;AACL,YAAM,QAAQ,SAAS,KAAK,CAAC,MAAM,EAAE,GAAG,SAAS,UAAU,CAAC;AAC5D,UAAI,CAAC,OAAO;AACV,gBAAQ,MAAM,sBAAsB,UAAU,EAAE;AAChD,gBAAQ,IAAI,cAAc,SAAS,IAAI,CAAC,MAAM,EAAE,GAAG,MAAM,GAAG,EAAE,CAAC,EAAE,KAAK,IAAI,CAAC;AAC3E;AAAA,MACF;AACA,iBAAW,MAAM;AAAA,IACnB;AAGA,UAAM,SAAS,WAAW,QAAQ;AAClC,QAAI,CAAC,QAAQ;AACX,cAAQ,MAAM,4BAA4B,QAAQ,EAAE;AACpD;AAAA,IACF;AAEA,YAAQ,IAAI,YAAY,OAAO,EAAE,EAAE;AACnC,YAAQ,IAAI,YAAY,OAAO,MAAM,EAAE;AACvC,QAAI,OAAO,MAAO,SAAQ,IAAI,YAAY,OAAO,KAAK,EAAE;AACxD,QAAI,OAAO,OAAQ,SAAQ,IAAI,YAAY,OAAO,MAAM,EAAE;AAC1D,QAAI,OAAO,KAAM,SAAQ,IAAI,YAAY,OAAO,IAAI,EAAE;AACtD,YAAQ,IAAI,YAAY,IAAI,KAAK,OAAO,aAAa,GAAI,EAAE,YAAY,CAAC,EAAE;AAC1E,QAAI,OAAO,SAAU,SAAQ,IAAI,YAAY,IAAI,KAAK,OAAO,WAAW,GAAI,EAAE,YAAY,CAAC,EAAE;AAC7F,QAAI,OAAO,aAAc,SAAQ,IAAI,YAAY,OAAO,YAAY,EAAE;AACtE,QAAI,OAAO,WAAY,SAAQ,IAAI,YAAY,OAAO,UAAU,EAAE;AAClE,QAAI,OAAO,QAAS,SAAQ,IAAI;AAAA;AAAA,EAAe,OAAO,OAAO,EAAE;AAAA,EACjE,SAAS,KAAK;AACZ,YAAQ,MAAM,kCAAmC,IAAc,OAAO;AACtE,YAAQ,KAAK,CAAC;AAAA,EAChB,UAAE;AACA,YAAQ;AAAA,EACV;AACF;","names":[]}
@@ -0,0 +1,38 @@
1
+ import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
2
+ import {
3
+ readStdin
4
+ } from "./chunk-XNOCTDHF.js";
5
+ import {
6
+ DaemonClient
7
+ } from "./chunk-TRUJLI6K.js";
8
+ import "./chunk-5VZ52A4T.js";
9
+ import "./chunk-WGTCA2NU.js";
10
+ import "./chunk-PB6TOLRQ.js";
11
+ import "./chunk-LPUQPDC2.js";
12
+ import {
13
+ resolveVaultDir
14
+ } from "./chunk-JTYZRPX5.js";
15
+ import "./chunk-PZUWP5VK.js";
16
+
17
+ // src/hooks/session-end.ts
18
+ import fs from "fs";
19
+ import path from "path";
20
+ async function main() {
21
+ const VAULT_DIR = resolveVaultDir();
22
+ if (!fs.existsSync(path.join(VAULT_DIR, "myco.yaml"))) return;
23
+ try {
24
+ const input = JSON.parse(await readStdin());
25
+ const sessionId = input.session_id ?? process.env.MYCO_SESSION_ID;
26
+ const client = new DaemonClient(VAULT_DIR);
27
+ if (sessionId) {
28
+ await client.post("/sessions/unregister", { session_id: sessionId });
29
+ }
30
+ } catch (error) {
31
+ process.stderr.write(`[myco] session-end error: ${error.message}
32
+ `);
33
+ }
34
+ }
35
+ export {
36
+ main
37
+ };
38
+ //# sourceMappingURL=session-end-FLVX32LE.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/hooks/session-end.ts"],"sourcesContent":["import { DaemonClient } from './client.js';\nimport { readStdin } from './read-stdin.js';\nimport { resolveVaultDir } from '../vault/resolve.js';\nimport fs from 'node:fs';\nimport path from 'node:path';\n\nexport async function main() {\n const VAULT_DIR = resolveVaultDir();\n if (!fs.existsSync(path.join(VAULT_DIR, 'myco.yaml'))) return;\n\n try {\n const input = JSON.parse(await readStdin());\n const sessionId = input.session_id ?? process.env.MYCO_SESSION_ID;\n\n const client = new DaemonClient(VAULT_DIR);\n if (sessionId) {\n await client.post('/sessions/unregister', { session_id: sessionId });\n }\n } catch (error) {\n process.stderr.write(`[myco] session-end error: ${(error as Error).message}\\n`);\n }\n}\n"],"mappings":";;;;;;;;;;;;;;;;;AAGA,OAAO,QAAQ;AACf,OAAO,UAAU;AAEjB,eAAsB,OAAO;AAC3B,QAAM,YAAY,gBAAgB;AAClC,MAAI,CAAC,GAAG,WAAW,KAAK,KAAK,WAAW,WAAW,CAAC,EAAG;AAEvD,MAAI;AACF,UAAM,QAAQ,KAAK,MAAM,MAAM,UAAU,CAAC;AAC1C,UAAM,YAAY,MAAM,cAAc,QAAQ,IAAI;AAElD,UAAM,SAAS,IAAI,aAAa,SAAS;AACzC,QAAI,WAAW;AACb,YAAM,OAAO,KAAK,wBAAwB,EAAE,YAAY,UAAU,CAAC;AAAA,IACrE;AAAA,EACF,SAAS,OAAO;AACd,YAAQ,OAAO,MAAM,6BAA8B,MAAgB,OAAO;AAAA,CAAI;AAAA,EAChF;AACF;","names":[]}
@@ -0,0 +1,169 @@
1
+ import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
2
+ import {
3
+ listSpores
4
+ } from "./chunk-3K5WGSJ4.js";
5
+ import {
6
+ listSessions
7
+ } from "./chunk-4LPQ26CK.js";
8
+ import {
9
+ createSchema
10
+ } from "./chunk-KV4OC4H3.js";
11
+ import {
12
+ loadConfig
13
+ } from "./chunk-MHSCMET3.js";
14
+ import "./chunk-D7TYRPRM.js";
15
+ import "./chunk-E4VLWIJC.js";
16
+ import "./chunk-KH64DHOY.js";
17
+ import {
18
+ getDatabase,
19
+ initDatabase,
20
+ vaultDbPath
21
+ } from "./chunk-MYX5NCRH.js";
22
+ import {
23
+ readStdin
24
+ } from "./chunk-XNOCTDHF.js";
25
+ import {
26
+ DaemonClient
27
+ } from "./chunk-TRUJLI6K.js";
28
+ import {
29
+ CONTEXT_SESSION_PREVIEW_CHARS,
30
+ CONTEXT_SPORE_PREVIEW_CHARS,
31
+ EXCLUDED_SPORE_STATUSES,
32
+ estimateTokens
33
+ } from "./chunk-5VZ52A4T.js";
34
+ import "./chunk-WGTCA2NU.js";
35
+ import "./chunk-PB6TOLRQ.js";
36
+ import "./chunk-LPUQPDC2.js";
37
+ import {
38
+ resolveVaultDir
39
+ } from "./chunk-JTYZRPX5.js";
40
+ import "./chunk-PZUWP5VK.js";
41
+
42
+ // src/context/injector.ts
43
+ var CONTEXT_SESSION_LIMIT = 10;
44
+ var CONTEXT_SESSION_DISPLAY_LIMIT = 5;
45
+ var CONTEXT_SPORE_FETCH_LIMIT = 20;
46
+ var CONTEXT_SPORE_DISPLAY_LIMIT = 5;
47
+ var DEFAULT_SESSIONS_BUDGET = 500;
48
+ var DEFAULT_SPORES_BUDGET = 300;
49
+ var DEFAULT_TEAM_BUDGET = 200;
50
+ var DEFAULT_CONTEXT_MAX_TOKENS = 1200;
51
+ async function buildInjectedContext(_config, context) {
52
+ try {
53
+ getDatabase();
54
+ } catch {
55
+ return emptyContext();
56
+ }
57
+ const [sessions, spores] = await Promise.all([
58
+ listSessions({ limit: CONTEXT_SESSION_LIMIT }),
59
+ listSpores({ limit: CONTEXT_SPORE_FETCH_LIMIT, status: "active" })
60
+ ]);
61
+ const sessionsText = formatLayer(
62
+ "Recent Sessions",
63
+ sessions.slice(0, CONTEXT_SESSION_DISPLAY_LIMIT).map((s) => {
64
+ const title = s.title ?? s.id;
65
+ const summary = (s.summary ?? "").slice(0, CONTEXT_SESSION_PREVIEW_CHARS);
66
+ const branchLabel = s.branch === context.branch ? " (same branch)" : "";
67
+ return `- **${title}**: ${summary}${branchLabel}`;
68
+ }),
69
+ DEFAULT_SESSIONS_BUDGET
70
+ );
71
+ const filteredSpores = spores.filter(
72
+ (s) => !EXCLUDED_SPORE_STATUSES.has(s.status)
73
+ );
74
+ const sporesText = formatLayer(
75
+ "Relevant Spores",
76
+ filteredSpores.slice(0, CONTEXT_SPORE_DISPLAY_LIMIT).map(
77
+ (s) => `- **${s.id}** (${s.observation_type}): ${s.content.slice(0, CONTEXT_SPORE_PREVIEW_CHARS)}`
78
+ ),
79
+ DEFAULT_SPORES_BUDGET
80
+ );
81
+ const teamText = formatLayer("Team Activity", [], DEFAULT_TEAM_BUDGET);
82
+ const allLayers = [sessionsText, sporesText, teamText].filter(Boolean);
83
+ const parts = [];
84
+ let totalTokens = 0;
85
+ for (const layer of allLayers) {
86
+ const layerTokens = estimateTokens(layer);
87
+ if (totalTokens + layerTokens > DEFAULT_CONTEXT_MAX_TOKENS) break;
88
+ parts.push(layer);
89
+ totalTokens += layerTokens;
90
+ }
91
+ const fullText = parts.join("\n\n");
92
+ return {
93
+ text: fullText,
94
+ tokenEstimate: totalTokens,
95
+ layers: {
96
+ sessions: sessionsText,
97
+ spores: sporesText,
98
+ team: teamText
99
+ }
100
+ };
101
+ }
102
+ function emptyContext() {
103
+ return {
104
+ text: "",
105
+ tokenEstimate: 0,
106
+ layers: { sessions: "", spores: "", team: "" }
107
+ };
108
+ }
109
+ function formatLayer(heading, items, budget) {
110
+ if (items.length === 0) return "";
111
+ let text = `### ${heading}
112
+ `;
113
+ let currentTokens = estimateTokens(text);
114
+ for (const item of items) {
115
+ const itemTokens = estimateTokens(item);
116
+ if (currentTokens + itemTokens > budget) break;
117
+ text += item + "\n";
118
+ currentTokens += itemTokens;
119
+ }
120
+ return text.trim();
121
+ }
122
+
123
+ // src/hooks/session-start.ts
124
+ import { execFileSync } from "child_process";
125
+ import fs from "fs";
126
+ import path from "path";
127
+ async function main() {
128
+ const VAULT_DIR = resolveVaultDir();
129
+ if (!fs.existsSync(path.join(VAULT_DIR, "myco.yaml"))) return;
130
+ try {
131
+ const config = loadConfig(VAULT_DIR);
132
+ const client = new DaemonClient(VAULT_DIR);
133
+ const healthy = await client.ensureRunning();
134
+ const input = JSON.parse(await readStdin());
135
+ const sessionId = input.session_id ?? `s-${Date.now()}`;
136
+ let branch;
137
+ try {
138
+ branch = execFileSync("git", ["rev-parse", "--abbrev-ref", "HEAD"], { encoding: "utf-8" }).trim();
139
+ } catch {
140
+ }
141
+ if (healthy) {
142
+ await client.post("/sessions/register", {
143
+ session_id: sessionId,
144
+ branch,
145
+ started_at: (/* @__PURE__ */ new Date()).toISOString()
146
+ });
147
+ const contextResult = await client.post("/context", { session_id: sessionId, branch });
148
+ if (contextResult.ok && contextResult.data?.text) {
149
+ if (contextResult.data.source === "digest") {
150
+ process.stderr.write(`[myco] Injecting digest extract (tier ${contextResult.data.tier})
151
+ `);
152
+ }
153
+ process.stdout.write(contextResult.data.text);
154
+ return;
155
+ }
156
+ }
157
+ const db = initDatabase(vaultDbPath(VAULT_DIR));
158
+ createSchema(db);
159
+ const injected = await buildInjectedContext(config, { branch });
160
+ if (injected.text) process.stdout.write(injected.text);
161
+ } catch (error) {
162
+ process.stderr.write(`[myco] session-start error: ${error.message}
163
+ `);
164
+ }
165
+ }
166
+ export {
167
+ main
168
+ };
169
+ //# sourceMappingURL=session-start-UCLK7PXE.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/context/injector.ts","../src/hooks/session-start.ts"],"sourcesContent":["/**\n * Context injector — assembles context from SQLite for hook injection.\n *\n * Queries sessions and spores from SQLite. For prompt-submit context,\n * semantic search is deferred to Phase 2 (requires daemon vector store).\n * If no data exists (zero-config), returns empty context gracefully.\n */\n\nimport { getDatabase } from '@myco/db/client.js';\nimport { listSessions } from '@myco/db/queries/sessions.js';\nimport { listSpores } from '@myco/db/queries/spores.js';\nimport type { MycoConfig } from '@myco/config/schema.js';\nimport {\n estimateTokens,\n CONTEXT_SESSION_PREVIEW_CHARS,\n CONTEXT_SPORE_PREVIEW_CHARS,\n PROMPT_CONTEXT_MIN_LENGTH,\n EXCLUDED_SPORE_STATUSES,\n} from '@myco/constants.js';\n\n// ---------------------------------------------------------------------------\n// Constants\n// ---------------------------------------------------------------------------\n\n/** Max recent sessions to include in context. */\nconst CONTEXT_SESSION_LIMIT = 10;\n\n/** Max sessions displayed after scoring. */\nconst CONTEXT_SESSION_DISPLAY_LIMIT = 5;\n\n/** Max spores to fetch for scoring. */\nconst CONTEXT_SPORE_FETCH_LIMIT = 20;\n\n/** Max spores displayed after scoring. */\nconst CONTEXT_SPORE_DISPLAY_LIMIT = 5;\n\n/** Default token budget for sessions layer. */\nconst DEFAULT_SESSIONS_BUDGET = 500;\n\n/** Default token budget for spores layer. */\nconst DEFAULT_SPORES_BUDGET = 300;\n\n/** Default token budget for team layer. */\nconst DEFAULT_TEAM_BUDGET = 200;\n\n/** Default total context max tokens. */\nconst DEFAULT_CONTEXT_MAX_TOKENS = 1200;\n\n// ---------------------------------------------------------------------------\n// Types\n// ---------------------------------------------------------------------------\n\ninterface InjectionContext {\n branch?: string;\n}\n\ninterface InjectedContext {\n text: string;\n tokenEstimate: number;\n layers: {\n sessions: string;\n spores: string;\n team: string;\n };\n}\n\n// ---------------------------------------------------------------------------\n// Public API\n// ---------------------------------------------------------------------------\n\n/**\n * Build injected context from SQLite data.\n *\n * Returns empty context gracefully when no data exists (zero-config behavior).\n */\nexport async function buildInjectedContext(\n _config: MycoConfig,\n context: InjectionContext,\n): Promise<InjectedContext> {\n // Verify database is available — return empty if not\n try {\n getDatabase();\n } catch {\n return emptyContext();\n }\n\n // Fetch sessions and spores in parallel\n const [sessions, spores] = await Promise.all([\n listSessions({ limit: CONTEXT_SESSION_LIMIT }),\n listSpores({ limit: CONTEXT_SPORE_FETCH_LIMIT, status: 'active' }),\n ]);\n\n // Layer 1: Recent sessions\n const sessionsText = formatLayer(\n 'Recent Sessions',\n sessions.slice(0, CONTEXT_SESSION_DISPLAY_LIMIT).map((s) => {\n const title = s.title ?? s.id;\n const summary = (s.summary ?? '').slice(0, CONTEXT_SESSION_PREVIEW_CHARS);\n const branchLabel = s.branch === context.branch ? ' (same branch)' : '';\n return `- **${title}**: ${summary}${branchLabel}`;\n }),\n DEFAULT_SESSIONS_BUDGET,\n );\n\n // Layer 2: Relevant spores (exclude superseded/archived)\n const filteredSpores = spores.filter((s) =>\n !EXCLUDED_SPORE_STATUSES.has(s.status),\n );\n const sporesText = formatLayer(\n 'Relevant Spores',\n filteredSpores.slice(0, CONTEXT_SPORE_DISPLAY_LIMIT).map((s) =>\n `- **${s.id}** (${s.observation_type}): ${s.content.slice(0, CONTEXT_SPORE_PREVIEW_CHARS)}`,\n ),\n DEFAULT_SPORES_BUDGET,\n );\n\n // Layer 3: Team activity (placeholder — populated in Phase 2)\n const teamText = formatLayer('Team Activity', [], DEFAULT_TEAM_BUDGET);\n\n // Enforce total max_tokens budget\n const allLayers = [sessionsText, sporesText, teamText].filter(Boolean);\n const parts: string[] = [];\n let totalTokens = 0;\n\n for (const layer of allLayers) {\n const layerTokens = estimateTokens(layer);\n if (totalTokens + layerTokens > DEFAULT_CONTEXT_MAX_TOKENS) break;\n parts.push(layer);\n totalTokens += layerTokens;\n }\n\n const fullText = parts.join('\\n\\n');\n\n return {\n text: fullText,\n tokenEstimate: totalTokens,\n layers: {\n sessions: sessionsText,\n spores: sporesText,\n team: teamText,\n },\n };\n}\n\n/**\n * Build per-prompt context using semantic search on spores.\n *\n * Semantic search via the daemon's in-process vector store is deferred to\n * Phase 2. For now, returns empty context. The hook (`user-prompt-submit`)\n * routes through the daemon API at `/context/prompt`, which will implement\n * vector search when ready.\n */\nexport async function buildPromptContext(\n prompt: string,\n _config: MycoConfig,\n): Promise<InjectedContext> {\n if (prompt.length < PROMPT_CONTEXT_MIN_LENGTH) {\n return emptyContext();\n }\n\n // Per-prompt semantic search deferred to Phase 2 (requires daemon vector store)\n return emptyContext();\n}\n\n// ---------------------------------------------------------------------------\n// Helpers\n// ---------------------------------------------------------------------------\n\nfunction emptyContext(): InjectedContext {\n return {\n text: '',\n tokenEstimate: 0,\n layers: { sessions: '', spores: '', team: '' },\n };\n}\n\nfunction formatLayer(heading: string, items: string[], budget: number): string {\n if (items.length === 0) return '';\n\n let text = `### ${heading}\\n`;\n let currentTokens = estimateTokens(text);\n\n for (const item of items) {\n const itemTokens = estimateTokens(item);\n if (currentTokens + itemTokens > budget) break;\n text += item + '\\n';\n currentTokens += itemTokens;\n }\n\n return text.trim();\n}\n","import { DaemonClient } from './client.js';\nimport { readStdin } from './read-stdin.js';\nimport { loadConfig } from '../config/loader.js';\nimport { buildInjectedContext } from '../context/injector.js';\nimport { initDatabase, vaultDbPath } from '../db/client.js';\nimport { createSchema } from '../db/schema.js';\nimport { resolveVaultDir } from '../vault/resolve.js';\nimport { execFileSync } from 'node:child_process';\nimport fs from 'node:fs';\nimport path from 'node:path';\n\nexport async function main() {\n const VAULT_DIR = resolveVaultDir();\n if (!fs.existsSync(path.join(VAULT_DIR, 'myco.yaml'))) return;\n\n try {\n const config = loadConfig(VAULT_DIR);\n const client = new DaemonClient(VAULT_DIR);\n const healthy = await client.ensureRunning();\n\n const input = JSON.parse(await readStdin());\n const sessionId = input.session_id ?? `s-${Date.now()}`;\n\n let branch: string | undefined;\n try {\n branch = execFileSync('git', ['rev-parse', '--abbrev-ref', 'HEAD'], { encoding: 'utf-8' }).trim();\n } catch { /* not a git repo */ }\n\n if (healthy) {\n await client.post('/sessions/register', {\n session_id: sessionId,\n branch,\n started_at: new Date().toISOString(),\n });\n\n const contextResult = await client.post('/context', { session_id: sessionId, branch });\n\n if (contextResult.ok && contextResult.data?.text) {\n if (contextResult.data.source === 'digest') {\n process.stderr.write(`[myco] Injecting digest extract (tier ${contextResult.data.tier})\\n`);\n }\n process.stdout.write(contextResult.data.text);\n return;\n }\n }\n\n // Degraded: local SQLite context only\n const db = initDatabase(vaultDbPath(VAULT_DIR));\n createSchema(db);\n const injected = await buildInjectedContext(config, { branch });\n if (injected.text) process.stdout.write(injected.text);\n } catch (error) {\n process.stderr.write(`[myco] session-start error: ${(error as Error).message}\\n`);\n }\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAyBA,IAAM,wBAAwB;AAG9B,IAAM,gCAAgC;AAGtC,IAAM,4BAA4B;AAGlC,IAAM,8BAA8B;AAGpC,IAAM,0BAA0B;AAGhC,IAAM,wBAAwB;AAG9B,IAAM,sBAAsB;AAG5B,IAAM,6BAA6B;AA6BnC,eAAsB,qBACpB,SACA,SAC0B;AAE1B,MAAI;AACF,gBAAY;AAAA,EACd,QAAQ;AACN,WAAO,aAAa;AAAA,EACtB;AAGA,QAAM,CAAC,UAAU,MAAM,IAAI,MAAM,QAAQ,IAAI;AAAA,IAC3C,aAAa,EAAE,OAAO,sBAAsB,CAAC;AAAA,IAC7C,WAAW,EAAE,OAAO,2BAA2B,QAAQ,SAAS,CAAC;AAAA,EACnE,CAAC;AAGD,QAAM,eAAe;AAAA,IACnB;AAAA,IACA,SAAS,MAAM,GAAG,6BAA6B,EAAE,IAAI,CAAC,MAAM;AAC1D,YAAM,QAAQ,EAAE,SAAS,EAAE;AAC3B,YAAM,WAAW,EAAE,WAAW,IAAI,MAAM,GAAG,6BAA6B;AACxE,YAAM,cAAc,EAAE,WAAW,QAAQ,SAAS,mBAAmB;AACrE,aAAO,OAAO,KAAK,OAAO,OAAO,GAAG,WAAW;AAAA,IACjD,CAAC;AAAA,IACD;AAAA,EACF;AAGA,QAAM,iBAAiB,OAAO;AAAA,IAAO,CAAC,MACpC,CAAC,wBAAwB,IAAI,EAAE,MAAM;AAAA,EACvC;AACA,QAAM,aAAa;AAAA,IACjB;AAAA,IACA,eAAe,MAAM,GAAG,2BAA2B,EAAE;AAAA,MAAI,CAAC,MACxD,OAAO,EAAE,EAAE,OAAO,EAAE,gBAAgB,MAAM,EAAE,QAAQ,MAAM,GAAG,2BAA2B,CAAC;AAAA,IAC3F;AAAA,IACA;AAAA,EACF;AAGA,QAAM,WAAW,YAAY,iBAAiB,CAAC,GAAG,mBAAmB;AAGrE,QAAM,YAAY,CAAC,cAAc,YAAY,QAAQ,EAAE,OAAO,OAAO;AACrE,QAAM,QAAkB,CAAC;AACzB,MAAI,cAAc;AAElB,aAAW,SAAS,WAAW;AAC7B,UAAM,cAAc,eAAe,KAAK;AACxC,QAAI,cAAc,cAAc,2BAA4B;AAC5D,UAAM,KAAK,KAAK;AAChB,mBAAe;AAAA,EACjB;AAEA,QAAM,WAAW,MAAM,KAAK,MAAM;AAElC,SAAO;AAAA,IACL,MAAM;AAAA,IACN,eAAe;AAAA,IACf,QAAQ;AAAA,MACN,UAAU;AAAA,MACV,QAAQ;AAAA,MACR,MAAM;AAAA,IACR;AAAA,EACF;AACF;AA0BA,SAAS,eAAgC;AACvC,SAAO;AAAA,IACL,MAAM;AAAA,IACN,eAAe;AAAA,IACf,QAAQ,EAAE,UAAU,IAAI,QAAQ,IAAI,MAAM,GAAG;AAAA,EAC/C;AACF;AAEA,SAAS,YAAY,SAAiB,OAAiB,QAAwB;AAC7E,MAAI,MAAM,WAAW,EAAG,QAAO;AAE/B,MAAI,OAAO,OAAO,OAAO;AAAA;AACzB,MAAI,gBAAgB,eAAe,IAAI;AAEvC,aAAW,QAAQ,OAAO;AACxB,UAAM,aAAa,eAAe,IAAI;AACtC,QAAI,gBAAgB,aAAa,OAAQ;AACzC,YAAQ,OAAO;AACf,qBAAiB;AAAA,EACnB;AAEA,SAAO,KAAK,KAAK;AACnB;;;ACvLA,SAAS,oBAAoB;AAC7B,OAAO,QAAQ;AACf,OAAO,UAAU;AAEjB,eAAsB,OAAO;AAC3B,QAAM,YAAY,gBAAgB;AAClC,MAAI,CAAC,GAAG,WAAW,KAAK,KAAK,WAAW,WAAW,CAAC,EAAG;AAEvD,MAAI;AACF,UAAM,SAAS,WAAW,SAAS;AACnC,UAAM,SAAS,IAAI,aAAa,SAAS;AACzC,UAAM,UAAU,MAAM,OAAO,cAAc;AAE3C,UAAM,QAAQ,KAAK,MAAM,MAAM,UAAU,CAAC;AAC1C,UAAM,YAAY,MAAM,cAAc,KAAK,KAAK,IAAI,CAAC;AAErD,QAAI;AACJ,QAAI;AACF,eAAS,aAAa,OAAO,CAAC,aAAa,gBAAgB,MAAM,GAAG,EAAE,UAAU,QAAQ,CAAC,EAAE,KAAK;AAAA,IAClG,QAAQ;AAAA,IAAuB;AAE/B,QAAI,SAAS;AACX,YAAM,OAAO,KAAK,sBAAsB;AAAA,QACtC,YAAY;AAAA,QACZ;AAAA,QACA,aAAY,oBAAI,KAAK,GAAE,YAAY;AAAA,MACrC,CAAC;AAED,YAAM,gBAAgB,MAAM,OAAO,KAAK,YAAY,EAAE,YAAY,WAAW,OAAO,CAAC;AAErF,UAAI,cAAc,MAAM,cAAc,MAAM,MAAM;AAChD,YAAI,cAAc,KAAK,WAAW,UAAU;AAC1C,kBAAQ,OAAO,MAAM,yCAAyC,cAAc,KAAK,IAAI;AAAA,CAAK;AAAA,QAC5F;AACA,gBAAQ,OAAO,MAAM,cAAc,KAAK,IAAI;AAC5C;AAAA,MACF;AAAA,IACF;AAGA,UAAM,KAAK,aAAa,YAAY,SAAS,CAAC;AAC9C,iBAAa,EAAE;AACf,UAAM,WAAW,MAAM,qBAAqB,QAAQ,EAAE,OAAO,CAAC;AAC9D,QAAI,SAAS,KAAM,SAAQ,OAAO,MAAM,SAAS,IAAI;AAAA,EACvD,SAAS,OAAO;AACd,YAAQ,OAAO,MAAM,+BAAgC,MAAgB,OAAO;AAAA,CAAI;AAAA,EAClF;AACF;","names":[]}
@@ -0,0 +1,15 @@
1
+ import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
2
+ import "./chunk-PZUWP5VK.js";
3
+
4
+ // src/cli/setup-digest.ts
5
+ async function run(_args, _vaultDir) {
6
+ console.log("Digest configuration has been removed in v3.");
7
+ console.log("The Myco agent (Claude Agent SDK) manages intelligence processing.");
8
+ console.log("");
9
+ console.log('Use "myco setup-llm" to configure the embedding provider, or');
10
+ console.log('use "myco config set embedding.model <model>" to change the embedding model.');
11
+ }
12
+ export {
13
+ run
14
+ };
15
+ //# sourceMappingURL=setup-digest-4KDSXAIV.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/cli/setup-digest.ts"],"sourcesContent":["export async function run(_args: string[], _vaultDir: string): Promise<void> {\n console.log('Digest configuration has been removed in v3.');\n console.log('The Myco agent (Claude Agent SDK) manages intelligence processing.');\n console.log('');\n console.log('Use \"myco setup-llm\" to configure the embedding provider, or');\n console.log('use \"myco config set embedding.model <model>\" to change the embedding model.');\n}\n"],"mappings":";;;;AAAA,eAAsB,IAAI,OAAiB,WAAkC;AAC3E,UAAQ,IAAI,8CAA8C;AAC1D,UAAQ,IAAI,oEAAoE;AAChF,UAAQ,IAAI,EAAE;AACd,UAAQ,IAAI,8DAA8D;AAC1E,UAAQ,IAAI,8EAA8E;AAC5F;","names":[]}
@@ -0,0 +1,81 @@
1
+ import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
2
+ import {
3
+ withEmbedding
4
+ } from "./chunk-M5XWW7UI.js";
5
+ import "./chunk-KB4DGYIY.js";
6
+ import {
7
+ parseStringFlag
8
+ } from "./chunk-SAKJMNSR.js";
9
+ import "./chunk-UBZPD4HN.js";
10
+ import {
11
+ loadConfig,
12
+ updateConfig
13
+ } from "./chunk-MHSCMET3.js";
14
+ import "./chunk-D7TYRPRM.js";
15
+ import "./chunk-E4VLWIJC.js";
16
+ import "./chunk-KH64DHOY.js";
17
+ import "./chunk-MYX5NCRH.js";
18
+ import "./chunk-TRUJLI6K.js";
19
+ import "./chunk-5VZ52A4T.js";
20
+ import "./chunk-WGTCA2NU.js";
21
+ import "./chunk-PB6TOLRQ.js";
22
+ import "./chunk-LPUQPDC2.js";
23
+ import "./chunk-PZUWP5VK.js";
24
+
25
+ // src/cli/setup-llm.ts
26
+ import fs from "fs";
27
+ import path from "path";
28
+ var DAEMON_STATE_FILENAME = "daemon.json";
29
+ var USAGE = `Usage: myco setup-llm [options]
30
+
31
+ Configure embedding provider settings.
32
+
33
+ In v3, LLM configuration is managed by the Myco agent (Claude Agent SDK).
34
+ Only embedding settings are user-configurable.
35
+
36
+ Options:
37
+ --embedding-provider <name> Embedding provider (ollama, openai-compatible)
38
+ --embedding-model <name> Embedding model name
39
+ --embedding-url <url> Embedding provider base URL
40
+ --show Show current settings and exit
41
+ `;
42
+ async function run(args, vaultDir) {
43
+ if (args.includes("--show")) {
44
+ const config = loadConfig(vaultDir);
45
+ console.log(JSON.stringify(config.embedding, null, 2));
46
+ return;
47
+ }
48
+ if (args.length === 0) {
49
+ console.log(USAGE);
50
+ return;
51
+ }
52
+ const llmProvider = parseStringFlag(args, "--llm-provider");
53
+ const llmModel = parseStringFlag(args, "--llm-model");
54
+ const llmUrl = parseStringFlag(args, "--llm-url");
55
+ const llmContextWindow = parseStringFlag(args, "--llm-context-window");
56
+ const llmMaxTokens = parseStringFlag(args, "--llm-max-tokens");
57
+ if (llmProvider || llmModel || llmUrl || llmContextWindow || llmMaxTokens) {
58
+ console.log("Note: LLM configuration is managed by the Myco agent. LLM flags are ignored.");
59
+ }
60
+ const updates = {};
61
+ const embeddingProvider = parseStringFlag(args, "--embedding-provider");
62
+ if (embeddingProvider !== void 0) updates.provider = embeddingProvider;
63
+ const embeddingModel = parseStringFlag(args, "--embedding-model");
64
+ if (embeddingModel !== void 0) updates.model = embeddingModel;
65
+ const embeddingUrl = parseStringFlag(args, "--embedding-url");
66
+ if (embeddingUrl !== void 0) updates.base_url = embeddingUrl;
67
+ const updated = updateConfig(vaultDir, (config) => withEmbedding(config, updates));
68
+ console.log("Embedding configuration updated.");
69
+ console.log(JSON.stringify(updated.embedding, null, 2));
70
+ if (embeddingModel !== void 0) {
71
+ console.log("\nWarning: changing the embedding model requires a full vector index rebuild.");
72
+ console.log("Run: myco rebuild");
73
+ }
74
+ if (fs.existsSync(path.join(vaultDir, DAEMON_STATE_FILENAME))) {
75
+ console.log("\nNote: restart the daemon for changes to take effect (myco restart)");
76
+ }
77
+ }
78
+ export {
79
+ run
80
+ };
81
+ //# sourceMappingURL=setup-llm-GKMCHURK.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/cli/setup-llm.ts"],"sourcesContent":["import fs from 'node:fs';\nimport path from 'node:path';\nimport { loadConfig, updateConfig } from '../config/loader.js';\nimport { withEmbedding } from '../config/updates.js';\nimport { parseStringFlag } from './shared.js';\nimport type { EmbeddingProviderConfig } from '../config/schema.js';\n\nconst DAEMON_STATE_FILENAME = 'daemon.json';\n\nconst USAGE = `Usage: myco setup-llm [options]\n\nConfigure embedding provider settings.\n\nIn v3, LLM configuration is managed by the Myco agent (Claude Agent SDK).\nOnly embedding settings are user-configurable.\n\nOptions:\n --embedding-provider <name> Embedding provider (ollama, openai-compatible)\n --embedding-model <name> Embedding model name\n --embedding-url <url> Embedding provider base URL\n --show Show current settings and exit\n`;\n\nexport async function run(args: string[], vaultDir: string): Promise<void> {\n // Show current settings\n if (args.includes('--show')) {\n const config = loadConfig(vaultDir);\n console.log(JSON.stringify(config.embedding, null, 2));\n return;\n }\n\n // No flags = show usage\n if (args.length === 0) {\n console.log(USAGE);\n return;\n }\n\n // Warn about removed LLM flags\n const llmProvider = parseStringFlag(args, '--llm-provider');\n const llmModel = parseStringFlag(args, '--llm-model');\n const llmUrl = parseStringFlag(args, '--llm-url');\n const llmContextWindow = parseStringFlag(args, '--llm-context-window');\n const llmMaxTokens = parseStringFlag(args, '--llm-max-tokens');\n if (llmProvider || llmModel || llmUrl || llmContextWindow || llmMaxTokens) {\n console.log('Note: LLM configuration is managed by the Myco agent. LLM flags are ignored.');\n }\n\n // Build partial embedding update from flags\n const updates: Partial<EmbeddingProviderConfig> = {};\n\n const embeddingProvider = parseStringFlag(args, '--embedding-provider');\n if (embeddingProvider !== undefined) updates.provider = embeddingProvider as EmbeddingProviderConfig['provider'];\n\n const embeddingModel = parseStringFlag(args, '--embedding-model');\n if (embeddingModel !== undefined) updates.model = embeddingModel;\n\n const embeddingUrl = parseStringFlag(args, '--embedding-url');\n if (embeddingUrl !== undefined) updates.base_url = embeddingUrl;\n\n // Apply the update through the single write gate\n const updated = updateConfig(vaultDir, (config) => withEmbedding(config, updates));\n\n console.log('Embedding configuration updated.');\n console.log(JSON.stringify(updated.embedding, null, 2));\n\n if (embeddingModel !== undefined) {\n console.log('\\nWarning: changing the embedding model requires a full vector index rebuild.');\n console.log('Run: myco rebuild');\n }\n\n if (fs.existsSync(path.join(vaultDir, DAEMON_STATE_FILENAME))) {\n console.log('\\nNote: restart the daemon for changes to take effect (myco restart)');\n }\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;AAAA,OAAO,QAAQ;AACf,OAAO,UAAU;AAMjB,IAAM,wBAAwB;AAE9B,IAAM,QAAQ;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAcd,eAAsB,IAAI,MAAgB,UAAiC;AAEzE,MAAI,KAAK,SAAS,QAAQ,GAAG;AAC3B,UAAM,SAAS,WAAW,QAAQ;AAClC,YAAQ,IAAI,KAAK,UAAU,OAAO,WAAW,MAAM,CAAC,CAAC;AACrD;AAAA,EACF;AAGA,MAAI,KAAK,WAAW,GAAG;AACrB,YAAQ,IAAI,KAAK;AACjB;AAAA,EACF;AAGA,QAAM,cAAc,gBAAgB,MAAM,gBAAgB;AAC1D,QAAM,WAAW,gBAAgB,MAAM,aAAa;AACpD,QAAM,SAAS,gBAAgB,MAAM,WAAW;AAChD,QAAM,mBAAmB,gBAAgB,MAAM,sBAAsB;AACrE,QAAM,eAAe,gBAAgB,MAAM,kBAAkB;AAC7D,MAAI,eAAe,YAAY,UAAU,oBAAoB,cAAc;AACzE,YAAQ,IAAI,8EAA8E;AAAA,EAC5F;AAGA,QAAM,UAA4C,CAAC;AAEnD,QAAM,oBAAoB,gBAAgB,MAAM,sBAAsB;AACtE,MAAI,sBAAsB,OAAW,SAAQ,WAAW;AAExD,QAAM,iBAAiB,gBAAgB,MAAM,mBAAmB;AAChE,MAAI,mBAAmB,OAAW,SAAQ,QAAQ;AAElD,QAAM,eAAe,gBAAgB,MAAM,iBAAiB;AAC5D,MAAI,iBAAiB,OAAW,SAAQ,WAAW;AAGnD,QAAM,UAAU,aAAa,UAAU,CAAC,WAAW,cAAc,QAAQ,OAAO,CAAC;AAEjF,UAAQ,IAAI,kCAAkC;AAC9C,UAAQ,IAAI,KAAK,UAAU,QAAQ,WAAW,MAAM,CAAC,CAAC;AAEtD,MAAI,mBAAmB,QAAW;AAChC,YAAQ,IAAI,+EAA+E;AAC3F,YAAQ,IAAI,mBAAmB;AAAA,EACjC;AAEA,MAAI,GAAG,WAAW,KAAK,KAAK,UAAU,qBAAqB,CAAC,GAAG;AAC7D,YAAQ,IAAI,sEAAsE;AAAA,EACpF;AACF;","names":[]}
@@ -0,0 +1,35 @@
1
+ # Built-in Myco agent definition.
2
+ # Loaded at startup and registered in the agents table as source: 'built-in'.
3
+
4
+ name: myco-agent
5
+ displayName: Myco Agent
6
+ description: >
7
+ Default intelligence agent that processes captured session data, extracts
8
+ observations (spores), builds the knowledge graph, manages spore lifecycle,
9
+ and synthesizes digest extracts.
10
+ model: claude-sonnet-4-20250514
11
+ maxTurns: 30
12
+ timeoutSeconds: 300
13
+ systemPromptPath: ../prompts/agent.md
14
+
15
+ tools:
16
+ # Read tools
17
+ - vault_unprocessed
18
+ - vault_spores
19
+ - vault_sessions
20
+ - vault_search_fts
21
+ - vault_search_semantic
22
+ - vault_state
23
+ - vault_entities
24
+ - vault_edges
25
+ # Write tools
26
+ - vault_create_spore
27
+ - vault_create_entity
28
+ - vault_create_edge
29
+ - vault_resolve_spore
30
+ - vault_update_session
31
+ - vault_mark_processed
32
+ - vault_read_digest
33
+ - vault_write_digest
34
+ - vault_set_state
35
+ - vault_report
@@ -0,0 +1,84 @@
1
+ # =============================================================================
2
+ # Built-in Task: Digest Only
3
+ # =============================================================================
4
+ # Regenerate digest extracts without processing new data.
5
+ # Useful after manual edits to spores or after a consolidation run.
6
+ # =============================================================================
7
+
8
+ name: digest-only
9
+ displayName: Digest Only
10
+ description: >
11
+ Regenerate digest extracts at all token tiers using current vault knowledge.
12
+ Does not process new data or extract spores. Useful after manual edits
13
+ to spores or graph data.
14
+ agent: myco-agent
15
+ isDefault: false
16
+ maxTurns: 28
17
+ timeoutSeconds: 1800
18
+
19
+ prompt: |
20
+ Regenerate digest extracts from current vault state.
21
+
22
+ ## Phase 1 — Assess Current State (budget: 4 turns)
23
+
24
+ 1. Call `vault_state` for context
25
+ 2. Call `vault_read_digest` (no tier param) to see current digest metadata
26
+ and assess whether any tiers exist and how fresh they are
27
+ 3. Call `vault_spores` with status "active" to see current observations
28
+ 4. Call `vault_sessions` to understand recent activity
29
+
30
+ If the current digest is very recent and no meaningful vault changes have
31
+ occurred since it was written (check spore counts and session dates), call
32
+ `vault_report` with action "skip" and reason "digest is already current",
33
+ then stop. Only proceed if there is genuine new material to incorporate.
34
+
35
+ ## Phase 2 — Survey New Material (budget: 5 turns)
36
+
37
+ Based on the spore types and tags you saw in Phase 1, construct targeted
38
+ search queries. Search for specific themes you identified, not generic
39
+ categories. For example, if Phase 1 showed new "gotcha" spores about SQLite
40
+ and new "decision" spores about caching, search for those specific topics.
41
+
42
+ Call `vault_search_semantic` with these targeted queries to gather the
43
+ material you will integrate into the digest.
44
+
45
+ The existing digest is your baseline — your goal is to integrate new
46
+ material into it, not start from scratch.
47
+
48
+ ## Phase 3 — Update Extracts (budget: 25 turns)
49
+
50
+ Update ALL 3 tiers in this order (largest first — each subsequent tier
51
+ is a compression of the one above):
52
+ 1. Tier 10000 — Full institutional knowledge (update if any new content)
53
+ 2. Tier 5000 — Deep onboarding (update if new trade-offs or patterns)
54
+ 3. Tier 1500 — Executive briefing (most compressed — update if
55
+ important new decisions or gotchas)
56
+
57
+ For each tier: read current content via `vault_read_digest`, then write
58
+ the updated version via `vault_write_digest`. Each tier is self-contained.
59
+ Preserve well-crafted existing content — integrate new material, don't
60
+ rewrite from scratch.
61
+
62
+ If a tier genuinely has no new material to integrate, skip it — but
63
+ do not skip tiers just to save turns. All 3 tiers should be kept current.
64
+
65
+ Prioritize within each tier: recent insights > active decisions >
66
+ unresolved gotchas > architectural patterns > historical context.
67
+
68
+ ## Phase 4 — Report (budget: 1 turn)
69
+
70
+ Call `vault_report` with action "digest":
71
+ - Tiers updated: [list]
72
+ - Tiers skipped (already current): [list]
73
+ - Active spores used as substrate: N
74
+ - Key themes incorporated
75
+
76
+ toolOverrides:
77
+ - vault_spores
78
+ - vault_sessions
79
+ - vault_search_semantic
80
+ - vault_state
81
+ - vault_read_digest
82
+ - vault_write_digest
83
+ - vault_set_state
84
+ - vault_report
@@ -0,0 +1,87 @@
1
+ # =============================================================================
2
+ # Built-in Task: Extract Only
3
+ # =============================================================================
4
+ # Quick pass: extract spores from unprocessed batches and update summaries.
5
+ # No graph building, no consolidation, no digest.
6
+ # =============================================================================
7
+
8
+ name: extract-only
9
+ displayName: Extract Only
10
+ description: >
11
+ Extract observations (spores) from unprocessed prompt batches without
12
+ building graph entities/edges or regenerating digest. Useful for a
13
+ quick pass that captures knowledge without full reasoning overhead.
14
+ agent: myco-agent
15
+ isDefault: false
16
+ maxTurns: 30
17
+ timeoutSeconds: 300
18
+
19
+ prompt: |
20
+ Extract spores from unprocessed batches. No graph or digest work.
21
+ Budget: ~25 turns.
22
+
23
+ ## Phase 1 — Read State (budget: 2 turns)
24
+
25
+ 1. Call `vault_state` to get cursor (`last_processed_batch_id`)
26
+ 2. Call `vault_unprocessed` with `after_id`
27
+
28
+ If no unprocessed batches, report "no work" and finish.
29
+
30
+ ## Phase 2 — Extract (budget: 18 turns)
31
+
32
+ Budget ~3 turns per batch. If more batches than turns allow, prioritize
33
+ batches with longer prompts (these contain more substantive work).
34
+
35
+ For each batch:
36
+ 1. Read `user_prompt` — understand what the developer asked
37
+ 2. Identify candidate observations — genuine insights only, not activity logs
38
+ 3. **Group candidates by topic across all batches** before searching.
39
+ One search call per topic is cheaper than one per batch.
40
+
41
+ For each TOPIC group:
42
+ 1. Use `vault_search_semantic` to find existing spores that already cover this topic
43
+ 2. If a similar spore exists:
44
+ - If the new observation adds meaningful new detail: call
45
+ `vault_create_spore` then supersede the old via `vault_resolve_spore`
46
+ action "supersede" with new_spore_id
47
+ - If the new observation adds nothing new: skip it entirely
48
+ 3. If no similar spore exists: call `vault_create_spore`
49
+
50
+ After processing each batch:
51
+ 4. Call `vault_mark_processed`
52
+ 5. Update cursor via `vault_set_state`
53
+
54
+ ## Phase 3 — Session Summaries (budget: 4 turns)
55
+
56
+ For each session touched:
57
+ 1. Call `vault_sessions` to check existing title/summary
58
+ 2. If missing or stale, review the prompt batches you just processed —
59
+ use the user_prompt and response_summary from those batches as
60
+ your primary source for the title and summary
61
+ 3. Call `vault_update_session` with BOTH title and summary
62
+
63
+ Title: describes WHAT WAS ACCOMPLISHED (under 80 chars, sentence case).
64
+ Synthesize from the full arc of batches, not just the first prompt.
65
+ NEVER use file paths, directory names, or the user's first message.
66
+ Summary: 2-4 sentences — rich in detail (files, tools, outcomes).
67
+
68
+ ## Phase 4 — Report (budget: 1 turn)
69
+
70
+ Call `vault_report` with action "extract":
71
+ - Batches processed: N
72
+ - Spores created: N (by type)
73
+ - Sessions updated: N
74
+
75
+ toolOverrides:
76
+ - vault_unprocessed
77
+ - vault_spores
78
+ - vault_sessions
79
+ - vault_search_fts
80
+ - vault_search_semantic
81
+ - vault_state
82
+ - vault_create_spore
83
+ - vault_resolve_spore
84
+ - vault_update_session
85
+ - vault_mark_processed
86
+ - vault_set_state
87
+ - vault_report