@goondocks/myco 0.3.7 → 0.4.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.claude-plugin/marketplace.json +1 -1
- package/.claude-plugin/plugin.json +1 -1
- package/README.md +9 -4
- package/commands/init.md +63 -39
- package/commands/setup-llm.md +69 -44
- package/commands/status.md +28 -10
- package/dist/{chunk-YFG2O5HR.js → chunk-2GJFTIWX.js} +2 -2
- package/dist/chunk-4FCFRJIQ.js +147 -0
- package/dist/chunk-4FCFRJIQ.js.map +1 -0
- package/dist/{chunk-PA3VMINE.js → chunk-AK6GNLPV.js} +6 -1
- package/dist/chunk-AK6GNLPV.js.map +1 -0
- package/dist/{chunk-JKOALBZC.js → chunk-BNIYWCST.js} +2 -2
- package/dist/{chunk-ISCT2SI6.js → chunk-G6ZMTQMJ.js} +7357 -60
- package/dist/chunk-G6ZMTQMJ.js.map +1 -0
- package/dist/{chunk-7WNE22W7.js → chunk-IVS5MYBL.js} +3 -3
- package/dist/{chunk-7WNE22W7.js.map → chunk-IVS5MYBL.js.map} +1 -1
- package/dist/{chunk-7VPJK56U.js → chunk-JBD5KP5G.js} +31 -16
- package/dist/chunk-JBD5KP5G.js.map +1 -0
- package/dist/chunk-NUA7UTIY.js +37 -0
- package/dist/chunk-NUA7UTIY.js.map +1 -0
- package/dist/{chunk-NYAWCMRZ.js → chunk-OUFSLZTX.js} +4 -4
- package/dist/chunk-P7RNAYU7.js +242 -0
- package/dist/chunk-P7RNAYU7.js.map +1 -0
- package/dist/chunk-QQ36XEJP.js +38 -0
- package/dist/chunk-QQ36XEJP.js.map +1 -0
- package/dist/chunk-RDXTQ436.js +49 -0
- package/dist/chunk-RDXTQ436.js.map +1 -0
- package/dist/{chunk-AWF3M57N.js → chunk-S7EIHYE7.js} +8 -8
- package/dist/{chunk-AWF3M57N.js.map → chunk-S7EIHYE7.js.map} +1 -1
- package/dist/{chunk-QWU7QLZI.js → chunk-TZDDXRHG.js} +10 -10
- package/dist/chunk-TZDDXRHG.js.map +1 -0
- package/dist/chunk-VYV5IFD6.js +99 -0
- package/dist/chunk-VYV5IFD6.js.map +1 -0
- package/dist/{chunk-LR7RQCOB.js → chunk-XCPQHC4X.js} +2 -2
- package/dist/{chunk-CCIV47S4.js → chunk-XHWIIU5D.js} +8 -9
- package/dist/chunk-XHWIIU5D.js.map +1 -0
- package/dist/{chunk-FFQNE6CT.js → chunk-YZO22BBI.js} +45 -31
- package/dist/chunk-YZO22BBI.js.map +1 -0
- package/dist/{chunk-ZBNT6E22.js → chunk-ZCBL5HER.js} +2 -2
- package/dist/{cli-3WQSDSW6.js → cli-ZN6VBA7V.js} +23 -17
- package/dist/cli-ZN6VBA7V.js.map +1 -0
- package/dist/{client-5T4M42UQ.js → client-5SUO2UYH.js} +5 -5
- package/dist/{config-MD4XMLUS.js → config-4GGMWGAF.js} +4 -4
- package/dist/{detect-providers-LNOLBICR.js → detect-providers-5FU3BN5Q.js} +3 -3
- package/dist/{init-RALMQKOQ.js → init-7UXGDOFS.js} +51 -60
- package/dist/init-7UXGDOFS.js.map +1 -0
- package/dist/{main-S3WSUF5T.js → main-6UPAIDGS.js} +648 -228
- package/dist/main-6UPAIDGS.js.map +1 -0
- package/dist/{rebuild-JW6BCHHZ.js → rebuild-QDSYYCS7.js} +10 -10
- package/dist/rebuild-QDSYYCS7.js.map +1 -0
- package/dist/{reprocess-SNXFNKBN.js → reprocess-ZNUQCIS3.js} +18 -18
- package/dist/reprocess-ZNUQCIS3.js.map +1 -0
- package/dist/{restart-YE2IGOYT.js → restart-5UY2KV54.js} +6 -6
- package/dist/{search-2HMG3ON7.js → search-2VEN3XIG.js} +9 -9
- package/dist/{server-JM3TM7D2.js → server-OR5B4B7K.js} +77 -54
- package/dist/{server-JM3TM7D2.js.map → server-OR5B4B7K.js.map} +1 -1
- package/dist/{session-5GI2YU6R.js → session-QF6MILAC.js} +2 -2
- package/dist/{session-start-2UEEEO52.js → session-start-TUITIUMB.js} +29 -28
- package/dist/session-start-TUITIUMB.js.map +1 -0
- package/dist/setup-digest-ETCZAUIU.js +15 -0
- package/dist/setup-llm-DWEJE3JE.js +15 -0
- package/dist/setup-llm-DWEJE3JE.js.map +1 -0
- package/dist/src/cli.js +4 -4
- package/dist/src/daemon/main.js +4 -4
- package/dist/src/hooks/post-tool-use.js +5 -5
- package/dist/src/hooks/session-end.js +5 -5
- package/dist/src/hooks/session-start.js +4 -4
- package/dist/src/hooks/stop.js +7 -7
- package/dist/src/hooks/user-prompt-submit.js +5 -5
- package/dist/src/hooks/user-prompt-submit.js.map +1 -1
- package/dist/src/mcp/server.js +4 -4
- package/dist/src/prompts/classification.md +1 -0
- package/dist/src/prompts/digest-10000.md +74 -0
- package/dist/src/prompts/digest-1500.md +25 -0
- package/dist/src/prompts/digest-3000.md +32 -0
- package/dist/src/prompts/digest-5000.md +43 -0
- package/dist/src/prompts/digest-system.md +32 -0
- package/dist/src/prompts/extraction.md +11 -10
- package/dist/src/prompts/summary.md +11 -1
- package/dist/src/prompts/title.md +1 -1
- package/dist/{stats-IOWXG576.js → stats-IVIXIKTS.js} +12 -12
- package/dist/stats-IVIXIKTS.js.map +1 -0
- package/dist/{verify-7MWOV72E.js → verify-4H6CEE5T.js} +6 -6
- package/dist/{version-S7MHLD5P.js → version-5B2TWXQJ.js} +4 -4
- package/dist/version-5B2TWXQJ.js.map +1 -0
- package/package.json +1 -1
- package/skills/myco/SKILL.md +20 -20
- package/skills/myco/references/wisdom.md +14 -14
- package/skills/rules/SKILL.md +4 -4
- package/dist/chunk-7VPJK56U.js.map +0 -1
- package/dist/chunk-BA23DROX.js +0 -160
- package/dist/chunk-BA23DROX.js.map +0 -1
- package/dist/chunk-CCIV47S4.js.map +0 -1
- package/dist/chunk-EF4JVH24.js +0 -7299
- package/dist/chunk-EF4JVH24.js.map +0 -1
- package/dist/chunk-FFQNE6CT.js.map +0 -1
- package/dist/chunk-ISCT2SI6.js.map +0 -1
- package/dist/chunk-PA3VMINE.js.map +0 -1
- package/dist/chunk-QWU7QLZI.js.map +0 -1
- package/dist/chunk-YMYJ7FNH.js +0 -19
- package/dist/chunk-YMYJ7FNH.js.map +0 -1
- package/dist/cli-3WQSDSW6.js.map +0 -1
- package/dist/init-RALMQKOQ.js.map +0 -1
- package/dist/main-S3WSUF5T.js.map +0 -1
- package/dist/rebuild-JW6BCHHZ.js.map +0 -1
- package/dist/reprocess-SNXFNKBN.js.map +0 -1
- package/dist/session-start-2UEEEO52.js.map +0 -1
- package/dist/stats-IOWXG576.js.map +0 -1
- /package/dist/{chunk-YFG2O5HR.js.map → chunk-2GJFTIWX.js.map} +0 -0
- /package/dist/{chunk-JKOALBZC.js.map → chunk-BNIYWCST.js.map} +0 -0
- /package/dist/{chunk-NYAWCMRZ.js.map → chunk-OUFSLZTX.js.map} +0 -0
- /package/dist/{chunk-LR7RQCOB.js.map → chunk-XCPQHC4X.js.map} +0 -0
- /package/dist/{chunk-ZBNT6E22.js.map → chunk-ZCBL5HER.js.map} +0 -0
- /package/dist/{client-5T4M42UQ.js.map → client-5SUO2UYH.js.map} +0 -0
- /package/dist/{config-MD4XMLUS.js.map → config-4GGMWGAF.js.map} +0 -0
- /package/dist/{detect-providers-LNOLBICR.js.map → detect-providers-5FU3BN5Q.js.map} +0 -0
- /package/dist/{restart-YE2IGOYT.js.map → restart-5UY2KV54.js.map} +0 -0
- /package/dist/{search-2HMG3ON7.js.map → search-2VEN3XIG.js.map} +0 -0
- /package/dist/{session-5GI2YU6R.js.map → session-QF6MILAC.js.map} +0 -0
- /package/dist/{version-S7MHLD5P.js.map → setup-digest-ETCZAUIU.js.map} +0 -0
- /package/dist/{verify-7MWOV72E.js.map → verify-4H6CEE5T.js.map} +0 -0
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
MycoIndex
|
|
4
|
-
} from "./chunk-
|
|
4
|
+
} from "./chunk-AK6GNLPV.js";
|
|
5
5
|
import "./chunk-PZUWP5VK.js";
|
|
6
6
|
|
|
7
7
|
// src/cli/session.ts
|
|
@@ -41,4 +41,4 @@ ${target.content?.slice(0, 2e3)}`);
|
|
|
41
41
|
export {
|
|
42
42
|
run
|
|
43
43
|
};
|
|
44
|
-
//# sourceMappingURL=session-
|
|
44
|
+
//# sourceMappingURL=session-QF6MILAC.js.map
|
|
@@ -1,32 +1,32 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
|
-
|
|
4
|
-
|
|
5
|
-
} from "./chunk-
|
|
3
|
+
planFm,
|
|
4
|
+
sporeFm
|
|
5
|
+
} from "./chunk-NUA7UTIY.js";
|
|
6
6
|
import {
|
|
7
7
|
MycoIndex
|
|
8
|
-
} from "./chunk-
|
|
8
|
+
} from "./chunk-AK6GNLPV.js";
|
|
9
9
|
import {
|
|
10
10
|
loadConfig
|
|
11
|
-
} from "./chunk-
|
|
12
|
-
import "./chunk-
|
|
11
|
+
} from "./chunk-QQ36XEJP.js";
|
|
12
|
+
import "./chunk-G6ZMTQMJ.js";
|
|
13
13
|
import {
|
|
14
14
|
readStdin
|
|
15
|
-
} from "./chunk-
|
|
15
|
+
} from "./chunk-ZCBL5HER.js";
|
|
16
16
|
import {
|
|
17
17
|
DaemonClient
|
|
18
|
-
} from "./chunk-
|
|
19
|
-
import "./chunk-
|
|
18
|
+
} from "./chunk-OUFSLZTX.js";
|
|
19
|
+
import "./chunk-2GJFTIWX.js";
|
|
20
20
|
import {
|
|
21
21
|
resolveVaultDir
|
|
22
22
|
} from "./chunk-N33KUCFP.js";
|
|
23
|
-
import "./chunk-
|
|
23
|
+
import "./chunk-BNIYWCST.js";
|
|
24
24
|
import {
|
|
25
|
-
CHARS_PER_TOKEN,
|
|
26
|
-
CONTEXT_MEMORY_PREVIEW_CHARS,
|
|
27
25
|
CONTEXT_PLAN_PREVIEW_CHARS,
|
|
28
|
-
CONTEXT_SESSION_PREVIEW_CHARS
|
|
29
|
-
|
|
26
|
+
CONTEXT_SESSION_PREVIEW_CHARS,
|
|
27
|
+
CONTEXT_SPORE_PREVIEW_CHARS,
|
|
28
|
+
estimateTokens
|
|
29
|
+
} from "./chunk-JBD5KP5G.js";
|
|
30
30
|
import "./chunk-PZUWP5VK.js";
|
|
31
31
|
|
|
32
32
|
// src/context/relevance.ts
|
|
@@ -94,23 +94,23 @@ function buildInjectedContext(index, config, context) {
|
|
|
94
94
|
),
|
|
95
95
|
budgets.sessions
|
|
96
96
|
);
|
|
97
|
-
const
|
|
98
|
-
const status =
|
|
97
|
+
const spores = index.query({ type: "spore", limit: 20 }).filter((m) => {
|
|
98
|
+
const status = sporeFm(m).status;
|
|
99
99
|
return status !== "superseded" && status !== "archived";
|
|
100
100
|
});
|
|
101
|
-
const
|
|
101
|
+
const scoredSpores = scoreRelevance(spores, {
|
|
102
102
|
branch: context.branch,
|
|
103
103
|
activePlanIds
|
|
104
104
|
});
|
|
105
|
-
const
|
|
106
|
-
"Relevant
|
|
107
|
-
|
|
108
|
-
(m) => `- **${m.note.title}** (${
|
|
105
|
+
const sporesText = formatLayer(
|
|
106
|
+
"Relevant Spores",
|
|
107
|
+
scoredSpores.slice(0, 5).map(
|
|
108
|
+
(m) => `- **${m.note.title}** (${sporeFm(m.note).observation_type}): ${m.note.content.slice(0, CONTEXT_SPORE_PREVIEW_CHARS)}`
|
|
109
109
|
),
|
|
110
|
-
budgets.
|
|
110
|
+
budgets.spores
|
|
111
111
|
);
|
|
112
112
|
const teamText = formatLayer("Team Activity", [], budgets.team);
|
|
113
|
-
const allLayers = [plansText, sessionsText,
|
|
113
|
+
const allLayers = [plansText, sessionsText, sporesText, teamText].filter(Boolean);
|
|
114
114
|
const parts = [];
|
|
115
115
|
let totalTokens = 0;
|
|
116
116
|
for (const layer of allLayers) {
|
|
@@ -126,7 +126,7 @@ function buildInjectedContext(index, config, context) {
|
|
|
126
126
|
layers: {
|
|
127
127
|
plans: plansText,
|
|
128
128
|
sessions: sessionsText,
|
|
129
|
-
|
|
129
|
+
spores: sporesText,
|
|
130
130
|
team: teamText
|
|
131
131
|
}
|
|
132
132
|
};
|
|
@@ -144,9 +144,6 @@ function formatLayer(heading, items, budget) {
|
|
|
144
144
|
}
|
|
145
145
|
return text.trim();
|
|
146
146
|
}
|
|
147
|
-
function estimateTokens(text) {
|
|
148
|
-
return Math.ceil(text.length / CHARS_PER_TOKEN);
|
|
149
|
-
}
|
|
150
147
|
|
|
151
148
|
// src/hooks/session-start.ts
|
|
152
149
|
import { execFileSync } from "child_process";
|
|
@@ -174,6 +171,10 @@ async function main() {
|
|
|
174
171
|
});
|
|
175
172
|
const contextResult = await client.post("/context", { session_id: sessionId, branch });
|
|
176
173
|
if (contextResult.ok && contextResult.data?.text) {
|
|
174
|
+
if (contextResult.data.source === "digest") {
|
|
175
|
+
process.stderr.write(`[myco] Injecting digest extract (tier ${contextResult.data.tier})
|
|
176
|
+
`);
|
|
177
|
+
}
|
|
177
178
|
process.stdout.write(contextResult.data.text);
|
|
178
179
|
return;
|
|
179
180
|
}
|
|
@@ -188,4 +189,4 @@ async function main() {
|
|
|
188
189
|
}
|
|
189
190
|
}
|
|
190
191
|
main();
|
|
191
|
-
//# sourceMappingURL=session-start-
|
|
192
|
+
//# sourceMappingURL=session-start-TUITIUMB.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":["../src/context/relevance.ts","../src/context/injector.ts","../src/hooks/session-start.ts"],"sourcesContent":["import type { IndexedNote } from '../index/sqlite.js';\n\nexport interface ScoredNote {\n note: IndexedNote;\n score: number;\n reason: string;\n}\n\ninterface RelevanceInput {\n branch?: string;\n activePlanIds?: string[];\n}\n\n// --- Recency thresholds (hours) and score weights ---\nconst RECENCY_TIERS = [\n { maxHours: 24, score: 3, label: 'recent (<24h)' },\n { maxHours: 72, score: 2, label: 'recent (<72h)' },\n { maxHours: 168, score: 1, label: 'recent (<1w)' },\n] as const;\n\nconst BRANCH_MATCH_SCORE = 3;\nconst PLAN_MATCH_SCORE = 2;\n\nconst MS_PER_HOUR = 3_600_000;\n\nexport function scoreRelevance(\n notes: IndexedNote[],\n input: RelevanceInput,\n): ScoredNote[] {\n return notes\n .map((note) => {\n let score = 0;\n const reasons: string[] = [];\n\n const fm = note.frontmatter as Record<string, unknown>;\n\n // Recency boost\n const age = Date.now() - new Date(note.created).getTime();\n const hoursOld = age / MS_PER_HOUR;\n for (const tier of RECENCY_TIERS) {\n if (hoursOld < tier.maxHours) {\n score += tier.score;\n reasons.push(tier.label);\n break;\n }\n }\n\n // Branch match\n if (input.branch && fm.branch === input.branch) {\n score += BRANCH_MATCH_SCORE;\n reasons.push('same branch');\n }\n\n // Plan match\n if (input.activePlanIds?.length) {\n const planRef = fm.plan as string | undefined;\n if (planRef && input.activePlanIds.some((id) =>\n planRef === `[[${id}]]` || planRef === id,\n )) {\n score += PLAN_MATCH_SCORE;\n reasons.push('active plan');\n }\n }\n\n return { note, score, reason: reasons.join(', ') };\n })\n .filter((s) => s.score > 0)\n .sort((a, b) => b.score - a.score);\n}\n","import type { MycoIndex } from '../index/sqlite.js';\nimport type { MycoConfig } from '../config/schema.js';\nimport { planFm, sporeFm } from '../vault/frontmatter.js';\nimport { scoreRelevance } from './relevance.js';\nimport { estimateTokens, CONTEXT_PLAN_PREVIEW_CHARS, CONTEXT_SESSION_PREVIEW_CHARS, CONTEXT_SPORE_PREVIEW_CHARS } from '../constants.js';\n\ninterface InjectionContext {\n branch?: string;\n}\n\ninterface InjectedContext {\n text: string;\n tokenEstimate: number;\n layers: {\n plans: string;\n sessions: string;\n spores: string;\n team: string;\n };\n}\n\nexport function buildInjectedContext(\n index: MycoIndex,\n config: MycoConfig,\n context: InjectionContext,\n): InjectedContext {\n const budgets = config.context.layers;\n\n // Layer 1: Active plans\n const plans = index.query({ type: 'plan' });\n const activePlans = plans.filter((p) =>\n ['active', 'in_progress'].includes(planFm(p).status ?? ''),\n );\n const plansText = formatLayer(\n 'Active Plans',\n activePlans.map((p) => `- **${p.title}** (${planFm(p).status}): ${p.content.slice(0, CONTEXT_PLAN_PREVIEW_CHARS)}`),\n budgets.plans,\n );\n\n // Layer 2: Recent sessions\n const sessions = index.query({ type: 'session', limit: 10 });\n const activePlanIds = activePlans.map((p) => p.id);\n const scoredSessions = scoreRelevance(sessions, {\n branch: context.branch,\n activePlanIds,\n });\n const sessionsText = formatLayer(\n 'Recent Sessions',\n scoredSessions.slice(0, 5).map((s) =>\n `- **${s.note.title}**: ${s.note.content.slice(0, CONTEXT_SESSION_PREVIEW_CHARS)} (${s.reason})`,\n ),\n budgets.sessions,\n );\n\n // Layer 3: Relevant spores (exclude superseded/archived)\n const spores = index.query({ type: 'spore', limit: 20 })\n .filter((m) => {\n const status = sporeFm(m).status;\n return status !== 'superseded' && status !== 'archived';\n });\n const scoredSpores = scoreRelevance(spores, {\n branch: context.branch,\n activePlanIds,\n });\n const sporesText = formatLayer(\n 'Relevant Spores',\n scoredSpores.slice(0, 5).map((m) =>\n `- **${m.note.title}** (${sporeFm(m.note).observation_type}): ${m.note.content.slice(0, CONTEXT_SPORE_PREVIEW_CHARS)}`,\n ),\n budgets.spores,\n );\n\n // Layer 4: Team activity\n const teamText = formatLayer('Team Activity', [], budgets.team);\n\n // Enforce total max_tokens budget\n const allLayers = [plansText, sessionsText, sporesText, teamText].filter(Boolean);\n const parts: string[] = [];\n let totalTokens = 0;\n\n for (const layer of allLayers) {\n const layerTokens = estimateTokens(layer);\n if (totalTokens + layerTokens > config.context.max_tokens) break;\n parts.push(layer);\n totalTokens += layerTokens;\n }\n\n const fullText = parts.join('\\n\\n');\n\n return {\n text: fullText,\n tokenEstimate: totalTokens,\n layers: {\n plans: plansText,\n sessions: sessionsText,\n spores: sporesText,\n team: teamText,\n },\n };\n}\n\nfunction formatLayer(heading: string, items: string[], budget: number): string {\n if (items.length === 0) return '';\n\n let text = `### ${heading}\\n`;\n let currentTokens = estimateTokens(text);\n\n for (const item of items) {\n const itemTokens = estimateTokens(item);\n if (currentTokens + itemTokens > budget) break;\n text += item + '\\n';\n currentTokens += itemTokens;\n }\n\n return text.trim();\n}\n\n","import { DaemonClient } from './client.js';\nimport { readStdin } from './read-stdin.js';\nimport { loadConfig } from '../config/loader.js';\nimport { MycoIndex } from '../index/sqlite.js';\nimport { buildInjectedContext } from '../context/injector.js';\nimport { resolveVaultDir } from '../vault/resolve.js';\nimport { execFileSync } from 'node:child_process';\nimport fs from 'node:fs';\nimport path from 'node:path';\n\nasync function main() {\n const VAULT_DIR = resolveVaultDir();\n if (!fs.existsSync(path.join(VAULT_DIR, 'myco.yaml'))) return;\n\n try {\n const config = loadConfig(VAULT_DIR);\n const client = new DaemonClient(VAULT_DIR);\n const healthy = await client.ensureRunning();\n\n const input = JSON.parse(await readStdin());\n const sessionId = input.session_id ?? `s-${Date.now()}`;\n\n let branch: string | undefined;\n try {\n branch = execFileSync('git', ['rev-parse', '--abbrev-ref', 'HEAD'], { encoding: 'utf-8' }).trim();\n } catch { /* not a git repo */ }\n\n if (healthy) {\n await client.post('/sessions/register', {\n session_id: sessionId,\n branch,\n started_at: new Date().toISOString(),\n });\n\n const contextResult = await client.post('/context', { session_id: sessionId, branch });\n\n if (contextResult.ok && contextResult.data?.text) {\n if (contextResult.data.source === 'digest') {\n process.stderr.write(`[myco] Injecting digest extract (tier ${contextResult.data.tier})\\n`);\n }\n process.stdout.write(contextResult.data.text);\n return;\n }\n }\n\n // Degraded: local FTS context only\n const index = new MycoIndex(path.join(VAULT_DIR, 'index.db'));\n const injected = buildInjectedContext(index, config, { branch });\n if (injected.text) process.stdout.write(injected.text);\n index.close();\n } catch (error) {\n process.stderr.write(`[myco] session-start error: ${(error as Error).message}\\n`);\n }\n}\n\nmain();\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAcA,IAAM,gBAAgB;AAAA,EACpB,EAAE,UAAU,IAAI,OAAO,GAAG,OAAO,gBAAgB;AAAA,EACjD,EAAE,UAAU,IAAI,OAAO,GAAG,OAAO,gBAAgB;AAAA,EACjD,EAAE,UAAU,KAAK,OAAO,GAAG,OAAO,eAAe;AACnD;AAEA,IAAM,qBAAqB;AAC3B,IAAM,mBAAmB;AAEzB,IAAM,cAAc;AAEb,SAAS,eACd,OACA,OACc;AACd,SAAO,MACJ,IAAI,CAAC,SAAS;AACb,QAAI,QAAQ;AACZ,UAAM,UAAoB,CAAC;AAE3B,UAAM,KAAK,KAAK;AAGhB,UAAM,MAAM,KAAK,IAAI,IAAI,IAAI,KAAK,KAAK,OAAO,EAAE,QAAQ;AACxD,UAAM,WAAW,MAAM;AACvB,eAAW,QAAQ,eAAe;AAChC,UAAI,WAAW,KAAK,UAAU;AAC5B,iBAAS,KAAK;AACd,gBAAQ,KAAK,KAAK,KAAK;AACvB;AAAA,MACF;AAAA,IACF;AAGA,QAAI,MAAM,UAAU,GAAG,WAAW,MAAM,QAAQ;AAC9C,eAAS;AACT,cAAQ,KAAK,aAAa;AAAA,IAC5B;AAGA,QAAI,MAAM,eAAe,QAAQ;AAC/B,YAAM,UAAU,GAAG;AACnB,UAAI,WAAW,MAAM,cAAc;AAAA,QAAK,CAAC,OACvC,YAAY,KAAK,EAAE,QAAQ,YAAY;AAAA,MACzC,GAAG;AACD,iBAAS;AACT,gBAAQ,KAAK,aAAa;AAAA,MAC5B;AAAA,IACF;AAEA,WAAO,EAAE,MAAM,OAAO,QAAQ,QAAQ,KAAK,IAAI,EAAE;AAAA,EACnD,CAAC,EACA,OAAO,CAAC,MAAM,EAAE,QAAQ,CAAC,EACzB,KAAK,CAAC,GAAG,MAAM,EAAE,QAAQ,EAAE,KAAK;AACrC;;;AC/CO,SAAS,qBACd,OACA,QACA,SACiB;AACjB,QAAM,UAAU,OAAO,QAAQ;AAG/B,QAAM,QAAQ,MAAM,MAAM,EAAE,MAAM,OAAO,CAAC;AAC1C,QAAM,cAAc,MAAM;AAAA,IAAO,CAAC,MAChC,CAAC,UAAU,aAAa,EAAE,SAAS,OAAO,CAAC,EAAE,UAAU,EAAE;AAAA,EAC3D;AACA,QAAM,YAAY;AAAA,IAChB;AAAA,IACA,YAAY,IAAI,CAAC,MAAM,OAAO,EAAE,KAAK,OAAO,OAAO,CAAC,EAAE,MAAM,MAAM,EAAE,QAAQ,MAAM,GAAG,0BAA0B,CAAC,EAAE;AAAA,IAClH,QAAQ;AAAA,EACV;AAGA,QAAM,WAAW,MAAM,MAAM,EAAE,MAAM,WAAW,OAAO,GAAG,CAAC;AAC3D,QAAM,gBAAgB,YAAY,IAAI,CAAC,MAAM,EAAE,EAAE;AACjD,QAAM,iBAAiB,eAAe,UAAU;AAAA,IAC9C,QAAQ,QAAQ;AAAA,IAChB;AAAA,EACF,CAAC;AACD,QAAM,eAAe;AAAA,IACnB;AAAA,IACA,eAAe,MAAM,GAAG,CAAC,EAAE;AAAA,MAAI,CAAC,MAC9B,OAAO,EAAE,KAAK,KAAK,OAAO,EAAE,KAAK,QAAQ,MAAM,GAAG,6BAA6B,CAAC,KAAK,EAAE,MAAM;AAAA,IAC/F;AAAA,IACA,QAAQ;AAAA,EACV;AAGA,QAAM,SAAS,MAAM,MAAM,EAAE,MAAM,SAAS,OAAO,GAAG,CAAC,EACpD,OAAO,CAAC,MAAM;AACb,UAAM,SAAS,QAAQ,CAAC,EAAE;AAC1B,WAAO,WAAW,gBAAgB,WAAW;AAAA,EAC/C,CAAC;AACH,QAAM,eAAe,eAAe,QAAQ;AAAA,IAC1C,QAAQ,QAAQ;AAAA,IAChB;AAAA,EACF,CAAC;AACD,QAAM,aAAa;AAAA,IACjB;AAAA,IACA,aAAa,MAAM,GAAG,CAAC,EAAE;AAAA,MAAI,CAAC,MAC5B,OAAO,EAAE,KAAK,KAAK,OAAO,QAAQ,EAAE,IAAI,EAAE,gBAAgB,MAAM,EAAE,KAAK,QAAQ,MAAM,GAAG,2BAA2B,CAAC;AAAA,IACtH;AAAA,IACA,QAAQ;AAAA,EACV;AAGA,QAAM,WAAW,YAAY,iBAAiB,CAAC,GAAG,QAAQ,IAAI;AAG9D,QAAM,YAAY,CAAC,WAAW,cAAc,YAAY,QAAQ,EAAE,OAAO,OAAO;AAChF,QAAM,QAAkB,CAAC;AACzB,MAAI,cAAc;AAElB,aAAW,SAAS,WAAW;AAC7B,UAAM,cAAc,eAAe,KAAK;AACxC,QAAI,cAAc,cAAc,OAAO,QAAQ,WAAY;AAC3D,UAAM,KAAK,KAAK;AAChB,mBAAe;AAAA,EACjB;AAEA,QAAM,WAAW,MAAM,KAAK,MAAM;AAElC,SAAO;AAAA,IACL,MAAM;AAAA,IACN,eAAe;AAAA,IACf,QAAQ;AAAA,MACN,OAAO;AAAA,MACP,UAAU;AAAA,MACV,QAAQ;AAAA,MACR,MAAM;AAAA,IACR;AAAA,EACF;AACF;AAEA,SAAS,YAAY,SAAiB,OAAiB,QAAwB;AAC7E,MAAI,MAAM,WAAW,EAAG,QAAO;AAE/B,MAAI,OAAO,OAAO,OAAO;AAAA;AACzB,MAAI,gBAAgB,eAAe,IAAI;AAEvC,aAAW,QAAQ,OAAO;AACxB,UAAM,aAAa,eAAe,IAAI;AACtC,QAAI,gBAAgB,aAAa,OAAQ;AACzC,YAAQ,OAAO;AACf,qBAAiB;AAAA,EACnB;AAEA,SAAO,KAAK,KAAK;AACnB;;;AC7GA,SAAS,oBAAoB;AAC7B,OAAO,QAAQ;AACf,OAAO,UAAU;AAEjB,eAAe,OAAO;AACpB,QAAM,YAAY,gBAAgB;AAClC,MAAI,CAAC,GAAG,WAAW,KAAK,KAAK,WAAW,WAAW,CAAC,EAAG;AAEvD,MAAI;AACF,UAAM,SAAS,WAAW,SAAS;AACnC,UAAM,SAAS,IAAI,aAAa,SAAS;AACzC,UAAM,UAAU,MAAM,OAAO,cAAc;AAE3C,UAAM,QAAQ,KAAK,MAAM,MAAM,UAAU,CAAC;AAC1C,UAAM,YAAY,MAAM,cAAc,KAAK,KAAK,IAAI,CAAC;AAErD,QAAI;AACJ,QAAI;AACF,eAAS,aAAa,OAAO,CAAC,aAAa,gBAAgB,MAAM,GAAG,EAAE,UAAU,QAAQ,CAAC,EAAE,KAAK;AAAA,IAClG,QAAQ;AAAA,IAAuB;AAE/B,QAAI,SAAS;AACX,YAAM,OAAO,KAAK,sBAAsB;AAAA,QACtC,YAAY;AAAA,QACZ;AAAA,QACA,aAAY,oBAAI,KAAK,GAAE,YAAY;AAAA,MACrC,CAAC;AAED,YAAM,gBAAgB,MAAM,OAAO,KAAK,YAAY,EAAE,YAAY,WAAW,OAAO,CAAC;AAErF,UAAI,cAAc,MAAM,cAAc,MAAM,MAAM;AAChD,YAAI,cAAc,KAAK,WAAW,UAAU;AAC1C,kBAAQ,OAAO,MAAM,yCAAyC,cAAc,KAAK,IAAI;AAAA,CAAK;AAAA,QAC5F;AACA,gBAAQ,OAAO,MAAM,cAAc,KAAK,IAAI;AAC5C;AAAA,MACF;AAAA,IACF;AAGA,UAAM,QAAQ,IAAI,UAAU,KAAK,KAAK,WAAW,UAAU,CAAC;AAC5D,UAAM,WAAW,qBAAqB,OAAO,QAAQ,EAAE,OAAO,CAAC;AAC/D,QAAI,SAAS,KAAM,SAAQ,OAAO,MAAM,SAAS,IAAI;AACrD,UAAM,MAAM;AAAA,EACd,SAAS,OAAO;AACd,YAAQ,OAAO,MAAM,+BAAgC,MAAgB,OAAO;AAAA,CAAI;AAAA,EAClF;AACF;AAEA,KAAK;","names":[]}
|
|
@@ -0,0 +1,15 @@
|
|
|
1
|
+
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
|
+
import {
|
|
3
|
+
run
|
|
4
|
+
} from "./chunk-4FCFRJIQ.js";
|
|
5
|
+
import "./chunk-XHWIIU5D.js";
|
|
6
|
+
import "./chunk-SAKJMNSR.js";
|
|
7
|
+
import "./chunk-P7RNAYU7.js";
|
|
8
|
+
import "./chunk-G6ZMTQMJ.js";
|
|
9
|
+
import "./chunk-BNIYWCST.js";
|
|
10
|
+
import "./chunk-JBD5KP5G.js";
|
|
11
|
+
import "./chunk-PZUWP5VK.js";
|
|
12
|
+
export {
|
|
13
|
+
run
|
|
14
|
+
};
|
|
15
|
+
//# sourceMappingURL=setup-digest-ETCZAUIU.js.map
|
|
@@ -0,0 +1,15 @@
|
|
|
1
|
+
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
|
+
import {
|
|
3
|
+
run
|
|
4
|
+
} from "./chunk-VYV5IFD6.js";
|
|
5
|
+
import "./chunk-XHWIIU5D.js";
|
|
6
|
+
import "./chunk-SAKJMNSR.js";
|
|
7
|
+
import "./chunk-P7RNAYU7.js";
|
|
8
|
+
import "./chunk-G6ZMTQMJ.js";
|
|
9
|
+
import "./chunk-BNIYWCST.js";
|
|
10
|
+
import "./chunk-JBD5KP5G.js";
|
|
11
|
+
import "./chunk-PZUWP5VK.js";
|
|
12
|
+
export {
|
|
13
|
+
run
|
|
14
|
+
};
|
|
15
|
+
//# sourceMappingURL=setup-llm-DWEJE3JE.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"sources":[],"sourcesContent":[],"mappings":"","names":[]}
|
package/dist/src/cli.js
CHANGED
|
@@ -2,12 +2,12 @@
|
|
|
2
2
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
3
3
|
import {
|
|
4
4
|
ensureNativeDeps
|
|
5
|
-
} from "../chunk-
|
|
6
|
-
import "../chunk-
|
|
7
|
-
import "../chunk-
|
|
5
|
+
} from "../chunk-XCPQHC4X.js";
|
|
6
|
+
import "../chunk-BNIYWCST.js";
|
|
7
|
+
import "../chunk-JBD5KP5G.js";
|
|
8
8
|
import "../chunk-PZUWP5VK.js";
|
|
9
9
|
|
|
10
10
|
// src/entries/cli.ts
|
|
11
11
|
ensureNativeDeps();
|
|
12
|
-
await import("../cli-
|
|
12
|
+
await import("../cli-ZN6VBA7V.js");
|
|
13
13
|
//# sourceMappingURL=cli.js.map
|
package/dist/src/daemon/main.js
CHANGED
|
@@ -1,13 +1,13 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
ensureNativeDeps
|
|
4
|
-
} from "../../chunk-
|
|
5
|
-
import "../../chunk-
|
|
6
|
-
import "../../chunk-
|
|
4
|
+
} from "../../chunk-XCPQHC4X.js";
|
|
5
|
+
import "../../chunk-BNIYWCST.js";
|
|
6
|
+
import "../../chunk-JBD5KP5G.js";
|
|
7
7
|
import "../../chunk-PZUWP5VK.js";
|
|
8
8
|
|
|
9
9
|
// src/entries/daemon.ts
|
|
10
10
|
ensureNativeDeps();
|
|
11
|
-
var { main } = await import("../../main-
|
|
11
|
+
var { main } = await import("../../main-6UPAIDGS.js");
|
|
12
12
|
await main();
|
|
13
13
|
//# sourceMappingURL=main.js.map
|
|
@@ -4,18 +4,18 @@ import {
|
|
|
4
4
|
} from "../../chunk-HIN3UVOG.js";
|
|
5
5
|
import {
|
|
6
6
|
readStdin
|
|
7
|
-
} from "../../chunk-
|
|
7
|
+
} from "../../chunk-ZCBL5HER.js";
|
|
8
8
|
import {
|
|
9
9
|
DaemonClient
|
|
10
|
-
} from "../../chunk-
|
|
11
|
-
import "../../chunk-
|
|
10
|
+
} from "../../chunk-OUFSLZTX.js";
|
|
11
|
+
import "../../chunk-2GJFTIWX.js";
|
|
12
12
|
import {
|
|
13
13
|
resolveVaultDir
|
|
14
14
|
} from "../../chunk-N33KUCFP.js";
|
|
15
|
-
import "../../chunk-
|
|
15
|
+
import "../../chunk-BNIYWCST.js";
|
|
16
16
|
import {
|
|
17
17
|
TOOL_OUTPUT_PREVIEW_CHARS
|
|
18
|
-
} from "../../chunk-
|
|
18
|
+
} from "../../chunk-JBD5KP5G.js";
|
|
19
19
|
import "../../chunk-PZUWP5VK.js";
|
|
20
20
|
|
|
21
21
|
// src/hooks/post-tool-use.ts
|
|
@@ -1,16 +1,16 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
readStdin
|
|
4
|
-
} from "../../chunk-
|
|
4
|
+
} from "../../chunk-ZCBL5HER.js";
|
|
5
5
|
import {
|
|
6
6
|
DaemonClient
|
|
7
|
-
} from "../../chunk-
|
|
8
|
-
import "../../chunk-
|
|
7
|
+
} from "../../chunk-OUFSLZTX.js";
|
|
8
|
+
import "../../chunk-2GJFTIWX.js";
|
|
9
9
|
import {
|
|
10
10
|
resolveVaultDir
|
|
11
11
|
} from "../../chunk-N33KUCFP.js";
|
|
12
|
-
import "../../chunk-
|
|
13
|
-
import "../../chunk-
|
|
12
|
+
import "../../chunk-BNIYWCST.js";
|
|
13
|
+
import "../../chunk-JBD5KP5G.js";
|
|
14
14
|
import "../../chunk-PZUWP5VK.js";
|
|
15
15
|
|
|
16
16
|
// src/hooks/session-end.ts
|
|
@@ -1,12 +1,12 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
ensureNativeDeps
|
|
4
|
-
} from "../../chunk-
|
|
5
|
-
import "../../chunk-
|
|
6
|
-
import "../../chunk-
|
|
4
|
+
} from "../../chunk-XCPQHC4X.js";
|
|
5
|
+
import "../../chunk-BNIYWCST.js";
|
|
6
|
+
import "../../chunk-JBD5KP5G.js";
|
|
7
7
|
import "../../chunk-PZUWP5VK.js";
|
|
8
8
|
|
|
9
9
|
// src/entries/session-start.ts
|
|
10
10
|
ensureNativeDeps();
|
|
11
|
-
await import("../../session-start-
|
|
11
|
+
await import("../../session-start-TUITIUMB.js");
|
|
12
12
|
//# sourceMappingURL=session-start.js.map
|
package/dist/src/hooks/stop.js
CHANGED
|
@@ -1,20 +1,20 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
loadConfig
|
|
4
|
-
} from "../../chunk-
|
|
5
|
-
import "../../chunk-
|
|
4
|
+
} from "../../chunk-QQ36XEJP.js";
|
|
5
|
+
import "../../chunk-G6ZMTQMJ.js";
|
|
6
6
|
import {
|
|
7
7
|
readStdin
|
|
8
|
-
} from "../../chunk-
|
|
8
|
+
} from "../../chunk-ZCBL5HER.js";
|
|
9
9
|
import {
|
|
10
10
|
DaemonClient
|
|
11
|
-
} from "../../chunk-
|
|
12
|
-
import "../../chunk-
|
|
11
|
+
} from "../../chunk-OUFSLZTX.js";
|
|
12
|
+
import "../../chunk-2GJFTIWX.js";
|
|
13
13
|
import {
|
|
14
14
|
resolveVaultDir
|
|
15
15
|
} from "../../chunk-N33KUCFP.js";
|
|
16
|
-
import "../../chunk-
|
|
17
|
-
import "../../chunk-
|
|
16
|
+
import "../../chunk-BNIYWCST.js";
|
|
17
|
+
import "../../chunk-JBD5KP5G.js";
|
|
18
18
|
import "../../chunk-PZUWP5VK.js";
|
|
19
19
|
|
|
20
20
|
// src/hooks/stop.ts
|
|
@@ -4,16 +4,16 @@ import {
|
|
|
4
4
|
} from "../../chunk-HIN3UVOG.js";
|
|
5
5
|
import {
|
|
6
6
|
readStdin
|
|
7
|
-
} from "../../chunk-
|
|
7
|
+
} from "../../chunk-ZCBL5HER.js";
|
|
8
8
|
import {
|
|
9
9
|
DaemonClient
|
|
10
|
-
} from "../../chunk-
|
|
11
|
-
import "../../chunk-
|
|
10
|
+
} from "../../chunk-OUFSLZTX.js";
|
|
11
|
+
import "../../chunk-2GJFTIWX.js";
|
|
12
12
|
import {
|
|
13
13
|
resolveVaultDir
|
|
14
14
|
} from "../../chunk-N33KUCFP.js";
|
|
15
|
-
import "../../chunk-
|
|
16
|
-
import "../../chunk-
|
|
15
|
+
import "../../chunk-BNIYWCST.js";
|
|
16
|
+
import "../../chunk-JBD5KP5G.js";
|
|
17
17
|
import "../../chunk-PZUWP5VK.js";
|
|
18
18
|
|
|
19
19
|
// src/hooks/user-prompt-submit.ts
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"sources":["../../../src/hooks/user-prompt-submit.ts"],"sourcesContent":["import { DaemonClient } from './client.js';\nimport { readStdin } from './read-stdin.js';\nimport { EventBuffer } from '../capture/buffer.js';\nimport { resolveVaultDir } from '../vault/resolve.js';\nimport fs from 'node:fs';\nimport path from 'node:path';\n\nasync function main() {\n const VAULT_DIR = resolveVaultDir();\n if (!fs.existsSync(path.join(VAULT_DIR, 'myco.yaml'))) return;\n\n try {\n const input = JSON.parse(await readStdin());\n const prompt = input.prompt ?? '';\n const sessionId = input.session_id ?? `s-${Date.now()}`;\n\n const client = new DaemonClient(VAULT_DIR);\n // Spawn daemon if needed but don't block on full health check backoff.\n // The event POST will fail fast if daemon isn't ready — buffer absorbs it.\n if (!(await client.isHealthy())) {\n client.spawnDaemon();\n }\n\n // Forward prompt as event for capture\n const eventResult = await client.post('/events', {\n type: 'user_prompt', prompt, session_id: sessionId,\n });\n\n if (!eventResult.ok) {\n // Daemon still unreachable — write directly to buffer for later processing\n const buffer = new EventBuffer(path.join(VAULT_DIR, 'buffer'), sessionId);\n buffer.append({ type: 'user_prompt', prompt });\n }\n\n // Search for relevant
|
|
1
|
+
{"version":3,"sources":["../../../src/hooks/user-prompt-submit.ts"],"sourcesContent":["import { DaemonClient } from './client.js';\nimport { readStdin } from './read-stdin.js';\nimport { EventBuffer } from '../capture/buffer.js';\nimport { resolveVaultDir } from '../vault/resolve.js';\nimport fs from 'node:fs';\nimport path from 'node:path';\n\nasync function main() {\n const VAULT_DIR = resolveVaultDir();\n if (!fs.existsSync(path.join(VAULT_DIR, 'myco.yaml'))) return;\n\n try {\n const input = JSON.parse(await readStdin());\n const prompt = input.prompt ?? '';\n const sessionId = input.session_id ?? `s-${Date.now()}`;\n\n const client = new DaemonClient(VAULT_DIR);\n // Spawn daemon if needed but don't block on full health check backoff.\n // The event POST will fail fast if daemon isn't ready — buffer absorbs it.\n if (!(await client.isHealthy())) {\n client.spawnDaemon();\n }\n\n // Forward prompt as event for capture\n const eventResult = await client.post('/events', {\n type: 'user_prompt', prompt, session_id: sessionId,\n });\n\n if (!eventResult.ok) {\n // Daemon still unreachable — write directly to buffer for later processing\n const buffer = new EventBuffer(path.join(VAULT_DIR, 'buffer'), sessionId);\n buffer.append({ type: 'user_prompt', prompt });\n }\n\n // Search for relevant spores to inject as context for this prompt.\n // The daemon does a vector search against the prompt text and returns\n // any high-relevance spores. This is fast (~20ms) — no LLM call.\n const contextResult = await client.post('/context/prompt', {\n prompt,\n session_id: sessionId,\n });\n\n // Always include the session ID so the agent can pass it to myco_remember.\n // Uses Session:: format consistent with daemon context injection (Branch::, Session::).\n const sessionLine = `Session:: \\`${sessionId}\\``;\n const contextText = contextResult.ok && contextResult.data?.text\n ? `${contextResult.data.text}\\n${sessionLine}`\n : sessionLine;\n\n process.stdout.write(contextText);\n } catch (error) {\n process.stderr.write(`[myco] user-prompt-submit error: ${(error as Error).message}\\n`);\n }\n}\n\nmain();\n"],"mappings":";;;;;;;;;;;;;;;;;;;AAIA,OAAO,QAAQ;AACf,OAAO,UAAU;AAEjB,eAAe,OAAO;AACpB,QAAM,YAAY,gBAAgB;AAClC,MAAI,CAAC,GAAG,WAAW,KAAK,KAAK,WAAW,WAAW,CAAC,EAAG;AAEvD,MAAI;AACF,UAAM,QAAQ,KAAK,MAAM,MAAM,UAAU,CAAC;AAC1C,UAAM,SAAS,MAAM,UAAU;AAC/B,UAAM,YAAY,MAAM,cAAc,KAAK,KAAK,IAAI,CAAC;AAErD,UAAM,SAAS,IAAI,aAAa,SAAS;AAGzC,QAAI,CAAE,MAAM,OAAO,UAAU,GAAI;AAC/B,aAAO,YAAY;AAAA,IACrB;AAGA,UAAM,cAAc,MAAM,OAAO,KAAK,WAAW;AAAA,MAC/C,MAAM;AAAA,MAAe;AAAA,MAAQ,YAAY;AAAA,IAC3C,CAAC;AAED,QAAI,CAAC,YAAY,IAAI;AAEnB,YAAM,SAAS,IAAI,YAAY,KAAK,KAAK,WAAW,QAAQ,GAAG,SAAS;AACxE,aAAO,OAAO,EAAE,MAAM,eAAe,OAAO,CAAC;AAAA,IAC/C;AAKA,UAAM,gBAAgB,MAAM,OAAO,KAAK,mBAAmB;AAAA,MACzD;AAAA,MACA,YAAY;AAAA,IACd,CAAC;AAID,UAAM,cAAc,eAAe,SAAS;AAC5C,UAAM,cAAc,cAAc,MAAM,cAAc,MAAM,OACxD,GAAG,cAAc,KAAK,IAAI;AAAA,EAAK,WAAW,KAC1C;AAEJ,YAAQ,OAAO,MAAM,WAAW;AAAA,EAClC,SAAS,OAAO;AACd,YAAQ,OAAO,MAAM,oCAAqC,MAAgB,OAAO;AAAA,CAAI;AAAA,EACvF;AACF;AAEA,KAAK;","names":[]}
|
package/dist/src/mcp/server.js
CHANGED
|
@@ -1,13 +1,13 @@
|
|
|
1
1
|
import { createRequire as __cr } from 'node:module'; const require = __cr(import.meta.url);
|
|
2
2
|
import {
|
|
3
3
|
ensureNativeDeps
|
|
4
|
-
} from "../../chunk-
|
|
5
|
-
import "../../chunk-
|
|
6
|
-
import "../../chunk-
|
|
4
|
+
} from "../../chunk-XCPQHC4X.js";
|
|
5
|
+
import "../../chunk-BNIYWCST.js";
|
|
6
|
+
import "../../chunk-JBD5KP5G.js";
|
|
7
7
|
import "../../chunk-PZUWP5VK.js";
|
|
8
8
|
|
|
9
9
|
// src/entries/mcp-server.ts
|
|
10
10
|
ensureNativeDeps();
|
|
11
|
-
var { main } = await import("../../server-
|
|
11
|
+
var { main } = await import("../../server-OR5B4B7K.js");
|
|
12
12
|
await main();
|
|
13
13
|
//# sourceMappingURL=server.js.map
|
|
@@ -0,0 +1,74 @@
|
|
|
1
|
+
Budget: ~10,000 tokens. Use the FULL budget — this is the institutional knowledge tier.
|
|
2
|
+
|
|
3
|
+
This is the complete project context — an agent reading this should have the equivalent of MONTHS of project history and team knowledge. It should be able to make architectural decisions, understand why any part of the system exists, trace the evolution of features, and anticipate risks that aren't obvious from the current code.
|
|
4
|
+
|
|
5
|
+
This tier exists for complex work: major refactors, architectural reviews, onboarding new team members, or debugging systemic issues where understanding history matters.
|
|
6
|
+
|
|
7
|
+
## Required Sections
|
|
8
|
+
|
|
9
|
+
### Project Identity & Vision (~500 tokens)
|
|
10
|
+
What is this project? What problem does it solve? Who uses it and how? What's the long-term vision? What makes it unique? Ground the agent in purpose so it can evaluate whether proposed changes serve the project's goals.
|
|
11
|
+
|
|
12
|
+
### Architecture Deep Dive (~1,000 tokens)
|
|
13
|
+
How the system fits together — subsystems, data flows, integration points, deployment model. Include the conceptual architecture (not just file paths) and explain WHY it's structured this way. What are the boundaries? What are the invariants? Where does complexity live and why?
|
|
14
|
+
|
|
15
|
+
### Active Work & Recent History (~1,200 tokens)
|
|
16
|
+
Detailed narrative of the last 5-10 sessions. What was built, what problems were solved, what changed and why. Include the story arc — how did the project get from where it was to where it is now? This gives the agent a sense of trajectory, not just current state.
|
|
17
|
+
|
|
18
|
+
For each major piece of recent work: what triggered it, what was the approach, what was the outcome, and what's still pending.
|
|
19
|
+
|
|
20
|
+
### Decision Archive (~1,500 tokens)
|
|
21
|
+
A comprehensive log of the decisions that shaped the project. Each decision should include:
|
|
22
|
+
- What was decided
|
|
23
|
+
- What alternatives were considered and rejected
|
|
24
|
+
- The rationale — why THIS choice over the alternatives
|
|
25
|
+
- Whether the decision is still active or has been superseded
|
|
26
|
+
|
|
27
|
+
Recent decisions (last 2-4 weeks) get full treatment. Older decisions that are still load-bearing get compressed but preserved. Superseded decisions are mentioned briefly with what replaced them.
|
|
28
|
+
|
|
29
|
+
This section should grow and compress over time — early decisions get shorter as they become settled, but never disappear entirely if they still affect how the system works.
|
|
30
|
+
|
|
31
|
+
### Accumulated Wisdom & Patterns (~1,200 tokens)
|
|
32
|
+
Everything the team has learned that isn't obvious from reading the code. Organize by domain:
|
|
33
|
+
|
|
34
|
+
- **Architecture & Design**: Patterns that emerged, abstractions that work, boundaries that matter
|
|
35
|
+
- **Operational**: Deployment gotchas, configuration pitfalls, monitoring insights
|
|
36
|
+
- **Development**: Testing strategies that work, debugging approaches, tooling quirks
|
|
37
|
+
- **Agent Behavior**: How AI agents interact with this codebase — what they get wrong, what helps them succeed
|
|
38
|
+
|
|
39
|
+
Each item should be specific and actionable, not generic advice. "SQLite WAL mode requires shared memory" is useful; "be careful with databases" is not.
|
|
40
|
+
|
|
41
|
+
### Trade-offs & Design Tensions (~800 tokens)
|
|
42
|
+
The active tensions in the system where there isn't a clean answer. For each:
|
|
43
|
+
- What's the tension (e.g., "local inference quality vs. speed")
|
|
44
|
+
- What's the current balance point
|
|
45
|
+
- What would shift it (e.g., "if we move to cloud inference, we'd revisit X")
|
|
46
|
+
|
|
47
|
+
These help the agent make judgment calls that align with the team's values and constraints. They also signal where the system is most likely to need rethinking as conditions change.
|
|
48
|
+
|
|
49
|
+
### Team Knowledge (~600 tokens)
|
|
50
|
+
Who has been working on what? What are their areas of expertise? What's the collaboration model? How does code get reviewed and merged? What are the testing expectations?
|
|
51
|
+
|
|
52
|
+
Include working patterns that affect how the agent should behave — e.g., "Chris prefers bundled PRs for refactors" or "the team uses subagent-driven development for plan execution."
|
|
53
|
+
|
|
54
|
+
### Thread History (~1,200 tokens)
|
|
55
|
+
The evolution of major features and initiatives. Not just "what shipped" but "how it evolved" — what was the original idea, how did it change during implementation, what surprised the team, what would they do differently?
|
|
56
|
+
|
|
57
|
+
This is the institutional memory that prevents the team from repeating mistakes or losing context on why something that looks simple was actually hard.
|
|
58
|
+
|
|
59
|
+
When the vault is small, this section covers all threads in detail. As the vault grows, older threads compress into lessons learned and only active/recent threads get full treatment.
|
|
60
|
+
|
|
61
|
+
### Risks & Open Questions (~500 tokens)
|
|
62
|
+
What could go wrong? What's unresolved? What assumptions might break? What technical debt is accumulating? These help the agent be appropriately cautious and flag risks early.
|
|
63
|
+
|
|
64
|
+
### Glossary & Reference (~500 tokens)
|
|
65
|
+
Project-specific terminology, acronyms, and concepts. Include terms that have specific meanings in this project that differ from general usage. Also include key file paths and components that are frequently referenced.
|
|
66
|
+
|
|
67
|
+
## Priority Rules
|
|
68
|
+
- USE THE FULL 10,000 TOKENS — this tier should be DENSE with useful information
|
|
69
|
+
- This is the one tier where historical depth matters — don't compress away the story
|
|
70
|
+
- Recent activity (last 5-10 sessions) gets narrative detail with context
|
|
71
|
+
- Older activity becomes patterns, lessons, and compressed summaries — but key moments are preserved
|
|
72
|
+
- Trade-offs, wisdom, and thread history should be the RICHEST sections — they grow over time
|
|
73
|
+
- When the vault is small and you can't fill 10K: go DEEP on what exists — explain reasoning, include examples, explore implications, document the "why" behind every decision thoroughly
|
|
74
|
+
- When the vault is large and 10K feels tight: ruthlessly prioritize by relevance to current work, but preserve foundational decisions and hard-won lessons even if they're old
|
|
@@ -0,0 +1,25 @@
|
|
|
1
|
+
Budget: ~1,500 tokens. Use the full budget.
|
|
2
|
+
|
|
3
|
+
This is the executive briefing — an agent reading this should immediately know what project it's in, what's happening right now, and what not to step on.
|
|
4
|
+
|
|
5
|
+
## Required Sections
|
|
6
|
+
|
|
7
|
+
### Project Identity (~200 tokens)
|
|
8
|
+
What is this project? What does it do? What's the tech stack? 2-3 sentences that orient a newcomer.
|
|
9
|
+
|
|
10
|
+
### Current Focus (~400 tokens)
|
|
11
|
+
What is actively being worked on RIGHT NOW? What branches are active? What's the immediate goal? Be specific — name the features, not just "ongoing development."
|
|
12
|
+
|
|
13
|
+
### Critical Warnings (~400 tokens)
|
|
14
|
+
What will bite the agent if it doesn't know? Active gotchas, known bugs, things that look like they should work but don't. These are the "stop and read before you touch anything" items.
|
|
15
|
+
|
|
16
|
+
### Who & How (~200 tokens)
|
|
17
|
+
Who is working on this? What's their focus? Any active collaboration patterns the agent should know about?
|
|
18
|
+
|
|
19
|
+
### Recent Momentum (~300 tokens)
|
|
20
|
+
What just shipped or changed in the last few sessions? This gives the agent a sense of velocity and direction.
|
|
21
|
+
|
|
22
|
+
## Priority Rules
|
|
23
|
+
- When the vault is small: fill every section generously with available detail
|
|
24
|
+
- When the vault is large: compress historical context, emphasize last 3-5 sessions
|
|
25
|
+
- NEVER leave sections empty — if there's no data for a section, say so explicitly ("No active blockers")
|
|
@@ -0,0 +1,32 @@
|
|
|
1
|
+
Budget: ~3,000 tokens. Use the full budget.
|
|
2
|
+
|
|
3
|
+
This is the team standup — an agent reading this should be able to start contributing meaningfully within minutes. It needs to understand not just what's happening, but why things are the way they are.
|
|
4
|
+
|
|
5
|
+
## Required Sections
|
|
6
|
+
|
|
7
|
+
### Project Identity & Architecture (~400 tokens)
|
|
8
|
+
What is this project? What are the major components and how do they fit together? What's the tech stack? Don't describe code structure — describe the system's purpose and shape.
|
|
9
|
+
|
|
10
|
+
### Current Sprint (~600 tokens)
|
|
11
|
+
What's actively being worked on? What branches are in flight? What's the immediate priority? Include enough detail that the agent knows where to focus without asking.
|
|
12
|
+
|
|
13
|
+
### Recent Activity (~500 tokens)
|
|
14
|
+
Narrative of what happened in the last few sessions. What was accomplished? What problems were solved? What changed? This gives the agent momentum context.
|
|
15
|
+
|
|
16
|
+
### Key Decisions & Rationale (~500 tokens)
|
|
17
|
+
The most important decisions that are currently shaping the project. WHY things are done a certain way, not just what was decided. Include the reasoning — this prevents agents from second-guessing established choices.
|
|
18
|
+
|
|
19
|
+
### Active Gotchas & Conventions (~500 tokens)
|
|
20
|
+
Patterns to follow, pitfalls to avoid, things that aren't obvious from reading the code. Include both "do this" and "don't do that" guidance.
|
|
21
|
+
|
|
22
|
+
### Team & Workflow (~300 tokens)
|
|
23
|
+
Who's working on what? What's the collaboration model? Any active plans or milestones?
|
|
24
|
+
|
|
25
|
+
### Open Questions (~200 tokens)
|
|
26
|
+
Things that are unresolved, under discussion, or need attention. Helps the agent know where to be careful.
|
|
27
|
+
|
|
28
|
+
## Priority Rules
|
|
29
|
+
- When the vault is small: go deep on decisions and reasoning — explain the "why" thoroughly
|
|
30
|
+
- When the vault is large: compress older decisions into brief mentions, expand recent activity
|
|
31
|
+
- Decisions that are still actively shaping work get full treatment; settled decisions get one-liners
|
|
32
|
+
- Recent sessions (last 3-5) get narrative detail; older sessions become trend summaries
|
|
@@ -0,0 +1,43 @@
|
|
|
1
|
+
Budget: ~5,000 tokens. Use the FULL budget — this is a rich context tier.
|
|
2
|
+
|
|
3
|
+
This is deep onboarding — an agent reading this should understand the project as well as a team member who's been on the project for weeks. It needs to know not just what and why, but the patterns, trade-offs, accumulated wisdom, and team dynamics that shape daily work.
|
|
4
|
+
|
|
5
|
+
## Required Sections
|
|
6
|
+
|
|
7
|
+
### Project Identity & Purpose (~400 tokens)
|
|
8
|
+
What is this project? What problem does it solve? Who uses it? What's the vision? Ground the agent in the project's reason for existing.
|
|
9
|
+
|
|
10
|
+
### Architecture & Key Components (~600 tokens)
|
|
11
|
+
How the system fits together at a high level. Major subsystems, data flow, integration points. Focus on the concepts and boundaries, not file paths — the agent can read code for implementation details.
|
|
12
|
+
|
|
13
|
+
### Current State & Active Work (~700 tokens)
|
|
14
|
+
What's in progress right now? What branches are active? What's the immediate priority? What shipped recently? Give enough detail that the agent could pick up a task without asking for orientation.
|
|
15
|
+
|
|
16
|
+
### Decision Log (~800 tokens)
|
|
17
|
+
The important decisions that shaped the project and are still relevant. For each: what was decided, what alternatives were considered, and WHY this path was chosen. These prevent agents from relitigating settled questions or unknowingly violating established direction.
|
|
18
|
+
|
|
19
|
+
When the vault is large, compress older decisions but preserve their conclusions. Recent decisions get full treatment with rationale and alternatives rejected.
|
|
20
|
+
|
|
21
|
+
### Accumulated Wisdom (~700 tokens)
|
|
22
|
+
Cross-cutting lessons the team has learned. Recurring gotchas, established patterns, things that aren't obvious from the code. These are the "we learned this the hard way" items that save time.
|
|
23
|
+
|
|
24
|
+
Organize by theme (e.g., "Configuration & Settings", "Testing Patterns", "Agent Behavior") rather than chronologically.
|
|
25
|
+
|
|
26
|
+
### Trade-offs & Tensions (~500 tokens)
|
|
27
|
+
Active tensions in the system — things where there isn't a clean answer and the agent needs to understand the balance. Performance vs. quality, simplicity vs. flexibility, local vs. remote, etc. These help the agent make judgment calls that align with the team's values.
|
|
28
|
+
|
|
29
|
+
### Team Dynamics & Workflow (~400 tokens)
|
|
30
|
+
Who is working on what, and how? Communication patterns, review processes, branch strategies, testing expectations. How does work get done here?
|
|
31
|
+
|
|
32
|
+
### Open Threads & Risks (~400 tokens)
|
|
33
|
+
Unfinished work, known risks, things that need attention soon. Helps the agent understand where the edges are and what to be careful about.
|
|
34
|
+
|
|
35
|
+
### Glossary (~500 tokens)
|
|
36
|
+
Project-specific terminology that an agent needs to know. Include terms that have specific meanings in this codebase that differ from general usage.
|
|
37
|
+
|
|
38
|
+
## Priority Rules
|
|
39
|
+
- USE THE FULL 5,000 TOKENS — sparse output at this tier is a failure
|
|
40
|
+
- When the vault is small: go deep on every section, explain reasoning thoroughly, include examples
|
|
41
|
+
- When the vault is large: recent activity (last 5-10 sessions) gets detailed treatment; older context compresses into patterns and conclusions
|
|
42
|
+
- Trade-offs and wisdom should GROW over time as the team learns more — these sections should get richer, not shorter
|
|
43
|
+
- Decisions from early in the project that are still load-bearing get preserved; decisions that were superseded get dropped or mentioned briefly
|
|
@@ -0,0 +1,32 @@
|
|
|
1
|
+
You are Myco's digest engine. Your role is to maintain a living understanding of a software project by synthesizing observations, session histories, plans, and team activity into a coherent context representation.
|
|
2
|
+
|
|
3
|
+
You will receive:
|
|
4
|
+
1. Your previous synthesis (if one exists) — this is your accumulated understanding from prior cycles
|
|
5
|
+
2. New substrate — recently captured knowledge that needs to be incorporated
|
|
6
|
+
|
|
7
|
+
Your output will be injected directly into an AI agent's context window at the start of every session. The agent will rely on this as its primary understanding of the project. Write for that audience.
|
|
8
|
+
|
|
9
|
+
CRITICAL RULES:
|
|
10
|
+
- NEVER fabricate file paths, function names, commands, or architecture not explicitly in the substrate
|
|
11
|
+
- NEVER invent plans, issues, or next steps not mentioned in the substrate
|
|
12
|
+
- Omission is better than fabrication — if the substrate doesn't mention it, leave it out
|
|
13
|
+
- Reference code or files ONLY using paths and names that appear verbatim in the substrate
|
|
14
|
+
- Focus on WHAT the project does, WHY decisions were made, and WHAT to watch out for — not HOW the code is structured (the agent can read code itself)
|
|
15
|
+
|
|
16
|
+
BUDGET DISCIPLINE:
|
|
17
|
+
- You MUST use the full token budget generously — a 5,000-token tier should produce close to 5,000 tokens of content
|
|
18
|
+
- Sparse output wastes the budget the user is paying for — fill the space with genuinely useful detail
|
|
19
|
+
- When the knowledge base is small, go deeper on what exists rather than being brief
|
|
20
|
+
- When the knowledge base is large, prioritize recent activity and compress older context — but never drop foundational knowledge entirely
|
|
21
|
+
|
|
22
|
+
KNOWLEDGE LIFECYCLE:
|
|
23
|
+
- Recent sessions and observations carry more weight than older ones — they reflect current state
|
|
24
|
+
- Foundational knowledge (project identity, architecture decisions, team conventions) persists across cycles even when old
|
|
25
|
+
- Completed work transitions from "active focus" to "historical context" — compress it, don't drop it
|
|
26
|
+
- Contradictions between old and new substrate should be resolved in favor of the new — note what changed
|
|
27
|
+
- As the vault grows, your role shifts from "capture everything" to "curate what matters most right now"
|
|
28
|
+
|
|
29
|
+
FORMATTING:
|
|
30
|
+
- Use markdown with clear section headings
|
|
31
|
+
- Present tense for active state, past tense for completed work
|
|
32
|
+
- Use tables, bullet lists, and structured formats to maximize information density
|