@jlongo78/agent-spaces 0.7.5 → 0.7.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.next/standalone/.claude/settings.local.json +55 -0
- package/.next/standalone/.claude/spaces-env.json +1 -0
- package/.next/standalone/.next/BUILD_ID +1 -1
- package/.next/standalone/.next/app-path-routes-manifest.json +2 -1
- package/.next/standalone/.next/build-manifest.json +5 -5
- package/.next/standalone/.next/prerender-manifest.json +27 -3
- package/.next/standalone/.next/required-server-files.json +19 -19
- package/.next/standalone/.next/routes-manifest.json +6 -0
- package/.next/standalone/.next/server/app/(desktop)/admin/analytics/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/admin/analytics/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/admin/users/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/admin/users/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/analytics/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/analytics/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/cortex/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/cortex/page/react-loadable-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/cortex/page.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/(desktop)/cortex/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/network/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/network/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/projects/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/projects/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/sessions/[id]/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/sessions/[id]/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/sessions/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/sessions/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/settings/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/settings/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/terminal/page.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/pane/[id]/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/terminal/pane/[id]/page.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/pane/[id]/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/remote/[nodeId]/[workspaceId]/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/terminal/remote/[nodeId]/[workspaceId]/page.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/(desktop)/terminal/remote/[nodeId]/[workspaceId]/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/(desktop)/workspaces/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/(desktop)/workspaces/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/_global-error/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/_global-error.html +2 -2
- package/.next/standalone/.next/server/app/_global-error.rsc +1 -1
- package/.next/standalone/.next/server/app/_global-error.segments/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_global-error.segments/_full.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_global-error.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_global-error.segments/_index.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_global-error.segments/_tree.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_not-found/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/_not-found/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/_not-found.html +1 -1
- package/.next/standalone/.next/server/app/_not-found.rsc +2 -2
- package/.next/standalone/.next/server/app/_not-found.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/_not-found.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_not-found.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/_not-found.segments/_not-found/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_not-found.segments/_not-found.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/_not-found.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/analytics.html +1 -1
- package/.next/standalone/.next/server/app/admin/analytics.rsc +7 -6
- package/.next/standalone/.next/server/app/admin/analytics.segments/!KGRlc2t0b3Ap/admin/analytics/__PAGE__.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/analytics.segments/!KGRlc2t0b3Ap/admin/analytics.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/analytics.segments/!KGRlc2t0b3Ap/admin.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/analytics.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/analytics.segments/_full.segment.rsc +7 -6
- package/.next/standalone/.next/server/app/admin/analytics.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/analytics.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/analytics.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/users.html +1 -1
- package/.next/standalone/.next/server/app/admin/users.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/users.segments/!KGRlc2t0b3Ap/admin/users/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/users.segments/!KGRlc2t0b3Ap/admin/users.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/users.segments/!KGRlc2t0b3Ap/admin.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/users.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/users.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/users.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/admin/users.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/admin/users.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/analytics.html +1 -1
- package/.next/standalone/.next/server/app/analytics.rsc +3 -3
- package/.next/standalone/.next/server/app/analytics.segments/!KGRlc2t0b3Ap/analytics/__PAGE__.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/analytics.segments/!KGRlc2t0b3Ap/analytics.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/analytics.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/analytics.segments/_full.segment.rsc +3 -3
- package/.next/standalone/.next/server/app/analytics.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/analytics.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/analytics.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/api/analytics/overview/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/bulk/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/config/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/context/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/curation/assess/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/curation/publish/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/curation/refine/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/curation/review/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/curation/seed/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/export/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/federation/pending/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/federation/resolve/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/federation/search/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/federation/teach/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/graph/edges/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/graph/entities/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/graph/entities/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/graph/populate/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/import/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/import/status/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/ingest/bootstrap/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/ingest/status/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/knowledge/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/knowledge/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/lobes/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/lobes/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/lobes/share/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/marketplace/browse/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/marketplace/preview/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/mcp/call/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/mcp/tools/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/search/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/settings/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/status/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/timeline/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/usage/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/cortex/workspace/[id]/context/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/events/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/folders/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/handshake/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/projects/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/search/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/sessions/[id]/messages/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/sessions/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/sessions/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/workspaces/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/network/workspaces/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/panes/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/panes/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/projects/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/search/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/sessions/[id]/chat/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/sessions/[id]/messages/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/sessions/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/sessions/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/sync/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/tags/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/tier/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/context/[key]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/context/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/messages/[msgId]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/messages/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/[id]/sessions/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/api/workspaces/route.js.nft.json +1 -1
- package/.next/standalone/.next/server/app/cortex.html +1 -1
- package/.next/standalone/.next/server/app/cortex.rsc +3 -3
- package/.next/standalone/.next/server/app/cortex.segments/!KGRlc2t0b3Ap/cortex/__PAGE__.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/cortex.segments/!KGRlc2t0b3Ap/cortex.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/cortex.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/cortex.segments/_full.segment.rsc +3 -3
- package/.next/standalone/.next/server/app/cortex.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/cortex.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/cortex.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/login/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/login/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/login.html +1 -1
- package/.next/standalone/.next/server/app/login.rsc +2 -2
- package/.next/standalone/.next/server/app/login.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/login.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/login.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/login.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/login.segments/login/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/login.segments/login.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/projects/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/projects/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/projects.html +1 -1
- package/.next/standalone/.next/server/app/m/projects.rsc +2 -2
- package/.next/standalone/.next/server/app/m/projects.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/projects.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/projects.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/projects.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/projects.segments/m/projects/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/projects.segments/m/projects.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/projects.segments/m.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/sessions/[id]/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/sessions/[id]/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/sessions/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/sessions/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/sessions.html +1 -1
- package/.next/standalone/.next/server/app/m/sessions.rsc +2 -2
- package/.next/standalone/.next/server/app/m/sessions.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/sessions.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/sessions.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/sessions.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/sessions.segments/m/sessions/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/sessions.segments/m/sessions.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/sessions.segments/m.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/settings/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/settings/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/settings.html +1 -1
- package/.next/standalone/.next/server/app/m/settings.rsc +2 -2
- package/.next/standalone/.next/server/app/m/settings.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/settings.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/settings.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/settings.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/settings.segments/m/settings/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/settings.segments/m/settings.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/settings.segments/m.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/terminal/page/build-manifest.json +3 -3
- package/.next/standalone/.next/server/app/m/terminal/page_client-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/app/m/terminal.html +1 -1
- package/.next/standalone/.next/server/app/m/terminal.rsc +2 -2
- package/.next/standalone/.next/server/app/m/terminal.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/terminal.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/terminal.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/terminal.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m/terminal.segments/m/terminal/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/terminal.segments/m/terminal.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m/terminal.segments/m.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m.html +1 -1
- package/.next/standalone/.next/server/app/m.rsc +2 -2
- package/.next/standalone/.next/server/app/m.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/m.segments/m/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/m.segments/m.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/network.html +1 -1
- package/.next/standalone/.next/server/app/network.rsc +2 -2
- package/.next/standalone/.next/server/app/network.segments/!KGRlc2t0b3Ap/network/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/network.segments/!KGRlc2t0b3Ap/network.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/network.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/network.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/network.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/network.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/network.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/projects.html +1 -1
- package/.next/standalone/.next/server/app/projects.rsc +2 -2
- package/.next/standalone/.next/server/app/projects.segments/!KGRlc2t0b3Ap/projects/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/projects.segments/!KGRlc2t0b3Ap/projects.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/projects.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/projects.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/projects.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/projects.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/projects.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/sessions.html +1 -1
- package/.next/standalone/.next/server/app/sessions.rsc +2 -2
- package/.next/standalone/.next/server/app/sessions.segments/!KGRlc2t0b3Ap/sessions/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/sessions.segments/!KGRlc2t0b3Ap/sessions.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/sessions.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/sessions.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/sessions.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/sessions.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/sessions.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/settings.html +1 -1
- package/.next/standalone/.next/server/app/settings.rsc +2 -2
- package/.next/standalone/.next/server/app/settings.segments/!KGRlc2t0b3Ap/settings/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/settings.segments/!KGRlc2t0b3Ap/settings.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/settings.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/settings.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/settings.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/settings.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/settings.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/terminal.html +1 -1
- package/.next/standalone/.next/server/app/terminal.rsc +3 -3
- package/.next/standalone/.next/server/app/terminal.segments/!KGRlc2t0b3Ap/terminal/__PAGE__.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/terminal.segments/!KGRlc2t0b3Ap/terminal.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/terminal.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/terminal.segments/_full.segment.rsc +3 -3
- package/.next/standalone/.next/server/app/terminal.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/terminal.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/terminal.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/vr/page/app-paths-manifest.json +3 -0
- package/.next/standalone/.next/server/app/vr/page/build-manifest.json +18 -0
- package/.next/standalone/.next/server/app/vr/page/next-font-manifest.json +11 -0
- package/.next/standalone/.next/server/app/vr/page/react-loadable-manifest.json +11 -0
- package/.next/standalone/.next/server/app/vr/page/server-reference-manifest.json +4 -0
- package/.next/standalone/.next/server/app/vr/page.js +17 -0
- package/.next/standalone/.next/server/app/vr/page.js.map +5 -0
- package/.next/standalone/.next/server/app/vr/page.js.nft.json +1 -0
- package/.next/standalone/.next/server/app/vr/page_client-reference-manifest.js +2 -0
- package/.next/standalone/.next/server/app/vr.html +1 -0
- package/.next/standalone/.next/server/app/vr.meta +15 -0
- package/.next/standalone/.next/server/app/vr.rsc +21 -0
- package/.next/standalone/.next/server/app/vr.segments/_full.segment.rsc +21 -0
- package/.next/standalone/.next/server/app/vr.segments/_head.segment.rsc +6 -0
- package/.next/standalone/.next/server/app/vr.segments/_index.segment.rsc +6 -0
- package/.next/standalone/.next/server/app/vr.segments/_tree.segment.rsc +4 -0
- package/.next/standalone/.next/server/app/vr.segments/vr/__PAGE__.segment.rsc +9 -0
- package/.next/standalone/.next/server/app/vr.segments/vr.segment.rsc +4 -0
- package/.next/standalone/.next/server/app/workspaces.html +1 -1
- package/.next/standalone/.next/server/app/workspaces.rsc +2 -2
- package/.next/standalone/.next/server/app/workspaces.segments/!KGRlc2t0b3Ap/workspaces/__PAGE__.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/workspaces.segments/!KGRlc2t0b3Ap/workspaces.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/workspaces.segments/!KGRlc2t0b3Ap.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/workspaces.segments/_full.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/workspaces.segments/_head.segment.rsc +1 -1
- package/.next/standalone/.next/server/app/workspaces.segments/_index.segment.rsc +2 -2
- package/.next/standalone/.next/server/app/workspaces.segments/_tree.segment.rsc +2 -2
- package/.next/standalone/.next/server/app-paths-manifest.json +2 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0041efe4._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__00bf0ace._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__08a68343._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0add852f._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0c113ed0._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0e1a27e0._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0e71d908._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__0e9142f3._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__10e47926._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1194f2c1._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1665dc78._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__175cbabf._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__19c2d094._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1adae357._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1d359752._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1e8fabeb._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__1f8deca0._.js +8 -8
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__253fdda1._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__28e6434f._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__2a386564._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__2acbd703._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__2acefabb._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__2c20fb38._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__309132cd._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__33fec964._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__3786d8ae._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__3ae92407._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__3beda9fe._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__3e3f25a1._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__4619e9bd._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__4a051043._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__50208a5f._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__508002e4._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5086c373._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5913e097._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5b5f68d2._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5c1f2459._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5ec8c977._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__5f8c694a._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__63cebc6c._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__64d30d4d._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__6c54fc2e._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__6dc1fb7e._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__6e568102._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__6faa04c0._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__727d05f1._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__74a34dc3._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__75d12b32._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__7e7250a4._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__8309e0a4._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__86cc0e2b._.js +6 -6
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__8915603e._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__89c2565a._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__8d178ad9._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__93ee06f3._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__9e4c154a._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__a1fbc199._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__a9d2e1d3._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__ae53d343._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__b3a04cef._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__b4270b77._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__b6b6ce60._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__b9545dd9._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__c200e21a._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__c3c74ca4._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__c88b63f7._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__cba5f007._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__cbf4ceb0._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__cefdba2f._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__cf9e82bb._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__d15515e3._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__d2897392._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__d3b2d856._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__d73273ca._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__d8417eb6._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__db4726bc._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__dc2a55de._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__dc6e2e5f._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__e0d4690b._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__e3ecfd17._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__e678dd53._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__e9223f55._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__ea630076._.js +3 -3
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__eb8acb65._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__f26ca49d._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__f33e1101._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__f3a4c668._.js +1 -1
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__f515f865._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__fceb5d60._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__fed41403._.js +2 -2
- package/.next/standalone/.next/server/chunks/[root-of-the-server]__ff2e98c2._.js +2 -2
- package/.next/standalone/.next/server/chunks/node_modules_next_dist_esm_build_templates_app-route_339169c8.js +1 -1
- package/.next/standalone/.next/server/chunks/node_modules_next_dist_esm_build_templates_app-route_97dac613.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__0d8d81ca._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__1425c64f._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__1d2ce8f1._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__31137509._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__3633a587._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__3c79441b._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__4ca0f26b._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__5b90d3ad._.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__62a0b363._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__66aca5d4._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__68205a46._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__69fd2efa._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__85dcf0f7._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__8c53a5da._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__aecb1873._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__b02cd143._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__b9bcde11._.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__cac90169._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__d25de2f0._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__e2f86be8._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__ee626b5b._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__f39a9e98._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__f3c566cd._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/[root-of-the-server]__f76aa221._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_149d7fd4._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_2e0dd6a7._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_3cd2355c._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_3d206597._.js +4 -0
- package/.next/standalone/.next/server/chunks/ssr/_47cc9af0._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_5cf334fd._.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/_7082788b._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_7154d8ae._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_75bb1b9a._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/{_aeeff784._.js → _81abf587._.js} +2 -2
- package/.next/standalone/.next/server/chunks/ssr/_8acf81e2._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_8c36feb8._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_91e9bb86._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_ac4c1838._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_ad8515fc._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_b1f49e81._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_c0fe7614._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_d4825f5a._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_da10a9f4._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_db0abd0a._.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/_dee5d4a1._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_ef482c0c._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_efe43d2f._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_f4a4e116._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_f4d525d2._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/_f4e57187._.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/_next-internal_server_app_vr_page_actions_3fb70d92.js +3 -0
- package/.next/standalone/.next/server/chunks/ssr/node_modules_32f9d62f._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/node_modules_next_dist_esm_build_templates_app-page_02f39477.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/node_modules_next_dist_esm_eedfc1fd._.js +1 -1
- package/.next/standalone/.next/server/chunks/ssr/src_40fa36ce._.js +7 -0
- package/.next/standalone/.next/server/chunks/ssr/src_app_(desktop)_cortex_page_tsx_0f33d8b3._.js +3 -0
- package/.next/standalone/.next/server/edge/chunks/[root-of-the-server]__32a0045c._.js +1 -1
- package/.next/standalone/.next/server/edge/chunks/_d73df637._.js +1 -1
- package/.next/standalone/.next/server/middleware-build-manifest.js +3 -3
- package/.next/standalone/.next/server/middleware-manifest.json +5 -5
- package/.next/standalone/.next/server/next-font-manifest.js +1 -1
- package/.next/standalone/.next/server/next-font-manifest.json +4 -0
- package/.next/standalone/.next/server/pages/404.html +1 -1
- package/.next/standalone/.next/server/pages/500.html +2 -2
- package/.next/standalone/.next/server/server-reference-manifest.js +1 -1
- package/.next/standalone/.next/server/server-reference-manifest.json +1 -1
- package/.next/standalone/.next/static/chunks/045c83caa4d15373.js +1 -0
- package/.next/standalone/.next/static/chunks/07ea09e6024a523b.js +1 -0
- package/.next/standalone/.next/static/chunks/232d8aae4fefab70.js +1 -0
- package/.next/standalone/.next/static/chunks/2ad22562bb37ecad.js +1011 -0
- package/.next/standalone/.next/static/chunks/396eac60f496f178.js +1 -0
- package/.next/standalone/.next/static/chunks/{a4e5c700421eaa46.js → 412140a02893327a.js} +1 -1
- package/.next/standalone/.next/static/chunks/481cc11ae80b08b1.js +1 -0
- package/.next/standalone/.next/static/chunks/5325351ef49cb65f.js +1 -0
- package/.next/standalone/.next/static/chunks/559735e598ca3cbb.js +1 -0
- package/.next/standalone/.next/static/chunks/59c63d5af5cf3daf.js +1 -0
- package/.next/standalone/.next/static/chunks/5d5d7b0095dd52ae.js +1 -0
- package/.next/standalone/.next/static/chunks/6ae575967d091df4.js +1 -0
- package/.next/standalone/.next/static/chunks/7a7c0d9d875332a3.js +1 -0
- package/.next/standalone/.next/static/chunks/7f8455bb855a6c84.js +1 -0
- package/.next/standalone/.next/static/chunks/898f380eba90427a.js +1 -0
- package/.next/standalone/.next/static/chunks/95339e55722bb4ca.js +5 -0
- package/.next/standalone/.next/static/chunks/9cd594813c539df9.js +1 -0
- package/.next/standalone/.next/static/chunks/ad1423eed05d129b.js +1 -0
- package/.next/standalone/.next/static/chunks/ae7b146884c67d2a.js +1 -0
- package/.next/standalone/.next/static/chunks/b84072d72aa86417.js +1 -0
- package/.next/standalone/.next/static/chunks/c1a95aebf6725f64.css +3 -0
- package/.next/standalone/.next/static/chunks/c515eb77d9410aa0.js +5 -0
- package/.next/standalone/.next/static/chunks/{9899cf4c2bdbe61d.js → d9ae203a7f123546.js} +2 -2
- package/.next/standalone/.next/static/chunks/e23f20b51a75a5bb.js +757 -0
- package/.next/standalone/.next/static/chunks/fdc09bd135846960.js +1 -0
- package/.next/standalone/.next/static/chunks/ff0196911449e745.js +1 -0
- package/.next/standalone/.next/static/chunks/{turbopack-4c21186b79fb4c10.js → turbopack-e1a0994ed4af988c.js} +1 -1
- package/.next/standalone/.spaces/cortex-context.md +70 -0
- package/.next/standalone/bin/cortex-hook.sh +62 -62
- package/.next/standalone/bin/cortex-mcp.js +60 -60
- package/.next/standalone/docs/superpowers/plans/2026-03-13-cortex-wiring.md +1387 -1387
- package/.next/standalone/docs/superpowers/plans/2026-03-14-cortex-v2-entity-graph.md +1923 -1923
- package/.next/standalone/docs/superpowers/plans/2026-03-14-cortex-v2-knowledge-evolution.md +1113 -1113
- package/.next/standalone/docs/superpowers/plans/2026-03-15-cortex-v2-boundary-engine.md +853 -853
- package/.next/standalone/docs/superpowers/plans/2026-03-15-cortex-v2-context-engine.md +1274 -1274
- package/.next/standalone/docs/superpowers/plans/2026-03-15-cortex-v2-signal-ingestion.md +933 -933
- package/.next/standalone/docs/superpowers/plans/2026-03-16-cortex-lobes.md +1080 -1080
- package/.next/standalone/docs/superpowers/plans/2026-03-16-cortex-v2-gravity-system.md +768 -768
- package/.next/standalone/docs/superpowers/plans/2026-03-16-cortex-v2-ui.md +1108 -1108
- package/.next/standalone/docs/superpowers/plans/2026-03-18-cortex-ui-integration.md +1846 -1846
- package/.next/standalone/docs/superpowers/specs/2026-03-13-cortex-wiring-design.md +268 -268
- package/.next/standalone/docs/superpowers/specs/2026-03-14-cortex-v2-design.md +623 -623
- package/.next/standalone/docs/superpowers/specs/2026-03-16-cortex-lobes-design.md +263 -263
- package/.next/standalone/docs/superpowers/specs/2026-03-16-cortex-v2-ui-design.md +240 -240
- package/.next/standalone/docs/superpowers/specs/2026-03-18-cortex-ui-integration-design.md +341 -341
- package/.next/standalone/node_modules/@img/sharp-win32-x64/lib/sharp-win32-x64.node +0 -0
- package/.next/standalone/node_modules/@img/{sharp-linux-x64 → sharp-win32-x64}/package.json +39 -46
- package/.next/standalone/package.json +103 -102
- package/.next/standalone/server.js +1 -1
- package/.next/standalone/src/app/(desktop)/cortex/page.tsx +78 -78
- package/.next/standalone/src/app/api/cortex/context/route.ts +78 -78
- package/.next/standalone/src/app/api/cortex/curation/assess/route.ts +27 -27
- package/.next/standalone/src/app/api/cortex/curation/publish/route.ts +23 -23
- package/.next/standalone/src/app/api/cortex/curation/refine/route.ts +23 -23
- package/.next/standalone/src/app/api/cortex/curation/review/route.ts +29 -29
- package/.next/standalone/src/app/api/cortex/curation/seed/route.ts +23 -23
- package/.next/standalone/src/app/api/cortex/export/route.ts +40 -40
- package/.next/standalone/src/app/api/cortex/federation/pending/route.ts +20 -20
- package/.next/standalone/src/app/api/cortex/federation/resolve/route.ts +43 -43
- package/.next/standalone/src/app/api/cortex/federation/search/route.ts +35 -35
- package/.next/standalone/src/app/api/cortex/federation/teach/route.ts +76 -76
- package/.next/standalone/src/app/api/cortex/graph/edges/route.ts +112 -112
- package/.next/standalone/src/app/api/cortex/graph/entities/[id]/route.ts +73 -73
- package/.next/standalone/src/app/api/cortex/graph/entities/route.ts +75 -75
- package/.next/standalone/src/app/api/cortex/graph/populate/route.ts +203 -203
- package/.next/standalone/src/app/api/cortex/import/route.ts +75 -75
- package/.next/standalone/src/app/api/cortex/import/status/route.ts +15 -15
- package/.next/standalone/src/app/api/cortex/ingest/bootstrap/route.ts +29 -29
- package/.next/standalone/src/app/api/cortex/ingest/status/route.ts +15 -15
- package/.next/standalone/src/app/api/cortex/knowledge/[id]/route.ts +91 -91
- package/.next/standalone/src/app/api/cortex/knowledge/route.ts +93 -93
- package/.next/standalone/src/app/api/cortex/lobes/[id]/route.ts +67 -67
- package/.next/standalone/src/app/api/cortex/lobes/route.ts +22 -22
- package/.next/standalone/src/app/api/cortex/lobes/share/route.ts +80 -80
- package/.next/standalone/src/app/api/cortex/marketplace/browse/route.ts +43 -43
- package/.next/standalone/src/app/api/cortex/marketplace/preview/route.ts +46 -46
- package/.next/standalone/src/app/api/cortex/mcp/call/route.ts +11 -11
- package/.next/standalone/src/app/api/cortex/mcp/tools/route.ts +6 -6
- package/.next/standalone/src/app/api/cortex/search/route.ts +43 -43
- package/.next/standalone/src/app/api/cortex/settings/route.ts +33 -33
- package/.next/standalone/src/app/api/cortex/status/route.ts +169 -169
- package/.next/standalone/src/app/api/cortex/timeline/route.ts +42 -42
- package/.next/standalone/src/app/api/cortex/usage/route.ts +31 -31
- package/.next/standalone/src/app/api/cortex/workspace/[id]/context/route.ts +41 -41
- package/.next/standalone/src/components/cortex/constants.ts +29 -29
- package/.next/standalone/src/components/cortex/cortex-dashboard.tsx +304 -304
- package/.next/standalone/src/components/cortex/cortex-indicator.tsx +44 -44
- package/.next/standalone/src/components/cortex/cortex-panel.tsx +140 -140
- package/.next/standalone/src/components/cortex/cortex-settings.tsx +221 -221
- package/.next/standalone/src/components/cortex/curation-tab.tsx +810 -810
- package/.next/standalone/src/components/cortex/entity-detail.tsx +101 -101
- package/.next/standalone/src/components/cortex/entity-graph.tsx +382 -382
- package/.next/standalone/src/components/cortex/import-dialog.tsx +212 -212
- package/.next/standalone/src/components/cortex/injection-badge.tsx +72 -72
- package/.next/standalone/src/components/cortex/knowledge-card.tsx +109 -109
- package/.next/standalone/src/components/cortex/knowledge-tab.tsx +158 -158
- package/.next/standalone/src/components/cortex/lobe-settings.tsx +215 -215
- package/.next/standalone/src/components/cortex/marketplace-card.tsx +126 -126
- package/.next/standalone/src/components/cortex/marketplace-tab.tsx +113 -113
- package/.next/standalone/src/lib/cortex/config.ts +40 -40
- package/.next/standalone/src/lib/cortex/debug.ts +10 -10
- package/.next/standalone/src/lib/cortex/distillation/usage-store.ts +18 -18
- package/.next/standalone/src/lib/cortex/graph/resolver.ts +10 -10
- package/.next/standalone/src/lib/cortex/graph/types.ts +22 -22
- package/.next/standalone/src/lib/cortex/index.ts +56 -56
- package/.next/standalone/src/lib/cortex/ingestion/bootstrap.ts +14 -14
- package/.next/standalone/src/lib/cortex/knowledge/compat.ts +14 -14
- package/.next/standalone/src/lib/cortex/knowledge/contradiction.ts +10 -10
- package/.next/standalone/src/lib/cortex/knowledge/types.ts +67 -67
- package/.next/standalone/src/lib/cortex/lobes/config.ts +16 -16
- package/.next/standalone/src/lib/cortex/lobes/resolver.ts +8 -8
- package/.next/standalone/src/lib/cortex/lobes/shares.ts +14 -14
- package/.next/standalone/src/lib/cortex/mcp/server.ts +8 -8
- package/.next/standalone/src/lib/cortex/portability/exporter.ts +6 -6
- package/.next/standalone/src/lib/cortex/portability/importer.ts +10 -10
- package/.next/standalone/src/lib/cortex/retrieval/context-engine.ts +10 -10
- package/.next/standalone/src/lib/cortex/types.ts +39 -39
- package/.next/standalone/tsconfig.json +34 -34
- package/LICENSE +661 -661
- package/README.md +131 -131
- package/bin/cortex-hook.sh +62 -62
- package/bin/cortex-mcp.js +60 -60
- package/bin/fix-standalone-externals.js +79 -79
- package/bin/lib/auto-setup.js +110 -110
- package/bin/mdns-service.js +171 -171
- package/bin/postinstall.js +35 -35
- package/bin/setup-admin.js +195 -195
- package/bin/spaces-dev.js +208 -208
- package/bin/spaces-install.js +599 -599
- package/bin/spaces-reset-totp.js +50 -50
- package/bin/spaces-service.js +1020 -1020
- package/bin/spaces-setup.js +253 -253
- package/bin/spaces.js +776 -776
- package/bin/ssh-auth-keys.sh +68 -68
- package/bin/terminal-server.js +1683 -1649
- package/package.json +103 -102
- package/.next/standalone/.next/server/chunks/ssr/_078dd64d._.js +0 -3
- package/.next/standalone/.next/server/chunks/ssr/_701606d5._.js +0 -3
- package/.next/standalone/.next/server/chunks/ssr/_72b1de37._.js +0 -3
- package/.next/standalone/.next/server/chunks/ssr/_950142a4._.js +0 -3
- package/.next/standalone/.next/server/chunks/ssr/src_components_terminal_terminal-pane_tsx_803c5e2c._.js +0 -7
- package/.next/standalone/.next/static/chunks/18f168665aef1aab.js +0 -1
- package/.next/standalone/.next/static/chunks/25b7a243a404a1a7.js +0 -1
- package/.next/standalone/.next/static/chunks/4a50d2a3e9bc9b41.js +0 -1
- package/.next/standalone/.next/static/chunks/6c78a1dfa7ec2959.css +0 -3
- package/.next/standalone/.next/static/chunks/7e0091ab6c5ee8bd.js +0 -1
- package/.next/standalone/.next/static/chunks/869f562dc32e55f4.js +0 -1
- package/.next/standalone/.next/static/chunks/8b3f4572fec83caa.js +0 -5
- package/.next/standalone/.next/static/chunks/8d5419afc4b9116b.js +0 -1
- package/.next/standalone/.next/static/chunks/9b2c5451f0b67975.js +0 -1
- package/.next/standalone/.next/static/chunks/ac339e970df82fa5.js +0 -5
- package/.next/standalone/.next/static/chunks/e7772d64463868eb.js +0 -1
- package/.next/standalone/node_modules/@img/sharp-libvips-linux-x64/README.md +0 -46
- package/.next/standalone/node_modules/@img/sharp-libvips-linux-x64/lib/glib-2.0/include/glibconfig.h +0 -221
- package/.next/standalone/node_modules/@img/sharp-libvips-linux-x64/lib/index.js +0 -1
- package/.next/standalone/node_modules/@img/sharp-libvips-linux-x64/lib/libvips-cpp.so.8.17.3 +0 -0
- package/.next/standalone/node_modules/@img/sharp-libvips-linux-x64/package.json +0 -42
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/README.md +0 -46
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/lib/glib-2.0/include/glibconfig.h +0 -221
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/lib/index.js +0 -1
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/lib/libvips-cpp.so.8.17.3 +0 -0
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/package.json +0 -42
- package/.next/standalone/node_modules/@img/sharp-libvips-linuxmusl-x64/versions.json +0 -30
- package/.next/standalone/node_modules/@img/sharp-linux-x64/lib/sharp-linux-x64.node +0 -0
- package/.next/standalone/node_modules/@img/sharp-linuxmusl-x64/lib/sharp-linuxmusl-x64.node +0 -0
- package/.next/standalone/node_modules/@img/sharp-linuxmusl-x64/package.json +0 -46
- /package/.next/standalone/.next/static/{77VYbwIoyxFNr5xevTrCu → ncDe4k4gvD0788HAnq_3G}/_buildManifest.js +0 -0
- /package/.next/standalone/.next/static/{77VYbwIoyxFNr5xevTrCu → ncDe4k4gvD0788HAnq_3G}/_clientMiddlewareManifest.json +0 -0
- /package/.next/standalone/.next/static/{77VYbwIoyxFNr5xevTrCu → ncDe4k4gvD0788HAnq_3G}/_ssgManifest.js +0 -0
- /package/.next/standalone/node_modules/@img/{sharp-libvips-linux-x64 → sharp-win32-x64}/versions.json +0 -0
|
@@ -1,1387 +1,1387 @@
|
|
|
1
|
-
# Cortex Wiring Implementation Plan
|
|
2
|
-
|
|
3
|
-
> **For agentic workers:** REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
4
|
-
|
|
5
|
-
**Goal:** Wire up 5 existing-but-unused Cortex subsystems (dedup, extractors, staleness, distillation, MCP server) and fix 2 bugs.
|
|
6
|
-
|
|
7
|
-
**Architecture:** Layered wiring in dependency order. Each layer is independently shippable. The pipeline gains dedup + classification before embedding. Search gains staleness scoring. Background distillation extracts structured knowledge via LLM. MCP server exposes Cortex to Claude Code as tools.
|
|
8
|
-
|
|
9
|
-
**Tech Stack:** TypeScript, Vitest, LanceDB, Node.js `fetch` (for LLM API calls), MCP stdio transport
|
|
10
|
-
|
|
11
|
-
**Spec:** `docs/superpowers/specs/2026-03-13-cortex-wiring-design.md`
|
|
12
|
-
|
|
13
|
-
---
|
|
14
|
-
|
|
15
|
-
## Chunk 1: Deduplication (Layer 1)
|
|
16
|
-
|
|
17
|
-
### Task 1: Add `textHash` to deduplicator
|
|
18
|
-
|
|
19
|
-
**Files:**
|
|
20
|
-
- Modify: `src/lib/cortex/ingestion/deduplicator.ts`
|
|
21
|
-
- Modify: `tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
22
|
-
|
|
23
|
-
- [ ] **Step 1: Write failing test for `textHash`**
|
|
24
|
-
|
|
25
|
-
In `tests/lib/cortex/ingestion/deduplicator.test.ts`, add:
|
|
26
|
-
|
|
27
|
-
```typescript
|
|
28
|
-
import { cosineSimilarity, isDuplicate, textHash } from '@/lib/cortex/ingestion/deduplicator';
|
|
29
|
-
|
|
30
|
-
describe('textHash', () => {
|
|
31
|
-
it('returns consistent SHA-256 hex for same input', () => {
|
|
32
|
-
const h1 = textHash('hello world');
|
|
33
|
-
const h2 = textHash('hello world');
|
|
34
|
-
expect(h1).toBe(h2);
|
|
35
|
-
expect(h1).toMatch(/^[a-f0-9]{64}$/);
|
|
36
|
-
});
|
|
37
|
-
|
|
38
|
-
it('normalizes whitespace before hashing', () => {
|
|
39
|
-
const h1 = textHash('hello world\n\n');
|
|
40
|
-
const h2 = textHash('hello world');
|
|
41
|
-
expect(h1).toBe(h2);
|
|
42
|
-
});
|
|
43
|
-
|
|
44
|
-
it('returns different hashes for different text', () => {
|
|
45
|
-
expect(textHash('foo')).not.toBe(textHash('bar'));
|
|
46
|
-
});
|
|
47
|
-
});
|
|
48
|
-
```
|
|
49
|
-
|
|
50
|
-
- [ ] **Step 2: Run test to verify it fails**
|
|
51
|
-
|
|
52
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
53
|
-
Expected: FAIL — `textHash` is not exported
|
|
54
|
-
|
|
55
|
-
- [ ] **Step 3: Implement `textHash`**
|
|
56
|
-
|
|
57
|
-
In `src/lib/cortex/ingestion/deduplicator.ts`, add at the top:
|
|
58
|
-
|
|
59
|
-
```typescript
|
|
60
|
-
import { createHash } from 'crypto';
|
|
61
|
-
|
|
62
|
-
export function textHash(text: string): string {
|
|
63
|
-
const normalized = text.replace(/\s+/g, ' ').trim();
|
|
64
|
-
return createHash('sha256').update(normalized).digest('hex');
|
|
65
|
-
}
|
|
66
|
-
```
|
|
67
|
-
|
|
68
|
-
- [ ] **Step 4: Run test to verify it passes**
|
|
69
|
-
|
|
70
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
71
|
-
Expected: ALL PASS
|
|
72
|
-
|
|
73
|
-
- [ ] **Step 5: Commit**
|
|
74
|
-
|
|
75
|
-
```bash
|
|
76
|
-
git add src/lib/cortex/ingestion/deduplicator.ts tests/lib/cortex/ingestion/deduplicator.test.ts
|
|
77
|
-
git commit -m "feat(cortex): add textHash for dedup hash check"
|
|
78
|
-
```
|
|
79
|
-
|
|
80
|
-
---
|
|
81
|
-
|
|
82
|
-
### Task 2: Add `updateAccessCount` to store
|
|
83
|
-
|
|
84
|
-
**Files:**
|
|
85
|
-
- Modify: `src/lib/cortex/store.ts`
|
|
86
|
-
- Modify: `tests/lib/cortex/store.test.ts`
|
|
87
|
-
|
|
88
|
-
- [ ] **Step 1: Write failing test**
|
|
89
|
-
|
|
90
|
-
In `tests/lib/cortex/store.test.ts`, add a test for `updateAccessCount`:
|
|
91
|
-
|
|
92
|
-
```typescript
|
|
93
|
-
it('updates access_count on a unit', async () => {
|
|
94
|
-
// Assumes a unit with known id has been added in a beforeEach or earlier test
|
|
95
|
-
// Add a unit first
|
|
96
|
-
const unit = makeUnit({ id: 'access-test', access_count: 0 });
|
|
97
|
-
await store.add('personal', unit);
|
|
98
|
-
|
|
99
|
-
await store.updateAccessCount('personal', 'access-test');
|
|
100
|
-
|
|
101
|
-
// Retrieve and check — use browse since we need to find by id
|
|
102
|
-
const results = await store.browse('personal', 100);
|
|
103
|
-
const found = results.find(r => r.id === 'access-test');
|
|
104
|
-
expect(found).toBeDefined();
|
|
105
|
-
expect(found!.access_count).toBe(1);
|
|
106
|
-
});
|
|
107
|
-
```
|
|
108
|
-
|
|
109
|
-
Note: adapt to the existing test file's setup patterns (mock store vs real LanceDB). If the test file uses mocks, test the method signature exists and is callable. If it uses real LanceDB, test the actual update.
|
|
110
|
-
|
|
111
|
-
- [ ] **Step 2: Run test to verify it fails**
|
|
112
|
-
|
|
113
|
-
Run: `npx vitest run tests/lib/cortex/store.test.ts`
|
|
114
|
-
Expected: FAIL — `updateAccessCount` does not exist
|
|
115
|
-
|
|
116
|
-
- [ ] **Step 3: Implement `updateAccessCount`**
|
|
117
|
-
|
|
118
|
-
In `src/lib/cortex/store.ts`, add after the `delete` method (~line 163):
|
|
119
|
-
|
|
120
|
-
```typescript
|
|
121
|
-
async updateAccessCount(layerKey: string, id: string): Promise<void> {
|
|
122
|
-
const conn = await this.getConnection(layerKey);
|
|
123
|
-
const tableNames = await conn.tableNames();
|
|
124
|
-
if (!tableNames.includes(TABLE_NAME)) return;
|
|
125
|
-
|
|
126
|
-
const table = await conn.openTable(TABLE_NAME);
|
|
127
|
-
const safeId = id.replace(/'/g, "''");
|
|
128
|
-
// LanceDB doesn't support UPDATE; delete + re-add with bumped count
|
|
129
|
-
// Use query().where() instead of vectorSearch to avoid dimension dependency
|
|
130
|
-
const rows = await table.query()
|
|
131
|
-
.where(`id = '${safeId}'`).limit(1).toArray();
|
|
132
|
-
if (rows.length === 0) return;
|
|
133
|
-
|
|
134
|
-
const row = rows[0];
|
|
135
|
-
await table.delete(`id = '${safeId}'`);
|
|
136
|
-
row.access_count = (row.access_count || 0) + 1;
|
|
137
|
-
row.last_accessed = new Date().toISOString();
|
|
138
|
-
await table.add([row]);
|
|
139
|
-
}
|
|
140
|
-
```
|
|
141
|
-
|
|
142
|
-
- [ ] **Step 4: Run test to verify it passes**
|
|
143
|
-
|
|
144
|
-
Run: `npx vitest run tests/lib/cortex/store.test.ts`
|
|
145
|
-
Expected: ALL PASS
|
|
146
|
-
|
|
147
|
-
- [ ] **Step 5: Commit**
|
|
148
|
-
|
|
149
|
-
```bash
|
|
150
|
-
git add src/lib/cortex/store.ts tests/lib/cortex/store.test.ts
|
|
151
|
-
git commit -m "feat(cortex): add updateAccessCount to store"
|
|
152
|
-
```
|
|
153
|
-
|
|
154
|
-
---
|
|
155
|
-
|
|
156
|
-
### Task 3: Wire dedup into the pipeline
|
|
157
|
-
|
|
158
|
-
**Files:**
|
|
159
|
-
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
160
|
-
- Modify: `tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
161
|
-
|
|
162
|
-
- [ ] **Step 1: Write failing test — duplicate is skipped**
|
|
163
|
-
|
|
164
|
-
In `tests/lib/cortex/ingestion/pipeline.test.ts`, first add `updateAccessCount` to the shared `mockStore` in the `beforeEach` block:
|
|
165
|
-
|
|
166
|
-
```typescript
|
|
167
|
-
// In the existing beforeEach, add to mockStore:
|
|
168
|
-
mockStore.updateAccessCount = vi.fn().mockResolvedValue(undefined);
|
|
169
|
-
```
|
|
170
|
-
|
|
171
|
-
Then add these tests:
|
|
172
|
-
|
|
173
|
-
```typescript
|
|
174
|
-
it('skips duplicate chunks (hash match)', async () => {
|
|
175
|
-
const msg = { role: 'human', content: 'Add auth', timestamp: new Date().toISOString() };
|
|
176
|
-
const msgs = [msg, { role: 'assistant', content: 'Done.', timestamp: new Date().toISOString() }];
|
|
177
|
-
const ctx = { sessionId: 's1', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
178
|
-
|
|
179
|
-
// Ingest twice with identical content
|
|
180
|
-
await pipeline.ingest(msgs, ctx);
|
|
181
|
-
await pipeline.ingest(msgs, ctx);
|
|
182
|
-
|
|
183
|
-
// store.add should only be called once (second ingest is hash-deduped)
|
|
184
|
-
expect(mockStore.add).toHaveBeenCalledTimes(1);
|
|
185
|
-
});
|
|
186
|
-
|
|
187
|
-
it('skips cosine-similar chunks and bumps access count', async () => {
|
|
188
|
-
// store.search returns a near-match with L2 distance below threshold
|
|
189
|
-
mockStore.search.mockResolvedValueOnce([{
|
|
190
|
-
id: 'existing-1', text: 'similar text', _distance: 0.01,
|
|
191
|
-
access_count: 0, confidence: 0.8,
|
|
192
|
-
}]);
|
|
193
|
-
|
|
194
|
-
const msgs = [
|
|
195
|
-
{ role: 'human', content: 'Slightly different auth', timestamp: new Date().toISOString() },
|
|
196
|
-
{ role: 'assistant', content: 'Done.', timestamp: new Date().toISOString() },
|
|
197
|
-
];
|
|
198
|
-
const ctx = { sessionId: 's2', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
199
|
-
|
|
200
|
-
await pipeline.ingest(msgs, ctx);
|
|
201
|
-
|
|
202
|
-
expect(mockStore.updateAccessCount).toHaveBeenCalledWith(expect.any(String), 'existing-1');
|
|
203
|
-
expect(mockStore.add).not.toHaveBeenCalled();
|
|
204
|
-
});
|
|
205
|
-
```
|
|
206
|
-
|
|
207
|
-
- [ ] **Step 2: Run tests to verify they fail**
|
|
208
|
-
|
|
209
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
210
|
-
Expected: FAIL — pipeline doesn't dedup yet
|
|
211
|
-
|
|
212
|
-
- [ ] **Step 3: Implement dedup in pipeline**
|
|
213
|
-
|
|
214
|
-
Modify `src/lib/cortex/ingestion/pipeline.ts`:
|
|
215
|
-
|
|
216
|
-
```typescript
|
|
217
|
-
import crypto from 'crypto';
|
|
218
|
-
import type { EmbeddingProvider } from '../embeddings';
|
|
219
|
-
import type { CortexStore } from '../store';
|
|
220
|
-
import type { KnowledgeUnit, RawChunk } from '../knowledge/types';
|
|
221
|
-
import { getConfidenceBase } from '../knowledge/types';
|
|
222
|
-
import { chunkMessages, type SessionMessage, type ChunkContext } from './chunker';
|
|
223
|
-
import { textHash } from './deduplicator';
|
|
224
|
-
|
|
225
|
-
export interface IngestionResult {
|
|
226
|
-
chunksCreated: number;
|
|
227
|
-
chunksEmbedded: number;
|
|
228
|
-
chunksSkipped: number;
|
|
229
|
-
errors: string[];
|
|
230
|
-
}
|
|
231
|
-
|
|
232
|
-
const COSINE_DEDUP_THRESHOLD = 0.05; // L2 distance < 0.05 = duplicate
|
|
233
|
-
|
|
234
|
-
export class IngestionPipeline {
|
|
235
|
-
private hashSet = new Set<string>();
|
|
236
|
-
|
|
237
|
-
constructor(
|
|
238
|
-
private embedding: EmbeddingProvider,
|
|
239
|
-
private store: CortexStore,
|
|
240
|
-
) {}
|
|
241
|
-
|
|
242
|
-
async ingest(
|
|
243
|
-
messages: SessionMessage[],
|
|
244
|
-
ctx: ChunkContext,
|
|
245
|
-
): Promise<IngestionResult> {
|
|
246
|
-
const result: IngestionResult = {
|
|
247
|
-
chunksCreated: 0, chunksEmbedded: 0, chunksSkipped: 0, errors: [],
|
|
248
|
-
};
|
|
249
|
-
|
|
250
|
-
// Tier 1: Fast pass — chunk messages
|
|
251
|
-
let chunks: RawChunk[];
|
|
252
|
-
try {
|
|
253
|
-
chunks = chunkMessages(messages, ctx);
|
|
254
|
-
} catch (err) {
|
|
255
|
-
result.errors.push(`Tier 1 error: ${err}`);
|
|
256
|
-
return result;
|
|
257
|
-
}
|
|
258
|
-
result.chunksCreated = chunks.length;
|
|
259
|
-
|
|
260
|
-
// Tier 2: Dedup, embed, and store
|
|
261
|
-
const BATCH_SIZE = 50;
|
|
262
|
-
for (let i = 0; i < chunks.length; i += BATCH_SIZE) {
|
|
263
|
-
const batch = chunks.slice(i, i + BATCH_SIZE);
|
|
264
|
-
|
|
265
|
-
// Phase 1: Hash dedup — filter out exact duplicates before embedding
|
|
266
|
-
const novel: RawChunk[] = [];
|
|
267
|
-
for (const chunk of batch) {
|
|
268
|
-
const hash = textHash(chunk.text);
|
|
269
|
-
if (this.hashSet.has(hash)) {
|
|
270
|
-
result.chunksSkipped++;
|
|
271
|
-
} else {
|
|
272
|
-
this.hashSet.add(hash);
|
|
273
|
-
novel.push(chunk);
|
|
274
|
-
}
|
|
275
|
-
}
|
|
276
|
-
|
|
277
|
-
if (novel.length === 0) continue;
|
|
278
|
-
|
|
279
|
-
try {
|
|
280
|
-
const texts = novel.map(c => c.text);
|
|
281
|
-
const vectors = await this.embedding.embed(texts);
|
|
282
|
-
|
|
283
|
-
for (let j = 0; j < novel.length; j++) {
|
|
284
|
-
const chunk = novel[j];
|
|
285
|
-
const vector = vectors[j];
|
|
286
|
-
const layerKey = chunk.layer === 'workspace' && chunk.workspace_id
|
|
287
|
-
? `workspace/${chunk.workspace_id}`
|
|
288
|
-
: chunk.layer;
|
|
289
|
-
|
|
290
|
-
// Phase 2: Cosine dedup — check store for near-matches
|
|
291
|
-
try {
|
|
292
|
-
const nearestResults = await this.store.search(layerKey, vector, 1);
|
|
293
|
-
if (nearestResults.length > 0) {
|
|
294
|
-
const nearest = nearestResults[0] as any;
|
|
295
|
-
const distance = nearest._distance ?? 1;
|
|
296
|
-
if (distance < COSINE_DEDUP_THRESHOLD) {
|
|
297
|
-
await this.store.updateAccessCount(layerKey, nearest.id);
|
|
298
|
-
result.chunksSkipped++;
|
|
299
|
-
continue;
|
|
300
|
-
}
|
|
301
|
-
}
|
|
302
|
-
} catch {
|
|
303
|
-
// Store may be empty or table not created yet — proceed with add
|
|
304
|
-
}
|
|
305
|
-
|
|
306
|
-
const unit: KnowledgeUnit = {
|
|
307
|
-
id: crypto.randomUUID(),
|
|
308
|
-
vector,
|
|
309
|
-
text: chunk.text,
|
|
310
|
-
type: chunk.type,
|
|
311
|
-
layer: chunk.layer,
|
|
312
|
-
workspace_id: chunk.workspace_id,
|
|
313
|
-
session_id: chunk.session_id,
|
|
314
|
-
agent_type: chunk.agent_type,
|
|
315
|
-
project_path: chunk.project_path,
|
|
316
|
-
file_refs: chunk.file_refs,
|
|
317
|
-
confidence: getConfidenceBase(chunk.type),
|
|
318
|
-
created: new Date().toISOString(),
|
|
319
|
-
source_timestamp: chunk.source_timestamp,
|
|
320
|
-
stale_score: 0,
|
|
321
|
-
access_count: 0,
|
|
322
|
-
last_accessed: null,
|
|
323
|
-
metadata: chunk.metadata,
|
|
324
|
-
};
|
|
325
|
-
|
|
326
|
-
await this.store.add(layerKey, unit);
|
|
327
|
-
result.chunksEmbedded++;
|
|
328
|
-
}
|
|
329
|
-
} catch (err) {
|
|
330
|
-
result.errors.push(`Tier 2 batch error: ${err}`);
|
|
331
|
-
result.chunksSkipped += novel.length;
|
|
332
|
-
}
|
|
333
|
-
}
|
|
334
|
-
|
|
335
|
-
return result;
|
|
336
|
-
}
|
|
337
|
-
}
|
|
338
|
-
```
|
|
339
|
-
|
|
340
|
-
- [ ] **Step 4: Run tests to verify they pass**
|
|
341
|
-
|
|
342
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
343
|
-
Expected: ALL PASS
|
|
344
|
-
|
|
345
|
-
- [ ] **Step 5: Run full cortex test suite**
|
|
346
|
-
|
|
347
|
-
Run: `npx vitest run tests/lib/cortex/`
|
|
348
|
-
Expected: ALL PASS (no regressions)
|
|
349
|
-
|
|
350
|
-
- [ ] **Step 6: Commit**
|
|
351
|
-
|
|
352
|
-
```bash
|
|
353
|
-
git add src/lib/cortex/ingestion/pipeline.ts tests/lib/cortex/ingestion/pipeline.test.ts
|
|
354
|
-
git commit -m "feat(cortex): wire dedup into ingestion pipeline (hash + cosine)"
|
|
355
|
-
```
|
|
356
|
-
|
|
357
|
-
---
|
|
358
|
-
|
|
359
|
-
## Chunk 2: Extractors (Layer 2)
|
|
360
|
-
|
|
361
|
-
### Task 4: Wire extractors into pipeline
|
|
362
|
-
|
|
363
|
-
**Files:**
|
|
364
|
-
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
365
|
-
- Modify: `tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
366
|
-
|
|
367
|
-
- [ ] **Step 1: Write failing test — error/fix detection**
|
|
368
|
-
|
|
369
|
-
In `tests/lib/cortex/ingestion/pipeline.test.ts`, add:
|
|
370
|
-
|
|
371
|
-
```typescript
|
|
372
|
-
it('classifies error/fix chunks via extractors', async () => {
|
|
373
|
-
const msgs = [
|
|
374
|
-
{ role: 'human', content: 'I got TypeError: cannot read undefined', timestamp: new Date().toISOString() },
|
|
375
|
-
{ role: 'assistant', content: 'Fixed by adding null check before access.', timestamp: new Date().toISOString() },
|
|
376
|
-
];
|
|
377
|
-
const ctx = { sessionId: 's3', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
378
|
-
|
|
379
|
-
await pipeline.ingest(msgs, ctx);
|
|
380
|
-
|
|
381
|
-
const addCall = mockStore.add.mock.calls[0];
|
|
382
|
-
const storedUnit = addCall[1];
|
|
383
|
-
expect(storedUnit.type).toBe('error_fix');
|
|
384
|
-
});
|
|
385
|
-
|
|
386
|
-
it('classifies decision chunks via extractors', async () => {
|
|
387
|
-
const msgs = [
|
|
388
|
-
{ role: 'human', content: 'Which framework?', timestamp: new Date().toISOString() },
|
|
389
|
-
{ role: 'assistant', content: 'We decided to use Next.js for the frontend because of SSR support.', timestamp: new Date().toISOString() },
|
|
390
|
-
];
|
|
391
|
-
const ctx = { sessionId: 's4', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
392
|
-
|
|
393
|
-
await pipeline.ingest(msgs, ctx);
|
|
394
|
-
|
|
395
|
-
const addCall = mockStore.add.mock.calls[0];
|
|
396
|
-
const storedUnit = addCall[1];
|
|
397
|
-
expect(storedUnit.type).toBe('decision');
|
|
398
|
-
});
|
|
399
|
-
```
|
|
400
|
-
|
|
401
|
-
- [ ] **Step 2: Run tests to verify they fail**
|
|
402
|
-
|
|
403
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
404
|
-
Expected: FAIL — chunks still typed as `conversation`
|
|
405
|
-
|
|
406
|
-
- [ ] **Step 3: Add extraction step to pipeline**
|
|
407
|
-
|
|
408
|
-
In `src/lib/cortex/ingestion/pipeline.ts`, add import at the top:
|
|
409
|
-
|
|
410
|
-
```typescript
|
|
411
|
-
import { detectErrorFixPairs, extractDecisionPatterns, extractCommands } from './extractors';
|
|
412
|
-
```
|
|
413
|
-
|
|
414
|
-
Add a private method to classify chunks. Call it after `chunkMessages()` but before the Tier 2 loop:
|
|
415
|
-
|
|
416
|
-
```typescript
|
|
417
|
-
/** Enrich chunk types using regex extractors. Mutates chunks in place. */
|
|
418
|
-
private classifyChunks(chunks: RawChunk[]): void {
|
|
419
|
-
for (const chunk of chunks) {
|
|
420
|
-
const errorFixes = detectErrorFixPairs(chunk.text);
|
|
421
|
-
const decisions = extractDecisionPatterns(chunk.text);
|
|
422
|
-
const commands = extractCommands(chunk.text);
|
|
423
|
-
|
|
424
|
-
// Priority: decision > error_fix > conversation (default)
|
|
425
|
-
if (decisions.length > 0) {
|
|
426
|
-
chunk.type = 'decision';
|
|
427
|
-
chunk.metadata.decisions = decisions;
|
|
428
|
-
} else if (errorFixes.length > 0) {
|
|
429
|
-
chunk.type = 'error_fix';
|
|
430
|
-
chunk.metadata.error_fixes = errorFixes;
|
|
431
|
-
}
|
|
432
|
-
|
|
433
|
-
if (commands.length > 0) {
|
|
434
|
-
chunk.metadata.commands = commands;
|
|
435
|
-
}
|
|
436
|
-
}
|
|
437
|
-
}
|
|
438
|
-
```
|
|
439
|
-
|
|
440
|
-
In the `ingest` method, call it after chunking (after `result.chunksCreated = chunks.length;`):
|
|
441
|
-
|
|
442
|
-
```typescript
|
|
443
|
-
// Tier 1.5: Classify chunks via regex extractors
|
|
444
|
-
this.classifyChunks(chunks);
|
|
445
|
-
```
|
|
446
|
-
|
|
447
|
-
- [ ] **Step 4: Run tests to verify they pass**
|
|
448
|
-
|
|
449
|
-
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
450
|
-
Expected: ALL PASS
|
|
451
|
-
|
|
452
|
-
- [ ] **Step 5: Run full cortex test suite**
|
|
453
|
-
|
|
454
|
-
Run: `npx vitest run tests/lib/cortex/`
|
|
455
|
-
Expected: ALL PASS
|
|
456
|
-
|
|
457
|
-
- [ ] **Step 6: Commit**
|
|
458
|
-
|
|
459
|
-
```bash
|
|
460
|
-
git add src/lib/cortex/ingestion/pipeline.ts tests/lib/cortex/ingestion/pipeline.test.ts
|
|
461
|
-
git commit -m "feat(cortex): wire extractors into pipeline for chunk classification"
|
|
462
|
-
```
|
|
463
|
-
|
|
464
|
-
---
|
|
465
|
-
|
|
466
|
-
## Chunk 3: Staleness (Layer 3)
|
|
467
|
-
|
|
468
|
-
### Task 5: Add staleness scoring to search
|
|
469
|
-
|
|
470
|
-
**Files:**
|
|
471
|
-
- Modify: `src/lib/cortex/retrieval/search.ts`
|
|
472
|
-
- Modify: `tests/lib/cortex/retrieval/search.test.ts`
|
|
473
|
-
|
|
474
|
-
- [ ] **Step 1: Write failing test**
|
|
475
|
-
|
|
476
|
-
In `tests/lib/cortex/retrieval/search.test.ts`, add a test that verifies stale results score lower than fresh results. The test should mock two results: one with file refs pointing to a recently modified file, one without. Check that the stale one ranks lower.
|
|
477
|
-
|
|
478
|
-
```typescript
|
|
479
|
-
it('reduces score for stale results (modified file refs)', async () => {
|
|
480
|
-
// Mock store.search returning two results
|
|
481
|
-
// Result A: has file_refs ['src/auth.ts'], source_timestamp = 30 days ago
|
|
482
|
-
// Result B: no file_refs, same similarity
|
|
483
|
-
// After staleness, A should rank lower than B
|
|
484
|
-
|
|
485
|
-
// Provide a fileStatFn that returns a recent mtime for src/auth.ts
|
|
486
|
-
const thirtyDaysAgo = new Date(Date.now() - 30 * 86400000).toISOString();
|
|
487
|
-
const yesterday = new Date(Date.now() - 86400000).toISOString();
|
|
488
|
-
|
|
489
|
-
// Note: store.search() returns deserialized data (file_refs as array, metadata as object)
|
|
490
|
-
mockStore.search.mockResolvedValue([
|
|
491
|
-
{ id: 'a', text: 'auth', file_refs: ['src/auth.ts'], source_timestamp: thirtyDaysAgo, confidence: 0.8, stale_score: 0, created: thirtyDaysAgo, _distance: 0.1, access_count: 0, last_accessed: null, metadata: {}, type: 'decision', layer: 'personal', workspace_id: null, session_id: null, agent_type: 'claude', project_path: '/project' },
|
|
492
|
-
{ id: 'b', text: 'other', file_refs: [], source_timestamp: thirtyDaysAgo, confidence: 0.8, stale_score: 0, created: thirtyDaysAgo, _distance: 0.1, access_count: 0, last_accessed: null, metadata: {}, type: 'decision', layer: 'personal', workspace_id: null, session_id: null, agent_type: 'claude', project_path: '/project' },
|
|
493
|
-
]);
|
|
494
|
-
|
|
495
|
-
const search = new CortexSearch(mockStore, {
|
|
496
|
-
fileStat: async (filepath: string) => {
|
|
497
|
-
if (filepath.includes('auth.ts')) return { mtime: new Date(yesterday) };
|
|
498
|
-
return null;
|
|
499
|
-
},
|
|
500
|
-
});
|
|
501
|
-
|
|
502
|
-
const results = await search.search([0.1, 0.2, 0.3], { limit: 2 });
|
|
503
|
-
// Result B should rank higher (not stale)
|
|
504
|
-
expect(results[0].id).toBe('b');
|
|
505
|
-
expect(results[1].id).toBe('a');
|
|
506
|
-
expect(results[1].stale_score).toBeGreaterThan(0);
|
|
507
|
-
});
|
|
508
|
-
```
|
|
509
|
-
|
|
510
|
-
- [ ] **Step 2: Run test to verify it fails**
|
|
511
|
-
|
|
512
|
-
Run: `npx vitest run tests/lib/cortex/retrieval/search.test.ts`
|
|
513
|
-
Expected: FAIL — `CortexSearch` doesn't accept `fileStat` option
|
|
514
|
-
|
|
515
|
-
- [ ] **Step 3: Add staleness computation to search**
|
|
516
|
-
|
|
517
|
-
Modify `src/lib/cortex/retrieval/search.ts`:
|
|
518
|
-
|
|
519
|
-
```typescript
|
|
520
|
-
import { computeFileStaleScore } from '../knowledge/staleness';
|
|
521
|
-
import fs from 'fs';
|
|
522
|
-
import path from 'path';
|
|
523
|
-
|
|
524
|
-
export interface SearchDeps {
|
|
525
|
-
/** Optional fs.stat wrapper — injectable for testing. */
|
|
526
|
-
fileStat?: (filepath: string) => Promise<{ mtime: Date } | null>;
|
|
527
|
-
}
|
|
528
|
-
|
|
529
|
-
export class CortexSearch {
|
|
530
|
-
private fileStat: (filepath: string) => Promise<{ mtime: Date } | null>;
|
|
531
|
-
|
|
532
|
-
constructor(private store: CortexStore, deps: SearchDeps = {}) {
|
|
533
|
-
this.fileStat = deps.fileStat ?? defaultFileStat;
|
|
534
|
-
}
|
|
535
|
-
```
|
|
536
|
-
|
|
537
|
-
Add a helper and update the search loop to compute staleness on the top results:
|
|
538
|
-
|
|
539
|
-
```typescript
|
|
540
|
-
private async computeStaleness(
|
|
541
|
-
unit: KnowledgeUnit,
|
|
542
|
-
statFn?: (fp: string) => Promise<{ mtime: Date } | null>,
|
|
543
|
-
): Promise<number> {
|
|
544
|
-
if (unit.file_refs.length === 0) return 0;
|
|
545
|
-
const doStat = statFn ?? this.fileStat;
|
|
546
|
-
|
|
547
|
-
const fileModTimes: Record<string, string> = {};
|
|
548
|
-
for (const ref of unit.file_refs) {
|
|
549
|
-
const fullPath = unit.project_path ? path.join(unit.project_path, ref) : ref;
|
|
550
|
-
try {
|
|
551
|
-
const stat = await doStat(fullPath);
|
|
552
|
-
if (stat) fileModTimes[ref] = stat.mtime.toISOString();
|
|
553
|
-
} catch { /* file doesn't exist or not accessible */ }
|
|
554
|
-
}
|
|
555
|
-
|
|
556
|
-
return computeFileStaleScore({
|
|
557
|
-
fileRefs: unit.file_refs,
|
|
558
|
-
sourceTimestamp: unit.source_timestamp,
|
|
559
|
-
fileModTimes,
|
|
560
|
-
});
|
|
561
|
-
}
|
|
562
|
-
```
|
|
563
|
-
|
|
564
|
-
In the search method, after collecting `allResults` and before the final sort, compute staleness on the top candidates:
|
|
565
|
-
|
|
566
|
-
```typescript
|
|
567
|
-
// Compute staleness on top candidates (cache stat calls per search request)
|
|
568
|
-
const candidates = allResults
|
|
569
|
-
.sort((a, b) => b.relevance_score - a.relevance_score)
|
|
570
|
-
.slice(0, limit * 2);
|
|
571
|
-
|
|
572
|
-
const statCache = new Map<string, { mtime: Date } | null>();
|
|
573
|
-
const cachedFileStat = async (fp: string) => {
|
|
574
|
-
if (!statCache.has(fp)) statCache.set(fp, await this.fileStat(fp));
|
|
575
|
-
return statCache.get(fp)!;
|
|
576
|
-
};
|
|
577
|
-
|
|
578
|
-
for (const result of candidates) {
|
|
579
|
-
const staleScore = await this.computeStaleness(result, cachedFileStat);
|
|
580
|
-
if (staleScore > 0) {
|
|
581
|
-
result.stale_score = staleScore;
|
|
582
|
-
// Recompute relevance with staleness
|
|
583
|
-
result.relevance_score = computeRelevanceScore({
|
|
584
|
-
similarity: result.similarity,
|
|
585
|
-
confidence: result.confidence,
|
|
586
|
-
stale_score: staleScore,
|
|
587
|
-
created: result.created,
|
|
588
|
-
}) * (LAYER_WEIGHTS[result.layer] ?? 0.5);
|
|
589
|
-
}
|
|
590
|
-
}
|
|
591
|
-
|
|
592
|
-
candidates.sort((a, b) => b.relevance_score - a.relevance_score);
|
|
593
|
-
return candidates.slice(0, limit);
|
|
594
|
-
```
|
|
595
|
-
|
|
596
|
-
Add the default `fileStat`:
|
|
597
|
-
|
|
598
|
-
```typescript
|
|
599
|
-
async function defaultFileStat(filepath: string): Promise<{ mtime: Date } | null> {
|
|
600
|
-
try {
|
|
601
|
-
const stat = await fs.promises.stat(filepath);
|
|
602
|
-
return { mtime: stat.mtime };
|
|
603
|
-
} catch {
|
|
604
|
-
return null;
|
|
605
|
-
}
|
|
606
|
-
}
|
|
607
|
-
```
|
|
608
|
-
|
|
609
|
-
- [ ] **Step 4: Update CortexSearch instantiation in `index.ts`**
|
|
610
|
-
|
|
611
|
-
In `src/lib/cortex/index.ts` (line 43), no change needed — the `SearchDeps` param is optional and defaults to real `fs.stat`.
|
|
612
|
-
|
|
613
|
-
- [ ] **Step 5: Run tests to verify they pass**
|
|
614
|
-
|
|
615
|
-
Run: `npx vitest run tests/lib/cortex/retrieval/search.test.ts`
|
|
616
|
-
Expected: ALL PASS
|
|
617
|
-
|
|
618
|
-
Note: existing tests create `CortexSearch(store)` without deps — that's fine since deps is optional.
|
|
619
|
-
|
|
620
|
-
- [ ] **Step 6: Commit**
|
|
621
|
-
|
|
622
|
-
```bash
|
|
623
|
-
git add src/lib/cortex/retrieval/search.ts tests/lib/cortex/retrieval/search.test.ts
|
|
624
|
-
git commit -m "feat(cortex): wire staleness scoring into search"
|
|
625
|
-
```
|
|
626
|
-
|
|
627
|
-
---
|
|
628
|
-
|
|
629
|
-
### Task 6: Add staleness badge to knowledge card
|
|
630
|
-
|
|
631
|
-
**Files:**
|
|
632
|
-
- Modify: `src/components/cortex/knowledge-card.tsx`
|
|
633
|
-
|
|
634
|
-
- [ ] **Step 1: Add staleness indicator**
|
|
635
|
-
|
|
636
|
-
In `src/components/cortex/knowledge-card.tsx`, update the `KnowledgeCardProps` interface to include `stale_score`:
|
|
637
|
-
|
|
638
|
-
```typescript
|
|
639
|
-
interface KnowledgeCardProps {
|
|
640
|
-
unit: {
|
|
641
|
-
id: string;
|
|
642
|
-
text: string;
|
|
643
|
-
type: string;
|
|
644
|
-
confidence: number;
|
|
645
|
-
created: string;
|
|
646
|
-
session_id?: string | null;
|
|
647
|
-
layer: string;
|
|
648
|
-
stale_score?: number;
|
|
649
|
-
};
|
|
650
|
-
onDelete?: (id: string) => void;
|
|
651
|
-
}
|
|
652
|
-
```
|
|
653
|
-
|
|
654
|
-
Add the amber badge after the type badge (inside the flex row at line 43-46):
|
|
655
|
-
|
|
656
|
-
```tsx
|
|
657
|
-
{(unit.stale_score ?? 0) > 0.3 && (
|
|
658
|
-
<span
|
|
659
|
-
className="text-[10px] px-1.5 py-0.5 rounded font-medium bg-amber-500/20 text-amber-400"
|
|
660
|
-
title="Referenced files have changed since this was learned"
|
|
661
|
-
>
|
|
662
|
-
stale
|
|
663
|
-
</span>
|
|
664
|
-
)}
|
|
665
|
-
```
|
|
666
|
-
|
|
667
|
-
- [ ] **Step 2: Verify visually** — No automated test. Check by opening Cortex panel in browser if available. Otherwise verify the component compiles:
|
|
668
|
-
|
|
669
|
-
Run: `npx tsc --noEmit src/components/cortex/knowledge-card.tsx` or just run the full build check.
|
|
670
|
-
|
|
671
|
-
- [ ] **Step 3: Commit**
|
|
672
|
-
|
|
673
|
-
```bash
|
|
674
|
-
git add src/components/cortex/knowledge-card.tsx
|
|
675
|
-
git commit -m "feat(cortex): add staleness badge to knowledge card"
|
|
676
|
-
```
|
|
677
|
-
|
|
678
|
-
---
|
|
679
|
-
|
|
680
|
-
## Chunk 4: Distillation (Layer 4)
|
|
681
|
-
|
|
682
|
-
### Task 7: Create `callLLM` implementation
|
|
683
|
-
|
|
684
|
-
**Files:**
|
|
685
|
-
- Create: `src/lib/cortex/distillation/llm.ts`
|
|
686
|
-
- Create: `tests/lib/cortex/distillation/llm.test.ts`
|
|
687
|
-
|
|
688
|
-
- [ ] **Step 1: Write failing test**
|
|
689
|
-
|
|
690
|
-
Create `tests/lib/cortex/distillation/llm.test.ts`:
|
|
691
|
-
|
|
692
|
-
```typescript
|
|
693
|
-
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
|
694
|
-
import { createCallLLM, detectLLMProvider } from '@/lib/cortex/distillation/llm';
|
|
695
|
-
|
|
696
|
-
describe('detectLLMProvider', () => {
|
|
697
|
-
const originalEnv = process.env;
|
|
698
|
-
|
|
699
|
-
beforeEach(() => { process.env = { ...originalEnv }; });
|
|
700
|
-
afterEach(() => { process.env = originalEnv; });
|
|
701
|
-
|
|
702
|
-
it('detects anthropic when ANTHROPIC_API_KEY is set', () => {
|
|
703
|
-
process.env.ANTHROPIC_API_KEY = 'sk-ant-test';
|
|
704
|
-
expect(detectLLMProvider()).toBe('anthropic');
|
|
705
|
-
});
|
|
706
|
-
|
|
707
|
-
it('detects openai when OPENAI_API_KEY is set', () => {
|
|
708
|
-
delete process.env.ANTHROPIC_API_KEY;
|
|
709
|
-
process.env.OPENAI_API_KEY = 'sk-test';
|
|
710
|
-
expect(detectLLMProvider()).toBe('openai');
|
|
711
|
-
});
|
|
712
|
-
|
|
713
|
-
it('returns null when no keys are set', () => {
|
|
714
|
-
delete process.env.ANTHROPIC_API_KEY;
|
|
715
|
-
delete process.env.OPENAI_API_KEY;
|
|
716
|
-
expect(detectLLMProvider()).toBeNull();
|
|
717
|
-
});
|
|
718
|
-
});
|
|
719
|
-
|
|
720
|
-
describe('createCallLLM', () => {
|
|
721
|
-
it('returns null when no provider available', () => {
|
|
722
|
-
delete process.env.ANTHROPIC_API_KEY;
|
|
723
|
-
delete process.env.OPENAI_API_KEY;
|
|
724
|
-
expect(createCallLLM()).toBeNull();
|
|
725
|
-
});
|
|
726
|
-
});
|
|
727
|
-
```
|
|
728
|
-
|
|
729
|
-
- [ ] **Step 2: Run test to verify it fails**
|
|
730
|
-
|
|
731
|
-
Run: `npx vitest run tests/lib/cortex/distillation/llm.test.ts`
|
|
732
|
-
Expected: FAIL — module doesn't exist
|
|
733
|
-
|
|
734
|
-
- [ ] **Step 3: Implement `llm.ts`**
|
|
735
|
-
|
|
736
|
-
Create `src/lib/cortex/distillation/llm.ts`:
|
|
737
|
-
|
|
738
|
-
```typescript
|
|
739
|
-
type LLMProvider = 'anthropic' | 'openai';
|
|
740
|
-
|
|
741
|
-
export function detectLLMProvider(): LLMProvider | null {
|
|
742
|
-
if (process.env.ANTHROPIC_API_KEY) return 'anthropic';
|
|
743
|
-
if (process.env.OPENAI_API_KEY) return 'openai';
|
|
744
|
-
return null;
|
|
745
|
-
}
|
|
746
|
-
|
|
747
|
-
export function createCallLLM(): ((system: string, user: string) => Promise<string>) | null {
|
|
748
|
-
const provider = detectLLMProvider();
|
|
749
|
-
if (!provider) return null;
|
|
750
|
-
|
|
751
|
-
if (provider === 'anthropic') {
|
|
752
|
-
return async (system: string, user: string) => {
|
|
753
|
-
const res = await fetch('https://api.anthropic.com/v1/messages', {
|
|
754
|
-
method: 'POST',
|
|
755
|
-
headers: {
|
|
756
|
-
'Content-Type': 'application/json',
|
|
757
|
-
'x-api-key': process.env.ANTHROPIC_API_KEY!,
|
|
758
|
-
'anthropic-version': '2023-06-01',
|
|
759
|
-
},
|
|
760
|
-
body: JSON.stringify({
|
|
761
|
-
model: 'claude-haiku-4-5-20251001',
|
|
762
|
-
max_tokens: 2048,
|
|
763
|
-
system,
|
|
764
|
-
messages: [{ role: 'user', content: user }],
|
|
765
|
-
}),
|
|
766
|
-
});
|
|
767
|
-
if (!res.ok) throw new Error(`Anthropic API error: ${res.status}`);
|
|
768
|
-
const data = await res.json();
|
|
769
|
-
return data.content?.[0]?.text ?? '';
|
|
770
|
-
};
|
|
771
|
-
}
|
|
772
|
-
|
|
773
|
-
return async (system: string, user: string) => {
|
|
774
|
-
const res = await fetch('https://api.openai.com/v1/chat/completions', {
|
|
775
|
-
method: 'POST',
|
|
776
|
-
headers: {
|
|
777
|
-
'Content-Type': 'application/json',
|
|
778
|
-
'Authorization': `Bearer ${process.env.OPENAI_API_KEY}`,
|
|
779
|
-
},
|
|
780
|
-
body: JSON.stringify({
|
|
781
|
-
model: 'gpt-4o-mini',
|
|
782
|
-
messages: [
|
|
783
|
-
{ role: 'system', content: system },
|
|
784
|
-
{ role: 'user', content: user },
|
|
785
|
-
],
|
|
786
|
-
max_tokens: 2048,
|
|
787
|
-
}),
|
|
788
|
-
});
|
|
789
|
-
if (!res.ok) throw new Error(`OpenAI API error: ${res.status}`);
|
|
790
|
-
const data = await res.json();
|
|
791
|
-
return data.choices?.[0]?.message?.content ?? '';
|
|
792
|
-
};
|
|
793
|
-
}
|
|
794
|
-
```
|
|
795
|
-
|
|
796
|
-
- [ ] **Step 4: Run test to verify it passes**
|
|
797
|
-
|
|
798
|
-
Run: `npx vitest run tests/lib/cortex/distillation/llm.test.ts`
|
|
799
|
-
Expected: ALL PASS
|
|
800
|
-
|
|
801
|
-
- [ ] **Step 5: Commit**
|
|
802
|
-
|
|
803
|
-
```bash
|
|
804
|
-
git add src/lib/cortex/distillation/llm.ts tests/lib/cortex/distillation/llm.test.ts
|
|
805
|
-
git commit -m "feat(cortex): add callLLM with auto-detect for Haiku/GPT-4o-mini"
|
|
806
|
-
```
|
|
807
|
-
|
|
808
|
-
---
|
|
809
|
-
|
|
810
|
-
### Task 8: Create distillation queue
|
|
811
|
-
|
|
812
|
-
**Files:**
|
|
813
|
-
- Create: `src/lib/cortex/distillation/queue.ts`
|
|
814
|
-
- Create: `tests/lib/cortex/distillation/queue.test.ts`
|
|
815
|
-
|
|
816
|
-
- [ ] **Step 1: Write failing test**
|
|
817
|
-
|
|
818
|
-
Create `tests/lib/cortex/distillation/queue.test.ts`:
|
|
819
|
-
|
|
820
|
-
```typescript
|
|
821
|
-
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
|
822
|
-
import { DistillationQueue } from '@/lib/cortex/distillation/queue';
|
|
823
|
-
import fs from 'fs';
|
|
824
|
-
import path from 'path';
|
|
825
|
-
import os from 'os';
|
|
826
|
-
|
|
827
|
-
describe('DistillationQueue', () => {
|
|
828
|
-
let tmpDir: string;
|
|
829
|
-
let queue: DistillationQueue;
|
|
830
|
-
|
|
831
|
-
beforeEach(() => {
|
|
832
|
-
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'cortex-queue-'));
|
|
833
|
-
queue = new DistillationQueue(tmpDir);
|
|
834
|
-
});
|
|
835
|
-
|
|
836
|
-
afterEach(() => { fs.rmSync(tmpDir, { recursive: true, force: true }); });
|
|
837
|
-
|
|
838
|
-
const entry = { text: 'chunk text 1', layerKey: 'personal', workspaceId: null, agentType: 'claude' };
|
|
839
|
-
|
|
840
|
-
it('enqueues and retrieves chunks by id', () => {
|
|
841
|
-
queue.enqueue('id1', entry);
|
|
842
|
-
queue.enqueue('id2', { ...entry, text: 'chunk text 2' });
|
|
843
|
-
|
|
844
|
-
const texts = queue.getTexts(['id1', 'id2']);
|
|
845
|
-
expect(texts).toEqual(['chunk text 1', 'chunk text 2']);
|
|
846
|
-
});
|
|
847
|
-
|
|
848
|
-
it('getEntries returns full context', () => {
|
|
849
|
-
queue.enqueue('id1', { text: 'ws text', layerKey: 'workspace/5', workspaceId: 5, agentType: 'claude' });
|
|
850
|
-
const entries = queue.getEntries(['id1']);
|
|
851
|
-
expect(entries[0].layerKey).toBe('workspace/5');
|
|
852
|
-
expect(entries[0].workspaceId).toBe(5);
|
|
853
|
-
});
|
|
854
|
-
|
|
855
|
-
it('removes processed entries', () => {
|
|
856
|
-
queue.enqueue('id1', entry);
|
|
857
|
-
queue.remove(['id1']);
|
|
858
|
-
expect(queue.getTexts(['id1'])).toEqual([]);
|
|
859
|
-
});
|
|
860
|
-
|
|
861
|
-
it('persists to disk and recovers', () => {
|
|
862
|
-
queue.enqueue('id1', entry);
|
|
863
|
-
|
|
864
|
-
const queue2 = new DistillationQueue(tmpDir);
|
|
865
|
-
expect(queue2.pendingIds()).toEqual(['id1']);
|
|
866
|
-
expect(queue2.getTexts(['id1'])).toEqual(['chunk text 1']);
|
|
867
|
-
});
|
|
868
|
-
});
|
|
869
|
-
```
|
|
870
|
-
|
|
871
|
-
- [ ] **Step 2: Run test to verify it fails**
|
|
872
|
-
|
|
873
|
-
Run: `npx vitest run tests/lib/cortex/distillation/queue.test.ts`
|
|
874
|
-
Expected: FAIL — module doesn't exist
|
|
875
|
-
|
|
876
|
-
- [ ] **Step 3: Implement queue**
|
|
877
|
-
|
|
878
|
-
Create `src/lib/cortex/distillation/queue.ts`:
|
|
879
|
-
|
|
880
|
-
```typescript
|
|
881
|
-
import fs from 'fs';
|
|
882
|
-
import path from 'path';
|
|
883
|
-
|
|
884
|
-
const QUEUE_FILE = 'distill-queue.json';
|
|
885
|
-
|
|
886
|
-
interface QueueEntry {
|
|
887
|
-
text: string;
|
|
888
|
-
layerKey: string;
|
|
889
|
-
workspaceId: number | null;
|
|
890
|
-
agentType: string;
|
|
891
|
-
}
|
|
892
|
-
|
|
893
|
-
/** Simple file-backed queue mapping chunk IDs to their text + context. */
|
|
894
|
-
export class DistillationQueue {
|
|
895
|
-
private data: Record<string, QueueEntry> = {};
|
|
896
|
-
private filePath: string;
|
|
897
|
-
|
|
898
|
-
constructor(cortexDir: string) {
|
|
899
|
-
this.filePath = path.join(cortexDir, QUEUE_FILE);
|
|
900
|
-
this.load();
|
|
901
|
-
}
|
|
902
|
-
|
|
903
|
-
private load(): void {
|
|
904
|
-
try {
|
|
905
|
-
if (fs.existsSync(this.filePath)) {
|
|
906
|
-
this.data = JSON.parse(fs.readFileSync(this.filePath, 'utf-8'));
|
|
907
|
-
}
|
|
908
|
-
} catch { this.data = {}; }
|
|
909
|
-
}
|
|
910
|
-
|
|
911
|
-
private save(): void {
|
|
912
|
-
fs.writeFileSync(this.filePath, JSON.stringify(this.data));
|
|
913
|
-
}
|
|
914
|
-
|
|
915
|
-
enqueue(id: string, entry: QueueEntry): void {
|
|
916
|
-
this.data[id] = entry;
|
|
917
|
-
this.save();
|
|
918
|
-
}
|
|
919
|
-
|
|
920
|
-
getEntries(ids: string[]): QueueEntry[] {
|
|
921
|
-
return ids.map(id => this.data[id]).filter((e): e is QueueEntry => e !== undefined);
|
|
922
|
-
}
|
|
923
|
-
|
|
924
|
-
getTexts(ids: string[]): string[] {
|
|
925
|
-
return this.getEntries(ids).map(e => e.text);
|
|
926
|
-
}
|
|
927
|
-
|
|
928
|
-
remove(ids: string[]): void {
|
|
929
|
-
for (const id of ids) delete this.data[id];
|
|
930
|
-
this.save();
|
|
931
|
-
}
|
|
932
|
-
|
|
933
|
-
pendingIds(): string[] {
|
|
934
|
-
return Object.keys(this.data);
|
|
935
|
-
}
|
|
936
|
-
}
|
|
937
|
-
```
|
|
938
|
-
|
|
939
|
-
- [ ] **Step 4: Run test to verify it passes**
|
|
940
|
-
|
|
941
|
-
Run: `npx vitest run tests/lib/cortex/distillation/queue.test.ts`
|
|
942
|
-
Expected: ALL PASS
|
|
943
|
-
|
|
944
|
-
- [ ] **Step 5: Commit**
|
|
945
|
-
|
|
946
|
-
```bash
|
|
947
|
-
git add src/lib/cortex/distillation/queue.ts tests/lib/cortex/distillation/queue.test.ts
|
|
948
|
-
git commit -m "feat(cortex): add file-backed distillation queue"
|
|
949
|
-
```
|
|
950
|
-
|
|
951
|
-
---
|
|
952
|
-
|
|
953
|
-
### Task 9: Wire distillation into Cortex singleton
|
|
954
|
-
|
|
955
|
-
**Files:**
|
|
956
|
-
- Modify: `src/lib/cortex/index.ts`
|
|
957
|
-
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
958
|
-
|
|
959
|
-
- [ ] **Step 1: Add distillation wiring to `index.ts`**
|
|
960
|
-
|
|
961
|
-
In `src/lib/cortex/index.ts`, add imports:
|
|
962
|
-
|
|
963
|
-
```typescript
|
|
964
|
-
import { Distiller } from './distillation/distiller';
|
|
965
|
-
import { DistillationScheduler } from './distillation/scheduler';
|
|
966
|
-
import { DistillationQueue } from './distillation/queue';
|
|
967
|
-
import { createCallLLM } from './distillation/llm';
|
|
968
|
-
```
|
|
969
|
-
|
|
970
|
-
Update the `CortexInstance` interface to include the queue and scheduler:
|
|
971
|
-
|
|
972
|
-
```typescript
|
|
973
|
-
export interface CortexInstance {
|
|
974
|
-
config: CortexConfig;
|
|
975
|
-
store: CortexStore;
|
|
976
|
-
search: CortexSearch;
|
|
977
|
-
pipeline: IngestionPipeline;
|
|
978
|
-
embedding: EmbeddingProvider;
|
|
979
|
-
sync?: FederationSync;
|
|
980
|
-
distillQueue?: DistillationQueue;
|
|
981
|
-
distillScheduler?: DistillationScheduler;
|
|
982
|
-
}
|
|
983
|
-
```
|
|
984
|
-
|
|
985
|
-
In `getCortex()`, after creating the pipeline, wire distillation:
|
|
986
|
-
|
|
987
|
-
```typescript
|
|
988
|
-
// Initialize distillation if enabled and LLM provider available
|
|
989
|
-
let distillQueue: DistillationQueue | undefined;
|
|
990
|
-
let distillScheduler: DistillationScheduler | undefined;
|
|
991
|
-
|
|
992
|
-
if (config.ingestion.distillation) {
|
|
993
|
-
const callLLM = createCallLLM();
|
|
994
|
-
if (callLLM) {
|
|
995
|
-
distillQueue = new DistillationQueue(cortexDir);
|
|
996
|
-
const distiller = new Distiller(store, embedding, callLLM);
|
|
997
|
-
|
|
998
|
-
distillScheduler = new DistillationScheduler(async (chunkIds) => {
|
|
999
|
-
const entries = distillQueue!.getEntries(chunkIds);
|
|
1000
|
-
if (entries.length === 0) return;
|
|
1001
|
-
|
|
1002
|
-
// Group by layerKey so workspace chunks go to the correct layer
|
|
1003
|
-
const byLayer = new Map<string, { texts: string[]; ctx: { workspaceId: number | null; agentType: string } }>();
|
|
1004
|
-
for (const e of entries) {
|
|
1005
|
-
if (!byLayer.has(e.layerKey)) {
|
|
1006
|
-
byLayer.set(e.layerKey, { texts: [], ctx: { workspaceId: e.workspaceId, agentType: e.agentType } });
|
|
1007
|
-
}
|
|
1008
|
-
byLayer.get(e.layerKey)!.texts.push(e.text);
|
|
1009
|
-
}
|
|
1010
|
-
|
|
1011
|
-
for (const [layerKey, { texts, ctx }] of byLayer) {
|
|
1012
|
-
await distiller.distill(texts, layerKey, ctx);
|
|
1013
|
-
}
|
|
1014
|
-
distillQueue!.remove(chunkIds);
|
|
1015
|
-
});
|
|
1016
|
-
|
|
1017
|
-
// Re-enqueue any pending items from previous session
|
|
1018
|
-
const pendingIds = distillQueue.pendingIds();
|
|
1019
|
-
if (pendingIds.length > 0) {
|
|
1020
|
-
distillScheduler.enqueue(pendingIds);
|
|
1021
|
-
}
|
|
1022
|
-
}
|
|
1023
|
-
}
|
|
1024
|
-
|
|
1025
|
-
const instance: CortexInstance = {
|
|
1026
|
-
config, store, search, pipeline, embedding,
|
|
1027
|
-
distillQueue, distillScheduler,
|
|
1028
|
-
};
|
|
1029
|
-
```
|
|
1030
|
-
|
|
1031
|
-
- [ ] **Step 2: Add distillation enqueue to pipeline**
|
|
1032
|
-
|
|
1033
|
-
In `src/lib/cortex/ingestion/pipeline.ts`, add an optional queue property:
|
|
1034
|
-
|
|
1035
|
-
```typescript
|
|
1036
|
-
import type { DistillationQueue } from '../distillation/queue';
|
|
1037
|
-
import type { DistillationScheduler } from '../distillation/scheduler';
|
|
1038
|
-
|
|
1039
|
-
const DISTILLABLE_TYPES = new Set(['decision', 'error_fix']);
|
|
1040
|
-
|
|
1041
|
-
export class IngestionPipeline {
|
|
1042
|
-
private hashSet = new Set<string>();
|
|
1043
|
-
distillQueue?: DistillationQueue;
|
|
1044
|
-
distillScheduler?: DistillationScheduler;
|
|
1045
|
-
// ... existing constructor
|
|
1046
|
-
```
|
|
1047
|
-
|
|
1048
|
-
After `await this.store.add(layerKey, unit);` and `result.chunksEmbedded++;`, add:
|
|
1049
|
-
|
|
1050
|
-
```typescript
|
|
1051
|
-
// Enqueue for distillation if the type qualifies
|
|
1052
|
-
if (this.distillQueue && this.distillScheduler && DISTILLABLE_TYPES.has(unit.type)) {
|
|
1053
|
-
this.distillQueue.enqueue(unit.id, {
|
|
1054
|
-
text: unit.text,
|
|
1055
|
-
layerKey,
|
|
1056
|
-
workspaceId: unit.workspace_id,
|
|
1057
|
-
agentType: unit.agent_type,
|
|
1058
|
-
});
|
|
1059
|
-
this.distillScheduler.enqueue([unit.id]);
|
|
1060
|
-
}
|
|
1061
|
-
```
|
|
1062
|
-
|
|
1063
|
-
Back in `index.ts`, wire the queue and scheduler into the pipeline:
|
|
1064
|
-
|
|
1065
|
-
```typescript
|
|
1066
|
-
pipeline.distillQueue = distillQueue;
|
|
1067
|
-
pipeline.distillScheduler = distillScheduler;
|
|
1068
|
-
```
|
|
1069
|
-
|
|
1070
|
-
- [ ] **Step 3: Run full cortex test suite**
|
|
1071
|
-
|
|
1072
|
-
Run: `npx vitest run tests/lib/cortex/`
|
|
1073
|
-
Expected: ALL PASS
|
|
1074
|
-
|
|
1075
|
-
- [ ] **Step 4: Commit**
|
|
1076
|
-
|
|
1077
|
-
```bash
|
|
1078
|
-
git add src/lib/cortex/index.ts src/lib/cortex/ingestion/pipeline.ts
|
|
1079
|
-
git commit -m "feat(cortex): wire distillation scheduler into singleton and pipeline"
|
|
1080
|
-
```
|
|
1081
|
-
|
|
1082
|
-
---
|
|
1083
|
-
|
|
1084
|
-
## Chunk 5: MCP Server (Layer 5) + Bug Fixes
|
|
1085
|
-
|
|
1086
|
-
### Task 10: Create MCP server entry point
|
|
1087
|
-
|
|
1088
|
-
**Spec deviation note:** The spec says to adapt `mcp/server.ts` into an HTTP client wrapper. Instead, we leave `mcp/server.ts` unchanged and create thin API routes (`/api/cortex/mcp/tools` and `/api/cortex/mcp/call`) that import from it. The stdio entry point (`bin/cortex-mcp.js`) proxies to these routes. This is a cleaner separation — `server.ts` stays as the canonical tool handler, usable both server-side and via HTTP.
|
|
1089
|
-
|
|
1090
|
-
**Files:**
|
|
1091
|
-
- Create: `bin/cortex-mcp.js`
|
|
1092
|
-
- Create: `src/app/api/cortex/mcp/tools/route.ts`
|
|
1093
|
-
- Create: `src/app/api/cortex/mcp/call/route.ts`
|
|
1094
|
-
- Modify: `bin/terminal-server.js:471-519`
|
|
1095
|
-
|
|
1096
|
-
- [ ] **Step 1: Create `bin/cortex-mcp.js`**
|
|
1097
|
-
|
|
1098
|
-
This is a stdio MCP server that proxies to the Spaces HTTP API. It reads `SPACES_URL` from the environment (set during registration).
|
|
1099
|
-
|
|
1100
|
-
```javascript
|
|
1101
|
-
#!/usr/bin/env node
|
|
1102
|
-
'use strict';
|
|
1103
|
-
|
|
1104
|
-
const readline = require('readline');
|
|
1105
|
-
|
|
1106
|
-
const SPACES_URL = process.env.SPACES_URL || 'http://localhost:3457';
|
|
1107
|
-
const INTERNAL_TOKEN = process.env.SPACES_INTERNAL_TOKEN || '';
|
|
1108
|
-
|
|
1109
|
-
// MCP stdio transport: read JSON-RPC from stdin, write to stdout
|
|
1110
|
-
const rl = readline.createInterface({ input: process.stdin });
|
|
1111
|
-
|
|
1112
|
-
rl.on('line', async (line) => {
|
|
1113
|
-
let msg;
|
|
1114
|
-
try { msg = JSON.parse(line); } catch { return; }
|
|
1115
|
-
|
|
1116
|
-
if (msg.method === 'initialize') {
|
|
1117
|
-
respond(msg.id, {
|
|
1118
|
-
protocolVersion: '2024-11-05',
|
|
1119
|
-
capabilities: { tools: { listChanged: false } },
|
|
1120
|
-
serverInfo: { name: 'cortex', version: '1.0.0' },
|
|
1121
|
-
});
|
|
1122
|
-
return;
|
|
1123
|
-
}
|
|
1124
|
-
|
|
1125
|
-
if (msg.method === 'notifications/initialized') return; // no response needed
|
|
1126
|
-
|
|
1127
|
-
if (msg.method === 'tools/list') {
|
|
1128
|
-
const tools = await fetchJSON('/api/cortex/mcp/tools');
|
|
1129
|
-
respond(msg.id, { tools: tools || [] });
|
|
1130
|
-
return;
|
|
1131
|
-
}
|
|
1132
|
-
|
|
1133
|
-
if (msg.method === 'tools/call') {
|
|
1134
|
-
const { name, arguments: args } = msg.params;
|
|
1135
|
-
const result = await fetchJSON('/api/cortex/mcp/call', {
|
|
1136
|
-
method: 'POST',
|
|
1137
|
-
body: JSON.stringify({ name, args }),
|
|
1138
|
-
});
|
|
1139
|
-
respond(msg.id, result || { content: [{ type: 'text', text: 'Error calling tool' }], isError: true });
|
|
1140
|
-
return;
|
|
1141
|
-
}
|
|
1142
|
-
|
|
1143
|
-
// Unknown method
|
|
1144
|
-
respond(msg.id, null, { code: -32601, message: `Method not found: ${msg.method}` });
|
|
1145
|
-
});
|
|
1146
|
-
|
|
1147
|
-
function respond(id, result, error) {
|
|
1148
|
-
const msg = { jsonrpc: '2.0', id };
|
|
1149
|
-
if (error) msg.error = error;
|
|
1150
|
-
else msg.result = result;
|
|
1151
|
-
process.stdout.write(JSON.stringify(msg) + '\n');
|
|
1152
|
-
}
|
|
1153
|
-
|
|
1154
|
-
async function fetchJSON(path, opts = {}) {
|
|
1155
|
-
try {
|
|
1156
|
-
const headers = { 'Content-Type': 'application/json' };
|
|
1157
|
-
if (INTERNAL_TOKEN) headers['x-spaces-internal'] = INTERNAL_TOKEN;
|
|
1158
|
-
const res = await fetch(`${SPACES_URL}${path}`, { ...opts, headers: { ...headers, ...opts.headers } });
|
|
1159
|
-
if (res.ok) return await res.json();
|
|
1160
|
-
return null;
|
|
1161
|
-
} catch { return null; }
|
|
1162
|
-
}
|
|
1163
|
-
```
|
|
1164
|
-
|
|
1165
|
-
- [ ] **Step 2: Create MCP API routes**
|
|
1166
|
-
|
|
1167
|
-
Create `src/app/api/cortex/mcp/tools/route.ts`:
|
|
1168
|
-
|
|
1169
|
-
```typescript
|
|
1170
|
-
import { NextResponse } from 'next/server';
|
|
1171
|
-
import { CORTEX_TOOLS } from '@/lib/cortex/mcp/server';
|
|
1172
|
-
|
|
1173
|
-
export async function GET() {
|
|
1174
|
-
return NextResponse.json(CORTEX_TOOLS);
|
|
1175
|
-
}
|
|
1176
|
-
```
|
|
1177
|
-
|
|
1178
|
-
Create `src/app/api/cortex/mcp/call/route.ts`:
|
|
1179
|
-
|
|
1180
|
-
```typescript
|
|
1181
|
-
import { NextResponse } from 'next/server';
|
|
1182
|
-
import type { NextRequest } from 'next/server';
|
|
1183
|
-
import { getCortex } from '@/lib/cortex';
|
|
1184
|
-
import { handleToolCall } from '@/lib/cortex/mcp/server';
|
|
1185
|
-
|
|
1186
|
-
export async function POST(request: NextRequest) {
|
|
1187
|
-
const { name, args } = await request.json();
|
|
1188
|
-
const cortex = await getCortex();
|
|
1189
|
-
const result = await handleToolCall(name, args || {}, cortex);
|
|
1190
|
-
return NextResponse.json(result);
|
|
1191
|
-
}
|
|
1192
|
-
```
|
|
1193
|
-
|
|
1194
|
-
- [ ] **Step 3: Register MCP server in `writeCortexHookConfig`**
|
|
1195
|
-
|
|
1196
|
-
In `bin/terminal-server.js`, in the `writeCortexHookConfig` function (~line 487), after the hooks setup, add MCP server registration:
|
|
1197
|
-
|
|
1198
|
-
```javascript
|
|
1199
|
-
// Register Cortex MCP server
|
|
1200
|
-
const mcpServer = path.resolve(__dirname, 'cortex-mcp.js');
|
|
1201
|
-
if (!settings.mcpServers) settings.mcpServers = {};
|
|
1202
|
-
settings.mcpServers.cortex = {
|
|
1203
|
-
command: 'node',
|
|
1204
|
-
args: [mcpServer],
|
|
1205
|
-
env: {
|
|
1206
|
-
SPACES_URL: `http://localhost:${httpPort || 3457}`,
|
|
1207
|
-
SPACES_INTERNAL_TOKEN: (process.env.SPACES_SESSION_SECRET || '').slice(0, 16),
|
|
1208
|
-
},
|
|
1209
|
-
};
|
|
1210
|
-
```
|
|
1211
|
-
|
|
1212
|
-
The `httpPort` can be read from `httpServer.address().port` if available, or default to 3457.
|
|
1213
|
-
|
|
1214
|
-
- [ ] **Step 4: Run full test suite**
|
|
1215
|
-
|
|
1216
|
-
Run: `npx vitest run tests/lib/cortex/`
|
|
1217
|
-
Expected: ALL PASS
|
|
1218
|
-
|
|
1219
|
-
- [ ] **Step 5: Commit**
|
|
1220
|
-
|
|
1221
|
-
```bash
|
|
1222
|
-
git add bin/cortex-mcp.js src/app/api/cortex/mcp/ bin/terminal-server.js
|
|
1223
|
-
git commit -m "feat(cortex): mount MCP server with stdio transport and API routes"
|
|
1224
|
-
```
|
|
1225
|
-
|
|
1226
|
-
---
|
|
1227
|
-
|
|
1228
|
-
### Task 11: Fix client-side layer filtering bug
|
|
1229
|
-
|
|
1230
|
-
**Spec deviation note:** The spec suggests 3 separate requests (one per tab). Instead, we send 1 request with a `layer` query param based on the active tab. This is simpler (fewer concurrent requests) and results refresh on tab switch.
|
|
1231
|
-
|
|
1232
|
-
**Files:**
|
|
1233
|
-
- Modify: `src/app/api/cortex/search/route.ts`
|
|
1234
|
-
- Modify: `src/components/cortex/cortex-panel.tsx`
|
|
1235
|
-
|
|
1236
|
-
- [ ] **Step 1: Add `layer` param to search API**
|
|
1237
|
-
|
|
1238
|
-
In `src/app/api/cortex/search/route.ts`, read a `layer` param and pass it to both browse and search:
|
|
1239
|
-
|
|
1240
|
-
```typescript
|
|
1241
|
-
const layer = url.searchParams.get('layer') as any;
|
|
1242
|
-
|
|
1243
|
-
// Browse mode
|
|
1244
|
-
if (!query) {
|
|
1245
|
-
const layers = layer ? [layer] : ['personal', 'workspace', 'team'] as const;
|
|
1246
|
-
const results: any[] = [];
|
|
1247
|
-
for (const l of layers) {
|
|
1248
|
-
const items = await cortex.store.browse(l, limit);
|
|
1249
|
-
results.push(...items);
|
|
1250
|
-
}
|
|
1251
|
-
return NextResponse.json({ results: results.slice(0, limit) });
|
|
1252
|
-
}
|
|
1253
|
-
|
|
1254
|
-
// Search mode
|
|
1255
|
-
const [queryVector] = await cortex.embedding.embed([query]);
|
|
1256
|
-
const results = await cortex.search.search(queryVector, {
|
|
1257
|
-
workspaceId: workspaceId ? parseInt(workspaceId, 10) : null,
|
|
1258
|
-
layers: layer ? [layer] : undefined,
|
|
1259
|
-
limit,
|
|
1260
|
-
});
|
|
1261
|
-
```
|
|
1262
|
-
|
|
1263
|
-
- [ ] **Step 2: Update panel to fetch per-layer**
|
|
1264
|
-
|
|
1265
|
-
In `src/components/cortex/cortex-panel.tsx`, change `fetchBrowse` and `handleSearch` to include the `layer` param based on `activeTab`:
|
|
1266
|
-
|
|
1267
|
-
Replace `fetchBrowse` and `handleSearch` with a single `fetchResults`:
|
|
1268
|
-
|
|
1269
|
-
```typescript
|
|
1270
|
-
const fetchResults = useCallback(async (searchQuery?: string) => {
|
|
1271
|
-
setLoading(true);
|
|
1272
|
-
try {
|
|
1273
|
-
const params = new URLSearchParams({ limit: '20', layer: activeTab });
|
|
1274
|
-
if (searchQuery) params.set('q', searchQuery);
|
|
1275
|
-
const res = await fetch(api(`/api/cortex/search?${params}`));
|
|
1276
|
-
if (res.ok) {
|
|
1277
|
-
const data = await res.json();
|
|
1278
|
-
setResults(data.results || []);
|
|
1279
|
-
}
|
|
1280
|
-
} catch { /* ignore */ }
|
|
1281
|
-
setLoading(false);
|
|
1282
|
-
}, [activeTab]);
|
|
1283
|
-
```
|
|
1284
|
-
|
|
1285
|
-
Update the `useEffect` to include `activeTab` and call `fetchResults`:
|
|
1286
|
-
|
|
1287
|
-
```typescript
|
|
1288
|
-
useEffect(() => {
|
|
1289
|
-
if (open) {
|
|
1290
|
-
fetchStats();
|
|
1291
|
-
fetchResults(query || undefined);
|
|
1292
|
-
}
|
|
1293
|
-
}, [open, activeTab, fetchStats, fetchResults]);
|
|
1294
|
-
```
|
|
1295
|
-
|
|
1296
|
-
Update `handleSearch` to call `fetchResults`:
|
|
1297
|
-
|
|
1298
|
-
```typescript
|
|
1299
|
-
const handleSearch = () => {
|
|
1300
|
-
fetchResults(query.trim() || undefined);
|
|
1301
|
-
};
|
|
1302
|
-
```
|
|
1303
|
-
|
|
1304
|
-
Remove the client-side `filtered` variable — `results` are already per-layer. Replace `filtered.map(unit =>` with `results.map(unit =>`.
|
|
1305
|
-
|
|
1306
|
-
Remove `const filtered = results.filter(r => r.layer === activeTab);` and use `results` directly in the render.
|
|
1307
|
-
|
|
1308
|
-
- [ ] **Step 3: Verify — run build or type check**
|
|
1309
|
-
|
|
1310
|
-
Run: `npx tsc --noEmit` (or `npm run build`)
|
|
1311
|
-
|
|
1312
|
-
- [ ] **Step 4: Commit**
|
|
1313
|
-
|
|
1314
|
-
```bash
|
|
1315
|
-
git add src/app/api/cortex/search/route.ts src/components/cortex/cortex-panel.tsx
|
|
1316
|
-
git commit -m "fix(cortex): query per-layer instead of client-side filtering"
|
|
1317
|
-
```
|
|
1318
|
-
|
|
1319
|
-
---
|
|
1320
|
-
|
|
1321
|
-
### Task 12: Fix exporter hardcoded 384-dim vector
|
|
1322
|
-
|
|
1323
|
-
**Files:**
|
|
1324
|
-
- Modify: `src/lib/cortex/portability/exporter.ts`
|
|
1325
|
-
|
|
1326
|
-
- [ ] **Step 1: Read the exporter to find the hardcoded line**
|
|
1327
|
-
|
|
1328
|
-
Look for `new Array(384)` in `exporter.ts`.
|
|
1329
|
-
|
|
1330
|
-
- [ ] **Step 2: Fix — accept dimensions parameter**
|
|
1331
|
-
|
|
1332
|
-
Update the export function signature to accept `dimensions: number` and replace the hardcoded array:
|
|
1333
|
-
|
|
1334
|
-
```typescript
|
|
1335
|
-
const dummyVector = new Array(dimensions).fill(0);
|
|
1336
|
-
```
|
|
1337
|
-
|
|
1338
|
-
Thread `dimensions` from the calling context (the MCP `cortex_export` handler and the API route both have access to `cortex.embedding.dimensions`).
|
|
1339
|
-
|
|
1340
|
-
- [ ] **Step 3: Update callers**
|
|
1341
|
-
|
|
1342
|
-
Update `src/app/api/cortex/export/route.ts` and `src/lib/cortex/mcp/server.ts` to pass `embedding.dimensions` to the export function.
|
|
1343
|
-
|
|
1344
|
-
- [ ] **Step 4: Run tests**
|
|
1345
|
-
|
|
1346
|
-
Run: `npx vitest run tests/lib/cortex/portability/exporter.test.ts`
|
|
1347
|
-
Expected: ALL PASS (update test mocks if dimensions param is now required)
|
|
1348
|
-
|
|
1349
|
-
- [ ] **Step 5: Commit**
|
|
1350
|
-
|
|
1351
|
-
```bash
|
|
1352
|
-
git add src/lib/cortex/portability/exporter.ts src/app/api/cortex/export/ src/lib/cortex/mcp/server.ts
|
|
1353
|
-
git commit -m "fix(cortex): use actual embedding dimensions in exporter instead of hardcoded 384"
|
|
1354
|
-
```
|
|
1355
|
-
|
|
1356
|
-
---
|
|
1357
|
-
|
|
1358
|
-
## Final Verification
|
|
1359
|
-
|
|
1360
|
-
### Task 13: Full test suite + smoke test
|
|
1361
|
-
|
|
1362
|
-
- [ ] **Step 1: Run full cortex test suite**
|
|
1363
|
-
|
|
1364
|
-
Run: `npx vitest run tests/lib/cortex/`
|
|
1365
|
-
Expected: ALL PASS
|
|
1366
|
-
|
|
1367
|
-
- [ ] **Step 2: Run type check**
|
|
1368
|
-
|
|
1369
|
-
Run: `npx tsc --noEmit`
|
|
1370
|
-
Expected: No errors
|
|
1371
|
-
|
|
1372
|
-
- [ ] **Step 3: Build**
|
|
1373
|
-
|
|
1374
|
-
Run: `npm run build`
|
|
1375
|
-
Expected: Build succeeds
|
|
1376
|
-
|
|
1377
|
-
- [ ] **Step 4: Commit any remaining fixes**
|
|
1378
|
-
|
|
1379
|
-
If any tests or build issues found, fix and commit.
|
|
1380
|
-
|
|
1381
|
-
- [ ] **Step 5: Final commit summarizing all changes**
|
|
1382
|
-
|
|
1383
|
-
If all individual commits are clean, no summary needed. Otherwise:
|
|
1384
|
-
|
|
1385
|
-
```bash
|
|
1386
|
-
git log --oneline -15 # verify commit history looks clean
|
|
1387
|
-
```
|
|
1
|
+
# Cortex Wiring Implementation Plan
|
|
2
|
+
|
|
3
|
+
> **For agentic workers:** REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
4
|
+
|
|
5
|
+
**Goal:** Wire up 5 existing-but-unused Cortex subsystems (dedup, extractors, staleness, distillation, MCP server) and fix 2 bugs.
|
|
6
|
+
|
|
7
|
+
**Architecture:** Layered wiring in dependency order. Each layer is independently shippable. The pipeline gains dedup + classification before embedding. Search gains staleness scoring. Background distillation extracts structured knowledge via LLM. MCP server exposes Cortex to Claude Code as tools.
|
|
8
|
+
|
|
9
|
+
**Tech Stack:** TypeScript, Vitest, LanceDB, Node.js `fetch` (for LLM API calls), MCP stdio transport
|
|
10
|
+
|
|
11
|
+
**Spec:** `docs/superpowers/specs/2026-03-13-cortex-wiring-design.md`
|
|
12
|
+
|
|
13
|
+
---
|
|
14
|
+
|
|
15
|
+
## Chunk 1: Deduplication (Layer 1)
|
|
16
|
+
|
|
17
|
+
### Task 1: Add `textHash` to deduplicator
|
|
18
|
+
|
|
19
|
+
**Files:**
|
|
20
|
+
- Modify: `src/lib/cortex/ingestion/deduplicator.ts`
|
|
21
|
+
- Modify: `tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
22
|
+
|
|
23
|
+
- [ ] **Step 1: Write failing test for `textHash`**
|
|
24
|
+
|
|
25
|
+
In `tests/lib/cortex/ingestion/deduplicator.test.ts`, add:
|
|
26
|
+
|
|
27
|
+
```typescript
|
|
28
|
+
import { cosineSimilarity, isDuplicate, textHash } from '@/lib/cortex/ingestion/deduplicator';
|
|
29
|
+
|
|
30
|
+
describe('textHash', () => {
|
|
31
|
+
it('returns consistent SHA-256 hex for same input', () => {
|
|
32
|
+
const h1 = textHash('hello world');
|
|
33
|
+
const h2 = textHash('hello world');
|
|
34
|
+
expect(h1).toBe(h2);
|
|
35
|
+
expect(h1).toMatch(/^[a-f0-9]{64}$/);
|
|
36
|
+
});
|
|
37
|
+
|
|
38
|
+
it('normalizes whitespace before hashing', () => {
|
|
39
|
+
const h1 = textHash('hello world\n\n');
|
|
40
|
+
const h2 = textHash('hello world');
|
|
41
|
+
expect(h1).toBe(h2);
|
|
42
|
+
});
|
|
43
|
+
|
|
44
|
+
it('returns different hashes for different text', () => {
|
|
45
|
+
expect(textHash('foo')).not.toBe(textHash('bar'));
|
|
46
|
+
});
|
|
47
|
+
});
|
|
48
|
+
```
|
|
49
|
+
|
|
50
|
+
- [ ] **Step 2: Run test to verify it fails**
|
|
51
|
+
|
|
52
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
53
|
+
Expected: FAIL — `textHash` is not exported
|
|
54
|
+
|
|
55
|
+
- [ ] **Step 3: Implement `textHash`**
|
|
56
|
+
|
|
57
|
+
In `src/lib/cortex/ingestion/deduplicator.ts`, add at the top:
|
|
58
|
+
|
|
59
|
+
```typescript
|
|
60
|
+
import { createHash } from 'crypto';
|
|
61
|
+
|
|
62
|
+
export function textHash(text: string): string {
|
|
63
|
+
const normalized = text.replace(/\s+/g, ' ').trim();
|
|
64
|
+
return createHash('sha256').update(normalized).digest('hex');
|
|
65
|
+
}
|
|
66
|
+
```
|
|
67
|
+
|
|
68
|
+
- [ ] **Step 4: Run test to verify it passes**
|
|
69
|
+
|
|
70
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/deduplicator.test.ts`
|
|
71
|
+
Expected: ALL PASS
|
|
72
|
+
|
|
73
|
+
- [ ] **Step 5: Commit**
|
|
74
|
+
|
|
75
|
+
```bash
|
|
76
|
+
git add src/lib/cortex/ingestion/deduplicator.ts tests/lib/cortex/ingestion/deduplicator.test.ts
|
|
77
|
+
git commit -m "feat(cortex): add textHash for dedup hash check"
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
---
|
|
81
|
+
|
|
82
|
+
### Task 2: Add `updateAccessCount` to store
|
|
83
|
+
|
|
84
|
+
**Files:**
|
|
85
|
+
- Modify: `src/lib/cortex/store.ts`
|
|
86
|
+
- Modify: `tests/lib/cortex/store.test.ts`
|
|
87
|
+
|
|
88
|
+
- [ ] **Step 1: Write failing test**
|
|
89
|
+
|
|
90
|
+
In `tests/lib/cortex/store.test.ts`, add a test for `updateAccessCount`:
|
|
91
|
+
|
|
92
|
+
```typescript
|
|
93
|
+
it('updates access_count on a unit', async () => {
|
|
94
|
+
// Assumes a unit with known id has been added in a beforeEach or earlier test
|
|
95
|
+
// Add a unit first
|
|
96
|
+
const unit = makeUnit({ id: 'access-test', access_count: 0 });
|
|
97
|
+
await store.add('personal', unit);
|
|
98
|
+
|
|
99
|
+
await store.updateAccessCount('personal', 'access-test');
|
|
100
|
+
|
|
101
|
+
// Retrieve and check — use browse since we need to find by id
|
|
102
|
+
const results = await store.browse('personal', 100);
|
|
103
|
+
const found = results.find(r => r.id === 'access-test');
|
|
104
|
+
expect(found).toBeDefined();
|
|
105
|
+
expect(found!.access_count).toBe(1);
|
|
106
|
+
});
|
|
107
|
+
```
|
|
108
|
+
|
|
109
|
+
Note: adapt to the existing test file's setup patterns (mock store vs real LanceDB). If the test file uses mocks, test the method signature exists and is callable. If it uses real LanceDB, test the actual update.
|
|
110
|
+
|
|
111
|
+
- [ ] **Step 2: Run test to verify it fails**
|
|
112
|
+
|
|
113
|
+
Run: `npx vitest run tests/lib/cortex/store.test.ts`
|
|
114
|
+
Expected: FAIL — `updateAccessCount` does not exist
|
|
115
|
+
|
|
116
|
+
- [ ] **Step 3: Implement `updateAccessCount`**
|
|
117
|
+
|
|
118
|
+
In `src/lib/cortex/store.ts`, add after the `delete` method (~line 163):
|
|
119
|
+
|
|
120
|
+
```typescript
|
|
121
|
+
async updateAccessCount(layerKey: string, id: string): Promise<void> {
|
|
122
|
+
const conn = await this.getConnection(layerKey);
|
|
123
|
+
const tableNames = await conn.tableNames();
|
|
124
|
+
if (!tableNames.includes(TABLE_NAME)) return;
|
|
125
|
+
|
|
126
|
+
const table = await conn.openTable(TABLE_NAME);
|
|
127
|
+
const safeId = id.replace(/'/g, "''");
|
|
128
|
+
// LanceDB doesn't support UPDATE; delete + re-add with bumped count
|
|
129
|
+
// Use query().where() instead of vectorSearch to avoid dimension dependency
|
|
130
|
+
const rows = await table.query()
|
|
131
|
+
.where(`id = '${safeId}'`).limit(1).toArray();
|
|
132
|
+
if (rows.length === 0) return;
|
|
133
|
+
|
|
134
|
+
const row = rows[0];
|
|
135
|
+
await table.delete(`id = '${safeId}'`);
|
|
136
|
+
row.access_count = (row.access_count || 0) + 1;
|
|
137
|
+
row.last_accessed = new Date().toISOString();
|
|
138
|
+
await table.add([row]);
|
|
139
|
+
}
|
|
140
|
+
```
|
|
141
|
+
|
|
142
|
+
- [ ] **Step 4: Run test to verify it passes**
|
|
143
|
+
|
|
144
|
+
Run: `npx vitest run tests/lib/cortex/store.test.ts`
|
|
145
|
+
Expected: ALL PASS
|
|
146
|
+
|
|
147
|
+
- [ ] **Step 5: Commit**
|
|
148
|
+
|
|
149
|
+
```bash
|
|
150
|
+
git add src/lib/cortex/store.ts tests/lib/cortex/store.test.ts
|
|
151
|
+
git commit -m "feat(cortex): add updateAccessCount to store"
|
|
152
|
+
```
|
|
153
|
+
|
|
154
|
+
---
|
|
155
|
+
|
|
156
|
+
### Task 3: Wire dedup into the pipeline
|
|
157
|
+
|
|
158
|
+
**Files:**
|
|
159
|
+
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
160
|
+
- Modify: `tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
161
|
+
|
|
162
|
+
- [ ] **Step 1: Write failing test — duplicate is skipped**
|
|
163
|
+
|
|
164
|
+
In `tests/lib/cortex/ingestion/pipeline.test.ts`, first add `updateAccessCount` to the shared `mockStore` in the `beforeEach` block:
|
|
165
|
+
|
|
166
|
+
```typescript
|
|
167
|
+
// In the existing beforeEach, add to mockStore:
|
|
168
|
+
mockStore.updateAccessCount = vi.fn().mockResolvedValue(undefined);
|
|
169
|
+
```
|
|
170
|
+
|
|
171
|
+
Then add these tests:
|
|
172
|
+
|
|
173
|
+
```typescript
|
|
174
|
+
it('skips duplicate chunks (hash match)', async () => {
|
|
175
|
+
const msg = { role: 'human', content: 'Add auth', timestamp: new Date().toISOString() };
|
|
176
|
+
const msgs = [msg, { role: 'assistant', content: 'Done.', timestamp: new Date().toISOString() }];
|
|
177
|
+
const ctx = { sessionId: 's1', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
178
|
+
|
|
179
|
+
// Ingest twice with identical content
|
|
180
|
+
await pipeline.ingest(msgs, ctx);
|
|
181
|
+
await pipeline.ingest(msgs, ctx);
|
|
182
|
+
|
|
183
|
+
// store.add should only be called once (second ingest is hash-deduped)
|
|
184
|
+
expect(mockStore.add).toHaveBeenCalledTimes(1);
|
|
185
|
+
});
|
|
186
|
+
|
|
187
|
+
it('skips cosine-similar chunks and bumps access count', async () => {
|
|
188
|
+
// store.search returns a near-match with L2 distance below threshold
|
|
189
|
+
mockStore.search.mockResolvedValueOnce([{
|
|
190
|
+
id: 'existing-1', text: 'similar text', _distance: 0.01,
|
|
191
|
+
access_count: 0, confidence: 0.8,
|
|
192
|
+
}]);
|
|
193
|
+
|
|
194
|
+
const msgs = [
|
|
195
|
+
{ role: 'human', content: 'Slightly different auth', timestamp: new Date().toISOString() },
|
|
196
|
+
{ role: 'assistant', content: 'Done.', timestamp: new Date().toISOString() },
|
|
197
|
+
];
|
|
198
|
+
const ctx = { sessionId: 's2', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
199
|
+
|
|
200
|
+
await pipeline.ingest(msgs, ctx);
|
|
201
|
+
|
|
202
|
+
expect(mockStore.updateAccessCount).toHaveBeenCalledWith(expect.any(String), 'existing-1');
|
|
203
|
+
expect(mockStore.add).not.toHaveBeenCalled();
|
|
204
|
+
});
|
|
205
|
+
```
|
|
206
|
+
|
|
207
|
+
- [ ] **Step 2: Run tests to verify they fail**
|
|
208
|
+
|
|
209
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
210
|
+
Expected: FAIL — pipeline doesn't dedup yet
|
|
211
|
+
|
|
212
|
+
- [ ] **Step 3: Implement dedup in pipeline**
|
|
213
|
+
|
|
214
|
+
Modify `src/lib/cortex/ingestion/pipeline.ts`:
|
|
215
|
+
|
|
216
|
+
```typescript
|
|
217
|
+
import crypto from 'crypto';
|
|
218
|
+
import type { EmbeddingProvider } from '../embeddings';
|
|
219
|
+
import type { CortexStore } from '../store';
|
|
220
|
+
import type { KnowledgeUnit, RawChunk } from '../knowledge/types';
|
|
221
|
+
import { getConfidenceBase } from '../knowledge/types';
|
|
222
|
+
import { chunkMessages, type SessionMessage, type ChunkContext } from './chunker';
|
|
223
|
+
import { textHash } from './deduplicator';
|
|
224
|
+
|
|
225
|
+
export interface IngestionResult {
|
|
226
|
+
chunksCreated: number;
|
|
227
|
+
chunksEmbedded: number;
|
|
228
|
+
chunksSkipped: number;
|
|
229
|
+
errors: string[];
|
|
230
|
+
}
|
|
231
|
+
|
|
232
|
+
const COSINE_DEDUP_THRESHOLD = 0.05; // L2 distance < 0.05 = duplicate
|
|
233
|
+
|
|
234
|
+
export class IngestionPipeline {
|
|
235
|
+
private hashSet = new Set<string>();
|
|
236
|
+
|
|
237
|
+
constructor(
|
|
238
|
+
private embedding: EmbeddingProvider,
|
|
239
|
+
private store: CortexStore,
|
|
240
|
+
) {}
|
|
241
|
+
|
|
242
|
+
async ingest(
|
|
243
|
+
messages: SessionMessage[],
|
|
244
|
+
ctx: ChunkContext,
|
|
245
|
+
): Promise<IngestionResult> {
|
|
246
|
+
const result: IngestionResult = {
|
|
247
|
+
chunksCreated: 0, chunksEmbedded: 0, chunksSkipped: 0, errors: [],
|
|
248
|
+
};
|
|
249
|
+
|
|
250
|
+
// Tier 1: Fast pass — chunk messages
|
|
251
|
+
let chunks: RawChunk[];
|
|
252
|
+
try {
|
|
253
|
+
chunks = chunkMessages(messages, ctx);
|
|
254
|
+
} catch (err) {
|
|
255
|
+
result.errors.push(`Tier 1 error: ${err}`);
|
|
256
|
+
return result;
|
|
257
|
+
}
|
|
258
|
+
result.chunksCreated = chunks.length;
|
|
259
|
+
|
|
260
|
+
// Tier 2: Dedup, embed, and store
|
|
261
|
+
const BATCH_SIZE = 50;
|
|
262
|
+
for (let i = 0; i < chunks.length; i += BATCH_SIZE) {
|
|
263
|
+
const batch = chunks.slice(i, i + BATCH_SIZE);
|
|
264
|
+
|
|
265
|
+
// Phase 1: Hash dedup — filter out exact duplicates before embedding
|
|
266
|
+
const novel: RawChunk[] = [];
|
|
267
|
+
for (const chunk of batch) {
|
|
268
|
+
const hash = textHash(chunk.text);
|
|
269
|
+
if (this.hashSet.has(hash)) {
|
|
270
|
+
result.chunksSkipped++;
|
|
271
|
+
} else {
|
|
272
|
+
this.hashSet.add(hash);
|
|
273
|
+
novel.push(chunk);
|
|
274
|
+
}
|
|
275
|
+
}
|
|
276
|
+
|
|
277
|
+
if (novel.length === 0) continue;
|
|
278
|
+
|
|
279
|
+
try {
|
|
280
|
+
const texts = novel.map(c => c.text);
|
|
281
|
+
const vectors = await this.embedding.embed(texts);
|
|
282
|
+
|
|
283
|
+
for (let j = 0; j < novel.length; j++) {
|
|
284
|
+
const chunk = novel[j];
|
|
285
|
+
const vector = vectors[j];
|
|
286
|
+
const layerKey = chunk.layer === 'workspace' && chunk.workspace_id
|
|
287
|
+
? `workspace/${chunk.workspace_id}`
|
|
288
|
+
: chunk.layer;
|
|
289
|
+
|
|
290
|
+
// Phase 2: Cosine dedup — check store for near-matches
|
|
291
|
+
try {
|
|
292
|
+
const nearestResults = await this.store.search(layerKey, vector, 1);
|
|
293
|
+
if (nearestResults.length > 0) {
|
|
294
|
+
const nearest = nearestResults[0] as any;
|
|
295
|
+
const distance = nearest._distance ?? 1;
|
|
296
|
+
if (distance < COSINE_DEDUP_THRESHOLD) {
|
|
297
|
+
await this.store.updateAccessCount(layerKey, nearest.id);
|
|
298
|
+
result.chunksSkipped++;
|
|
299
|
+
continue;
|
|
300
|
+
}
|
|
301
|
+
}
|
|
302
|
+
} catch {
|
|
303
|
+
// Store may be empty or table not created yet — proceed with add
|
|
304
|
+
}
|
|
305
|
+
|
|
306
|
+
const unit: KnowledgeUnit = {
|
|
307
|
+
id: crypto.randomUUID(),
|
|
308
|
+
vector,
|
|
309
|
+
text: chunk.text,
|
|
310
|
+
type: chunk.type,
|
|
311
|
+
layer: chunk.layer,
|
|
312
|
+
workspace_id: chunk.workspace_id,
|
|
313
|
+
session_id: chunk.session_id,
|
|
314
|
+
agent_type: chunk.agent_type,
|
|
315
|
+
project_path: chunk.project_path,
|
|
316
|
+
file_refs: chunk.file_refs,
|
|
317
|
+
confidence: getConfidenceBase(chunk.type),
|
|
318
|
+
created: new Date().toISOString(),
|
|
319
|
+
source_timestamp: chunk.source_timestamp,
|
|
320
|
+
stale_score: 0,
|
|
321
|
+
access_count: 0,
|
|
322
|
+
last_accessed: null,
|
|
323
|
+
metadata: chunk.metadata,
|
|
324
|
+
};
|
|
325
|
+
|
|
326
|
+
await this.store.add(layerKey, unit);
|
|
327
|
+
result.chunksEmbedded++;
|
|
328
|
+
}
|
|
329
|
+
} catch (err) {
|
|
330
|
+
result.errors.push(`Tier 2 batch error: ${err}`);
|
|
331
|
+
result.chunksSkipped += novel.length;
|
|
332
|
+
}
|
|
333
|
+
}
|
|
334
|
+
|
|
335
|
+
return result;
|
|
336
|
+
}
|
|
337
|
+
}
|
|
338
|
+
```
|
|
339
|
+
|
|
340
|
+
- [ ] **Step 4: Run tests to verify they pass**
|
|
341
|
+
|
|
342
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
343
|
+
Expected: ALL PASS
|
|
344
|
+
|
|
345
|
+
- [ ] **Step 5: Run full cortex test suite**
|
|
346
|
+
|
|
347
|
+
Run: `npx vitest run tests/lib/cortex/`
|
|
348
|
+
Expected: ALL PASS (no regressions)
|
|
349
|
+
|
|
350
|
+
- [ ] **Step 6: Commit**
|
|
351
|
+
|
|
352
|
+
```bash
|
|
353
|
+
git add src/lib/cortex/ingestion/pipeline.ts tests/lib/cortex/ingestion/pipeline.test.ts
|
|
354
|
+
git commit -m "feat(cortex): wire dedup into ingestion pipeline (hash + cosine)"
|
|
355
|
+
```
|
|
356
|
+
|
|
357
|
+
---
|
|
358
|
+
|
|
359
|
+
## Chunk 2: Extractors (Layer 2)
|
|
360
|
+
|
|
361
|
+
### Task 4: Wire extractors into pipeline
|
|
362
|
+
|
|
363
|
+
**Files:**
|
|
364
|
+
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
365
|
+
- Modify: `tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
366
|
+
|
|
367
|
+
- [ ] **Step 1: Write failing test — error/fix detection**
|
|
368
|
+
|
|
369
|
+
In `tests/lib/cortex/ingestion/pipeline.test.ts`, add:
|
|
370
|
+
|
|
371
|
+
```typescript
|
|
372
|
+
it('classifies error/fix chunks via extractors', async () => {
|
|
373
|
+
const msgs = [
|
|
374
|
+
{ role: 'human', content: 'I got TypeError: cannot read undefined', timestamp: new Date().toISOString() },
|
|
375
|
+
{ role: 'assistant', content: 'Fixed by adding null check before access.', timestamp: new Date().toISOString() },
|
|
376
|
+
];
|
|
377
|
+
const ctx = { sessionId: 's3', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
378
|
+
|
|
379
|
+
await pipeline.ingest(msgs, ctx);
|
|
380
|
+
|
|
381
|
+
const addCall = mockStore.add.mock.calls[0];
|
|
382
|
+
const storedUnit = addCall[1];
|
|
383
|
+
expect(storedUnit.type).toBe('error_fix');
|
|
384
|
+
});
|
|
385
|
+
|
|
386
|
+
it('classifies decision chunks via extractors', async () => {
|
|
387
|
+
const msgs = [
|
|
388
|
+
{ role: 'human', content: 'Which framework?', timestamp: new Date().toISOString() },
|
|
389
|
+
{ role: 'assistant', content: 'We decided to use Next.js for the frontend because of SSR support.', timestamp: new Date().toISOString() },
|
|
390
|
+
];
|
|
391
|
+
const ctx = { sessionId: 's4', workspaceId: 1, agentType: 'claude' as const, projectPath: '/p' };
|
|
392
|
+
|
|
393
|
+
await pipeline.ingest(msgs, ctx);
|
|
394
|
+
|
|
395
|
+
const addCall = mockStore.add.mock.calls[0];
|
|
396
|
+
const storedUnit = addCall[1];
|
|
397
|
+
expect(storedUnit.type).toBe('decision');
|
|
398
|
+
});
|
|
399
|
+
```
|
|
400
|
+
|
|
401
|
+
- [ ] **Step 2: Run tests to verify they fail**
|
|
402
|
+
|
|
403
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
404
|
+
Expected: FAIL — chunks still typed as `conversation`
|
|
405
|
+
|
|
406
|
+
- [ ] **Step 3: Add extraction step to pipeline**
|
|
407
|
+
|
|
408
|
+
In `src/lib/cortex/ingestion/pipeline.ts`, add import at the top:
|
|
409
|
+
|
|
410
|
+
```typescript
|
|
411
|
+
import { detectErrorFixPairs, extractDecisionPatterns, extractCommands } from './extractors';
|
|
412
|
+
```
|
|
413
|
+
|
|
414
|
+
Add a private method to classify chunks. Call it after `chunkMessages()` but before the Tier 2 loop:
|
|
415
|
+
|
|
416
|
+
```typescript
|
|
417
|
+
/** Enrich chunk types using regex extractors. Mutates chunks in place. */
|
|
418
|
+
private classifyChunks(chunks: RawChunk[]): void {
|
|
419
|
+
for (const chunk of chunks) {
|
|
420
|
+
const errorFixes = detectErrorFixPairs(chunk.text);
|
|
421
|
+
const decisions = extractDecisionPatterns(chunk.text);
|
|
422
|
+
const commands = extractCommands(chunk.text);
|
|
423
|
+
|
|
424
|
+
// Priority: decision > error_fix > conversation (default)
|
|
425
|
+
if (decisions.length > 0) {
|
|
426
|
+
chunk.type = 'decision';
|
|
427
|
+
chunk.metadata.decisions = decisions;
|
|
428
|
+
} else if (errorFixes.length > 0) {
|
|
429
|
+
chunk.type = 'error_fix';
|
|
430
|
+
chunk.metadata.error_fixes = errorFixes;
|
|
431
|
+
}
|
|
432
|
+
|
|
433
|
+
if (commands.length > 0) {
|
|
434
|
+
chunk.metadata.commands = commands;
|
|
435
|
+
}
|
|
436
|
+
}
|
|
437
|
+
}
|
|
438
|
+
```
|
|
439
|
+
|
|
440
|
+
In the `ingest` method, call it after chunking (after `result.chunksCreated = chunks.length;`):
|
|
441
|
+
|
|
442
|
+
```typescript
|
|
443
|
+
// Tier 1.5: Classify chunks via regex extractors
|
|
444
|
+
this.classifyChunks(chunks);
|
|
445
|
+
```
|
|
446
|
+
|
|
447
|
+
- [ ] **Step 4: Run tests to verify they pass**
|
|
448
|
+
|
|
449
|
+
Run: `npx vitest run tests/lib/cortex/ingestion/pipeline.test.ts`
|
|
450
|
+
Expected: ALL PASS
|
|
451
|
+
|
|
452
|
+
- [ ] **Step 5: Run full cortex test suite**
|
|
453
|
+
|
|
454
|
+
Run: `npx vitest run tests/lib/cortex/`
|
|
455
|
+
Expected: ALL PASS
|
|
456
|
+
|
|
457
|
+
- [ ] **Step 6: Commit**
|
|
458
|
+
|
|
459
|
+
```bash
|
|
460
|
+
git add src/lib/cortex/ingestion/pipeline.ts tests/lib/cortex/ingestion/pipeline.test.ts
|
|
461
|
+
git commit -m "feat(cortex): wire extractors into pipeline for chunk classification"
|
|
462
|
+
```
|
|
463
|
+
|
|
464
|
+
---
|
|
465
|
+
|
|
466
|
+
## Chunk 3: Staleness (Layer 3)
|
|
467
|
+
|
|
468
|
+
### Task 5: Add staleness scoring to search
|
|
469
|
+
|
|
470
|
+
**Files:**
|
|
471
|
+
- Modify: `src/lib/cortex/retrieval/search.ts`
|
|
472
|
+
- Modify: `tests/lib/cortex/retrieval/search.test.ts`
|
|
473
|
+
|
|
474
|
+
- [ ] **Step 1: Write failing test**
|
|
475
|
+
|
|
476
|
+
In `tests/lib/cortex/retrieval/search.test.ts`, add a test that verifies stale results score lower than fresh results. The test should mock two results: one with file refs pointing to a recently modified file, one without. Check that the stale one ranks lower.
|
|
477
|
+
|
|
478
|
+
```typescript
|
|
479
|
+
it('reduces score for stale results (modified file refs)', async () => {
|
|
480
|
+
// Mock store.search returning two results
|
|
481
|
+
// Result A: has file_refs ['src/auth.ts'], source_timestamp = 30 days ago
|
|
482
|
+
// Result B: no file_refs, same similarity
|
|
483
|
+
// After staleness, A should rank lower than B
|
|
484
|
+
|
|
485
|
+
// Provide a fileStatFn that returns a recent mtime for src/auth.ts
|
|
486
|
+
const thirtyDaysAgo = new Date(Date.now() - 30 * 86400000).toISOString();
|
|
487
|
+
const yesterday = new Date(Date.now() - 86400000).toISOString();
|
|
488
|
+
|
|
489
|
+
// Note: store.search() returns deserialized data (file_refs as array, metadata as object)
|
|
490
|
+
mockStore.search.mockResolvedValue([
|
|
491
|
+
{ id: 'a', text: 'auth', file_refs: ['src/auth.ts'], source_timestamp: thirtyDaysAgo, confidence: 0.8, stale_score: 0, created: thirtyDaysAgo, _distance: 0.1, access_count: 0, last_accessed: null, metadata: {}, type: 'decision', layer: 'personal', workspace_id: null, session_id: null, agent_type: 'claude', project_path: '/project' },
|
|
492
|
+
{ id: 'b', text: 'other', file_refs: [], source_timestamp: thirtyDaysAgo, confidence: 0.8, stale_score: 0, created: thirtyDaysAgo, _distance: 0.1, access_count: 0, last_accessed: null, metadata: {}, type: 'decision', layer: 'personal', workspace_id: null, session_id: null, agent_type: 'claude', project_path: '/project' },
|
|
493
|
+
]);
|
|
494
|
+
|
|
495
|
+
const search = new CortexSearch(mockStore, {
|
|
496
|
+
fileStat: async (filepath: string) => {
|
|
497
|
+
if (filepath.includes('auth.ts')) return { mtime: new Date(yesterday) };
|
|
498
|
+
return null;
|
|
499
|
+
},
|
|
500
|
+
});
|
|
501
|
+
|
|
502
|
+
const results = await search.search([0.1, 0.2, 0.3], { limit: 2 });
|
|
503
|
+
// Result B should rank higher (not stale)
|
|
504
|
+
expect(results[0].id).toBe('b');
|
|
505
|
+
expect(results[1].id).toBe('a');
|
|
506
|
+
expect(results[1].stale_score).toBeGreaterThan(0);
|
|
507
|
+
});
|
|
508
|
+
```
|
|
509
|
+
|
|
510
|
+
- [ ] **Step 2: Run test to verify it fails**
|
|
511
|
+
|
|
512
|
+
Run: `npx vitest run tests/lib/cortex/retrieval/search.test.ts`
|
|
513
|
+
Expected: FAIL — `CortexSearch` doesn't accept `fileStat` option
|
|
514
|
+
|
|
515
|
+
- [ ] **Step 3: Add staleness computation to search**
|
|
516
|
+
|
|
517
|
+
Modify `src/lib/cortex/retrieval/search.ts`:
|
|
518
|
+
|
|
519
|
+
```typescript
|
|
520
|
+
import { computeFileStaleScore } from '../knowledge/staleness';
|
|
521
|
+
import fs from 'fs';
|
|
522
|
+
import path from 'path';
|
|
523
|
+
|
|
524
|
+
export interface SearchDeps {
|
|
525
|
+
/** Optional fs.stat wrapper — injectable for testing. */
|
|
526
|
+
fileStat?: (filepath: string) => Promise<{ mtime: Date } | null>;
|
|
527
|
+
}
|
|
528
|
+
|
|
529
|
+
export class CortexSearch {
|
|
530
|
+
private fileStat: (filepath: string) => Promise<{ mtime: Date } | null>;
|
|
531
|
+
|
|
532
|
+
constructor(private store: CortexStore, deps: SearchDeps = {}) {
|
|
533
|
+
this.fileStat = deps.fileStat ?? defaultFileStat;
|
|
534
|
+
}
|
|
535
|
+
```
|
|
536
|
+
|
|
537
|
+
Add a helper and update the search loop to compute staleness on the top results:
|
|
538
|
+
|
|
539
|
+
```typescript
|
|
540
|
+
private async computeStaleness(
|
|
541
|
+
unit: KnowledgeUnit,
|
|
542
|
+
statFn?: (fp: string) => Promise<{ mtime: Date } | null>,
|
|
543
|
+
): Promise<number> {
|
|
544
|
+
if (unit.file_refs.length === 0) return 0;
|
|
545
|
+
const doStat = statFn ?? this.fileStat;
|
|
546
|
+
|
|
547
|
+
const fileModTimes: Record<string, string> = {};
|
|
548
|
+
for (const ref of unit.file_refs) {
|
|
549
|
+
const fullPath = unit.project_path ? path.join(unit.project_path, ref) : ref;
|
|
550
|
+
try {
|
|
551
|
+
const stat = await doStat(fullPath);
|
|
552
|
+
if (stat) fileModTimes[ref] = stat.mtime.toISOString();
|
|
553
|
+
} catch { /* file doesn't exist or not accessible */ }
|
|
554
|
+
}
|
|
555
|
+
|
|
556
|
+
return computeFileStaleScore({
|
|
557
|
+
fileRefs: unit.file_refs,
|
|
558
|
+
sourceTimestamp: unit.source_timestamp,
|
|
559
|
+
fileModTimes,
|
|
560
|
+
});
|
|
561
|
+
}
|
|
562
|
+
```
|
|
563
|
+
|
|
564
|
+
In the search method, after collecting `allResults` and before the final sort, compute staleness on the top candidates:
|
|
565
|
+
|
|
566
|
+
```typescript
|
|
567
|
+
// Compute staleness on top candidates (cache stat calls per search request)
|
|
568
|
+
const candidates = allResults
|
|
569
|
+
.sort((a, b) => b.relevance_score - a.relevance_score)
|
|
570
|
+
.slice(0, limit * 2);
|
|
571
|
+
|
|
572
|
+
const statCache = new Map<string, { mtime: Date } | null>();
|
|
573
|
+
const cachedFileStat = async (fp: string) => {
|
|
574
|
+
if (!statCache.has(fp)) statCache.set(fp, await this.fileStat(fp));
|
|
575
|
+
return statCache.get(fp)!;
|
|
576
|
+
};
|
|
577
|
+
|
|
578
|
+
for (const result of candidates) {
|
|
579
|
+
const staleScore = await this.computeStaleness(result, cachedFileStat);
|
|
580
|
+
if (staleScore > 0) {
|
|
581
|
+
result.stale_score = staleScore;
|
|
582
|
+
// Recompute relevance with staleness
|
|
583
|
+
result.relevance_score = computeRelevanceScore({
|
|
584
|
+
similarity: result.similarity,
|
|
585
|
+
confidence: result.confidence,
|
|
586
|
+
stale_score: staleScore,
|
|
587
|
+
created: result.created,
|
|
588
|
+
}) * (LAYER_WEIGHTS[result.layer] ?? 0.5);
|
|
589
|
+
}
|
|
590
|
+
}
|
|
591
|
+
|
|
592
|
+
candidates.sort((a, b) => b.relevance_score - a.relevance_score);
|
|
593
|
+
return candidates.slice(0, limit);
|
|
594
|
+
```
|
|
595
|
+
|
|
596
|
+
Add the default `fileStat`:
|
|
597
|
+
|
|
598
|
+
```typescript
|
|
599
|
+
async function defaultFileStat(filepath: string): Promise<{ mtime: Date } | null> {
|
|
600
|
+
try {
|
|
601
|
+
const stat = await fs.promises.stat(filepath);
|
|
602
|
+
return { mtime: stat.mtime };
|
|
603
|
+
} catch {
|
|
604
|
+
return null;
|
|
605
|
+
}
|
|
606
|
+
}
|
|
607
|
+
```
|
|
608
|
+
|
|
609
|
+
- [ ] **Step 4: Update CortexSearch instantiation in `index.ts`**
|
|
610
|
+
|
|
611
|
+
In `src/lib/cortex/index.ts` (line 43), no change needed — the `SearchDeps` param is optional and defaults to real `fs.stat`.
|
|
612
|
+
|
|
613
|
+
- [ ] **Step 5: Run tests to verify they pass**
|
|
614
|
+
|
|
615
|
+
Run: `npx vitest run tests/lib/cortex/retrieval/search.test.ts`
|
|
616
|
+
Expected: ALL PASS
|
|
617
|
+
|
|
618
|
+
Note: existing tests create `CortexSearch(store)` without deps — that's fine since deps is optional.
|
|
619
|
+
|
|
620
|
+
- [ ] **Step 6: Commit**
|
|
621
|
+
|
|
622
|
+
```bash
|
|
623
|
+
git add src/lib/cortex/retrieval/search.ts tests/lib/cortex/retrieval/search.test.ts
|
|
624
|
+
git commit -m "feat(cortex): wire staleness scoring into search"
|
|
625
|
+
```
|
|
626
|
+
|
|
627
|
+
---
|
|
628
|
+
|
|
629
|
+
### Task 6: Add staleness badge to knowledge card
|
|
630
|
+
|
|
631
|
+
**Files:**
|
|
632
|
+
- Modify: `src/components/cortex/knowledge-card.tsx`
|
|
633
|
+
|
|
634
|
+
- [ ] **Step 1: Add staleness indicator**
|
|
635
|
+
|
|
636
|
+
In `src/components/cortex/knowledge-card.tsx`, update the `KnowledgeCardProps` interface to include `stale_score`:
|
|
637
|
+
|
|
638
|
+
```typescript
|
|
639
|
+
interface KnowledgeCardProps {
|
|
640
|
+
unit: {
|
|
641
|
+
id: string;
|
|
642
|
+
text: string;
|
|
643
|
+
type: string;
|
|
644
|
+
confidence: number;
|
|
645
|
+
created: string;
|
|
646
|
+
session_id?: string | null;
|
|
647
|
+
layer: string;
|
|
648
|
+
stale_score?: number;
|
|
649
|
+
};
|
|
650
|
+
onDelete?: (id: string) => void;
|
|
651
|
+
}
|
|
652
|
+
```
|
|
653
|
+
|
|
654
|
+
Add the amber badge after the type badge (inside the flex row at line 43-46):
|
|
655
|
+
|
|
656
|
+
```tsx
|
|
657
|
+
{(unit.stale_score ?? 0) > 0.3 && (
|
|
658
|
+
<span
|
|
659
|
+
className="text-[10px] px-1.5 py-0.5 rounded font-medium bg-amber-500/20 text-amber-400"
|
|
660
|
+
title="Referenced files have changed since this was learned"
|
|
661
|
+
>
|
|
662
|
+
stale
|
|
663
|
+
</span>
|
|
664
|
+
)}
|
|
665
|
+
```
|
|
666
|
+
|
|
667
|
+
- [ ] **Step 2: Verify visually** — No automated test. Check by opening Cortex panel in browser if available. Otherwise verify the component compiles:
|
|
668
|
+
|
|
669
|
+
Run: `npx tsc --noEmit src/components/cortex/knowledge-card.tsx` or just run the full build check.
|
|
670
|
+
|
|
671
|
+
- [ ] **Step 3: Commit**
|
|
672
|
+
|
|
673
|
+
```bash
|
|
674
|
+
git add src/components/cortex/knowledge-card.tsx
|
|
675
|
+
git commit -m "feat(cortex): add staleness badge to knowledge card"
|
|
676
|
+
```
|
|
677
|
+
|
|
678
|
+
---
|
|
679
|
+
|
|
680
|
+
## Chunk 4: Distillation (Layer 4)
|
|
681
|
+
|
|
682
|
+
### Task 7: Create `callLLM` implementation
|
|
683
|
+
|
|
684
|
+
**Files:**
|
|
685
|
+
- Create: `src/lib/cortex/distillation/llm.ts`
|
|
686
|
+
- Create: `tests/lib/cortex/distillation/llm.test.ts`
|
|
687
|
+
|
|
688
|
+
- [ ] **Step 1: Write failing test**
|
|
689
|
+
|
|
690
|
+
Create `tests/lib/cortex/distillation/llm.test.ts`:
|
|
691
|
+
|
|
692
|
+
```typescript
|
|
693
|
+
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
|
694
|
+
import { createCallLLM, detectLLMProvider } from '@/lib/cortex/distillation/llm';
|
|
695
|
+
|
|
696
|
+
describe('detectLLMProvider', () => {
|
|
697
|
+
const originalEnv = process.env;
|
|
698
|
+
|
|
699
|
+
beforeEach(() => { process.env = { ...originalEnv }; });
|
|
700
|
+
afterEach(() => { process.env = originalEnv; });
|
|
701
|
+
|
|
702
|
+
it('detects anthropic when ANTHROPIC_API_KEY is set', () => {
|
|
703
|
+
process.env.ANTHROPIC_API_KEY = 'sk-ant-test';
|
|
704
|
+
expect(detectLLMProvider()).toBe('anthropic');
|
|
705
|
+
});
|
|
706
|
+
|
|
707
|
+
it('detects openai when OPENAI_API_KEY is set', () => {
|
|
708
|
+
delete process.env.ANTHROPIC_API_KEY;
|
|
709
|
+
process.env.OPENAI_API_KEY = 'sk-test';
|
|
710
|
+
expect(detectLLMProvider()).toBe('openai');
|
|
711
|
+
});
|
|
712
|
+
|
|
713
|
+
it('returns null when no keys are set', () => {
|
|
714
|
+
delete process.env.ANTHROPIC_API_KEY;
|
|
715
|
+
delete process.env.OPENAI_API_KEY;
|
|
716
|
+
expect(detectLLMProvider()).toBeNull();
|
|
717
|
+
});
|
|
718
|
+
});
|
|
719
|
+
|
|
720
|
+
describe('createCallLLM', () => {
|
|
721
|
+
it('returns null when no provider available', () => {
|
|
722
|
+
delete process.env.ANTHROPIC_API_KEY;
|
|
723
|
+
delete process.env.OPENAI_API_KEY;
|
|
724
|
+
expect(createCallLLM()).toBeNull();
|
|
725
|
+
});
|
|
726
|
+
});
|
|
727
|
+
```
|
|
728
|
+
|
|
729
|
+
- [ ] **Step 2: Run test to verify it fails**
|
|
730
|
+
|
|
731
|
+
Run: `npx vitest run tests/lib/cortex/distillation/llm.test.ts`
|
|
732
|
+
Expected: FAIL — module doesn't exist
|
|
733
|
+
|
|
734
|
+
- [ ] **Step 3: Implement `llm.ts`**
|
|
735
|
+
|
|
736
|
+
Create `src/lib/cortex/distillation/llm.ts`:
|
|
737
|
+
|
|
738
|
+
```typescript
|
|
739
|
+
type LLMProvider = 'anthropic' | 'openai';
|
|
740
|
+
|
|
741
|
+
export function detectLLMProvider(): LLMProvider | null {
|
|
742
|
+
if (process.env.ANTHROPIC_API_KEY) return 'anthropic';
|
|
743
|
+
if (process.env.OPENAI_API_KEY) return 'openai';
|
|
744
|
+
return null;
|
|
745
|
+
}
|
|
746
|
+
|
|
747
|
+
export function createCallLLM(): ((system: string, user: string) => Promise<string>) | null {
|
|
748
|
+
const provider = detectLLMProvider();
|
|
749
|
+
if (!provider) return null;
|
|
750
|
+
|
|
751
|
+
if (provider === 'anthropic') {
|
|
752
|
+
return async (system: string, user: string) => {
|
|
753
|
+
const res = await fetch('https://api.anthropic.com/v1/messages', {
|
|
754
|
+
method: 'POST',
|
|
755
|
+
headers: {
|
|
756
|
+
'Content-Type': 'application/json',
|
|
757
|
+
'x-api-key': process.env.ANTHROPIC_API_KEY!,
|
|
758
|
+
'anthropic-version': '2023-06-01',
|
|
759
|
+
},
|
|
760
|
+
body: JSON.stringify({
|
|
761
|
+
model: 'claude-haiku-4-5-20251001',
|
|
762
|
+
max_tokens: 2048,
|
|
763
|
+
system,
|
|
764
|
+
messages: [{ role: 'user', content: user }],
|
|
765
|
+
}),
|
|
766
|
+
});
|
|
767
|
+
if (!res.ok) throw new Error(`Anthropic API error: ${res.status}`);
|
|
768
|
+
const data = await res.json();
|
|
769
|
+
return data.content?.[0]?.text ?? '';
|
|
770
|
+
};
|
|
771
|
+
}
|
|
772
|
+
|
|
773
|
+
return async (system: string, user: string) => {
|
|
774
|
+
const res = await fetch('https://api.openai.com/v1/chat/completions', {
|
|
775
|
+
method: 'POST',
|
|
776
|
+
headers: {
|
|
777
|
+
'Content-Type': 'application/json',
|
|
778
|
+
'Authorization': `Bearer ${process.env.OPENAI_API_KEY}`,
|
|
779
|
+
},
|
|
780
|
+
body: JSON.stringify({
|
|
781
|
+
model: 'gpt-4o-mini',
|
|
782
|
+
messages: [
|
|
783
|
+
{ role: 'system', content: system },
|
|
784
|
+
{ role: 'user', content: user },
|
|
785
|
+
],
|
|
786
|
+
max_tokens: 2048,
|
|
787
|
+
}),
|
|
788
|
+
});
|
|
789
|
+
if (!res.ok) throw new Error(`OpenAI API error: ${res.status}`);
|
|
790
|
+
const data = await res.json();
|
|
791
|
+
return data.choices?.[0]?.message?.content ?? '';
|
|
792
|
+
};
|
|
793
|
+
}
|
|
794
|
+
```
|
|
795
|
+
|
|
796
|
+
- [ ] **Step 4: Run test to verify it passes**
|
|
797
|
+
|
|
798
|
+
Run: `npx vitest run tests/lib/cortex/distillation/llm.test.ts`
|
|
799
|
+
Expected: ALL PASS
|
|
800
|
+
|
|
801
|
+
- [ ] **Step 5: Commit**
|
|
802
|
+
|
|
803
|
+
```bash
|
|
804
|
+
git add src/lib/cortex/distillation/llm.ts tests/lib/cortex/distillation/llm.test.ts
|
|
805
|
+
git commit -m "feat(cortex): add callLLM with auto-detect for Haiku/GPT-4o-mini"
|
|
806
|
+
```
|
|
807
|
+
|
|
808
|
+
---
|
|
809
|
+
|
|
810
|
+
### Task 8: Create distillation queue
|
|
811
|
+
|
|
812
|
+
**Files:**
|
|
813
|
+
- Create: `src/lib/cortex/distillation/queue.ts`
|
|
814
|
+
- Create: `tests/lib/cortex/distillation/queue.test.ts`
|
|
815
|
+
|
|
816
|
+
- [ ] **Step 1: Write failing test**
|
|
817
|
+
|
|
818
|
+
Create `tests/lib/cortex/distillation/queue.test.ts`:
|
|
819
|
+
|
|
820
|
+
```typescript
|
|
821
|
+
import { describe, it, expect, beforeEach, afterEach } from 'vitest';
|
|
822
|
+
import { DistillationQueue } from '@/lib/cortex/distillation/queue';
|
|
823
|
+
import fs from 'fs';
|
|
824
|
+
import path from 'path';
|
|
825
|
+
import os from 'os';
|
|
826
|
+
|
|
827
|
+
describe('DistillationQueue', () => {
|
|
828
|
+
let tmpDir: string;
|
|
829
|
+
let queue: DistillationQueue;
|
|
830
|
+
|
|
831
|
+
beforeEach(() => {
|
|
832
|
+
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'cortex-queue-'));
|
|
833
|
+
queue = new DistillationQueue(tmpDir);
|
|
834
|
+
});
|
|
835
|
+
|
|
836
|
+
afterEach(() => { fs.rmSync(tmpDir, { recursive: true, force: true }); });
|
|
837
|
+
|
|
838
|
+
const entry = { text: 'chunk text 1', layerKey: 'personal', workspaceId: null, agentType: 'claude' };
|
|
839
|
+
|
|
840
|
+
it('enqueues and retrieves chunks by id', () => {
|
|
841
|
+
queue.enqueue('id1', entry);
|
|
842
|
+
queue.enqueue('id2', { ...entry, text: 'chunk text 2' });
|
|
843
|
+
|
|
844
|
+
const texts = queue.getTexts(['id1', 'id2']);
|
|
845
|
+
expect(texts).toEqual(['chunk text 1', 'chunk text 2']);
|
|
846
|
+
});
|
|
847
|
+
|
|
848
|
+
it('getEntries returns full context', () => {
|
|
849
|
+
queue.enqueue('id1', { text: 'ws text', layerKey: 'workspace/5', workspaceId: 5, agentType: 'claude' });
|
|
850
|
+
const entries = queue.getEntries(['id1']);
|
|
851
|
+
expect(entries[0].layerKey).toBe('workspace/5');
|
|
852
|
+
expect(entries[0].workspaceId).toBe(5);
|
|
853
|
+
});
|
|
854
|
+
|
|
855
|
+
it('removes processed entries', () => {
|
|
856
|
+
queue.enqueue('id1', entry);
|
|
857
|
+
queue.remove(['id1']);
|
|
858
|
+
expect(queue.getTexts(['id1'])).toEqual([]);
|
|
859
|
+
});
|
|
860
|
+
|
|
861
|
+
it('persists to disk and recovers', () => {
|
|
862
|
+
queue.enqueue('id1', entry);
|
|
863
|
+
|
|
864
|
+
const queue2 = new DistillationQueue(tmpDir);
|
|
865
|
+
expect(queue2.pendingIds()).toEqual(['id1']);
|
|
866
|
+
expect(queue2.getTexts(['id1'])).toEqual(['chunk text 1']);
|
|
867
|
+
});
|
|
868
|
+
});
|
|
869
|
+
```
|
|
870
|
+
|
|
871
|
+
- [ ] **Step 2: Run test to verify it fails**
|
|
872
|
+
|
|
873
|
+
Run: `npx vitest run tests/lib/cortex/distillation/queue.test.ts`
|
|
874
|
+
Expected: FAIL — module doesn't exist
|
|
875
|
+
|
|
876
|
+
- [ ] **Step 3: Implement queue**
|
|
877
|
+
|
|
878
|
+
Create `src/lib/cortex/distillation/queue.ts`:
|
|
879
|
+
|
|
880
|
+
```typescript
|
|
881
|
+
import fs from 'fs';
|
|
882
|
+
import path from 'path';
|
|
883
|
+
|
|
884
|
+
const QUEUE_FILE = 'distill-queue.json';
|
|
885
|
+
|
|
886
|
+
interface QueueEntry {
|
|
887
|
+
text: string;
|
|
888
|
+
layerKey: string;
|
|
889
|
+
workspaceId: number | null;
|
|
890
|
+
agentType: string;
|
|
891
|
+
}
|
|
892
|
+
|
|
893
|
+
/** Simple file-backed queue mapping chunk IDs to their text + context. */
|
|
894
|
+
export class DistillationQueue {
|
|
895
|
+
private data: Record<string, QueueEntry> = {};
|
|
896
|
+
private filePath: string;
|
|
897
|
+
|
|
898
|
+
constructor(cortexDir: string) {
|
|
899
|
+
this.filePath = path.join(cortexDir, QUEUE_FILE);
|
|
900
|
+
this.load();
|
|
901
|
+
}
|
|
902
|
+
|
|
903
|
+
private load(): void {
|
|
904
|
+
try {
|
|
905
|
+
if (fs.existsSync(this.filePath)) {
|
|
906
|
+
this.data = JSON.parse(fs.readFileSync(this.filePath, 'utf-8'));
|
|
907
|
+
}
|
|
908
|
+
} catch { this.data = {}; }
|
|
909
|
+
}
|
|
910
|
+
|
|
911
|
+
private save(): void {
|
|
912
|
+
fs.writeFileSync(this.filePath, JSON.stringify(this.data));
|
|
913
|
+
}
|
|
914
|
+
|
|
915
|
+
enqueue(id: string, entry: QueueEntry): void {
|
|
916
|
+
this.data[id] = entry;
|
|
917
|
+
this.save();
|
|
918
|
+
}
|
|
919
|
+
|
|
920
|
+
getEntries(ids: string[]): QueueEntry[] {
|
|
921
|
+
return ids.map(id => this.data[id]).filter((e): e is QueueEntry => e !== undefined);
|
|
922
|
+
}
|
|
923
|
+
|
|
924
|
+
getTexts(ids: string[]): string[] {
|
|
925
|
+
return this.getEntries(ids).map(e => e.text);
|
|
926
|
+
}
|
|
927
|
+
|
|
928
|
+
remove(ids: string[]): void {
|
|
929
|
+
for (const id of ids) delete this.data[id];
|
|
930
|
+
this.save();
|
|
931
|
+
}
|
|
932
|
+
|
|
933
|
+
pendingIds(): string[] {
|
|
934
|
+
return Object.keys(this.data);
|
|
935
|
+
}
|
|
936
|
+
}
|
|
937
|
+
```
|
|
938
|
+
|
|
939
|
+
- [ ] **Step 4: Run test to verify it passes**
|
|
940
|
+
|
|
941
|
+
Run: `npx vitest run tests/lib/cortex/distillation/queue.test.ts`
|
|
942
|
+
Expected: ALL PASS
|
|
943
|
+
|
|
944
|
+
- [ ] **Step 5: Commit**
|
|
945
|
+
|
|
946
|
+
```bash
|
|
947
|
+
git add src/lib/cortex/distillation/queue.ts tests/lib/cortex/distillation/queue.test.ts
|
|
948
|
+
git commit -m "feat(cortex): add file-backed distillation queue"
|
|
949
|
+
```
|
|
950
|
+
|
|
951
|
+
---
|
|
952
|
+
|
|
953
|
+
### Task 9: Wire distillation into Cortex singleton
|
|
954
|
+
|
|
955
|
+
**Files:**
|
|
956
|
+
- Modify: `src/lib/cortex/index.ts`
|
|
957
|
+
- Modify: `src/lib/cortex/ingestion/pipeline.ts`
|
|
958
|
+
|
|
959
|
+
- [ ] **Step 1: Add distillation wiring to `index.ts`**
|
|
960
|
+
|
|
961
|
+
In `src/lib/cortex/index.ts`, add imports:
|
|
962
|
+
|
|
963
|
+
```typescript
|
|
964
|
+
import { Distiller } from './distillation/distiller';
|
|
965
|
+
import { DistillationScheduler } from './distillation/scheduler';
|
|
966
|
+
import { DistillationQueue } from './distillation/queue';
|
|
967
|
+
import { createCallLLM } from './distillation/llm';
|
|
968
|
+
```
|
|
969
|
+
|
|
970
|
+
Update the `CortexInstance` interface to include the queue and scheduler:
|
|
971
|
+
|
|
972
|
+
```typescript
|
|
973
|
+
export interface CortexInstance {
|
|
974
|
+
config: CortexConfig;
|
|
975
|
+
store: CortexStore;
|
|
976
|
+
search: CortexSearch;
|
|
977
|
+
pipeline: IngestionPipeline;
|
|
978
|
+
embedding: EmbeddingProvider;
|
|
979
|
+
sync?: FederationSync;
|
|
980
|
+
distillQueue?: DistillationQueue;
|
|
981
|
+
distillScheduler?: DistillationScheduler;
|
|
982
|
+
}
|
|
983
|
+
```
|
|
984
|
+
|
|
985
|
+
In `getCortex()`, after creating the pipeline, wire distillation:
|
|
986
|
+
|
|
987
|
+
```typescript
|
|
988
|
+
// Initialize distillation if enabled and LLM provider available
|
|
989
|
+
let distillQueue: DistillationQueue | undefined;
|
|
990
|
+
let distillScheduler: DistillationScheduler | undefined;
|
|
991
|
+
|
|
992
|
+
if (config.ingestion.distillation) {
|
|
993
|
+
const callLLM = createCallLLM();
|
|
994
|
+
if (callLLM) {
|
|
995
|
+
distillQueue = new DistillationQueue(cortexDir);
|
|
996
|
+
const distiller = new Distiller(store, embedding, callLLM);
|
|
997
|
+
|
|
998
|
+
distillScheduler = new DistillationScheduler(async (chunkIds) => {
|
|
999
|
+
const entries = distillQueue!.getEntries(chunkIds);
|
|
1000
|
+
if (entries.length === 0) return;
|
|
1001
|
+
|
|
1002
|
+
// Group by layerKey so workspace chunks go to the correct layer
|
|
1003
|
+
const byLayer = new Map<string, { texts: string[]; ctx: { workspaceId: number | null; agentType: string } }>();
|
|
1004
|
+
for (const e of entries) {
|
|
1005
|
+
if (!byLayer.has(e.layerKey)) {
|
|
1006
|
+
byLayer.set(e.layerKey, { texts: [], ctx: { workspaceId: e.workspaceId, agentType: e.agentType } });
|
|
1007
|
+
}
|
|
1008
|
+
byLayer.get(e.layerKey)!.texts.push(e.text);
|
|
1009
|
+
}
|
|
1010
|
+
|
|
1011
|
+
for (const [layerKey, { texts, ctx }] of byLayer) {
|
|
1012
|
+
await distiller.distill(texts, layerKey, ctx);
|
|
1013
|
+
}
|
|
1014
|
+
distillQueue!.remove(chunkIds);
|
|
1015
|
+
});
|
|
1016
|
+
|
|
1017
|
+
// Re-enqueue any pending items from previous session
|
|
1018
|
+
const pendingIds = distillQueue.pendingIds();
|
|
1019
|
+
if (pendingIds.length > 0) {
|
|
1020
|
+
distillScheduler.enqueue(pendingIds);
|
|
1021
|
+
}
|
|
1022
|
+
}
|
|
1023
|
+
}
|
|
1024
|
+
|
|
1025
|
+
const instance: CortexInstance = {
|
|
1026
|
+
config, store, search, pipeline, embedding,
|
|
1027
|
+
distillQueue, distillScheduler,
|
|
1028
|
+
};
|
|
1029
|
+
```
|
|
1030
|
+
|
|
1031
|
+
- [ ] **Step 2: Add distillation enqueue to pipeline**
|
|
1032
|
+
|
|
1033
|
+
In `src/lib/cortex/ingestion/pipeline.ts`, add an optional queue property:
|
|
1034
|
+
|
|
1035
|
+
```typescript
|
|
1036
|
+
import type { DistillationQueue } from '../distillation/queue';
|
|
1037
|
+
import type { DistillationScheduler } from '../distillation/scheduler';
|
|
1038
|
+
|
|
1039
|
+
const DISTILLABLE_TYPES = new Set(['decision', 'error_fix']);
|
|
1040
|
+
|
|
1041
|
+
export class IngestionPipeline {
|
|
1042
|
+
private hashSet = new Set<string>();
|
|
1043
|
+
distillQueue?: DistillationQueue;
|
|
1044
|
+
distillScheduler?: DistillationScheduler;
|
|
1045
|
+
// ... existing constructor
|
|
1046
|
+
```
|
|
1047
|
+
|
|
1048
|
+
After `await this.store.add(layerKey, unit);` and `result.chunksEmbedded++;`, add:
|
|
1049
|
+
|
|
1050
|
+
```typescript
|
|
1051
|
+
// Enqueue for distillation if the type qualifies
|
|
1052
|
+
if (this.distillQueue && this.distillScheduler && DISTILLABLE_TYPES.has(unit.type)) {
|
|
1053
|
+
this.distillQueue.enqueue(unit.id, {
|
|
1054
|
+
text: unit.text,
|
|
1055
|
+
layerKey,
|
|
1056
|
+
workspaceId: unit.workspace_id,
|
|
1057
|
+
agentType: unit.agent_type,
|
|
1058
|
+
});
|
|
1059
|
+
this.distillScheduler.enqueue([unit.id]);
|
|
1060
|
+
}
|
|
1061
|
+
```
|
|
1062
|
+
|
|
1063
|
+
Back in `index.ts`, wire the queue and scheduler into the pipeline:
|
|
1064
|
+
|
|
1065
|
+
```typescript
|
|
1066
|
+
pipeline.distillQueue = distillQueue;
|
|
1067
|
+
pipeline.distillScheduler = distillScheduler;
|
|
1068
|
+
```
|
|
1069
|
+
|
|
1070
|
+
- [ ] **Step 3: Run full cortex test suite**
|
|
1071
|
+
|
|
1072
|
+
Run: `npx vitest run tests/lib/cortex/`
|
|
1073
|
+
Expected: ALL PASS
|
|
1074
|
+
|
|
1075
|
+
- [ ] **Step 4: Commit**
|
|
1076
|
+
|
|
1077
|
+
```bash
|
|
1078
|
+
git add src/lib/cortex/index.ts src/lib/cortex/ingestion/pipeline.ts
|
|
1079
|
+
git commit -m "feat(cortex): wire distillation scheduler into singleton and pipeline"
|
|
1080
|
+
```
|
|
1081
|
+
|
|
1082
|
+
---
|
|
1083
|
+
|
|
1084
|
+
## Chunk 5: MCP Server (Layer 5) + Bug Fixes
|
|
1085
|
+
|
|
1086
|
+
### Task 10: Create MCP server entry point
|
|
1087
|
+
|
|
1088
|
+
**Spec deviation note:** The spec says to adapt `mcp/server.ts` into an HTTP client wrapper. Instead, we leave `mcp/server.ts` unchanged and create thin API routes (`/api/cortex/mcp/tools` and `/api/cortex/mcp/call`) that import from it. The stdio entry point (`bin/cortex-mcp.js`) proxies to these routes. This is a cleaner separation — `server.ts` stays as the canonical tool handler, usable both server-side and via HTTP.
|
|
1089
|
+
|
|
1090
|
+
**Files:**
|
|
1091
|
+
- Create: `bin/cortex-mcp.js`
|
|
1092
|
+
- Create: `src/app/api/cortex/mcp/tools/route.ts`
|
|
1093
|
+
- Create: `src/app/api/cortex/mcp/call/route.ts`
|
|
1094
|
+
- Modify: `bin/terminal-server.js:471-519`
|
|
1095
|
+
|
|
1096
|
+
- [ ] **Step 1: Create `bin/cortex-mcp.js`**
|
|
1097
|
+
|
|
1098
|
+
This is a stdio MCP server that proxies to the Spaces HTTP API. It reads `SPACES_URL` from the environment (set during registration).
|
|
1099
|
+
|
|
1100
|
+
```javascript
|
|
1101
|
+
#!/usr/bin/env node
|
|
1102
|
+
'use strict';
|
|
1103
|
+
|
|
1104
|
+
const readline = require('readline');
|
|
1105
|
+
|
|
1106
|
+
const SPACES_URL = process.env.SPACES_URL || 'http://localhost:3457';
|
|
1107
|
+
const INTERNAL_TOKEN = process.env.SPACES_INTERNAL_TOKEN || '';
|
|
1108
|
+
|
|
1109
|
+
// MCP stdio transport: read JSON-RPC from stdin, write to stdout
|
|
1110
|
+
const rl = readline.createInterface({ input: process.stdin });
|
|
1111
|
+
|
|
1112
|
+
rl.on('line', async (line) => {
|
|
1113
|
+
let msg;
|
|
1114
|
+
try { msg = JSON.parse(line); } catch { return; }
|
|
1115
|
+
|
|
1116
|
+
if (msg.method === 'initialize') {
|
|
1117
|
+
respond(msg.id, {
|
|
1118
|
+
protocolVersion: '2024-11-05',
|
|
1119
|
+
capabilities: { tools: { listChanged: false } },
|
|
1120
|
+
serverInfo: { name: 'cortex', version: '1.0.0' },
|
|
1121
|
+
});
|
|
1122
|
+
return;
|
|
1123
|
+
}
|
|
1124
|
+
|
|
1125
|
+
if (msg.method === 'notifications/initialized') return; // no response needed
|
|
1126
|
+
|
|
1127
|
+
if (msg.method === 'tools/list') {
|
|
1128
|
+
const tools = await fetchJSON('/api/cortex/mcp/tools');
|
|
1129
|
+
respond(msg.id, { tools: tools || [] });
|
|
1130
|
+
return;
|
|
1131
|
+
}
|
|
1132
|
+
|
|
1133
|
+
if (msg.method === 'tools/call') {
|
|
1134
|
+
const { name, arguments: args } = msg.params;
|
|
1135
|
+
const result = await fetchJSON('/api/cortex/mcp/call', {
|
|
1136
|
+
method: 'POST',
|
|
1137
|
+
body: JSON.stringify({ name, args }),
|
|
1138
|
+
});
|
|
1139
|
+
respond(msg.id, result || { content: [{ type: 'text', text: 'Error calling tool' }], isError: true });
|
|
1140
|
+
return;
|
|
1141
|
+
}
|
|
1142
|
+
|
|
1143
|
+
// Unknown method
|
|
1144
|
+
respond(msg.id, null, { code: -32601, message: `Method not found: ${msg.method}` });
|
|
1145
|
+
});
|
|
1146
|
+
|
|
1147
|
+
function respond(id, result, error) {
|
|
1148
|
+
const msg = { jsonrpc: '2.0', id };
|
|
1149
|
+
if (error) msg.error = error;
|
|
1150
|
+
else msg.result = result;
|
|
1151
|
+
process.stdout.write(JSON.stringify(msg) + '\n');
|
|
1152
|
+
}
|
|
1153
|
+
|
|
1154
|
+
async function fetchJSON(path, opts = {}) {
|
|
1155
|
+
try {
|
|
1156
|
+
const headers = { 'Content-Type': 'application/json' };
|
|
1157
|
+
if (INTERNAL_TOKEN) headers['x-spaces-internal'] = INTERNAL_TOKEN;
|
|
1158
|
+
const res = await fetch(`${SPACES_URL}${path}`, { ...opts, headers: { ...headers, ...opts.headers } });
|
|
1159
|
+
if (res.ok) return await res.json();
|
|
1160
|
+
return null;
|
|
1161
|
+
} catch { return null; }
|
|
1162
|
+
}
|
|
1163
|
+
```
|
|
1164
|
+
|
|
1165
|
+
- [ ] **Step 2: Create MCP API routes**
|
|
1166
|
+
|
|
1167
|
+
Create `src/app/api/cortex/mcp/tools/route.ts`:
|
|
1168
|
+
|
|
1169
|
+
```typescript
|
|
1170
|
+
import { NextResponse } from 'next/server';
|
|
1171
|
+
import { CORTEX_TOOLS } from '@/lib/cortex/mcp/server';
|
|
1172
|
+
|
|
1173
|
+
export async function GET() {
|
|
1174
|
+
return NextResponse.json(CORTEX_TOOLS);
|
|
1175
|
+
}
|
|
1176
|
+
```
|
|
1177
|
+
|
|
1178
|
+
Create `src/app/api/cortex/mcp/call/route.ts`:
|
|
1179
|
+
|
|
1180
|
+
```typescript
|
|
1181
|
+
import { NextResponse } from 'next/server';
|
|
1182
|
+
import type { NextRequest } from 'next/server';
|
|
1183
|
+
import { getCortex } from '@/lib/cortex';
|
|
1184
|
+
import { handleToolCall } from '@/lib/cortex/mcp/server';
|
|
1185
|
+
|
|
1186
|
+
export async function POST(request: NextRequest) {
|
|
1187
|
+
const { name, args } = await request.json();
|
|
1188
|
+
const cortex = await getCortex();
|
|
1189
|
+
const result = await handleToolCall(name, args || {}, cortex);
|
|
1190
|
+
return NextResponse.json(result);
|
|
1191
|
+
}
|
|
1192
|
+
```
|
|
1193
|
+
|
|
1194
|
+
- [ ] **Step 3: Register MCP server in `writeCortexHookConfig`**
|
|
1195
|
+
|
|
1196
|
+
In `bin/terminal-server.js`, in the `writeCortexHookConfig` function (~line 487), after the hooks setup, add MCP server registration:
|
|
1197
|
+
|
|
1198
|
+
```javascript
|
|
1199
|
+
// Register Cortex MCP server
|
|
1200
|
+
const mcpServer = path.resolve(__dirname, 'cortex-mcp.js');
|
|
1201
|
+
if (!settings.mcpServers) settings.mcpServers = {};
|
|
1202
|
+
settings.mcpServers.cortex = {
|
|
1203
|
+
command: 'node',
|
|
1204
|
+
args: [mcpServer],
|
|
1205
|
+
env: {
|
|
1206
|
+
SPACES_URL: `http://localhost:${httpPort || 3457}`,
|
|
1207
|
+
SPACES_INTERNAL_TOKEN: (process.env.SPACES_SESSION_SECRET || '').slice(0, 16),
|
|
1208
|
+
},
|
|
1209
|
+
};
|
|
1210
|
+
```
|
|
1211
|
+
|
|
1212
|
+
The `httpPort` can be read from `httpServer.address().port` if available, or default to 3457.
|
|
1213
|
+
|
|
1214
|
+
- [ ] **Step 4: Run full test suite**
|
|
1215
|
+
|
|
1216
|
+
Run: `npx vitest run tests/lib/cortex/`
|
|
1217
|
+
Expected: ALL PASS
|
|
1218
|
+
|
|
1219
|
+
- [ ] **Step 5: Commit**
|
|
1220
|
+
|
|
1221
|
+
```bash
|
|
1222
|
+
git add bin/cortex-mcp.js src/app/api/cortex/mcp/ bin/terminal-server.js
|
|
1223
|
+
git commit -m "feat(cortex): mount MCP server with stdio transport and API routes"
|
|
1224
|
+
```
|
|
1225
|
+
|
|
1226
|
+
---
|
|
1227
|
+
|
|
1228
|
+
### Task 11: Fix client-side layer filtering bug
|
|
1229
|
+
|
|
1230
|
+
**Spec deviation note:** The spec suggests 3 separate requests (one per tab). Instead, we send 1 request with a `layer` query param based on the active tab. This is simpler (fewer concurrent requests) and results refresh on tab switch.
|
|
1231
|
+
|
|
1232
|
+
**Files:**
|
|
1233
|
+
- Modify: `src/app/api/cortex/search/route.ts`
|
|
1234
|
+
- Modify: `src/components/cortex/cortex-panel.tsx`
|
|
1235
|
+
|
|
1236
|
+
- [ ] **Step 1: Add `layer` param to search API**
|
|
1237
|
+
|
|
1238
|
+
In `src/app/api/cortex/search/route.ts`, read a `layer` param and pass it to both browse and search:
|
|
1239
|
+
|
|
1240
|
+
```typescript
|
|
1241
|
+
const layer = url.searchParams.get('layer') as any;
|
|
1242
|
+
|
|
1243
|
+
// Browse mode
|
|
1244
|
+
if (!query) {
|
|
1245
|
+
const layers = layer ? [layer] : ['personal', 'workspace', 'team'] as const;
|
|
1246
|
+
const results: any[] = [];
|
|
1247
|
+
for (const l of layers) {
|
|
1248
|
+
const items = await cortex.store.browse(l, limit);
|
|
1249
|
+
results.push(...items);
|
|
1250
|
+
}
|
|
1251
|
+
return NextResponse.json({ results: results.slice(0, limit) });
|
|
1252
|
+
}
|
|
1253
|
+
|
|
1254
|
+
// Search mode
|
|
1255
|
+
const [queryVector] = await cortex.embedding.embed([query]);
|
|
1256
|
+
const results = await cortex.search.search(queryVector, {
|
|
1257
|
+
workspaceId: workspaceId ? parseInt(workspaceId, 10) : null,
|
|
1258
|
+
layers: layer ? [layer] : undefined,
|
|
1259
|
+
limit,
|
|
1260
|
+
});
|
|
1261
|
+
```
|
|
1262
|
+
|
|
1263
|
+
- [ ] **Step 2: Update panel to fetch per-layer**
|
|
1264
|
+
|
|
1265
|
+
In `src/components/cortex/cortex-panel.tsx`, change `fetchBrowse` and `handleSearch` to include the `layer` param based on `activeTab`:
|
|
1266
|
+
|
|
1267
|
+
Replace `fetchBrowse` and `handleSearch` with a single `fetchResults`:
|
|
1268
|
+
|
|
1269
|
+
```typescript
|
|
1270
|
+
const fetchResults = useCallback(async (searchQuery?: string) => {
|
|
1271
|
+
setLoading(true);
|
|
1272
|
+
try {
|
|
1273
|
+
const params = new URLSearchParams({ limit: '20', layer: activeTab });
|
|
1274
|
+
if (searchQuery) params.set('q', searchQuery);
|
|
1275
|
+
const res = await fetch(api(`/api/cortex/search?${params}`));
|
|
1276
|
+
if (res.ok) {
|
|
1277
|
+
const data = await res.json();
|
|
1278
|
+
setResults(data.results || []);
|
|
1279
|
+
}
|
|
1280
|
+
} catch { /* ignore */ }
|
|
1281
|
+
setLoading(false);
|
|
1282
|
+
}, [activeTab]);
|
|
1283
|
+
```
|
|
1284
|
+
|
|
1285
|
+
Update the `useEffect` to include `activeTab` and call `fetchResults`:
|
|
1286
|
+
|
|
1287
|
+
```typescript
|
|
1288
|
+
useEffect(() => {
|
|
1289
|
+
if (open) {
|
|
1290
|
+
fetchStats();
|
|
1291
|
+
fetchResults(query || undefined);
|
|
1292
|
+
}
|
|
1293
|
+
}, [open, activeTab, fetchStats, fetchResults]);
|
|
1294
|
+
```
|
|
1295
|
+
|
|
1296
|
+
Update `handleSearch` to call `fetchResults`:
|
|
1297
|
+
|
|
1298
|
+
```typescript
|
|
1299
|
+
const handleSearch = () => {
|
|
1300
|
+
fetchResults(query.trim() || undefined);
|
|
1301
|
+
};
|
|
1302
|
+
```
|
|
1303
|
+
|
|
1304
|
+
Remove the client-side `filtered` variable — `results` are already per-layer. Replace `filtered.map(unit =>` with `results.map(unit =>`.
|
|
1305
|
+
|
|
1306
|
+
Remove `const filtered = results.filter(r => r.layer === activeTab);` and use `results` directly in the render.
|
|
1307
|
+
|
|
1308
|
+
- [ ] **Step 3: Verify — run build or type check**
|
|
1309
|
+
|
|
1310
|
+
Run: `npx tsc --noEmit` (or `npm run build`)
|
|
1311
|
+
|
|
1312
|
+
- [ ] **Step 4: Commit**
|
|
1313
|
+
|
|
1314
|
+
```bash
|
|
1315
|
+
git add src/app/api/cortex/search/route.ts src/components/cortex/cortex-panel.tsx
|
|
1316
|
+
git commit -m "fix(cortex): query per-layer instead of client-side filtering"
|
|
1317
|
+
```
|
|
1318
|
+
|
|
1319
|
+
---
|
|
1320
|
+
|
|
1321
|
+
### Task 12: Fix exporter hardcoded 384-dim vector
|
|
1322
|
+
|
|
1323
|
+
**Files:**
|
|
1324
|
+
- Modify: `src/lib/cortex/portability/exporter.ts`
|
|
1325
|
+
|
|
1326
|
+
- [ ] **Step 1: Read the exporter to find the hardcoded line**
|
|
1327
|
+
|
|
1328
|
+
Look for `new Array(384)` in `exporter.ts`.
|
|
1329
|
+
|
|
1330
|
+
- [ ] **Step 2: Fix — accept dimensions parameter**
|
|
1331
|
+
|
|
1332
|
+
Update the export function signature to accept `dimensions: number` and replace the hardcoded array:
|
|
1333
|
+
|
|
1334
|
+
```typescript
|
|
1335
|
+
const dummyVector = new Array(dimensions).fill(0);
|
|
1336
|
+
```
|
|
1337
|
+
|
|
1338
|
+
Thread `dimensions` from the calling context (the MCP `cortex_export` handler and the API route both have access to `cortex.embedding.dimensions`).
|
|
1339
|
+
|
|
1340
|
+
- [ ] **Step 3: Update callers**
|
|
1341
|
+
|
|
1342
|
+
Update `src/app/api/cortex/export/route.ts` and `src/lib/cortex/mcp/server.ts` to pass `embedding.dimensions` to the export function.
|
|
1343
|
+
|
|
1344
|
+
- [ ] **Step 4: Run tests**
|
|
1345
|
+
|
|
1346
|
+
Run: `npx vitest run tests/lib/cortex/portability/exporter.test.ts`
|
|
1347
|
+
Expected: ALL PASS (update test mocks if dimensions param is now required)
|
|
1348
|
+
|
|
1349
|
+
- [ ] **Step 5: Commit**
|
|
1350
|
+
|
|
1351
|
+
```bash
|
|
1352
|
+
git add src/lib/cortex/portability/exporter.ts src/app/api/cortex/export/ src/lib/cortex/mcp/server.ts
|
|
1353
|
+
git commit -m "fix(cortex): use actual embedding dimensions in exporter instead of hardcoded 384"
|
|
1354
|
+
```
|
|
1355
|
+
|
|
1356
|
+
---
|
|
1357
|
+
|
|
1358
|
+
## Final Verification
|
|
1359
|
+
|
|
1360
|
+
### Task 13: Full test suite + smoke test
|
|
1361
|
+
|
|
1362
|
+
- [ ] **Step 1: Run full cortex test suite**
|
|
1363
|
+
|
|
1364
|
+
Run: `npx vitest run tests/lib/cortex/`
|
|
1365
|
+
Expected: ALL PASS
|
|
1366
|
+
|
|
1367
|
+
- [ ] **Step 2: Run type check**
|
|
1368
|
+
|
|
1369
|
+
Run: `npx tsc --noEmit`
|
|
1370
|
+
Expected: No errors
|
|
1371
|
+
|
|
1372
|
+
- [ ] **Step 3: Build**
|
|
1373
|
+
|
|
1374
|
+
Run: `npm run build`
|
|
1375
|
+
Expected: Build succeeds
|
|
1376
|
+
|
|
1377
|
+
- [ ] **Step 4: Commit any remaining fixes**
|
|
1378
|
+
|
|
1379
|
+
If any tests or build issues found, fix and commit.
|
|
1380
|
+
|
|
1381
|
+
- [ ] **Step 5: Final commit summarizing all changes**
|
|
1382
|
+
|
|
1383
|
+
If all individual commits are clean, no summary needed. Otherwise:
|
|
1384
|
+
|
|
1385
|
+
```bash
|
|
1386
|
+
git log --oneline -15 # verify commit history looks clean
|
|
1387
|
+
```
|