@bodhi-ventures/aiocs 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE CHANGED
@@ -1,6 +1,6 @@
1
1
  MIT License
2
2
 
3
- Copyright (c) 2026 jmucha
3
+ Copyright (c) 2026 Bodhi Ventures
4
4
 
5
5
  Permission is hereby granted, free of charge, to any person obtaining a copy
6
6
  of this software and associated documentation files (the "Software"), to deal
package/README.md CHANGED
@@ -30,25 +30,24 @@ For testing or local overrides, set:
30
30
  ```bash
31
31
  npm install -g @bodhi-ventures/aiocs
32
32
  docs --version
33
+ docs --help
34
+ command -v aiocs-mcp
33
35
  ```
34
36
 
35
- For repository development:
37
+ Zero-install fallback:
36
38
 
37
39
  ```bash
38
- pnpm install
39
- pnpm build
40
+ npx -y -p @bodhi-ventures/aiocs docs --version
41
+ npx -y -p @bodhi-ventures/aiocs aiocs-mcp
40
42
  ```
41
43
 
42
- Run the CLI during development with:
44
+ For repository development only:
43
45
 
44
46
  ```bash
47
+ pnpm install
48
+ pnpm build
45
49
  pnpm dev -- --help
46
- ```
47
-
48
- Or after build:
49
-
50
- ```bash
51
- ./dist/cli.js --help
50
+ pnpm dev:mcp
52
51
  ```
53
52
 
54
53
  For AI agents, prefer the root-level `--json` flag for one-shot commands:
@@ -57,9 +56,9 @@ For AI agents, prefer the root-level `--json` flag for one-shot commands:
57
56
  docs --json version
58
57
  docs --json doctor
59
58
  docs --json init --no-fetch
60
- pnpm dev -- --json source list
61
- pnpm dev -- --json search "maker flow" --source hyperliquid
62
- pnpm dev -- --json show 42
59
+ docs --json source list
60
+ docs --json search "maker flow" --source hyperliquid
61
+ docs --json show 42
63
62
  ```
64
63
 
65
64
  `--json` emits exactly one JSON document to stdout with this envelope:
@@ -109,19 +108,17 @@ GitHub Actions publishes `@bodhi-ventures/aiocs` publicly to npm and creates the
109
108
 
110
109
  ## Codex integration
111
110
 
112
- For Codex-first setup, automatic-use guidance, MCP recommendations, and subagent examples, see [docs/codex-integration.md](./docs/codex-integration.md).
111
+ For Codex-first setup, automatic-use guidance, MCP recommendations, and agent definitions, see [docs/codex-integration.md](./docs/codex-integration.md).
113
112
 
114
- ## Built-in sources
113
+ ## Managed sources
115
114
 
116
- Initial source specs are shipped in `sources/`:
115
+ The open-source repo bundles `hyperliquid` in `sources/`. Additional machine-local source specs
116
+ belong in `~/.aiocs/sources`.
117
117
 
118
- - `synthetix`
119
- - `hyperliquid`
120
- - `lighter`
121
- - `nado`
122
- - `ethereal`
118
+ `docs init` bootstraps both managed locations, so source behavior is the same regardless of
119
+ whether a spec lives in the repo or in `~/.aiocs/sources`.
123
120
 
124
- Bootstrap them in one command:
121
+ Bootstrap managed sources in one command:
125
122
 
126
123
  ```bash
127
124
  docs init --no-fetch
@@ -142,50 +139,50 @@ Register a source:
142
139
  ```bash
143
140
  mkdir -p ~/.aiocs/sources
144
141
  cp /path/to/source.yaml ~/.aiocs/sources/my-source.yaml
145
- pnpm dev -- source upsert ~/.aiocs/sources/my-source.yaml
146
- pnpm dev -- source upsert /path/to/source.yaml
147
- pnpm dev -- source list
142
+ docs source upsert ~/.aiocs/sources/my-source.yaml
143
+ docs source upsert /path/to/source.yaml
144
+ docs source list
148
145
  ```
149
146
 
150
147
  Fetch and snapshot docs:
151
148
 
152
149
  ```bash
153
- pnpm dev -- refresh due hyperliquid
154
- pnpm dev -- snapshot list hyperliquid
155
- pnpm dev -- refresh due
150
+ docs refresh due hyperliquid
151
+ docs snapshot list hyperliquid
152
+ docs refresh due
156
153
  ```
157
154
 
158
155
  Force fetch remains available for explicit maintenance:
159
156
 
160
157
  ```bash
161
- pnpm dev -- fetch hyperliquid
162
- pnpm dev -- fetch all
158
+ docs fetch hyperliquid
159
+ docs fetch all
163
160
  ```
164
161
 
165
162
  Link docs to a local project:
166
163
 
167
164
  ```bash
168
- pnpm dev -- project link /absolute/path/to/project hyperliquid lighter
169
- pnpm dev -- project unlink /absolute/path/to/project lighter
165
+ docs project link /absolute/path/to/project hyperliquid lighter
166
+ docs project unlink /absolute/path/to/project lighter
170
167
  ```
171
168
 
172
169
  Search and inspect results:
173
170
 
174
171
  ```bash
175
- pnpm dev -- search "maker flow" --source hyperliquid
176
- pnpm dev -- search "maker flow" --source hyperliquid --mode lexical
177
- pnpm dev -- search "maker flow" --source hyperliquid --mode hybrid
178
- pnpm dev -- search "maker flow" --source hyperliquid --mode semantic
179
- pnpm dev -- search "maker flow" --all
180
- pnpm dev -- search "maker flow" --source hyperliquid --limit 5 --offset 0
181
- pnpm dev -- show 42
182
- pnpm dev -- canary hyperliquid
183
- pnpm dev -- diff hyperliquid
184
- pnpm dev -- embeddings status
185
- pnpm dev -- embeddings backfill all
186
- pnpm dev -- embeddings run
187
- pnpm dev -- backup export /absolute/path/to/backup
188
- pnpm dev -- verify coverage hyperliquid /absolute/path/to/reference.md
172
+ docs search "maker flow" --source hyperliquid
173
+ docs search "maker flow" --source hyperliquid --mode lexical
174
+ docs search "maker flow" --source hyperliquid --mode hybrid
175
+ docs search "maker flow" --source hyperliquid --mode semantic
176
+ docs search "maker flow" --all
177
+ docs search "maker flow" --source hyperliquid --limit 5 --offset 0
178
+ docs show 42
179
+ docs canary hyperliquid
180
+ docs diff hyperliquid
181
+ docs embeddings status
182
+ docs embeddings backfill all
183
+ docs embeddings run
184
+ docs backup export /absolute/path/to/backup
185
+ docs verify coverage hyperliquid /absolute/path/to/reference.md
189
186
  ```
190
187
 
191
188
  When `docs search` runs inside a linked project, it automatically scopes to that project's linked sources unless `--source` or `--all` is provided.
@@ -281,22 +278,22 @@ All one-shot commands support `--json`:
281
278
  Representative examples:
282
279
 
283
280
  ```bash
284
- pnpm dev -- --json doctor
285
- pnpm dev -- --json init --no-fetch
286
- pnpm dev -- --json source list
287
- pnpm dev -- --json source upsert sources/hyperliquid.yaml
288
- pnpm dev -- --json refresh due hyperliquid
289
- pnpm dev -- --json canary hyperliquid
290
- pnpm dev -- --json refresh due
291
- pnpm dev -- --json diff hyperliquid
292
- pnpm dev -- --json embeddings status
293
- pnpm dev -- --json embeddings backfill all
294
- pnpm dev -- --json embeddings clear hyperliquid
295
- pnpm dev -- --json embeddings run
296
- pnpm dev -- --json project link /absolute/path/to/project hyperliquid lighter
297
- pnpm dev -- --json snapshot list hyperliquid
298
- pnpm dev -- --json backup export /absolute/path/to/backup
299
- pnpm dev -- --json verify coverage hyperliquid /absolute/path/to/reference.md
281
+ docs --json doctor
282
+ docs --json init --no-fetch
283
+ docs --json source list
284
+ docs --json source upsert sources/hyperliquid.yaml
285
+ docs --json refresh due hyperliquid
286
+ docs --json canary hyperliquid
287
+ docs --json refresh due
288
+ docs --json diff hyperliquid
289
+ docs --json embeddings status
290
+ docs --json embeddings backfill all
291
+ docs --json embeddings clear hyperliquid
292
+ docs --json embeddings run
293
+ docs --json project link /absolute/path/to/project hyperliquid lighter
294
+ docs --json snapshot list hyperliquid
295
+ docs --json backup export /absolute/path/to/backup
296
+ docs --json verify coverage hyperliquid /absolute/path/to/reference.md
300
297
  ```
301
298
 
302
299
  For multi-result commands like `fetch`, `refresh due`, and `search`, `data` contains structured collections rather than line-by-line output:
@@ -323,8 +320,7 @@ For multi-result commands like `fetch`, `refresh due`, and `search`, `data` cont
323
320
  `aiocs` ships a first-class long-running refresh process:
324
321
 
325
322
  ```bash
326
- pnpm dev -- daemon
327
- ./dist/cli.js daemon
323
+ docs daemon
328
324
  ```
329
325
 
330
326
  The daemon bootstraps source specs from the configured directories, refreshes due sources, sleeps for the configured interval, and repeats.
@@ -355,7 +351,7 @@ For local agents, the daemon keeps the shared catalog under `~/.aiocs` warm whil
355
351
  `docs daemon --json` is intentionally different from one-shot commands. Because it is long-running, it emits one JSON event per line:
356
352
 
357
353
  ```bash
358
- ./dist/cli.js --json daemon
354
+ docs --json daemon
359
355
  ```
360
356
 
361
357
  Example event stream:
@@ -371,7 +367,13 @@ Example event stream:
371
367
  `aiocs` also ships an MCP server binary for tool-native agent integrations:
372
368
 
373
369
  ```bash
370
+ command -v aiocs-mcp
374
371
  aiocs-mcp
372
+ ```
373
+
374
+ For repository development only:
375
+
376
+ ```bash
375
377
  pnpm dev:mcp
376
378
  ```
377
379
 
@@ -416,7 +418,7 @@ Successful MCP results use an envelope:
416
418
  "ok": true,
417
419
  "data": {
418
420
  "name": "@bodhi-ventures/aiocs",
419
- "version": "0.1.0"
421
+ "version": "0.1.1"
420
422
  }
421
423
  }
422
424
  ```
@@ -43,9 +43,108 @@ function toAiocsError(error) {
43
43
  return new AiocsError(AIOCS_ERROR_CODES.internalError, String(error));
44
44
  }
45
45
 
46
- // src/catalog/catalog.ts
46
+ // src/runtime/paths.ts
47
+ import { homedir } from "os";
48
+ import { join as join2, relative, resolve, sep } from "path";
47
49
  import { mkdirSync } from "fs";
48
- import { join, resolve as resolve2 } from "path";
50
+
51
+ // src/runtime/bundled-sources.ts
52
+ import { existsSync } from "fs";
53
+ import { dirname, join } from "path";
54
+ import { fileURLToPath } from "url";
55
+ function findPackageRoot(startDir) {
56
+ let currentDir = startDir;
57
+ while (true) {
58
+ if (existsSync(join(currentDir, "package.json")) && existsSync(join(currentDir, "sources"))) {
59
+ return currentDir;
60
+ }
61
+ const parentDir = dirname(currentDir);
62
+ if (parentDir === currentDir) {
63
+ throw new Error(`Could not locate aiocs package root from ${startDir}`);
64
+ }
65
+ currentDir = parentDir;
66
+ }
67
+ }
68
+ function getBundledSourcesDir() {
69
+ const currentFilePath = fileURLToPath(import.meta.url);
70
+ const packageRoot = findPackageRoot(dirname(currentFilePath));
71
+ return join(packageRoot, "sources");
72
+ }
73
+
74
+ // src/runtime/paths.ts
75
+ var PORTABLE_USER_SOURCES_PREFIX = "~/.aiocs/sources";
76
+ var PORTABLE_BUNDLED_SOURCES_PREFIX = "aiocs://bundled";
77
+ var CONTAINER_USER_SOURCES_DIR = "/root/.aiocs/sources";
78
+ var CONTAINER_BUNDLED_SOURCES_DIR = "/app/sources";
79
+ function expandTilde(path) {
80
+ if (path === "~") {
81
+ return homedir();
82
+ }
83
+ if (path.startsWith("~/")) {
84
+ return join2(homedir(), path.slice(2));
85
+ }
86
+ return path;
87
+ }
88
+ function getAiocsDataDir(env = process.env) {
89
+ const override = env.AIOCS_DATA_DIR;
90
+ if (override) {
91
+ mkdirSync(expandTilde(override), { recursive: true });
92
+ return expandTilde(override);
93
+ }
94
+ const target = join2(homedir(), ".aiocs", "data");
95
+ mkdirSync(target, { recursive: true });
96
+ return target;
97
+ }
98
+ function getAiocsConfigDir(env = process.env) {
99
+ const override = env.AIOCS_CONFIG_DIR;
100
+ if (override) {
101
+ mkdirSync(expandTilde(override), { recursive: true });
102
+ return expandTilde(override);
103
+ }
104
+ const target = join2(homedir(), ".aiocs", "config");
105
+ mkdirSync(target, { recursive: true });
106
+ return target;
107
+ }
108
+ function getAiocsSourcesDir(env = process.env) {
109
+ const override = env.AIOCS_SOURCES_DIR;
110
+ if (override) {
111
+ mkdirSync(expandTilde(override), { recursive: true });
112
+ return expandTilde(override);
113
+ }
114
+ const target = join2(homedir(), ".aiocs", "sources");
115
+ mkdirSync(target, { recursive: true });
116
+ return target;
117
+ }
118
+ function isWithinRoot(candidatePath, rootPath) {
119
+ return candidatePath === rootPath || candidatePath.startsWith(`${rootPath}${sep}`);
120
+ }
121
+ function toPortablePath(prefix, rootPath, candidatePath) {
122
+ const relativePath = relative(rootPath, candidatePath).split(sep).join("/");
123
+ return relativePath ? `${prefix}/${relativePath}` : prefix;
124
+ }
125
+ function canonicalizeManagedSpecPath(specPath, env = process.env) {
126
+ if (specPath === PORTABLE_USER_SOURCES_PREFIX || specPath.startsWith(`${PORTABLE_USER_SOURCES_PREFIX}/`) || specPath === PORTABLE_BUNDLED_SOURCES_PREFIX || specPath.startsWith(`${PORTABLE_BUNDLED_SOURCES_PREFIX}/`)) {
127
+ return specPath;
128
+ }
129
+ const resolvedPath = resolve(specPath);
130
+ const userRoots = [resolve(getAiocsSourcesDir(env)), CONTAINER_USER_SOURCES_DIR];
131
+ for (const rootPath of userRoots) {
132
+ if (isWithinRoot(resolvedPath, rootPath)) {
133
+ return toPortablePath(PORTABLE_USER_SOURCES_PREFIX, rootPath, resolvedPath);
134
+ }
135
+ }
136
+ const bundledRoots = [resolve(getBundledSourcesDir()), CONTAINER_BUNDLED_SOURCES_DIR];
137
+ for (const rootPath of bundledRoots) {
138
+ if (isWithinRoot(resolvedPath, rootPath)) {
139
+ return toPortablePath(PORTABLE_BUNDLED_SOURCES_PREFIX, rootPath, resolvedPath);
140
+ }
141
+ }
142
+ return resolvedPath;
143
+ }
144
+
145
+ // src/catalog/catalog.ts
146
+ import { mkdirSync as mkdirSync2 } from "fs";
147
+ import { join as join3, resolve as resolve3 } from "path";
49
148
  import { randomUUID } from "crypto";
50
149
  import Database from "better-sqlite3";
51
150
 
@@ -152,12 +251,12 @@ function buildSnapshotFingerprint(input) {
152
251
 
153
252
  // src/catalog/project-scope.ts
154
253
  import { realpathSync } from "fs";
155
- import { resolve } from "path";
254
+ import { resolve as resolve2 } from "path";
156
255
  function isWithin(candidate, root) {
157
256
  return candidate === root || candidate.startsWith(`${root}/`);
158
257
  }
159
258
  function canonicalizeProjectPath(path) {
160
- const resolved = resolve(path);
259
+ const resolved = resolve2(path);
161
260
  try {
162
261
  return realpathSync.native(resolved);
163
262
  } catch {
@@ -511,9 +610,9 @@ function assertPaginationValue(value, field, fallback) {
511
610
  return value;
512
611
  }
513
612
  function openCatalog(options) {
514
- const dataDir = resolve2(options.dataDir);
515
- mkdirSync(dataDir, { recursive: true });
516
- const db = new Database(join(dataDir, "catalog.sqlite"));
613
+ const dataDir = resolve3(options.dataDir);
614
+ mkdirSync2(dataDir, { recursive: true });
615
+ const db = new Database(join3(dataDir, "catalog.sqlite"));
517
616
  initSchema(db);
518
617
  const listProjectLinks = () => {
519
618
  const rows = db.prepare("SELECT project_path, source_id FROM project_links ORDER BY project_path, source_id").all();
@@ -788,7 +887,7 @@ function openCatalog(options) {
788
887
  const timestamp = nowIso();
789
888
  const configHash = sha256(stableStringify(spec));
790
889
  const existing = db.prepare("SELECT id, created_at, next_due_at, next_canary_due_at, config_hash FROM sources WHERE id = ?").get(spec.id);
791
- const resolvedSpecPath = options2?.specPath ? resolve2(options2.specPath) : null;
890
+ const resolvedSpecPath = options2?.specPath ? canonicalizeManagedSpecPath(options2.specPath) : null;
792
891
  const nextDueAt = !existing ? timestamp : existing.config_hash === configHash ? existing.next_due_at : timestamp;
793
892
  const canaryConfig = resolveSourceCanary(spec);
794
893
  const nextCanaryDueAt = !existing ? timestamp : existing.config_hash === configHash ? existing.next_canary_due_at ?? addHoursIso(canaryConfig.everyHours) : timestamp;
@@ -851,7 +950,7 @@ function openCatalog(options) {
851
950
  return rows.map((row) => ({
852
951
  id: row.id,
853
952
  label: row.label,
854
- specPath: row.spec_path,
953
+ specPath: row.spec_path ? canonicalizeManagedSpecPath(row.spec_path) : null,
855
954
  nextDueAt: row.next_due_at,
856
955
  isDue: Date.parse(row.next_due_at) <= Date.now(),
857
956
  nextCanaryDueAt: row.next_canary_due_at,
@@ -1086,8 +1185,9 @@ function openCatalog(options) {
1086
1185
  return [];
1087
1186
  }
1088
1187
  const activeSourceKeys = new Set(
1089
- input.activeSources.map((source) => `${source.sourceId}::${resolve2(source.specPath)}`)
1188
+ input.activeSources.map((source) => `${source.sourceId}::${canonicalizeManagedSpecPath(source.specPath)}`)
1090
1189
  );
1190
+ const normalizedManagedRoots = input.managedRoots.map((managedRoot) => canonicalizeManagedSpecPath(managedRoot));
1091
1191
  const rows = db.prepare(`
1092
1192
  SELECT id, spec_path
1093
1193
  FROM sources
@@ -1098,8 +1198,8 @@ function openCatalog(options) {
1098
1198
  if (!row.spec_path) {
1099
1199
  return false;
1100
1200
  }
1101
- const normalizedSpecPath = resolve2(row.spec_path);
1102
- return input.managedRoots.some(
1201
+ const normalizedSpecPath = canonicalizeManagedSpecPath(row.spec_path);
1202
+ return normalizedManagedRoots.some(
1103
1203
  (managedRoot) => normalizedSpecPath === managedRoot || normalizedSpecPath.startsWith(`${managedRoot}/`)
1104
1204
  ) && !activeSourceKeys.has(`${row.id}::${normalizedSpecPath}`);
1105
1205
  }).map((row) => row.id);
@@ -1718,57 +1818,14 @@ function openCatalog(options) {
1718
1818
  };
1719
1819
  }
1720
1820
 
1721
- // src/runtime/paths.ts
1722
- import { homedir } from "os";
1723
- import { join as join2 } from "path";
1724
- import { mkdirSync as mkdirSync2 } from "fs";
1725
- function expandTilde(path) {
1726
- if (path === "~") {
1727
- return homedir();
1728
- }
1729
- if (path.startsWith("~/")) {
1730
- return join2(homedir(), path.slice(2));
1731
- }
1732
- return path;
1733
- }
1734
- function getAiocsDataDir(env = process.env) {
1735
- const override = env.AIOCS_DATA_DIR;
1736
- if (override) {
1737
- mkdirSync2(expandTilde(override), { recursive: true });
1738
- return expandTilde(override);
1739
- }
1740
- const target = join2(homedir(), ".aiocs", "data");
1741
- mkdirSync2(target, { recursive: true });
1742
- return target;
1743
- }
1744
- function getAiocsConfigDir(env = process.env) {
1745
- const override = env.AIOCS_CONFIG_DIR;
1746
- if (override) {
1747
- mkdirSync2(expandTilde(override), { recursive: true });
1748
- return expandTilde(override);
1749
- }
1750
- const target = join2(homedir(), ".aiocs", "config");
1751
- mkdirSync2(target, { recursive: true });
1752
- return target;
1753
- }
1754
- function getAiocsSourcesDir(env = process.env) {
1755
- const override = env.AIOCS_SOURCES_DIR;
1756
- if (override) {
1757
- mkdirSync2(expandTilde(override), { recursive: true });
1758
- return expandTilde(override);
1759
- }
1760
- const target = join2(homedir(), ".aiocs", "sources");
1761
- mkdirSync2(target, { recursive: true });
1762
- return target;
1763
- }
1764
-
1765
1821
  // src/daemon.ts
1766
- import { resolve as resolve4 } from "path";
1822
+ import { existsSync as existsSync2 } from "fs";
1823
+ import { resolve as resolve5 } from "path";
1767
1824
  import { setTimeout as sleep2 } from "timers/promises";
1768
1825
 
1769
1826
  // src/fetch/fetch-source.ts
1770
1827
  import { mkdirSync as mkdirSync3, writeFileSync } from "fs";
1771
- import { join as join3 } from "path";
1828
+ import { join as join4 } from "path";
1772
1829
  import { setTimeout as sleep } from "timers/promises";
1773
1830
  import { chromium } from "playwright";
1774
1831
 
@@ -2051,11 +2108,11 @@ async function extractRawMarkdownPage(url, response) {
2051
2108
  };
2052
2109
  }
2053
2110
  function persistSnapshotPages(input, snapshotId, pages) {
2054
- const snapshotDir = join3(input.dataDir, "sources", input.sourceId, "snapshots", snapshotId, "pages");
2111
+ const snapshotDir = join4(input.dataDir, "sources", input.sourceId, "snapshots", snapshotId, "pages");
2055
2112
  mkdirSync3(snapshotDir, { recursive: true });
2056
2113
  pages.forEach((page, index) => {
2057
2114
  const filename = `${String(index + 1).padStart(3, "0")}-${slugify(page.title)}.md`;
2058
- writeFileSync(join3(snapshotDir, filename), page.markdown, "utf8");
2115
+ writeFileSync(join4(snapshotDir, filename), page.markdown, "utf8");
2059
2116
  });
2060
2117
  }
2061
2118
  function resolveEnvValue(name, env) {
@@ -2404,6 +2461,30 @@ function getEmbeddingModelKey(config) {
2404
2461
  function normalizeBaseUrl(baseUrl) {
2405
2462
  return baseUrl.endsWith("/") ? baseUrl.slice(0, -1) : baseUrl;
2406
2463
  }
2464
+ function normalizeEmbeddingWhitespace(value) {
2465
+ return value.replace(/\s+/g, " ").trim();
2466
+ }
2467
+ function truncateEmbeddingText(value, maxChars) {
2468
+ if (value.length <= maxChars) {
2469
+ return value;
2470
+ }
2471
+ const slice = value.slice(0, maxChars);
2472
+ const lastWhitespace = slice.lastIndexOf(" ");
2473
+ if (lastWhitespace >= Math.floor(maxChars * 0.8)) {
2474
+ return slice.slice(0, lastWhitespace).trim();
2475
+ }
2476
+ return slice.trim();
2477
+ }
2478
+ function prepareTextForEmbedding(markdown, maxChars) {
2479
+ const withoutComments = markdown.replace(/<!--[\s\S]*?-->/g, " ");
2480
+ const withoutImages = withoutComments.replace(/!\[([^\]]*)\]\(([^)]+)\)/g, "$1");
2481
+ const withoutLinks = withoutImages.replace(/\[([^\]]+)\]\(([^)]+)\)/g, "$1");
2482
+ const withoutHtml = withoutLinks.replace(/<[^>]+>/g, " ");
2483
+ const withoutCodeFenceMarkers = withoutHtml.replace(/```[^\n]*\n/g, "\n").replace(/```/g, "\n");
2484
+ const withoutInlineCodeTicks = withoutCodeFenceMarkers.replace(/`([^`]+)`/g, "$1");
2485
+ const normalized = normalizeEmbeddingWhitespace(withoutInlineCodeTicks);
2486
+ return truncateEmbeddingText(normalized, maxChars);
2487
+ }
2407
2488
  async function parseJsonResponse(response) {
2408
2489
  const text = await response.text();
2409
2490
  if (!text) {
@@ -2422,6 +2503,7 @@ async function embedTexts(config, texts) {
2422
2503
  if (texts.length === 0) {
2423
2504
  return [];
2424
2505
  }
2506
+ const preparedTexts = texts.map((text) => prepareTextForEmbedding(text, config.ollamaMaxInputChars));
2425
2507
  const response = await fetch(`${normalizeBaseUrl(config.ollamaBaseUrl)}/api/embed`, {
2426
2508
  method: "POST",
2427
2509
  headers: {
@@ -2430,7 +2512,7 @@ async function embedTexts(config, texts) {
2430
2512
  signal: AbortSignal.timeout(config.ollamaTimeoutMs),
2431
2513
  body: JSON.stringify({
2432
2514
  model: config.ollamaEmbeddingModel,
2433
- input: texts
2515
+ input: preparedTexts
2434
2516
  })
2435
2517
  }).catch((error) => {
2436
2518
  throw new AiocsError(
@@ -2823,29 +2905,6 @@ async function processEmbeddingJobs(input) {
2823
2905
  };
2824
2906
  }
2825
2907
 
2826
- // src/runtime/bundled-sources.ts
2827
- import { existsSync } from "fs";
2828
- import { dirname, join as join4 } from "path";
2829
- import { fileURLToPath } from "url";
2830
- function findPackageRoot(startDir) {
2831
- let currentDir = startDir;
2832
- while (true) {
2833
- if (existsSync(join4(currentDir, "package.json")) && existsSync(join4(currentDir, "sources"))) {
2834
- return currentDir;
2835
- }
2836
- const parentDir = dirname(currentDir);
2837
- if (parentDir === currentDir) {
2838
- throw new Error(`Could not locate aiocs package root from ${startDir}`);
2839
- }
2840
- currentDir = parentDir;
2841
- }
2842
- }
2843
- function getBundledSourcesDir() {
2844
- const currentFilePath = fileURLToPath(import.meta.url);
2845
- const packageRoot = findPackageRoot(dirname(currentFilePath));
2846
- return join4(packageRoot, "sources");
2847
- }
2848
-
2849
2908
  // src/runtime/hybrid-config.ts
2850
2909
  function parsePositiveInteger(value, field, fallback) {
2851
2910
  if (typeof value === "undefined" || value.trim() === "") {
@@ -2888,7 +2947,8 @@ function getHybridRuntimeConfig(env = process.env) {
2888
2947
  embeddingProvider: "ollama",
2889
2948
  ollamaBaseUrl: env.AIOCS_OLLAMA_BASE_URL ?? "http://127.0.0.1:11434",
2890
2949
  ollamaEmbeddingModel: env.AIOCS_OLLAMA_EMBEDDING_MODEL ?? "nomic-embed-text",
2891
- ollamaTimeoutMs: parsePositiveInteger(env.AIOCS_OLLAMA_TIMEOUT_MS, "AIOCS_OLLAMA_TIMEOUT_MS", 1e3),
2950
+ ollamaTimeoutMs: parsePositiveInteger(env.AIOCS_OLLAMA_TIMEOUT_MS, "AIOCS_OLLAMA_TIMEOUT_MS", 1e4),
2951
+ ollamaMaxInputChars: parsePositiveInteger(env.AIOCS_OLLAMA_MAX_INPUT_CHARS, "AIOCS_OLLAMA_MAX_INPUT_CHARS", 4e3),
2892
2952
  embeddingBatchSize: parsePositiveInteger(env.AIOCS_EMBEDDING_BATCH_SIZE, "AIOCS_EMBEDDING_BATCH_SIZE", 32),
2893
2953
  embeddingJobsPerCycle: parsePositiveInteger(env.AIOCS_EMBEDDING_JOB_LIMIT_PER_CYCLE, "AIOCS_EMBEDDING_JOB_LIMIT_PER_CYCLE", 2),
2894
2954
  lexicalCandidateWindow: parsePositiveInteger(env.AIOCS_LEXICAL_CANDIDATE_WINDOW, "AIOCS_LEXICAL_CANDIDATE_WINDOW", 40),
@@ -2900,13 +2960,13 @@ function getHybridRuntimeConfig(env = process.env) {
2900
2960
  // src/spec/source-spec-files.ts
2901
2961
  import { access, readdir } from "fs/promises";
2902
2962
  import { constants as fsConstants } from "fs";
2903
- import { extname as extname2, join as join5, resolve as resolve3 } from "path";
2963
+ import { extname as extname2, join as join5, resolve as resolve4 } from "path";
2904
2964
  var SOURCE_SPEC_EXTENSIONS = /* @__PURE__ */ new Set([".yaml", ".yml", ".json"]);
2905
2965
  function uniqueResolvedPaths(paths) {
2906
2966
  const seen = /* @__PURE__ */ new Set();
2907
2967
  const unique = [];
2908
2968
  for (const rawPath of paths) {
2909
- const normalized = resolve3(rawPath);
2969
+ const normalized = resolve4(rawPath);
2910
2970
  if (seen.has(normalized)) {
2911
2971
  continue;
2912
2972
  }
@@ -2973,10 +3033,11 @@ function parseBoolean(raw, variableName) {
2973
3033
  function parseDaemonConfig(env, options = {}) {
2974
3034
  const intervalMinutes = env.AIOCS_DAEMON_INTERVAL_MINUTES ? parsePositiveInteger2(env.AIOCS_DAEMON_INTERVAL_MINUTES, "AIOCS_DAEMON_INTERVAL_MINUTES") : DEFAULT_INTERVAL_MINUTES;
2975
3035
  const fetchOnStart = env.AIOCS_DAEMON_FETCH_ON_START ? parseBoolean(env.AIOCS_DAEMON_FETCH_ON_START, "AIOCS_DAEMON_FETCH_ON_START") : true;
3036
+ const defaultContainerSourceDir = options.containerSourceDir ?? (existsSync2(DEFAULT_CONTAINER_SOURCE_DIR) ? DEFAULT_CONTAINER_SOURCE_DIR : void 0);
2976
3037
  const defaultSourceDirs = uniqueResolvedPaths([
2977
3038
  options.bundledSourceDir ?? getBundledSourcesDir(),
2978
3039
  options.userSourceDir ?? getAiocsSourcesDir(env),
2979
- options.containerSourceDir ?? DEFAULT_CONTAINER_SOURCE_DIR
3040
+ ...defaultContainerSourceDir ? [defaultContainerSourceDir] : []
2980
3041
  ]);
2981
3042
  const sourceSpecDirs = env.AIOCS_SOURCE_SPEC_DIRS ? uniqueResolvedPaths(
2982
3043
  env.AIOCS_SOURCE_SPEC_DIRS.split(",").map((entry) => entry.trim()).filter(Boolean)
@@ -3023,7 +3084,7 @@ async function bootstrapSourceSpecs(input) {
3023
3084
  throw new Error(`No source spec files found in configured directories: ${normalizedSourceSpecDirs.join(", ")}`);
3024
3085
  }
3025
3086
  const removedSourceIds = input.catalog.removeManagedSources({
3026
- managedRoots: existingDirs.map((sourceSpecDir) => resolve4(sourceSpecDir)),
3087
+ managedRoots: existingDirs.map((sourceSpecDir) => resolve5(sourceSpecDir)),
3027
3088
  activeSources: sources.map((source) => ({
3028
3089
  sourceId: source.sourceId,
3029
3090
  specPath: source.specPath
@@ -3208,7 +3269,7 @@ async function startDaemon(input) {
3208
3269
  // package.json
3209
3270
  var package_default = {
3210
3271
  name: "@bodhi-ventures/aiocs",
3211
- version: "0.1.0",
3272
+ version: "0.1.2",
3212
3273
  license: "MIT",
3213
3274
  type: "module",
3214
3275
  description: "Local-only documentation store, fetcher, and search CLI for AI agents.",
@@ -3287,11 +3348,11 @@ var packageVersion = package_default.version;
3287
3348
  var packageDescription = package_default.description;
3288
3349
 
3289
3350
  // src/services.ts
3290
- import { resolve as resolve7 } from "path";
3351
+ import { resolve as resolve8 } from "path";
3291
3352
 
3292
3353
  // src/backup.ts
3293
3354
  import { cp, mkdir, readdir as readdir2, readFile as readFile2, rename, rm, stat, writeFile } from "fs/promises";
3294
- import { basename, dirname as dirname2, join as join6, resolve as resolve5 } from "path";
3355
+ import { basename, dirname as dirname2, join as join6, resolve as resolve6 } from "path";
3295
3356
  import { randomUUID as randomUUID2 } from "crypto";
3296
3357
  import Database2 from "better-sqlite3";
3297
3358
  var CATALOG_DB_FILENAME = "catalog.sqlite";
@@ -3427,9 +3488,9 @@ async function prepareReplacementTarget(backupDir, targetDir) {
3427
3488
  return stagingDir;
3428
3489
  }
3429
3490
  async function exportBackup(input) {
3430
- const dataDir = resolve5(input.dataDir);
3431
- const outputDir = resolve5(input.outputDir);
3432
- const configDir = input.configDir ? resolve5(input.configDir) : void 0;
3491
+ const dataDir = resolve6(input.dataDir);
3492
+ const outputDir = resolve6(input.outputDir);
3493
+ const configDir = input.configDir ? resolve6(input.configDir) : void 0;
3433
3494
  await assertSourceDirExists(dataDir);
3434
3495
  if (!await isDirectoryEmpty(outputDir)) {
3435
3496
  if (!input.replaceExisting) {
@@ -3465,9 +3526,9 @@ async function exportBackup(input) {
3465
3526
  };
3466
3527
  }
3467
3528
  async function importBackup(input) {
3468
- const inputDir = resolve5(input.inputDir);
3469
- const dataDir = resolve5(input.dataDir);
3470
- const configDir = input.configDir ? resolve5(input.configDir) : void 0;
3529
+ const inputDir = resolve6(input.inputDir);
3530
+ const dataDir = resolve6(input.dataDir);
3531
+ const configDir = input.configDir ? resolve6(input.configDir) : void 0;
3471
3532
  const { manifest, backupDataDir, backupConfigDir } = await loadValidatedBackupPayload(inputDir);
3472
3533
  if (!await isDirectoryEmpty(dataDir)) {
3473
3534
  if (!input.replaceExisting) {
@@ -3511,7 +3572,7 @@ async function importBackup(input) {
3511
3572
 
3512
3573
  // src/coverage.ts
3513
3574
  import { readFile as readFile3 } from "fs/promises";
3514
- import { resolve as resolve6 } from "path";
3575
+ import { resolve as resolve7 } from "path";
3515
3576
  function normalizeText(value) {
3516
3577
  return value.replace(/[`*_~]+/g, "").replace(/\s+/g, " ").trim().toLowerCase();
3517
3578
  }
@@ -3560,7 +3621,7 @@ async function verifyCoverageAgainstReferences(corpus, referenceFiles) {
3560
3621
  body: 0
3561
3622
  };
3562
3623
  for (const referenceFile of referenceFiles) {
3563
- const resolvedReferenceFile = resolve6(referenceFile);
3624
+ const resolvedReferenceFile = resolve7(referenceFile);
3564
3625
  let raw;
3565
3626
  try {
3566
3627
  raw = await readFile3(resolvedReferenceFile, "utf8");
@@ -4226,7 +4287,7 @@ function withCatalog(run) {
4226
4287
  return Promise.resolve(run(ctx)).finally(() => ctx.catalog.close());
4227
4288
  }
4228
4289
  async function upsertSourceFromSpecFile(specFile) {
4229
- const specPath = resolve7(specFile);
4290
+ const specPath = resolve8(specFile);
4230
4291
  const spec = await loadSourceSpec(specPath);
4231
4292
  const result = await withCatalog(({ catalog }) => catalog.upsertSource(spec, { specPath }));
4232
4293
  return {
@@ -4322,7 +4383,7 @@ async function diffSnapshotsForSource(input) {
4322
4383
  return withCatalog(({ catalog }) => catalog.diffSnapshots(input));
4323
4384
  }
4324
4385
  async function linkProjectSources(projectPath, sourceIds) {
4325
- const resolvedProjectPath = resolve7(projectPath);
4386
+ const resolvedProjectPath = resolve8(projectPath);
4326
4387
  await withCatalog(({ catalog }) => {
4327
4388
  catalog.linkProject(resolvedProjectPath, sourceIds);
4328
4389
  });
@@ -4332,7 +4393,7 @@ async function linkProjectSources(projectPath, sourceIds) {
4332
4393
  };
4333
4394
  }
4334
4395
  async function unlinkProjectSources(projectPath, sourceIds) {
4335
- const resolvedProjectPath = resolve7(projectPath);
4396
+ const resolvedProjectPath = resolve8(projectPath);
4336
4397
  await withCatalog(({ catalog }) => {
4337
4398
  catalog.unlinkProject(resolvedProjectPath, sourceIds);
4338
4399
  });
@@ -4342,7 +4403,7 @@ async function unlinkProjectSources(projectPath, sourceIds) {
4342
4403
  };
4343
4404
  }
4344
4405
  async function searchCatalog(query, options) {
4345
- const cwd = options.project ? resolve7(options.project) : process.cwd();
4406
+ const cwd = options.project ? resolve8(options.project) : process.cwd();
4346
4407
  const explicitSources = options.source.length > 0;
4347
4408
  const results = await withCatalog(({ catalog }) => {
4348
4409
  const hybridConfig = getHybridRuntimeConfig();
@@ -4398,14 +4459,19 @@ async function verifyCoverage(input) {
4398
4459
  return verifyCoverageAgainstReferences(corpus, input.referenceFiles);
4399
4460
  });
4400
4461
  }
4401
- async function initBuiltInSources(options) {
4402
- const sourceSpecDir = options?.sourceSpecDir ?? getBundledSourcesDir();
4462
+ async function initManagedSources(options) {
4463
+ const sourceSpecDirs = uniqueResolvedPaths(
4464
+ options?.sourceSpecDirs ?? [
4465
+ getBundledSourcesDir(),
4466
+ getAiocsSourcesDir()
4467
+ ]
4468
+ );
4403
4469
  const fetched = options?.fetch ?? false;
4404
4470
  const userSourceDir = getAiocsSourcesDir();
4405
4471
  return withCatalog(async ({ catalog, dataDir }) => {
4406
4472
  const bootstrapped = await bootstrapSourceSpecs({
4407
4473
  catalog,
4408
- sourceSpecDirs: [sourceSpecDir],
4474
+ sourceSpecDirs,
4409
4475
  strictSourceSpecDirs: true
4410
4476
  });
4411
4477
  const fetchResults = [];
@@ -4429,7 +4495,7 @@ async function initBuiltInSources(options) {
4429
4495
  });
4430
4496
  }
4431
4497
  return {
4432
- sourceSpecDir,
4498
+ sourceSpecDirs,
4433
4499
  userSourceDir,
4434
4500
  fetched,
4435
4501
  initializedSources: bootstrapped.sources,
@@ -4503,9 +4569,9 @@ export {
4503
4569
  AIOCS_ERROR_CODES,
4504
4570
  AiocsError,
4505
4571
  toAiocsError,
4506
- openCatalog,
4507
4572
  getAiocsDataDir,
4508
4573
  getAiocsConfigDir,
4574
+ openCatalog,
4509
4575
  parseDaemonConfig,
4510
4576
  startDaemon,
4511
4577
  packageName,
@@ -4523,7 +4589,7 @@ export {
4523
4589
  searchCatalog,
4524
4590
  showChunk,
4525
4591
  verifyCoverage,
4526
- initBuiltInSources,
4592
+ initManagedSources,
4527
4593
  getManagedSourceSpecDirectories,
4528
4594
  getDoctorReport,
4529
4595
  exportCatalogBackup,
package/dist/cli.js CHANGED
@@ -13,7 +13,7 @@ import {
13
13
  getEmbeddingStatus,
14
14
  getManagedSourceSpecDirectories,
15
15
  importCatalogBackup,
16
- initBuiltInSources,
16
+ initManagedSources,
17
17
  linkProjectSources,
18
18
  listSnapshotsForSource,
19
19
  listSources,
@@ -31,7 +31,7 @@ import {
31
31
  unlinkProjectSources,
32
32
  upsertSourceFromSpecFile,
33
33
  verifyCoverage
34
- } from "./chunk-ID3PUSMY.js";
34
+ } from "./chunk-CZ6C4YUX.js";
35
35
 
36
36
  // src/cli.ts
37
37
  import { Command, CommanderError as CommanderError2 } from "commander";
@@ -289,21 +289,22 @@ program.command("version").description("Show the current aiocs version.").action
289
289
  human: packageVersion
290
290
  }));
291
291
  });
292
- program.command("init").description("Register bundled built-in source specs and optionally fetch them.").option("--fetch", "fetch built-in sources immediately").option("--no-fetch", "skip immediate fetching after bootstrapping").action(async (options, command) => {
292
+ program.command("init").description("Register managed source specs from the bundled repo directory and ~/.aiocs/sources, then optionally fetch them.").option("--fetch", "fetch managed sources immediately").option("--no-fetch", "skip immediate fetching after bootstrapping").action(async (options, command) => {
293
293
  await executeCommand(command, "init", async () => {
294
- const result = await initBuiltInSources({
294
+ const result = await initManagedSources({
295
295
  fetch: options.fetch ?? false
296
296
  });
297
297
  return {
298
298
  data: result,
299
299
  human: [
300
- `Initialized ${result.initializedSources.length} built-in sources from ${result.sourceSpecDir}`,
300
+ `Initialized ${result.initializedSources.length} managed sources from ${result.sourceSpecDirs.length} directories`,
301
+ ...result.sourceSpecDirs.map((directory) => `Managed source dir: ${directory}`),
301
302
  `User-managed source specs live under ${getManagedSourceSpecDirectories().userSourceDir}`,
302
303
  ...result.removedSourceIds.length > 0 ? [`Removed managed sources: ${result.removedSourceIds.join(", ")}`] : [],
303
304
  ...result.fetchResults.length > 0 ? result.fetchResults.map((entry) => {
304
305
  const verb = entry.reused ? "Reused" : "Fetched";
305
306
  return `${verb} ${entry.sourceId} -> ${entry.snapshotId} (${entry.pageCount} pages)`;
306
- }) : [result.fetched ? "No built-in sources were fetched." : "Skipped fetching built-in sources."]
307
+ }) : [result.fetched ? "No managed sources were fetched." : "Skipped fetching managed sources."]
307
308
  ]
308
309
  };
309
310
  });
@@ -10,7 +10,7 @@ import {
10
10
  getDoctorReport,
11
11
  getEmbeddingStatus,
12
12
  importCatalogBackup,
13
- initBuiltInSources,
13
+ initManagedSources,
14
14
  linkProjectSources,
15
15
  listSnapshotsForSource,
16
16
  listSources,
@@ -26,7 +26,7 @@ import {
26
26
  unlinkProjectSources,
27
27
  upsertSourceFromSpecFile,
28
28
  verifyCoverage
29
- } from "./chunk-ID3PUSMY.js";
29
+ } from "./chunk-CZ6C4YUX.js";
30
30
 
31
31
  // src/mcp-server.ts
32
32
  import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
@@ -235,7 +235,7 @@ var toolHandlers = {
235
235
  version: packageVersion
236
236
  }),
237
237
  doctor: async () => getDoctorReport(),
238
- init: async (args = {}) => initBuiltInSources({
238
+ init: async (args = {}) => initManagedSources({
239
239
  ...typeof args.fetch === "boolean" ? { fetch: args.fetch } : {}
240
240
  }),
241
241
  source_upsert: async (args = {}) => upsertSourceFromSpecFile(args.specFile),
@@ -350,12 +350,12 @@ registerAiocsTool(
350
350
  "init",
351
351
  {
352
352
  title: "Init",
353
- description: "Bootstrap the bundled built-in source specs and optionally fetch them.",
353
+ description: "Bootstrap managed source specs from the bundled repo directory and ~/.aiocs/sources, then optionally fetch them.",
354
354
  inputSchema: z.object({
355
355
  fetch: z.boolean().optional()
356
356
  }),
357
357
  outputSchema: z.object({
358
- sourceSpecDir: z.string(),
358
+ sourceSpecDirs: z.array(z.string()),
359
359
  userSourceDir: z.string(),
360
360
  fetched: z.boolean(),
361
361
  initializedSources: z.array(z.object({
@@ -24,7 +24,7 @@ This design keeps `aiocs` as the canonical docs system:
24
24
 
25
25
  ## Why This Shape
26
26
 
27
- The current `aiocs` search path in [catalog.ts](/Users/jmucha/repos/mandex/aiocs/src/catalog/catalog.ts) is pure FTS5 BM25 over the latest successful snapshots. That is excellent for exact docs lookups, versioned terms, and API names. It is weaker for:
27
+ The current `aiocs` search path in [catalog.ts](../src/catalog/catalog.ts) is pure FTS5 BM25 over the latest successful snapshots. That is excellent for exact docs lookups, versioned terms, and API names. It is weaker for:
28
28
 
29
29
  - synonym-heavy prompts
30
30
  - conceptual questions
package/docs/README.md CHANGED
@@ -9,4 +9,4 @@ Good candidates:
9
9
  - operational runbooks
10
10
  - decisions worth preserving across sessions
11
11
  - Codex integration guidance in `codex-integration.md`
12
- - reusable agent examples under `examples/codex-agents/`
12
+ - reusable repo-managed agent definitions under `../agents/`
@@ -9,7 +9,14 @@ Install the CLI and MCP binary globally:
9
9
  ```bash
10
10
  npm install -g @bodhi-ventures/aiocs
11
11
  docs --version
12
- aiocs-mcp
12
+ command -v aiocs-mcp
13
+ ```
14
+
15
+ If global install is unavailable, use `npx` only as a fallback:
16
+
17
+ ```bash
18
+ npx -y -p @bodhi-ventures/aiocs docs --version
19
+ npx -y -p @bodhi-ventures/aiocs aiocs-mcp
13
20
  ```
14
21
 
15
22
  The `aiocs-mcp` process is an MCP stdio server, so running it directly will wait for MCP clients instead of printing interactive help. The useful validation commands are:
@@ -38,8 +45,9 @@ Codex does not automatically invoke a custom subagent just because one exists. T
38
45
  To make Codex discover `aiocs` automatically on this machine, expose the skill in the global Codex skill directory:
39
46
 
40
47
  ```bash
48
+ AIOCS_REPO=/absolute/path/to/your/aiocs/checkout
41
49
  mkdir -p ~/.codex/skills
42
- ln -sfn /Users/jmucha/repos/mandex/aiocs/skills/aiocs ~/.codex/skills/aiocs
50
+ ln -sfn "$AIOCS_REPO/skills/aiocs" ~/.codex/skills/aiocs
43
51
  ```
44
52
 
45
53
  Once that symlink exists, Codex can load the `aiocs` skill directly from the global skills catalog and prefer local docs without you explicitly calling a subagent.
@@ -48,18 +56,19 @@ Once that symlink exists, Codex can load the `aiocs` skill directly from the glo
48
56
 
49
57
  There are two supported subagent patterns:
50
58
 
51
- - Repo example for development and debugging:
52
- [`docs/examples/codex-agents/aiocs-docs-specialist.example.toml`](/Users/jmucha/repos/mandex/aiocs/docs/examples/codex-agents/aiocs-docs-specialist.example.toml)
59
+ - Repo-managed agent definition:
60
+ [`agents/aiocs-docs-specialist.toml`](../agents/aiocs-docs-specialist.toml)
53
61
  - Install-ready global agent definition:
54
- [`/Users/jmucha/repos/ai-skills/agents/aiocs-docs-specialist.toml`](/Users/jmucha/repos/ai-skills/agents/aiocs-docs-specialist.toml)
62
+ `ai-skills/agents/aiocs-docs-specialist.toml` from your local `ai-skills` checkout
55
63
 
56
- The repo example is intentionally development-oriented and uses a checkout-local MCP command. The global agent points at the globally installed `aiocs-mcp` binary.
64
+ The repo-managed agent definition and the install-ready global agent both point at the globally installed `aiocs-mcp` binary so Codex uses the published package by default.
57
65
 
58
66
  To expose the install-ready global agent to Codex on this machine:
59
67
 
60
68
  ```bash
69
+ AI_SKILLS_REPO=/absolute/path/to/your/ai-skills/checkout
61
70
  mkdir -p ~/.codex/agents
62
- ln -sfn /Users/jmucha/repos/ai-skills/agents/aiocs-docs-specialist.toml ~/.codex/agents/aiocs-docs-specialist.toml
71
+ ln -sfn "$AI_SKILLS_REPO/agents/aiocs-docs-specialist.toml" ~/.codex/agents/aiocs-docs-specialist.toml
63
72
  ```
64
73
 
65
74
  ## Suggested Codex flows
@@ -73,7 +73,7 @@ This section documents the stable top-level `data` payload per command.
73
73
  ```json
74
74
  {
75
75
  "name": "@bodhi-ventures/aiocs",
76
- "version": "0.1.0"
76
+ "version": "0.1.1"
77
77
  }
78
78
  ```
79
79
 
@@ -81,7 +81,11 @@ This section documents the stable top-level `data` payload per command.
81
81
 
82
82
  ```json
83
83
  {
84
- "sourceSpecDir": "/absolute/path/to/aiocs/sources",
84
+ "sourceSpecDirs": [
85
+ "/absolute/path/to/aiocs/sources",
86
+ "<home>/.aiocs/sources"
87
+ ],
88
+ "userSourceDir": "<home>/.aiocs/sources",
85
89
  "fetched": false,
86
90
  "initializedSources": [
87
91
  {
@@ -402,7 +406,7 @@ Summary status values:
402
406
  "manifest": {
403
407
  "formatVersion": 1,
404
408
  "createdAt": "2026-03-26T10:00:00.000Z",
405
- "packageVersion": "0.1.0",
409
+ "packageVersion": "0.1.1",
406
410
  "entries": [
407
411
  {
408
412
  "relativePath": "data/catalog.sqlite",
@@ -419,12 +423,12 @@ Summary status values:
419
423
  ```json
420
424
  {
421
425
  "inputDir": "/absolute/path/to/backup",
422
- "dataDir": "/Users/example/.aiocs/data",
423
- "configDir": "/Users/example/.aiocs/config",
426
+ "dataDir": "<home>/.aiocs/data",
427
+ "configDir": "<home>/.aiocs/config",
424
428
  "manifest": {
425
429
  "formatVersion": 1,
426
430
  "createdAt": "2026-03-26T10:00:00.000Z",
427
- "packageVersion": "0.1.0",
431
+ "packageVersion": "0.1.1",
428
432
  "entries": []
429
433
  }
430
434
  }
@@ -495,7 +499,7 @@ Successful MCP tool results:
495
499
  "ok": true,
496
500
  "data": {
497
501
  "name": "@bodhi-ventures/aiocs",
498
- "version": "0.1.0"
502
+ "version": "0.1.1"
499
503
  }
500
504
  }
501
505
  ```
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@bodhi-ventures/aiocs",
3
- "version": "0.1.0",
3
+ "version": "0.1.2",
4
4
  "license": "MIT",
5
5
  "type": "module",
6
6
  "description": "Local-only documentation store, fetcher, and search CLI for AI agents.",
@@ -17,7 +17,7 @@ Use this skill when you need authoritative local documentation search, inspectio
17
17
 
18
18
  - Prefer `aiocs` before live web browsing when the requested docs may already be in the local catalog.
19
19
  - Check `source_list` or scoped `search` before assuming a source is missing.
20
- - Use `aiocs` first for supported built-in sources and for any repo or machine that already relies on `~/.aiocs`.
20
+ - Use `aiocs` first for the bundled `hyperliquid` source and for any repo or machine that already relies on `~/.aiocs`.
21
21
  - If a source is missing, only add it when it is worth curating for future reuse.
22
22
  - Prefer `refresh due <source-id>` over force `fetch <source-id>` whenever freshness is the real goal.
23
23
  - Do not use `fetch all` as a normal answering path; reserve it for explicit user requests or maintenance flows.
@@ -32,6 +32,8 @@ Use this skill when you need authoritative local documentation search, inspectio
32
32
  1. Prefer `aiocs-mcp` when an MCP client can use it directly.
33
33
  2. Otherwise use the CLI with the root `--json` flag.
34
34
  3. Avoid parsing human-formatted CLI output unless there is no alternative.
35
+ 4. Assume `docs` and `aiocs-mcp` come from the globally installed `@bodhi-ventures/aiocs` package unless the user explicitly asks for a checkout-local development build.
36
+ 5. Use `npx -y -p @bodhi-ventures/aiocs ...` only as a fallback when the global install is unavailable.
35
37
 
36
38
  ## Search defaults for agents
37
39
 
@@ -49,7 +51,7 @@ Validate the local runtime:
49
51
  docs --json doctor
50
52
  ```
51
53
 
52
- Bootstrap the bundled built-in sources:
54
+ Bootstrap managed sources from the repo bundle and `~/.aiocs/sources`:
53
55
 
54
56
  ```bash
55
57
  docs --json init --no-fetch
@@ -170,5 +172,5 @@ The `aiocs-mcp` server exposes the same core operations without shell parsing:
170
172
  - Newly added or changed sources become due immediately, so `refresh due <source-id>` is the safe first refresh path after upsert.
171
173
  - CLI failures expose machine-readable `error.code` fields in `--json` mode.
172
174
  - MCP tool results use `{ ok, data?, error? }` envelopes, and `batch` can reduce multiple small MCP round trips.
173
- - For exact CLI payloads, see `/Users/jmucha/repos/mandex/aiocs/docs/json-contract.md`.
174
- - For Codex setup and subagent examples, see `/Users/jmucha/repos/mandex/aiocs/docs/codex-integration.md`.
175
+ - For exact CLI payloads, see `docs/json-contract.md`.
176
+ - For Codex setup and subagent examples, see `docs/codex-integration.md`.
@@ -1,21 +0,0 @@
1
- name = "aiocs_docs_specialist"
2
- description = "Development example specialist for local aiocs documentation search, drift checks, diffs, and health verification through a checkout-local MCP server."
3
- model = "gpt-5.4-mini"
4
- model_reasoning_effort = "high"
5
- sandbox_mode = "read-only"
6
- nickname_candidates = ["Index", "Ledger", "Compass"]
7
- developer_instructions = """
8
- Use aiocs as the first stop for local documentation work.
9
- Prefer aiocs before live browsing when the requested docs may already exist in the local catalog.
10
- Check source presence and freshness with source_list before assuming docs are missing or stale.
11
- Default to search mode auto, switch to lexical for exact identifiers, prefer refresh_due for targeted freshness checks, and use batch when multiple aiocs tool calls are needed in one task.
12
- If the parent agent explicitly asks for aiocs write operations and a source is missing but likely to be reused, add a spec under ~/.aiocs/sources, upsert it, then refresh only that source.
13
- Avoid fetch all unless the parent agent explicitly asks for broad maintenance.
14
- When returning results, include sourceId, snapshotId, and pageUrl when they materially improve traceability.
15
- Do not edit aiocs source specs, catalog contents, or daemon configuration unless the parent agent explicitly asks for it.
16
- If aiocs health is in doubt, run doctor before assuming the catalog is broken.
17
- """
18
-
19
- [mcp_servers.aiocs]
20
- command = "pnpm"
21
- args = ["--dir", "/absolute/path/to/aiocs", "dev:mcp"]
@@ -1,20 +0,0 @@
1
- id: ethereal
2
- label: Ethereal Docs
3
- startUrls:
4
- - https://docs.ethereal.trade/
5
- allowedHosts:
6
- - docs.ethereal.trade
7
- discovery:
8
- include:
9
- - https://docs.ethereal.trade/**
10
- exclude: []
11
- maxPages: 500
12
- extract:
13
- strategy: clipboardButton
14
- interactions:
15
- - action: click
16
- selector: button[aria-label="Copy page"]
17
- normalize:
18
- prependSourceComment: true
19
- schedule:
20
- everyHours: 24
@@ -1,24 +0,0 @@
1
- id: lighter
2
- label: Lighter Docs
3
- startUrls:
4
- - https://apidocs.lighter.xyz/docs/get-started
5
- - https://apidocs.lighter.xyz/reference/status
6
- allowedHosts:
7
- - apidocs.lighter.xyz
8
- discovery:
9
- include:
10
- - https://apidocs.lighter.xyz/docs/**
11
- - https://apidocs.lighter.xyz/reference/**
12
- exclude: []
13
- maxPages: 800
14
- extract:
15
- strategy: clipboardButton
16
- interactions:
17
- - action: click
18
- selector: 'button:has-text("Ask AI")'
19
- - action: click
20
- selector: 'div[role="button"]:has-text("Copy Markdown")'
21
- normalize:
22
- prependSourceComment: true
23
- schedule:
24
- everyHours: 24
package/sources/nado.yaml DELETED
@@ -1,22 +0,0 @@
1
- id: nado
2
- label: Nado Docs
3
- startUrls:
4
- - https://docs.nado.xyz/
5
- allowedHosts:
6
- - docs.nado.xyz
7
- discovery:
8
- include:
9
- - https://docs.nado.xyz/**
10
- exclude: []
11
- maxPages: 500
12
- extract:
13
- strategy: clipboardButton
14
- interactions:
15
- - action: click
16
- selector: button[aria-label="More"]
17
- - action: click
18
- selector: '[role="menuitem"]:has-text("Copy page")'
19
- normalize:
20
- prependSourceComment: true
21
- schedule:
22
- everyHours: 24
@@ -1,24 +0,0 @@
1
- id: synthetix
2
- label: Synthetix Docs
3
- startUrls:
4
- - https://developers.synthetix.io/
5
- allowedHosts:
6
- - developers.synthetix.io
7
- discovery:
8
- include:
9
- - https://developers.synthetix.io/**
10
- exclude: []
11
- maxPages: 500
12
- extract:
13
- strategy: clipboardButton
14
- interactions:
15
- - action: hover
16
- selector: .vocs_AiCtaDropdown
17
- - action: click
18
- selector: .vocs_AiCtaDropdown_buttonRight
19
- - action: click
20
- selector: '[role="menuitem"]:has-text("Copy")'
21
- normalize:
22
- prependSourceComment: true
23
- schedule:
24
- everyHours: 24