@bodhi-ventures/aiocs 0.1.1 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -30,25 +30,24 @@ For testing or local overrides, set:
30
30
  ```bash
31
31
  npm install -g @bodhi-ventures/aiocs
32
32
  docs --version
33
+ docs --help
34
+ command -v aiocs-mcp
33
35
  ```
34
36
 
35
- For repository development:
37
+ Zero-install fallback:
36
38
 
37
39
  ```bash
38
- pnpm install
39
- pnpm build
40
+ npx -y -p @bodhi-ventures/aiocs docs --version
41
+ npx -y -p @bodhi-ventures/aiocs aiocs-mcp
40
42
  ```
41
43
 
42
- Run the CLI during development with:
44
+ For repository development only:
43
45
 
44
46
  ```bash
47
+ pnpm install
48
+ pnpm build
45
49
  pnpm dev -- --help
46
- ```
47
-
48
- Or after build:
49
-
50
- ```bash
51
- ./dist/cli.js --help
50
+ pnpm dev:mcp
52
51
  ```
53
52
 
54
53
  For AI agents, prefer the root-level `--json` flag for one-shot commands:
@@ -57,9 +56,9 @@ For AI agents, prefer the root-level `--json` flag for one-shot commands:
57
56
  docs --json version
58
57
  docs --json doctor
59
58
  docs --json init --no-fetch
60
- pnpm dev -- --json source list
61
- pnpm dev -- --json search "maker flow" --source hyperliquid
62
- pnpm dev -- --json show 42
59
+ docs --json source list
60
+ docs --json search "maker flow" --source hyperliquid
61
+ docs --json show 42
63
62
  ```
64
63
 
65
64
  `--json` emits exactly one JSON document to stdout with this envelope:
@@ -109,7 +108,7 @@ GitHub Actions publishes `@bodhi-ventures/aiocs` publicly to npm and creates the
109
108
 
110
109
  ## Codex integration
111
110
 
112
- For Codex-first setup, automatic-use guidance, MCP recommendations, and subagent examples, see [docs/codex-integration.md](./docs/codex-integration.md).
111
+ For Codex-first setup, automatic-use guidance, MCP recommendations, and agent definitions, see [docs/codex-integration.md](./docs/codex-integration.md).
113
112
 
114
113
  ## Managed sources
115
114
 
@@ -140,50 +139,50 @@ Register a source:
140
139
  ```bash
141
140
  mkdir -p ~/.aiocs/sources
142
141
  cp /path/to/source.yaml ~/.aiocs/sources/my-source.yaml
143
- pnpm dev -- source upsert ~/.aiocs/sources/my-source.yaml
144
- pnpm dev -- source upsert /path/to/source.yaml
145
- pnpm dev -- source list
142
+ docs source upsert ~/.aiocs/sources/my-source.yaml
143
+ docs source upsert /path/to/source.yaml
144
+ docs source list
146
145
  ```
147
146
 
148
147
  Fetch and snapshot docs:
149
148
 
150
149
  ```bash
151
- pnpm dev -- refresh due hyperliquid
152
- pnpm dev -- snapshot list hyperliquid
153
- pnpm dev -- refresh due
150
+ docs refresh due hyperliquid
151
+ docs snapshot list hyperliquid
152
+ docs refresh due
154
153
  ```
155
154
 
156
155
  Force fetch remains available for explicit maintenance:
157
156
 
158
157
  ```bash
159
- pnpm dev -- fetch hyperliquid
160
- pnpm dev -- fetch all
158
+ docs fetch hyperliquid
159
+ docs fetch all
161
160
  ```
162
161
 
163
162
  Link docs to a local project:
164
163
 
165
164
  ```bash
166
- pnpm dev -- project link /absolute/path/to/project hyperliquid lighter
167
- pnpm dev -- project unlink /absolute/path/to/project lighter
165
+ docs project link /absolute/path/to/project hyperliquid lighter
166
+ docs project unlink /absolute/path/to/project lighter
168
167
  ```
169
168
 
170
169
  Search and inspect results:
171
170
 
172
171
  ```bash
173
- pnpm dev -- search "maker flow" --source hyperliquid
174
- pnpm dev -- search "maker flow" --source hyperliquid --mode lexical
175
- pnpm dev -- search "maker flow" --source hyperliquid --mode hybrid
176
- pnpm dev -- search "maker flow" --source hyperliquid --mode semantic
177
- pnpm dev -- search "maker flow" --all
178
- pnpm dev -- search "maker flow" --source hyperliquid --limit 5 --offset 0
179
- pnpm dev -- show 42
180
- pnpm dev -- canary hyperliquid
181
- pnpm dev -- diff hyperliquid
182
- pnpm dev -- embeddings status
183
- pnpm dev -- embeddings backfill all
184
- pnpm dev -- embeddings run
185
- pnpm dev -- backup export /absolute/path/to/backup
186
- pnpm dev -- verify coverage hyperliquid /absolute/path/to/reference.md
172
+ docs search "maker flow" --source hyperliquid
173
+ docs search "maker flow" --source hyperliquid --mode lexical
174
+ docs search "maker flow" --source hyperliquid --mode hybrid
175
+ docs search "maker flow" --source hyperliquid --mode semantic
176
+ docs search "maker flow" --all
177
+ docs search "maker flow" --source hyperliquid --limit 5 --offset 0
178
+ docs show 42
179
+ docs canary hyperliquid
180
+ docs diff hyperliquid
181
+ docs embeddings status
182
+ docs embeddings backfill all
183
+ docs embeddings run
184
+ docs backup export /absolute/path/to/backup
185
+ docs verify coverage hyperliquid /absolute/path/to/reference.md
187
186
  ```
188
187
 
189
188
  When `docs search` runs inside a linked project, it automatically scopes to that project's linked sources unless `--source` or `--all` is provided.
@@ -279,22 +278,22 @@ All one-shot commands support `--json`:
279
278
  Representative examples:
280
279
 
281
280
  ```bash
282
- pnpm dev -- --json doctor
283
- pnpm dev -- --json init --no-fetch
284
- pnpm dev -- --json source list
285
- pnpm dev -- --json source upsert sources/hyperliquid.yaml
286
- pnpm dev -- --json refresh due hyperliquid
287
- pnpm dev -- --json canary hyperliquid
288
- pnpm dev -- --json refresh due
289
- pnpm dev -- --json diff hyperliquid
290
- pnpm dev -- --json embeddings status
291
- pnpm dev -- --json embeddings backfill all
292
- pnpm dev -- --json embeddings clear hyperliquid
293
- pnpm dev -- --json embeddings run
294
- pnpm dev -- --json project link /absolute/path/to/project hyperliquid lighter
295
- pnpm dev -- --json snapshot list hyperliquid
296
- pnpm dev -- --json backup export /absolute/path/to/backup
297
- pnpm dev -- --json verify coverage hyperliquid /absolute/path/to/reference.md
281
+ docs --json doctor
282
+ docs --json init --no-fetch
283
+ docs --json source list
284
+ docs --json source upsert sources/hyperliquid.yaml
285
+ docs --json refresh due hyperliquid
286
+ docs --json canary hyperliquid
287
+ docs --json refresh due
288
+ docs --json diff hyperliquid
289
+ docs --json embeddings status
290
+ docs --json embeddings backfill all
291
+ docs --json embeddings clear hyperliquid
292
+ docs --json embeddings run
293
+ docs --json project link /absolute/path/to/project hyperliquid lighter
294
+ docs --json snapshot list hyperliquid
295
+ docs --json backup export /absolute/path/to/backup
296
+ docs --json verify coverage hyperliquid /absolute/path/to/reference.md
298
297
  ```
299
298
 
300
299
  For multi-result commands like `fetch`, `refresh due`, and `search`, `data` contains structured collections rather than line-by-line output:
@@ -321,8 +320,7 @@ For multi-result commands like `fetch`, `refresh due`, and `search`, `data` cont
321
320
  `aiocs` ships a first-class long-running refresh process:
322
321
 
323
322
  ```bash
324
- pnpm dev -- daemon
325
- ./dist/cli.js daemon
323
+ docs daemon
326
324
  ```
327
325
 
328
326
  The daemon bootstraps source specs from the configured directories, refreshes due sources, sleeps for the configured interval, and repeats.
@@ -353,7 +351,7 @@ For local agents, the daemon keeps the shared catalog under `~/.aiocs` warm whil
353
351
  `docs daemon --json` is intentionally different from one-shot commands. Because it is long-running, it emits one JSON event per line:
354
352
 
355
353
  ```bash
356
- ./dist/cli.js --json daemon
354
+ docs --json daemon
357
355
  ```
358
356
 
359
357
  Example event stream:
@@ -369,7 +367,13 @@ Example event stream:
369
367
  `aiocs` also ships an MCP server binary for tool-native agent integrations:
370
368
 
371
369
  ```bash
370
+ command -v aiocs-mcp
372
371
  aiocs-mcp
372
+ ```
373
+
374
+ For repository development only:
375
+
376
+ ```bash
373
377
  pnpm dev:mcp
374
378
  ```
375
379
 
@@ -43,9 +43,108 @@ function toAiocsError(error) {
43
43
  return new AiocsError(AIOCS_ERROR_CODES.internalError, String(error));
44
44
  }
45
45
 
46
- // src/catalog/catalog.ts
46
+ // src/runtime/paths.ts
47
+ import { homedir } from "os";
48
+ import { join as join2, relative, resolve, sep } from "path";
47
49
  import { mkdirSync } from "fs";
48
- import { join, resolve as resolve2 } from "path";
50
+
51
+ // src/runtime/bundled-sources.ts
52
+ import { existsSync } from "fs";
53
+ import { dirname, join } from "path";
54
+ import { fileURLToPath } from "url";
55
+ function findPackageRoot(startDir) {
56
+ let currentDir = startDir;
57
+ while (true) {
58
+ if (existsSync(join(currentDir, "package.json")) && existsSync(join(currentDir, "sources"))) {
59
+ return currentDir;
60
+ }
61
+ const parentDir = dirname(currentDir);
62
+ if (parentDir === currentDir) {
63
+ throw new Error(`Could not locate aiocs package root from ${startDir}`);
64
+ }
65
+ currentDir = parentDir;
66
+ }
67
+ }
68
+ function getBundledSourcesDir() {
69
+ const currentFilePath = fileURLToPath(import.meta.url);
70
+ const packageRoot = findPackageRoot(dirname(currentFilePath));
71
+ return join(packageRoot, "sources");
72
+ }
73
+
74
+ // src/runtime/paths.ts
75
+ var PORTABLE_USER_SOURCES_PREFIX = "~/.aiocs/sources";
76
+ var PORTABLE_BUNDLED_SOURCES_PREFIX = "aiocs://bundled";
77
+ var CONTAINER_USER_SOURCES_DIR = "/root/.aiocs/sources";
78
+ var CONTAINER_BUNDLED_SOURCES_DIR = "/app/sources";
79
+ function expandTilde(path) {
80
+ if (path === "~") {
81
+ return homedir();
82
+ }
83
+ if (path.startsWith("~/")) {
84
+ return join2(homedir(), path.slice(2));
85
+ }
86
+ return path;
87
+ }
88
+ function getAiocsDataDir(env = process.env) {
89
+ const override = env.AIOCS_DATA_DIR;
90
+ if (override) {
91
+ mkdirSync(expandTilde(override), { recursive: true });
92
+ return expandTilde(override);
93
+ }
94
+ const target = join2(homedir(), ".aiocs", "data");
95
+ mkdirSync(target, { recursive: true });
96
+ return target;
97
+ }
98
+ function getAiocsConfigDir(env = process.env) {
99
+ const override = env.AIOCS_CONFIG_DIR;
100
+ if (override) {
101
+ mkdirSync(expandTilde(override), { recursive: true });
102
+ return expandTilde(override);
103
+ }
104
+ const target = join2(homedir(), ".aiocs", "config");
105
+ mkdirSync(target, { recursive: true });
106
+ return target;
107
+ }
108
+ function getAiocsSourcesDir(env = process.env) {
109
+ const override = env.AIOCS_SOURCES_DIR;
110
+ if (override) {
111
+ mkdirSync(expandTilde(override), { recursive: true });
112
+ return expandTilde(override);
113
+ }
114
+ const target = join2(homedir(), ".aiocs", "sources");
115
+ mkdirSync(target, { recursive: true });
116
+ return target;
117
+ }
118
+ function isWithinRoot(candidatePath, rootPath) {
119
+ return candidatePath === rootPath || candidatePath.startsWith(`${rootPath}${sep}`);
120
+ }
121
+ function toPortablePath(prefix, rootPath, candidatePath) {
122
+ const relativePath = relative(rootPath, candidatePath).split(sep).join("/");
123
+ return relativePath ? `${prefix}/${relativePath}` : prefix;
124
+ }
125
+ function canonicalizeManagedSpecPath(specPath, env = process.env) {
126
+ if (specPath === PORTABLE_USER_SOURCES_PREFIX || specPath.startsWith(`${PORTABLE_USER_SOURCES_PREFIX}/`) || specPath === PORTABLE_BUNDLED_SOURCES_PREFIX || specPath.startsWith(`${PORTABLE_BUNDLED_SOURCES_PREFIX}/`)) {
127
+ return specPath;
128
+ }
129
+ const resolvedPath = resolve(specPath);
130
+ const userRoots = [resolve(getAiocsSourcesDir(env)), CONTAINER_USER_SOURCES_DIR];
131
+ for (const rootPath of userRoots) {
132
+ if (isWithinRoot(resolvedPath, rootPath)) {
133
+ return toPortablePath(PORTABLE_USER_SOURCES_PREFIX, rootPath, resolvedPath);
134
+ }
135
+ }
136
+ const bundledRoots = [resolve(getBundledSourcesDir()), CONTAINER_BUNDLED_SOURCES_DIR];
137
+ for (const rootPath of bundledRoots) {
138
+ if (isWithinRoot(resolvedPath, rootPath)) {
139
+ return toPortablePath(PORTABLE_BUNDLED_SOURCES_PREFIX, rootPath, resolvedPath);
140
+ }
141
+ }
142
+ return resolvedPath;
143
+ }
144
+
145
+ // src/catalog/catalog.ts
146
+ import { mkdirSync as mkdirSync2 } from "fs";
147
+ import { join as join3, resolve as resolve3 } from "path";
49
148
  import { randomUUID } from "crypto";
50
149
  import Database from "better-sqlite3";
51
150
 
@@ -152,12 +251,12 @@ function buildSnapshotFingerprint(input) {
152
251
 
153
252
  // src/catalog/project-scope.ts
154
253
  import { realpathSync } from "fs";
155
- import { resolve } from "path";
254
+ import { resolve as resolve2 } from "path";
156
255
  function isWithin(candidate, root) {
157
256
  return candidate === root || candidate.startsWith(`${root}/`);
158
257
  }
159
258
  function canonicalizeProjectPath(path) {
160
- const resolved = resolve(path);
259
+ const resolved = resolve2(path);
161
260
  try {
162
261
  return realpathSync.native(resolved);
163
262
  } catch {
@@ -511,9 +610,9 @@ function assertPaginationValue(value, field, fallback) {
511
610
  return value;
512
611
  }
513
612
  function openCatalog(options) {
514
- const dataDir = resolve2(options.dataDir);
515
- mkdirSync(dataDir, { recursive: true });
516
- const db = new Database(join(dataDir, "catalog.sqlite"));
613
+ const dataDir = resolve3(options.dataDir);
614
+ mkdirSync2(dataDir, { recursive: true });
615
+ const db = new Database(join3(dataDir, "catalog.sqlite"));
517
616
  initSchema(db);
518
617
  const listProjectLinks = () => {
519
618
  const rows = db.prepare("SELECT project_path, source_id FROM project_links ORDER BY project_path, source_id").all();
@@ -788,7 +887,7 @@ function openCatalog(options) {
788
887
  const timestamp = nowIso();
789
888
  const configHash = sha256(stableStringify(spec));
790
889
  const existing = db.prepare("SELECT id, created_at, next_due_at, next_canary_due_at, config_hash FROM sources WHERE id = ?").get(spec.id);
791
- const resolvedSpecPath = options2?.specPath ? resolve2(options2.specPath) : null;
890
+ const resolvedSpecPath = options2?.specPath ? canonicalizeManagedSpecPath(options2.specPath) : null;
792
891
  const nextDueAt = !existing ? timestamp : existing.config_hash === configHash ? existing.next_due_at : timestamp;
793
892
  const canaryConfig = resolveSourceCanary(spec);
794
893
  const nextCanaryDueAt = !existing ? timestamp : existing.config_hash === configHash ? existing.next_canary_due_at ?? addHoursIso(canaryConfig.everyHours) : timestamp;
@@ -851,7 +950,7 @@ function openCatalog(options) {
851
950
  return rows.map((row) => ({
852
951
  id: row.id,
853
952
  label: row.label,
854
- specPath: row.spec_path,
953
+ specPath: row.spec_path ? canonicalizeManagedSpecPath(row.spec_path) : null,
855
954
  nextDueAt: row.next_due_at,
856
955
  isDue: Date.parse(row.next_due_at) <= Date.now(),
857
956
  nextCanaryDueAt: row.next_canary_due_at,
@@ -1086,8 +1185,9 @@ function openCatalog(options) {
1086
1185
  return [];
1087
1186
  }
1088
1187
  const activeSourceKeys = new Set(
1089
- input.activeSources.map((source) => `${source.sourceId}::${resolve2(source.specPath)}`)
1188
+ input.activeSources.map((source) => `${source.sourceId}::${canonicalizeManagedSpecPath(source.specPath)}`)
1090
1189
  );
1190
+ const normalizedManagedRoots = input.managedRoots.map((managedRoot) => canonicalizeManagedSpecPath(managedRoot));
1091
1191
  const rows = db.prepare(`
1092
1192
  SELECT id, spec_path
1093
1193
  FROM sources
@@ -1098,8 +1198,8 @@ function openCatalog(options) {
1098
1198
  if (!row.spec_path) {
1099
1199
  return false;
1100
1200
  }
1101
- const normalizedSpecPath = resolve2(row.spec_path);
1102
- return input.managedRoots.some(
1201
+ const normalizedSpecPath = canonicalizeManagedSpecPath(row.spec_path);
1202
+ return normalizedManagedRoots.some(
1103
1203
  (managedRoot) => normalizedSpecPath === managedRoot || normalizedSpecPath.startsWith(`${managedRoot}/`)
1104
1204
  ) && !activeSourceKeys.has(`${row.id}::${normalizedSpecPath}`);
1105
1205
  }).map((row) => row.id);
@@ -1718,57 +1818,14 @@ function openCatalog(options) {
1718
1818
  };
1719
1819
  }
1720
1820
 
1721
- // src/runtime/paths.ts
1722
- import { homedir } from "os";
1723
- import { join as join2 } from "path";
1724
- import { mkdirSync as mkdirSync2 } from "fs";
1725
- function expandTilde(path) {
1726
- if (path === "~") {
1727
- return homedir();
1728
- }
1729
- if (path.startsWith("~/")) {
1730
- return join2(homedir(), path.slice(2));
1731
- }
1732
- return path;
1733
- }
1734
- function getAiocsDataDir(env = process.env) {
1735
- const override = env.AIOCS_DATA_DIR;
1736
- if (override) {
1737
- mkdirSync2(expandTilde(override), { recursive: true });
1738
- return expandTilde(override);
1739
- }
1740
- const target = join2(homedir(), ".aiocs", "data");
1741
- mkdirSync2(target, { recursive: true });
1742
- return target;
1743
- }
1744
- function getAiocsConfigDir(env = process.env) {
1745
- const override = env.AIOCS_CONFIG_DIR;
1746
- if (override) {
1747
- mkdirSync2(expandTilde(override), { recursive: true });
1748
- return expandTilde(override);
1749
- }
1750
- const target = join2(homedir(), ".aiocs", "config");
1751
- mkdirSync2(target, { recursive: true });
1752
- return target;
1753
- }
1754
- function getAiocsSourcesDir(env = process.env) {
1755
- const override = env.AIOCS_SOURCES_DIR;
1756
- if (override) {
1757
- mkdirSync2(expandTilde(override), { recursive: true });
1758
- return expandTilde(override);
1759
- }
1760
- const target = join2(homedir(), ".aiocs", "sources");
1761
- mkdirSync2(target, { recursive: true });
1762
- return target;
1763
- }
1764
-
1765
1821
  // src/daemon.ts
1766
- import { resolve as resolve4 } from "path";
1822
+ import { existsSync as existsSync2 } from "fs";
1823
+ import { resolve as resolve5 } from "path";
1767
1824
  import { setTimeout as sleep2 } from "timers/promises";
1768
1825
 
1769
1826
  // src/fetch/fetch-source.ts
1770
1827
  import { mkdirSync as mkdirSync3, writeFileSync } from "fs";
1771
- import { join as join3 } from "path";
1828
+ import { join as join4 } from "path";
1772
1829
  import { setTimeout as sleep } from "timers/promises";
1773
1830
  import { chromium } from "playwright";
1774
1831
 
@@ -2051,11 +2108,11 @@ async function extractRawMarkdownPage(url, response) {
2051
2108
  };
2052
2109
  }
2053
2110
  function persistSnapshotPages(input, snapshotId, pages) {
2054
- const snapshotDir = join3(input.dataDir, "sources", input.sourceId, "snapshots", snapshotId, "pages");
2111
+ const snapshotDir = join4(input.dataDir, "sources", input.sourceId, "snapshots", snapshotId, "pages");
2055
2112
  mkdirSync3(snapshotDir, { recursive: true });
2056
2113
  pages.forEach((page, index) => {
2057
2114
  const filename = `${String(index + 1).padStart(3, "0")}-${slugify(page.title)}.md`;
2058
- writeFileSync(join3(snapshotDir, filename), page.markdown, "utf8");
2115
+ writeFileSync(join4(snapshotDir, filename), page.markdown, "utf8");
2059
2116
  });
2060
2117
  }
2061
2118
  function resolveEnvValue(name, env) {
@@ -2404,6 +2461,30 @@ function getEmbeddingModelKey(config) {
2404
2461
  function normalizeBaseUrl(baseUrl) {
2405
2462
  return baseUrl.endsWith("/") ? baseUrl.slice(0, -1) : baseUrl;
2406
2463
  }
2464
+ function normalizeEmbeddingWhitespace(value) {
2465
+ return value.replace(/\s+/g, " ").trim();
2466
+ }
2467
+ function truncateEmbeddingText(value, maxChars) {
2468
+ if (value.length <= maxChars) {
2469
+ return value;
2470
+ }
2471
+ const slice = value.slice(0, maxChars);
2472
+ const lastWhitespace = slice.lastIndexOf(" ");
2473
+ if (lastWhitespace >= Math.floor(maxChars * 0.8)) {
2474
+ return slice.slice(0, lastWhitespace).trim();
2475
+ }
2476
+ return slice.trim();
2477
+ }
2478
+ function prepareTextForEmbedding(markdown, maxChars) {
2479
+ const withoutComments = markdown.replace(/<!--[\s\S]*?-->/g, " ");
2480
+ const withoutImages = withoutComments.replace(/!\[([^\]]*)\]\(([^)]+)\)/g, "$1");
2481
+ const withoutLinks = withoutImages.replace(/\[([^\]]+)\]\(([^)]+)\)/g, "$1");
2482
+ const withoutHtml = withoutLinks.replace(/<[^>]+>/g, " ");
2483
+ const withoutCodeFenceMarkers = withoutHtml.replace(/```[^\n]*\n/g, "\n").replace(/```/g, "\n");
2484
+ const withoutInlineCodeTicks = withoutCodeFenceMarkers.replace(/`([^`]+)`/g, "$1");
2485
+ const normalized = normalizeEmbeddingWhitespace(withoutInlineCodeTicks);
2486
+ return truncateEmbeddingText(normalized, maxChars);
2487
+ }
2407
2488
  async function parseJsonResponse(response) {
2408
2489
  const text = await response.text();
2409
2490
  if (!text) {
@@ -2422,6 +2503,7 @@ async function embedTexts(config, texts) {
2422
2503
  if (texts.length === 0) {
2423
2504
  return [];
2424
2505
  }
2506
+ const preparedTexts = texts.map((text) => prepareTextForEmbedding(text, config.ollamaMaxInputChars));
2425
2507
  const response = await fetch(`${normalizeBaseUrl(config.ollamaBaseUrl)}/api/embed`, {
2426
2508
  method: "POST",
2427
2509
  headers: {
@@ -2430,7 +2512,7 @@ async function embedTexts(config, texts) {
2430
2512
  signal: AbortSignal.timeout(config.ollamaTimeoutMs),
2431
2513
  body: JSON.stringify({
2432
2514
  model: config.ollamaEmbeddingModel,
2433
- input: texts
2515
+ input: preparedTexts
2434
2516
  })
2435
2517
  }).catch((error) => {
2436
2518
  throw new AiocsError(
@@ -2823,29 +2905,6 @@ async function processEmbeddingJobs(input) {
2823
2905
  };
2824
2906
  }
2825
2907
 
2826
- // src/runtime/bundled-sources.ts
2827
- import { existsSync } from "fs";
2828
- import { dirname, join as join4 } from "path";
2829
- import { fileURLToPath } from "url";
2830
- function findPackageRoot(startDir) {
2831
- let currentDir = startDir;
2832
- while (true) {
2833
- if (existsSync(join4(currentDir, "package.json")) && existsSync(join4(currentDir, "sources"))) {
2834
- return currentDir;
2835
- }
2836
- const parentDir = dirname(currentDir);
2837
- if (parentDir === currentDir) {
2838
- throw new Error(`Could not locate aiocs package root from ${startDir}`);
2839
- }
2840
- currentDir = parentDir;
2841
- }
2842
- }
2843
- function getBundledSourcesDir() {
2844
- const currentFilePath = fileURLToPath(import.meta.url);
2845
- const packageRoot = findPackageRoot(dirname(currentFilePath));
2846
- return join4(packageRoot, "sources");
2847
- }
2848
-
2849
2908
  // src/runtime/hybrid-config.ts
2850
2909
  function parsePositiveInteger(value, field, fallback) {
2851
2910
  if (typeof value === "undefined" || value.trim() === "") {
@@ -2888,7 +2947,8 @@ function getHybridRuntimeConfig(env = process.env) {
2888
2947
  embeddingProvider: "ollama",
2889
2948
  ollamaBaseUrl: env.AIOCS_OLLAMA_BASE_URL ?? "http://127.0.0.1:11434",
2890
2949
  ollamaEmbeddingModel: env.AIOCS_OLLAMA_EMBEDDING_MODEL ?? "nomic-embed-text",
2891
- ollamaTimeoutMs: parsePositiveInteger(env.AIOCS_OLLAMA_TIMEOUT_MS, "AIOCS_OLLAMA_TIMEOUT_MS", 1e3),
2950
+ ollamaTimeoutMs: parsePositiveInteger(env.AIOCS_OLLAMA_TIMEOUT_MS, "AIOCS_OLLAMA_TIMEOUT_MS", 1e4),
2951
+ ollamaMaxInputChars: parsePositiveInteger(env.AIOCS_OLLAMA_MAX_INPUT_CHARS, "AIOCS_OLLAMA_MAX_INPUT_CHARS", 4e3),
2892
2952
  embeddingBatchSize: parsePositiveInteger(env.AIOCS_EMBEDDING_BATCH_SIZE, "AIOCS_EMBEDDING_BATCH_SIZE", 32),
2893
2953
  embeddingJobsPerCycle: parsePositiveInteger(env.AIOCS_EMBEDDING_JOB_LIMIT_PER_CYCLE, "AIOCS_EMBEDDING_JOB_LIMIT_PER_CYCLE", 2),
2894
2954
  lexicalCandidateWindow: parsePositiveInteger(env.AIOCS_LEXICAL_CANDIDATE_WINDOW, "AIOCS_LEXICAL_CANDIDATE_WINDOW", 40),
@@ -2900,13 +2960,13 @@ function getHybridRuntimeConfig(env = process.env) {
2900
2960
  // src/spec/source-spec-files.ts
2901
2961
  import { access, readdir } from "fs/promises";
2902
2962
  import { constants as fsConstants } from "fs";
2903
- import { extname as extname2, join as join5, resolve as resolve3 } from "path";
2963
+ import { extname as extname2, join as join5, resolve as resolve4 } from "path";
2904
2964
  var SOURCE_SPEC_EXTENSIONS = /* @__PURE__ */ new Set([".yaml", ".yml", ".json"]);
2905
2965
  function uniqueResolvedPaths(paths) {
2906
2966
  const seen = /* @__PURE__ */ new Set();
2907
2967
  const unique = [];
2908
2968
  for (const rawPath of paths) {
2909
- const normalized = resolve3(rawPath);
2969
+ const normalized = resolve4(rawPath);
2910
2970
  if (seen.has(normalized)) {
2911
2971
  continue;
2912
2972
  }
@@ -2973,10 +3033,11 @@ function parseBoolean(raw, variableName) {
2973
3033
  function parseDaemonConfig(env, options = {}) {
2974
3034
  const intervalMinutes = env.AIOCS_DAEMON_INTERVAL_MINUTES ? parsePositiveInteger2(env.AIOCS_DAEMON_INTERVAL_MINUTES, "AIOCS_DAEMON_INTERVAL_MINUTES") : DEFAULT_INTERVAL_MINUTES;
2975
3035
  const fetchOnStart = env.AIOCS_DAEMON_FETCH_ON_START ? parseBoolean(env.AIOCS_DAEMON_FETCH_ON_START, "AIOCS_DAEMON_FETCH_ON_START") : true;
3036
+ const defaultContainerSourceDir = options.containerSourceDir ?? (existsSync2(DEFAULT_CONTAINER_SOURCE_DIR) ? DEFAULT_CONTAINER_SOURCE_DIR : void 0);
2976
3037
  const defaultSourceDirs = uniqueResolvedPaths([
2977
3038
  options.bundledSourceDir ?? getBundledSourcesDir(),
2978
3039
  options.userSourceDir ?? getAiocsSourcesDir(env),
2979
- options.containerSourceDir ?? DEFAULT_CONTAINER_SOURCE_DIR
3040
+ ...defaultContainerSourceDir ? [defaultContainerSourceDir] : []
2980
3041
  ]);
2981
3042
  const sourceSpecDirs = env.AIOCS_SOURCE_SPEC_DIRS ? uniqueResolvedPaths(
2982
3043
  env.AIOCS_SOURCE_SPEC_DIRS.split(",").map((entry) => entry.trim()).filter(Boolean)
@@ -3023,7 +3084,7 @@ async function bootstrapSourceSpecs(input) {
3023
3084
  throw new Error(`No source spec files found in configured directories: ${normalizedSourceSpecDirs.join(", ")}`);
3024
3085
  }
3025
3086
  const removedSourceIds = input.catalog.removeManagedSources({
3026
- managedRoots: existingDirs.map((sourceSpecDir) => resolve4(sourceSpecDir)),
3087
+ managedRoots: existingDirs.map((sourceSpecDir) => resolve5(sourceSpecDir)),
3027
3088
  activeSources: sources.map((source) => ({
3028
3089
  sourceId: source.sourceId,
3029
3090
  specPath: source.specPath
@@ -3208,7 +3269,7 @@ async function startDaemon(input) {
3208
3269
  // package.json
3209
3270
  var package_default = {
3210
3271
  name: "@bodhi-ventures/aiocs",
3211
- version: "0.1.1",
3272
+ version: "0.1.2",
3212
3273
  license: "MIT",
3213
3274
  type: "module",
3214
3275
  description: "Local-only documentation store, fetcher, and search CLI for AI agents.",
@@ -3287,11 +3348,11 @@ var packageVersion = package_default.version;
3287
3348
  var packageDescription = package_default.description;
3288
3349
 
3289
3350
  // src/services.ts
3290
- import { resolve as resolve7 } from "path";
3351
+ import { resolve as resolve8 } from "path";
3291
3352
 
3292
3353
  // src/backup.ts
3293
3354
  import { cp, mkdir, readdir as readdir2, readFile as readFile2, rename, rm, stat, writeFile } from "fs/promises";
3294
- import { basename, dirname as dirname2, join as join6, resolve as resolve5 } from "path";
3355
+ import { basename, dirname as dirname2, join as join6, resolve as resolve6 } from "path";
3295
3356
  import { randomUUID as randomUUID2 } from "crypto";
3296
3357
  import Database2 from "better-sqlite3";
3297
3358
  var CATALOG_DB_FILENAME = "catalog.sqlite";
@@ -3427,9 +3488,9 @@ async function prepareReplacementTarget(backupDir, targetDir) {
3427
3488
  return stagingDir;
3428
3489
  }
3429
3490
  async function exportBackup(input) {
3430
- const dataDir = resolve5(input.dataDir);
3431
- const outputDir = resolve5(input.outputDir);
3432
- const configDir = input.configDir ? resolve5(input.configDir) : void 0;
3491
+ const dataDir = resolve6(input.dataDir);
3492
+ const outputDir = resolve6(input.outputDir);
3493
+ const configDir = input.configDir ? resolve6(input.configDir) : void 0;
3433
3494
  await assertSourceDirExists(dataDir);
3434
3495
  if (!await isDirectoryEmpty(outputDir)) {
3435
3496
  if (!input.replaceExisting) {
@@ -3465,9 +3526,9 @@ async function exportBackup(input) {
3465
3526
  };
3466
3527
  }
3467
3528
  async function importBackup(input) {
3468
- const inputDir = resolve5(input.inputDir);
3469
- const dataDir = resolve5(input.dataDir);
3470
- const configDir = input.configDir ? resolve5(input.configDir) : void 0;
3529
+ const inputDir = resolve6(input.inputDir);
3530
+ const dataDir = resolve6(input.dataDir);
3531
+ const configDir = input.configDir ? resolve6(input.configDir) : void 0;
3471
3532
  const { manifest, backupDataDir, backupConfigDir } = await loadValidatedBackupPayload(inputDir);
3472
3533
  if (!await isDirectoryEmpty(dataDir)) {
3473
3534
  if (!input.replaceExisting) {
@@ -3511,7 +3572,7 @@ async function importBackup(input) {
3511
3572
 
3512
3573
  // src/coverage.ts
3513
3574
  import { readFile as readFile3 } from "fs/promises";
3514
- import { resolve as resolve6 } from "path";
3575
+ import { resolve as resolve7 } from "path";
3515
3576
  function normalizeText(value) {
3516
3577
  return value.replace(/[`*_~]+/g, "").replace(/\s+/g, " ").trim().toLowerCase();
3517
3578
  }
@@ -3560,7 +3621,7 @@ async function verifyCoverageAgainstReferences(corpus, referenceFiles) {
3560
3621
  body: 0
3561
3622
  };
3562
3623
  for (const referenceFile of referenceFiles) {
3563
- const resolvedReferenceFile = resolve6(referenceFile);
3624
+ const resolvedReferenceFile = resolve7(referenceFile);
3564
3625
  let raw;
3565
3626
  try {
3566
3627
  raw = await readFile3(resolvedReferenceFile, "utf8");
@@ -4226,7 +4287,7 @@ function withCatalog(run) {
4226
4287
  return Promise.resolve(run(ctx)).finally(() => ctx.catalog.close());
4227
4288
  }
4228
4289
  async function upsertSourceFromSpecFile(specFile) {
4229
- const specPath = resolve7(specFile);
4290
+ const specPath = resolve8(specFile);
4230
4291
  const spec = await loadSourceSpec(specPath);
4231
4292
  const result = await withCatalog(({ catalog }) => catalog.upsertSource(spec, { specPath }));
4232
4293
  return {
@@ -4322,7 +4383,7 @@ async function diffSnapshotsForSource(input) {
4322
4383
  return withCatalog(({ catalog }) => catalog.diffSnapshots(input));
4323
4384
  }
4324
4385
  async function linkProjectSources(projectPath, sourceIds) {
4325
- const resolvedProjectPath = resolve7(projectPath);
4386
+ const resolvedProjectPath = resolve8(projectPath);
4326
4387
  await withCatalog(({ catalog }) => {
4327
4388
  catalog.linkProject(resolvedProjectPath, sourceIds);
4328
4389
  });
@@ -4332,7 +4393,7 @@ async function linkProjectSources(projectPath, sourceIds) {
4332
4393
  };
4333
4394
  }
4334
4395
  async function unlinkProjectSources(projectPath, sourceIds) {
4335
- const resolvedProjectPath = resolve7(projectPath);
4396
+ const resolvedProjectPath = resolve8(projectPath);
4336
4397
  await withCatalog(({ catalog }) => {
4337
4398
  catalog.unlinkProject(resolvedProjectPath, sourceIds);
4338
4399
  });
@@ -4342,7 +4403,7 @@ async function unlinkProjectSources(projectPath, sourceIds) {
4342
4403
  };
4343
4404
  }
4344
4405
  async function searchCatalog(query, options) {
4345
- const cwd = options.project ? resolve7(options.project) : process.cwd();
4406
+ const cwd = options.project ? resolve8(options.project) : process.cwd();
4346
4407
  const explicitSources = options.source.length > 0;
4347
4408
  const results = await withCatalog(({ catalog }) => {
4348
4409
  const hybridConfig = getHybridRuntimeConfig();
@@ -4508,9 +4569,9 @@ export {
4508
4569
  AIOCS_ERROR_CODES,
4509
4570
  AiocsError,
4510
4571
  toAiocsError,
4511
- openCatalog,
4512
4572
  getAiocsDataDir,
4513
4573
  getAiocsConfigDir,
4574
+ openCatalog,
4514
4575
  parseDaemonConfig,
4515
4576
  startDaemon,
4516
4577
  packageName,
package/dist/cli.js CHANGED
@@ -31,7 +31,7 @@ import {
31
31
  unlinkProjectSources,
32
32
  upsertSourceFromSpecFile,
33
33
  verifyCoverage
34
- } from "./chunk-AJ5NZDK4.js";
34
+ } from "./chunk-CZ6C4YUX.js";
35
35
 
36
36
  // src/cli.ts
37
37
  import { Command, CommanderError as CommanderError2 } from "commander";
@@ -26,7 +26,7 @@ import {
26
26
  unlinkProjectSources,
27
27
  upsertSourceFromSpecFile,
28
28
  verifyCoverage
29
- } from "./chunk-AJ5NZDK4.js";
29
+ } from "./chunk-CZ6C4YUX.js";
30
30
 
31
31
  // src/mcp-server.ts
32
32
  import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
package/docs/README.md CHANGED
@@ -9,4 +9,4 @@ Good candidates:
9
9
  - operational runbooks
10
10
  - decisions worth preserving across sessions
11
11
  - Codex integration guidance in `codex-integration.md`
12
- - reusable agent examples under `examples/codex-agents/`
12
+ - reusable repo-managed agent definitions under `../agents/`
@@ -9,7 +9,14 @@ Install the CLI and MCP binary globally:
9
9
  ```bash
10
10
  npm install -g @bodhi-ventures/aiocs
11
11
  docs --version
12
- aiocs-mcp
12
+ command -v aiocs-mcp
13
+ ```
14
+
15
+ If global install is unavailable, use `npx` only as a fallback:
16
+
17
+ ```bash
18
+ npx -y -p @bodhi-ventures/aiocs docs --version
19
+ npx -y -p @bodhi-ventures/aiocs aiocs-mcp
13
20
  ```
14
21
 
15
22
  The `aiocs-mcp` process is an MCP stdio server, so running it directly will wait for MCP clients instead of printing interactive help. The useful validation commands are:
@@ -49,12 +56,12 @@ Once that symlink exists, Codex can load the `aiocs` skill directly from the glo
49
56
 
50
57
  There are two supported subagent patterns:
51
58
 
52
- - Repo example for development and debugging:
53
- [`docs/examples/codex-agents/aiocs-docs-specialist.example.toml`](examples/codex-agents/aiocs-docs-specialist.example.toml)
59
+ - Repo-managed agent definition:
60
+ [`agents/aiocs-docs-specialist.toml`](../agents/aiocs-docs-specialist.toml)
54
61
  - Install-ready global agent definition:
55
62
  `ai-skills/agents/aiocs-docs-specialist.toml` from your local `ai-skills` checkout
56
63
 
57
- The repo example is intentionally development-oriented and uses a checkout-local MCP command. The global agent points at the globally installed `aiocs-mcp` binary.
64
+ The repo-managed agent definition and the install-ready global agent both point at the globally installed `aiocs-mcp` binary so Codex uses the published package by default.
58
65
 
59
66
  To expose the install-ready global agent to Codex on this machine:
60
67
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@bodhi-ventures/aiocs",
3
- "version": "0.1.1",
3
+ "version": "0.1.2",
4
4
  "license": "MIT",
5
5
  "type": "module",
6
6
  "description": "Local-only documentation store, fetcher, and search CLI for AI agents.",
@@ -32,6 +32,8 @@ Use this skill when you need authoritative local documentation search, inspectio
32
32
  1. Prefer `aiocs-mcp` when an MCP client can use it directly.
33
33
  2. Otherwise use the CLI with the root `--json` flag.
34
34
  3. Avoid parsing human-formatted CLI output unless there is no alternative.
35
+ 4. Assume `docs` and `aiocs-mcp` come from the globally installed `@bodhi-ventures/aiocs` package unless the user explicitly asks for a checkout-local development build.
36
+ 5. Use `npx -y -p @bodhi-ventures/aiocs ...` only as a fallback when the global install is unavailable.
35
37
 
36
38
  ## Search defaults for agents
37
39
 
@@ -1,21 +0,0 @@
1
- name = "aiocs_docs_specialist"
2
- description = "Development example specialist for local aiocs documentation search, drift checks, diffs, and health verification through a checkout-local MCP server."
3
- model = "gpt-5.4-mini"
4
- model_reasoning_effort = "high"
5
- sandbox_mode = "read-only"
6
- nickname_candidates = ["Index", "Ledger", "Compass"]
7
- developer_instructions = """
8
- Use aiocs as the first stop for local documentation work.
9
- Prefer aiocs before live browsing when the requested docs may already exist in the local catalog.
10
- Check source presence and freshness with source_list before assuming docs are missing or stale.
11
- Default to search mode auto, switch to lexical for exact identifiers, prefer refresh_due for targeted freshness checks, and use batch when multiple aiocs tool calls are needed in one task.
12
- If the parent agent explicitly asks for aiocs write operations and a source is missing but likely to be reused, add a spec under ~/.aiocs/sources, upsert it, then refresh only that source.
13
- Avoid fetch all unless the parent agent explicitly asks for broad maintenance.
14
- When returning results, include sourceId, snapshotId, and pageUrl when they materially improve traceability.
15
- Do not edit aiocs source specs, catalog contents, or daemon configuration unless the parent agent explicitly asks for it.
16
- If aiocs health is in doubt, run doctor before assuming the catalog is broken.
17
- """
18
-
19
- [mcp_servers.aiocs]
20
- command = "pnpm"
21
- args = ["--dir", "/absolute/path/to/aiocs", "dev:mcp"]