akm-cli 0.6.0-rc2 → 0.6.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -10,9 +10,17 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
10
10
 
11
11
  ### Added
12
12
 
13
+ - **`akm workflow validate <ref|path>`** — new subcommand that validates a workflow markdown file or ref, surfacing every error in one pass (without running a full reindex).
14
+ - **`akm feedback` now accepts any indexed ref** — previously type-restricted. `memory:`, `vault:`, `workflow:`, `wiki:` refs all work. Vault feedback never echoes vault values.
15
+ - **`akm upgrade` runs post-upgrade tasks automatically.** After a successful upgrade, the new binary is invoked as a child process running `akm index`, which auto-migrates any legacy `stashes` → `sources` config keys via `loadConfig` and rebuilds the index against the new schema (`DB_VERSION` 8 → 9 forces a rebuild). Pass `--skip-post-upgrade` to opt out (config migration still runs on the next `akm` invocation; you'd just need to run `akm index` yourself). Result is reported in the `postUpgrade` field of the upgrade response.
13
16
  - **`writable` flag on sources.** New optional `SourceConfigEntry.writable` controls whether write commands (`akm remember`, `akm import`, `akm save`, `akm clone`) may target the source. Defaults: `true` for `filesystem`, `false` for `git` / `website` / `npm`. `writable: true` on `website` or `npm` is rejected at config load with `ConfigError("writable: true is only supported on filesystem and git sources")`.
14
17
  - **`defaultWriteTarget` root config key.** Names the source that receives writes when no `--target` flag is given. Resolution order: `--target` → `defaultWriteTarget` → `stashDir` (working stash) → `ConfigError("no writable source configured; run \`akm init\`")`. There is no implicit "first writable in `sources[]` order" fallback.
15
18
 
19
+ ### Changed
20
+
21
+ - **Workflows are now stored as validated `WorkflowDocument` JSON** — workflows are compiled into a validated `WorkflowDocument` JSON shape with line-anchored `SourceRef`s back into the source markdown, cached in a new `workflow_documents` table in `index.db`. The run engine reads from the cache on `akm workflow next` instead of re-parsing markdown each step.
22
+ - **Feedback events flow into utility recomputation** — positive/negative feedback signals now feed utility scoring alongside search/show events. Telemetry records both `entry_ref` and `entry_id` so feedback signals survive a reindex.
23
+
16
24
  ### Changed (breaking)
17
25
 
18
26
  - **v1 architecture refactor.** The internal architecture was rebuilt around a single minimal `SourceProvider` interface (`{ name, kind, init, path, sync? }`), a unified FTS5 index that owns search and show, and a single `writeAssetToSource` helper that owns all writes. The CLI command surface and all user-visible config keys are unchanged. See `docs/migration/v1.md` for the full guide.
@@ -30,6 +38,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
30
38
  - **Migration**: a curated registry author should regenerate their `index.json` (rename `kits` → `stashes`, drop legacy keyword filtering). Publishers should add the `akm-stash` keyword/topic and remove `akm-kit`/`agentikit`.
31
39
  - **`akm registry` description**: changed from "Manage kit registries" to "Manage stash registries".
32
40
 
41
+ ### Migration / Breaking
42
+
43
+ - **`DB_VERSION` bumped 8 → 9.** On first run after upgrade, the version-mismatch path in `ensureSchema()` drops + recreates all `index.db` tables (preserving `usage_events` via a typed backup); the next `akm index` rebuilds the index. `workflow.db` (run state) is unaffected.
44
+
33
45
  ### Removed (breaking)
34
46
 
35
47
  - **OpenViking source provider.** The `openviking` source kind is no longer supported. Configs that contain one fail to load with `ConfigError("openviking is not supported in akm v1. …")` and a hint pointing to `akm config sources remove <name>`. API-backed sources will return as a separate `QuerySource` tier post-v1. To downgrade in the meantime, pin to `akm-cli@0.5`.
package/dist/cli.js CHANGED
@@ -25,7 +25,7 @@ import { ConfigError, NotFoundError, UsageError } from "./core/errors";
25
25
  import { getCacheDir, getDbPath, getDefaultStashDir } from "./core/paths";
26
26
  import { setQuiet, warn } from "./core/warn";
27
27
  import { resolveWriteTarget, writeAssetToSource } from "./core/write-source";
28
- import { closeDatabase, openDatabase } from "./indexer/db";
28
+ import { closeDatabase, findEntryIdByRef, openDatabase } from "./indexer/db";
29
29
  import { akmIndex } from "./indexer/indexer";
30
30
  import { resolveSourceEntries } from "./indexer/search-source";
31
31
  import { insertUsageEvent } from "./indexer/usage-events";
@@ -115,7 +115,10 @@ const initCommand = defineCommand({
115
115
  },
116
116
  async run({ args }) {
117
117
  await runWithJsonErrors(async () => {
118
- const result = await akmInit({ dir: args.dir });
118
+ // Accept both historical spellings for backwards compatibility with
119
+ // older docs/scripts that used `--stashDir`.
120
+ const legacyDir = parseFlagValue(process.argv, "--stashDir") ?? parseFlagValue(process.argv, "--stash-dir");
121
+ const result = await akmInit({ dir: args.dir ?? legacyDir });
119
122
  output("init", result);
120
123
  });
121
124
  },
@@ -378,6 +381,11 @@ const upgradeCommand = defineCommand({
378
381
  description: "Skip checksum verification (not recommended)",
379
382
  default: false,
380
383
  },
384
+ "skip-post-upgrade": {
385
+ type: "boolean",
386
+ description: "Skip the post-upgrade `akm index` rebuild (config auto-migration still runs on next `akm` invocation)",
387
+ default: false,
388
+ },
381
389
  },
382
390
  async run({ args }) {
383
391
  await runWithJsonErrors(async () => {
@@ -387,7 +395,8 @@ const upgradeCommand = defineCommand({
387
395
  return;
388
396
  }
389
397
  const skipChecksum = getHyphenatedBoolean(args, "skip-checksum");
390
- const result = await performUpgrade(check, { force: args.force, skipChecksum });
398
+ const skipPostUpgrade = getHyphenatedBoolean(args, "skip-post-upgrade");
399
+ const result = await performUpgrade(check, { force: args.force, skipChecksum, skipPostUpgrade });
391
400
  output("upgrade", result);
392
401
  });
393
402
  },
@@ -436,7 +445,8 @@ const showCommand = defineCommand({
436
445
  }
437
446
  }
438
447
  const cliDetail = getOutputMode().detail;
439
- const showDetail = cliDetail === "summary" ? "summary" : undefined;
448
+ const explicitDetail = parseFlagValue(process.argv, "--detail");
449
+ const showDetail = explicitDetail === "brief" ? "brief" : cliDetail === "summary" ? "summary" : undefined;
440
450
  const result = await akmShowUnified({ ref: args.ref, view, detail: showDetail });
441
451
  output("show", result);
442
452
  });
@@ -781,7 +791,7 @@ const registryCommand = defineCommand({
781
791
  const feedbackCommand = defineCommand({
782
792
  meta: {
783
793
  name: "feedback",
784
- description: "Record positive or negative feedback for a stash asset",
794
+ description: "Record positive or negative feedback for any indexed stash asset",
785
795
  },
786
796
  args: {
787
797
  ref: { type: "positional", description: "Asset ref (type:name)", required: true },
@@ -795,6 +805,7 @@ const feedbackCommand = defineCommand({
795
805
  if (!ref) {
796
806
  throw new UsageError("Asset ref is required. Usage: akm feedback <ref> --positive|--negative");
797
807
  }
808
+ parseAssetRef(ref);
798
809
  if (args.positive && args.negative) {
799
810
  throw new UsageError("Specify either --positive or --negative, not both.");
800
811
  }
@@ -805,9 +816,14 @@ const feedbackCommand = defineCommand({
805
816
  const metadata = args.note ? JSON.stringify({ note: args.note }) : undefined;
806
817
  const db = openDatabase();
807
818
  try {
819
+ const entryId = findEntryIdByRef(db, ref);
820
+ if (entryId === undefined) {
821
+ throw new UsageError(`Ref "${ref}" is not in the current index. Run "akm index" and try again.`);
822
+ }
808
823
  insertUsageEvent(db, {
809
824
  event_type: "feedback",
810
825
  entry_ref: ref,
826
+ entry_id: entryId,
811
827
  signal,
812
828
  metadata,
813
829
  });
@@ -1520,6 +1536,53 @@ function listVaultsRecursive(listKeysFn) {
1520
1536
  walk(vaultsDir);
1521
1537
  return result;
1522
1538
  }
1539
+ function wasRefMisparsedAsFlagValue(ref, flag, flagValue) {
1540
+ const argv = process.argv.slice(2);
1541
+ const vaultIndex = argv.indexOf("vault");
1542
+ const listIndex = vaultIndex >= 0 ? argv.indexOf("list", vaultIndex + 1) : -1;
1543
+ const tokens = listIndex >= 0 ? argv.slice(listIndex + 1) : argv;
1544
+ let flagIndex = -1;
1545
+ let flagConsumesNextToken = false;
1546
+ for (let i = 0; i < tokens.length; i += 1) {
1547
+ const token = tokens[i];
1548
+ if (token === flag) {
1549
+ flagIndex = i;
1550
+ flagConsumesNextToken = true;
1551
+ break;
1552
+ }
1553
+ if (token === `${flag}=${flagValue}`) {
1554
+ flagIndex = i;
1555
+ break;
1556
+ }
1557
+ }
1558
+ if (flagIndex === -1)
1559
+ return false;
1560
+ // If the same token appeared before the flag, the user explicitly passed it
1561
+ // as the positional ref and it was not consumed by the output flag.
1562
+ if (tokens.slice(0, flagIndex).includes(ref))
1563
+ return false;
1564
+ // Skip past either `--flag value` (2 tokens) or `--flag=value` (1 token)
1565
+ // before checking whether the ref appears elsewhere as a real positional.
1566
+ const TOKENS_AFTER_SPACE_FLAG = 2;
1567
+ const TOKENS_AFTER_EQUALS_FLAG = 1;
1568
+ const firstTokenAfterFlag = flagIndex + (flagConsumesNextToken ? TOKENS_AFTER_SPACE_FLAG : TOKENS_AFTER_EQUALS_FLAG);
1569
+ if (tokens.slice(firstTokenAfterFlag).includes(ref))
1570
+ return false;
1571
+ return true;
1572
+ }
1573
+ function resolveVaultListRef(ref) {
1574
+ if (ref === undefined)
1575
+ return undefined;
1576
+ const parsedFormat = parseFlagValue(process.argv, "--format");
1577
+ if (parsedFormat !== undefined && ref === parsedFormat && wasRefMisparsedAsFlagValue(ref, "--format", parsedFormat)) {
1578
+ return undefined;
1579
+ }
1580
+ const parsedDetail = parseFlagValue(process.argv, "--detail");
1581
+ if (parsedDetail !== undefined && ref === parsedDetail && wasRefMisparsedAsFlagValue(ref, "--detail", parsedDetail)) {
1582
+ return undefined;
1583
+ }
1584
+ return ref;
1585
+ }
1523
1586
  const vaultListCommand = defineCommand({
1524
1587
  meta: { name: "list", description: "List vaults, or list keys (no values) inside one vault" },
1525
1588
  args: {
@@ -1528,8 +1591,9 @@ const vaultListCommand = defineCommand({
1528
1591
  run({ args }) {
1529
1592
  return runWithJsonErrors(async () => {
1530
1593
  const { listKeys, listEntries } = await import("./commands/vault.js");
1531
- if (args.ref) {
1532
- const { name, absPath } = resolveVaultPath(args.ref);
1594
+ const effectiveRef = resolveVaultListRef(args.ref);
1595
+ if (effectiveRef) {
1596
+ const { name, absPath } = resolveVaultPath(effectiveRef);
1533
1597
  if (!fs.existsSync(absPath)) {
1534
1598
  throw new NotFoundError(`Vault not found: vault:${name}`);
1535
1599
  }
@@ -6,7 +6,7 @@
6
6
  */
7
7
  import fs from "node:fs";
8
8
  import path from "node:path";
9
- import { resolveStashDir } from "../core/common";
9
+ import { isWithin, resolveStashDir } from "../core/common";
10
10
  import { loadConfig } from "../core/config";
11
11
  import { NotFoundError, UsageError } from "../core/errors";
12
12
  import { akmIndex } from "../indexer/indexer";
@@ -14,6 +14,7 @@ import { removeLockEntry, upsertLockEntry } from "../integrations/lockfile";
14
14
  import { parseRegistryRef } from "../registry/resolve";
15
15
  import { syncFromRef } from "../sources/providers/sync-from-ref";
16
16
  import { ensureWebsiteMirror } from "../sources/providers/website";
17
+ import { listWikis, resolveWikisRoot } from "../wiki/wiki";
17
18
  import { auditInstallCandidate, deriveRegistryLabels, enforceRegistryInstallPolicy, formatInstallAuditFailure, } from "./install-audit";
18
19
  import { removeInstalledRegistryEntry, upsertInstalledRegistryEntry } from "./source-add";
19
20
  import { removeStash } from "./source-manage";
@@ -57,6 +58,31 @@ export async function akmListSources(input) {
57
58
  status: { exists: directoryExists(entry.stashRoot) },
58
59
  });
59
60
  }
61
+ if (!kindFilter || kindFilter.includes("filesystem")) {
62
+ const wikisRoot = resolveWikisRoot(stashDir);
63
+ const seenPaths = new Set(sources
64
+ .map((source) => source.path)
65
+ .filter((sourcePath) => typeof sourcePath === "string")
66
+ .map((sourcePath) => path.resolve(sourcePath)));
67
+ for (const wiki of listWikis(stashDir)) {
68
+ // `listWikis()` also includes externally-registered wikis. `akm list`
69
+ // should synthesize source entries here only for stash-owned wiki dirs.
70
+ if (!isWithin(wiki.path, wikisRoot))
71
+ continue;
72
+ const resolvedPath = path.resolve(wiki.path);
73
+ if (seenPaths.has(resolvedPath))
74
+ continue;
75
+ seenPaths.add(resolvedPath);
76
+ sources.push({
77
+ name: wiki.name,
78
+ kind: "filesystem",
79
+ wiki: wiki.name,
80
+ path: wiki.path,
81
+ writable: true,
82
+ status: { exists: directoryExists(wiki.path) },
83
+ });
84
+ }
85
+ }
60
86
  return {
61
87
  schemaVersion: 1,
62
88
  stashDir,
@@ -80,6 +80,7 @@ export async function checkForUpdate(currentVersion) {
80
80
  export async function performUpgrade(check, opts) {
81
81
  const { currentVersion, latestVersion, installMethod } = check;
82
82
  const force = opts?.force === true;
83
+ const skipPostUpgrade = opts?.skipPostUpgrade === true;
83
84
  // All install methods can short-circuit here unless the user explicitly forces an upgrade.
84
85
  if (!check.updateAvailable && !force) {
85
86
  return {
@@ -113,6 +114,7 @@ export async function performUpgrade(check, opts) {
113
114
  upgraded: true,
114
115
  installMethod,
115
116
  message: `akm upgraded via ${installMethod}`,
117
+ postUpgrade: runPostUpgradeTasks("akm", { skip: skipPostUpgrade }),
116
118
  };
117
119
  }
118
120
  if (installMethod === "unknown") {
@@ -283,8 +285,72 @@ export async function performUpgrade(check, opts) {
283
285
  installMethod,
284
286
  binaryPath: execPath,
285
287
  checksumVerified,
288
+ // For binary installs, the new binary now lives at execPath; spawn it
289
+ // directly so the post-upgrade work runs against the new code.
290
+ postUpgrade: runPostUpgradeTasks(execPath, { skip: skipPostUpgrade }),
286
291
  };
287
292
  }
293
+ /**
294
+ * Run the post-upgrade tasks against the *new* binary as a child process.
295
+ *
296
+ * Why a child process: the running akm process still has the old code in
297
+ * memory. Calling loadConfig()/akmIndex() in-process would use the old
298
+ * implementations and miss any DB_VERSION / config-key changes the new
299
+ * release introduces.
300
+ *
301
+ * The new binary's `akm index` does the work for us:
302
+ * 1. loadConfig() runs at startup — auto-migrates legacy `stashes` →
303
+ * `sources` if the on-disk config still uses the old key.
304
+ * 2. ensureSchema() detects DB_VERSION mismatch and rebuilds index.db
305
+ * tables (preserving usage_events).
306
+ * 3. The full reindex repopulates entries + workflow_documents + FTS.
307
+ */
308
+ function runPostUpgradeTasks(akmBin, opts) {
309
+ if (opts.skip) {
310
+ return {
311
+ ok: true,
312
+ skipped: true,
313
+ message: "Skipped post-upgrade tasks. Run `akm index` manually to migrate config and rebuild the index.",
314
+ };
315
+ }
316
+ try {
317
+ const result = childProcess.spawnSync(akmBin, ["index"], {
318
+ encoding: "utf8",
319
+ env: process.env,
320
+ stdio: "pipe",
321
+ });
322
+ if (result.error) {
323
+ return {
324
+ ok: false,
325
+ skipped: false,
326
+ message: `Post-upgrade tasks could not start: ${result.error.message}. Run \`akm index\` manually.`,
327
+ };
328
+ }
329
+ if (result.status !== 0) {
330
+ const detail = (result.stderr ?? "").trim() || (result.stdout ?? "").trim() || `exit code ${result.status}`;
331
+ return {
332
+ ok: false,
333
+ skipped: false,
334
+ exitCode: result.status,
335
+ message: `Post-upgrade \`akm index\` failed (${detail}). Run \`akm index\` manually.`,
336
+ };
337
+ }
338
+ return {
339
+ ok: true,
340
+ skipped: false,
341
+ exitCode: 0,
342
+ message: "Config migrated (if needed) and index rebuilt against the new binary.",
343
+ };
344
+ }
345
+ catch (err) {
346
+ const detail = err instanceof Error ? err.message : String(err);
347
+ return {
348
+ ok: false,
349
+ skipped: false,
350
+ message: `Post-upgrade tasks failed: ${detail}. Run \`akm index\` manually.`,
351
+ };
352
+ }
353
+ }
288
354
  function parseChecksumForFile(checksumsText, filename) {
289
355
  for (const line of checksumsText.split("\n")) {
290
356
  const trimmed = line.trim();
@@ -25,15 +25,15 @@ import { parseAssetRef } from "../core/asset-ref";
25
25
  import { loadConfig } from "../core/config";
26
26
  import { NotFoundError, UsageError } from "../core/errors";
27
27
  import { parseFrontmatter, toStringOrUndefined } from "../core/frontmatter";
28
- import { closeDatabase, openDatabase } from "../indexer/db";
28
+ import { closeDatabase, findEntryIdByRef, openDatabase } from "../indexer/db";
29
29
  import { buildFileContext, buildRenderContext, getRenderer, runMatchers } from "../indexer/file-context";
30
30
  import { lookup } from "../indexer/indexer";
31
31
  import { loadStashFile } from "../indexer/metadata";
32
32
  import { buildEditHint, findSourceForPath, isEditable, resolveSourceEntries } from "../indexer/search-source";
33
+ import { insertUsageEvent } from "../indexer/usage-events";
33
34
  import { resolveSourcesForOrigin } from "../registry/origin-resolve";
34
35
  // Eagerly import source providers to trigger self-registration.
35
36
  import "../sources/providers/index";
36
- import { insertUsageEvent } from "../indexer/usage-events";
37
37
  import { resolveAssetPath } from "../sources/resolve";
38
38
  /**
39
39
  * Show a wiki root (no page path) — returns the same payload as
@@ -98,8 +98,8 @@ function resolveRegisteredWikiAssetPath(wikiRoot, wikiName, assetName) {
98
98
  * type-dir resolution if the index has no row. Spec §6.2; no remote provider
99
99
  * fallback.
100
100
  *
101
- * When `detail` is `"summary"`, the response omits content/template/prompt and
102
- * returns only compact metadata (name, type, description, tags, parameters).
101
+ * When `detail` is `"brief"` or `"summary"`, the response omits
102
+ * content/template/prompt and returns compact metadata.
103
103
  */
104
104
  export async function akmShowUnified(input) {
105
105
  const ref = input.ref.trim();
@@ -136,15 +136,10 @@ function logShowEvent(ref, existingDb) {
136
136
  try {
137
137
  const db = existingDb ?? openDatabase();
138
138
  try {
139
- const parsed = parseAssetRef(ref);
140
- const safeName = parsed.name.replace(/%/g, "\\%").replace(/_/g, "\\_");
141
- const row = db
142
- .prepare("SELECT id FROM entries WHERE entry_key LIKE ? ESCAPE '\\' AND entry_type = ? LIMIT 1")
143
- .get(`%:${parsed.type}:${safeName}`, parsed.type);
144
139
  insertUsageEvent(db, {
145
140
  event_type: "show",
146
141
  entry_ref: ref,
147
- entry_id: row?.id,
142
+ entry_id: findEntryIdByRef(db, ref),
148
143
  });
149
144
  }
150
145
  finally {
@@ -256,6 +251,9 @@ export async function showLocal(input) {
256
251
  editable,
257
252
  ...(!editable ? { editHint: buildEditHint(assetPath, parsed.type, parsed.name, source?.registryId) } : {}),
258
253
  };
254
+ if (input.detail === "brief") {
255
+ return buildBriefResponse(fullResponse, assetPath);
256
+ }
259
257
  if (input.detail === "summary") {
260
258
  return buildSummaryResponse(fullResponse, assetPath);
261
259
  }
@@ -275,6 +273,24 @@ export async function showByRef(ref) {
275
273
  const body = await fs.promises.readFile(entry.filePath, "utf8");
276
274
  return { filePath: entry.filePath, body };
277
275
  }
276
+ /**
277
+ * Build a reduced brief response from a full ShowResponse.
278
+ *
279
+ * Keeps routing/identification fields while omitting content/template/prompt.
280
+ */
281
+ function buildBriefResponse(full, assetPath) {
282
+ const summary = buildSummaryResponse(full, assetPath);
283
+ return {
284
+ type: summary.type,
285
+ name: summary.name,
286
+ path: summary.path,
287
+ ...(summary.description ? { description: summary.description } : {}),
288
+ ...(summary.action ? { action: summary.action } : {}),
289
+ ...(summary.run ? { run: summary.run } : {}),
290
+ ...(summary.origin !== undefined ? { origin: summary.origin } : {}),
291
+ ...(full.editable !== undefined ? { editable: full.editable } : {}),
292
+ };
293
+ }
278
294
  /**
279
295
  * Build a compact summary response from a full ShowResponse.
280
296
  *
@@ -108,20 +108,7 @@ export function saveConfig(config) {
108
108
  const dir = path.dirname(configPath);
109
109
  fs.mkdirSync(dir, { recursive: true });
110
110
  const sanitized = sanitizeConfigForWrite(config);
111
- const tmpPath = `${configPath}.tmp.${process.pid}.${Math.random().toString(36).slice(2)}`;
112
- try {
113
- fs.writeFileSync(tmpPath, `${JSON.stringify(sanitized, null, 2)}\n`, "utf8");
114
- fs.renameSync(tmpPath, configPath);
115
- }
116
- catch (err) {
117
- try {
118
- fs.unlinkSync(tmpPath);
119
- }
120
- catch {
121
- /* ignore cleanup failure */
122
- }
123
- throw err;
124
- }
111
+ writeConfigObject(configPath, sanitized);
125
112
  }
126
113
  /**
127
114
  * Strip apiKey fields before writing config to disk.
@@ -230,12 +217,9 @@ function pickKnownKeys(raw) {
230
217
  else {
231
218
  const legacyStashes = parseStashesConfig(raw.stashes);
232
219
  if (legacyStashes) {
233
- // Backwards-compat migration: `stashes[]` `sources[]` in-memory.
234
- // Emit a one-time deprecation warning and carry the value forward as
235
- // `sources`. The renamed key is persisted on the next `akm config` write.
236
- warn('Config key "stashes" is deprecated; rename it to "sources" in your config file ' +
237
- `(edit it directly at ${_getConfigPath()}). ` +
238
- "Your configuration has been loaded successfully — no manual action is required right now.");
220
+ // Backwards-compat fallback: configs that still carry `stashes[]` are
221
+ // normalized to `sources[]` after the raw file loader has had a chance to
222
+ // auto-migrate the on-disk key.
239
223
  config.sources = legacyStashes;
240
224
  }
241
225
  }
@@ -264,20 +248,16 @@ function pickKnownKeys(raw) {
264
248
  }
265
249
  function readNormalizedConfig(configPath) {
266
250
  const raw = readConfigObject(configPath);
267
- const expanded = raw ? expandEnvVars(raw) : undefined;
251
+ const migrated = raw ? maybeAutoMigrateLegacyStashes(configPath, raw) : undefined;
252
+ const expanded = migrated ? expandEnvVars(migrated) : undefined;
268
253
  return expanded ? pickKnownKeys(expanded) : undefined;
269
254
  }
270
- function readNormalizedConfigFromText(_configPath, text) {
271
- let raw;
272
- try {
273
- raw = JSON.parse(stripJsonComments(text));
274
- }
275
- catch {
255
+ function readNormalizedConfigFromText(configPath, text) {
256
+ const raw = parseConfigObjectFromText(text);
257
+ if (!raw)
276
258
  return undefined;
277
- }
278
- if (typeof raw !== "object" || raw === null || Array.isArray(raw))
279
- return undefined;
280
- const expanded = expandEnvVars(raw);
259
+ const migrated = maybeAutoMigrateLegacyStashes(configPath, raw);
260
+ const expanded = expandEnvVars(migrated);
281
261
  return pickKnownKeys(expanded);
282
262
  }
283
263
  function parseOutputConfig(value) {
@@ -340,6 +320,14 @@ function expandEnvVars(value, fieldName) {
340
320
  function readConfigObject(configPath) {
341
321
  try {
342
322
  const text = fs.readFileSync(configPath, "utf8");
323
+ return parseConfigObjectFromText(text);
324
+ }
325
+ catch {
326
+ return undefined;
327
+ }
328
+ }
329
+ function parseConfigObjectFromText(text) {
330
+ try {
343
331
  const raw = JSON.parse(stripJsonComments(text));
344
332
  if (typeof raw !== "object" || raw === null || Array.isArray(raw))
345
333
  return undefined;
@@ -349,6 +337,47 @@ function readConfigObject(configPath) {
349
337
  return undefined;
350
338
  }
351
339
  }
340
+ /**
341
+ * Best-effort on-disk config migration for the legacy `stashes` key.
342
+ *
343
+ * When a config file still uses `stashes` and does not already define
344
+ * `sources`, rewrite the file in place with `sources` replacing `stashes`,
345
+ * emit a one-time notice on success, and return the migrated object. If the
346
+ * rewrite fails, emit a warning and return the original object so the loader
347
+ * can still continue with an in-memory fallback.
348
+ */
349
+ function maybeAutoMigrateLegacyStashes(configPath, raw) {
350
+ if (Object.hasOwn(raw, "sources") || !Object.hasOwn(raw, "stashes")) {
351
+ return raw;
352
+ }
353
+ const migrated = Object.fromEntries(Object.entries(raw).map(([key, value]) => (key === "stashes" ? ["sources", value] : [key, value])));
354
+ try {
355
+ writeConfigObject(configPath, migrated);
356
+ warn('Config migrated: "stashes" → "sources" in config.json');
357
+ return migrated;
358
+ }
359
+ catch {
360
+ warn('Failed to migrate "stashes" → "sources" in config.json; continuing with the legacy key in memory. ' +
361
+ "Check file permissions or rename the key manually if this persists.");
362
+ return raw;
363
+ }
364
+ }
365
+ function writeConfigObject(configPath, config) {
366
+ const tmpPath = `${configPath}.tmp.${process.pid}.${Math.random().toString(36).slice(2)}`;
367
+ try {
368
+ fs.writeFileSync(tmpPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
369
+ fs.renameSync(tmpPath, configPath);
370
+ }
371
+ catch (err) {
372
+ try {
373
+ fs.unlinkSync(tmpPath);
374
+ }
375
+ catch {
376
+ /* ignore cleanup failure */
377
+ }
378
+ throw err;
379
+ }
380
+ }
352
381
  /**
353
382
  * Strip JavaScript-style comments from a JSON string (JSONC support).
354
383
  * Handles // line comments and /* block comments while preserving
@@ -17,7 +17,7 @@ import { defaultRendererRegistry } from "../core/asset-registry";
17
17
  import { deriveCanonicalAssetNameFromStashRoot } from "../core/asset-spec";
18
18
  import { getDbPath } from "../core/paths";
19
19
  import { warn } from "../core/warn";
20
- import { closeDatabase, getAllEntries, getEntryById, getEntryCount, getMeta, getUtilityScoresByIds, openDatabase, searchFts, searchVec, } from "./db";
20
+ import { closeDatabase, getAllEntries, getEntryById, getEntryCount, getMeta, getUtilityScoresByIds, openDatabase, sanitizeFtsQuery, searchFts, searchVec, } from "./db";
21
21
  import { getRenderer } from "./file-context";
22
22
  import { generateMetadataFlat, loadStashFile, shouldIndexStashFile } from "./metadata";
23
23
  import { buildSearchText } from "./search-fields";
@@ -115,8 +115,11 @@ export async function searchLocal(input) {
115
115
  }
116
116
  // ── Database search ─────────────────────────────────────────────────────────
117
117
  async function searchDatabase(db, query, searchType, limit, stashDir, allSourceDirs, config, sources, rendererRegistry = defaultRendererRegistry) {
118
- // Empty query: return all entries
119
- if (!query) {
118
+ const hasSearchableTokens = query.length > 0 && sanitizeFtsQuery(query).length > 0;
119
+ // Empty queries — including ones that sanitize down to no searchable FTS
120
+ // tokens such as "." — should enumerate matching entries instead of
121
+ // returning an empty result set from FTS.
122
+ if (!hasSearchableTokens) {
120
123
  const typeFilter = searchType === "any" ? undefined : searchType;
121
124
  const allEntries = getAllEntries(db, typeFilter);
122
125
  // Deduplicate by file path — multiple entries can share the same file
@@ -2,6 +2,7 @@ import { Database } from "bun:sqlite";
2
2
  import fs from "node:fs";
3
3
  import { createRequire } from "node:module";
4
4
  import path from "node:path";
5
+ import { parseAssetRef } from "../core/asset-ref";
5
6
  import { getDbPath } from "../core/paths";
6
7
  import { warn } from "../core/warn";
7
8
  import { cosineSimilarity } from "../llm/embedders/types";
@@ -717,6 +718,14 @@ export function getAllEntries(db, entryType) {
717
718
  }
718
719
  return entries;
719
720
  }
721
+ export function findEntryIdByRef(db, ref) {
722
+ const parsed = parseAssetRef(ref);
723
+ const suffix = `${parsed.type}:${parsed.name}`;
724
+ const row = db
725
+ .prepare("SELECT id FROM entries WHERE entry_type = ? AND substr(entry_key, length(entry_key) - length(?) + 1) = ? LIMIT 1")
726
+ .get(parsed.type, suffix, suffix);
727
+ return row?.id;
728
+ }
720
729
  export function getEntryCount(db) {
721
730
  const row = db.prepare("SELECT COUNT(*) AS cnt FROM entries").get();
722
731
  return row.cnt;
@@ -772,8 +772,10 @@ const USAGE_EVENT_RETENTION_DAYS = 90;
772
772
  * For each entry:
773
773
  * - Count search appearances (event_type = 'search')
774
774
  * - Count show events (event_type = 'show')
775
+ * - Count positive/negative feedback events
775
776
  * - Compute select_rate = showCount / searchCount, clamped to [0, 1]
776
- * - Update utility via EMA: utility = previousUtility * 0.7 + selectRate * 0.3
777
+ * - Convert feedback counts into a positive-only feedback_rate
778
+ * - Update utility via EMA from the stronger of select_rate / feedback_rate
777
779
  *
778
780
  * Also purges usage_events older than 90 days and ensures the M-1
779
781
  * usage_events table exists before querying.
@@ -803,6 +805,8 @@ export function recomputeUtilityScores(db) {
803
805
  SELECT entry_id,
804
806
  SUM(CASE WHEN event_type = 'search' THEN 1 ELSE 0 END) AS search_count,
805
807
  SUM(CASE WHEN event_type = 'show' THEN 1 ELSE 0 END) AS show_count,
808
+ SUM(CASE WHEN event_type = 'feedback' AND signal = 'positive' THEN 1 ELSE 0 END) AS positive_feedback_count,
809
+ SUM(CASE WHEN event_type = 'feedback' AND signal = 'negative' THEN 1 ELSE 0 END) AS negative_feedback_count,
806
810
  MAX(created_at) AS last_used_at
807
811
  FROM usage_events
808
812
  WHERE entry_id IS NOT NULL
@@ -821,8 +825,11 @@ export function recomputeUtilityScores(db) {
821
825
  }
822
826
  for (const row of usageRows) {
823
827
  const selectRate = row.search_count > 0 ? Math.min(1, row.show_count / row.search_count) : 0;
828
+ const feedbackTotal = row.positive_feedback_count + row.negative_feedback_count;
829
+ const feedbackRate = feedbackTotal > 0 ? Math.max(0, row.positive_feedback_count - row.negative_feedback_count) / feedbackTotal : 0;
830
+ const effectiveRate = Math.max(selectRate, feedbackRate);
824
831
  const prevUtility = existingScores.get(row.entry_id) ?? 0;
825
- const utility = prevUtility * emaDecay + selectRate * emaNew;
832
+ const utility = prevUtility * emaDecay + effectiveRate * emaNew;
826
833
  upsertUtilityScore(db, row.entry_id, {
827
834
  utility,
828
835
  showCount: row.show_count,
@@ -255,10 +255,9 @@ const WIKI_INFRA_FILES = new Set(["schema.md", "index.md", "log.md"]);
255
255
  * Apply wiki-specific index exclusions while leaving all other stash files
256
256
  * untouched.
257
257
  *
258
- * - In a normal stash, excludes `wikis/<name>/raw/**` and wiki-root
259
- * `schema.md`, `index.md`, `log.md`.
260
- * - In a wiki-root stash source (`wikiName`), excludes `raw/**` and those same
261
- * root-level infrastructure files.
258
+ * - In a normal stash, excludes wiki-root `schema.md`, `index.md`, `log.md`.
259
+ * - In a wiki-root stash source (`wikiName`), excludes those same root-level
260
+ * infrastructure files.
262
261
  */
263
262
  export function shouldIndexStashFile(stashRoot, file, options) {
264
263
  const relPath = path.relative(stashRoot, file);
@@ -268,8 +267,6 @@ export function shouldIndexStashFile(stashRoot, file, options) {
268
267
  if (segments.length === 0)
269
268
  return true;
270
269
  if (options?.treatStashRootAsWikiRoot) {
271
- if (segments[0] === "raw")
272
- return false;
273
270
  return !(segments.length === 1 && WIKI_INFRA_FILES.has(segments[0]));
274
271
  }
275
272
  const wikisIdx = segments.indexOf("wikis");
@@ -278,8 +275,6 @@ export function shouldIndexStashFile(stashRoot, file, options) {
278
275
  const wikiRelativeSegments = segments.slice(wikisIdx + 2);
279
276
  if (wikiRelativeSegments.length === 0)
280
277
  return true;
281
- if (wikiRelativeSegments[0] === "raw")
282
- return false;
283
278
  return !(wikiRelativeSegments.length === 1 && WIKI_INFRA_FILES.has(wikiRelativeSegments[0]));
284
279
  }
285
280
  /**
@@ -125,6 +125,8 @@ akm workflow validate workflows/foo.md # Validate a workflow file or ref
125
125
  akm workflow next workflow:ship-release # Start or resume the next workflow step
126
126
  akm feedback skill:code-review --positive # Record that an asset helped
127
127
  akm feedback agent:reviewer --negative # Record that an asset missed the mark
128
+ akm feedback memory:deployment-notes --positive # Works for memories too
129
+ akm feedback vault:prod --positive # Records vault feedback without surfacing values
128
130
  \`\`\`
129
131
 
130
132
  Use \`akm feedback\` whenever an asset materially helps or fails so future search
@@ -433,7 +433,7 @@ export function formatWikiRemovePlain(r) {
433
433
  const preserved = r.preservedRaw === true;
434
434
  const removed = Array.isArray(r.removed) ? r.removed.length : 0;
435
435
  const base = `Removed wiki ${String(r.name ?? "?")} (${removed} path(s))`;
436
- return preserved ? `${base}; preserved ${String(r.rawPath ?? "raw/")}` : base;
436
+ return preserved ? `${base}; raw/ preserved at ${String(r.rawPath ?? "raw/")}` : base;
437
437
  }
438
438
  export function formatWikiPagesPlain(r) {
439
439
  const pages = Array.isArray(r.pages) ? r.pages : [];
package/dist/wiki/wiki.js CHANGED
@@ -76,7 +76,7 @@ function registeredWikiSources(stashDir) {
76
76
  export function resolveWikiSource(stashDir, name) {
77
77
  validateWikiName(name);
78
78
  const wikiDir = resolveWikiDir(stashDir, name);
79
- if (fs.existsSync(wikiDir)) {
79
+ if (fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir)) {
80
80
  return { name, path: wikiDir, mode: "stash" };
81
81
  }
82
82
  const external = registeredWikiSources(stashDir).find((source) => source.name === name);
@@ -87,7 +87,7 @@ export function resolveWikiSource(stashDir, name) {
87
87
  export function ensureWikiNameAvailable(stashDir, name) {
88
88
  validateWikiName(name);
89
89
  const wikiDir = resolveWikiDir(stashDir, name);
90
- if (fs.existsSync(wikiDir)) {
90
+ if (fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir)) {
91
91
  throw new UsageError(`Wiki already exists: ${name}.`, "RESOURCE_ALREADY_EXISTS");
92
92
  }
93
93
  const external = registeredWikiSources(stashDir).find((source) => source.name === name);
@@ -99,8 +99,9 @@ export function ensureWikiNameAvailable(stashDir, name) {
99
99
  * Walk a wiki directory and bucket files into pages vs raws.
100
100
  *
101
101
  * "Pages" are any `.md` files under the wiki root EXCEPT `schema.md`,
102
- * `index.md`, `log.md`, or anything under `raw/`. This matches the set the
103
- * agent edits, and the set `akm wiki pages` exposes.
102
+ * `index.md`, or `log.md`. Raw sources are bucketed separately so callers can
103
+ * distinguish authored pages from ingested source material while still
104
+ * surfacing both.
104
105
  *
105
106
  * Returns two mtime signals:
106
107
  * - `lastModifiedMs` — newest across all .md files. Used for the `show` /
@@ -164,6 +165,17 @@ function scanWikiFiles(wikiDir) {
164
165
  }
165
166
  return { pages, raws, lastModifiedMs, pagesLastModifiedMs };
166
167
  }
168
+ function hasWikiInfrastructure(wikiDir) {
169
+ for (const file of WIKI_SPECIAL_FILES) {
170
+ if (fs.existsSync(path.join(wikiDir, file)))
171
+ return true;
172
+ }
173
+ return false;
174
+ }
175
+ function isRecognizedStashWiki(wikiDir, buckets) {
176
+ const scanned = buckets ?? scanWikiFiles(wikiDir);
177
+ return scanned.pages.length > 0 || hasWikiInfrastructure(wikiDir);
178
+ }
167
179
  function readSchemaDescription(wikiDir) {
168
180
  const schemaPath = path.join(wikiDir, SCHEMA_MD);
169
181
  let raw;
@@ -207,6 +219,8 @@ export function listWikis(stashDir) {
207
219
  }
208
220
  const summarize = (name, dir) => {
209
221
  const buckets = scanWikiFiles(dir);
222
+ if (!isRecognizedStashWiki(dir, buckets))
223
+ return;
210
224
  const summary = {
211
225
  name,
212
226
  path: dir,
@@ -353,9 +367,11 @@ export function createWiki(stashDir, name) {
353
367
  * ignore that (e.g. idempotent cleanup) by catching.
354
368
  */
355
369
  export function removeWiki(stashDir, name, options = {}) {
356
- const resolved = resolveWikiSource(stashDir, name);
357
- const wikiDir = resolved.path;
358
- if (resolved.mode === "external") {
370
+ validateWikiName(name);
371
+ const wikiDir = resolveWikiDir(stashDir, name);
372
+ const external = registeredWikiSources(stashDir).find((source) => source.name === name);
373
+ const isStashWiki = fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir);
374
+ if (!isStashWiki && external) {
359
375
  const config = loadUserConfig();
360
376
  const filteredSources = (config.sources ?? config.stashes ?? []).filter((entry) => entry.wikiName !== name);
361
377
  const installed = (config.installed ?? []).filter((entry) => entry.wikiName !== name);
@@ -373,8 +389,8 @@ export function removeWiki(stashDir, name, options = {}) {
373
389
  unregistered: true,
374
390
  };
375
391
  }
376
- if (!fs.existsSync(wikiDir)) {
377
- throw new NotFoundError(`Wiki not found: ${name}.`, "STASH_NOT_FOUND");
392
+ if (!fs.existsSync(wikiDir) || (!isStashWiki && !options.withSources)) {
393
+ throw new NotFoundError(wikiNotFoundMessage(name), "STASH_NOT_FOUND");
378
394
  }
379
395
  const wikisRoot = resolveWikisRoot(stashDir);
380
396
  if (!isWithin(wikiDir, wikisRoot)) {
@@ -480,15 +496,16 @@ function readPageFrontmatter(absPath) {
480
496
  return out;
481
497
  }
482
498
  /**
483
- * List the pages in a wiki, excluding `schema.md`, `index.md`, `log.md`, and
484
- * anything under `raw/`. Each entry carries its ref (`wiki:<name>/<page>`),
485
- * path, and frontmatter-derived fields for orientation.
499
+ * List the addressable markdown entries in a wiki, excluding only the
500
+ * infrastructure files `schema.md`, `index.md`, and `log.md`. This includes
501
+ * both authored pages and `raw/` sources so `wiki pages` can inventory content
502
+ * written via `akm wiki stash`.
486
503
  */
487
504
  export function listPages(stashDir, name) {
488
505
  const wikiDir = resolveWikiSource(stashDir, name).path;
489
- const { pages } = scanWikiFiles(wikiDir);
506
+ const { pages, raws } = scanWikiFiles(wikiDir);
490
507
  const result = [];
491
- for (const abs of pages) {
508
+ for (const abs of [...pages, ...raws]) {
492
509
  const pageName = pageNameFromPath(wikiDir, abs);
493
510
  const ref = `wiki:${name}/${pageName}`;
494
511
  const fm = readPageFrontmatter(abs);
@@ -525,7 +542,6 @@ export async function searchInWiki(input) {
525
542
  }
526
543
  throw err;
527
544
  }
528
- const rawDir = path.join(wikiDir, RAW_SUBDIR);
529
545
  const filtered = [];
530
546
  for (const hit of response.hits) {
531
547
  // hits can be SourceSearchHit or RegistrySearchResultHit (union); filter
@@ -541,9 +557,6 @@ export async function searchInWiki(input) {
541
557
  const basename = path.basename(stashHit.path);
542
558
  if (WIKI_SPECIAL_FILES.has(basename) && path.dirname(stashHit.path) === wikiDir)
543
559
  continue;
544
- // Exclude anything under raw/
545
- if (isWithin(stashHit.path, rawDir))
546
- continue;
547
560
  filtered.push(stashHit);
548
561
  }
549
562
  return { ...response, hits: filtered, registryHits: undefined };
@@ -4,6 +4,28 @@ This release ships the v1 architecture refactor on top of the earlier 0.6
4
4
  terminology cut (`kit` → `stash` in the registry wire format). The CLI
5
5
  command surface is unchanged. Most users have nothing to do.
6
6
 
7
+ ## Workflows: schema-driven indexing
8
+
9
+ Workflows are now compiled into a validated `WorkflowDocument` JSON shape
10
+ with line-anchored `SourceRef`s back into the source markdown, cached in a
11
+ new `workflow_documents` table in `index.db`. `akm workflow next` reads from
12
+ the cache instead of re-parsing markdown each step.
13
+
14
+ A new `akm workflow validate <ref|path>` subcommand surfaces every error in
15
+ one pass (without running a full reindex), formatted as `path:line — message`.
16
+
17
+ `DB_VERSION` is bumped 8 → 9 to introduce the table; first run after upgrade
18
+ drops + recreates all `index.db` tables (preserving `usage_events`). The next
19
+ `akm index` rebuilds. Run-state in `workflow.db` is untouched.
20
+
21
+ ## Feedback expanded to any indexed ref
22
+
23
+ `akm feedback <ref>` now accepts any indexed ref — `memory:`, `vault:`,
24
+ `workflow:`, `wiki:` etc. The ref must be present in the local index.
25
+ Vault feedback never echoes vault values. Telemetry persists both
26
+ `entry_ref` and `entry_id` so signals survive a reindex, and feedback
27
+ events now feed into utility recomputation alongside search/show signals.
28
+
7
29
  Locked v1 decisions:
8
30
  - `writable` defaults to `true` on `filesystem`, `false` on `git` /
9
31
  `website` / `npm`.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "akm-cli",
3
- "version": "0.6.0-rc2",
3
+ "version": "0.6.1",
4
4
  "type": "module",
5
5
  "description": "akm (Agent Kit Manager) — A package manager for AI agent skills, commands, tools, and knowledge. Works with Claude Code, OpenCode, Cursor, and any AI coding assistant.",
6
6
  "keywords": [