akm-cli 0.6.0-rc2 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -10,9 +10,17 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
10
10
 
11
11
  ### Added
12
12
 
13
+ - **`akm workflow validate <ref|path>`** — new subcommand that validates a workflow markdown file or ref, surfacing every error in one pass (without running a full reindex).
14
+ - **`akm feedback` now accepts any indexed ref** — previously type-restricted. `memory:`, `vault:`, `workflow:`, `wiki:` refs all work. Vault feedback never echoes vault values.
15
+ - **`akm upgrade` runs post-upgrade tasks automatically.** After a successful upgrade, the new binary is invoked as a child process running `akm index`, which auto-migrates any legacy `stashes` → `sources` config keys via `loadConfig` and rebuilds the index against the new schema (`DB_VERSION` 8 → 9 forces a rebuild). Pass `--skip-post-upgrade` to opt out (config migration still runs on the next `akm` invocation; you'd just need to run `akm index` yourself). Result is reported in the `postUpgrade` field of the upgrade response.
13
16
  - **`writable` flag on sources.** New optional `SourceConfigEntry.writable` controls whether write commands (`akm remember`, `akm import`, `akm save`, `akm clone`) may target the source. Defaults: `true` for `filesystem`, `false` for `git` / `website` / `npm`. `writable: true` on `website` or `npm` is rejected at config load with `ConfigError("writable: true is only supported on filesystem and git sources")`.
14
17
  - **`defaultWriteTarget` root config key.** Names the source that receives writes when no `--target` flag is given. Resolution order: `--target` → `defaultWriteTarget` → `stashDir` (working stash) → `ConfigError("no writable source configured; run \`akm init\`")`. There is no implicit "first writable in `sources[]` order" fallback.
15
18
 
19
+ ### Changed
20
+
21
+ - **Workflows are now stored as validated `WorkflowDocument` JSON** — workflows are compiled into a validated `WorkflowDocument` JSON shape with line-anchored `SourceRef`s back into the source markdown, cached in a new `workflow_documents` table in `index.db`. The run engine reads from the cache on `akm workflow next` instead of re-parsing markdown each step.
22
+ - **Feedback events flow into utility recomputation** — positive/negative feedback signals now feed utility scoring alongside search/show events. Telemetry records both `entry_ref` and `entry_id` so feedback signals survive a reindex.
23
+
16
24
  ### Changed (breaking)
17
25
 
18
26
  - **v1 architecture refactor.** The internal architecture was rebuilt around a single minimal `SourceProvider` interface (`{ name, kind, init, path, sync? }`), a unified FTS5 index that owns search and show, and a single `writeAssetToSource` helper that owns all writes. The CLI command surface and all user-visible config keys are unchanged. See `docs/migration/v1.md` for the full guide.
@@ -30,6 +38,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
30
38
  - **Migration**: a curated registry author should regenerate their `index.json` (rename `kits` → `stashes`, drop legacy keyword filtering). Publishers should add the `akm-stash` keyword/topic and remove `akm-kit`/`agentikit`.
31
39
  - **`akm registry` description**: changed from "Manage kit registries" to "Manage stash registries".
32
40
 
41
+ ### Migration / Breaking
42
+
43
+ - **`DB_VERSION` bumped 8 → 9.** On first run after upgrade, the version-mismatch path in `ensureSchema()` drops + recreates all `index.db` tables (preserving `usage_events` via a typed backup); the next `akm index` rebuilds the index. `workflow.db` (run state) is unaffected.
44
+
33
45
  ### Removed (breaking)
34
46
 
35
47
  - **OpenViking source provider.** The `openviking` source kind is no longer supported. Configs that contain one fail to load with `ConfigError("openviking is not supported in akm v1. …")` and a hint pointing to `akm config sources remove <name>`. API-backed sources will return as a separate `QuerySource` tier post-v1. To downgrade in the meantime, pin to `akm-cli@0.5`.
package/dist/cli.js CHANGED
@@ -25,7 +25,7 @@ import { ConfigError, NotFoundError, UsageError } from "./core/errors";
25
25
  import { getCacheDir, getDbPath, getDefaultStashDir } from "./core/paths";
26
26
  import { setQuiet, warn } from "./core/warn";
27
27
  import { resolveWriteTarget, writeAssetToSource } from "./core/write-source";
28
- import { closeDatabase, openDatabase } from "./indexer/db";
28
+ import { closeDatabase, findEntryIdByRef, openDatabase } from "./indexer/db";
29
29
  import { akmIndex } from "./indexer/indexer";
30
30
  import { resolveSourceEntries } from "./indexer/search-source";
31
31
  import { insertUsageEvent } from "./indexer/usage-events";
@@ -115,7 +115,10 @@ const initCommand = defineCommand({
115
115
  },
116
116
  async run({ args }) {
117
117
  await runWithJsonErrors(async () => {
118
- const result = await akmInit({ dir: args.dir });
118
+ // Accept both historical spellings for backwards compatibility with
119
+ // older docs/scripts that used `--stashDir`.
120
+ const legacyDir = parseFlagValue(process.argv, "--stashDir") ?? parseFlagValue(process.argv, "--stash-dir");
121
+ const result = await akmInit({ dir: args.dir ?? legacyDir });
119
122
  output("init", result);
120
123
  });
121
124
  },
@@ -378,6 +381,11 @@ const upgradeCommand = defineCommand({
378
381
  description: "Skip checksum verification (not recommended)",
379
382
  default: false,
380
383
  },
384
+ "skip-post-upgrade": {
385
+ type: "boolean",
386
+ description: "Skip the post-upgrade `akm index` rebuild (config auto-migration still runs on next `akm` invocation)",
387
+ default: false,
388
+ },
381
389
  },
382
390
  async run({ args }) {
383
391
  await runWithJsonErrors(async () => {
@@ -387,7 +395,8 @@ const upgradeCommand = defineCommand({
387
395
  return;
388
396
  }
389
397
  const skipChecksum = getHyphenatedBoolean(args, "skip-checksum");
390
- const result = await performUpgrade(check, { force: args.force, skipChecksum });
398
+ const skipPostUpgrade = getHyphenatedBoolean(args, "skip-post-upgrade");
399
+ const result = await performUpgrade(check, { force: args.force, skipChecksum, skipPostUpgrade });
391
400
  output("upgrade", result);
392
401
  });
393
402
  },
@@ -781,7 +790,7 @@ const registryCommand = defineCommand({
781
790
  const feedbackCommand = defineCommand({
782
791
  meta: {
783
792
  name: "feedback",
784
- description: "Record positive or negative feedback for a stash asset",
793
+ description: "Record positive or negative feedback for any indexed stash asset",
785
794
  },
786
795
  args: {
787
796
  ref: { type: "positional", description: "Asset ref (type:name)", required: true },
@@ -795,6 +804,7 @@ const feedbackCommand = defineCommand({
795
804
  if (!ref) {
796
805
  throw new UsageError("Asset ref is required. Usage: akm feedback <ref> --positive|--negative");
797
806
  }
807
+ parseAssetRef(ref);
798
808
  if (args.positive && args.negative) {
799
809
  throw new UsageError("Specify either --positive or --negative, not both.");
800
810
  }
@@ -805,9 +815,14 @@ const feedbackCommand = defineCommand({
805
815
  const metadata = args.note ? JSON.stringify({ note: args.note }) : undefined;
806
816
  const db = openDatabase();
807
817
  try {
818
+ const entryId = findEntryIdByRef(db, ref);
819
+ if (entryId === undefined) {
820
+ throw new UsageError(`Ref "${ref}" is not in the current index. Run "akm index" and try again.`);
821
+ }
808
822
  insertUsageEvent(db, {
809
823
  event_type: "feedback",
810
824
  entry_ref: ref,
825
+ entry_id: entryId,
811
826
  signal,
812
827
  metadata,
813
828
  });
@@ -80,6 +80,7 @@ export async function checkForUpdate(currentVersion) {
80
80
  export async function performUpgrade(check, opts) {
81
81
  const { currentVersion, latestVersion, installMethod } = check;
82
82
  const force = opts?.force === true;
83
+ const skipPostUpgrade = opts?.skipPostUpgrade === true;
83
84
  // All install methods can short-circuit here unless the user explicitly forces an upgrade.
84
85
  if (!check.updateAvailable && !force) {
85
86
  return {
@@ -113,6 +114,7 @@ export async function performUpgrade(check, opts) {
113
114
  upgraded: true,
114
115
  installMethod,
115
116
  message: `akm upgraded via ${installMethod}`,
117
+ postUpgrade: runPostUpgradeTasks("akm", { skip: skipPostUpgrade }),
116
118
  };
117
119
  }
118
120
  if (installMethod === "unknown") {
@@ -283,8 +285,72 @@ export async function performUpgrade(check, opts) {
283
285
  installMethod,
284
286
  binaryPath: execPath,
285
287
  checksumVerified,
288
+ // For binary installs, the new binary now lives at execPath; spawn it
289
+ // directly so the post-upgrade work runs against the new code.
290
+ postUpgrade: runPostUpgradeTasks(execPath, { skip: skipPostUpgrade }),
286
291
  };
287
292
  }
293
+ /**
294
+ * Run the post-upgrade tasks against the *new* binary as a child process.
295
+ *
296
+ * Why a child process: the running akm process still has the old code in
297
+ * memory. Calling loadConfig()/akmIndex() in-process would use the old
298
+ * implementations and miss any DB_VERSION / config-key changes the new
299
+ * release introduces.
300
+ *
301
+ * The new binary's `akm index` does the work for us:
302
+ * 1. loadConfig() runs at startup — auto-migrates legacy `stashes` →
303
+ * `sources` if the on-disk config still uses the old key.
304
+ * 2. ensureSchema() detects DB_VERSION mismatch and rebuilds index.db
305
+ * tables (preserving usage_events).
306
+ * 3. The full reindex repopulates entries + workflow_documents + FTS.
307
+ */
308
+ function runPostUpgradeTasks(akmBin, opts) {
309
+ if (opts.skip) {
310
+ return {
311
+ ok: true,
312
+ skipped: true,
313
+ message: "Skipped post-upgrade tasks. Run `akm index` manually to migrate config and rebuild the index.",
314
+ };
315
+ }
316
+ try {
317
+ const result = childProcess.spawnSync(akmBin, ["index"], {
318
+ encoding: "utf8",
319
+ env: process.env,
320
+ stdio: "pipe",
321
+ });
322
+ if (result.error) {
323
+ return {
324
+ ok: false,
325
+ skipped: false,
326
+ message: `Post-upgrade tasks could not start: ${result.error.message}. Run \`akm index\` manually.`,
327
+ };
328
+ }
329
+ if (result.status !== 0) {
330
+ const detail = (result.stderr ?? "").trim() || (result.stdout ?? "").trim() || `exit code ${result.status}`;
331
+ return {
332
+ ok: false,
333
+ skipped: false,
334
+ exitCode: result.status,
335
+ message: `Post-upgrade \`akm index\` failed (${detail}). Run \`akm index\` manually.`,
336
+ };
337
+ }
338
+ return {
339
+ ok: true,
340
+ skipped: false,
341
+ exitCode: 0,
342
+ message: "Config migrated (if needed) and index rebuilt against the new binary.",
343
+ };
344
+ }
345
+ catch (err) {
346
+ const detail = err instanceof Error ? err.message : String(err);
347
+ return {
348
+ ok: false,
349
+ skipped: false,
350
+ message: `Post-upgrade tasks failed: ${detail}. Run \`akm index\` manually.`,
351
+ };
352
+ }
353
+ }
288
354
  function parseChecksumForFile(checksumsText, filename) {
289
355
  for (const line of checksumsText.split("\n")) {
290
356
  const trimmed = line.trim();
@@ -25,15 +25,15 @@ import { parseAssetRef } from "../core/asset-ref";
25
25
  import { loadConfig } from "../core/config";
26
26
  import { NotFoundError, UsageError } from "../core/errors";
27
27
  import { parseFrontmatter, toStringOrUndefined } from "../core/frontmatter";
28
- import { closeDatabase, openDatabase } from "../indexer/db";
28
+ import { closeDatabase, findEntryIdByRef, openDatabase } from "../indexer/db";
29
29
  import { buildFileContext, buildRenderContext, getRenderer, runMatchers } from "../indexer/file-context";
30
30
  import { lookup } from "../indexer/indexer";
31
31
  import { loadStashFile } from "../indexer/metadata";
32
32
  import { buildEditHint, findSourceForPath, isEditable, resolveSourceEntries } from "../indexer/search-source";
33
+ import { insertUsageEvent } from "../indexer/usage-events";
33
34
  import { resolveSourcesForOrigin } from "../registry/origin-resolve";
34
35
  // Eagerly import source providers to trigger self-registration.
35
36
  import "../sources/providers/index";
36
- import { insertUsageEvent } from "../indexer/usage-events";
37
37
  import { resolveAssetPath } from "../sources/resolve";
38
38
  /**
39
39
  * Show a wiki root (no page path) — returns the same payload as
@@ -136,15 +136,10 @@ function logShowEvent(ref, existingDb) {
136
136
  try {
137
137
  const db = existingDb ?? openDatabase();
138
138
  try {
139
- const parsed = parseAssetRef(ref);
140
- const safeName = parsed.name.replace(/%/g, "\\%").replace(/_/g, "\\_");
141
- const row = db
142
- .prepare("SELECT id FROM entries WHERE entry_key LIKE ? ESCAPE '\\' AND entry_type = ? LIMIT 1")
143
- .get(`%:${parsed.type}:${safeName}`, parsed.type);
144
139
  insertUsageEvent(db, {
145
140
  event_type: "show",
146
141
  entry_ref: ref,
147
- entry_id: row?.id,
142
+ entry_id: findEntryIdByRef(db, ref),
148
143
  });
149
144
  }
150
145
  finally {
@@ -108,20 +108,7 @@ export function saveConfig(config) {
108
108
  const dir = path.dirname(configPath);
109
109
  fs.mkdirSync(dir, { recursive: true });
110
110
  const sanitized = sanitizeConfigForWrite(config);
111
- const tmpPath = `${configPath}.tmp.${process.pid}.${Math.random().toString(36).slice(2)}`;
112
- try {
113
- fs.writeFileSync(tmpPath, `${JSON.stringify(sanitized, null, 2)}\n`, "utf8");
114
- fs.renameSync(tmpPath, configPath);
115
- }
116
- catch (err) {
117
- try {
118
- fs.unlinkSync(tmpPath);
119
- }
120
- catch {
121
- /* ignore cleanup failure */
122
- }
123
- throw err;
124
- }
111
+ writeConfigObject(configPath, sanitized);
125
112
  }
126
113
  /**
127
114
  * Strip apiKey fields before writing config to disk.
@@ -230,12 +217,9 @@ function pickKnownKeys(raw) {
230
217
  else {
231
218
  const legacyStashes = parseStashesConfig(raw.stashes);
232
219
  if (legacyStashes) {
233
- // Backwards-compat migration: `stashes[]` `sources[]` in-memory.
234
- // Emit a one-time deprecation warning and carry the value forward as
235
- // `sources`. The renamed key is persisted on the next `akm config` write.
236
- warn('Config key "stashes" is deprecated; rename it to "sources" in your config file ' +
237
- `(edit it directly at ${_getConfigPath()}). ` +
238
- "Your configuration has been loaded successfully — no manual action is required right now.");
220
+ // Backwards-compat fallback: configs that still carry `stashes[]` are
221
+ // normalized to `sources[]` after the raw file loader has had a chance to
222
+ // auto-migrate the on-disk key.
239
223
  config.sources = legacyStashes;
240
224
  }
241
225
  }
@@ -264,20 +248,16 @@ function pickKnownKeys(raw) {
264
248
  }
265
249
  function readNormalizedConfig(configPath) {
266
250
  const raw = readConfigObject(configPath);
267
- const expanded = raw ? expandEnvVars(raw) : undefined;
251
+ const migrated = raw ? maybeAutoMigrateLegacyStashes(configPath, raw) : undefined;
252
+ const expanded = migrated ? expandEnvVars(migrated) : undefined;
268
253
  return expanded ? pickKnownKeys(expanded) : undefined;
269
254
  }
270
- function readNormalizedConfigFromText(_configPath, text) {
271
- let raw;
272
- try {
273
- raw = JSON.parse(stripJsonComments(text));
274
- }
275
- catch {
255
+ function readNormalizedConfigFromText(configPath, text) {
256
+ const raw = parseConfigObjectFromText(text);
257
+ if (!raw)
276
258
  return undefined;
277
- }
278
- if (typeof raw !== "object" || raw === null || Array.isArray(raw))
279
- return undefined;
280
- const expanded = expandEnvVars(raw);
259
+ const migrated = maybeAutoMigrateLegacyStashes(configPath, raw);
260
+ const expanded = expandEnvVars(migrated);
281
261
  return pickKnownKeys(expanded);
282
262
  }
283
263
  function parseOutputConfig(value) {
@@ -340,6 +320,14 @@ function expandEnvVars(value, fieldName) {
340
320
  function readConfigObject(configPath) {
341
321
  try {
342
322
  const text = fs.readFileSync(configPath, "utf8");
323
+ return parseConfigObjectFromText(text);
324
+ }
325
+ catch {
326
+ return undefined;
327
+ }
328
+ }
329
+ function parseConfigObjectFromText(text) {
330
+ try {
343
331
  const raw = JSON.parse(stripJsonComments(text));
344
332
  if (typeof raw !== "object" || raw === null || Array.isArray(raw))
345
333
  return undefined;
@@ -349,6 +337,47 @@ function readConfigObject(configPath) {
349
337
  return undefined;
350
338
  }
351
339
  }
340
+ /**
341
+ * Best-effort on-disk config migration for the legacy `stashes` key.
342
+ *
343
+ * When a config file still uses `stashes` and does not already define
344
+ * `sources`, rewrite the file in place with `sources` replacing `stashes`,
345
+ * emit a one-time notice on success, and return the migrated object. If the
346
+ * rewrite fails, emit a warning and return the original object so the loader
347
+ * can still continue with an in-memory fallback.
348
+ */
349
+ function maybeAutoMigrateLegacyStashes(configPath, raw) {
350
+ if (Object.hasOwn(raw, "sources") || !Object.hasOwn(raw, "stashes")) {
351
+ return raw;
352
+ }
353
+ const migrated = Object.fromEntries(Object.entries(raw).map(([key, value]) => (key === "stashes" ? ["sources", value] : [key, value])));
354
+ try {
355
+ writeConfigObject(configPath, migrated);
356
+ warn('Config migrated: "stashes" → "sources" in config.json');
357
+ return migrated;
358
+ }
359
+ catch {
360
+ warn('Failed to migrate "stashes" → "sources" in config.json; continuing with the legacy key in memory. ' +
361
+ "Check file permissions or rename the key manually if this persists.");
362
+ return raw;
363
+ }
364
+ }
365
+ function writeConfigObject(configPath, config) {
366
+ const tmpPath = `${configPath}.tmp.${process.pid}.${Math.random().toString(36).slice(2)}`;
367
+ try {
368
+ fs.writeFileSync(tmpPath, `${JSON.stringify(config, null, 2)}\n`, "utf8");
369
+ fs.renameSync(tmpPath, configPath);
370
+ }
371
+ catch (err) {
372
+ try {
373
+ fs.unlinkSync(tmpPath);
374
+ }
375
+ catch {
376
+ /* ignore cleanup failure */
377
+ }
378
+ throw err;
379
+ }
380
+ }
352
381
  /**
353
382
  * Strip JavaScript-style comments from a JSON string (JSONC support).
354
383
  * Handles // line comments and /* block comments while preserving
@@ -2,6 +2,7 @@ import { Database } from "bun:sqlite";
2
2
  import fs from "node:fs";
3
3
  import { createRequire } from "node:module";
4
4
  import path from "node:path";
5
+ import { parseAssetRef } from "../core/asset-ref";
5
6
  import { getDbPath } from "../core/paths";
6
7
  import { warn } from "../core/warn";
7
8
  import { cosineSimilarity } from "../llm/embedders/types";
@@ -717,6 +718,14 @@ export function getAllEntries(db, entryType) {
717
718
  }
718
719
  return entries;
719
720
  }
721
+ export function findEntryIdByRef(db, ref) {
722
+ const parsed = parseAssetRef(ref);
723
+ const suffix = `${parsed.type}:${parsed.name}`;
724
+ const row = db
725
+ .prepare("SELECT id FROM entries WHERE entry_type = ? AND substr(entry_key, length(entry_key) - length(?) + 1) = ? LIMIT 1")
726
+ .get(parsed.type, suffix, suffix);
727
+ return row?.id;
728
+ }
720
729
  export function getEntryCount(db) {
721
730
  const row = db.prepare("SELECT COUNT(*) AS cnt FROM entries").get();
722
731
  return row.cnt;
@@ -772,8 +772,10 @@ const USAGE_EVENT_RETENTION_DAYS = 90;
772
772
  * For each entry:
773
773
  * - Count search appearances (event_type = 'search')
774
774
  * - Count show events (event_type = 'show')
775
+ * - Count positive/negative feedback events
775
776
  * - Compute select_rate = showCount / searchCount, clamped to [0, 1]
776
- * - Update utility via EMA: utility = previousUtility * 0.7 + selectRate * 0.3
777
+ * - Convert feedback counts into a positive-only feedback_rate
778
+ * - Update utility via EMA from the stronger of select_rate / feedback_rate
777
779
  *
778
780
  * Also purges usage_events older than 90 days and ensures the M-1
779
781
  * usage_events table exists before querying.
@@ -803,6 +805,8 @@ export function recomputeUtilityScores(db) {
803
805
  SELECT entry_id,
804
806
  SUM(CASE WHEN event_type = 'search' THEN 1 ELSE 0 END) AS search_count,
805
807
  SUM(CASE WHEN event_type = 'show' THEN 1 ELSE 0 END) AS show_count,
808
+ SUM(CASE WHEN event_type = 'feedback' AND signal = 'positive' THEN 1 ELSE 0 END) AS positive_feedback_count,
809
+ SUM(CASE WHEN event_type = 'feedback' AND signal = 'negative' THEN 1 ELSE 0 END) AS negative_feedback_count,
806
810
  MAX(created_at) AS last_used_at
807
811
  FROM usage_events
808
812
  WHERE entry_id IS NOT NULL
@@ -821,8 +825,11 @@ export function recomputeUtilityScores(db) {
821
825
  }
822
826
  for (const row of usageRows) {
823
827
  const selectRate = row.search_count > 0 ? Math.min(1, row.show_count / row.search_count) : 0;
828
+ const feedbackTotal = row.positive_feedback_count + row.negative_feedback_count;
829
+ const feedbackRate = feedbackTotal > 0 ? Math.max(0, row.positive_feedback_count - row.negative_feedback_count) / feedbackTotal : 0;
830
+ const effectiveRate = Math.max(selectRate, feedbackRate);
824
831
  const prevUtility = existingScores.get(row.entry_id) ?? 0;
825
- const utility = prevUtility * emaDecay + selectRate * emaNew;
832
+ const utility = prevUtility * emaDecay + effectiveRate * emaNew;
826
833
  upsertUtilityScore(db, row.entry_id, {
827
834
  utility,
828
835
  showCount: row.show_count,
@@ -125,6 +125,8 @@ akm workflow validate workflows/foo.md # Validate a workflow file or ref
125
125
  akm workflow next workflow:ship-release # Start or resume the next workflow step
126
126
  akm feedback skill:code-review --positive # Record that an asset helped
127
127
  akm feedback agent:reviewer --negative # Record that an asset missed the mark
128
+ akm feedback memory:deployment-notes --positive # Works for memories too
129
+ akm feedback vault:prod --positive # Records vault feedback without surfacing values
128
130
  \`\`\`
129
131
 
130
132
  Use \`akm feedback\` whenever an asset materially helps or fails so future search
@@ -433,7 +433,7 @@ export function formatWikiRemovePlain(r) {
433
433
  const preserved = r.preservedRaw === true;
434
434
  const removed = Array.isArray(r.removed) ? r.removed.length : 0;
435
435
  const base = `Removed wiki ${String(r.name ?? "?")} (${removed} path(s))`;
436
- return preserved ? `${base}; preserved ${String(r.rawPath ?? "raw/")}` : base;
436
+ return preserved ? `${base}; raw/ preserved at ${String(r.rawPath ?? "raw/")}` : base;
437
437
  }
438
438
  export function formatWikiPagesPlain(r) {
439
439
  const pages = Array.isArray(r.pages) ? r.pages : [];
package/dist/wiki/wiki.js CHANGED
@@ -76,7 +76,7 @@ function registeredWikiSources(stashDir) {
76
76
  export function resolveWikiSource(stashDir, name) {
77
77
  validateWikiName(name);
78
78
  const wikiDir = resolveWikiDir(stashDir, name);
79
- if (fs.existsSync(wikiDir)) {
79
+ if (fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir)) {
80
80
  return { name, path: wikiDir, mode: "stash" };
81
81
  }
82
82
  const external = registeredWikiSources(stashDir).find((source) => source.name === name);
@@ -87,7 +87,7 @@ export function resolveWikiSource(stashDir, name) {
87
87
  export function ensureWikiNameAvailable(stashDir, name) {
88
88
  validateWikiName(name);
89
89
  const wikiDir = resolveWikiDir(stashDir, name);
90
- if (fs.existsSync(wikiDir)) {
90
+ if (fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir)) {
91
91
  throw new UsageError(`Wiki already exists: ${name}.`, "RESOURCE_ALREADY_EXISTS");
92
92
  }
93
93
  const external = registeredWikiSources(stashDir).find((source) => source.name === name);
@@ -164,6 +164,17 @@ function scanWikiFiles(wikiDir) {
164
164
  }
165
165
  return { pages, raws, lastModifiedMs, pagesLastModifiedMs };
166
166
  }
167
+ function hasWikiInfrastructure(wikiDir) {
168
+ for (const file of WIKI_SPECIAL_FILES) {
169
+ if (fs.existsSync(path.join(wikiDir, file)))
170
+ return true;
171
+ }
172
+ return false;
173
+ }
174
+ function isRecognizedStashWiki(wikiDir, buckets) {
175
+ const scanned = buckets ?? scanWikiFiles(wikiDir);
176
+ return scanned.pages.length > 0 || hasWikiInfrastructure(wikiDir);
177
+ }
167
178
  function readSchemaDescription(wikiDir) {
168
179
  const schemaPath = path.join(wikiDir, SCHEMA_MD);
169
180
  let raw;
@@ -207,6 +218,8 @@ export function listWikis(stashDir) {
207
218
  }
208
219
  const summarize = (name, dir) => {
209
220
  const buckets = scanWikiFiles(dir);
221
+ if (!isRecognizedStashWiki(dir, buckets))
222
+ return;
210
223
  const summary = {
211
224
  name,
212
225
  path: dir,
@@ -353,9 +366,11 @@ export function createWiki(stashDir, name) {
353
366
  * ignore that (e.g. idempotent cleanup) by catching.
354
367
  */
355
368
  export function removeWiki(stashDir, name, options = {}) {
356
- const resolved = resolveWikiSource(stashDir, name);
357
- const wikiDir = resolved.path;
358
- if (resolved.mode === "external") {
369
+ validateWikiName(name);
370
+ const wikiDir = resolveWikiDir(stashDir, name);
371
+ const external = registeredWikiSources(stashDir).find((source) => source.name === name);
372
+ const isStashWiki = fs.existsSync(wikiDir) && isRecognizedStashWiki(wikiDir);
373
+ if (!isStashWiki && external) {
359
374
  const config = loadUserConfig();
360
375
  const filteredSources = (config.sources ?? config.stashes ?? []).filter((entry) => entry.wikiName !== name);
361
376
  const installed = (config.installed ?? []).filter((entry) => entry.wikiName !== name);
@@ -373,8 +388,8 @@ export function removeWiki(stashDir, name, options = {}) {
373
388
  unregistered: true,
374
389
  };
375
390
  }
376
- if (!fs.existsSync(wikiDir)) {
377
- throw new NotFoundError(`Wiki not found: ${name}.`, "STASH_NOT_FOUND");
391
+ if (!fs.existsSync(wikiDir) || (!isStashWiki && !options.withSources)) {
392
+ throw new NotFoundError(wikiNotFoundMessage(name), "STASH_NOT_FOUND");
378
393
  }
379
394
  const wikisRoot = resolveWikisRoot(stashDir);
380
395
  if (!isWithin(wikiDir, wikisRoot)) {
@@ -4,6 +4,28 @@ This release ships the v1 architecture refactor on top of the earlier 0.6
4
4
  terminology cut (`kit` → `stash` in the registry wire format). The CLI
5
5
  command surface is unchanged. Most users have nothing to do.
6
6
 
7
+ ## Workflows: schema-driven indexing
8
+
9
+ Workflows are now compiled into a validated `WorkflowDocument` JSON shape
10
+ with line-anchored `SourceRef`s back into the source markdown, cached in a
11
+ new `workflow_documents` table in `index.db`. `akm workflow next` reads from
12
+ the cache instead of re-parsing markdown each step.
13
+
14
+ A new `akm workflow validate <ref|path>` subcommand surfaces every error in
15
+ one pass (without running a full reindex), formatted as `path:line — message`.
16
+
17
+ `DB_VERSION` is bumped 8 → 9 to introduce the table; first run after upgrade
18
+ drops + recreates all `index.db` tables (preserving `usage_events`). The next
19
+ `akm index` rebuilds. Run-state in `workflow.db` is untouched.
20
+
21
+ ## Feedback expanded to any indexed ref
22
+
23
+ `akm feedback <ref>` now accepts any indexed ref — `memory:`, `vault:`,
24
+ `workflow:`, `wiki:` etc. The ref must be present in the local index.
25
+ Vault feedback never echoes vault values. Telemetry persists both
26
+ `entry_ref` and `entry_id` so signals survive a reindex, and feedback
27
+ events now feed into utility recomputation alongside search/show signals.
28
+
7
29
  Locked v1 decisions:
8
30
  - `writable` defaults to `true` on `filesystem`, `false` on `git` /
9
31
  `website` / `npm`.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "akm-cli",
3
- "version": "0.6.0-rc2",
3
+ "version": "0.6.0",
4
4
  "type": "module",
5
5
  "description": "akm (Agent Kit Manager) — A package manager for AI agent skills, commands, tools, and knowledge. Works with Claude Code, OpenCode, Cursor, and any AI coding assistant.",
6
6
  "keywords": [