skilld 0.7.0 → 0.8.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -8,7 +8,7 @@
8
8
 
9
9
  ## Why?
10
10
 
11
- Agents have [knowledge cutoffs](https://platform.claude.com/docs/en/about-claude/models/overview#latest-models-comparison) and predict the most likely code from training data, not the best code. For example, [Vue v3.5](https://blog.vuejs.org/posts/vue-3-5) was released over 16 months ago and agents refuse so use its features out-of-the-box. This is because they're trained on conventions not the latest docs.
11
+ Agents have [knowledge cutoffs](https://platform.claude.com/docs/en/about-claude/models/overview#latest-models-comparison) and predict the most likely code from training data, not the best code. For example, [Vue v3.5](https://blog.vuejs.org/posts/vue-3-5) was released over 16 months ago and agents refuse to use its features out-of-the-box. This is because they're trained on conventions not the latest docs.
12
12
 
13
13
  Methods of getting the right context to your agent require either manual curation, author opt-in, or external servers. See [the landscape](#the-landscape)
14
14
  for more details.
@@ -59,7 +59,7 @@ If you need to re-configure skilld, just run `npx -y skilld config` to update yo
59
59
 
60
60
  ## FAQ
61
61
 
62
- ### Why doesn't the skills run?
62
+ ### Why don't the skills run?
63
63
 
64
64
  Try this in your project/user prompt:
65
65
 
@@ -72,16 +72,16 @@ For each skill, determine YES/NO relevance and invoke all YES skills before proc
72
72
 
73
73
  Context7 is an MCP that fetches raw doc chunks at query time. You get different results each prompt, no curation, and it requires their server. Skilld is local-first: it generates a SKILL.md that lives in your project, tied to your actual package versions. No MCP dependency, no per-prompt latency, and it goes further with LLM-enhanced sections, prompt injection sanitization, and semantic search.
74
74
 
75
- ### Will I be prompt injection?
75
+ ### Will I be prompt injected?
76
76
 
77
77
  Skilld pulls issues from GitHub which could be abused for potential prompt injection.
78
78
 
79
- Skilld treats all data as unstrusted, running in permissioned environments and using best practices to avoid injections.
80
- However, always be cautious when using skills from untrusted source.
79
+ Skilld treats all data as untrusted, running in permissioned environments and using best practices to avoid injections.
80
+ However, always be cautious when using skills from untrusted sources.
81
81
 
82
82
  ### Do skills update when my deps update?
83
83
 
84
- Yes. Run `skilld update` to regenerate outdated skills, or add `skilld --prepare -b` to your prepare script and they regenerate in the background whenever you install packages.
84
+ Yes. Run `skilld update` to regenerate outdated skills, or add `skilld update -b` to your prepare script and they regenerate in the background whenever you install packages.
85
85
 
86
86
  ## Installation
87
87
 
@@ -102,7 +102,7 @@ Add to `package.json` to keep skills fresh on install:
102
102
  ```json
103
103
  {
104
104
  "scripts": {
105
- "prepare": "skilld --prepare -b"
105
+ "prepare": "skilld update -b"
106
106
  }
107
107
  }
108
108
  ```
@@ -169,8 +169,6 @@ skilld config
169
169
  | `--force` | `-f` | `false` | Ignore all caches, re-fetch docs and regenerate |
170
170
  | `--model` | `-m` | config default | LLM model for skill generation (sonnet, haiku, opus, etc.) |
171
171
  | `--debug` | | `false` | Save raw LLM output to logs/ for each section |
172
- | `--prepare` | | `false` | Non-interactive sync for prepare hook (outdated only) |
173
- | `--background` | `-b` | `false` | Run `--prepare` in a detached background process |
174
172
 
175
173
  ## The Landscape
176
174
 
@@ -194,6 +192,19 @@ Several approaches exist for steering agent knowledge. Each fills a different ni
194
192
  - **[skills-npm](https://github.com/antfu/skills-npm)**: the ideal end-state: zero-token skills shipped by the package author, but requires every maintainer to opt in.
195
193
  - **skilld**: generates version-aware skills from existing docs, changelogs, issues, and discussions. Works for any package without author opt-in.
196
194
 
195
+ ## Telemetry
196
+
197
+ Skilld sends anonymous install events to [skills.sh](https://skills.sh/) so skills can be discovered and ranked. No personal information is collected.
198
+
199
+ Telemetry is automatically disabled in CI environments.
200
+
201
+ To opt out, set either environment variable:
202
+
203
+ ```bash
204
+ DISABLE_TELEMETRY=1
205
+ DO_NOT_TRACK=1
206
+ ```
207
+
197
208
  ## Related
198
209
 
199
210
  - [skills-npm](https://github.com/antfu/skills-npm) - Convention for shipping agent skills in npm packages
@@ -1 +1 @@
1
- {"version":3,"file":"config.mjs","names":[],"sources":["../../src/cache/version.ts","../../src/cache/config.ts"],"sourcesContent":["/**\n * Version utilities\n */\n\nimport { resolve } from 'pathe'\nimport { REFERENCES_DIR } from './config'\n\n/** Validate npm package name (scoped or unscoped) */\nconst VALID_PKG_NAME = /^(?:@[a-z0-9][-a-z0-9._]*\\/)?[a-z0-9][-a-z0-9._]*$/\n\n/** Validate version string (semver-ish, no path separators) */\nconst VALID_VERSION = /^[a-z0-9][-\\w.+]*$/i\n\n/**\n * Get exact version key for cache keying\n */\nexport function getVersionKey(version: string): string {\n return version\n}\n\n/**\n * Get cache key for a package: name@version\n */\nexport function getCacheKey(name: string, version: string): string {\n return `${name}@${getVersionKey(version)}`\n}\n\n/**\n * Get path to cached package references.\n * Validates name/version to prevent path traversal.\n */\nexport function getCacheDir(name: string, version: string): string {\n if (!VALID_PKG_NAME.test(name))\n throw new Error(`Invalid package name: ${name}`)\n if (!VALID_VERSION.test(version))\n throw new Error(`Invalid version: ${version}`)\n\n const dir = resolve(REFERENCES_DIR, getCacheKey(name, version))\n if (!dir.startsWith(REFERENCES_DIR))\n throw new Error(`Path traversal detected: ${dir}`)\n return dir\n}\n","/**\n * Cache configuration\n */\n\nimport { homedir } from 'node:os'\nimport { join } from 'pathe'\nimport { getCacheKey } from './version'\n\n/** Global cache directory */\nexport const CACHE_DIR = join(homedir(), '.skilld')\n\n/** References subdirectory */\nexport const REFERENCES_DIR = join(CACHE_DIR, 'references')\n\n/** Get search DB path for a specific package@version */\nexport function getPackageDbPath(name: string, version: string): string {\n return join(REFERENCES_DIR, getCacheKey(name, version), 'search.db')\n}\n"],"mappings":";;AAQA,MAAM,iBAAiB;AAGvB,MAAM,gBAAgB;AAKtB,SAAgB,cAAc,SAAyB;AACrD,QAAO;;AAMT,SAAgB,YAAY,MAAc,SAAyB;AACjE,QAAO,GAAG,KAAK,GAAG,cAAc,QAAQ;;AAO1C,SAAgB,YAAY,MAAc,SAAyB;AACjE,KAAI,CAAC,eAAe,KAAK,KAAK,CAC5B,OAAM,IAAI,MAAM,yBAAyB,OAAO;AAClD,KAAI,CAAC,cAAc,KAAK,QAAQ,CAC9B,OAAM,IAAI,MAAM,oBAAoB,UAAU;CAEhD,MAAM,MAAM,QAAQ,gBAAgB,YAAY,MAAM,QAAQ,CAAC;AAC/D,KAAI,CAAC,IAAI,WAAW,eAAe,CACjC,OAAM,IAAI,MAAM,4BAA4B,MAAM;AACpD,QAAO;;AC/BT,MAAa,YAAY,KAAK,SAAS,EAAE,UAAU;AAGnD,MAAa,iBAAiB,KAAK,WAAW,aAAa;AAG3D,SAAgB,iBAAiB,MAAc,SAAyB;AACtE,QAAO,KAAK,gBAAgB,YAAY,MAAM,QAAQ,EAAE,YAAY"}
1
+ {"version":3,"file":"config.mjs","names":[],"sources":["../../src/cache/version.ts","../../src/cache/config.ts"],"sourcesContent":["/**\n * Version utilities\n */\n\nimport { resolve } from 'pathe'\nimport { REFERENCES_DIR } from './config.ts'\n\n/** Validate npm package name (scoped or unscoped) */\nconst VALID_PKG_NAME = /^(?:@[a-z0-9][-a-z0-9._]*\\/)?[a-z0-9][-a-z0-9._]*$/\n\n/** Validate version string (semver-ish, no path separators) */\nconst VALID_VERSION = /^[a-z0-9][-\\w.+]*$/i\n\n/**\n * Get exact version key for cache keying\n */\nexport function getVersionKey(version: string): string {\n return version\n}\n\n/**\n * Get cache key for a package: name@version\n */\nexport function getCacheKey(name: string, version: string): string {\n return `${name}@${getVersionKey(version)}`\n}\n\n/**\n * Get path to cached package references.\n * Validates name/version to prevent path traversal.\n */\nexport function getCacheDir(name: string, version: string): string {\n if (!VALID_PKG_NAME.test(name))\n throw new Error(`Invalid package name: ${name}`)\n if (!VALID_VERSION.test(version))\n throw new Error(`Invalid version: ${version}`)\n\n const dir = resolve(REFERENCES_DIR, getCacheKey(name, version))\n if (!dir.startsWith(REFERENCES_DIR))\n throw new Error(`Path traversal detected: ${dir}`)\n return dir\n}\n","/**\n * Cache configuration\n */\n\nimport { homedir } from 'node:os'\nimport { join } from 'pathe'\nimport { getCacheKey } from './version.ts'\n\n/** Global cache directory */\nexport const CACHE_DIR = join(homedir(), '.skilld')\n\n/** References subdirectory */\nexport const REFERENCES_DIR = join(CACHE_DIR, 'references')\n\n/** Get search DB path for a specific package@version */\nexport function getPackageDbPath(name: string, version: string): string {\n return join(REFERENCES_DIR, getCacheKey(name, version), 'search.db')\n}\n"],"mappings":";;AAQA,MAAM,iBAAiB;AAGvB,MAAM,gBAAgB;AAKtB,SAAgB,cAAc,SAAyB;AACrD,QAAO;;AAMT,SAAgB,YAAY,MAAc,SAAyB;AACjE,QAAO,GAAG,KAAK,GAAG,cAAc,QAAQ;;AAO1C,SAAgB,YAAY,MAAc,SAAyB;AACjE,KAAI,CAAC,eAAe,KAAK,KAAK,CAC5B,OAAM,IAAI,MAAM,yBAAyB,OAAO;AAClD,KAAI,CAAC,cAAc,KAAK,QAAQ,CAC9B,OAAM,IAAI,MAAM,oBAAoB,UAAU;CAEhD,MAAM,MAAM,QAAQ,gBAAgB,YAAY,MAAM,QAAQ,CAAC;AAC/D,KAAI,CAAC,IAAI,WAAW,eAAe,CACjC,OAAM,IAAI,MAAM,4BAA4B,MAAM;AACpD,QAAO;;AC/BT,MAAa,YAAY,KAAK,SAAS,EAAE,UAAU;AAGnD,MAAa,iBAAiB,KAAK,WAAW,aAAa;AAG3D,SAAgB,iBAAiB,MAAc,SAAyB;AACtE,QAAO,KAAK,gBAAgB,YAAY,MAAM,QAAQ,EAAE,YAAY"}
@@ -1,6 +1,6 @@
1
1
  import { t as __exportAll } from "./chunk.mjs";
2
2
  import { _ as writeSections, b as sanitizeMarkdown, h as readCachedSection, y as repairMarkdown } from "./storage.mjs";
3
- import { o as getFilePatterns, t as yamlEscape } from "./yaml.mjs";
3
+ import { l as getFilePatterns, o as mapInsert, t as yamlEscape } from "./yaml.mjs";
4
4
  import { homedir } from "node:os";
5
5
  import { dirname, join, relative } from "pathe";
6
6
  import { existsSync, lstatSync, mkdirSync, readFileSync, readdirSync, realpathSync, symlinkSync, unlinkSync, writeFileSync } from "node:fs";
@@ -544,7 +544,21 @@ function getAgentVersion(agentType) {
544
544
  return null;
545
545
  }
546
546
  }
547
- function apiChangesSection({ packageName, version, hasReleases, hasChangelog, hasIssues, hasDiscussions, features }) {
547
+ function maxLines(min, max, sectionCount) {
548
+ const scale = budgetScale(sectionCount);
549
+ return Math.max(min, Math.round(max * scale));
550
+ }
551
+ function maxItems(min, max, sectionCount) {
552
+ const scale = budgetScale(sectionCount);
553
+ return Math.max(min, Math.round(max * scale));
554
+ }
555
+ function budgetScale(sectionCount) {
556
+ if (!sectionCount || sectionCount <= 1) return 1;
557
+ if (sectionCount === 2) return .85;
558
+ if (sectionCount === 3) return .7;
559
+ return .6;
560
+ }
561
+ function apiChangesSection({ packageName, version, hasReleases, hasChangelog, hasIssues, hasDiscussions, features, enabledSectionCount }) {
548
562
  const [, major, minor] = version?.match(/^(\d+)\.(\d+)/) ?? [];
549
563
  const searchHints = [];
550
564
  if (features?.search !== false) {
@@ -623,7 +637,7 @@ This section documents version-specific API changes — prioritize recent major/
623
637
 
624
638
  Each item: ⚠️ (breaking/deprecated) or ✨ (new) + API name + what changed + source link.`,
625
639
  rules: [
626
- "- **API Changes:** 8-12 items from version history, MAX 80 lines",
640
+ `- **API Changes:** ${maxItems(6, 12, enabledSectionCount)} items from version history, MAX ${maxLines(50, 80, enabledSectionCount)} lines`,
627
641
  "- Prioritize recent major/minor releases over old patch versions",
628
642
  "- Focus on APIs that CHANGED, not general conventions or gotchas",
629
643
  "- New APIs get ✨, deprecated/breaking get ⚠️",
@@ -632,7 +646,7 @@ Each item: ⚠️ (breaking/deprecated) or ✨ (new) + API name + what changed +
632
646
  ].filter(Boolean)
633
647
  };
634
648
  }
635
- function bestPracticesSection({ packageName, hasIssues, hasDiscussions, hasReleases, hasChangelog, features }) {
649
+ function bestPracticesSection({ packageName, hasIssues, hasDiscussions, hasReleases, hasChangelog, features, enabledSectionCount }) {
636
650
  const searchHints = [];
637
651
  if (features?.search !== false) searchHints.push(`\`npx -y skilld search "recommended" -p ${packageName}\``, `\`npx -y skilld search "avoid" -p ${packageName}\``);
638
652
  const referenceWeights = [{
@@ -691,13 +705,13 @@ async function fetchUser(id: string, signal?: AbortSignal) {
691
705
 
692
706
  Each item: ✅ + pattern name + why it's preferred + source link. Code block only when the pattern isn't obvious from the title. Use the most relevant language tag (ts, vue, css, json, etc).`,
693
707
  rules: [
694
- "- **5-10 best practice items**",
695
- "- **MAX 150 lines** for best practices section",
708
+ `- **${maxItems(4, 10, enabledSectionCount)} best practice items**`,
709
+ `- **MAX ${maxLines(80, 150, enabledSectionCount)} lines** for best practices section`,
696
710
  "- **Only link files confirmed to exist** via Glob or Read — no guessed paths"
697
711
  ]
698
712
  };
699
713
  }
700
- function customSection({ heading, body }) {
714
+ function customSection({ heading, body }, enabledSectionCount) {
701
715
  return {
702
716
  task: `**Custom section — "${heading}":**\n${body}`,
703
717
  format: `Custom section format:
@@ -706,10 +720,10 @@ function customSection({ heading, body }) {
706
720
 
707
721
  Content addressing the user's instructions above, using concise examples and source links.
708
722
  \`\`\``,
709
- rules: [`- **Custom section "${heading}":** MAX 80 lines, use \`## ${heading}\` heading`]
723
+ rules: [`- **Custom section "${heading}":** MAX ${maxLines(50, 80, enabledSectionCount)} lines, use \`## ${heading}\` heading`]
710
724
  };
711
725
  }
712
- function apiSection({ hasReleases, hasChangelog, hasIssues, hasDiscussions }) {
726
+ function apiSection({ hasReleases, hasChangelog, hasIssues, hasDiscussions, enabledSectionCount }) {
713
727
  const referenceWeights = [{
714
728
  name: "Docs",
715
729
  path: "./.skilld/docs/",
@@ -769,7 +783,7 @@ useNuxtData, usePreviewMode, prerenderRoutes
769
783
 
770
784
  Comma-separated names per group. One line per doc page. Annotate version when APIs are recent additions. For single-doc packages, use a flat comma list.`,
771
785
  rules: [
772
- "- **Doc Map:** names only, grouped by doc page, MAX 25 lines",
786
+ `- **Doc Map:** names only, grouped by doc page, MAX ${maxLines(15, 25, enabledSectionCount)} lines`,
773
787
  "- Skip entirely for packages with fewer than 5 exports or only 1 doc page",
774
788
  "- Prioritize new/recent exports over well-established APIs",
775
789
  "- No signatures, no descriptions — the linked doc IS the description",
@@ -859,7 +873,7 @@ function getSectionDef(section, ctx, customPrompt) {
859
873
  case "api-changes": return apiChangesSection(ctx);
860
874
  case "best-practices": return bestPracticesSection(ctx);
861
875
  case "api": return apiSection(ctx);
862
- case "custom": return customPrompt ? customSection(customPrompt) : null;
876
+ case "custom": return customPrompt ? customSection(customPrompt, ctx.enabledSectionCount) : null;
863
877
  }
864
878
  }
865
879
  function buildSectionPrompt(opts) {
@@ -876,7 +890,8 @@ function buildSectionPrompt(opts) {
876
890
  hasDiscussions,
877
891
  hasReleases,
878
892
  hasChangelog,
879
- features: opts.features
893
+ features: opts.features,
894
+ enabledSectionCount: opts.enabledSectionCount
880
895
  }, customPrompt);
881
896
  if (!sectionDef) return "";
882
897
  const outputFile = SECTION_OUTPUT_FILES[section];
@@ -914,7 +929,8 @@ function buildAllSectionPrompts(opts) {
914
929
  for (const section of opts.sections) {
915
930
  const prompt = buildSectionPrompt({
916
931
  ...opts,
917
- section
932
+ section,
933
+ enabledSectionCount: opts.sections.length
918
934
  });
919
935
  if (prompt) result.set(section, prompt);
920
936
  }
@@ -1026,15 +1042,15 @@ function generatePackageHeader({ name, description, version, releasedAt, depende
1026
1042
  }
1027
1043
  lines.push("");
1028
1044
  const refs = [];
1029
- refs.push(`[package.json](./.skilld/pkg/package.json)`);
1045
+ refs.push(`[package.json](./.skilld/pkg/package.json) — exports, entry points`);
1030
1046
  if (packages && packages.length > 1) for (const pkg of packages) {
1031
1047
  const shortName = pkg.name.split("/").pop().toLowerCase();
1032
1048
  refs.push(`[pkg-${shortName}](./.skilld/pkg-${shortName}/package.json)`);
1033
1049
  }
1034
- if (pkgFiles?.includes("README.md")) refs.push(`[README](./.skilld/pkg/README.md)`);
1035
- if (hasIssues) refs.push(`[GitHub Issues](./.skilld/issues/_INDEX.md)`);
1036
- if (hasDiscussions) refs.push(`[GitHub Discussions](./.skilld/discussions/_INDEX.md)`);
1037
- if (hasReleases) refs.push(`[Releases](./.skilld/releases/_INDEX.md)`);
1050
+ if (pkgFiles?.includes("README.md")) refs.push(`[README](./.skilld/pkg/README.md) — setup, basic usage`);
1051
+ if (hasIssues) refs.push(`[GitHub Issues](./.skilld/issues/_INDEX.md) — bugs, workarounds, edge cases`);
1052
+ if (hasDiscussions) refs.push(`[GitHub Discussions](./.skilld/discussions/_INDEX.md) — Q&A, patterns, recipes`);
1053
+ if (hasReleases) refs.push(`[Releases](./.skilld/releases/_INDEX.md) — changelog, breaking changes, new APIs`);
1038
1054
  if (refs.length > 0) lines.push(`**References:** ${refs.join(" • ")}`);
1039
1055
  return lines.join("\n");
1040
1056
  }
@@ -1063,7 +1079,8 @@ function expandRepoName(repoUrl) {
1063
1079
  function generateFrontmatter({ name, version, description: pkgDescription, globs, body, generatedBy, dirName, packages, repoUrl }) {
1064
1080
  const patterns = globs ?? getFilePatterns(name);
1065
1081
  const globHint = patterns?.length ? ` or working with ${patterns.join(", ")} files` : "";
1066
- const descSuffix = pkgDescription ? ` (${pkgDescription.replace(/\.?\s*$/, "")})` : "";
1082
+ const rawDesc = pkgDescription?.replace(/[<>]/g, "").replace(/\.?\s*$/, "");
1083
+ const cleanDesc = rawDesc && rawDesc.length > 200 ? `${rawDesc.slice(0, 197)}...` : rawDesc;
1067
1084
  const editHint = globHint ? `editing${globHint} or code importing` : "writing code importing";
1068
1085
  let desc;
1069
1086
  if (packages && packages.length > 1) {
@@ -1073,14 +1090,17 @@ function generateFrontmatter({ name, version, description: pkgDescription, globs
1073
1090
  allKeywords.add(pkg.name);
1074
1091
  for (const kw of expandPackageName(pkg.name)) allKeywords.add(kw);
1075
1092
  }
1076
- desc = `ALWAYS use when ${editHint} ${importList}. Consult for debugging, best practices, or modifying ${[...allKeywords].join(", ")}.${descSuffix}`;
1093
+ const keywordList = [...allKeywords].join(", ");
1094
+ desc = `${cleanDesc ? `${cleanDesc}. ` : ""}ALWAYS use when ${editHint} ${importList}. Consult for debugging, best practices, or modifying ${keywordList}.`;
1077
1095
  } else {
1078
1096
  const allKeywords = /* @__PURE__ */ new Set();
1079
1097
  allKeywords.add(name);
1080
1098
  for (const kw of expandPackageName(name)) allKeywords.add(kw);
1081
1099
  if (repoUrl) for (const kw of expandRepoName(repoUrl)) allKeywords.add(kw);
1082
- desc = `ALWAYS use when ${editHint} "${name}". Consult for debugging, best practices, or modifying ${[...allKeywords].join(", ")}.${descSuffix}`;
1100
+ const nameList = [...allKeywords].join(", ");
1101
+ desc = `${cleanDesc ? `${cleanDesc}. ` : ""}ALWAYS use when ${editHint} "${name}". Consult for debugging, best practices, or modifying ${nameList}.`;
1083
1102
  }
1103
+ if (desc.length > 1024) desc = `${desc.slice(0, 1021)}...`;
1084
1104
  const lines = [
1085
1105
  "---",
1086
1106
  `name: ${dirName ?? sanitizeName(name)}`,
@@ -1161,6 +1181,8 @@ function buildArgs$2(model, skillDir, symlinkDirs) {
1161
1181
  `Write(${skilldDir}/**)`,
1162
1182
  `Bash(*skilld search*)`
1163
1183
  ].join(" "),
1184
+ "--disallowedTools",
1185
+ "WebSearch WebFetch Task",
1164
1186
  "--add-dir",
1165
1187
  skillDir,
1166
1188
  ...symlinkDirs.flatMap((d) => ["--add-dir", d]),
@@ -1174,7 +1196,6 @@ function parseLine$2(line) {
1174
1196
  const evt = obj.event;
1175
1197
  if (!evt) return {};
1176
1198
  if (evt.type === "content_block_delta" && evt.delta?.type === "text_delta") return { textDelta: evt.delta.text };
1177
- if (evt.type === "content_block_start" && evt.content_block?.type === "tool_use") return { toolName: evt.content_block.name };
1178
1199
  return {};
1179
1200
  }
1180
1201
  if (obj.type === "assistant" && obj.message?.content) {
@@ -1350,6 +1371,71 @@ function parseLine(line) {
1350
1371
  } catch {}
1351
1372
  return {};
1352
1373
  }
1374
+ const TOOL_VERBS = {
1375
+ Read: "Reading",
1376
+ Glob: "Searching",
1377
+ Grep: "Searching",
1378
+ Write: "Writing",
1379
+ Bash: "Running",
1380
+ read_file: "Reading",
1381
+ glob_tool: "Searching",
1382
+ write_file: "Writing"
1383
+ };
1384
+ function createToolProgress(log) {
1385
+ const pending = /* @__PURE__ */ new Map();
1386
+ let timer = null;
1387
+ let lastEmitted = "";
1388
+ function flush() {
1389
+ const parts = [];
1390
+ for (const [section, { verb, path, count }] of pending) {
1391
+ const suffix = count > 1 ? ` \x1B[90m(+${count - 1})\x1B[0m` : "";
1392
+ parts.push(`\x1B[90m[${section}]\x1B[0m ${verb} ${path}${suffix}`);
1393
+ }
1394
+ const msg = parts.join(" ");
1395
+ if (msg && msg !== lastEmitted) {
1396
+ log.message(msg);
1397
+ lastEmitted = msg;
1398
+ }
1399
+ pending.clear();
1400
+ timer = null;
1401
+ }
1402
+ return ({ type, chunk, section }) => {
1403
+ if (type === "text") {
1404
+ log.message(`${section ? `\x1B[90m[${section}]\x1B[0m ` : ""}Writing...`);
1405
+ return;
1406
+ }
1407
+ if (type !== "reasoning" || !chunk.startsWith("[")) return;
1408
+ const key = section ?? "";
1409
+ const match = chunk.match(/^\[(\w+)(?:,\s\w+)*(?::\s(.+))?\]$/);
1410
+ if (!match) return;
1411
+ const rawName = match[1];
1412
+ const hint = match[2] ?? "";
1413
+ let verb = TOOL_VERBS[rawName] ?? rawName;
1414
+ let path = hint || "...";
1415
+ if (rawName === "Bash" && hint) {
1416
+ const searchMatch = hint.match(/skilld search\s+"([^"]+)"/);
1417
+ if (searchMatch) {
1418
+ verb = "skilld search:";
1419
+ path = searchMatch[1];
1420
+ } else path = hint.length > 60 ? `${hint.slice(0, 57)}...` : hint;
1421
+ } else path = shortenPath(path);
1422
+ if (rawName === "Write") {
1423
+ if (timer) flush();
1424
+ const prefix = section ? `\x1B[90m[${section}]\x1B[0m ` : "";
1425
+ log.message(`${prefix}Writing ${path}`);
1426
+ return;
1427
+ }
1428
+ const entry = mapInsert(pending, key, () => ({
1429
+ verb,
1430
+ path,
1431
+ count: 0
1432
+ }));
1433
+ entry.verb = verb;
1434
+ entry.path = path;
1435
+ entry.count++;
1436
+ if (!timer) timer = setTimeout(flush, 400);
1437
+ };
1438
+ }
1353
1439
  const CLI_DEFS = [
1354
1440
  claude_exports,
1355
1441
  gemini_exports,
@@ -1403,7 +1489,7 @@ async function getAvailableModels() {
1403
1489
  function resolveReferenceDirs(skillDir) {
1404
1490
  const refsDir = join(skillDir, ".skilld");
1405
1491
  if (!existsSync(refsDir)) return [];
1406
- return readdirSync(refsDir).map((entry) => join(refsDir, entry)).filter((p) => lstatSync(p).isSymbolicLink()).map((p) => realpathSync(p));
1492
+ return readdirSync(refsDir).map((entry) => join(refsDir, entry)).filter((p) => lstatSync(p).isSymbolicLink() && existsSync(p)).map((p) => realpathSync(p));
1407
1493
  }
1408
1494
  const CACHE_DIR = join(homedir(), ".skilld", "llm-cache");
1409
1495
  function normalizePromptForHash(prompt) {
@@ -1492,7 +1578,7 @@ function optimizeSection(opts) {
1492
1578
  if (evt.fullText) accumulatedText = evt.fullText;
1493
1579
  if (evt.writeContent) lastWriteContent = evt.writeContent;
1494
1580
  if (evt.toolName) {
1495
- const hint = evt.toolHint ? `[${evt.toolName}: ${shortenPath(evt.toolHint)}]` : `[${evt.toolName}]`;
1581
+ const hint = evt.toolHint ? `[${evt.toolName}: ${evt.toolHint}]` : `[${evt.toolName}]`;
1496
1582
  onProgress?.({
1497
1583
  chunk: hint,
1498
1584
  type: "reasoning",
@@ -1759,6 +1845,7 @@ function cleanSectionOutput(content) {
1759
1845
  if (/\b(?:function|const |let |var |export |return |import |async |class )\b/.test(preamble)) cleaned = cleaned.slice(firstMarker.index).trim();
1760
1846
  }
1761
1847
  cleaned = sanitizeMarkdown(cleaned);
1848
+ if (!/^##\s/m.test(cleaned) && !/⚠️|✅|✨/.test(cleaned)) return "";
1762
1849
  return cleaned;
1763
1850
  }
1764
1851
  const NUXT_CONFIG_FILES = [
@@ -1910,6 +1997,6 @@ function isNodeBuiltin(pkg) {
1910
1997
  const base = pkg.startsWith("node:") ? pkg.slice(5) : pkg;
1911
1998
  return NODE_BUILTINS.has(base.split("/")[0]);
1912
1999
  }
1913
- export { detectTargetAgent as _, optimizeDocs as a, installSkillForAgents as c, unlinkSkillFromAgents as d, SECTION_MERGE_ORDER as f, detectInstalledAgents as g, buildSectionPrompt as h, getModelName as i, linkSkillToAgents as l, buildAllSectionPrompts as m, getAvailableModels as n, generateSkillMd as o, SECTION_OUTPUT_FILES as p, getModelLabel as r, computeSkillDirName as s, detectImportedPackages as t, sanitizeName as u, getAgentVersion as v, targets as y };
2000
+ export { targets as S, maxItems as _, getModelName as a, detectTargetAgent as b, computeSkillDirName as c, sanitizeName as d, unlinkSkillFromAgents as f, buildSectionPrompt as g, buildAllSectionPrompts as h, getModelLabel as i, installSkillForAgents as l, SECTION_OUTPUT_FILES as m, createToolProgress as n, optimizeDocs as o, SECTION_MERGE_ORDER as p, getAvailableModels as r, generateSkillMd as s, detectImportedPackages as t, linkSkillToAgents as u, maxLines as v, getAgentVersion as x, detectInstalledAgents as y };
1914
2001
 
1915
2002
  //# sourceMappingURL=detect-imports.mjs.map