@hi-man/himan 0.3.0 → 0.3.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +21 -0
- package/README.md +4 -4
- package/dist/adapters/git/repo-manager.js +21 -3
- package/dist/adapters/resource/resource-scanner.js +73 -18
- package/dist/adapters/source/git-source-adapter.js +176 -24
- package/dist/cli/source-commands.js +6 -0
- package/dist/services/index.js +79 -24
- package/docs/development.md +4 -3
- package/docs/error-codes.md +2 -2
- package/docs/mvp/README.md +3 -3
- package/docs/mvp/create-resource.md +2 -5
- package/docs/mvp/impl.md +3 -3
- package/package.json +5 -4
package/CHANGELOG.md
CHANGED
|
@@ -6,6 +6,27 @@ The format is based on Keep a Changelog, and this project follows semver for the
|
|
|
6
6
|
|
|
7
7
|
## [Unreleased]
|
|
8
8
|
|
|
9
|
+
## [0.3.1] - 2026-05-07
|
|
10
|
+
|
|
11
|
+
### Added
|
|
12
|
+
|
|
13
|
+
- Added the `common-project-changelog` skill to enforce changelog and version history placement rules.
|
|
14
|
+
- Added `scripts/release-changelog.mjs` so package version scripts release `[Unreleased]` changelog entries into the new version section.
|
|
15
|
+
|
|
16
|
+
### Changed
|
|
17
|
+
|
|
18
|
+
- Changed `himan source init-docs --force` to list existing source resources in generated docs.
|
|
19
|
+
- Changed `himan source init-docs` to commit and push generated source docs when files changed.
|
|
20
|
+
- Changed generated source docs to show the latest tagged resource version when one exists.
|
|
21
|
+
- Changed `himan publish` to allow resources without `himan.yaml` when their default entry file exists.
|
|
22
|
+
- Changed resource discovery to infer `rule`, `command`, and `skill` resources from default entry files when `himan.yaml` is absent.
|
|
23
|
+
- Changed `himan publish` to reinstall the published version in copy mode, update the lock file, and remove the resource dev directory.
|
|
24
|
+
- Changed package version scripts to archive `[Unreleased]` changelog entries after version bumps.
|
|
25
|
+
|
|
26
|
+
### Fixed
|
|
27
|
+
|
|
28
|
+
- Fixed `himan source init-docs --force` so existing Codex-style skills with `SKILL.md` front matter are included in generated source docs.
|
|
29
|
+
|
|
9
30
|
## [0.3.0] - 2026-05-07
|
|
10
31
|
|
|
11
32
|
### Added
|
package/README.md
CHANGED
|
@@ -91,10 +91,10 @@ your-himan-source/
|
|
|
91
91
|
- `README.md`:source 仓库入口文档,建议记录资源目录说明、推荐安装方式、默认 agent 策略、常用资源索引和维护约定。
|
|
92
92
|
- `CHANGELOG.md`:source 仓库级变更记录,建议记录新增、变更、废弃、移除的资源,以及重要版本发布说明。
|
|
93
93
|
- `rules/`、`commands/`、`skills/`:按资源类型分组;每个子目录是一份 himan 资源。
|
|
94
|
-
- `himan.yaml
|
|
95
|
-
- `content.md` / `SKILL.md
|
|
94
|
+
- `himan.yaml`:可选资源元数据;存在时供 himan 扫描、校验、读取入口和默认 agent。
|
|
95
|
+
- `content.md` / `SKILL.md`:资源主入口;没有 `himan.yaml` 时,`rule` / `command` 默认使用 `content.md`,`skill` 默认使用 `SKILL.md`。
|
|
96
96
|
|
|
97
|
-
可通过 `himan source init-docs` 为当前 default source 生成根目录文档模板;默认只创建缺失文件,`--force`
|
|
97
|
+
可通过 `himan source init-docs` 为当前 default source 生成根目录文档模板;默认只创建缺失文件,`--force` 会覆盖已有 `README.md` / `CHANGELOG.md`,并把当前 source 中已有的 `rule`、`command`、`skill` 整理进 README 资源索引和 CHANGELOG 初始条目;资源引用会优先带上 Git tag 中的最新 semver 版本;对于尚未补齐 `himan.yaml` 的资源,会按默认入口识别,skill 还会读取 `skills/<name>/SKILL.md` front matter。`--dry-run` 可预览结果。有实际文件变更时,命令会提交并 push 到当前 Git source。
|
|
98
98
|
|
|
99
99
|
`himan create` 和 `himan publish` 会自动维护 source 根目录文档:
|
|
100
100
|
|
|
@@ -161,7 +161,7 @@ your-himan-source/
|
|
|
161
161
|
说明:资源与项目相关命令统一使用 `--agent` 指定目标 Agent。
|
|
162
162
|
若未显式传 `--agent`,`create` / `install` 会使用当前项目默认 agent、全局默认 agent、资源 metadata 或内置默认 `cursor` 中最合适的一项;`dev` 会优先使用 lock 中记录的 agent。
|
|
163
163
|
|
|
164
|
-
`publish` 优先使用项目里 `.himan/dev`
|
|
164
|
+
`publish` 优先使用项目里 `.himan/dev` 对应目录,否则用源仓库里对应目录。若资源目录包含 `himan.yaml`,发布前会校验元数据与入口文件;若没有 `himan.yaml`,则按默认入口推断最小元数据并发布,不会强制创建 `himan.yaml`。发布需要可推送的 Git 权限。发布 commit 会包含资源目录以及自动维护的 source 根目录 `README.md` / `CHANGELOG.md`。发布成功后会从新版本 store 以 `copy` 模式重新安装到项目目标、更新 lock,并删除对应 `.himan/dev/<type>/<name>` 开发目录。
|
|
165
165
|
|
|
166
166
|
`--json` 模式下,失败时会输出机器可读错误 JSON(`stderr`)。错误码定义见 [docs/error-codes.md](./docs/error-codes.md)。
|
|
167
167
|
|
|
@@ -48,6 +48,20 @@ export class RepoManager {
|
|
|
48
48
|
}
|
|
49
49
|
async commitTagAndPush(repoDir, message, tag, branch, paths = ["."]) {
|
|
50
50
|
const git = simpleGit(repoDir);
|
|
51
|
+
await this.commitChanges(git, message, paths, true);
|
|
52
|
+
await git.addTag(tag);
|
|
53
|
+
await this.pushCurrentBranch(git, branch);
|
|
54
|
+
await git.pushTags("origin");
|
|
55
|
+
}
|
|
56
|
+
async commitAndPush(repoDir, message, branch, paths = ["."]) {
|
|
57
|
+
const git = simpleGit(repoDir);
|
|
58
|
+
const committed = await this.commitChanges(git, message, paths, false);
|
|
59
|
+
if (!committed)
|
|
60
|
+
return false;
|
|
61
|
+
await this.pushCurrentBranch(git, branch);
|
|
62
|
+
return true;
|
|
63
|
+
}
|
|
64
|
+
async commitChanges(git, message, paths, requireChanges) {
|
|
51
65
|
const pathspecs = paths.length > 0 ? paths : ["."];
|
|
52
66
|
await git.add(pathspecs);
|
|
53
67
|
const stagedFiles = await git.raw([
|
|
@@ -58,14 +72,18 @@ export class RepoManager {
|
|
|
58
72
|
...pathspecs,
|
|
59
73
|
]);
|
|
60
74
|
if (!stagedFiles.trim()) {
|
|
61
|
-
|
|
75
|
+
if (requireChanges) {
|
|
76
|
+
throw new HimanError(errorCodes.PUBLISH_NO_CHANGES, "No changes to publish.");
|
|
77
|
+
}
|
|
78
|
+
return false;
|
|
62
79
|
}
|
|
63
80
|
await git.commit(message, pathspecs);
|
|
64
|
-
|
|
81
|
+
return true;
|
|
82
|
+
}
|
|
83
|
+
async pushCurrentBranch(git, branch) {
|
|
65
84
|
const currentBranch = (await git.raw(["rev-parse", "--abbrev-ref", "HEAD"])).trim();
|
|
66
85
|
const targetBranch = branch ?? currentBranch;
|
|
67
86
|
await git.push("origin", targetBranch);
|
|
68
|
-
await git.pushTags("origin");
|
|
69
87
|
}
|
|
70
88
|
async exists(targetPath) {
|
|
71
89
|
try {
|
|
@@ -15,28 +15,80 @@ export class ResourceScanner {
|
|
|
15
15
|
const result = [];
|
|
16
16
|
for (const resourceDir of resourceDirs) {
|
|
17
17
|
const yamlPath = path.join(baseDir, resourceDir.name, "himan.yaml");
|
|
18
|
-
if (
|
|
18
|
+
if (await this.exists(yamlPath)) {
|
|
19
|
+
const raw = await fs.readFile(yamlPath, "utf8");
|
|
20
|
+
const parsed = YAML.parse(raw);
|
|
21
|
+
if (!parsed)
|
|
22
|
+
continue;
|
|
23
|
+
if (parsed.type !== type)
|
|
24
|
+
continue;
|
|
25
|
+
if (!parsed.name || !parsed.entry)
|
|
26
|
+
continue;
|
|
27
|
+
result.push({
|
|
28
|
+
name: parsed.name,
|
|
29
|
+
type,
|
|
30
|
+
entry: parsed.entry,
|
|
31
|
+
description: parsed.description,
|
|
32
|
+
agents: Array.isArray(parsed.agents)
|
|
33
|
+
? (parsed.agents ?? [])
|
|
34
|
+
: (parsed.targets ?? []),
|
|
35
|
+
});
|
|
19
36
|
continue;
|
|
20
|
-
|
|
21
|
-
const
|
|
22
|
-
if (
|
|
23
|
-
|
|
24
|
-
if (parsed.type !== type)
|
|
25
|
-
continue;
|
|
26
|
-
if (!parsed.name || !parsed.entry)
|
|
27
|
-
continue;
|
|
28
|
-
result.push({
|
|
29
|
-
name: parsed.name,
|
|
30
|
-
type,
|
|
31
|
-
entry: parsed.entry,
|
|
32
|
-
description: parsed.description,
|
|
33
|
-
agents: Array.isArray(parsed.agents)
|
|
34
|
-
? (parsed.agents ?? [])
|
|
35
|
-
: (parsed.targets ?? []),
|
|
36
|
-
});
|
|
37
|
+
}
|
|
38
|
+
const inferred = await this.inferResourceMeta(path.join(baseDir, resourceDir.name), resourceDir.name, type);
|
|
39
|
+
if (inferred)
|
|
40
|
+
result.push(inferred);
|
|
37
41
|
}
|
|
38
42
|
return result;
|
|
39
43
|
}
|
|
44
|
+
async inferResourceMeta(resourceDir, dirName, type) {
|
|
45
|
+
const entry = this.getDefaultEntry(type);
|
|
46
|
+
const entryPath = path.join(resourceDir, entry);
|
|
47
|
+
if (!(await this.exists(entryPath)))
|
|
48
|
+
return undefined;
|
|
49
|
+
const metadata = type === "skill" ? await this.readSkillFrontMatter(entryPath) : null;
|
|
50
|
+
return {
|
|
51
|
+
name: this.readStringMetadata(metadata, "name") ?? dirName,
|
|
52
|
+
type,
|
|
53
|
+
entry,
|
|
54
|
+
description: this.readStringMetadata(metadata, "description"),
|
|
55
|
+
agents: this.readStringArrayMetadata(metadata, "agents") ??
|
|
56
|
+
this.readStringArrayMetadata(metadata, "targets") ??
|
|
57
|
+
[],
|
|
58
|
+
};
|
|
59
|
+
}
|
|
60
|
+
async readSkillFrontMatter(skillPath) {
|
|
61
|
+
const raw = await fs.readFile(skillPath, "utf8");
|
|
62
|
+
const match = /^---\r?\n([\s\S]*?)\r?\n---(?:\r?\n|$)/.exec(raw.trimStart());
|
|
63
|
+
if (!match)
|
|
64
|
+
return null;
|
|
65
|
+
try {
|
|
66
|
+
const parsed = YAML.parse(match[1]);
|
|
67
|
+
return typeof parsed === "object" && parsed !== null && !Array.isArray(parsed)
|
|
68
|
+
? parsed
|
|
69
|
+
: null;
|
|
70
|
+
}
|
|
71
|
+
catch {
|
|
72
|
+
return null;
|
|
73
|
+
}
|
|
74
|
+
}
|
|
75
|
+
readStringMetadata(metadata, key) {
|
|
76
|
+
const value = metadata?.[key];
|
|
77
|
+
if (typeof value !== "string")
|
|
78
|
+
return undefined;
|
|
79
|
+
const trimmed = value.trim();
|
|
80
|
+
return trimmed ? trimmed : undefined;
|
|
81
|
+
}
|
|
82
|
+
readStringArrayMetadata(metadata, key) {
|
|
83
|
+
const value = metadata?.[key];
|
|
84
|
+
if (!Array.isArray(value))
|
|
85
|
+
return undefined;
|
|
86
|
+
const items = value
|
|
87
|
+
.filter((item) => typeof item === "string")
|
|
88
|
+
.map((item) => item.trim())
|
|
89
|
+
.filter(Boolean);
|
|
90
|
+
return items.length > 0 ? items : undefined;
|
|
91
|
+
}
|
|
40
92
|
async exists(targetPath) {
|
|
41
93
|
try {
|
|
42
94
|
await fs.access(targetPath);
|
|
@@ -53,4 +105,7 @@ export class ResourceScanner {
|
|
|
53
105
|
return "commands";
|
|
54
106
|
return "skills";
|
|
55
107
|
}
|
|
108
|
+
getDefaultEntry(type) {
|
|
109
|
+
return type === "skill" ? "SKILL.md" : "content.md";
|
|
110
|
+
}
|
|
56
111
|
}
|
|
@@ -27,7 +27,7 @@ export class GitSourceAdapter {
|
|
|
27
27
|
const repoId = this.sourceConfig?.repoId ?? "default";
|
|
28
28
|
const typeDir = this.getTypeDir(type);
|
|
29
29
|
const baseDir = path.join(repoDir, typeDir);
|
|
30
|
-
const metadataHash = await this.getResourceMetadataHash(baseDir);
|
|
30
|
+
const metadataHash = await this.getResourceMetadataHash(baseDir, type);
|
|
31
31
|
const cached = await this.indexStore.get(repoId, type);
|
|
32
32
|
if (cached && cached.metadataHash === metadataHash) {
|
|
33
33
|
return cached.resources;
|
|
@@ -51,20 +51,22 @@ export class GitSourceAdapter {
|
|
|
51
51
|
async publish(type, name, version, sourceDir) {
|
|
52
52
|
const repoDir = this.getRepoDir();
|
|
53
53
|
const targetDir = path.join(repoDir, `${type}s`, name);
|
|
54
|
-
const
|
|
54
|
+
const metadataResult = await this.validatePublishResource(type, name, sourceDir);
|
|
55
55
|
const sameDir = await this.isSameDirectory(sourceDir, targetDir);
|
|
56
56
|
if (!sameDir) {
|
|
57
57
|
await fs.rm(targetDir, { recursive: true, force: true });
|
|
58
58
|
await fs.mkdir(path.dirname(targetDir), { recursive: true });
|
|
59
59
|
await fs.cp(sourceDir, targetDir, { recursive: true });
|
|
60
60
|
}
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
61
|
+
if (metadataResult.shouldWriteMetadata) {
|
|
62
|
+
const yamlPath = path.join(targetDir, "himan.yaml");
|
|
63
|
+
const metadata = { ...metadataResult.metadata, version };
|
|
64
|
+
await fs.writeFile(yamlPath, YAML.stringify(metadata), "utf8");
|
|
65
|
+
}
|
|
64
66
|
const docsPaths = await this.maintainSourceDocs(repoDir, {
|
|
65
67
|
section: "Changed",
|
|
66
68
|
line: `- Published \`${type}/${name}@${version}\`.`,
|
|
67
|
-
});
|
|
69
|
+
}, new Map([[this.getResourceVersionOverrideKey(type, name), version]]));
|
|
68
70
|
const tag = `${type}/${name}@${version}`;
|
|
69
71
|
await this.repoManager.commitTagAndPush(repoDir, `publish ${type}/${name}@${version}`, tag, undefined, [
|
|
70
72
|
path.relative(repoDir, targetDir),
|
|
@@ -118,10 +120,11 @@ export class GitSourceAdapter {
|
|
|
118
120
|
},
|
|
119
121
|
{
|
|
120
122
|
path: path.join(repoDir, "CHANGELOG.md"),
|
|
121
|
-
content: this.buildChangelogContent(),
|
|
123
|
+
content: await this.buildChangelogContent(repoDir),
|
|
122
124
|
},
|
|
123
125
|
];
|
|
124
126
|
const results = [];
|
|
127
|
+
const changedPaths = [];
|
|
125
128
|
for (const file of files) {
|
|
126
129
|
const exists = await this.exists(file.path);
|
|
127
130
|
const action = exists ? (options.force ? "updated" : "skipped") : "created";
|
|
@@ -129,12 +132,17 @@ export class GitSourceAdapter {
|
|
|
129
132
|
results.push({ path: file.path, action, reason });
|
|
130
133
|
if (!options.dryRun && action !== "skipped") {
|
|
131
134
|
await fs.writeFile(file.path, file.content, "utf8");
|
|
135
|
+
changedPaths.push(path.relative(repoDir, file.path));
|
|
132
136
|
}
|
|
133
137
|
}
|
|
138
|
+
const committed = !options.dryRun &&
|
|
139
|
+
changedPaths.length > 0 &&
|
|
140
|
+
(await this.repoManager.commitAndPush(repoDir, "docs: init source docs", undefined, changedPaths));
|
|
134
141
|
return {
|
|
135
142
|
sourceDir: repoDir,
|
|
136
143
|
files: results,
|
|
137
144
|
dryRun: Boolean(options.dryRun),
|
|
145
|
+
committed,
|
|
138
146
|
};
|
|
139
147
|
}
|
|
140
148
|
getRepoDir() {
|
|
@@ -164,7 +172,10 @@ export class GitSourceAdapter {
|
|
|
164
172
|
async validatePublishResource(type, name, resourceDir) {
|
|
165
173
|
const yamlPath = path.join(resourceDir, "himan.yaml");
|
|
166
174
|
if (!(await this.exists(yamlPath))) {
|
|
167
|
-
|
|
175
|
+
return {
|
|
176
|
+
metadata: await this.inferPublishResourceMetadata(type, name, resourceDir),
|
|
177
|
+
shouldWriteMetadata: false,
|
|
178
|
+
};
|
|
168
179
|
}
|
|
169
180
|
const raw = await fs.readFile(yamlPath, "utf8");
|
|
170
181
|
let parsed;
|
|
@@ -210,11 +221,45 @@ export class GitSourceAdapter {
|
|
|
210
221
|
throw this.invalidResourceMetadata(type, name, `Resource entry is not a file: ${entry}`, { yamlPath, entry, entryPath });
|
|
211
222
|
}
|
|
212
223
|
return {
|
|
213
|
-
|
|
224
|
+
metadata: {
|
|
225
|
+
...parsed,
|
|
226
|
+
name,
|
|
227
|
+
type,
|
|
228
|
+
entry,
|
|
229
|
+
},
|
|
230
|
+
shouldWriteMetadata: true,
|
|
231
|
+
};
|
|
232
|
+
}
|
|
233
|
+
async inferPublishResourceMetadata(type, name, resourceDir) {
|
|
234
|
+
const entry = this.getDefaultEntry(type);
|
|
235
|
+
const entryPath = path.join(resourceDir, entry);
|
|
236
|
+
let entryStat;
|
|
237
|
+
try {
|
|
238
|
+
entryStat = await fs.stat(entryPath);
|
|
239
|
+
}
|
|
240
|
+
catch (error) {
|
|
241
|
+
if (!this.isNotFoundError(error)) {
|
|
242
|
+
throw error;
|
|
243
|
+
}
|
|
244
|
+
throw this.invalidResourceMetadata(type, name, `Missing himan.yaml and default entry file for publish: ${entry}`, { yamlPath: path.join(resourceDir, "himan.yaml"), entry, entryPath });
|
|
245
|
+
}
|
|
246
|
+
if (!entryStat.isFile()) {
|
|
247
|
+
throw this.invalidResourceMetadata(type, name, `Default resource entry is not a file: ${entry}`, { entry, entryPath });
|
|
248
|
+
}
|
|
249
|
+
const frontMatter = type === "skill" ? await this.readSkillFrontMatter(entryPath) : null;
|
|
250
|
+
const metadata = {
|
|
214
251
|
name,
|
|
215
252
|
type,
|
|
216
253
|
entry,
|
|
217
254
|
};
|
|
255
|
+
const description = this.readStringMetadata(frontMatter, "description");
|
|
256
|
+
if (description)
|
|
257
|
+
metadata.description = description;
|
|
258
|
+
const agents = this.readStringArrayMetadata(frontMatter, "agents") ??
|
|
259
|
+
this.readStringArrayMetadata(frontMatter, "targets");
|
|
260
|
+
if (agents)
|
|
261
|
+
metadata.agents = agents;
|
|
262
|
+
return metadata;
|
|
218
263
|
}
|
|
219
264
|
invalidResourceMetadata(type, name, message, details) {
|
|
220
265
|
return new HimanError(errorCodes.INVALID_RESOURCE_METADATA, `Invalid metadata for ${type}/${name}: ${message}`, details);
|
|
@@ -229,7 +274,7 @@ export class GitSourceAdapter {
|
|
|
229
274
|
return "commands";
|
|
230
275
|
return "skills";
|
|
231
276
|
}
|
|
232
|
-
async getResourceMetadataHash(baseDir) {
|
|
277
|
+
async getResourceMetadataHash(baseDir, type) {
|
|
233
278
|
const hash = createHash("sha256");
|
|
234
279
|
hash.update("himan-resource-index-v1");
|
|
235
280
|
if (!(await this.exists(baseDir))) {
|
|
@@ -255,6 +300,18 @@ export class GitSourceAdapter {
|
|
|
255
300
|
throw error;
|
|
256
301
|
}
|
|
257
302
|
hash.update("\0yaml-missing");
|
|
303
|
+
const entryPath = path.join(baseDir, resourceDirName, this.getDefaultEntry(type));
|
|
304
|
+
try {
|
|
305
|
+
const raw = await fs.readFile(entryPath);
|
|
306
|
+
hash.update("\0entry:");
|
|
307
|
+
hash.update(raw);
|
|
308
|
+
}
|
|
309
|
+
catch (entryError) {
|
|
310
|
+
if (!this.isNotFoundError(entryError)) {
|
|
311
|
+
throw entryError;
|
|
312
|
+
}
|
|
313
|
+
hash.update("\0entry-missing");
|
|
314
|
+
}
|
|
258
315
|
}
|
|
259
316
|
}
|
|
260
317
|
return hash.digest("hex");
|
|
@@ -277,8 +334,8 @@ export class GitSourceAdapter {
|
|
|
277
334
|
}
|
|
278
335
|
return `# ${name}\n\nDescribe skill workflow here.\n`;
|
|
279
336
|
}
|
|
280
|
-
async buildReadmeContent(repoDir) {
|
|
281
|
-
const resourceLines = await this.buildResourceIndex(repoDir);
|
|
337
|
+
async buildReadmeContent(repoDir, versionOverrides = new Map()) {
|
|
338
|
+
const resourceLines = await this.buildResourceIndex(repoDir, versionOverrides);
|
|
282
339
|
const repo = this.sourceConfig?.repo ?? "<git_url>";
|
|
283
340
|
return [
|
|
284
341
|
`# ${this.getSourceTitle()}`,
|
|
@@ -309,7 +366,8 @@ export class GitSourceAdapter {
|
|
|
309
366
|
"",
|
|
310
367
|
].join("\n");
|
|
311
368
|
}
|
|
312
|
-
buildChangelogContent() {
|
|
369
|
+
async buildChangelogContent(repoDir) {
|
|
370
|
+
const resourceLines = await this.buildExistingResourceChangelogLines(repoDir);
|
|
313
371
|
return [
|
|
314
372
|
"# Changelog",
|
|
315
373
|
"",
|
|
@@ -320,23 +378,21 @@ export class GitSourceAdapter {
|
|
|
320
378
|
"### Added",
|
|
321
379
|
"",
|
|
322
380
|
"- Initial source README/CHANGELOG scaffold.",
|
|
381
|
+
...resourceLines,
|
|
323
382
|
"",
|
|
324
383
|
].join("\n");
|
|
325
384
|
}
|
|
326
|
-
async buildResourceIndex(repoDir) {
|
|
385
|
+
async buildResourceIndex(repoDir, versionOverrides = new Map()) {
|
|
327
386
|
const sections = [];
|
|
328
387
|
for (const type of RESOURCE_TYPES) {
|
|
329
|
-
const resources =
|
|
388
|
+
const resources = await this.collectResourceDocsItems(repoDir, type);
|
|
330
389
|
sections.push(`### ${this.getTypeLabel(type)}`, "");
|
|
331
390
|
if (resources.length === 0) {
|
|
332
391
|
sections.push(`- No ${type} resources yet.`, "");
|
|
333
392
|
continue;
|
|
334
393
|
}
|
|
335
394
|
for (const resource of resources) {
|
|
336
|
-
const
|
|
337
|
-
const ref = version
|
|
338
|
-
? `${resource.type}/${resource.name}@${version}`
|
|
339
|
-
: `${resource.type}/${resource.name}`;
|
|
395
|
+
const ref = await this.getResourceRef(repoDir, resource.type, resource.name, versionOverrides);
|
|
340
396
|
sections.push(`- \`${ref}\`${resource.description ? `: ${resource.description}` : ""}`);
|
|
341
397
|
}
|
|
342
398
|
sections.push("");
|
|
@@ -346,21 +402,98 @@ export class GitSourceAdapter {
|
|
|
346
402
|
}
|
|
347
403
|
return sections;
|
|
348
404
|
}
|
|
349
|
-
async
|
|
350
|
-
const
|
|
405
|
+
async buildExistingResourceChangelogLines(repoDir) {
|
|
406
|
+
const lines = [];
|
|
407
|
+
for (const type of RESOURCE_TYPES) {
|
|
408
|
+
const resources = await this.collectResourceDocsItems(repoDir, type);
|
|
409
|
+
for (const resource of resources) {
|
|
410
|
+
const ref = await this.getResourceRef(repoDir, resource.type, resource.name);
|
|
411
|
+
lines.push(`- Documented existing resource \`${ref}\`.`);
|
|
412
|
+
}
|
|
413
|
+
}
|
|
414
|
+
return lines;
|
|
415
|
+
}
|
|
416
|
+
async collectResourceDocsItems(repoDir, type) {
|
|
417
|
+
const resources = await this.scanner.scanByType(repoDir, type);
|
|
418
|
+
const items = resources.map((resource) => ({
|
|
419
|
+
name: resource.name,
|
|
420
|
+
type: resource.type,
|
|
421
|
+
description: resource.description,
|
|
422
|
+
}));
|
|
423
|
+
const managedNames = new Set(resources.map((resource) => resource.name));
|
|
424
|
+
items.push(...(await this.scanEntryBasedDocsItems(repoDir, type, managedNames)));
|
|
425
|
+
return items.sort((a, b) => a.name.localeCompare(b.name));
|
|
426
|
+
}
|
|
427
|
+
async scanEntryBasedDocsItems(repoDir, type, managedNames) {
|
|
428
|
+
const baseDir = path.join(repoDir, this.getTypeDir(type));
|
|
429
|
+
if (!(await this.exists(baseDir)))
|
|
430
|
+
return [];
|
|
431
|
+
const entries = await fs.readdir(baseDir, { withFileTypes: true });
|
|
432
|
+
const items = [];
|
|
433
|
+
for (const entry of entries) {
|
|
434
|
+
if (!entry.isDirectory())
|
|
435
|
+
continue;
|
|
436
|
+
const resourceEntry = this.getDefaultEntry(type);
|
|
437
|
+
const entryPath = path.join(baseDir, entry.name, resourceEntry);
|
|
438
|
+
if (!(await this.exists(entryPath)))
|
|
439
|
+
continue;
|
|
440
|
+
const metadata = type === "skill" ? await this.readSkillFrontMatter(entryPath) : null;
|
|
441
|
+
const name = this.readStringMetadata(metadata, "name") ?? entry.name;
|
|
442
|
+
if (managedNames.has(name) || managedNames.has(entry.name))
|
|
443
|
+
continue;
|
|
444
|
+
items.push({
|
|
445
|
+
name,
|
|
446
|
+
type,
|
|
447
|
+
description: this.readStringMetadata(metadata, "description"),
|
|
448
|
+
});
|
|
449
|
+
}
|
|
450
|
+
return items;
|
|
451
|
+
}
|
|
452
|
+
async readSkillFrontMatter(skillPath) {
|
|
453
|
+
const raw = await fs.readFile(skillPath, "utf8");
|
|
454
|
+
const match = /^---\r?\n([\s\S]*?)\r?\n---(?:\r?\n|$)/.exec(raw.trimStart());
|
|
455
|
+
if (!match)
|
|
456
|
+
return null;
|
|
457
|
+
try {
|
|
458
|
+
const parsed = YAML.parse(match[1]);
|
|
459
|
+
return this.isRecord(parsed) ? parsed : null;
|
|
460
|
+
}
|
|
461
|
+
catch {
|
|
462
|
+
return null;
|
|
463
|
+
}
|
|
464
|
+
}
|
|
465
|
+
readStringMetadata(metadata, key) {
|
|
466
|
+
const value = metadata?.[key];
|
|
467
|
+
if (typeof value !== "string")
|
|
468
|
+
return undefined;
|
|
469
|
+
const trimmed = value.trim();
|
|
470
|
+
return trimmed ? trimmed : undefined;
|
|
471
|
+
}
|
|
472
|
+
readStringArrayMetadata(metadata, key) {
|
|
473
|
+
const value = metadata?.[key];
|
|
474
|
+
if (!Array.isArray(value))
|
|
475
|
+
return undefined;
|
|
476
|
+
const items = value
|
|
477
|
+
.filter((item) => typeof item === "string")
|
|
478
|
+
.map((item) => item.trim())
|
|
479
|
+
.filter(Boolean);
|
|
480
|
+
return items.length > 0 ? items : undefined;
|
|
481
|
+
}
|
|
482
|
+
async maintainSourceDocs(repoDir, changelogEntry, versionOverrides = new Map()) {
|
|
483
|
+
const readmePath = await this.updateReadmeResourceIndex(repoDir, versionOverrides);
|
|
351
484
|
const changelogPath = await this.updateChangelog(repoDir, changelogEntry);
|
|
352
485
|
return [readmePath, changelogPath];
|
|
353
486
|
}
|
|
354
|
-
async updateReadmeResourceIndex(repoDir) {
|
|
487
|
+
async updateReadmeResourceIndex(repoDir, versionOverrides = new Map()) {
|
|
355
488
|
const readmePath = path.join(repoDir, "README.md");
|
|
356
489
|
if (!(await this.exists(readmePath))) {
|
|
357
|
-
await fs.writeFile(readmePath, await this.buildReadmeContent(repoDir), "utf8");
|
|
490
|
+
await fs.writeFile(readmePath, await this.buildReadmeContent(repoDir, versionOverrides), "utf8");
|
|
358
491
|
return readmePath;
|
|
359
492
|
}
|
|
360
493
|
const current = await fs.readFile(readmePath, "utf8");
|
|
361
494
|
const resourceSection = [
|
|
362
495
|
README_RESOURCES_START,
|
|
363
|
-
...(await this.buildResourceIndex(repoDir)),
|
|
496
|
+
...(await this.buildResourceIndex(repoDir, versionOverrides)),
|
|
364
497
|
README_RESOURCES_END,
|
|
365
498
|
].join("\n");
|
|
366
499
|
const updated = this.replaceOrAppendReadmeResourceSection(current, resourceSection);
|
|
@@ -461,6 +594,25 @@ export class GitSourceAdapter {
|
|
|
461
594
|
return undefined;
|
|
462
595
|
}
|
|
463
596
|
}
|
|
597
|
+
async getResourceRef(repoDir, type, name, versionOverrides = new Map()) {
|
|
598
|
+
const version = versionOverrides.get(this.getResourceVersionOverrideKey(type, name)) ??
|
|
599
|
+
(await this.readLatestTaggedResourceVersion(repoDir, type, name)) ??
|
|
600
|
+
(await this.readResourceVersion(repoDir, type, name));
|
|
601
|
+
return this.formatResourceRef(type, name, version);
|
|
602
|
+
}
|
|
603
|
+
getResourceVersionOverrideKey(type, name) {
|
|
604
|
+
return `${type}/${name}`;
|
|
605
|
+
}
|
|
606
|
+
async readLatestTaggedResourceVersion(repoDir, type, name) {
|
|
607
|
+
const versions = (await this.repoManager.listTags(repoDir, `${type}/${name}@*`))
|
|
608
|
+
.map((tag) => tag.split("@").at(1) ?? "")
|
|
609
|
+
.filter((version) => semver.valid(version))
|
|
610
|
+
.sort(semver.rcompare);
|
|
611
|
+
return versions.at(0);
|
|
612
|
+
}
|
|
613
|
+
formatResourceRef(type, name, version) {
|
|
614
|
+
return version ? `${type}/${name}@${version}` : `${type}/${name}`;
|
|
615
|
+
}
|
|
464
616
|
getSourceTitle() {
|
|
465
617
|
const repo = this.sourceConfig?.repo?.replace(/\/$/, "");
|
|
466
618
|
const repoName = repo?.split(/[/:]/).at(-1)?.replace(/\.git$/, "");
|
|
@@ -75,6 +75,12 @@ export function registerSourceCommands(command, services, options) {
|
|
|
75
75
|
for (const file of result.files) {
|
|
76
76
|
process.stdout.write(`- ${file.action} ${file.path}${file.reason ? ` (${file.reason})` : ""}\n`);
|
|
77
77
|
}
|
|
78
|
+
if (result.committed) {
|
|
79
|
+
process.stdout.write("Committed and pushed source docs changes.\n");
|
|
80
|
+
}
|
|
81
|
+
else if (!result.dryRun) {
|
|
82
|
+
process.stdout.write("No source docs changes to commit.\n");
|
|
83
|
+
}
|
|
78
84
|
});
|
|
79
85
|
});
|
|
80
86
|
}
|
package/dist/services/index.js
CHANGED
|
@@ -198,6 +198,7 @@ export class ServiceFactory {
|
|
|
198
198
|
async publish(type, name, releaseType, projectDir) {
|
|
199
199
|
const source = await this.loadSourceFromConfig();
|
|
200
200
|
const sourceDir = await this.resolvePublishSourceDir(type, name, projectDir);
|
|
201
|
+
const existingInstallInfo = await this.tryResolveInstalledResource(projectDir, type, name);
|
|
201
202
|
const history = await source.history(type, name);
|
|
202
203
|
const latest = history[0]?.version ?? "0.0.0";
|
|
203
204
|
const nextVersion = this.versions.nextVersion(latest, releaseType);
|
|
@@ -208,28 +209,31 @@ export class ServiceFactory {
|
|
|
208
209
|
if (!(await this.exists(storePath))) {
|
|
209
210
|
await source.pull(type, name, nextVersion, storePath);
|
|
210
211
|
}
|
|
211
|
-
const agentsFromMeta = normalizeAgents((await this.readResourceMetaFromDir(storePath))?.agents);
|
|
212
212
|
const locked = await this.getLockedResource(projectDir, type, name);
|
|
213
|
+
const resourceMeta = await this.readResourceMetaFromDir(storePath, type);
|
|
214
|
+
const configuredAgents = await this.getConfiguredAgents(projectDir);
|
|
213
215
|
const nextAgents = locked?.agents?.length
|
|
214
216
|
? normalizeAgents(locked.agents)
|
|
215
|
-
:
|
|
216
|
-
|
|
217
|
+
: existingInstallInfo?.agents.length
|
|
218
|
+
? normalizeAgents(existingInstallInfo.agents)
|
|
219
|
+
: configuredAgents ?? normalizeAgents(resourceMeta?.agents);
|
|
220
|
+
const installMode = "copy";
|
|
217
221
|
const linkPaths = getProjectResourcePaths(projectDir, type, name, nextAgents);
|
|
218
222
|
for (const linkPath of linkPaths) {
|
|
219
|
-
|
|
220
|
-
await this.materializeResource(storePath, linkPath, installMode);
|
|
221
|
-
}
|
|
222
|
-
}
|
|
223
|
-
if (locked) {
|
|
224
|
-
const sourceInfo = await this.getLockSourceInfo();
|
|
225
|
-
await this.lockStore.upsertResource(projectDir, sourceInfo, {
|
|
226
|
-
type,
|
|
227
|
-
name,
|
|
228
|
-
version: nextVersion,
|
|
229
|
-
agents: nextAgents,
|
|
230
|
-
mode: installMode,
|
|
231
|
-
});
|
|
223
|
+
await this.materializeResource(storePath, linkPath, installMode);
|
|
232
224
|
}
|
|
225
|
+
const sourceInfo = await this.getLockSourceInfo();
|
|
226
|
+
await this.lockStore.upsertResource(projectDir, sourceInfo, {
|
|
227
|
+
type,
|
|
228
|
+
name,
|
|
229
|
+
version: nextVersion,
|
|
230
|
+
agents: nextAgents,
|
|
231
|
+
mode: installMode,
|
|
232
|
+
});
|
|
233
|
+
await fs.rm(this.getProjectDevPath(projectDir, type, name), {
|
|
234
|
+
recursive: true,
|
|
235
|
+
force: true,
|
|
236
|
+
});
|
|
233
237
|
return { type, name, version: result.version, tag: result.tag };
|
|
234
238
|
}
|
|
235
239
|
async create(type, name, options, projectDir) {
|
|
@@ -274,7 +278,7 @@ export class ServiceFactory {
|
|
|
274
278
|
if (!(await this.exists(storePath))) {
|
|
275
279
|
await source.pull(type, name, resolvedVersion, storePath);
|
|
276
280
|
}
|
|
277
|
-
const resourceMeta = await this.readResourceMetaFromDir(storePath);
|
|
281
|
+
const resourceMeta = await this.readResourceMetaFromDir(storePath, type);
|
|
278
282
|
const effectiveTargets = await this.resolveEffectiveAgents(projectDir, agents, resourceMeta?.agents);
|
|
279
283
|
const linkPaths = getProjectResourcePaths(projectDir, type, name, effectiveTargets);
|
|
280
284
|
for (const linkPath of linkPaths) {
|
|
@@ -428,7 +432,7 @@ export class ServiceFactory {
|
|
|
428
432
|
throw new HimanError(errorCodes.INSTALL_NOT_FOUND, `Installed resource link not found for ${type}/${name}. Run install first.`);
|
|
429
433
|
}
|
|
430
434
|
const installedPath = await fs.realpath(existingCandidates[0].path);
|
|
431
|
-
const resourceMeta = await this.readResourceMetaFromDir(installedPath);
|
|
435
|
+
const resourceMeta = await this.readResourceMetaFromDir(installedPath, type);
|
|
432
436
|
const agentsFromMeta = resourceMeta?.agents?.length
|
|
433
437
|
? normalizeAgents(resourceMeta.agents)
|
|
434
438
|
: undefined;
|
|
@@ -441,6 +445,18 @@ export class ServiceFactory {
|
|
|
441
445
|
mode: "link",
|
|
442
446
|
};
|
|
443
447
|
}
|
|
448
|
+
async tryResolveInstalledResource(projectDir, type, name) {
|
|
449
|
+
try {
|
|
450
|
+
return await this.resolveInstalledResource(projectDir, type, name);
|
|
451
|
+
}
|
|
452
|
+
catch (error) {
|
|
453
|
+
if (error instanceof HimanError &&
|
|
454
|
+
error.code === errorCodes.INSTALL_NOT_FOUND) {
|
|
455
|
+
return undefined;
|
|
456
|
+
}
|
|
457
|
+
throw error;
|
|
458
|
+
}
|
|
459
|
+
}
|
|
444
460
|
async resolveEffectiveAgents(projectDir, explicitAgents, fallbackAgents) {
|
|
445
461
|
if (explicitAgents?.length) {
|
|
446
462
|
return normalizeAgents(explicitAgents);
|
|
@@ -464,15 +480,51 @@ export class ServiceFactory {
|
|
|
464
480
|
}
|
|
465
481
|
return undefined;
|
|
466
482
|
}
|
|
467
|
-
async readResourceMetaFromDir(resourceDir) {
|
|
483
|
+
async readResourceMetaFromDir(resourceDir, type) {
|
|
468
484
|
const yamlPath = path.join(resourceDir, "himan.yaml");
|
|
469
|
-
if (
|
|
485
|
+
if (await this.exists(yamlPath)) {
|
|
486
|
+
const raw = await fs.readFile(yamlPath, "utf8");
|
|
487
|
+
const parsed = YAML.parse(raw) ??
|
|
488
|
+
null;
|
|
489
|
+
if (!parsed)
|
|
490
|
+
return null;
|
|
491
|
+
return { agents: parsed.agents ?? parsed.targets };
|
|
492
|
+
}
|
|
493
|
+
if (type !== "skill")
|
|
470
494
|
return null;
|
|
471
|
-
const
|
|
472
|
-
|
|
473
|
-
if (!parsed)
|
|
495
|
+
const entryPath = path.join(resourceDir, this.getDefaultEntry(type));
|
|
496
|
+
if (!(await this.exists(entryPath)))
|
|
474
497
|
return null;
|
|
475
|
-
|
|
498
|
+
const metadata = await this.readFrontMatter(entryPath);
|
|
499
|
+
return {
|
|
500
|
+
agents: this.readStringArrayMetadata(metadata, "agents") ??
|
|
501
|
+
this.readStringArrayMetadata(metadata, "targets"),
|
|
502
|
+
};
|
|
503
|
+
}
|
|
504
|
+
async readFrontMatter(filePath) {
|
|
505
|
+
const raw = await fs.readFile(filePath, "utf8");
|
|
506
|
+
const match = /^---\r?\n([\s\S]*?)\r?\n---(?:\r?\n|$)/.exec(raw.trimStart());
|
|
507
|
+
if (!match)
|
|
508
|
+
return null;
|
|
509
|
+
try {
|
|
510
|
+
const parsed = YAML.parse(match[1]);
|
|
511
|
+
return typeof parsed === "object" && parsed !== null && !Array.isArray(parsed)
|
|
512
|
+
? parsed
|
|
513
|
+
: null;
|
|
514
|
+
}
|
|
515
|
+
catch {
|
|
516
|
+
return null;
|
|
517
|
+
}
|
|
518
|
+
}
|
|
519
|
+
readStringArrayMetadata(metadata, key) {
|
|
520
|
+
const value = metadata?.[key];
|
|
521
|
+
if (!Array.isArray(value))
|
|
522
|
+
return undefined;
|
|
523
|
+
const items = value
|
|
524
|
+
.filter((item) => typeof item === "string")
|
|
525
|
+
.map((item) => item.trim())
|
|
526
|
+
.filter(Boolean);
|
|
527
|
+
return items.length > 0 ? items : undefined;
|
|
476
528
|
}
|
|
477
529
|
async exists(targetPath) {
|
|
478
530
|
try {
|
|
@@ -512,6 +564,9 @@ export class ServiceFactory {
|
|
|
512
564
|
return "commands";
|
|
513
565
|
return "skills";
|
|
514
566
|
}
|
|
567
|
+
getDefaultEntry(type) {
|
|
568
|
+
return type === "skill" ? "SKILL.md" : "content.md";
|
|
569
|
+
}
|
|
515
570
|
validateCreateInput(type, name, options) {
|
|
516
571
|
if (!["rule", "command", "skill"].includes(type)) {
|
|
517
572
|
throw new HimanError(errorCodes.UNSUPPORTED_RESOURCE_TYPE, `Unsupported resource type for create: ${type}`);
|
package/docs/development.md
CHANGED
|
@@ -41,9 +41,9 @@ pnpm test
|
|
|
41
41
|
本地可执行 `pnpm run verify`(类型检查、单测、`build`),确认通过后再提 PR。PR 会自动运行同一组核心校验。
|
|
42
42
|
|
|
43
43
|
2. **更新 `package.json` 中的 `version` 与 `CHANGELOG.md`**
|
|
44
|
-
npm 不允许重复发布同一版本号。合并进 `master` 前,在 PR
|
|
44
|
+
npm 不允许重复发布同一版本号。合并进 `master` 前,在 PR 里把用户可见变更先记录到 [CHANGELOG.md](../CHANGELOG.md) 的 `[Unreleased]`,再把版本改成 registry 上尚未存在的号。
|
|
45
45
|
- 手动改 `version` 字段,或
|
|
46
|
-
-
|
|
46
|
+
- 在分支上执行其一(改版本号并把 `[Unreleased]` 归档到新版本,**不会**发包):`pnpm run version:patch` / `version:minor` / `version:major`(使用 `npm version … --no-git-tag-version`,随后执行 `scripts/release-changelog.mjs`;需自行 `git add` / `commit` 版本和 changelog 变更)。
|
|
47
47
|
Git 标签约定:与 `version` 对应、带前缀 **`v`**(如 `1.2.0` → 标签 `v1.2.0`)。
|
|
48
48
|
|
|
49
49
|
3. **合并到 `master`**
|
|
@@ -66,7 +66,8 @@ pnpm test
|
|
|
66
66
|
| `pnpm run release:dry` | 检查 + `npm publish --dry-run`(演练,不上传) |
|
|
67
67
|
| `pnpm run release:test` | 检查 + 将版本打成 `*-test.*` 预发布号并发布到 **`@test` 标签** |
|
|
68
68
|
| `pnpm run release` | 检查 + 发布 **latest**(维护者本地发包时用;**请写 `pnpm run release`**,勿用裸命令 `pnpm publish`,二者不是同一套流程) |
|
|
69
|
-
| `pnpm run
|
|
69
|
+
| `pnpm run changelog:release` | 把 `CHANGELOG.md` 的 `[Unreleased]` 归档到当前 `package.json` 版本 |
|
|
70
|
+
| `pnpm run version:patch` / `version:minor` / `version:major` | 提升 `package.json` 版本号,并调用 `changelog:release`;不发包 |
|
|
70
71
|
|
|
71
72
|
发测试标签后,安装示例:`npm i @hi-man/himan@test`。
|
|
72
73
|
|
package/docs/error-codes.md
CHANGED
|
@@ -110,8 +110,8 @@
|
|
|
110
110
|
### `E_INVALID_RESOURCE_METADATA`
|
|
111
111
|
|
|
112
112
|
- **含义**:资源元数据不合法,无法发布或读取为有效资源。
|
|
113
|
-
- **常见触发**:`
|
|
114
|
-
-
|
|
113
|
+
- **常见触发**:`himan.yaml` 存在但 `name/type/entry` 不匹配,`entry` 指向的入口文件不存在,或缺少 `himan.yaml` 且默认入口文件也不存在。
|
|
114
|
+
- **建议处理**:如果使用 `himan.yaml`,确认 `name`、`type`、`entry` 与命令参数和文件结构一致;如果暂不使用 `himan.yaml`,确认默认入口文件存在:`rule` / `command` 为 `content.md`,`skill` 为 `SKILL.md`。
|
|
115
115
|
|
|
116
116
|
### `E_PUBLISH_NO_CHANGES`
|
|
117
117
|
|
package/docs/mvp/README.md
CHANGED
|
@@ -30,7 +30,7 @@
|
|
|
30
30
|
### 2.2 `list`
|
|
31
31
|
|
|
32
32
|
- `himan list [type]`,`--json` 可选
|
|
33
|
-
-
|
|
33
|
+
- 扫描源仓库中各类型目录;优先读取 `himan.yaml`,缺失时按默认入口文件推断资源,返回名称、描述、目标 agent、入口文件等。
|
|
34
34
|
|
|
35
35
|
### 2.3 `history`
|
|
36
36
|
|
|
@@ -64,7 +64,7 @@
|
|
|
64
64
|
- 发布内容优先取项目 `.himan/dev/<type>/<name>`,否则取源仓库内对应资源目录。
|
|
65
65
|
- 新版本:基于已有 tag 最新 semver 递增;无任何历史时从 `0.0.0` 起算。
|
|
66
66
|
- 写回源仓库、提交、打 tag、推送,并将该版本同步到本地 store。
|
|
67
|
-
-
|
|
67
|
+
- 发布成功后,用新版本 store 以 copy 模式重新安装到项目目标、更新 lock,并删除对应 `.himan/dev/<type>/<name>`。
|
|
68
68
|
|
|
69
69
|
### 2.7 `create`
|
|
70
70
|
|
|
@@ -106,7 +106,7 @@
|
|
|
106
106
|
- `.himan/dev/<type>/<name>`:资源开发态可编辑副本
|
|
107
107
|
|
|
108
108
|
**源仓库内资源布局:**
|
|
109
|
-
- `rules/<name>/`、`commands/<name>/`、`skills/<name
|
|
109
|
+
- `rules/<name>/`、`commands/<name>/`、`skills/<name>/`,可含 `himan.yaml`,并包含约定入口文件(如 `content.md`、`SKILL.md`)。
|
|
110
110
|
|
|
111
111
|
### 3.3 技术依赖(概要)
|
|
112
112
|
|
|
@@ -56,7 +56,7 @@ repo/
|
|
|
56
56
|
- `CHANGELOG.md`:记录 source 级别的新增资源、资源变更、废弃、移除和重要发布说明
|
|
57
57
|
- `rules/`、`commands/`、`skills/`:资源类型根目录,由 himan 扫描
|
|
58
58
|
|
|
59
|
-
可用 `himan source init-docs` 生成根目录文档模板。命令默认只创建缺失的 `README.md` / `CHANGELOG.md`;已有文件会保留,除非显式传 `--force`。`--dry-run`
|
|
59
|
+
可用 `himan source init-docs` 生成根目录文档模板。命令默认只创建缺失的 `README.md` / `CHANGELOG.md`;已有文件会保留,除非显式传 `--force`。`--force` 覆盖文档时会扫描当前 source 中已有的 `rule`、`command`、`skill`,写入 README 资源索引,并在 CHANGELOG 初始条目中记录已整理的资源。资源引用会优先使用 Git tag 中的最新 semver 版本,找不到 tag 时再回退到 `himan.yaml` 的 `version`。对于尚未补齐 `himan.yaml` 的资源,文档整理会按默认入口识别资源;其中 skill 会额外读取 `skills/<name>/SKILL.md` front matter 中的 `name` 和 `description`。`--dry-run` 只返回将执行的创建、覆盖或跳过动作,不写盘。有实际文件变更时,命令会提交并 push 到当前 Git source。
|
|
60
60
|
|
|
61
61
|
`create` 和 `publish` 会自动维护根目录文档:
|
|
62
62
|
|
|
@@ -115,17 +115,14 @@ himan install rule code-review
|
|
|
115
115
|
```text
|
|
116
116
|
repo/
|
|
117
117
|
rules/<name>/
|
|
118
|
-
himan.yaml
|
|
119
118
|
content.md
|
|
120
119
|
commands/<name>/
|
|
121
|
-
himan.yaml
|
|
122
120
|
content.md
|
|
123
121
|
skills/<name>/
|
|
124
|
-
himan.yaml
|
|
125
122
|
SKILL.md
|
|
126
123
|
```
|
|
127
124
|
|
|
128
|
-
`himan.yaml` 最小字段示例:
|
|
125
|
+
`himan.yaml` 是推荐但非强制的资源元数据文件。存在时,发布会校验 `name`、`type`、`entry` 与入口文件;缺失时,发布按默认入口推断最小元数据:`rule` / `command` 使用 `content.md`,`skill` 使用 `SKILL.md`。`himan.yaml` 最小字段示例:
|
|
129
126
|
|
|
130
127
|
```yaml
|
|
131
128
|
name: code-review
|
package/docs/mvp/impl.md
CHANGED
|
@@ -30,8 +30,8 @@
|
|
|
30
30
|
|
|
31
31
|
### 2.2 `list [type]`
|
|
32
32
|
|
|
33
|
-
-
|
|
34
|
-
-
|
|
33
|
+
- 在缓存仓库内按类型扫描子目录,优先读取 `himan.yaml`,缺失时按默认入口文件推断资源
|
|
34
|
+
- 校验已有元数据的类型与必填字段(如 name、entry),不符的目录跳过
|
|
35
35
|
- 支持人类可读与 `--json` 输出
|
|
36
36
|
|
|
37
37
|
### 2.3 `history <type> <name>`
|
|
@@ -58,7 +58,7 @@
|
|
|
58
58
|
- 下一版本:基于历史最新 tag;无历史则从 `0.0.0` 按 patch/minor/major 递增
|
|
59
59
|
- 将内容同步回缓存仓库中的规范路径,更新元数据中的版本字段,提交、打 tag、推送
|
|
60
60
|
- 将新 tag 对应内容拉取到 store 新版本目录
|
|
61
|
-
-
|
|
61
|
+
- 用新版本 store 以 copy 模式重新安装项目内对应类型目标,更新 lock,并删除对应 `.himan/dev/<type>/<name>` 开发目录
|
|
62
62
|
|
|
63
63
|
### 2.7 `create <type> <name>`
|
|
64
64
|
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@hi-man/himan",
|
|
3
|
-
"version": "0.3.
|
|
3
|
+
"version": "0.3.1",
|
|
4
4
|
"description": "Prompt and agent asset management CLI",
|
|
5
5
|
"keywords": [
|
|
6
6
|
"ai",
|
|
@@ -59,9 +59,10 @@
|
|
|
59
59
|
"release": "pnpm run verify && npm publish",
|
|
60
60
|
"release:dry": "pnpm run verify && npm publish --dry-run",
|
|
61
61
|
"release:test": "pnpm run verify && npm version prerelease --preid test --no-git-tag-version && npm publish --tag test",
|
|
62
|
-
"
|
|
63
|
-
"version:
|
|
64
|
-
"version:
|
|
62
|
+
"changelog:release": "node scripts/release-changelog.mjs",
|
|
63
|
+
"version:patch": "npm version patch --no-git-tag-version && pnpm run changelog:release",
|
|
64
|
+
"version:minor": "npm version minor --no-git-tag-version && pnpm run changelog:release",
|
|
65
|
+
"version:major": "npm version major --no-git-tag-version && pnpm run changelog:release"
|
|
65
66
|
},
|
|
66
67
|
"dependencies": {
|
|
67
68
|
"commander": "^14.0.3",
|