@devport-kr/portki 0.1.1 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +117 -35
  2. package/dist/cli.js +686 -85
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -10,18 +10,19 @@
10
10
  - **청크 기반 생성 (Chunked Generation)**: 저장소의 규모가 클 경우, 문서를 여러 섹션으로 나누고 각 섹션 단위로 상세한 문서를 생성 및 검증한 뒤 로컬 세션 상태에 반영합니다.
11
11
  - **순수 Markdown 출력**: 최종 위키를 데이터베이스가 아니라 `README.md`와 섹션별 `.md` 파일로 출력합니다.
12
12
  - **증분 업데이트 (Incremental Update)**: 전체 코드를 매번 다시 분석하지 않고, 마지막 위키 생성 커밋 이후 변경된 파일과 영향을 받는 섹션만 추적하여 위키를 효율적으로 갱신합니다.
13
+ - **AI 에이전트 원클릭 통합**: Claude Code, Codex, Gemini CLI에서 한 줄의 명령으로 위키를 생성할 수 있습니다.
13
14
 
14
- ## 설치
15
-
16
- Node.js 환경이 필요합니다.
15
+ ## 빠른 시작 (Quick Start)
17
16
 
18
17
  ```bash
19
- npm install
18
+ npx @devport-kr/portki facebook/react
20
19
  ```
21
20
 
22
- 공개 버전에서는 `.env`를 만들 필요가 없습니다.
21
+ 줄이면 저장소를 분석하고 `handoff.md`를 생성합니다. AI 에이전트가 이 파일을 읽고 위키를 완성합니다.
22
+
23
+ ## 설치
23
24
 
24
- 출시 후에는 로컬 clone 없이도 사용할 있습니다.
25
+ Node.js 20 이상이 필요합니다. `.env` 설정은 필요 없습니다.
25
26
 
26
27
  ```bash
27
28
  npx @devport-kr/portki help
@@ -34,31 +35,88 @@ npm install -g @devport-kr/portki
34
35
  portki help
35
36
  ```
36
37
 
37
- ## GitHub Releases
38
+ 환경 상태를 확인하려면:
39
+
40
+ ```bash
41
+ portki doctor
42
+ ```
38
43
 
39
- `public` 브랜치에서는 GitHub Release를 만들 수 있습니다.
44
+ ## AI 에이전트에서 사용하기
45
+
46
+ portki는 AI 코딩 에이전트와 함께 사용하도록 설계되었습니다. `portki owner/repo`를 실행하면 에이전트가 따라갈 수 있는 `handoff.md`가 생성됩니다. 각 에이전트 환경에 맞는 어댑터를 설치하면 더 편리하게 사용할 수 있습니다.
47
+
48
+ ### Claude Code
49
+
50
+ ```bash
51
+ # 1. 어댑터 설치 (프로젝트 루트에서 한 번만)
52
+ portki install --agent claude
53
+
54
+ # 2. Claude Code에서 슬래시 명령으로 실행
55
+ /portki facebook/react
56
+ ```
40
57
 
41
- - `v*` 태그를 `public` 브랜치 커밋에 push하면 Release자동으로 생성됩니다.
42
- - Actions의 수동 실행(`workflow_dispatch`)으로도 Release를 만들 수 있습니다.
43
- - Release에는 자동 생성된 노트와 함께 소스 아카이브(`.tar.gz`, `.zip`), npm 패키지 tarball(`devport-kr-portki-<version>.tgz`), SHA256 체크섬 파일이 첨부됩니다.
44
- - npm publish workflow를 함께 켜두면 같은 `v*` 태그에서 npm 패키지도 배포할 수 있습니다.
58
+ `portki install --agent claude`는 `.claude/commands/portki.md`를 생성합니다. 이후 Claude Code에서 `/portki owner/repo`를 입력하면 portki저장소를 분석하고 handoff.md를 생성한 뒤, Claude가 그 지침에 따라 위키를 완성합니다.
59
+
60
+ 어댑터 없이 직접 사용할 수도 있습니다:
61
+
62
+ ```bash
63
+ # Claude Code 터미널에서 직접 실행
64
+ portki facebook/react
65
+ # → handoff.md 경로가 출력됨. Claude에게 해당 파일을 읽고 따라달라고 요청
66
+ ```
67
+
68
+ ### Codex (OpenAI)
69
+
70
+ ```bash
71
+ # 1. 어댑터 설치 (프로젝트 루트에서 한 번만)
72
+ portki install --agent codex
73
+ ```
74
+
75
+ 이 명령은 현재 프로젝트의 `AGENTS.md`에 portki 사용법 섹션을 추가합니다. Codex가 해당 파일을 자동으로 읽어 portki 명령어를 사용할 수 있게 됩니다.
76
+
77
+ Codex에서 사용 예시:
78
+
79
+ ```
80
+ portki facebook/react 실행하고 handoff.md 따라서 위키 만들어줘
81
+ ```
45
82
 
46
- 기본 절차:
83
+ ### Gemini CLI
47
84
 
48
85
  ```bash
49
- git checkout public
50
- git pull origin public
86
+ # 1. 어댑터 설치 (프로젝트 루트에서 한 번만)
87
+ portki install --agent gemini
51
88
 
52
- # package.json version과 태그는 일치해야 합니다.
53
- git tag v0.1.1
54
- git push origin public --follow-tags
89
+ # 2. Gemini에서 슬래시 명령으로 실행
90
+ /portki facebook/react
55
91
  ```
56
92
 
57
- npm 배포까지 하려면 저장소 Secrets에 `NPM_TOKEN`을 추가해야 합니다.
93
+ `portki install --agent gemini`는 `.gemini/commands/portki.md`를 생성합니다. 이후 Gemini CLI에서 `/portki owner/repo`를 입력하면 동일한 handoff 흐름이 시작됩니다.
94
+
95
+ ### 공통 워크플로우
96
+
97
+ 어떤 에이전트를 사용하든 내부 흐름은 동일합니다:
98
+
99
+ 1. `portki owner/repo` — 저장소 스냅샷 수집 + 분석 컨텍스트 생성 + `handoff.md` 생성
100
+ 2. AI 에이전트가 `handoff.md`를 읽고 단계별 지침을 따름
101
+ 3. 에이전트가 `section-plan.json` 작성 → `validate-plan` 실행
102
+ 4. 각 섹션을 작성하고 `persist-section`으로 검증
103
+ 5. `finalize`로 최종 Markdown 위키 조립
58
104
 
59
- ## 권장 실행 환경
105
+ 진행 상황은 언제든 확인할 수 있습니다:
106
+
107
+ ```bash
108
+ portki status owner/repo
109
+ ```
110
+
111
+ 중단된 작업을 이어서 하려면:
112
+
113
+ ```bash
114
+ portki resume owner/repo
115
+ ```
116
+
117
+ ## 권장 실행 환경
60
118
  **(2026년 3월 7일 기준)**
61
- | 환경 | 권장 모델 | (Effort (Thinking) Level) | 설정 지침 |
119
+ | 환경 | 권장 모델 | Effort (Thinking) Level | 설정 지침 |
62
120
  |------|-----------|--------------------------|------------------|
63
121
  | **Claude Code** | `Opus 4.6` | High | [`CLAUDE.md`](./CLAUDE.md)|
64
122
  | **Codex** | `GPT-5.4` | xHigh | [`AGENTS.md`](./AGENTS.md)|
@@ -66,11 +124,17 @@ npm 배포까지 하려면 저장소 Secrets에 `NPM_TOKEN`을 추가해야 합
66
124
 
67
125
  ## 명령어 (Commands)
68
126
 
69
- 모든 명령은 프로젝트 루트에서 실행합니다.
127
+ ### 상위 명령어 (High-level)
70
128
 
71
- ```bash
72
- npx tsx src/agent.ts <command> [flags]
73
- ```
129
+ | 명령 | 설명 |
130
+ |------|------|
131
+ | `portki owner/repo` | 저장소를 분석하고 AI 에이전트용 handoff.md를 생성합니다. |
132
+ | `portki status owner/repo` | 파이프라인 진행 상황을 표시합니다. |
133
+ | `portki resume owner/repo` | 마지막 유효 단계에서 handoff를 재생성합니다. |
134
+ | `portki doctor` | 실행 환경 상태를 점검합니다. |
135
+ | `portki install --agent <name>` | AI 에이전트 어댑터를 설치합니다. (`claude`, `codex`, `gemini`) |
136
+
137
+ ### 하위 명령어 (Low-level)
74
138
 
75
139
  | 명령 | 설명 |
76
140
  |------|------|
@@ -84,48 +148,66 @@ npx tsx src/agent.ts <command> [flags]
84
148
 
85
149
  ## 워크플로우
86
150
 
87
- 이 도구는 AI 에이전트가 위키 문서를 더 정확하고 꼼꼼하게 작성하도록 **청크 단위 작업**을 권장합니다.
151
+ 이 도구는 AI 에이전트가 위키 문서를 더 정확하고 꼼꼼하게 작성하도록 **청크 단위 작업**을 권장합니다.
152
+
153
+ ### 1. AI 에이전트 워크플로우 (권장)
154
+
155
+ 가장 간단한 방식입니다. portki가 분석을 자동화하고, AI 에이전트가 handoff.md를 따라 문서를 작성합니다.
156
+
157
+ ```bash
158
+ # 1. 저장소 분석 + handoff 생성
159
+ portki owner/repo
160
+
161
+ # 2. AI 에이전트가 handoff.md를 읽고 위키 완성
162
+ # (에이전트가 validate-plan, persist-section, finalize를 자동으로 실행)
163
+
164
+ # 3. 진행 상황 확인 (선택)
165
+ portki status owner/repo
166
+
167
+ # 4. 중단 시 복구 (선택)
168
+ portki resume owner/repo
169
+ ```
88
170
 
89
- ### 1. 청크 단위 위키 생성 (권장)
171
+ ### 2. 수동 청크 단위 위키 생성
90
172
 
91
173
  저장소를 분석하여 여러 섹션으로 분리한 후 각 섹션별로 코드를 깊이 있게 확인하고 작성하는 방식입니다.
92
174
 
93
175
  ```bash
94
176
  # 1. 저장소 스냅샷 수집
95
- npx tsx src/agent.ts ingest --repo owner/repo --out devport-output/workspace/{repo-slug}-artifact.json
177
+ portki ingest --repo owner/repo --out devport-output/workspace/{repo-slug}-artifact.json
96
178
 
97
179
  # 2. 분석 및 섹션 계획 수립 준비 (컨텍스트 생성)
98
- npx tsx src/agent.ts plan-sections --artifact devport-output/workspace/{repo-slug}-artifact.json --out devport-output/workspace/{repo-slug}-plan-context.json
180
+ portki plan-sections --artifact devport-output/workspace/{repo-slug}-artifact.json --out devport-output/workspace/{repo-slug}-plan-context.json
99
181
 
100
182
  # (AI 작업: plan-context.json과 주요 코드를 바탕으로 {repo-slug}-section-plan.json 생성)
101
183
 
102
184
  # 3. 플랜 검증
103
- npx tsx src/agent.ts validate-plan --input devport-output/workspace/{repo-slug}-section-plan.json --context devport-output/workspace/{repo-slug}-plan-context.json --out devport-output/workspace/{repo-slug}-section-plan.json
185
+ portki validate-plan --input devport-output/workspace/{repo-slug}-section-plan.json --context devport-output/workspace/{repo-slug}-plan-context.json --out devport-output/workspace/{repo-slug}-section-plan.json
104
186
 
105
187
  # 4. 각 섹션별 작성 및 세션 반영 (전체 섹션 완료 시까지 반복)
106
188
  # (AI 작업: 해당 섹션의 코드를 분석하고 {repo-slug}-section-1-output.json 생성)
107
- npx tsx src/agent.ts persist-section --plan devport-output/workspace/{repo-slug}-section-plan.json --section sec-1 --input devport-output/workspace/{repo-slug}-section-1-output.json
189
+ portki persist-section --plan devport-output/workspace/{repo-slug}-section-plan.json --section sec-1 --input devport-output/workspace/{repo-slug}-section-1-output.json
108
190
 
109
191
  # 5. 최종 완성 및 증분 베이스라인 갱신
110
- npx tsx src/agent.ts finalize --plan devport-output/workspace/{repo-slug}-section-plan.json --advance_baseline
192
+ portki finalize --plan devport-output/workspace/{repo-slug}-section-plan.json --advance_baseline
111
193
  ```
112
194
 
113
195
  `finalize`가 완료되면 `devport-output/wiki/{owner}/{repo}/README.md`와 섹션별 Markdown 파일이 생성됩니다.
114
196
 
115
- ### 2. 증분 업데이트 (Incremental Update)
197
+ ### 3. 증분 업데이트 (Incremental Update)
116
198
 
117
199
  이전에 작성 완료된 위키를 바탕으로 변경된 파일들만 식별하여 최신화합니다.
118
200
 
119
201
  ```bash
120
202
  # 1. 변경사항 감지
121
- npx tsx src/agent.ts detect --repo owner/repo
203
+ portki detect --repo owner/repo
122
204
  # 반환된 JSON의 status에 따라 분기:
123
205
  # - noop: 변경점 없음 (수정 불필요)
124
206
  # - incremental: 변경된 코드와 영향을 받는 섹션(impacted_section_ids)만 재생성
125
207
  # - full-rebuild: 너무 많은 변화로 인해 전체 위키 재작성 필요
126
208
 
127
209
  # 2. 재생성 필요 시 최신 코드로 스냅샷 다시 수집
128
- npx tsx src/agent.ts ingest --repo owner/repo --out devport-output/workspace/{repo-slug}-artifact.json
210
+ portki ingest --repo owner/repo --out devport-output/workspace/{repo-slug}-artifact.json
129
211
 
130
212
  # (이후 영향을 받는 섹션만 다시 작성하여 persist-section 후 finalize 수행)
131
213
  ```
package/dist/cli.js CHANGED
@@ -1,6 +1,6 @@
1
1
  // src/agent.ts
2
- import path13 from "node:path";
3
- import { mkdir as mkdir2, writeFile as writeFile2, readFile as readFile2 } from "node:fs/promises";
2
+ import path18 from "node:path";
3
+ import { mkdir as mkdir4, writeFile as writeFile5, readFile as readFile4 } from "node:fs/promises";
4
4
  import { pathToFileURL } from "node:url";
5
5
 
6
6
  // src/shared/load-env.ts
@@ -2848,8 +2848,8 @@ function compareDeterministic(left, right) {
2848
2848
  }
2849
2849
  function toOrderedUniquePaths(paths) {
2850
2850
  const unique = /* @__PURE__ */ new Set();
2851
- for (const path14 of paths) {
2852
- const normalized = normalizeRepoPath(path14);
2851
+ for (const path19 of paths) {
2852
+ const normalized = normalizeRepoPath(path19);
2853
2853
  if (normalized.length > 0) {
2854
2854
  unique.add(normalized);
2855
2855
  }
@@ -2873,15 +2873,15 @@ function mapChangedPathsToImpactedSections(input) {
2873
2873
  }));
2874
2874
  for (const section of normalizedSectionPaths) {
2875
2875
  const sectionPaths = section.repoPaths;
2876
- for (const path14 of changedPaths) {
2877
- if (sectionPaths.has(path14)) {
2876
+ for (const path19 of changedPaths) {
2877
+ if (sectionPaths.has(path19)) {
2878
2878
  impacted.add(section.sectionId);
2879
- matchedPaths.add(path14);
2879
+ matchedPaths.add(path19);
2880
2880
  }
2881
2881
  }
2882
2882
  }
2883
2883
  const impacted_section_ids = [...impacted].sort(compareDeterministic);
2884
- const unmatched_changed_paths = changedPaths.filter((path14) => !matchedPaths.has(path14));
2884
+ const unmatched_changed_paths = changedPaths.filter((path19) => !matchedPaths.has(path19));
2885
2885
  if (impacted_section_ids.length === 0) {
2886
2886
  return {
2887
2887
  mode: "full-rebuild-required",
@@ -3933,6 +3933,602 @@ async function finalize(session, plan, options) {
3933
3933
  return result;
3934
3934
  }
3935
3935
 
3936
+ // src/commands/wiki.ts
3937
+ import path13 from "node:path";
3938
+ import { mkdir as mkdir2, writeFile as writeFile2 } from "node:fs/promises";
3939
+
3940
+ // src/commands/handoff.ts
3941
+ function renderHandoff(opts) {
3942
+ const {
3943
+ ownerRepo,
3944
+ slug,
3945
+ artifactPath,
3946
+ planContextPath,
3947
+ artifact,
3948
+ planContext: ctx,
3949
+ stepOverrides,
3950
+ persistedSections
3951
+ } = opts;
3952
+ const step = (name, fallback) => stepOverrides?.[name] ?? fallback;
3953
+ const persistedSet = new Set(persistedSections ?? []);
3954
+ const sectionPlanPath = `devport-output/workspace/${slug}-section-plan.json`;
3955
+ const profileLines = [
3956
+ `- **Repo**: ${ownerRepo}`,
3957
+ `- **Commit**: ${artifact.commit_sha.slice(0, 7)}`,
3958
+ `- **Language**: ${ctx.profile.primaryLanguage}`,
3959
+ `- **Type**: ${ctx.profile.projectType}`,
3960
+ `- **Domain**: ${ctx.profile.domainHint}`,
3961
+ `- **Files scanned**: ${artifact.files_scanned.toLocaleString("en-US")}`,
3962
+ `- **Snapshot**: ${artifact.snapshot_path}`
3963
+ ];
3964
+ const keyPathsList = ctx.keyPaths.slice(0, 20).map((p) => ` - \`${p}\``).join("\n");
3965
+ const sectionExample = JSON.stringify(
3966
+ {
3967
+ sectionId: "sec-1",
3968
+ titleKo: "\uD504\uB85C\uC81D\uD2B8 \uAC1C\uC694 \uBC0F \uC544\uD0A4\uD14D\uCC98",
3969
+ summaryKo: "\uC774 \uC139\uC158\uC5D0\uC11C\uB294 \uD504\uB85C\uC81D\uD2B8\uC758 \uBAA9\uC801\uACFC \uC804\uCCB4 \uAD6C\uC870\uB97C \uC124\uBA85\uD569\uB2C8\uB2E4.",
3970
+ sourcePaths: ["README.md", "src/main.ts"],
3971
+ subsections: [
3972
+ {
3973
+ subsectionId: "sub-1-1",
3974
+ titleKo: "\uD504\uB85C\uC81D\uD2B8 \uC18C\uAC1C",
3975
+ bodyKo: "(3000\uC790 \uC774\uC0C1\uC758 \uD55C\uAD6D\uC5B4 \uC124\uBA85...)",
3976
+ sourcePaths: ["README.md"]
3977
+ }
3978
+ ]
3979
+ },
3980
+ null,
3981
+ 2
3982
+ );
3983
+ const lines = [
3984
+ `# Handoff: ${ownerRepo}`,
3985
+ "",
3986
+ "This file was generated by `portki`. Follow the steps below to complete the wiki generation.",
3987
+ "",
3988
+ "## Pipeline Status",
3989
+ "",
3990
+ `| Step | Status |`,
3991
+ `|------|--------|`,
3992
+ `| ingest | ${step("ingest", "DONE")} |`,
3993
+ `| plan-sections | ${step("plan-sections", "DONE")} |`,
3994
+ `| write section-plan.json | ${step("write-plan", "PENDING")} |`,
3995
+ `| validate-plan | ${step("validate-plan", "PENDING")} |`,
3996
+ `| write & persist sections | ${step("persist-sections", "PENDING")} |`,
3997
+ `| finalize | ${step("finalize", "PENDING")} |`,
3998
+ "",
3999
+ "## Repo Profile",
4000
+ "",
4001
+ ...profileLines,
4002
+ "",
4003
+ "## Key Paths to Read",
4004
+ "",
4005
+ keyPathsList,
4006
+ "",
4007
+ "## Artifact Files",
4008
+ "",
4009
+ `- Artifact: \`${artifactPath}\``,
4010
+ `- Plan context: \`${planContextPath}\``,
4011
+ `- Section plan (you write this): \`${sectionPlanPath}\``,
4012
+ "",
4013
+ "## Steps",
4014
+ "",
4015
+ "### Step 1: Read the plan context",
4016
+ "",
4017
+ `Read \`${planContextPath}\` to understand the repo structure, profile, and constraints.`,
4018
+ `Also read the README and key source files under \`${artifact.snapshot_path}\`.`,
4019
+ "",
4020
+ "### Step 2: Write section-plan.json",
4021
+ "",
4022
+ `Create \`${sectionPlanPath}\` with 4-6 sections. Each section needs:`,
4023
+ "- `sectionId`: sec-1, sec-2, ...",
4024
+ "- `titleKo`: Korean title (3+ chars)",
4025
+ "- `summaryKo`: Korean summary (10+ chars)",
4026
+ "- `focusPaths`: real paths from the snapshot",
4027
+ "- `subsections`: array of 3+ subsections with subsectionId, titleKo, objectiveKo, targetEvidenceKinds, targetCharacterCount",
4028
+ "",
4029
+ "### Step 3: Validate the plan",
4030
+ "",
4031
+ "```bash",
4032
+ `portki validate-plan --input ${sectionPlanPath} --context ${planContextPath} --out ${sectionPlanPath}`,
4033
+ "```",
4034
+ "",
4035
+ "### Step 4: Write and persist each section",
4036
+ "",
4037
+ "For each section in the plan:",
4038
+ "",
4039
+ "1. Read the focus files listed in the section plan",
4040
+ "2. Write a section output JSON file:",
4041
+ "",
4042
+ "```json",
4043
+ sectionExample,
4044
+ "```",
4045
+ "",
4046
+ "3. Persist it:",
4047
+ "",
4048
+ "```bash",
4049
+ `portki persist-section --plan ${sectionPlanPath} --section sec-N --input devport-output/workspace/${slug}-section-N-output.json`,
4050
+ "```",
4051
+ ""
4052
+ ];
4053
+ if (persistedSections && persistedSections.length > 0) {
4054
+ lines.push("**Section progress:**", "");
4055
+ for (const secId of persistedSections) {
4056
+ lines.push(`- [x] ${secId} (persisted)`);
4057
+ }
4058
+ lines.push("");
4059
+ }
4060
+ lines.push(
4061
+ "### Step 5: Finalize",
4062
+ "",
4063
+ "```bash",
4064
+ `portki finalize --plan ${sectionPlanPath} --advance_baseline`,
4065
+ "```",
4066
+ "",
4067
+ `Output will be written to \`devport-output/wiki/${ownerRepo}/\`.`,
4068
+ "",
4069
+ "## Writing Rules",
4070
+ "",
4071
+ "- All Korean prose must use formal `\uD569\uB2C8\uB2E4` style",
4072
+ "- Each `bodyKo` must be at least 3,000 characters",
4073
+ "- Include at least one Mermaid architecture block across all sections",
4074
+ "- `sourcePaths` must be real snapshot paths",
4075
+ "- `sub-1-1` is the project introduction summary (purpose, problem space, getting-started, section map)",
4076
+ "- Code call-flow analysis should begin in `sub-1-2` or later",
4077
+ "- Do not use repetitive filler blocks",
4078
+ ""
4079
+ );
4080
+ return lines.join("\n");
4081
+ }
4082
+
4083
+ // src/commands/wiki.ts
4084
+ function repoSlug(ownerRepo) {
4085
+ const parts = ownerRepo.split("/");
4086
+ return parts[parts.length - 1] ?? ownerRepo;
4087
+ }
4088
+ function workspacePath(ownerRepo, suffix) {
4089
+ const slug = repoSlug(ownerRepo);
4090
+ return `devport-output/workspace/${slug}-${suffix}`;
4091
+ }
4092
+ async function wikiCommand(ownerRepo, flags) {
4093
+ const parts = ownerRepo.toLowerCase().split("/");
4094
+ if (parts.length !== 2 || !parts[0] || !parts[1]) {
4095
+ throw new Error(`Invalid repo format: ${ownerRepo}. Use owner/repo.`);
4096
+ }
4097
+ const repo = `${parts[0]}/${parts[1]}`;
4098
+ const slug = repoSlug(repo);
4099
+ const snapshotRoot = flags["snapshot_root"] ?? "devport-output/snapshots";
4100
+ process.stderr.write(`[portki] ${repo} \u2014 ingesting...
4101
+ `);
4102
+ const ref = flags["ref"];
4103
+ const artifact = await runIngest({
4104
+ repo_ref: { repo, ...ref ? { ref } : {} },
4105
+ snapshot_root: path13.resolve(snapshotRoot),
4106
+ force_rebuild: flags["force_rebuild"] === "true"
4107
+ });
4108
+ const cacheLabel = artifact.idempotent_hit ? "cache hit" : "downloaded";
4109
+ process.stderr.write(
4110
+ ` ingest: ${artifact.commit_sha.slice(0, 7)} \u2014 ${artifact.files_scanned.toLocaleString("en-US")} files (${cacheLabel})
4111
+ `
4112
+ );
4113
+ const artifactPath = workspacePath(repo, "artifact.json");
4114
+ await mkdir2(path13.dirname(path13.resolve(artifactPath)), { recursive: true });
4115
+ await writeFile2(
4116
+ path13.resolve(artifactPath),
4117
+ `${JSON.stringify(artifact, null, 2)}
4118
+ `,
4119
+ "utf8"
4120
+ );
4121
+ process.stderr.write(` plan-sections: analyzing...
4122
+ `);
4123
+ const ctx = await planContext(artifact);
4124
+ const planContextPath = workspacePath(repo, "plan-context.json");
4125
+ await writeFile2(
4126
+ path13.resolve(planContextPath),
4127
+ `${JSON.stringify(ctx, null, 2)}
4128
+ `,
4129
+ "utf8"
4130
+ );
4131
+ process.stderr.write(
4132
+ ` plan-sections: ${ctx.profile.projectType}, ${ctx.profile.primaryLanguage}, ${ctx.profile.domainHint}
4133
+ `
4134
+ );
4135
+ const handoffPath = workspacePath(repo, "handoff.md");
4136
+ const handoff = renderHandoff({
4137
+ ownerRepo: repo,
4138
+ slug,
4139
+ artifactPath,
4140
+ planContextPath,
4141
+ artifact,
4142
+ planContext: ctx
4143
+ });
4144
+ await writeFile2(path13.resolve(handoffPath), handoff, "utf8");
4145
+ process.stderr.write(
4146
+ `
4147
+ Done. Follow the handoff to complete wiki generation:
4148
+ ${handoffPath}
4149
+
4150
+ `
4151
+ );
4152
+ process.stdout.write(`${path13.resolve(handoffPath)}
4153
+ `);
4154
+ }
4155
+
4156
+ // src/commands/status.ts
4157
+ import { existsSync as existsSync4 } from "node:fs";
4158
+ import path14 from "node:path";
4159
+ async function statusCommand(argv) {
4160
+ const ownerRepo = argv[0];
4161
+ if (!ownerRepo || !ownerRepo.includes("/")) {
4162
+ throw new Error("Usage: portki status owner/repo");
4163
+ }
4164
+ const repo = ownerRepo.toLowerCase();
4165
+ const slug = repoSlug(repo);
4166
+ process.stderr.write(`[portki] status: ${repo}
4167
+
4168
+ `);
4169
+ const checks = [
4170
+ { label: "ingest (artifact)", path: workspacePath(repo, "artifact.json"), done: false },
4171
+ { label: "plan-sections (context)", path: workspacePath(repo, "plan-context.json"), done: false },
4172
+ { label: "section-plan.json", path: workspacePath(repo, "section-plan.json"), done: false },
4173
+ { label: "handoff.md", path: workspacePath(repo, "handoff.md"), done: false }
4174
+ ];
4175
+ for (const check of checks) {
4176
+ check.done = existsSync4(path14.resolve(check.path));
4177
+ const icon = check.done ? "+" : "-";
4178
+ process.stderr.write(` [${icon}] ${check.label}
4179
+ `);
4180
+ }
4181
+ const sessionPath = sessionPathForRepo(repo);
4182
+ const session = await loadSession(sessionPath);
4183
+ if (session) {
4184
+ process.stderr.write(`
4185
+ Session: ${session.sessionId}
4186
+ `);
4187
+ const sectionIds = Object.keys(session.sections).sort();
4188
+ let persisted = 0;
4189
+ for (const secId of sectionIds) {
4190
+ const sec = session.sections[secId];
4191
+ const icon = sec.status === "persisted" ? "+" : "-";
4192
+ process.stderr.write(` [${icon}] ${secId}: ${sec.status}
4193
+ `);
4194
+ if (sec.status === "persisted") persisted++;
4195
+ }
4196
+ process.stderr.write(`
4197
+ Sections: ${persisted}/${sectionIds.length} persisted
4198
+ `);
4199
+ } else {
4200
+ process.stderr.write(`
4201
+ No session found yet.
4202
+ `);
4203
+ }
4204
+ const wikiReadme = `devport-output/wiki/${repo}/README.md`;
4205
+ const finalized = existsSync4(path14.resolve(wikiReadme));
4206
+ const fIcon = finalized ? "+" : "-";
4207
+ process.stderr.write(` [${fIcon}] finalized (wiki output)
4208
+ `);
4209
+ process.stderr.write("\n");
4210
+ }
4211
+
4212
+ // src/commands/resume.ts
4213
+ import { existsSync as existsSync5 } from "node:fs";
4214
+ import { readFile as readFile2, writeFile as writeFile3 } from "node:fs/promises";
4215
+ import path15 from "node:path";
4216
+ async function resumeCommand(argv) {
4217
+ const ownerRepo = argv[0];
4218
+ if (!ownerRepo || !ownerRepo.includes("/")) {
4219
+ throw new Error("Usage: portki resume owner/repo");
4220
+ }
4221
+ const repo = ownerRepo.toLowerCase();
4222
+ const slug = repoSlug(repo);
4223
+ process.stderr.write(`[portki] resume: ${repo}
4224
+ `);
4225
+ const artifactPath = workspacePath(repo, "artifact.json");
4226
+ if (!existsSync5(path15.resolve(artifactPath))) {
4227
+ throw new Error(
4228
+ `No artifact found at ${artifactPath}. Run \`portki ${repo}\` first.`
4229
+ );
4230
+ }
4231
+ const artifactRaw = await readFile2(path15.resolve(artifactPath), "utf8");
4232
+ const artifact = ingestRunArtifactSchema.parse(JSON.parse(artifactRaw));
4233
+ const planContextPath = workspacePath(repo, "plan-context.json");
4234
+ let ctx;
4235
+ if (existsSync5(path15.resolve(planContextPath))) {
4236
+ const raw = await readFile2(path15.resolve(planContextPath), "utf8");
4237
+ ctx = PlanContextSchema.parse(JSON.parse(raw));
4238
+ process.stderr.write(` plan-context: loaded from cache
4239
+ `);
4240
+ } else {
4241
+ process.stderr.write(` plan-context: regenerating...
4242
+ `);
4243
+ ctx = await planContext(artifact);
4244
+ await writeFile3(
4245
+ path15.resolve(planContextPath),
4246
+ `${JSON.stringify(ctx, null, 2)}
4247
+ `,
4248
+ "utf8"
4249
+ );
4250
+ }
4251
+ const sessionPath = sessionPathForRepo(repo);
4252
+ const session = await loadSession(sessionPath);
4253
+ const persistedSections = [];
4254
+ const stepOverrides = {
4255
+ ingest: "DONE",
4256
+ "plan-sections": "DONE"
4257
+ };
4258
+ if (session) {
4259
+ for (const [secId, sec] of Object.entries(session.sections)) {
4260
+ if (sec.status === "persisted") {
4261
+ persistedSections.push(secId);
4262
+ }
4263
+ }
4264
+ const sectionPlanPath = workspacePath(repo, "section-plan.json");
4265
+ if (existsSync5(path15.resolve(sectionPlanPath))) {
4266
+ stepOverrides["write-plan"] = "DONE";
4267
+ stepOverrides["validate-plan"] = "DONE";
4268
+ }
4269
+ if (persistedSections.length > 0) {
4270
+ const total = Object.keys(session.sections).length;
4271
+ if (persistedSections.length >= total) {
4272
+ stepOverrides["persist-sections"] = "DONE";
4273
+ }
4274
+ }
4275
+ }
4276
+ const handoffPath = workspacePath(repo, "handoff.md");
4277
+ const handoff = renderHandoff({
4278
+ ownerRepo: repo,
4279
+ slug,
4280
+ artifactPath,
4281
+ planContextPath,
4282
+ artifact,
4283
+ planContext: ctx,
4284
+ stepOverrides,
4285
+ persistedSections
4286
+ });
4287
+ await writeFile3(path15.resolve(handoffPath), handoff, "utf8");
4288
+ if (persistedSections.length > 0) {
4289
+ process.stderr.write(
4290
+ ` ${persistedSections.length} section(s) already persisted: ${persistedSections.join(", ")}
4291
+ `
4292
+ );
4293
+ }
4294
+ process.stderr.write(
4295
+ `
4296
+ Handoff regenerated:
4297
+ ${handoffPath}
4298
+
4299
+ `
4300
+ );
4301
+ process.stdout.write(`${path15.resolve(handoffPath)}
4302
+ `);
4303
+ }
4304
+
4305
+ // src/commands/doctor.ts
4306
+ import { existsSync as existsSync6 } from "node:fs";
4307
+ import { access, readdir, constants } from "node:fs/promises";
4308
+ import path16 from "node:path";
4309
+ async function doctorCommand() {
4310
+ process.stderr.write(`[portki] doctor
4311
+
4312
+ `);
4313
+ let allOk = true;
4314
+ const nodeVersion = process.versions.node;
4315
+ const major = parseInt(nodeVersion.split(".")[0], 10);
4316
+ const nodeOk = major >= 20;
4317
+ process.stderr.write(
4318
+ ` ${nodeOk ? "ok" : "FAIL"} Node.js ${nodeVersion}${nodeOk ? "" : " (requires >= 20)"}
4319
+ `
4320
+ );
4321
+ if (!nodeOk) allOk = false;
4322
+ const outputDir = path16.resolve("devport-output");
4323
+ let writable = false;
4324
+ if (existsSync6(outputDir)) {
4325
+ try {
4326
+ await access(outputDir, constants.W_OK);
4327
+ writable = true;
4328
+ } catch {
4329
+ }
4330
+ } else {
4331
+ writable = true;
4332
+ }
4333
+ process.stderr.write(
4334
+ ` ${writable ? "ok" : "FAIL"} devport-output/ ${existsSync6(outputDir) ? "writable" : "will be created on first run"}
4335
+ `
4336
+ );
4337
+ if (!writable) allOk = false;
4338
+ const chunkedDir = path16.resolve("devport-output/chunked");
4339
+ let sessionCount = 0;
4340
+ if (existsSync6(chunkedDir)) {
4341
+ try {
4342
+ const owners = await readdir(chunkedDir, { withFileTypes: true });
4343
+ for (const owner of owners) {
4344
+ if (!owner.isDirectory()) continue;
4345
+ const repos = await readdir(path16.join(chunkedDir, owner.name), { withFileTypes: true });
4346
+ for (const repo of repos) {
4347
+ if (!repo.isDirectory()) continue;
4348
+ const sessionPath = path16.join(chunkedDir, owner.name, repo.name, "session.json");
4349
+ if (existsSync6(sessionPath)) sessionCount++;
4350
+ }
4351
+ }
4352
+ } catch {
4353
+ }
4354
+ }
4355
+ process.stderr.write(` ok Active sessions: ${sessionCount}
4356
+ `);
4357
+ const hasToken = !!process.env["GITHUB_TOKEN"];
4358
+ process.stderr.write(
4359
+ ` ${hasToken ? "ok" : "info"} GITHUB_TOKEN: ${hasToken ? "set" : "not set (optional, public repos work without it)"}
4360
+ `
4361
+ );
4362
+ process.stderr.write(`
4363
+ ${allOk ? "All checks passed." : "Some checks failed."}
4364
+
4365
+ `);
4366
+ if (!allOk) {
4367
+ process.exitCode = 1;
4368
+ }
4369
+ }
4370
+
4371
+ // src/commands/install.ts
4372
+ import { mkdir as mkdir3, writeFile as writeFile4, readFile as readFile3, appendFile } from "node:fs/promises";
4373
+ import { existsSync as existsSync7 } from "node:fs";
4374
+ import path17 from "node:path";
4375
+ var CLAUDE_TEMPLATE = `# portki \u2014 Generate Korean Wiki
4376
+
4377
+ GitHub \uC800\uC7A5\uC18C\uC758 \uD55C\uAD6D\uC5B4 \uC704\uD0A4\uB97C \uC0DD\uC131\uD569\uB2C8\uB2E4.
4378
+
4379
+ ## \uC2E4\uD589
4380
+
4381
+ \`\`\`
4382
+ portki $ARGUMENTS
4383
+ \`\`\`
4384
+
4385
+ ## \uC2E4\uD589 \uD6C4 \uD574\uC57C \uD560 \uC77C
4386
+
4387
+ portki\uB97C \uC2E4\uD589\uD558\uBA74 stdout\uC5D0 handoff.md \uACBD\uB85C\uAC00 \uCD9C\uB825\uB429\uB2C8\uB2E4. \uBC18\uB4DC\uC2DC \uC544\uB798 \uC21C\uC11C\uB97C \uB530\uB974\uC138\uC694:
4388
+
4389
+ 1. **handoff.md\uB97C \uC77D\uC2B5\uB2C8\uB2E4.** \uD30C\uC774\uD504\uB77C\uC778 \uC0C1\uD0DC, \uC800\uC7A5\uC18C \uD504\uB85C\uD544, \uC2A4\uB0C5\uC0F7 \uACBD\uB85C, \uC791\uC131 \uADDC\uCE59\uC774 \uBAA8\uB450 \uB4E4\uC5B4 \uC788\uC2B5\uB2C8\uB2E4.
4390
+ 2. **\uC2A4\uB0C5\uC0F7\uC758 \uC8FC\uC694 \uC18C\uC2A4 \uD30C\uC77C\uC744 \uC77D\uC2B5\uB2C8\uB2E4.** handoff.md\uC758 "Key Paths to Read" \uC139\uC158\uC5D0 \uB098\uC5F4\uB41C \uD30C\uC77C\uB4E4\uC744 \uC6B0\uC120\uC73C\uB85C \uC77D\uC73C\uC138\uC694.
4391
+ 3. **section-plan.json\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** handoff.md\uC758 \uACBD\uB85C\uC640 \uC2A4\uD0A4\uB9C8\uB97C \uB530\uB974\uC138\uC694. 4~6\uAC1C \uC139\uC158, \uAC01 \uC139\uC158 3\uAC1C \uC774\uC0C1\uC758 \uC11C\uBE0C\uC139\uC158.
4392
+ 4. **\`portki validate-plan\`\uC744 \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4393
+ 5. **\uAC01 \uC139\uC158\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** \uC139\uC158 \uD50C\uB79C\uC758 focusPaths\uC5D0 \uC788\uB294 \uD30C\uC77C\uB4E4\uC744 \uC77D\uACE0, section output JSON\uC744 \uC791\uC131\uD55C \uB4A4 \`portki persist-section\`\uC73C\uB85C \uAC80\uC99D\uD569\uB2C8\uB2E4. \uBAA8\uB4E0 \uC139\uC158\uC744 \uBC18\uBCF5\uD569\uB2C8\uB2E4.
4394
+ 6. **\`portki finalize\`\uB97C \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4395
+
4396
+ ## \uC791\uC131 \uADDC\uCE59
4397
+
4398
+ - \uBAA8\uB4E0 \uD55C\uAD6D\uC5B4 \uBB38\uC7A5\uC740 \`\uD569\uB2C8\uB2E4\` \uCCB4\uB97C \uC0AC\uC6A9\uD569\uB2C8\uB2E4.
4399
+ - \uAC01 bodyKo\uB294 \uCD5C\uC18C 3,000\uC790 \uC774\uC0C1\uC774\uC5B4\uC57C \uD569\uB2C8\uB2E4.
4400
+ - \uC804\uCCB4 \uC704\uD0A4\uC5D0 \uCD5C\uC18C \uD558\uB098\uC758 Mermaid \uC544\uD0A4\uD14D\uCC98 \uBE14\uB85D\uC744 \uD3EC\uD568\uD569\uB2C8\uB2E4.
4401
+ - sourcePaths\uB294 \uBC18\uB4DC\uC2DC \uC2A4\uB0C5\uC0F7\uC5D0 \uC2E4\uC81C\uB85C \uC874\uC7AC\uD558\uB294 \uACBD\uB85C\uC5EC\uC57C \uD569\uB2C8\uB2E4.
4402
+ - sub-1-1\uC740 \uD504\uB85C\uC81D\uD2B8 \uC18C\uAC1C \uC694\uC57D\uC785\uB2C8\uB2E4 (\uBAA9\uC801, \uBB38\uC81C \uC601\uC5ED, \uC2DC\uC791 \uAC00\uC774\uB4DC, \uB098\uBA38\uC9C0 \uC139\uC158 \uC548\uB0B4).
4403
+ - \uCF54\uB4DC \uCF5C\uD50C\uB85C\uC6B0 \uBD84\uC11D\uC740 sub-1-2\uBD80\uD130 \uC2DC\uC791\uD569\uB2C8\uB2E4.
4404
+ - \uBC18\uBCF5\uC801\uC778 \uD544\uB7EC \uBB38\uC7A5\uC744 \uC0AC\uC6A9\uD558\uC9C0 \uB9C8\uC138\uC694.
4405
+ `;
4406
+ var CODEX_TEMPLATE = `
4407
+ ## portki \u2014 Korean Wiki Generator
4408
+
4409
+ GitHub \uC800\uC7A5\uC18C\uC758 \uD55C\uAD6D\uC5B4 \uC704\uD0A4\uB97C \uC0DD\uC131\uD569\uB2C8\uB2E4.
4410
+
4411
+ ### \uC2E4\uD589
4412
+
4413
+ \`\`\`bash
4414
+ portki owner/repo
4415
+ \`\`\`
4416
+
4417
+ ### \uC2E4\uD589 \uD6C4 \uD574\uC57C \uD560 \uC77C
4418
+
4419
+ portki\uB97C \uC2E4\uD589\uD558\uBA74 stdout\uC5D0 handoff.md \uACBD\uB85C\uAC00 \uCD9C\uB825\uB429\uB2C8\uB2E4. \uBC18\uB4DC\uC2DC \uC544\uB798 \uC21C\uC11C\uB97C \uB530\uB974\uC138\uC694:
4420
+
4421
+ 1. **handoff.md\uB97C \uC77D\uC2B5\uB2C8\uB2E4.** \uD30C\uC774\uD504\uB77C\uC778 \uC0C1\uD0DC, \uC800\uC7A5\uC18C \uD504\uB85C\uD544, \uC2A4\uB0C5\uC0F7 \uACBD\uB85C, \uC791\uC131 \uADDC\uCE59\uC774 \uBAA8\uB450 \uB4E4\uC5B4 \uC788\uC2B5\uB2C8\uB2E4.
4422
+ 2. **\uC2A4\uB0C5\uC0F7\uC758 \uC8FC\uC694 \uC18C\uC2A4 \uD30C\uC77C\uC744 \uC77D\uC2B5\uB2C8\uB2E4.** handoff.md\uC758 "Key Paths to Read" \uC139\uC158\uC5D0 \uB098\uC5F4\uB41C \uD30C\uC77C\uB4E4\uC744 \uC6B0\uC120\uC73C\uB85C \uC77D\uC73C\uC138\uC694.
4423
+ 3. **section-plan.json\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** handoff.md\uC758 \uACBD\uB85C\uC640 \uC2A4\uD0A4\uB9C8\uB97C \uB530\uB974\uC138\uC694. 4~6\uAC1C \uC139\uC158, \uAC01 \uC139\uC158 3\uAC1C \uC774\uC0C1\uC758 \uC11C\uBE0C\uC139\uC158.
4424
+ 4. **\`portki validate-plan\`\uC744 \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4425
+ 5. **\uAC01 \uC139\uC158\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** \uC139\uC158 \uD50C\uB79C\uC758 focusPaths\uC5D0 \uC788\uB294 \uD30C\uC77C\uB4E4\uC744 \uC77D\uACE0, section output JSON\uC744 \uC791\uC131\uD55C \uB4A4 \`portki persist-section\`\uC73C\uB85C \uAC80\uC99D\uD569\uB2C8\uB2E4. \uBAA8\uB4E0 \uC139\uC158\uC744 \uBC18\uBCF5\uD569\uB2C8\uB2E4.
4426
+ 6. **\`portki finalize\`\uB97C \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4427
+
4428
+ ### \uBCF4\uC870 \uBA85\uB839\uC5B4
4429
+
4430
+ - \`portki status owner/repo\` \u2014 \uC9C4\uD589 \uC0C1\uD669 \uD655\uC778
4431
+ - \`portki resume owner/repo\` \u2014 \uC911\uB2E8\uB41C \uC791\uC5C5 \uC774\uC5B4\uC11C \uD558\uAE30
4432
+
4433
+ ### \uC791\uC131 \uADDC\uCE59
4434
+
4435
+ - \uBAA8\uB4E0 \uD55C\uAD6D\uC5B4 \uBB38\uC7A5\uC740 \`\uD569\uB2C8\uB2E4\` \uCCB4\uB97C \uC0AC\uC6A9\uD569\uB2C8\uB2E4.
4436
+ - \uAC01 bodyKo\uB294 \uCD5C\uC18C 3,000\uC790 \uC774\uC0C1\uC774\uC5B4\uC57C \uD569\uB2C8\uB2E4.
4437
+ - \uC804\uCCB4 \uC704\uD0A4\uC5D0 \uCD5C\uC18C \uD558\uB098\uC758 Mermaid \uC544\uD0A4\uD14D\uCC98 \uBE14\uB85D\uC744 \uD3EC\uD568\uD569\uB2C8\uB2E4.
4438
+ - sourcePaths\uB294 \uBC18\uB4DC\uC2DC \uC2A4\uB0C5\uC0F7\uC5D0 \uC2E4\uC81C\uB85C \uC874\uC7AC\uD558\uB294 \uACBD\uB85C\uC5EC\uC57C \uD569\uB2C8\uB2E4.
4439
+ - sub-1-1\uC740 \uD504\uB85C\uC81D\uD2B8 \uC18C\uAC1C \uC694\uC57D\uC785\uB2C8\uB2E4 (\uBAA9\uC801, \uBB38\uC81C \uC601\uC5ED, \uC2DC\uC791 \uAC00\uC774\uB4DC, \uB098\uBA38\uC9C0 \uC139\uC158 \uC548\uB0B4).
4440
+ - \uCF54\uB4DC \uCF5C\uD50C\uB85C\uC6B0 \uBD84\uC11D\uC740 sub-1-2\uBD80\uD130 \uC2DC\uC791\uD569\uB2C8\uB2E4.
4441
+ - \uBC18\uBCF5\uC801\uC778 \uD544\uB7EC \uBB38\uC7A5\uC744 \uC0AC\uC6A9\uD558\uC9C0 \uB9C8\uC138\uC694.
4442
+ `;
4443
+ var GEMINI_TEMPLATE = `# portki \u2014 Generate Korean Wiki
4444
+
4445
+ GitHub \uC800\uC7A5\uC18C\uC758 \uD55C\uAD6D\uC5B4 \uC704\uD0A4\uB97C \uC0DD\uC131\uD569\uB2C8\uB2E4.
4446
+
4447
+ ## \uC2E4\uD589
4448
+
4449
+ \`\`\`
4450
+ portki $ARGUMENTS
4451
+ \`\`\`
4452
+
4453
+ ## \uC2E4\uD589 \uD6C4 \uD574\uC57C \uD560 \uC77C
4454
+
4455
+ portki\uB97C \uC2E4\uD589\uD558\uBA74 stdout\uC5D0 handoff.md \uACBD\uB85C\uAC00 \uCD9C\uB825\uB429\uB2C8\uB2E4. \uBC18\uB4DC\uC2DC \uC544\uB798 \uC21C\uC11C\uB97C \uB530\uB974\uC138\uC694:
4456
+
4457
+ 1. **handoff.md\uB97C \uC77D\uC2B5\uB2C8\uB2E4.** \uD30C\uC774\uD504\uB77C\uC778 \uC0C1\uD0DC, \uC800\uC7A5\uC18C \uD504\uB85C\uD544, \uC2A4\uB0C5\uC0F7 \uACBD\uB85C, \uC791\uC131 \uADDC\uCE59\uC774 \uBAA8\uB450 \uB4E4\uC5B4 \uC788\uC2B5\uB2C8\uB2E4.
4458
+ 2. **\uC2A4\uB0C5\uC0F7\uC758 \uC8FC\uC694 \uC18C\uC2A4 \uD30C\uC77C\uC744 \uC77D\uC2B5\uB2C8\uB2E4.** handoff.md\uC758 "Key Paths to Read" \uC139\uC158\uC5D0 \uB098\uC5F4\uB41C \uD30C\uC77C\uB4E4\uC744 \uC6B0\uC120\uC73C\uB85C \uC77D\uC73C\uC138\uC694.
4459
+ 3. **section-plan.json\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** handoff.md\uC758 \uACBD\uB85C\uC640 \uC2A4\uD0A4\uB9C8\uB97C \uB530\uB974\uC138\uC694. 4~6\uAC1C \uC139\uC158, \uAC01 \uC139\uC158 3\uAC1C \uC774\uC0C1\uC758 \uC11C\uBE0C\uC139\uC158.
4460
+ 4. **\`portki validate-plan\`\uC744 \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4461
+ 5. **\uAC01 \uC139\uC158\uC744 \uC791\uC131\uD569\uB2C8\uB2E4.** \uC139\uC158 \uD50C\uB79C\uC758 focusPaths\uC5D0 \uC788\uB294 \uD30C\uC77C\uB4E4\uC744 \uC77D\uACE0, section output JSON\uC744 \uC791\uC131\uD55C \uB4A4 \`portki persist-section\`\uC73C\uB85C \uAC80\uC99D\uD569\uB2C8\uB2E4. \uBAA8\uB4E0 \uC139\uC158\uC744 \uBC18\uBCF5\uD569\uB2C8\uB2E4.
4462
+ 6. **\`portki finalize\`\uB97C \uC2E4\uD589\uD569\uB2C8\uB2E4.** handoff.md\uC5D0 \uC801\uD78C \uC815\uD655\uD55C \uBA85\uB839\uC5B4\uB97C \uC0AC\uC6A9\uD558\uC138\uC694.
4463
+
4464
+ ## \uC791\uC131 \uADDC\uCE59
4465
+
4466
+ - \uBAA8\uB4E0 \uD55C\uAD6D\uC5B4 \uBB38\uC7A5\uC740 \`\uD569\uB2C8\uB2E4\` \uCCB4\uB97C \uC0AC\uC6A9\uD569\uB2C8\uB2E4.
4467
+ - \uAC01 bodyKo\uB294 \uCD5C\uC18C 3,000\uC790 \uC774\uC0C1\uC774\uC5B4\uC57C \uD569\uB2C8\uB2E4.
4468
+ - \uC804\uCCB4 \uC704\uD0A4\uC5D0 \uCD5C\uC18C \uD558\uB098\uC758 Mermaid \uC544\uD0A4\uD14D\uCC98 \uBE14\uB85D\uC744 \uD3EC\uD568\uD569\uB2C8\uB2E4.
4469
+ - sourcePaths\uB294 \uBC18\uB4DC\uC2DC \uC2A4\uB0C5\uC0F7\uC5D0 \uC2E4\uC81C\uB85C \uC874\uC7AC\uD558\uB294 \uACBD\uB85C\uC5EC\uC57C \uD569\uB2C8\uB2E4.
4470
+ - sub-1-1\uC740 \uD504\uB85C\uC81D\uD2B8 \uC18C\uAC1C \uC694\uC57D\uC785\uB2C8\uB2E4 (\uBAA9\uC801, \uBB38\uC81C \uC601\uC5ED, \uC2DC\uC791 \uAC00\uC774\uB4DC, \uB098\uBA38\uC9C0 \uC139\uC158 \uC548\uB0B4).
4471
+ - \uCF54\uB4DC \uCF5C\uD50C\uB85C\uC6B0 \uBD84\uC11D\uC740 sub-1-2\uBD80\uD130 \uC2DC\uC791\uD569\uB2C8\uB2E4.
4472
+ - \uBC18\uBCF5\uC801\uC778 \uD544\uB7EC \uBB38\uC7A5\uC744 \uC0AC\uC6A9\uD558\uC9C0 \uB9C8\uC138\uC694.
4473
+ `;
4474
+ async function installCommand(flags) {
4475
+ const agent = flags["agent"];
4476
+ if (!agent) {
4477
+ throw new Error("Usage: portki install --agent claude|codex|gemini");
4478
+ }
4479
+ const normalized = agent.toLowerCase();
4480
+ switch (normalized) {
4481
+ case "claude": {
4482
+ const dir = path17.resolve(".claude", "commands");
4483
+ const filePath = path17.join(dir, "portki.md");
4484
+ await mkdir3(dir, { recursive: true });
4485
+ await writeFile4(filePath, CLAUDE_TEMPLATE, "utf8");
4486
+ process.stderr.write(`[portki] Installed Claude adapter:
4487
+ ${filePath}
4488
+ `);
4489
+ process.stderr.write(` Use /portki owner/repo in Claude Code.
4490
+
4491
+ `);
4492
+ break;
4493
+ }
4494
+ case "codex": {
4495
+ const agentsFile = path17.resolve("AGENTS.md");
4496
+ if (existsSync7(agentsFile)) {
4497
+ const existing = await readFile3(agentsFile, "utf8");
4498
+ if (existing.includes("portki")) {
4499
+ process.stderr.write(`[portki] AGENTS.md already contains portki section. Skipping.
4500
+
4501
+ `);
4502
+ return;
4503
+ }
4504
+ await appendFile(agentsFile, CODEX_TEMPLATE, "utf8");
4505
+ } else {
4506
+ await writeFile4(agentsFile, CODEX_TEMPLATE.trimStart(), "utf8");
4507
+ }
4508
+ process.stderr.write(`[portki] Installed Codex adapter:
4509
+ ${agentsFile}
4510
+
4511
+ `);
4512
+ break;
4513
+ }
4514
+ case "gemini": {
4515
+ const dir = path17.resolve(".gemini", "commands");
4516
+ const filePath = path17.join(dir, "portki.md");
4517
+ await mkdir3(dir, { recursive: true });
4518
+ await writeFile4(filePath, GEMINI_TEMPLATE, "utf8");
4519
+ process.stderr.write(`[portki] Installed Gemini adapter:
4520
+ ${filePath}
4521
+ `);
4522
+ process.stderr.write(` Use /portki owner/repo in Gemini.
4523
+
4524
+ `);
4525
+ break;
4526
+ }
4527
+ default:
4528
+ throw new Error(`Unknown agent: ${agent}. Supported: claude, codex, gemini`);
4529
+ }
4530
+ }
4531
+
3936
4532
  // src/agent.ts
3937
4533
  function getQualityGateLevel(env) {
3938
4534
  const val = env.DEVPORT_QUALITY_GATE_LEVEL;
@@ -3990,7 +4586,7 @@ async function ingestCommand(flags) {
3990
4586
  `);
3991
4587
  const artifact = await runIngest({
3992
4588
  repo_ref: { repo, ...ref ? { ref } : {} },
3993
- snapshot_root: path13.resolve(snapshotRoot),
4589
+ snapshot_root: path18.resolve(snapshotRoot),
3994
4590
  force_rebuild: flags["force_rebuild"] === "true"
3995
4591
  });
3996
4592
  const cacheLabel = artifact.idempotent_hit ? "cache hit" : "downloaded";
@@ -4003,9 +4599,9 @@ async function ingestCommand(flags) {
4003
4599
  const json = `${JSON.stringify(artifact, null, 2)}
4004
4600
  `;
4005
4601
  if (outFile) {
4006
- const outPath = path13.resolve(outFile);
4007
- await mkdir2(path13.dirname(outPath), { recursive: true });
4008
- await writeFile2(outPath, json, "utf8");
4602
+ const outPath = path18.resolve(outFile);
4603
+ await mkdir4(path18.dirname(outPath), { recursive: true });
4604
+ await writeFile5(outPath, json, "utf8");
4009
4605
  process.stderr.write(` artifact \u2192 ${outPath}
4010
4606
  `);
4011
4607
  } else {
@@ -4097,7 +4693,7 @@ async function packageCommand(flags) {
4097
4693
  const qualityGateLevel = resolveQualityGateLevel(flags, getQualityGateLevel(process.env));
4098
4694
  let raw;
4099
4695
  if (inputFile) {
4100
- raw = await readFile2(path13.resolve(inputFile), "utf8");
4696
+ raw = await readFile4(path18.resolve(inputFile), "utf8");
4101
4697
  } else {
4102
4698
  raw = await readStdin();
4103
4699
  }
@@ -4150,7 +4746,7 @@ async function packageCommand(flags) {
4150
4746
  async function planSectionsCommand(flags) {
4151
4747
  const artifactFile = requireFlag(flags, "artifact");
4152
4748
  const outFile = flags["out"];
4153
- const raw = await readFile2(path13.resolve(artifactFile), "utf8");
4749
+ const raw = await readFile4(path18.resolve(artifactFile), "utf8");
4154
4750
  const artifact = ingestRunArtifactSchema.parse(JSON.parse(raw));
4155
4751
  process.stderr.write(
4156
4752
  `[devport-agent] plan-sections: ${artifact.repo_ref} (${fmtNum(artifact.files_scanned)} files)
@@ -4167,9 +4763,9 @@ async function planSectionsCommand(flags) {
4167
4763
  const json = `${JSON.stringify(context, null, 2)}
4168
4764
  `;
4169
4765
  if (outFile) {
4170
- const outPath = path13.resolve(outFile);
4171
- await mkdir2(path13.dirname(outPath), { recursive: true });
4172
- await writeFile2(outPath, json, "utf8");
4766
+ const outPath = path18.resolve(outFile);
4767
+ await mkdir4(path18.dirname(outPath), { recursive: true });
4768
+ await writeFile5(outPath, json, "utf8");
4173
4769
  process.stderr.write(` context \u2192 ${outPath}
4174
4770
  `);
4175
4771
  } else {
@@ -4180,9 +4776,9 @@ async function validatePlanCommand(flags) {
4180
4776
  const inputFile = requireFlag(flags, "input");
4181
4777
  const contextFile = requireFlag(flags, "context");
4182
4778
  const outFile = flags["out"];
4183
- const contextRaw = await readFile2(path13.resolve(contextFile), "utf8");
4779
+ const contextRaw = await readFile4(path18.resolve(contextFile), "utf8");
4184
4780
  const context = PlanContextSchema.parse(JSON.parse(contextRaw));
4185
- const planRaw = await readFile2(path13.resolve(inputFile), "utf8");
4781
+ const planRaw = await readFile4(path18.resolve(inputFile), "utf8");
4186
4782
  const planJson = JSON.parse(planRaw);
4187
4783
  process.stderr.write(
4188
4784
  `[devport-agent] validate-plan: ${context.repoFullName}
@@ -4204,9 +4800,9 @@ async function validatePlanCommand(flags) {
4204
4800
  const json = `${JSON.stringify(validated, null, 2)}
4205
4801
  `;
4206
4802
  if (outFile) {
4207
- const outPath = path13.resolve(outFile);
4208
- await mkdir2(path13.dirname(outPath), { recursive: true });
4209
- await writeFile2(outPath, json, "utf8");
4803
+ const outPath = path18.resolve(outFile);
4804
+ await mkdir4(path18.dirname(outPath), { recursive: true });
4805
+ await writeFile5(outPath, json, "utf8");
4210
4806
  process.stderr.write(` validated plan \u2192 ${outPath}
4211
4807
  `);
4212
4808
  } else {
@@ -4218,9 +4814,9 @@ async function persistSectionCommand(flags) {
4218
4814
  const sectionId = requireFlag(flags, "section");
4219
4815
  const inputFile = requireFlag(flags, "input");
4220
4816
  const sessionFile = flags["session"];
4221
- const planRaw = await readFile2(path13.resolve(planFile), "utf8");
4817
+ const planRaw = await readFile4(path18.resolve(planFile), "utf8");
4222
4818
  const plan = SectionPlanOutputSchema.parse(JSON.parse(planRaw));
4223
- const sectionRaw = await readFile2(path13.resolve(inputFile), "utf8");
4819
+ const sectionRaw = await readFile4(path18.resolve(inputFile), "utf8");
4224
4820
  const sectionOutput = SectionOutputSchema.parse(JSON.parse(sectionRaw));
4225
4821
  if (sectionOutput.sectionId !== sectionId) {
4226
4822
  throw new Error(
@@ -4244,7 +4840,7 @@ async function persistSectionCommand(flags) {
4244
4840
  }
4245
4841
  process.stderr.write(` \u2713 section validation passed
4246
4842
  `);
4247
- const sessionPath = sessionFile ? path13.resolve(sessionFile) : sessionPathForRepo(plan.repoFullName);
4843
+ const sessionPath = sessionFile ? path18.resolve(sessionFile) : sessionPathForRepo(plan.repoFullName);
4248
4844
  let session = await loadSession(sessionPath);
4249
4845
  if (!session) {
4250
4846
  session = initSession(plan, planFile);
@@ -4256,7 +4852,7 @@ async function persistSectionCommand(flags) {
4256
4852
  koreanChars += sub.bodyKo.length;
4257
4853
  }
4258
4854
  session = markSectionPersisted(session, sectionId, {
4259
- sectionOutputPath: path13.resolve(inputFile),
4855
+ sectionOutputPath: path18.resolve(inputFile),
4260
4856
  chunksInserted: 0,
4261
4857
  claimCount: 0,
4262
4858
  citationCount: 0,
@@ -4282,9 +4878,9 @@ async function finalizeCommand(flags) {
4282
4878
  const statePath = flags["state_path"] ?? "devport-output/freshness/state.json";
4283
4879
  const deleteSnapshot = flags["delete_snapshot"] === "true";
4284
4880
  const outDir = flags["out_dir"] ?? "devport-output/wiki";
4285
- const planRaw = await readFile2(path13.resolve(planFile), "utf8");
4881
+ const planRaw = await readFile4(path18.resolve(planFile), "utf8");
4286
4882
  const plan = SectionPlanOutputSchema.parse(JSON.parse(planRaw));
4287
- const sessionPath = sessionFile ? path13.resolve(sessionFile) : sessionPathForRepo(plan.repoFullName);
4883
+ const sessionPath = sessionFile ? path18.resolve(sessionFile) : sessionPathForRepo(plan.repoFullName);
4288
4884
  const session = await loadSession(sessionPath);
4289
4885
  if (!session) {
4290
4886
  throw new Error(
@@ -4327,72 +4923,57 @@ function printHelp() {
4327
4923
  "portki \u2014 public CLI for generating Korean Markdown wikis from GitHub repositories",
4328
4924
  "Run as an installed package with `portki <command>` or from source with `npx tsx src/agent.ts <command>`.",
4329
4925
  "",
4330
- "Commands:",
4331
- " ingest Snapshot a GitHub repo and emit the ingest artifact",
4332
- " --repo owner/repo (required)",
4333
- " --ref branch|sha (optional, uses default branch if omitted)",
4334
- " --out artifact.json (optional, prints to stdout if omitted)",
4335
- " --snapshot_root (default: devport-output/snapshots)",
4336
- " --force_rebuild (re-download even if cache is valid)",
4926
+ "Quick start:",
4927
+ " portki owner/repo Ingest repo + generate handoff.md for AI agents",
4928
+ "",
4929
+ "High-level commands:",
4930
+ " owner/repo Ingest a repo, generate plan context and handoff.md",
4931
+ " status Show pipeline progress portki status owner/repo",
4932
+ " resume Regenerate handoff from state portki resume owner/repo",
4933
+ " doctor Environment health check portki doctor",
4934
+ " install Install agent adapter portki install --agent claude|codex|gemini",
4935
+ "",
4936
+ "Low-level commands:",
4937
+ " ingest Snapshot a GitHub repo and emit the ingest artifact",
4938
+ " --repo owner/repo (required)",
4939
+ " --ref branch|sha (optional, uses default branch if omitted)",
4940
+ " --out artifact.json (optional, prints to stdout if omitted)",
4941
+ " --snapshot_root (default: devport-output/snapshots)",
4942
+ " --force_rebuild (re-download even if cache is valid)",
4337
4943
  "",
4338
- " detect Detect what changed since the last delivery",
4339
- " --repo owner/repo (required)",
4340
- " --state_path (default: devport-output/freshness/state.json)",
4341
- " stdout: { status, changed_paths, impacted_section_ids, ... }",
4342
- " status values: noop | incremental | full-rebuild",
4944
+ " detect Detect what changed since the last delivery",
4945
+ " --repo owner/repo (required)",
4946
+ " --state_path (default: devport-output/freshness/state.json)",
4343
4947
  "",
4344
- " package Validate AI-generated GroundedAcceptedOutput, write markdown wiki files",
4345
- " --input accepted-output.json (optional, reads stdin if omitted)",
4346
- " --out_dir (default: devport-output/wiki)",
4347
- " --quality_gate_level standard|strict (default from DEVPORT_QUALITY_GATE_LEVEL)",
4348
- " --advance_baseline save freshness state for future detect",
4349
- " --state_path (default: devport-output/freshness/state.json)",
4948
+ " package Validate AI-generated output, write markdown wiki files",
4949
+ " --input accepted-output.json (optional, reads stdin if omitted)",
4950
+ " --out_dir (default: devport-output/wiki)",
4951
+ " --quality_gate_level standard|strict",
4952
+ " --advance_baseline save freshness state for future detect",
4350
4953
  "",
4351
- " plan-sections Analyze repo and produce planning context for AI section generation",
4352
- " --artifact artifact.json (required)",
4353
- " --out plan-context.json (optional, prints to stdout if omitted)",
4954
+ " plan-sections Analyze repo and produce planning context",
4955
+ " --artifact artifact.json (required)",
4956
+ " --out plan-context.json (optional, prints to stdout if omitted)",
4354
4957
  "",
4355
- " validate-plan Validate an AI-generated section plan against the schema",
4356
- " --input section-plan.json (required)",
4357
- " --context plan-context.json (required)",
4358
- " --out section-plan.json (optional, prints to stdout if omitted)",
4958
+ " validate-plan Validate an AI-generated section plan",
4959
+ " --input section-plan.json (required)",
4960
+ " --context plan-context.json (required)",
4961
+ " --out section-plan.json (optional)",
4359
4962
  "",
4360
- " persist-section Validate a single section and register it in the local session",
4963
+ " persist-section Validate a single section and register in session",
4361
4964
  " --plan section-plan.json (required)",
4362
4965
  " --section sec-1 (required)",
4363
4966
  " --input section-1.json (required)",
4364
- " --quality_gate_level standard|strict (default from DEVPORT_QUALITY_GATE_LEVEL)",
4365
- " --session session.json (optional, auto-derived from repo name)",
4366
4967
  "",
4367
- " finalize Cross-validate all sections and write the final markdown wiki bundle",
4368
- " --plan section-plan.json (required)",
4369
- " --session session.json (optional, auto-derived from repo name)",
4370
- " --out_dir (default: devport-output/wiki)",
4371
- " --advance_baseline save freshness state for future detect",
4372
- " --state_path (default: devport-output/freshness/state.json)",
4373
- " --delete_snapshot delete snapshot directory after successful finalize",
4374
- "",
4375
- "First-run workflow (monolithic):",
4376
- " 1. portki ingest --repo owner/repo --out artifact.json",
4377
- " 2. AI reads artifact.json + files under snapshot_path, generates GroundedAcceptedOutput",
4378
- " 3. portki package --input accepted-output.json --advance_baseline",
4379
- "",
4380
- "Chunked workflow (higher quality, section-at-a-time):",
4381
- " 1. portki ingest --repo owner/repo --out artifact.json",
4382
- " 2. portki plan-sections --artifact artifact.json --out plan-context.json",
4383
- " 3. AI reads plan-context.json + README + code, generates section-plan.json",
4384
- " 4. portki validate-plan --input section-plan.json --context plan-context.json --out section-plan.json",
4385
- " 5. For each section: AI reads focus files, writes section-N.json",
4386
- " portki persist-section --plan section-plan.json --section sec-N --input section-N.json",
4387
- " 6. portki finalize --plan section-plan.json --advance_baseline",
4388
- " \u2192 writes README.md + section markdown files under devport-output/wiki/{owner}/{repo}/",
4968
+ " finalize Cross-validate all sections and write final wiki",
4969
+ " --plan section-plan.json (required)",
4970
+ " --advance_baseline save freshness state",
4389
4971
  "",
4390
- "Incremental update workflow:",
4391
- " 1. portki detect --repo owner/repo",
4392
- " \u2192 noop: done. incremental/full-rebuild: continue below",
4393
- " 2. portki ingest --repo owner/repo --out artifact.json",
4394
- " 3. AI regenerates (all or only impacted sections) \u2192 accepted-output.json",
4395
- " 4. portki package --input accepted-output.json --advance_baseline",
4972
+ "Recommended workflow:",
4973
+ " 1. portki owner/repo \u2192 generates handoff.md",
4974
+ " 2. AI reads handoff.md and follows step-by-step instructions",
4975
+ " 3. portki status owner/repo \u2192 check progress anytime",
4976
+ " 4. portki resume owner/repo \u2192 recover from interruptions",
4396
4977
  ""
4397
4978
  ].join("\n")
4398
4979
  );
@@ -4407,6 +4988,22 @@ async function main() {
4407
4988
  return;
4408
4989
  }
4409
4990
  const flags = parseFlags(argv.slice(1));
4991
+ if (command === "install") {
4992
+ await installCommand(flags);
4993
+ return;
4994
+ }
4995
+ if (command === "doctor") {
4996
+ await doctorCommand();
4997
+ return;
4998
+ }
4999
+ if (command === "status") {
5000
+ await statusCommand(argv.slice(1));
5001
+ return;
5002
+ }
5003
+ if (command === "resume") {
5004
+ await resumeCommand(argv.slice(1));
5005
+ return;
5006
+ }
4410
5007
  if (command === "ingest") {
4411
5008
  await ingestCommand(flags);
4412
5009
  return;
@@ -4435,7 +5032,11 @@ async function main() {
4435
5032
  await finalizeCommand(flags);
4436
5033
  return;
4437
5034
  }
4438
- process.stderr.write(`[devport-agent] unknown command: ${command}
5035
+ if (command.includes("/")) {
5036
+ await wikiCommand(command, flags);
5037
+ return;
5038
+ }
5039
+ process.stderr.write(`[portki] unknown command: ${command}
4439
5040
  `);
4440
5041
  printHelp();
4441
5042
  process.exitCode = 1;
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@devport-kr/portki",
3
- "version": "0.1.1",
3
+ "version": "0.2.0",
4
4
  "description": "Public markdown-only CLI for generating Korean repository wikis from GitHub codebases",
5
5
  "type": "module",
6
6
  "private": false,