@biaoo/tiangong-wiki 0.2.2 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -99,7 +99,112 @@ tiangong-wiki dashboard # open dashboard in browse
99
99
  # or: tiangong-wiki daemon run # run the daemon in the foreground for debugging
100
100
  ```
101
101
 
102
- > Environment variables are managed via `.wiki.env` (created by `tiangong-wiki setup`). The CLI prefers the nearest local `.wiki.env`, then falls back to the global default workspace config. See [references/troubleshooting.md](./references/troubleshooting.md) for the full reference.
102
+ > Environment variables are managed via `.wiki.env` (created by `tiangong-wiki setup`). The CLI prefers the nearest local `.wiki.env`, then falls back to the global default workspace config. See [references/troubleshooting.md](./references/troubleshooting.md) for the full reference. For a centralized Linux + `systemd` + Nginx deployment, see [references/centralized-service-deployment.md](./references/centralized-service-deployment.md). That deployment guide now also includes Git repository / GitHub remote setup for daemon-side commit and optional auto-push.
103
+
104
+ ## MCP Server
105
+
106
+ Tiangong Wiki ships a separate MCP adapter that talks to the daemon over HTTP. It uses the MCP Streamable HTTP transport, not stdio.
107
+
108
+ Start it after the daemon is already listening:
109
+
110
+ ```bash
111
+ tiangong-wiki daemon run
112
+ ```
113
+
114
+ In another shell:
115
+
116
+ ```bash
117
+ export WIKI_DAEMON_BASE_URL=http://127.0.0.1:8787
118
+ export WIKI_MCP_HOST=127.0.0.1
119
+ export WIKI_MCP_PORT=9400
120
+ export WIKI_MCP_PATH=/mcp
121
+
122
+ tiangong-wiki-mcp-server
123
+ ```
124
+
125
+ The server prints a JSON line like:
126
+
127
+ ```json
128
+ {"status":"listening","host":"127.0.0.1","port":9400,"healthUrl":"http://127.0.0.1:9400/health","mcpUrl":"http://127.0.0.1:9400/mcp"}
129
+ ```
130
+
131
+ If you are running from a source checkout instead of a global install:
132
+
133
+ ```bash
134
+ npm install
135
+ npm run build
136
+
137
+ WIKI_DAEMON_BASE_URL=http://127.0.0.1:8787 \
138
+ WIKI_MCP_PORT=9400 \
139
+ node mcp-server/dist/index.js
140
+
141
+ # or during development
142
+ npm run dev:mcp-server
143
+ ```
144
+
145
+ Required MCP-side environment variables:
146
+
147
+ - `WIKI_DAEMON_BASE_URL`: base URL of the wiki daemon, for example `http://127.0.0.1:8787`
148
+ - `WIKI_MCP_HOST`: bind host for the MCP HTTP server, default `127.0.0.1`
149
+ - `WIKI_MCP_PORT`: bind port for the MCP HTTP server, default random free port
150
+ - `WIKI_MCP_PATH`: MCP route path, default `/mcp`
151
+
152
+ Bearer token note:
153
+
154
+ - Bearer tokens are not configured in `.wiki.env`, daemon env, or MCP env
155
+ - In the current V1 deployment model, Bearer tokens live in the reverse proxy config
156
+ - See [references/examples/centralized-service/nginx-centralized-wiki.conf](./references/examples/centralized-service/nginx-centralized-wiki.conf) for the current `map $http_authorization ...` example
157
+ - In production, keep token values in a private Nginx include file such as `/etc/nginx/snippets/wiki-auth-tokens.conf`, then `include` it from the main site config instead of hardcoding secrets in the repo
158
+
159
+ ## Using the MCP From Clients
160
+
161
+ Any MCP client that supports Streamable HTTP can connect to the MCP endpoint:
162
+
163
+ - Local debug endpoint: `http://127.0.0.1:9400/mcp`
164
+ - Health check: `http://127.0.0.1:9400/health`
165
+ - Production recommendation: expose `/mcp` behind a reverse proxy and keep daemon/MCP bound to loopback
166
+
167
+ Read tools can be called directly. Write tools such as `wiki_page_create`, `wiki_page_update`, and `wiki_sync` require these headers:
168
+
169
+ - `x-wiki-actor-id`
170
+ - `x-wiki-actor-type`
171
+ - `x-request-id`
172
+
173
+ In production, the recommended model is: the client sends `Authorization: Bearer ...` to the reverse proxy, the proxy validates the token there, and the proxy injects the actor headers before forwarding to the MCP server. For local debugging without a proxy, your client must send those headers itself when calling write tools.
174
+
175
+ Minimal Node.js MCP client example:
176
+
177
+ ```ts
178
+ import { Client } from "@modelcontextprotocol/sdk/client/index.js";
179
+ import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";
180
+
181
+ const transport = new StreamableHTTPClientTransport(new URL("http://127.0.0.1:9400/mcp"), {
182
+ requestInit: {
183
+ headers: {
184
+ "x-wiki-actor-id": "agent:demo",
185
+ "x-wiki-actor-type": "agent",
186
+ "x-request-id": "req-demo-1",
187
+ },
188
+ },
189
+ });
190
+
191
+ const client = new Client({ name: "demo-client", version: "1.0.0" });
192
+ await client.connect(transport);
193
+
194
+ const tools = await client.listTools();
195
+ const search = await client.callTool({
196
+ name: "wiki_search",
197
+ arguments: { query: "bayes", limit: 5 },
198
+ });
199
+ ```
200
+
201
+ Current MCP tools include:
202
+
203
+ - Query: `wiki_find`, `wiki_fts`, `wiki_search`, `wiki_graph`
204
+ - Page: `wiki_page_info`, `wiki_page_read`, `wiki_page_create`, `wiki_page_update`
205
+ - Type: `wiki_type_list`, `wiki_type_show`, `wiki_type_recommend`
206
+ - Vault: `wiki_vault_list`, `wiki_vault_queue`
207
+ - Maintenance: `wiki_sync`, `wiki_lint`
103
208
 
104
209
  ## CLI
105
210
 
package/README.zh-CN.md CHANGED
@@ -99,7 +99,112 @@ tiangong-wiki dashboard # 在浏览器中打开仪
99
99
  # 或者:tiangong-wiki daemon run # 前台运行 daemon,适合调试
100
100
  ```
101
101
 
102
- > 环境变量通过 `.wiki.env` 管理(由 `tiangong-wiki setup` 创建)。CLI 会优先使用最近的本地 `.wiki.env`,找不到时再 fallback 到全局默认工作区配置。完整参考见 [references/troubleshooting.md](./references/troubleshooting.md)
102
+ > 环境变量通过 `.wiki.env` 管理(由 `tiangong-wiki setup` 创建)。CLI 会优先使用最近的本地 `.wiki.env`,找不到时再 fallback 到全局默认工作区配置。完整参考见 [references/troubleshooting.md](./references/troubleshooting.md)。如需部署中心化服务(Linux + `systemd` + Nginx),见 [references/centralized-service-deployment.md](./references/centralized-service-deployment.md)。该部署文档现在也包含了 Git 仓库初始化、GitHub remote 配置和 daemon 自动 push 的 Git 配置说明。
103
+
104
+ ## MCP Server
105
+
106
+ Tiangong Wiki 提供了独立的 MCP 适配层,通过 HTTP 调用 daemon。它使用的是 MCP 的 Streamable HTTP 传输,不是 stdio。
107
+
108
+ 启动 MCP 前,先确保 daemon 已经在监听:
109
+
110
+ ```bash
111
+ tiangong-wiki daemon run
112
+ ```
113
+
114
+ 另开一个终端:
115
+
116
+ ```bash
117
+ export WIKI_DAEMON_BASE_URL=http://127.0.0.1:8787
118
+ export WIKI_MCP_HOST=127.0.0.1
119
+ export WIKI_MCP_PORT=9400
120
+ export WIKI_MCP_PATH=/mcp
121
+
122
+ tiangong-wiki-mcp-server
123
+ ```
124
+
125
+ 启动后会输出一行 JSON,例如:
126
+
127
+ ```json
128
+ {"status":"listening","host":"127.0.0.1","port":9400,"healthUrl":"http://127.0.0.1:9400/health","mcpUrl":"http://127.0.0.1:9400/mcp"}
129
+ ```
130
+
131
+ 如果你不是通过全局安装使用,而是在源码仓库里运行:
132
+
133
+ ```bash
134
+ npm install
135
+ npm run build
136
+
137
+ WIKI_DAEMON_BASE_URL=http://127.0.0.1:8787 \
138
+ WIKI_MCP_PORT=9400 \
139
+ node mcp-server/dist/index.js
140
+
141
+ # 或者开发态运行
142
+ npm run dev:mcp-server
143
+ ```
144
+
145
+ MCP 侧需要的环境变量:
146
+
147
+ - `WIKI_DAEMON_BASE_URL`:wiki daemon 的 base URL,例如 `http://127.0.0.1:8787`
148
+ - `WIKI_MCP_HOST`:MCP HTTP 服务绑定地址,默认 `127.0.0.1`
149
+ - `WIKI_MCP_PORT`:MCP HTTP 服务绑定端口,默认随机空闲端口
150
+ - `WIKI_MCP_PATH`:MCP 路由路径,默认 `/mcp`
151
+
152
+ Bearer token 说明:
153
+
154
+ - Bearer token 不配置在 `.wiki.env`、daemon env 或 MCP env 里
155
+ - 当前 V1 部署模型里,Bearer token 配在反向代理层
156
+ - 具体示例见 [references/examples/centralized-service/nginx-centralized-wiki.conf](./references/examples/centralized-service/nginx-centralized-wiki.conf) 中的 `map $http_authorization ...`
157
+ - 生产环境建议把 token 放在私有的 Nginx include 文件中,例如 `/etc/nginx/snippets/wiki-auth-tokens.conf`,再由主站点配置 `include` 进来,不要把真实密钥硬编码在仓库示例里
158
+
159
+ ## 客户端如何使用这个 MCP
160
+
161
+ 任何支持 Streamable HTTP 的 MCP client 都可以连接到这个服务:
162
+
163
+ - 本地调试地址:`http://127.0.0.1:9400/mcp`
164
+ - 健康检查:`http://127.0.0.1:9400/health`
165
+ - 生产环境建议:对外只暴露反向代理后的 `/mcp`,daemon 和 MCP 自身只监听 loopback
166
+
167
+ 读工具可以直接调用。写工具,如 `wiki_page_create`、`wiki_page_update`、`wiki_sync`,额外要求这些 header:
168
+
169
+ - `x-wiki-actor-id`
170
+ - `x-wiki-actor-type`
171
+ - `x-request-id`
172
+
173
+ 生产环境推荐模式是:客户端只向反向代理发送 `Authorization: Bearer ...`,由反向代理在代理层完成 token 校验,并在转发到 MCP server 前注入 actor headers。若你在本地直接连 MCP 做调试,则需要由客户端自己带上这些 header 才能调用写工具。
174
+
175
+ 最小 Node.js MCP client 示例:
176
+
177
+ ```ts
178
+ import { Client } from "@modelcontextprotocol/sdk/client/index.js";
179
+ import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";
180
+
181
+ const transport = new StreamableHTTPClientTransport(new URL("http://127.0.0.1:9400/mcp"), {
182
+ requestInit: {
183
+ headers: {
184
+ "x-wiki-actor-id": "agent:demo",
185
+ "x-wiki-actor-type": "agent",
186
+ "x-request-id": "req-demo-1",
187
+ },
188
+ },
189
+ });
190
+
191
+ const client = new Client({ name: "demo-client", version: "1.0.0" });
192
+ await client.connect(transport);
193
+
194
+ const tools = await client.listTools();
195
+ const search = await client.callTool({
196
+ name: "wiki_search",
197
+ arguments: { query: "bayes", limit: 5 },
198
+ });
199
+ ```
200
+
201
+ 当前 MCP tools 包括:
202
+
203
+ - 查询:`wiki_find`、`wiki_fts`、`wiki_search`、`wiki_graph`
204
+ - 页面:`wiki_page_info`、`wiki_page_read`、`wiki_page_create`、`wiki_page_update`
205
+ - 类型:`wiki_type_list`、`wiki_type_show`、`wiki_type_recommend`
206
+ - Vault:`wiki_vault_list`、`wiki_vault_queue`
207
+ - 维护:`wiki_sync`、`wiki_lint`
103
208
 
104
209
  ## CLI
105
210
 
@@ -1,4 +1,5 @@
1
1
  import { executeServerBackedOperation, requestDaemonJson } from "../daemon/client.js";
2
+ import { buildCliWriteActor } from "../daemon/write-actor.js";
2
3
  import { createPage } from "../operations/write.js";
3
4
  import { writeJson } from "../utils/output.js";
4
5
  export function registerCreateCommand(program) {
@@ -20,7 +21,9 @@ export function registerCreateCommand(program) {
20
21
  endpoint,
21
22
  method: "POST",
22
23
  path: "/create",
24
+ timeoutMs: 310_000,
23
25
  body: {
26
+ actor: buildCliWriteActor(process.env),
24
27
  type: options.type,
25
28
  title: options.title,
26
29
  nodeId: options.nodeId ?? undefined,
@@ -1,4 +1,5 @@
1
1
  import { executeServerBackedOperation, requestDaemonJson } from "../daemon/client.js";
2
+ import { buildCliWriteActor } from "../daemon/write-actor.js";
2
3
  import { runSyncCommand } from "../operations/write.js";
3
4
  import { writeJson } from "../utils/output.js";
4
5
  export function registerSyncCommand(program) {
@@ -24,7 +25,9 @@ export function registerSyncCommand(program) {
24
25
  endpoint,
25
26
  method: "POST",
26
27
  path: "/sync",
28
+ timeoutMs: 310_000,
27
29
  body: {
30
+ actor: buildCliWriteActor(process.env),
28
31
  path: options.path ?? undefined,
29
32
  force: options.force === true,
30
33
  skipEmbedding: options.skipEmbedding === true,
@@ -1,4 +1,5 @@
1
1
  import { executeServerBackedOperation, requestDaemonJson } from "../daemon/client.js";
2
+ import { buildCliWriteActor } from "../daemon/write-actor.js";
2
3
  import { renderTemplateLintResult, runTemplateLint } from "../operations/template-lint.js";
3
4
  import { createTemplate, listTemplates, showTemplate } from "../operations/type-template.js";
4
5
  import { ensureTextOrJson, writeJson, writeText } from "../utils/output.js";
@@ -90,7 +91,9 @@ export function registerTemplateCommand(program) {
90
91
  endpoint,
91
92
  method: "POST",
92
93
  path: "/template/create",
94
+ timeoutMs: 310_000,
93
95
  body: {
96
+ actor: buildCliWriteActor(process.env),
94
97
  type: options.type,
95
98
  title: options.title,
96
99
  },
@@ -1,6 +1,7 @@
1
1
  import path from "node:path";
2
2
  import { Codex } from "@openai/codex-sdk";
3
3
  import { readWorkflowResult } from "./workflow-result.js";
4
+ import { resolveAgentSettings } from "./paths.js";
4
5
  import { readTextFileSync, writeTextFileSync } from "../utils/fs.js";
5
6
  import { AppError } from "../utils/errors.js";
6
7
  export const CODEX_WORKFLOW_VERSION = "2026-04-07";
@@ -85,16 +86,10 @@ async function runThread(thread, input) {
85
86
  continue;
86
87
  }
87
88
  if (event.type === "turn.failed") {
88
- throw new AppError("Codex workflow turn failed", "runtime", {
89
- cause: event.error.message,
90
- threadId: activeThreadId,
91
- });
89
+ throw classifyWorkflowRuntimeError("Codex workflow turn failed", event.error.message, activeThreadId);
92
90
  }
93
91
  if (event.type === "error") {
94
- throw new AppError("Codex workflow stream failed", "runtime", {
95
- cause: event.message,
96
- threadId: activeThreadId,
97
- });
92
+ throw classifyWorkflowRuntimeError("Codex workflow stream failed", event.message, activeThreadId);
98
93
  }
99
94
  }
100
95
  }
@@ -103,10 +98,7 @@ async function runThread(thread, input) {
103
98
  throw error;
104
99
  }
105
100
  const message = error instanceof Error ? error.message : String(error);
106
- throw new AppError("Codex workflow turn failed", "runtime", {
107
- cause: message,
108
- threadId: activeThreadId,
109
- });
101
+ throw classifyWorkflowRuntimeError("Codex workflow turn failed", message, activeThreadId);
110
102
  }
111
103
  if (!activeThreadId && thread.id) {
112
104
  activeThreadId = thread.id;
@@ -118,6 +110,10 @@ async function runThread(thread, input) {
118
110
  return activeThreadId;
119
111
  }
120
112
  export class CodexSdkWorkflowRunner {
113
+ options;
114
+ constructor(options = {}) {
115
+ this.options = options;
116
+ }
121
117
  // The SDK can only continue a thread by sending a new input, so queue retries
122
118
  // must not automatically resume real workflow threads inline.
123
119
  async startWorkflow(input) {
@@ -127,7 +123,7 @@ export class CodexSdkWorkflowRunner {
127
123
  modelReasoningEffort: "low",
128
124
  workingDirectory: input.workspaceRoot,
129
125
  skipGitRepoCheck: true,
130
- sandboxMode: "workspace-write",
126
+ sandboxMode: this.options.sandboxMode ?? "danger-full-access",
131
127
  networkAccessEnabled: true,
132
128
  approvalPolicy: "never",
133
129
  webSearchMode: "disabled",
@@ -143,7 +139,7 @@ export class CodexSdkWorkflowRunner {
143
139
  modelReasoningEffort: "low",
144
140
  workingDirectory: input.workspaceRoot,
145
141
  skipGitRepoCheck: true,
146
- sandboxMode: "workspace-write",
142
+ sandboxMode: this.options.sandboxMode ?? "danger-full-access",
147
143
  networkAccessEnabled: true,
148
144
  approvalPolicy: "never",
149
145
  webSearchMode: "disabled",
@@ -229,5 +225,31 @@ export function createDefaultWorkflowRunner(env = process.env) {
229
225
  const delayMs = Number.parseInt(env.WIKI_TEST_FAKE_WORKFLOW_DELAY_MS ?? "0", 10) || 0;
230
226
  return createSkipOnlyTestWorkflowRunner({ delayMs, mode: "delay-skip" });
231
227
  }
232
- return new CodexSdkWorkflowRunner();
228
+ return new CodexSdkWorkflowRunner({
229
+ sandboxMode: resolveAgentSettings(env).sandboxMode,
230
+ });
231
+ }
232
+ function isSandboxStartupFailure(message) {
233
+ const normalized = message.toLowerCase();
234
+ return (normalized.includes("bwrap") ||
235
+ normalized.includes("bubblewrap") ||
236
+ normalized.includes("uid map") ||
237
+ normalized.includes("uid_map") ||
238
+ normalized.includes("gid map") ||
239
+ normalized.includes("gid_map") ||
240
+ normalized.includes("unshare") ||
241
+ normalized.includes("operation not permitted"));
242
+ }
243
+ function classifyWorkflowRuntimeError(baseMessage, cause, threadId) {
244
+ if (isSandboxStartupFailure(cause)) {
245
+ return new AppError("Codex workflow sandbox failed to initialize", "runtime", {
246
+ cause,
247
+ threadId,
248
+ phase: "sandbox",
249
+ });
250
+ }
251
+ return new AppError(baseMessage, "runtime", {
252
+ cause,
253
+ threadId,
254
+ });
233
255
  }
package/dist/core/db.js CHANGED
@@ -149,6 +149,25 @@ function ensureBaseTables(db, embeddingDimensions) {
149
149
  key TEXT PRIMARY KEY,
150
150
  value TEXT
151
151
  );
152
+
153
+ CREATE TABLE IF NOT EXISTS daemon_write_jobs (
154
+ job_id TEXT PRIMARY KEY,
155
+ task_type TEXT NOT NULL,
156
+ status TEXT NOT NULL,
157
+ enqueued_at TEXT NOT NULL,
158
+ started_at TEXT,
159
+ finished_at TEXT,
160
+ duration_ms INTEGER,
161
+ timeout_ms INTEGER NOT NULL,
162
+ queue_depth_at_enqueue INTEGER NOT NULL DEFAULT 0,
163
+ result_summary TEXT,
164
+ error_message TEXT,
165
+ error_details TEXT
166
+ );
167
+
168
+ CREATE INDEX IF NOT EXISTS idx_daemon_write_jobs_status ON daemon_write_jobs(status);
169
+ CREATE INDEX IF NOT EXISTS idx_daemon_write_jobs_enqueued_at ON daemon_write_jobs(enqueued_at DESC);
170
+ CREATE INDEX IF NOT EXISTS idx_daemon_write_jobs_finished_at ON daemon_write_jobs(finished_at DESC);
152
171
  `);
153
172
  ensureTableColumns(db, "vault_processing_queue", {
154
173
  claimed_at: "TEXT",
@@ -6,7 +6,7 @@ import { DEFAULT_WIKI_ENV_FILE, getCliEnvironmentInfo, parseEnvFile, serializeEn
6
6
  import { resolveTemplateFilePath, loadConfig } from "./config.js";
7
7
  import { EmbeddingClient } from "./embedding.js";
8
8
  import { writeGlobalConfig } from "./global-config.js";
9
- import { parseVaultHashMode, resolveAgentSettings } from "./paths.js";
9
+ import { parseVaultHashMode, parseWikiAgentSandboxMode, resolveAgentSettings } from "./paths.js";
10
10
  import { loadSynologyConfigFromEnv, normalizeSynologyRemotePath, withSynologyClient } from "./synology.js";
11
11
  import { ensureWikiSkillInstall, formatParserSkills, inspectSkillInstall, installParserSkill, OPTIONAL_PARSER_SKILLS, parseParserSkillSelection, parseParserSkills, resolveWorkspaceRootFromWikiPath, resolveWorkspaceSkillPath, resolveWorkspaceSkillPaths, } from "./workspace-skills.js";
12
12
  import { scaffoldWorkspaceAssets } from "./workspace-bootstrap.js";
@@ -36,11 +36,16 @@ const MANAGED_ENV_KEYS = new Set([
36
36
  "WIKI_AGENT_API_KEY",
37
37
  "WIKI_AGENT_MODEL",
38
38
  "WIKI_AGENT_BATCH_SIZE",
39
+ "WIKI_AGENT_SANDBOX_MODE",
39
40
  "WIKI_PARSER_SKILLS",
40
41
  ]);
41
42
  function writeSection(output, title) {
42
43
  output.write(`\n${title}\n`);
43
44
  }
45
+ function writeWarning(output, message) {
46
+ const isTty = "isTTY" in output && output.isTTY;
47
+ output.write(isTty ? `\x1b[31m${message}\x1b[0m\n` : `${message}\n`);
48
+ }
44
49
  function resolvePackageRoot(packageRoot) {
45
50
  return packageRoot ?? path.resolve(path.dirname(fileURLToPath(import.meta.url)), "../..");
46
51
  }
@@ -73,6 +78,14 @@ function safeVaultHashMode(rawValue, defaultValue) {
73
78
  return defaultValue;
74
79
  }
75
80
  }
81
+ function safeAgentSandboxMode(rawValue) {
82
+ try {
83
+ return parseWikiAgentSandboxMode(rawValue);
84
+ }
85
+ catch {
86
+ return "danger-full-access";
87
+ }
88
+ }
76
89
  function safeBooleanFlag(rawValue, defaultValue) {
77
90
  if (rawValue === undefined || rawValue.trim().length === 0) {
78
91
  return defaultValue;
@@ -381,6 +394,7 @@ function getPathDefaults(env, cwd) {
381
394
  agentApiKey: env.WIKI_AGENT_API_KEY ?? null,
382
395
  agentModel: env.WIKI_AGENT_MODEL ?? null,
383
396
  agentBatchSize: env.WIKI_AGENT_BATCH_SIZE ?? "5",
397
+ agentSandboxMode: safeAgentSandboxMode(env.WIKI_AGENT_SANDBOX_MODE),
384
398
  parserSkills: parseParserSkills(env.WIKI_PARSER_SKILLS, { strict: false }),
385
399
  };
386
400
  }
@@ -447,14 +461,32 @@ async function collectAgentSettings(driver, ctx, defaults) {
447
461
  agentApiKey: null,
448
462
  agentModel: null,
449
463
  agentBatchSize: null,
464
+ agentSandboxMode: null,
450
465
  };
451
466
  }
467
+ writeWarning(ctx.output, "Warning: danger-full-access grants full access to the runtime workspace.");
452
468
  return {
453
469
  agentEnabled: true,
454
470
  agentBaseUrl: await promptText(driver, "WIKI_AGENT_BASE_URL", defaults.agentBaseUrl ?? "https://api.openai.com/v1", { validator: (value) => validateUrl(value, "WIKI_AGENT_BASE_URL") }),
455
471
  agentApiKey: await promptPassword(driver, "WIKI_AGENT_API_KEY", defaults.agentApiKey ?? "", { required: true }),
456
472
  agentModel: await promptText(driver, "WIKI_AGENT_MODEL", defaults.agentModel ?? "", { required: true }),
457
473
  agentBatchSize: await promptText(driver, "WIKI_AGENT_BATCH_SIZE", defaults.agentBatchSize ?? "5", { validator: (value) => validateNonNegativeInteger(value, "WIKI_AGENT_BATCH_SIZE") }),
474
+ agentSandboxMode: await driver.select({
475
+ message: "WIKI_AGENT_SANDBOX_MODE",
476
+ defaultValue: defaults.agentSandboxMode ?? "danger-full-access",
477
+ choices: [
478
+ {
479
+ value: "danger-full-access",
480
+ label: "danger-full-access",
481
+ description: "Full access to the runtime workspace. Default.",
482
+ },
483
+ {
484
+ value: "workspace-write",
485
+ label: "workspace-write",
486
+ description: "Use Codex workspace-write sandbox when the host supports it.",
487
+ },
488
+ ],
489
+ }),
458
490
  };
459
491
  }
460
492
  async function collectSynologySettings(driver, ctx, defaults) {
@@ -532,6 +564,7 @@ function buildSetupSummary(values) {
532
564
  lines.push(` WIKI_AGENT_BASE_URL: ${values.agentBaseUrl}`);
533
565
  lines.push(` WIKI_AGENT_MODEL: ${values.agentModel}`);
534
566
  lines.push(` WIKI_AGENT_BATCH_SIZE: ${values.agentBatchSize}`);
567
+ lines.push(` WIKI_AGENT_SANDBOX_MODE: ${values.agentSandboxMode}`);
535
568
  }
536
569
  if (values.vaultSource === "synology") {
537
570
  lines.push(` SYNOLOGY_BASE_URL: ${values.synologyBaseUrl}`);
@@ -569,6 +602,7 @@ function writeSetupEnvFile(values) {
569
602
  ["WIKI_AGENT_API_KEY", values.agentEnabled ? values.agentApiKey : null],
570
603
  ["WIKI_AGENT_MODEL", values.agentEnabled ? values.agentModel : null],
571
604
  ["WIKI_AGENT_BATCH_SIZE", values.agentEnabled ? values.agentBatchSize : null],
605
+ ["WIKI_AGENT_SANDBOX_MODE", values.agentEnabled ? values.agentSandboxMode : null],
572
606
  ["WIKI_PARSER_SKILLS", formatParserSkills(values.parserSkills)],
573
607
  ];
574
608
  const body = [
@@ -0,0 +1,25 @@
1
+ import { parsePage } from "./frontmatter.js";
2
+ import { normalizePageId, resolvePagePath } from "./paths.js";
3
+ import { pathExistsSync, readTextFileSync, sha256Text } from "../utils/fs.js";
4
+ export function buildPageRevision(rawMarkdown) {
5
+ if (rawMarkdown === null) {
6
+ return null;
7
+ }
8
+ return sha256Text(rawMarkdown);
9
+ }
10
+ export function readCanonicalPageSource(filePath, wikiPath, config) {
11
+ const pageId = normalizePageId(filePath, wikiPath);
12
+ const rawMarkdown = pathExistsSync(filePath) ? readTextFileSync(filePath) : null;
13
+ const parsed = rawMarkdown === null ? null : parsePage(filePath, wikiPath, config);
14
+ return {
15
+ pageId,
16
+ pagePath: filePath,
17
+ rawMarkdown,
18
+ frontmatter: parsed?.ok ? parsed.parsed.rawData : {},
19
+ revision: buildPageRevision(rawMarkdown),
20
+ };
21
+ }
22
+ export function readCanonicalPageSourceById(pageId, wikiPath, config) {
23
+ const canonicalPageId = normalizePageId(pageId, wikiPath);
24
+ return readCanonicalPageSource(resolvePagePath(canonicalPageId, wikiPath), wikiPath, config);
25
+ }
@@ -89,12 +89,20 @@ export function parseWikiAgentBackend(raw) {
89
89
  }
90
90
  throw new AppError(`WIKI_AGENT_BACKEND must be "codex-workflow", got ${raw}`, "config");
91
91
  }
92
+ export function parseWikiAgentSandboxMode(raw) {
93
+ const value = (raw ?? "danger-full-access").trim().toLowerCase();
94
+ if (value === "danger-full-access" || value === "workspace-write") {
95
+ return value;
96
+ }
97
+ throw new AppError(`WIKI_AGENT_SANDBOX_MODE must be "danger-full-access" or "workspace-write", got ${raw}`, "config");
98
+ }
92
99
  export function resolveAgentSettings(env = process.env, options = {}) {
93
100
  const enabled = parseBooleanFlag("WIKI_AGENT_ENABLED", env.WIKI_AGENT_ENABLED, false);
94
101
  const baseUrl = normalizeOptionalUrl(env.WIKI_AGENT_BASE_URL);
95
102
  const apiKey = env.WIKI_AGENT_API_KEY?.trim() || null;
96
103
  const model = env.WIKI_AGENT_MODEL?.trim() || null;
97
104
  const batchSize = parseNonNegativeInteger(env.WIKI_AGENT_BATCH_SIZE, 5, "WIKI_AGENT_BATCH_SIZE");
105
+ const sandboxMode = parseWikiAgentSandboxMode(env.WIKI_AGENT_SANDBOX_MODE);
98
106
  const workflowTimeoutSeconds = parsePositiveInteger(env.WIKI_WORKFLOW_TIMEOUT, 600, "WIKI_WORKFLOW_TIMEOUT");
99
107
  const missing = [];
100
108
  if (enabled) {
@@ -114,6 +122,7 @@ export function resolveAgentSettings(env = process.env, options = {}) {
114
122
  apiKey,
115
123
  model,
116
124
  batchSize,
125
+ sandboxMode,
117
126
  workflowTimeoutSeconds,
118
127
  configured: enabled && missing.length === 0,
119
128
  missing,
@@ -146,6 +155,7 @@ export function resolveRuntimePaths(env = process.env) {
146
155
  daemonPidPath: path.join(wikiRoot, ".wiki-daemon.pid"),
147
156
  daemonLogPath: path.join(wikiRoot, ".wiki-daemon.log"),
148
157
  daemonStatePath: path.join(wikiRoot, ".wiki-daemon.state.json"),
158
+ auditLogPath: path.join(wikiRoot, "..", ".wiki-runtime", "audit.ndjson"),
149
159
  };
150
160
  }
151
161
  export function normalizePageId(inputPath, wikiPath) {
@@ -148,6 +148,11 @@ export function readWorkflowResult(resultPath) {
148
148
  fail(`Workflow result not found: ${resultPath}`);
149
149
  }
150
150
  const rawText = readTextFileSync(resultPath);
151
+ if (!rawText.trim()) {
152
+ fail("Workflow result is empty", {
153
+ resultPath,
154
+ });
155
+ }
151
156
  let parsed;
152
157
  try {
153
158
  parsed = JSON.parse(rawText);
@@ -0,0 +1,18 @@
1
+ import { appendFileSync } from "node:fs";
2
+ import { randomUUID } from "node:crypto";
3
+ import path from "node:path";
4
+ import { ensureDirSync } from "../utils/fs.js";
5
+ import { toOffsetIso } from "../utils/time.js";
6
+ export function appendAuditEvent(paths, actor, input) {
7
+ ensureDirSync(path.dirname(paths.auditLogPath));
8
+ const event = {
9
+ eventId: randomUUID(),
10
+ timestamp: toOffsetIso(),
11
+ requestId: actor.requestId,
12
+ actorId: actor.actorId,
13
+ actorType: actor.actorType,
14
+ ...input,
15
+ };
16
+ appendFileSync(paths.auditLogPath, `${JSON.stringify(event)}\n`, "utf8");
17
+ return event;
18
+ }
@@ -124,7 +124,7 @@ export async function requestDaemonJson(options) {
124
124
  method: options.method,
125
125
  headers: options.body === undefined ? undefined : { "content-type": "application/json" },
126
126
  body: options.body === undefined ? undefined : JSON.stringify(options.body),
127
- }, 5_000);
127
+ }, options.timeoutMs ?? (options.method === "POST" ? 310_000 : 5_000));
128
128
  }
129
129
  function warnReadFallback(availability) {
130
130
  const location = availability.state