@junwu168/openshell 0.1.3 → 0.1.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/dist/core/audit/log-store.js +1 -1
- package/dist/core/orchestrator.d.ts +2 -2
- package/dist/core/orchestrator.js +3 -3
- package/dist/core/result.d.ts +1 -1
- package/dist/index.d.ts +3 -3
- package/dist/index.js +3 -3
- package/dist/opencode/plugin.d.ts +1 -1
- package/dist/opencode/plugin.js +8 -8
- package/package.json +6 -1
- package/.claude/settings.local.json +0 -25
- package/bun.lock +0 -368
- package/docs/superpowers/notes/2026-03-25-opencode-remote-tools-handoff.md +0 -81
- package/docs/superpowers/notes/2026-03-26-openshell-pre-release-review.md +0 -174
- package/docs/superpowers/plans/2026-03-25-opencode-remote-tools.md +0 -1656
- package/docs/superpowers/plans/2026-03-25-server-registry-cli.md +0 -54
- package/docs/superpowers/plans/2026-03-26-config-backed-credential-registry.md +0 -494
- package/docs/superpowers/plans/2026-03-26-openshell-release-prep.md +0 -639
- package/docs/superpowers/specs/2026-03-25-opencode-remote-tools-design.md +0 -378
- package/docs/superpowers/specs/2026-03-26-config-backed-credential-registry-design.md +0 -272
- package/docs/superpowers/specs/2026-03-26-openshell-release-prep-design.md +0 -197
- package/examples/opencode-local/opencode.json +0 -19
- package/scripts/openshell.ts +0 -3
- package/scripts/server-registry.ts +0 -3
- package/src/cli/openshell.ts +0 -65
- package/src/cli/server-registry.ts +0 -476
- package/src/core/audit/git-audit-repo.ts +0 -42
- package/src/core/audit/log-store.ts +0 -20
- package/src/core/audit/redact.ts +0 -4
- package/src/core/contracts.ts +0 -51
- package/src/core/orchestrator.ts +0 -1082
- package/src/core/patch.ts +0 -11
- package/src/core/paths.ts +0 -32
- package/src/core/policy.ts +0 -30
- package/src/core/registry/server-registry.ts +0 -505
- package/src/core/result.ts +0 -16
- package/src/core/ssh/ssh-runtime.ts +0 -355
- package/src/index.ts +0 -3
- package/src/opencode/plugin.ts +0 -242
- package/src/product/install.ts +0 -43
- package/src/product/opencode-config.ts +0 -118
- package/src/product/uninstall.ts +0 -47
- package/src/product/workspace-tracker.ts +0 -69
- package/tests/integration/fake-ssh-server.ts +0 -97
- package/tests/integration/install-lifecycle.test.ts +0 -85
- package/tests/integration/orchestrator.test.ts +0 -767
- package/tests/integration/ssh-runtime.test.ts +0 -122
- package/tests/unit/audit.test.ts +0 -221
- package/tests/unit/build-layout.test.ts +0 -28
- package/tests/unit/opencode-config.test.ts +0 -100
- package/tests/unit/opencode-plugin.test.ts +0 -358
- package/tests/unit/openshell-cli.test.ts +0 -60
- package/tests/unit/paths.test.ts +0 -64
- package/tests/unit/plugin-export.test.ts +0 -10
- package/tests/unit/policy.test.ts +0 -53
- package/tests/unit/release-docs.test.ts +0 -31
- package/tests/unit/result.test.ts +0 -28
- package/tests/unit/server-registry-cli.test.ts +0 -673
- package/tests/unit/server-registry.test.ts +0 -452
- package/tests/unit/workspace-tracker.test.ts +0 -57
- package/tsconfig.json +0 -14
|
@@ -1,1656 +0,0 @@
|
|
|
1
|
-
# Open Code v1 Implementation Plan
|
|
2
|
-
|
|
3
|
-
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
4
|
-
|
|
5
|
-
**Goal:** Build the first `opencode` plugin release of Open Code with explicit multi-server SSH tools, encrypted local credentials, deterministic approval policy, structured local audit logs, and git-backed snapshots for dedicated file writes.
|
|
6
|
-
|
|
7
|
-
**Architecture:** Build a TypeScript ESM package that exports an `opencode` plugin while keeping the runtime core in-process and host-agnostic. The `opencode` adapter only registers tools, wires runtime dependencies, and relies on OpenCode's permission config to surface approval prompts; the core owns registry, policy, SSH, patch application, audit, and orchestration. Dedicated file write tools carry file-level audit guarantees; `remote_exec` carries command-level audit only.
|
|
8
|
-
|
|
9
|
-
**Tech Stack:** TypeScript ESM, Bun runtime/package manager, `@opencode-ai/plugin`, `ssh2`, `diff`, `env-paths`, `keytar`, `shescape`, `testcontainers`, Bun/Node crypto APIs, local `git` CLI, `bun:test`
|
|
10
|
-
|
|
11
|
-
---
|
|
12
|
-
|
|
13
|
-
## File Map
|
|
14
|
-
|
|
15
|
-
- Create: `package.json` - package metadata, scripts, runtime dependencies, publish entrypoints.
|
|
16
|
-
- Create: `tsconfig.json` - ESM TypeScript compiler settings for Bun-compatible output.
|
|
17
|
-
- Create: `.gitignore` - ignore build output, temp fixtures, and local runtime artifacts.
|
|
18
|
-
- Create: `src/index.ts` - public package entry that exports the `opencode` plugin.
|
|
19
|
-
- Create: `src/core/contracts.ts` - shared server records, tool arg types, approval types, and tool result shapes.
|
|
20
|
-
- Create: `src/core/result.ts` - canonical helpers for success, partial failure, and error payloads.
|
|
21
|
-
- Create: `src/core/paths.ts` - user config/data path resolution for registry and audit storage.
|
|
22
|
-
- Create: `src/core/policy.ts` - deterministic command classification and approval decisions.
|
|
23
|
-
- Create: `src/core/patch.ts` - apply unified diffs locally before remote writes.
|
|
24
|
-
- Create: `src/core/registry/secret-provider.ts` - secret-provider interface plus test double contract.
|
|
25
|
-
- Create: `src/core/registry/keychain-provider.ts` - OS-keychain-backed master-key provider.
|
|
26
|
-
- Create: `src/core/registry/crypto.ts` - AES-GCM encrypt/decrypt helpers for registry records.
|
|
27
|
-
- Create: `src/core/registry/server-registry.ts` - CRUD/load/resolve logic for encrypted server definitions.
|
|
28
|
-
- Create: `src/core/audit/redact.ts` - redact commands, content, and secret-like values before logging.
|
|
29
|
-
- Create: `src/core/audit/log-store.ts` - append-only JSONL audit log writer with fail-closed preflight.
|
|
30
|
-
- Create: `src/core/audit/git-audit-repo.ts` - audit repo init, snapshot path mapping, and git commit creation.
|
|
31
|
-
- Create: `src/core/ssh/ssh-runtime.ts` - SSH command execution and SFTP-backed file operations.
|
|
32
|
-
- Create: `src/core/orchestrator.ts` - central validate/classify/approve/execute/audit pipeline.
|
|
33
|
-
- Create: `src/opencode/plugin.ts` - OpenCode custom tool definitions using `@opencode-ai/plugin`.
|
|
34
|
-
- Create: `tests/unit/plugin-export.test.ts`
|
|
35
|
-
- Create: `tests/unit/result.test.ts`
|
|
36
|
-
- Create: `tests/unit/server-registry.test.ts`
|
|
37
|
-
- Create: `tests/unit/policy.test.ts`
|
|
38
|
-
- Create: `tests/unit/audit.test.ts`
|
|
39
|
-
- Create: `tests/unit/opencode-plugin.test.ts`
|
|
40
|
-
- Create: `tests/integration/fake-ssh-server.ts`
|
|
41
|
-
- Create: `tests/integration/ssh-runtime.test.ts`
|
|
42
|
-
- Create: `tests/integration/orchestrator.test.ts`
|
|
43
|
-
- Create: `examples/opencode-local/.opencode/package.json` - local plugin fixture dependencies for manual smoke testing.
|
|
44
|
-
- Create: `examples/opencode-local/.opencode/plugins/open-code.ts` - local loader for the plugin package during smoke testing.
|
|
45
|
-
- Create: `examples/opencode-local/opencode.json` - permission rules and plugin loading config for the fixture project.
|
|
46
|
-
- Modify: `README.md` - setup, security model, development commands, and manual OpenCode smoke test instructions.
|
|
47
|
-
|
|
48
|
-
## Task 1: Bootstrap The Package And Test Harness
|
|
49
|
-
|
|
50
|
-
**Files:**
|
|
51
|
-
- Create: `package.json`
|
|
52
|
-
- Create: `tsconfig.json`
|
|
53
|
-
- Create: `.gitignore`
|
|
54
|
-
- Create: `src/index.ts`
|
|
55
|
-
- Create: `tests/unit/plugin-export.test.ts`
|
|
56
|
-
|
|
57
|
-
- [ ] **Step 1: Write the failing export smoke test**
|
|
58
|
-
|
|
59
|
-
```ts
|
|
60
|
-
import { describe, expect, test } from "bun:test"
|
|
61
|
-
import { OpenCodePlugin } from "../../src/index"
|
|
62
|
-
|
|
63
|
-
describe("package entry", () => {
|
|
64
|
-
test("exports the OpenCode plugin factory", () => {
|
|
65
|
-
expect(typeof OpenCodePlugin).toBe("function")
|
|
66
|
-
})
|
|
67
|
-
})
|
|
68
|
-
```
|
|
69
|
-
|
|
70
|
-
- [ ] **Step 2: Run the test to verify the repo is still unimplemented**
|
|
71
|
-
|
|
72
|
-
Run: `bun test tests/unit/plugin-export.test.ts`
|
|
73
|
-
Expected: FAIL with a module resolution error for `../../src/index`
|
|
74
|
-
|
|
75
|
-
- [ ] **Step 3: Add the minimal package scaffold**
|
|
76
|
-
|
|
77
|
-
Run:
|
|
78
|
-
|
|
79
|
-
```bash
|
|
80
|
-
bun add @opencode-ai/plugin ssh2 diff env-paths keytar shescape
|
|
81
|
-
bun add -d typescript @types/bun testcontainers
|
|
82
|
-
```
|
|
83
|
-
|
|
84
|
-
Write:
|
|
85
|
-
|
|
86
|
-
```json
|
|
87
|
-
{
|
|
88
|
-
"name": "open-code",
|
|
89
|
-
"version": "0.1.0",
|
|
90
|
-
"private": true,
|
|
91
|
-
"type": "module",
|
|
92
|
-
"exports": {
|
|
93
|
-
".": "./dist/index.js"
|
|
94
|
-
},
|
|
95
|
-
"dependencies": {
|
|
96
|
-
"@opencode-ai/plugin": "*",
|
|
97
|
-
"diff": "*",
|
|
98
|
-
"env-paths": "*",
|
|
99
|
-
"keytar": "*",
|
|
100
|
-
"shescape": "*",
|
|
101
|
-
"ssh2": "*"
|
|
102
|
-
},
|
|
103
|
-
"devDependencies": {
|
|
104
|
-
"@types/bun": "*",
|
|
105
|
-
"testcontainers": "*",
|
|
106
|
-
"typescript": "*"
|
|
107
|
-
},
|
|
108
|
-
"scripts": {
|
|
109
|
-
"build": "tsc -p tsconfig.json",
|
|
110
|
-
"typecheck": "tsc --noEmit",
|
|
111
|
-
"test": "bun test"
|
|
112
|
-
}
|
|
113
|
-
}
|
|
114
|
-
```
|
|
115
|
-
|
|
116
|
-
```json
|
|
117
|
-
{
|
|
118
|
-
"compilerOptions": {
|
|
119
|
-
"target": "ES2022",
|
|
120
|
-
"module": "ESNext",
|
|
121
|
-
"moduleResolution": "Bundler",
|
|
122
|
-
"outDir": "dist",
|
|
123
|
-
"rootDir": ".",
|
|
124
|
-
"strict": true,
|
|
125
|
-
"declaration": true,
|
|
126
|
-
"skipLibCheck": true
|
|
127
|
-
},
|
|
128
|
-
"include": ["src/**/*.ts", "tests/**/*.ts", "examples/**/*.ts"]
|
|
129
|
-
}
|
|
130
|
-
```
|
|
131
|
-
|
|
132
|
-
```ts
|
|
133
|
-
export const OpenCodePlugin = async () => ({})
|
|
134
|
-
```
|
|
135
|
-
|
|
136
|
-
- [ ] **Step 4: Run the first green checks**
|
|
137
|
-
|
|
138
|
-
Run: `bun test tests/unit/plugin-export.test.ts`
|
|
139
|
-
Expected: PASS
|
|
140
|
-
|
|
141
|
-
Run: `bun run typecheck`
|
|
142
|
-
Expected: PASS with no TypeScript errors
|
|
143
|
-
|
|
144
|
-
- [ ] **Step 5: Commit**
|
|
145
|
-
|
|
146
|
-
```bash
|
|
147
|
-
git add package.json bun.lock tsconfig.json .gitignore src/index.ts tests/unit/plugin-export.test.ts
|
|
148
|
-
git commit -m "chore: bootstrap open code plugin package"
|
|
149
|
-
```
|
|
150
|
-
|
|
151
|
-
## Task 2: Lock Shared Contracts, Result Schema, And Runtime Paths
|
|
152
|
-
|
|
153
|
-
**Files:**
|
|
154
|
-
- Create: `src/core/contracts.ts`
|
|
155
|
-
- Create: `src/core/result.ts`
|
|
156
|
-
- Create: `src/core/paths.ts`
|
|
157
|
-
- Create: `tests/unit/result.test.ts`
|
|
158
|
-
- Modify: `src/index.ts`
|
|
159
|
-
|
|
160
|
-
- [ ] **Step 1: Write failing tests for the canonical tool result helpers**
|
|
161
|
-
|
|
162
|
-
```ts
|
|
163
|
-
import { describe, expect, test } from "bun:test"
|
|
164
|
-
import { okResult, partialFailureResult, errorResult } from "../../src/core/result"
|
|
165
|
-
|
|
166
|
-
describe("tool result helpers", () => {
|
|
167
|
-
test("builds success payloads", () => {
|
|
168
|
-
expect(
|
|
169
|
-
okResult({
|
|
170
|
-
tool: "list_servers",
|
|
171
|
-
data: [],
|
|
172
|
-
execution: { attempted: true, completed: true },
|
|
173
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
174
|
-
}).status,
|
|
175
|
-
).toBe("ok")
|
|
176
|
-
})
|
|
177
|
-
|
|
178
|
-
test("builds partial-failure payloads", () => {
|
|
179
|
-
expect(
|
|
180
|
-
partialFailureResult({
|
|
181
|
-
tool: "remote_write_file",
|
|
182
|
-
message: "remote write succeeded but git commit failed",
|
|
183
|
-
}).status,
|
|
184
|
-
).toBe("partial_failure")
|
|
185
|
-
})
|
|
186
|
-
|
|
187
|
-
test("builds hard-error payloads", () => {
|
|
188
|
-
expect(errorResult({ tool: "remote_exec", code: "POLICY_REJECTED" }).status).toBe("error")
|
|
189
|
-
})
|
|
190
|
-
})
|
|
191
|
-
```
|
|
192
|
-
|
|
193
|
-
- [ ] **Step 2: Run the test to verify the contracts do not exist yet**
|
|
194
|
-
|
|
195
|
-
Run: `bun test tests/unit/result.test.ts`
|
|
196
|
-
Expected: FAIL with missing module errors for `src/core/result`
|
|
197
|
-
|
|
198
|
-
- [ ] **Step 3: Implement the shared types and helpers**
|
|
199
|
-
|
|
200
|
-
Write `src/core/contracts.ts`:
|
|
201
|
-
|
|
202
|
-
```ts
|
|
203
|
-
export type ServerID = string
|
|
204
|
-
|
|
205
|
-
export type ApprovalDecision = "allow" | "deny"
|
|
206
|
-
export type PolicyDecision = "auto-allow" | "approval-required" | "reject"
|
|
207
|
-
|
|
208
|
-
export type ToolStatus = "ok" | "partial_failure" | "error"
|
|
209
|
-
|
|
210
|
-
export interface ToolPayload<TData = unknown> {
|
|
211
|
-
tool: string
|
|
212
|
-
server?: ServerID
|
|
213
|
-
data?: TData
|
|
214
|
-
message?: string
|
|
215
|
-
code?: string
|
|
216
|
-
execution?: {
|
|
217
|
-
attempted: boolean
|
|
218
|
-
completed: boolean
|
|
219
|
-
exitCode?: number
|
|
220
|
-
stdoutBytes?: number
|
|
221
|
-
stderrBytes?: number
|
|
222
|
-
stdoutTruncated?: boolean
|
|
223
|
-
stderrTruncated?: boolean
|
|
224
|
-
}
|
|
225
|
-
audit?: {
|
|
226
|
-
logWritten: boolean
|
|
227
|
-
snapshotStatus: "not-applicable" | "written" | "partial-failure"
|
|
228
|
-
}
|
|
229
|
-
}
|
|
230
|
-
|
|
231
|
-
export interface ToolResult<TData = unknown> extends ToolPayload<TData> {
|
|
232
|
-
status: ToolStatus
|
|
233
|
-
}
|
|
234
|
-
```
|
|
235
|
-
|
|
236
|
-
Write `src/core/result.ts`:
|
|
237
|
-
|
|
238
|
-
```ts
|
|
239
|
-
import type { ToolPayload, ToolResult } from "./contracts"
|
|
240
|
-
|
|
241
|
-
export const okResult = <T>(payload: ToolPayload<T>): ToolResult<T> => ({
|
|
242
|
-
status: "ok",
|
|
243
|
-
...payload,
|
|
244
|
-
})
|
|
245
|
-
|
|
246
|
-
export const partialFailureResult = <T>(payload: ToolPayload<T>): ToolResult<T> => ({
|
|
247
|
-
status: "partial_failure",
|
|
248
|
-
...payload,
|
|
249
|
-
})
|
|
250
|
-
|
|
251
|
-
export const errorResult = <T>(payload: ToolPayload<T>): ToolResult<T> => ({
|
|
252
|
-
status: "error",
|
|
253
|
-
...payload,
|
|
254
|
-
})
|
|
255
|
-
```
|
|
256
|
-
|
|
257
|
-
Populate `execution` and `audit` in every return path from the orchestrator so adapters never need tool-specific result parsing to understand whether an action ran, finished, or wrote audit artifacts.
|
|
258
|
-
|
|
259
|
-
Write `src/core/paths.ts`:
|
|
260
|
-
|
|
261
|
-
```ts
|
|
262
|
-
import envPaths from "env-paths"
|
|
263
|
-
import { mkdir } from "node:fs/promises"
|
|
264
|
-
|
|
265
|
-
const paths = envPaths("open-code", { suffix: "" })
|
|
266
|
-
|
|
267
|
-
export const runtimePaths = {
|
|
268
|
-
configDir: paths.config,
|
|
269
|
-
dataDir: paths.data,
|
|
270
|
-
registryFile: `${paths.config}/servers.enc.json`,
|
|
271
|
-
auditLogFile: `${paths.data}/audit/actions.jsonl`,
|
|
272
|
-
auditRepoDir: `${paths.data}/audit/repo`,
|
|
273
|
-
}
|
|
274
|
-
|
|
275
|
-
export const ensureRuntimeDirs = async () => {
|
|
276
|
-
await mkdir(`${runtimePaths.dataDir}/audit`, { recursive: true })
|
|
277
|
-
await mkdir(runtimePaths.configDir, { recursive: true })
|
|
278
|
-
}
|
|
279
|
-
```
|
|
280
|
-
|
|
281
|
-
Modify `src/index.ts`:
|
|
282
|
-
|
|
283
|
-
```ts
|
|
284
|
-
export const OpenCodePlugin = async () => ({})
|
|
285
|
-
export * from "./core/contracts"
|
|
286
|
-
```
|
|
287
|
-
|
|
288
|
-
- [ ] **Step 4: Run the focused checks**
|
|
289
|
-
|
|
290
|
-
Run: `bun test tests/unit/result.test.ts`
|
|
291
|
-
Expected: PASS
|
|
292
|
-
|
|
293
|
-
Run: `bun run typecheck`
|
|
294
|
-
Expected: PASS
|
|
295
|
-
|
|
296
|
-
- [ ] **Step 5: Commit**
|
|
297
|
-
|
|
298
|
-
```bash
|
|
299
|
-
git add src/index.ts src/core/contracts.ts src/core/result.ts src/core/paths.ts tests/unit/result.test.ts
|
|
300
|
-
git commit -m "feat: add shared contracts and runtime paths"
|
|
301
|
-
```
|
|
302
|
-
|
|
303
|
-
## Task 3: Implement The Encrypted Multi-Server Registry
|
|
304
|
-
|
|
305
|
-
**Files:**
|
|
306
|
-
- Create: `src/core/registry/secret-provider.ts`
|
|
307
|
-
- Create: `src/core/registry/keychain-provider.ts`
|
|
308
|
-
- Create: `src/core/registry/crypto.ts`
|
|
309
|
-
- Create: `src/core/registry/server-registry.ts`
|
|
310
|
-
- Create: `tests/unit/server-registry.test.ts`
|
|
311
|
-
|
|
312
|
-
- [ ] **Step 1: Write failing tests for encrypted persistence**
|
|
313
|
-
|
|
314
|
-
```ts
|
|
315
|
-
import { afterEach, beforeEach, describe, expect, test } from "bun:test"
|
|
316
|
-
import { mkdtemp, readFile, rm } from "node:fs/promises"
|
|
317
|
-
import { join } from "node:path"
|
|
318
|
-
import { tmpdir } from "node:os"
|
|
319
|
-
import { createServerRegistry } from "../../src/core/registry/server-registry"
|
|
320
|
-
|
|
321
|
-
describe("server registry", () => {
|
|
322
|
-
let tempDir: string
|
|
323
|
-
|
|
324
|
-
beforeEach(async () => {
|
|
325
|
-
tempDir = await mkdtemp(join(tmpdir(), "open-code-registry-"))
|
|
326
|
-
})
|
|
327
|
-
|
|
328
|
-
afterEach(async () => {
|
|
329
|
-
await rm(tempDir, { recursive: true, force: true })
|
|
330
|
-
})
|
|
331
|
-
|
|
332
|
-
test("stores server records encrypted at rest", async () => {
|
|
333
|
-
const registry = createServerRegistry({
|
|
334
|
-
registryFile: join(tempDir, "servers.enc.json"),
|
|
335
|
-
secretProvider: { getMasterKey: async () => Buffer.alloc(32, 7) },
|
|
336
|
-
})
|
|
337
|
-
|
|
338
|
-
await registry.upsert({
|
|
339
|
-
id: "prod-a",
|
|
340
|
-
host: "10.0.0.10",
|
|
341
|
-
port: 22,
|
|
342
|
-
username: "root",
|
|
343
|
-
auth: { kind: "password", secret: "super-secret" },
|
|
344
|
-
})
|
|
345
|
-
|
|
346
|
-
const disk = await readFile(join(tempDir, "servers.enc.json"), "utf8")
|
|
347
|
-
expect(disk.includes("super-secret")).toBe(false)
|
|
348
|
-
expect(await registry.resolve("prod-a")).toMatchObject({ id: "prod-a", host: "10.0.0.10" })
|
|
349
|
-
})
|
|
350
|
-
})
|
|
351
|
-
```
|
|
352
|
-
|
|
353
|
-
- [ ] **Step 2: Run the registry test to see the missing implementation**
|
|
354
|
-
|
|
355
|
-
Run: `bun test tests/unit/server-registry.test.ts`
|
|
356
|
-
Expected: FAIL with missing module errors for `server-registry`
|
|
357
|
-
|
|
358
|
-
- [ ] **Step 3: Implement registry contracts, crypto, and persistence**
|
|
359
|
-
|
|
360
|
-
Write `src/core/registry/secret-provider.ts`:
|
|
361
|
-
|
|
362
|
-
```ts
|
|
363
|
-
export interface SecretProvider {
|
|
364
|
-
getMasterKey(): Promise<Buffer>
|
|
365
|
-
}
|
|
366
|
-
```
|
|
367
|
-
|
|
368
|
-
Write `src/core/registry/keychain-provider.ts`:
|
|
369
|
-
|
|
370
|
-
```ts
|
|
371
|
-
import { randomBytes } from "node:crypto"
|
|
372
|
-
import keytar from "keytar"
|
|
373
|
-
import type { SecretProvider } from "./secret-provider"
|
|
374
|
-
|
|
375
|
-
const SERVICE = "open-code"
|
|
376
|
-
const ACCOUNT = "registry-master-key"
|
|
377
|
-
|
|
378
|
-
export const createKeychainSecretProvider = (): SecretProvider => ({
|
|
379
|
-
async getMasterKey() {
|
|
380
|
-
let secret = await keytar.getPassword(SERVICE, ACCOUNT)
|
|
381
|
-
if (!secret) {
|
|
382
|
-
secret = randomBytes(32).toString("base64")
|
|
383
|
-
await keytar.setPassword(SERVICE, ACCOUNT, secret)
|
|
384
|
-
}
|
|
385
|
-
return Buffer.from(secret, "base64")
|
|
386
|
-
},
|
|
387
|
-
})
|
|
388
|
-
```
|
|
389
|
-
|
|
390
|
-
Write `src/core/registry/crypto.ts`:
|
|
391
|
-
|
|
392
|
-
```ts
|
|
393
|
-
import { createCipheriv, createDecipheriv, randomBytes } from "node:crypto"
|
|
394
|
-
|
|
395
|
-
export const encryptJson = (plaintext: string, key: Buffer) => {
|
|
396
|
-
const iv = randomBytes(12)
|
|
397
|
-
const cipher = createCipheriv("aes-256-gcm", key, iv)
|
|
398
|
-
const body = Buffer.concat([cipher.update(plaintext, "utf8"), cipher.final()])
|
|
399
|
-
const tag = cipher.getAuthTag()
|
|
400
|
-
return {
|
|
401
|
-
iv: iv.toString("base64"),
|
|
402
|
-
tag: tag.toString("base64"),
|
|
403
|
-
body: body.toString("base64"),
|
|
404
|
-
}
|
|
405
|
-
}
|
|
406
|
-
|
|
407
|
-
export const decryptJson = (payload: { iv: string; tag: string; body: string }, key: Buffer) => {
|
|
408
|
-
const decipher = createDecipheriv("aes-256-gcm", key, Buffer.from(payload.iv, "base64"))
|
|
409
|
-
decipher.setAuthTag(Buffer.from(payload.tag, "base64"))
|
|
410
|
-
return Buffer.concat([
|
|
411
|
-
decipher.update(Buffer.from(payload.body, "base64")),
|
|
412
|
-
decipher.final(),
|
|
413
|
-
]).toString("utf8")
|
|
414
|
-
}
|
|
415
|
-
```
|
|
416
|
-
|
|
417
|
-
Write `src/core/registry/server-registry.ts`:
|
|
418
|
-
|
|
419
|
-
```ts
|
|
420
|
-
import { readFile, writeFile } from "node:fs/promises"
|
|
421
|
-
import { dirname } from "node:path"
|
|
422
|
-
import { mkdir } from "node:fs/promises"
|
|
423
|
-
import { decryptJson, encryptJson } from "./crypto"
|
|
424
|
-
import type { SecretProvider } from "./secret-provider"
|
|
425
|
-
|
|
426
|
-
export const createServerRegistry = ({ registryFile, secretProvider }: { registryFile: string; secretProvider: SecretProvider }) => {
|
|
427
|
-
const load = async () => {
|
|
428
|
-
try {
|
|
429
|
-
const raw = await readFile(registryFile, "utf8")
|
|
430
|
-
const payload = JSON.parse(raw)
|
|
431
|
-
const key = await secretProvider.getMasterKey()
|
|
432
|
-
return JSON.parse(decryptJson(payload, key)) as any[]
|
|
433
|
-
} catch (error: any) {
|
|
434
|
-
if (error.code === "ENOENT") return []
|
|
435
|
-
throw error
|
|
436
|
-
}
|
|
437
|
-
}
|
|
438
|
-
|
|
439
|
-
const save = async (records: any[]) => {
|
|
440
|
-
await mkdir(dirname(registryFile), { recursive: true })
|
|
441
|
-
const key = await secretProvider.getMasterKey()
|
|
442
|
-
await writeFile(registryFile, JSON.stringify(encryptJson(JSON.stringify(records), key), null, 2))
|
|
443
|
-
}
|
|
444
|
-
|
|
445
|
-
return {
|
|
446
|
-
async list() {
|
|
447
|
-
return load()
|
|
448
|
-
},
|
|
449
|
-
async resolve(id: string) {
|
|
450
|
-
return (await load()).find((record) => record.id === id) ?? null
|
|
451
|
-
},
|
|
452
|
-
async upsert(record: any) {
|
|
453
|
-
const records = await load()
|
|
454
|
-
const next = records.filter((item) => item.id !== record.id)
|
|
455
|
-
next.push(record)
|
|
456
|
-
await save(next)
|
|
457
|
-
},
|
|
458
|
-
}
|
|
459
|
-
}
|
|
460
|
-
```
|
|
461
|
-
|
|
462
|
-
- [ ] **Step 4: Run the registry checks**
|
|
463
|
-
|
|
464
|
-
Run: `bun test tests/unit/server-registry.test.ts`
|
|
465
|
-
Expected: PASS
|
|
466
|
-
|
|
467
|
-
Run: `bun run typecheck`
|
|
468
|
-
Expected: PASS
|
|
469
|
-
|
|
470
|
-
- [ ] **Step 5: Commit**
|
|
471
|
-
|
|
472
|
-
```bash
|
|
473
|
-
git add src/core/registry/secret-provider.ts src/core/registry/keychain-provider.ts src/core/registry/crypto.ts src/core/registry/server-registry.ts tests/unit/server-registry.test.ts
|
|
474
|
-
git commit -m "feat: add encrypted server registry"
|
|
475
|
-
```
|
|
476
|
-
|
|
477
|
-
## Task 4: Build The Deterministic Policy Engine
|
|
478
|
-
|
|
479
|
-
**Files:**
|
|
480
|
-
- Create: `src/core/policy.ts`
|
|
481
|
-
- Create: `tests/unit/policy.test.ts`
|
|
482
|
-
|
|
483
|
-
- [ ] **Step 1: Write failing tests for safe, approval-required, and rejected cases**
|
|
484
|
-
|
|
485
|
-
```ts
|
|
486
|
-
import { describe, expect, test } from "bun:test"
|
|
487
|
-
import { classifyRemoteExec } from "../../src/core/policy"
|
|
488
|
-
|
|
489
|
-
describe("remote exec policy", () => {
|
|
490
|
-
test("auto-allows simple linux inspection commands", () => {
|
|
491
|
-
expect(classifyRemoteExec("cat /etc/hosts").decision).toBe("auto-allow")
|
|
492
|
-
})
|
|
493
|
-
|
|
494
|
-
test("requires approval for middleware commands", () => {
|
|
495
|
-
expect(classifyRemoteExec("kubectl get pods -A").decision).toBe("approval-required")
|
|
496
|
-
})
|
|
497
|
-
|
|
498
|
-
test("requires approval for shell composition", () => {
|
|
499
|
-
expect(classifyRemoteExec("cat /etc/hosts | grep localhost").decision).toBe("approval-required")
|
|
500
|
-
})
|
|
501
|
-
})
|
|
502
|
-
```
|
|
503
|
-
|
|
504
|
-
- [ ] **Step 2: Run the policy test and confirm it fails first**
|
|
505
|
-
|
|
506
|
-
Run: `bun test tests/unit/policy.test.ts`
|
|
507
|
-
Expected: FAIL with missing module errors for `policy`
|
|
508
|
-
|
|
509
|
-
- [ ] **Step 3: Implement the classifier**
|
|
510
|
-
|
|
511
|
-
Write `src/core/policy.ts`:
|
|
512
|
-
|
|
513
|
-
```ts
|
|
514
|
-
const SAFE_COMMANDS = new Set(["cat", "grep", "find", "ls", "pwd", "uname", "df", "free", "ps"])
|
|
515
|
-
const MIDDLEWARE_COMMANDS = new Set(["psql", "mysql", "redis-cli", "kubectl", "docker", "helm", "aws", "gcloud", "az"])
|
|
516
|
-
const SHELL_META = ["|", ">", "<", ";", "&&", "||", "$(", "`"]
|
|
517
|
-
|
|
518
|
-
export const classifyRemoteExec = (command: string) => {
|
|
519
|
-
const trimmed = command.trim()
|
|
520
|
-
if (!trimmed) return { decision: "reject", reason: "empty command" } as const
|
|
521
|
-
if (SHELL_META.some((token) => trimmed.includes(token))) {
|
|
522
|
-
return { decision: "approval-required", reason: "shell composition" } as const
|
|
523
|
-
}
|
|
524
|
-
|
|
525
|
-
const [binary] = trimmed.split(/\s+/)
|
|
526
|
-
if (MIDDLEWARE_COMMANDS.has(binary)) {
|
|
527
|
-
return { decision: "approval-required", reason: "middleware command" } as const
|
|
528
|
-
}
|
|
529
|
-
if (SAFE_COMMANDS.has(binary) || trimmed.startsWith("systemctl status")) {
|
|
530
|
-
return { decision: "auto-allow", reason: "safe inspection command" } as const
|
|
531
|
-
}
|
|
532
|
-
return { decision: "approval-required", reason: "unknown command" } as const
|
|
533
|
-
}
|
|
534
|
-
```
|
|
535
|
-
|
|
536
|
-
- [ ] **Step 4: Run the policy checks**
|
|
537
|
-
|
|
538
|
-
Run: `bun test tests/unit/policy.test.ts`
|
|
539
|
-
Expected: PASS
|
|
540
|
-
|
|
541
|
-
Run: `bun run typecheck`
|
|
542
|
-
Expected: PASS
|
|
543
|
-
|
|
544
|
-
- [ ] **Step 5: Commit**
|
|
545
|
-
|
|
546
|
-
```bash
|
|
547
|
-
git add src/core/policy.ts tests/unit/policy.test.ts
|
|
548
|
-
git commit -m "feat: add deterministic remote exec policy"
|
|
549
|
-
```
|
|
550
|
-
|
|
551
|
-
## Task 5: Add The Audit Log And Git Snapshot Engine
|
|
552
|
-
|
|
553
|
-
**Files:**
|
|
554
|
-
- Create: `src/core/audit/redact.ts`
|
|
555
|
-
- Create: `src/core/audit/log-store.ts`
|
|
556
|
-
- Create: `src/core/audit/git-audit-repo.ts`
|
|
557
|
-
- Create: `tests/unit/audit.test.ts`
|
|
558
|
-
|
|
559
|
-
- [ ] **Step 1: Write failing tests for redaction, preflight, and snapshot commits**
|
|
560
|
-
|
|
561
|
-
```ts
|
|
562
|
-
import { describe, expect, test } from "bun:test"
|
|
563
|
-
import { mkdtemp, readFile } from "node:fs/promises"
|
|
564
|
-
import { join } from "node:path"
|
|
565
|
-
import { tmpdir } from "node:os"
|
|
566
|
-
import { createAuditLogStore } from "../../src/core/audit/log-store"
|
|
567
|
-
import { createGitAuditRepo } from "../../src/core/audit/git-audit-repo"
|
|
568
|
-
|
|
569
|
-
describe("audit engine", () => {
|
|
570
|
-
test("redacts secret-looking values before writing JSONL entries", async () => {
|
|
571
|
-
const dir = await mkdtemp(join(tmpdir(), "open-code-audit-"))
|
|
572
|
-
const store = createAuditLogStore(join(dir, "actions.jsonl"))
|
|
573
|
-
await store.preflight()
|
|
574
|
-
await store.append({ command: "psql postgresql://user:secret@db/app" })
|
|
575
|
-
const disk = await readFile(join(dir, "actions.jsonl"), "utf8")
|
|
576
|
-
expect(disk.includes("secret")).toBe(false)
|
|
577
|
-
})
|
|
578
|
-
|
|
579
|
-
test("creates a git commit for before and after snapshots", async () => {
|
|
580
|
-
const dir = await mkdtemp(join(tmpdir(), "open-code-git-audit-"))
|
|
581
|
-
const repo = createGitAuditRepo(dir)
|
|
582
|
-
await repo.preflight()
|
|
583
|
-
await repo.captureChange({
|
|
584
|
-
server: "prod-a",
|
|
585
|
-
path: "/etc/app.conf",
|
|
586
|
-
before: "port=80\n",
|
|
587
|
-
after: "port=81\n",
|
|
588
|
-
})
|
|
589
|
-
expect(await repo.lastCommitMessage()).toContain("prod-a")
|
|
590
|
-
})
|
|
591
|
-
})
|
|
592
|
-
```
|
|
593
|
-
|
|
594
|
-
- [ ] **Step 2: Run the audit test to verify it fails**
|
|
595
|
-
|
|
596
|
-
Run: `bun test tests/unit/audit.test.ts`
|
|
597
|
-
Expected: FAIL with missing module errors for the audit modules
|
|
598
|
-
|
|
599
|
-
- [ ] **Step 3: Implement redaction, JSONL logging, and git-backed snapshots**
|
|
600
|
-
|
|
601
|
-
Write `src/core/audit/redact.ts`:
|
|
602
|
-
|
|
603
|
-
```ts
|
|
604
|
-
export const redactSecrets = (value: string) =>
|
|
605
|
-
value
|
|
606
|
-
.replace(/:\/\/([^:\s]+):([^@\s]+)@/g, "://$1:[REDACTED]@")
|
|
607
|
-
.replace(/(password|secret|token)=([^\s]+)/gi, "$1=[REDACTED]")
|
|
608
|
-
```
|
|
609
|
-
|
|
610
|
-
Write `src/core/audit/log-store.ts`:
|
|
611
|
-
|
|
612
|
-
```ts
|
|
613
|
-
import { appendFile, mkdir, writeFile } from "node:fs/promises"
|
|
614
|
-
import { dirname } from "node:path"
|
|
615
|
-
import { redactSecrets } from "./redact"
|
|
616
|
-
|
|
617
|
-
export const createAuditLogStore = (file: string) => ({
|
|
618
|
-
async preflight() {
|
|
619
|
-
await mkdir(dirname(file), { recursive: true })
|
|
620
|
-
await appendFile(file, "")
|
|
621
|
-
},
|
|
622
|
-
async append(entry: Record<string, unknown>) {
|
|
623
|
-
const stamped = {
|
|
624
|
-
timestamp: new Date().toISOString(),
|
|
625
|
-
...entry,
|
|
626
|
-
}
|
|
627
|
-
const json = JSON.stringify(stamped, (_key, value) =>
|
|
628
|
-
typeof value === "string" ? redactSecrets(value) : value,
|
|
629
|
-
)
|
|
630
|
-
await appendFile(file, `${json}\n`)
|
|
631
|
-
},
|
|
632
|
-
})
|
|
633
|
-
```
|
|
634
|
-
|
|
635
|
-
When wiring the orchestrator, always write log entries with at least `tool`, `server`, `timestamp`, `approvalStatus`, execution metadata, and either `changedPath` or `command` so both successful and failed actions are traceable.
|
|
636
|
-
|
|
637
|
-
Write `src/core/audit/git-audit-repo.ts`:
|
|
638
|
-
|
|
639
|
-
```ts
|
|
640
|
-
import { mkdir, writeFile } from "node:fs/promises"
|
|
641
|
-
import { dirname, join } from "node:path"
|
|
642
|
-
|
|
643
|
-
const run = async (cwd: string, args: string[]) => {
|
|
644
|
-
const proc = Bun.spawn(["git", ...args], { cwd, stderr: "pipe", stdout: "pipe" })
|
|
645
|
-
const exitCode = await proc.exited
|
|
646
|
-
if (exitCode !== 0) throw new Error(await new Response(proc.stderr).text())
|
|
647
|
-
}
|
|
648
|
-
|
|
649
|
-
export const createGitAuditRepo = (repoDir: string) => ({
|
|
650
|
-
async preflight() {
|
|
651
|
-
await mkdir(repoDir, { recursive: true })
|
|
652
|
-
await run(repoDir, ["init"])
|
|
653
|
-
await run(repoDir, ["config", "user.name", "Open Code"])
|
|
654
|
-
await run(repoDir, ["config", "user.email", "open-code@local"])
|
|
655
|
-
},
|
|
656
|
-
async captureChange(input: { server: string; path: string; before: string; after: string }) {
|
|
657
|
-
const base = join(repoDir, input.server, input.path.replace(/^\//, ""))
|
|
658
|
-
await mkdir(dirname(base), { recursive: true })
|
|
659
|
-
await writeFile(`${base}.before`, input.before)
|
|
660
|
-
await writeFile(`${base}.after`, input.after)
|
|
661
|
-
await run(repoDir, ["add", "."])
|
|
662
|
-
await run(repoDir, ["commit", "-m", `audit: ${input.server} ${input.path}`])
|
|
663
|
-
},
|
|
664
|
-
async lastCommitMessage() {
|
|
665
|
-
const proc = Bun.spawn(["git", "log", "-1", "--pretty=%s"], { cwd: repoDir, stdout: "pipe" })
|
|
666
|
-
return (await new Response(proc.stdout).text()).trim()
|
|
667
|
-
},
|
|
668
|
-
})
|
|
669
|
-
```
|
|
670
|
-
|
|
671
|
-
- [ ] **Step 4: Run the audit checks**
|
|
672
|
-
|
|
673
|
-
Run: `bun test tests/unit/audit.test.ts`
|
|
674
|
-
Expected: PASS
|
|
675
|
-
|
|
676
|
-
Run: `bun run typecheck`
|
|
677
|
-
Expected: PASS
|
|
678
|
-
|
|
679
|
-
- [ ] **Step 5: Commit**
|
|
680
|
-
|
|
681
|
-
```bash
|
|
682
|
-
git add src/core/audit/redact.ts src/core/audit/log-store.ts src/core/audit/git-audit-repo.ts tests/unit/audit.test.ts
|
|
683
|
-
git commit -m "feat: add audit logging and git snapshot storage"
|
|
684
|
-
```
|
|
685
|
-
|
|
686
|
-
## Task 6: Implement SSH Runtime And Patch Application
|
|
687
|
-
|
|
688
|
-
**Files:**
|
|
689
|
-
- Create: `src/core/patch.ts`
|
|
690
|
-
- Create: `src/core/ssh/ssh-runtime.ts`
|
|
691
|
-
- Create: `tests/integration/fake-ssh-server.ts`
|
|
692
|
-
- Create: `tests/integration/ssh-runtime.test.ts`
|
|
693
|
-
|
|
694
|
-
- [ ] **Step 1: Write failing integration tests for exec, read, list, write, and patch**
|
|
695
|
-
|
|
696
|
-
```ts
|
|
697
|
-
import { afterAll, beforeAll, describe, expect, test } from "bun:test"
|
|
698
|
-
import { startFakeSshServer } from "./fake-ssh-server"
|
|
699
|
-
import { createSshRuntime } from "../../src/core/ssh/ssh-runtime"
|
|
700
|
-
|
|
701
|
-
describe("ssh runtime", () => {
|
|
702
|
-
let server: Awaited<ReturnType<typeof startFakeSshServer>>
|
|
703
|
-
let runtime: ReturnType<typeof createSshRuntime>
|
|
704
|
-
|
|
705
|
-
beforeAll(async () => {
|
|
706
|
-
server = await startFakeSshServer()
|
|
707
|
-
runtime = createSshRuntime()
|
|
708
|
-
})
|
|
709
|
-
|
|
710
|
-
afterAll(async () => {
|
|
711
|
-
await server.stop()
|
|
712
|
-
})
|
|
713
|
-
|
|
714
|
-
test("executes a safe remote command", async () => {
|
|
715
|
-
const result = await runtime.exec(server.connection, "cat /tmp/open-code/hosts")
|
|
716
|
-
expect(result.stdout).toContain("localhost")
|
|
717
|
-
})
|
|
718
|
-
|
|
719
|
-
test("writes and reads a remote file through sftp", async () => {
|
|
720
|
-
await runtime.writeFile(server.connection, "/tmp/open-code/app.conf", "port=80\n")
|
|
721
|
-
expect(await runtime.readFile(server.connection, "/tmp/open-code/app.conf")).toBe("port=80\n")
|
|
722
|
-
})
|
|
723
|
-
|
|
724
|
-
test("lists and stats remote paths", async () => {
|
|
725
|
-
const entries = await runtime.listDir(server.connection, "/tmp/open-code", false, 50)
|
|
726
|
-
expect(Array.isArray(entries)).toBe(true)
|
|
727
|
-
expect(await runtime.stat(server.connection, "/tmp/open-code/hosts")).toMatchObject({ isFile: true })
|
|
728
|
-
})
|
|
729
|
-
})
|
|
730
|
-
```
|
|
731
|
-
|
|
732
|
-
- [ ] **Step 2: Run the integration test and confirm it fails before the implementation exists**
|
|
733
|
-
|
|
734
|
-
Run: `bun test tests/integration/ssh-runtime.test.ts`
|
|
735
|
-
Expected: FAIL with missing module errors for `ssh-runtime`
|
|
736
|
-
|
|
737
|
-
- [ ] **Step 3: Implement patching and SSH/SFTP primitives**
|
|
738
|
-
|
|
739
|
-
Write `tests/integration/fake-ssh-server.ts`:
|
|
740
|
-
|
|
741
|
-
```ts
|
|
742
|
-
import { GenericContainer, Wait } from "testcontainers"
|
|
743
|
-
|
|
744
|
-
export const startFakeSshServer = async () => {
|
|
745
|
-
const container = await new GenericContainer("linuxserver/openssh-server:latest")
|
|
746
|
-
.withEnvironment({
|
|
747
|
-
USER_NAME: "open",
|
|
748
|
-
USER_PASSWORD: "openpass",
|
|
749
|
-
PASSWORD_ACCESS: "true",
|
|
750
|
-
SUDO_ACCESS: "false",
|
|
751
|
-
})
|
|
752
|
-
.withExposedPorts(2222)
|
|
753
|
-
.withWaitStrategy(Wait.forListeningPorts())
|
|
754
|
-
.start()
|
|
755
|
-
|
|
756
|
-
await container.exec([
|
|
757
|
-
"sh",
|
|
758
|
-
"-lc",
|
|
759
|
-
"mkdir -p /tmp/open-code && printf '127.0.0.1 localhost\n' > /tmp/open-code/hosts && printf 'port=80\n' > /tmp/open-code/app.conf",
|
|
760
|
-
])
|
|
761
|
-
|
|
762
|
-
return {
|
|
763
|
-
connection: {
|
|
764
|
-
host: container.getHost(),
|
|
765
|
-
port: container.getMappedPort(2222),
|
|
766
|
-
username: "open",
|
|
767
|
-
password: "openpass",
|
|
768
|
-
},
|
|
769
|
-
stop: () => container.stop(),
|
|
770
|
-
}
|
|
771
|
-
}
|
|
772
|
-
```
|
|
773
|
-
|
|
774
|
-
Write `src/core/patch.ts`:
|
|
775
|
-
|
|
776
|
-
```ts
|
|
777
|
-
import { applyPatch } from "diff"
|
|
778
|
-
|
|
779
|
-
export const applyUnifiedPatch = (source: string, patch: string) => {
|
|
780
|
-
const next = applyPatch(source, patch)
|
|
781
|
-
if (next === false) throw new Error("patch apply failed")
|
|
782
|
-
return next
|
|
783
|
-
}
|
|
784
|
-
```
|
|
785
|
-
|
|
786
|
-
Write `src/core/ssh/ssh-runtime.ts`:
|
|
787
|
-
|
|
788
|
-
```ts
|
|
789
|
-
import { Client } from "ssh2"
|
|
790
|
-
import { escape } from "shescape"
|
|
791
|
-
|
|
792
|
-
export const createSshRuntime = () => ({
|
|
793
|
-
exec(connection: any, command: string, options: { cwd?: string; timeout?: number } = {}) {
|
|
794
|
-
return new Promise<{ stdout: string; stderr: string; exitCode: number }>((resolve, reject) => {
|
|
795
|
-
const client = new Client()
|
|
796
|
-
const timer = options.timeout
|
|
797
|
-
? setTimeout(() => {
|
|
798
|
-
client.end()
|
|
799
|
-
reject(new Error(`command timed out after ${options.timeout}ms`))
|
|
800
|
-
}, options.timeout)
|
|
801
|
-
: null
|
|
802
|
-
client
|
|
803
|
-
.on("ready", () => {
|
|
804
|
-
const effective = options.cwd ? `cd ${escape(options.cwd)} && ${command}` : command
|
|
805
|
-
client.exec(effective, (error, stream) => {
|
|
806
|
-
if (error) return reject(error)
|
|
807
|
-
let stdout = ""
|
|
808
|
-
let stderr = ""
|
|
809
|
-
stream.on("data", (chunk) => (stdout += chunk.toString()))
|
|
810
|
-
stream.stderr.on("data", (chunk) => (stderr += chunk.toString()))
|
|
811
|
-
stream.on("close", (exitCode: number) => {
|
|
812
|
-
if (timer) clearTimeout(timer)
|
|
813
|
-
client.end()
|
|
814
|
-
resolve({ stdout, stderr, exitCode })
|
|
815
|
-
})
|
|
816
|
-
})
|
|
817
|
-
})
|
|
818
|
-
.on("error", reject)
|
|
819
|
-
.connect(connection)
|
|
820
|
-
})
|
|
821
|
-
},
|
|
822
|
-
readFile(connection: any, path: string) {
|
|
823
|
-
return new Promise<string>((resolve, reject) => {
|
|
824
|
-
const client = new Client()
|
|
825
|
-
client
|
|
826
|
-
.on("ready", () => {
|
|
827
|
-
client.sftp((error, sftp) => {
|
|
828
|
-
if (error) return reject(error)
|
|
829
|
-
const chunks: Buffer[] = []
|
|
830
|
-
const stream = sftp.createReadStream(path)
|
|
831
|
-
stream.on("data", (chunk) => chunks.push(Buffer.from(chunk)))
|
|
832
|
-
stream.on("error", (readError) => {
|
|
833
|
-
client.end()
|
|
834
|
-
reject(readError)
|
|
835
|
-
})
|
|
836
|
-
stream.on("close", () => {
|
|
837
|
-
client.end()
|
|
838
|
-
resolve(Buffer.concat(chunks).toString("utf8"))
|
|
839
|
-
})
|
|
840
|
-
})
|
|
841
|
-
})
|
|
842
|
-
.on("error", reject)
|
|
843
|
-
.connect(connection)
|
|
844
|
-
})
|
|
845
|
-
},
|
|
846
|
-
async writeFile(connection: any, path: string, content: string, mode?: number) {
|
|
847
|
-
return new Promise<void>((resolve, reject) => {
|
|
848
|
-
const client = new Client()
|
|
849
|
-
client
|
|
850
|
-
.on("ready", () => {
|
|
851
|
-
client.sftp((error, sftp) => {
|
|
852
|
-
if (error) return reject(error)
|
|
853
|
-
const stream = sftp.createWriteStream(path, mode ? { mode } : undefined)
|
|
854
|
-
stream.on("error", (writeError) => {
|
|
855
|
-
client.end()
|
|
856
|
-
reject(writeError)
|
|
857
|
-
})
|
|
858
|
-
stream.on("close", () => {
|
|
859
|
-
client.end()
|
|
860
|
-
resolve()
|
|
861
|
-
})
|
|
862
|
-
stream.end(content)
|
|
863
|
-
})
|
|
864
|
-
})
|
|
865
|
-
.on("error", reject)
|
|
866
|
-
.connect(connection)
|
|
867
|
-
})
|
|
868
|
-
},
|
|
869
|
-
async listDir(connection: any, path: string, recursive = false, limit = 200) {
|
|
870
|
-
if (recursive) {
|
|
871
|
-
const listed = await this.exec(connection, `find ${escape(path)} | head -n ${limit}`)
|
|
872
|
-
return listed.stdout.trim().split("\n").filter(Boolean)
|
|
873
|
-
}
|
|
874
|
-
|
|
875
|
-
return new Promise<any[]>((resolve, reject) => {
|
|
876
|
-
const client = new Client()
|
|
877
|
-
client
|
|
878
|
-
.on("ready", () => {
|
|
879
|
-
client.sftp((error, sftp) => {
|
|
880
|
-
if (error) return reject(error)
|
|
881
|
-
sftp.readdir(path, (readError, entries) => {
|
|
882
|
-
client.end()
|
|
883
|
-
if (readError) return reject(readError)
|
|
884
|
-
resolve(entries.map((entry: any) => ({ name: entry.filename, longname: entry.longname })))
|
|
885
|
-
})
|
|
886
|
-
})
|
|
887
|
-
})
|
|
888
|
-
.on("error", reject)
|
|
889
|
-
.connect(connection)
|
|
890
|
-
})
|
|
891
|
-
},
|
|
892
|
-
stat(connection: any, path: string) {
|
|
893
|
-
return new Promise<any>((resolve, reject) => {
|
|
894
|
-
const client = new Client()
|
|
895
|
-
client
|
|
896
|
-
.on("ready", () => {
|
|
897
|
-
client.sftp((error, sftp) => {
|
|
898
|
-
if (error) return reject(error)
|
|
899
|
-
sftp.stat(path, (statError, stats) => {
|
|
900
|
-
client.end()
|
|
901
|
-
if (statError) return reject(statError)
|
|
902
|
-
resolve({
|
|
903
|
-
size: stats.size,
|
|
904
|
-
mode: stats.mode,
|
|
905
|
-
isFile: stats.isFile(),
|
|
906
|
-
isDirectory: stats.isDirectory(),
|
|
907
|
-
})
|
|
908
|
-
})
|
|
909
|
-
})
|
|
910
|
-
})
|
|
911
|
-
.on("error", reject)
|
|
912
|
-
.connect(connection)
|
|
913
|
-
})
|
|
914
|
-
},
|
|
915
|
-
})
|
|
916
|
-
```
|
|
917
|
-
|
|
918
|
-
- [ ] **Step 4: Run the integration checks**
|
|
919
|
-
|
|
920
|
-
Run: `bun test tests/integration/ssh-runtime.test.ts`
|
|
921
|
-
Expected: PASS
|
|
922
|
-
|
|
923
|
-
Run: `bun run typecheck`
|
|
924
|
-
Expected: PASS
|
|
925
|
-
|
|
926
|
-
- [ ] **Step 5: Commit**
|
|
927
|
-
|
|
928
|
-
```bash
|
|
929
|
-
git add src/core/patch.ts src/core/ssh/ssh-runtime.ts tests/integration/fake-ssh-server.ts tests/integration/ssh-runtime.test.ts
|
|
930
|
-
git commit -m "feat: add ssh runtime and patch application"
|
|
931
|
-
```
|
|
932
|
-
|
|
933
|
-
## Task 7: Build The Central Orchestrator And Enforce Audit Semantics
|
|
934
|
-
|
|
935
|
-
**Files:**
|
|
936
|
-
- Create: `src/core/orchestrator.ts`
|
|
937
|
-
- Create: `tests/integration/orchestrator.test.ts`
|
|
938
|
-
|
|
939
|
-
- [ ] **Step 1: Write failing orchestrator tests for policy enforcement and partial failures**
|
|
940
|
-
|
|
941
|
-
```ts
|
|
942
|
-
import { describe, expect, test } from "bun:test"
|
|
943
|
-
import { createOrchestrator } from "../../src/core/orchestrator"
|
|
944
|
-
|
|
945
|
-
describe("tool orchestrator", () => {
|
|
946
|
-
test("auto-allows safe remote exec commands", async () => {
|
|
947
|
-
const orchestrator = createOrchestrator({
|
|
948
|
-
registry: { resolve: async () => ({ id: "prod-a" }) },
|
|
949
|
-
policy: { classifyRemoteExec: () => ({ decision: "auto-allow", reason: "safe inspection command" }) },
|
|
950
|
-
ssh: {
|
|
951
|
-
exec: async (_server: any, _command: string, options: any) => ({
|
|
952
|
-
stdout: options.cwd,
|
|
953
|
-
stderr: "",
|
|
954
|
-
exitCode: options.timeout,
|
|
955
|
-
}),
|
|
956
|
-
},
|
|
957
|
-
audit: { preflightLog: async () => {}, appendLog: async () => {} },
|
|
958
|
-
})
|
|
959
|
-
|
|
960
|
-
const result = await orchestrator.remoteExec({
|
|
961
|
-
server: "prod-a",
|
|
962
|
-
command: "cat /etc/hosts",
|
|
963
|
-
cwd: "/etc",
|
|
964
|
-
timeout: 5000,
|
|
965
|
-
})
|
|
966
|
-
expect(result).toMatchObject({ status: "ok", data: { stdout: "/etc", exitCode: 5000 } })
|
|
967
|
-
})
|
|
968
|
-
|
|
969
|
-
test("returns partial failure when audit snapshot finalization fails after a successful write", async () => {
|
|
970
|
-
const orchestrator = createOrchestrator({
|
|
971
|
-
registry: { resolve: async () => ({ id: "prod-a" }) },
|
|
972
|
-
policy: { classifyRemoteExec: () => ({ decision: "approval-required", reason: "write" }) },
|
|
973
|
-
ssh: {
|
|
974
|
-
readFile: async () => "port=80\n",
|
|
975
|
-
writeFile: async () => {},
|
|
976
|
-
},
|
|
977
|
-
audit: {
|
|
978
|
-
preflightLog: async () => {},
|
|
979
|
-
preflightSnapshots: async () => {},
|
|
980
|
-
appendLog: async () => {},
|
|
981
|
-
captureSnapshots: async () => {
|
|
982
|
-
throw new Error("git commit failed")
|
|
983
|
-
},
|
|
984
|
-
},
|
|
985
|
-
})
|
|
986
|
-
|
|
987
|
-
const result = await orchestrator.remoteWriteFile({
|
|
988
|
-
server: "prod-a",
|
|
989
|
-
path: "/tmp/app.conf",
|
|
990
|
-
content: "port=81\n",
|
|
991
|
-
mode: 0o640,
|
|
992
|
-
})
|
|
993
|
-
expect(result.status).toBe("partial_failure")
|
|
994
|
-
})
|
|
995
|
-
|
|
996
|
-
test("keeps execution and audit scoped to the addressed server", async () => {
|
|
997
|
-
const logs: any[] = []
|
|
998
|
-
const orchestrator = createOrchestrator({
|
|
999
|
-
registry: {
|
|
1000
|
-
resolve: async (id: string) => ({ id }),
|
|
1001
|
-
},
|
|
1002
|
-
policy: { classifyRemoteExec: () => ({ decision: "auto-allow", reason: "safe inspection command" }) },
|
|
1003
|
-
ssh: {
|
|
1004
|
-
exec: async (server: any) => ({ stdout: server.id, stderr: "", exitCode: 0 }),
|
|
1005
|
-
},
|
|
1006
|
-
audit: {
|
|
1007
|
-
preflightLog: async () => {},
|
|
1008
|
-
appendLog: async (entry: any) => {
|
|
1009
|
-
logs.push(entry)
|
|
1010
|
-
},
|
|
1011
|
-
},
|
|
1012
|
-
})
|
|
1013
|
-
|
|
1014
|
-
const first = await orchestrator.remoteExec({ server: "prod-a", command: "pwd" })
|
|
1015
|
-
const second = await orchestrator.remoteExec({ server: "prod-b", command: "pwd" })
|
|
1016
|
-
|
|
1017
|
-
expect(first.data.stdout).toBe("prod-a")
|
|
1018
|
-
expect(second.data.stdout).toBe("prod-b")
|
|
1019
|
-
expect(logs.map((entry) => entry.server)).toEqual(["prod-a", "prod-b"])
|
|
1020
|
-
})
|
|
1021
|
-
|
|
1022
|
-
test("keeps file writes and snapshots partitioned across two registered servers", async () => {
|
|
1023
|
-
const snapshots: any[] = []
|
|
1024
|
-
const files = new Map([
|
|
1025
|
-
["prod-a:/tmp/app.conf", "port=80\n"],
|
|
1026
|
-
["prod-b:/tmp/app.conf", "port=90\n"],
|
|
1027
|
-
])
|
|
1028
|
-
|
|
1029
|
-
const orchestrator = createOrchestrator({
|
|
1030
|
-
registry: {
|
|
1031
|
-
resolve: async (id: string) => ({ id }),
|
|
1032
|
-
},
|
|
1033
|
-
policy: { classifyRemoteExec: () => ({ decision: "auto-allow", reason: "safe inspection command" }) },
|
|
1034
|
-
ssh: {
|
|
1035
|
-
readFile: async (server: any, path: string) => files.get(`${server.id}:${path}`) ?? "",
|
|
1036
|
-
writeFile: async (server: any, path: string, content: string) => {
|
|
1037
|
-
files.set(`${server.id}:${path}`, content)
|
|
1038
|
-
},
|
|
1039
|
-
},
|
|
1040
|
-
audit: {
|
|
1041
|
-
preflightLog: async () => {},
|
|
1042
|
-
preflightSnapshots: async () => {},
|
|
1043
|
-
appendLog: async () => {},
|
|
1044
|
-
captureSnapshots: async (entry: any) => {
|
|
1045
|
-
snapshots.push(entry)
|
|
1046
|
-
},
|
|
1047
|
-
},
|
|
1048
|
-
})
|
|
1049
|
-
|
|
1050
|
-
await orchestrator.remoteWriteFile({ server: "prod-a", path: "/tmp/app.conf", content: "port=81\n" })
|
|
1051
|
-
await orchestrator.remoteWriteFile({ server: "prod-b", path: "/tmp/app.conf", content: "port=91\n" })
|
|
1052
|
-
|
|
1053
|
-
expect(snapshots).toEqual([
|
|
1054
|
-
expect.objectContaining({ server: "prod-a", path: "/tmp/app.conf", before: "port=80\n", after: "port=81\n" }),
|
|
1055
|
-
expect.objectContaining({ server: "prod-b", path: "/tmp/app.conf", before: "port=90\n", after: "port=91\n" }),
|
|
1056
|
-
])
|
|
1057
|
-
})
|
|
1058
|
-
})
|
|
1059
|
-
```
|
|
1060
|
-
|
|
1061
|
-
- [ ] **Step 2: Run the orchestrator test to verify the implementation is still missing**
|
|
1062
|
-
|
|
1063
|
-
Run: `bun test tests/integration/orchestrator.test.ts`
|
|
1064
|
-
Expected: FAIL with missing module errors for `orchestrator`
|
|
1065
|
-
|
|
1066
|
-
- [ ] **Step 3: Implement the shared execution pipeline**
|
|
1067
|
-
|
|
1068
|
-
Write `src/core/orchestrator.ts`:
|
|
1069
|
-
|
|
1070
|
-
```ts
|
|
1071
|
-
import { applyUnifiedPatch } from "./patch"
|
|
1072
|
-
import { errorResult, okResult, partialFailureResult } from "./result"
|
|
1073
|
-
import { classifyRemoteExec } from "./policy"
|
|
1074
|
-
import { escape } from "shescape"
|
|
1075
|
-
|
|
1076
|
-
export const createOrchestrator = ({ registry, policy = { classifyRemoteExec }, ssh, audit }: any) => ({
|
|
1077
|
-
async listServers() {
|
|
1078
|
-
await audit.preflightLog()
|
|
1079
|
-
const servers = await registry.list()
|
|
1080
|
-
await audit.appendLog({ tool: "list_servers", approvalStatus: "not-required", count: servers.length })
|
|
1081
|
-
return okResult({ tool: "list_servers", data: servers.map(({ auth, ...server }: any) => server) })
|
|
1082
|
-
},
|
|
1083
|
-
|
|
1084
|
-
async remoteExec(input: { server: string; command: string; cwd?: string; timeout?: number }) {
|
|
1085
|
-
await audit.preflightLog()
|
|
1086
|
-
const server = await registry.resolve(input.server)
|
|
1087
|
-
if (!server) {
|
|
1088
|
-
await audit.appendLog({ tool: "remote_exec", server: input.server, command: input.command, approvalStatus: "unknown", code: "SERVER_NOT_FOUND" })
|
|
1089
|
-
return errorResult({
|
|
1090
|
-
tool: "remote_exec",
|
|
1091
|
-
server: input.server,
|
|
1092
|
-
code: "SERVER_NOT_FOUND",
|
|
1093
|
-
execution: { attempted: false, completed: false },
|
|
1094
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
1095
|
-
})
|
|
1096
|
-
}
|
|
1097
|
-
|
|
1098
|
-
const decision = policy.classifyRemoteExec(input.command)
|
|
1099
|
-
if (decision.decision === "reject") {
|
|
1100
|
-
await audit.appendLog({ tool: "remote_exec", server: input.server, command: input.command, approvalStatus: "not-required", code: "POLICY_REJECTED" })
|
|
1101
|
-
return errorResult({
|
|
1102
|
-
tool: "remote_exec",
|
|
1103
|
-
server: input.server,
|
|
1104
|
-
code: "POLICY_REJECTED",
|
|
1105
|
-
message: decision.reason,
|
|
1106
|
-
execution: { attempted: false, completed: false },
|
|
1107
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
1108
|
-
})
|
|
1109
|
-
}
|
|
1110
|
-
|
|
1111
|
-
let executed
|
|
1112
|
-
try {
|
|
1113
|
-
executed = await ssh.exec(server, input.command, { cwd: input.cwd, timeout: input.timeout })
|
|
1114
|
-
} catch (error: any) {
|
|
1115
|
-
await audit.appendLog({ tool: "remote_exec", server: input.server, command: input.command, approvalStatus: decision.decision === "approval-required" ? "host-managed-required" : "not-required", code: "SSH_EXEC_FAILED", message: error.message })
|
|
1116
|
-
return errorResult({
|
|
1117
|
-
tool: "remote_exec",
|
|
1118
|
-
server: input.server,
|
|
1119
|
-
code: "SSH_EXEC_FAILED",
|
|
1120
|
-
message: error.message,
|
|
1121
|
-
execution: { attempted: true, completed: false },
|
|
1122
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
1123
|
-
})
|
|
1124
|
-
}
|
|
1125
|
-
await audit.appendLog({
|
|
1126
|
-
tool: "remote_exec",
|
|
1127
|
-
server: input.server,
|
|
1128
|
-
command: input.command,
|
|
1129
|
-
cwd: input.cwd,
|
|
1130
|
-
timeout: input.timeout,
|
|
1131
|
-
approvalStatus: decision.decision === "approval-required" ? "host-managed-required" : "not-required",
|
|
1132
|
-
policyDecision: decision.decision,
|
|
1133
|
-
approvalRequired: decision.decision === "approval-required",
|
|
1134
|
-
...executed,
|
|
1135
|
-
})
|
|
1136
|
-
return okResult({
|
|
1137
|
-
tool: "remote_exec",
|
|
1138
|
-
server: input.server,
|
|
1139
|
-
data: executed,
|
|
1140
|
-
execution: {
|
|
1141
|
-
attempted: true,
|
|
1142
|
-
completed: true,
|
|
1143
|
-
exitCode: executed.exitCode,
|
|
1144
|
-
stdoutBytes: executed.stdout.length,
|
|
1145
|
-
stderrBytes: executed.stderr.length,
|
|
1146
|
-
},
|
|
1147
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
1148
|
-
})
|
|
1149
|
-
},
|
|
1150
|
-
|
|
1151
|
-
async remoteReadFile(input: { server: string; path: string; offset?: number; length?: number }) {
|
|
1152
|
-
await audit.preflightLog()
|
|
1153
|
-
const server = await registry.resolve(input.server)
|
|
1154
|
-
if (!server) {
|
|
1155
|
-
await audit.appendLog({ tool: "remote_read_file", server: input.server, path: input.path, approvalStatus: "not-required", code: "SERVER_NOT_FOUND" })
|
|
1156
|
-
return errorResult({ tool: "remote_read_file", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1157
|
-
}
|
|
1158
|
-
let body: string
|
|
1159
|
-
try {
|
|
1160
|
-
body = await ssh.readFile(server, input.path)
|
|
1161
|
-
} catch (error: any) {
|
|
1162
|
-
await audit.appendLog({ tool: "remote_read_file", server: input.server, path: input.path, approvalStatus: "not-required", code: "SSH_READ_FAILED", message: error.message })
|
|
1163
|
-
return errorResult({ tool: "remote_read_file", server: input.server, code: "SSH_READ_FAILED", message: error.message })
|
|
1164
|
-
}
|
|
1165
|
-
const sliced = body.slice(input.offset ?? 0, input.length ? (input.offset ?? 0) + input.length : undefined)
|
|
1166
|
-
await audit.appendLog({ tool: "remote_read_file", server: input.server, path: input.path, approvalStatus: "not-required" })
|
|
1167
|
-
return okResult({ tool: "remote_read_file", server: input.server, data: { content: sliced } })
|
|
1168
|
-
},
|
|
1169
|
-
|
|
1170
|
-
async remoteWriteFile(input: { server: string; path: string; content: string; mode?: number }) {
|
|
1171
|
-
await audit.preflightLog()
|
|
1172
|
-
const server = await registry.resolve(input.server)
|
|
1173
|
-
if (!server) {
|
|
1174
|
-
await audit.appendLog({ tool: "remote_write_file", server: input.server, path: input.path, approvalStatus: "host-managed-required", code: "SERVER_NOT_FOUND" })
|
|
1175
|
-
return errorResult({ tool: "remote_write_file", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1176
|
-
}
|
|
1177
|
-
|
|
1178
|
-
await audit.preflightSnapshots()
|
|
1179
|
-
const before = await ssh.readFile(server, input.path).catch(() => "")
|
|
1180
|
-
try {
|
|
1181
|
-
await ssh.writeFile(server, input.path, input.content, input.mode)
|
|
1182
|
-
} catch (error: any) {
|
|
1183
|
-
await audit.appendLog({ tool: "remote_write_file", server: input.server, path: input.path, approvalStatus: "host-managed-required", code: "SSH_WRITE_FAILED", message: error.message })
|
|
1184
|
-
return errorResult({
|
|
1185
|
-
tool: "remote_write_file",
|
|
1186
|
-
server: input.server,
|
|
1187
|
-
code: "SSH_WRITE_FAILED",
|
|
1188
|
-
message: error.message,
|
|
1189
|
-
execution: { attempted: true, completed: false },
|
|
1190
|
-
audit: { logWritten: true, snapshotStatus: "not-applicable" },
|
|
1191
|
-
})
|
|
1192
|
-
}
|
|
1193
|
-
const after = await ssh.readFile(server, input.path)
|
|
1194
|
-
await audit.appendLog({
|
|
1195
|
-
tool: "remote_write_file",
|
|
1196
|
-
server: input.server,
|
|
1197
|
-
path: input.path,
|
|
1198
|
-
mode: input.mode,
|
|
1199
|
-
changedPath: input.path,
|
|
1200
|
-
approvalStatus: "host-managed-required",
|
|
1201
|
-
approvalRequired: true,
|
|
1202
|
-
})
|
|
1203
|
-
|
|
1204
|
-
try {
|
|
1205
|
-
await audit.captureSnapshots({
|
|
1206
|
-
server: input.server,
|
|
1207
|
-
path: input.path,
|
|
1208
|
-
before,
|
|
1209
|
-
after,
|
|
1210
|
-
})
|
|
1211
|
-
return okResult({
|
|
1212
|
-
tool: "remote_write_file",
|
|
1213
|
-
server: input.server,
|
|
1214
|
-
execution: { attempted: true, completed: true },
|
|
1215
|
-
audit: { logWritten: true, snapshotStatus: "written" },
|
|
1216
|
-
})
|
|
1217
|
-
} catch (error: any) {
|
|
1218
|
-
return partialFailureResult({
|
|
1219
|
-
tool: "remote_write_file",
|
|
1220
|
-
server: input.server,
|
|
1221
|
-
message: `remote write succeeded but audit finalization failed: ${error.message}`,
|
|
1222
|
-
execution: { attempted: true, completed: true },
|
|
1223
|
-
audit: { logWritten: true, snapshotStatus: "partial-failure" },
|
|
1224
|
-
})
|
|
1225
|
-
}
|
|
1226
|
-
},
|
|
1227
|
-
|
|
1228
|
-
async remotePatchFile(input: { server: string; path: string; patch: string }) {
|
|
1229
|
-
await audit.preflightLog()
|
|
1230
|
-
const server = await registry.resolve(input.server)
|
|
1231
|
-
if (!server) {
|
|
1232
|
-
await audit.appendLog({ tool: "remote_patch_file", server: input.server, path: input.path, approvalStatus: "host-managed-required", code: "SERVER_NOT_FOUND" })
|
|
1233
|
-
return errorResult({ tool: "remote_patch_file", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1234
|
-
}
|
|
1235
|
-
let before: string
|
|
1236
|
-
try {
|
|
1237
|
-
before = await ssh.readFile(server, input.path)
|
|
1238
|
-
} catch (error: any) {
|
|
1239
|
-
await audit.appendLog({ tool: "remote_patch_file", server: input.server, path: input.path, approvalStatus: "host-managed-required", code: "SSH_READ_FAILED", message: error.message })
|
|
1240
|
-
return errorResult({ tool: "remote_patch_file", server: input.server, code: "SSH_READ_FAILED", message: error.message })
|
|
1241
|
-
}
|
|
1242
|
-
let after: string
|
|
1243
|
-
try {
|
|
1244
|
-
after = applyUnifiedPatch(before, input.patch)
|
|
1245
|
-
} catch (error: any) {
|
|
1246
|
-
await audit.appendLog({ tool: "remote_patch_file", server: input.server, path: input.path, approvalStatus: "host-managed-required", code: "PATCH_APPLY_FAILED", message: error.message })
|
|
1247
|
-
return errorResult({
|
|
1248
|
-
tool: "remote_patch_file",
|
|
1249
|
-
server: input.server,
|
|
1250
|
-
code: "PATCH_APPLY_FAILED",
|
|
1251
|
-
message: error.message,
|
|
1252
|
-
})
|
|
1253
|
-
}
|
|
1254
|
-
return this.remoteWriteFile({ server: input.server, path: input.path, content: after })
|
|
1255
|
-
},
|
|
1256
|
-
|
|
1257
|
-
async remoteListDir(input: { server: string; path: string; recursive?: boolean; limit?: number }) {
|
|
1258
|
-
await audit.preflightLog()
|
|
1259
|
-
const server = await registry.resolve(input.server)
|
|
1260
|
-
if (!server) {
|
|
1261
|
-
await audit.appendLog({ tool: "remote_list_dir", server: input.server, path: input.path, approvalStatus: "not-required", code: "SERVER_NOT_FOUND" })
|
|
1262
|
-
return errorResult({ tool: "remote_list_dir", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1263
|
-
}
|
|
1264
|
-
let entries
|
|
1265
|
-
try {
|
|
1266
|
-
entries = await ssh.listDir(server, input.path, input.recursive ?? false, input.limit ?? 200)
|
|
1267
|
-
} catch (error: any) {
|
|
1268
|
-
await audit.appendLog({ tool: "remote_list_dir", server: input.server, path: input.path, approvalStatus: "not-required", code: "SSH_LIST_FAILED", message: error.message })
|
|
1269
|
-
return errorResult({ tool: "remote_list_dir", server: input.server, code: "SSH_LIST_FAILED", message: error.message })
|
|
1270
|
-
}
|
|
1271
|
-
await audit.appendLog({ tool: "remote_list_dir", server: input.server, path: input.path, approvalStatus: "not-required" })
|
|
1272
|
-
return okResult({ tool: "remote_list_dir", server: input.server, data: entries })
|
|
1273
|
-
},
|
|
1274
|
-
|
|
1275
|
-
async remoteStat(input: { server: string; path: string }) {
|
|
1276
|
-
await audit.preflightLog()
|
|
1277
|
-
const server = await registry.resolve(input.server)
|
|
1278
|
-
if (!server) {
|
|
1279
|
-
await audit.appendLog({ tool: "remote_stat", server: input.server, path: input.path, approvalStatus: "not-required", code: "SERVER_NOT_FOUND" })
|
|
1280
|
-
return errorResult({ tool: "remote_stat", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1281
|
-
}
|
|
1282
|
-
let stat
|
|
1283
|
-
try {
|
|
1284
|
-
stat = await ssh.stat(server, input.path)
|
|
1285
|
-
} catch (error: any) {
|
|
1286
|
-
await audit.appendLog({ tool: "remote_stat", server: input.server, path: input.path, approvalStatus: "not-required", code: "SSH_STAT_FAILED", message: error.message })
|
|
1287
|
-
return errorResult({ tool: "remote_stat", server: input.server, code: "SSH_STAT_FAILED", message: error.message })
|
|
1288
|
-
}
|
|
1289
|
-
await audit.appendLog({ tool: "remote_stat", server: input.server, path: input.path, approvalStatus: "not-required" })
|
|
1290
|
-
return okResult({ tool: "remote_stat", server: input.server, data: stat })
|
|
1291
|
-
},
|
|
1292
|
-
|
|
1293
|
-
async remoteFind(input: { server: string; path: string; pattern: string; glob?: string; limit?: number }) {
|
|
1294
|
-
await audit.preflightLog()
|
|
1295
|
-
const server = await registry.resolve(input.server)
|
|
1296
|
-
if (!server) {
|
|
1297
|
-
await audit.appendLog({ tool: "remote_find", server: input.server, path: input.path, approvalStatus: "not-required", code: "SERVER_NOT_FOUND" })
|
|
1298
|
-
return errorResult({ tool: "remote_find", server: input.server, code: "SERVER_NOT_FOUND" })
|
|
1299
|
-
}
|
|
1300
|
-
const command = input.glob
|
|
1301
|
-
? `find ${escape(input.path)} -name ${escape(input.glob)} | head -n ${input.limit ?? 200}`
|
|
1302
|
-
: `grep -R -n ${escape(input.pattern)} ${escape(input.path)} | head -n ${input.limit ?? 200}`
|
|
1303
|
-
let executed
|
|
1304
|
-
try {
|
|
1305
|
-
executed = await ssh.exec(server, command)
|
|
1306
|
-
} catch (error: any) {
|
|
1307
|
-
await audit.appendLog({ tool: "remote_find", server: input.server, command, approvalStatus: "not-required", code: "SSH_FIND_FAILED", message: error.message })
|
|
1308
|
-
return errorResult({ tool: "remote_find", server: input.server, code: "SSH_FIND_FAILED", message: error.message })
|
|
1309
|
-
}
|
|
1310
|
-
await audit.appendLog({ tool: "remote_find", server: input.server, command, approvalStatus: "not-required", ...executed })
|
|
1311
|
-
return okResult({ tool: "remote_find", server: input.server, data: executed })
|
|
1312
|
-
},
|
|
1313
|
-
})
|
|
1314
|
-
```
|
|
1315
|
-
|
|
1316
|
-
Apply the same `execution` and `audit` envelope to the other success and failure returns in this file, even when the snippet above shows only one representative branch per tool.
|
|
1317
|
-
|
|
1318
|
-
- [ ] **Step 4: Run the orchestrator checks**
|
|
1319
|
-
|
|
1320
|
-
Run: `bun test tests/integration/orchestrator.test.ts`
|
|
1321
|
-
Expected: PASS
|
|
1322
|
-
|
|
1323
|
-
Run: `bun test tests/unit/*.test.ts tests/integration/ssh-runtime.test.ts tests/integration/orchestrator.test.ts`
|
|
1324
|
-
Expected: PASS
|
|
1325
|
-
|
|
1326
|
-
- [ ] **Step 5: Commit**
|
|
1327
|
-
|
|
1328
|
-
```bash
|
|
1329
|
-
git add src/core/orchestrator.ts tests/integration/orchestrator.test.ts
|
|
1330
|
-
git commit -m "feat: add orchestrator for policy, audit, and ssh flows"
|
|
1331
|
-
```
|
|
1332
|
-
|
|
1333
|
-
## Task 8: Add The OpenCode Adapter And Register Explicit Tools
|
|
1334
|
-
|
|
1335
|
-
**Files:**
|
|
1336
|
-
- Create: `src/opencode/plugin.ts`
|
|
1337
|
-
- Create: `tests/unit/opencode-plugin.test.ts`
|
|
1338
|
-
- Modify: `src/index.ts`
|
|
1339
|
-
|
|
1340
|
-
- [ ] **Step 1: Write failing tests for tool registration**
|
|
1341
|
-
|
|
1342
|
-
```ts
|
|
1343
|
-
import { describe, expect, test } from "bun:test"
|
|
1344
|
-
import { OpenCodePlugin } from "../../src/index"
|
|
1345
|
-
|
|
1346
|
-
describe("OpenCode plugin", () => {
|
|
1347
|
-
test("registers the expected explicit remote tools", async () => {
|
|
1348
|
-
const hooks = await OpenCodePlugin({
|
|
1349
|
-
client: { app: { log: async () => {} } },
|
|
1350
|
-
directory: process.cwd(),
|
|
1351
|
-
worktree: process.cwd(),
|
|
1352
|
-
} as any)
|
|
1353
|
-
|
|
1354
|
-
expect(Object.keys(hooks.tool)).toEqual([
|
|
1355
|
-
"list_servers",
|
|
1356
|
-
"remote_exec",
|
|
1357
|
-
"remote_read_file",
|
|
1358
|
-
"remote_write_file",
|
|
1359
|
-
"remote_patch_file",
|
|
1360
|
-
"remote_list_dir",
|
|
1361
|
-
"remote_stat",
|
|
1362
|
-
"remote_find",
|
|
1363
|
-
])
|
|
1364
|
-
})
|
|
1365
|
-
})
|
|
1366
|
-
```
|
|
1367
|
-
|
|
1368
|
-
- [ ] **Step 2: Run the plugin test and confirm it fails before implementation**
|
|
1369
|
-
|
|
1370
|
-
Run: `bun test tests/unit/opencode-plugin.test.ts`
|
|
1371
|
-
Expected: FAIL because `OpenCodePlugin` does not yet return `tool` definitions
|
|
1372
|
-
|
|
1373
|
-
- [ ] **Step 3: Implement the OpenCode adapter with explicit custom tools**
|
|
1374
|
-
|
|
1375
|
-
Write `src/opencode/plugin.ts`:
|
|
1376
|
-
|
|
1377
|
-
```ts
|
|
1378
|
-
import { tool, type Plugin } from "@opencode-ai/plugin"
|
|
1379
|
-
import { createAuditLogStore } from "../core/audit/log-store"
|
|
1380
|
-
import { createGitAuditRepo } from "../core/audit/git-audit-repo"
|
|
1381
|
-
import { createOrchestrator } from "../core/orchestrator"
|
|
1382
|
-
import { runtimePaths } from "../core/paths"
|
|
1383
|
-
import { createKeychainSecretProvider } from "../core/registry/keychain-provider"
|
|
1384
|
-
import { createServerRegistry } from "../core/registry/server-registry"
|
|
1385
|
-
import { createSshRuntime } from "../core/ssh/ssh-runtime"
|
|
1386
|
-
|
|
1387
|
-
const createRuntimeDependencies = () => ({
|
|
1388
|
-
registry: createServerRegistry({
|
|
1389
|
-
registryFile: runtimePaths.registryFile,
|
|
1390
|
-
secretProvider: createKeychainSecretProvider(),
|
|
1391
|
-
}),
|
|
1392
|
-
ssh: createSshRuntime(),
|
|
1393
|
-
audit: {
|
|
1394
|
-
...createAuditLogStore(runtimePaths.auditLogFile),
|
|
1395
|
-
preflightLog: async () => createAuditLogStore(runtimePaths.auditLogFile).preflight(),
|
|
1396
|
-
preflightSnapshots: async () => createGitAuditRepo(runtimePaths.auditRepoDir).preflight(),
|
|
1397
|
-
captureSnapshots: async (change: any) => createGitAuditRepo(runtimePaths.auditRepoDir).captureChange(change),
|
|
1398
|
-
},
|
|
1399
|
-
})
|
|
1400
|
-
|
|
1401
|
-
export const OpenCodePlugin: Plugin = async (ctx) => {
|
|
1402
|
-
const orchestrator = createOrchestrator(createRuntimeDependencies())
|
|
1403
|
-
|
|
1404
|
-
return {
|
|
1405
|
-
tool: {
|
|
1406
|
-
list_servers: tool({
|
|
1407
|
-
description: "List registered remote servers",
|
|
1408
|
-
args: {},
|
|
1409
|
-
async execute() {
|
|
1410
|
-
return orchestrator.listServers()
|
|
1411
|
-
},
|
|
1412
|
-
}),
|
|
1413
|
-
remote_exec: tool({
|
|
1414
|
-
description: "Execute a command on a named remote server over SSH",
|
|
1415
|
-
args: {
|
|
1416
|
-
server: tool.schema.string(),
|
|
1417
|
-
command: tool.schema.string(),
|
|
1418
|
-
cwd: tool.schema.string().optional(),
|
|
1419
|
-
timeout: tool.schema.number().optional(),
|
|
1420
|
-
},
|
|
1421
|
-
async execute(args) {
|
|
1422
|
-
return orchestrator.remoteExec(args)
|
|
1423
|
-
},
|
|
1424
|
-
}),
|
|
1425
|
-
remote_read_file: tool({
|
|
1426
|
-
description: "Read a file from a named remote server",
|
|
1427
|
-
args: {
|
|
1428
|
-
server: tool.schema.string(),
|
|
1429
|
-
path: tool.schema.string(),
|
|
1430
|
-
offset: tool.schema.number().optional(),
|
|
1431
|
-
length: tool.schema.number().optional(),
|
|
1432
|
-
},
|
|
1433
|
-
async execute(args) {
|
|
1434
|
-
return orchestrator.remoteReadFile(args)
|
|
1435
|
-
},
|
|
1436
|
-
}),
|
|
1437
|
-
remote_write_file: tool({
|
|
1438
|
-
description: "Write a file to a named remote server with approval and audit",
|
|
1439
|
-
args: {
|
|
1440
|
-
server: tool.schema.string(),
|
|
1441
|
-
path: tool.schema.string(),
|
|
1442
|
-
content: tool.schema.string(),
|
|
1443
|
-
mode: tool.schema.number().optional(),
|
|
1444
|
-
},
|
|
1445
|
-
async execute(args) {
|
|
1446
|
-
return orchestrator.remoteWriteFile(args)
|
|
1447
|
-
},
|
|
1448
|
-
}),
|
|
1449
|
-
remote_patch_file: tool({
|
|
1450
|
-
description: "Apply a unified diff to a remote file with approval and audit",
|
|
1451
|
-
args: {
|
|
1452
|
-
server: tool.schema.string(),
|
|
1453
|
-
path: tool.schema.string(),
|
|
1454
|
-
patch: tool.schema.string(),
|
|
1455
|
-
},
|
|
1456
|
-
async execute(args) {
|
|
1457
|
-
return orchestrator.remotePatchFile(args)
|
|
1458
|
-
},
|
|
1459
|
-
}),
|
|
1460
|
-
remote_list_dir: tool({
|
|
1461
|
-
description: "List a directory on a named remote server",
|
|
1462
|
-
args: {
|
|
1463
|
-
server: tool.schema.string(),
|
|
1464
|
-
path: tool.schema.string(),
|
|
1465
|
-
recursive: tool.schema.boolean().optional(),
|
|
1466
|
-
limit: tool.schema.number().optional(),
|
|
1467
|
-
},
|
|
1468
|
-
async execute(args) {
|
|
1469
|
-
return orchestrator.remoteListDir(args)
|
|
1470
|
-
},
|
|
1471
|
-
}),
|
|
1472
|
-
remote_stat: tool({
|
|
1473
|
-
description: "Get file metadata on a named remote server",
|
|
1474
|
-
args: {
|
|
1475
|
-
server: tool.schema.string(),
|
|
1476
|
-
path: tool.schema.string(),
|
|
1477
|
-
},
|
|
1478
|
-
async execute(args) {
|
|
1479
|
-
return orchestrator.remoteStat(args)
|
|
1480
|
-
},
|
|
1481
|
-
}),
|
|
1482
|
-
remote_find: tool({
|
|
1483
|
-
description: "Search a named remote server for files or content",
|
|
1484
|
-
args: {
|
|
1485
|
-
server: tool.schema.string(),
|
|
1486
|
-
path: tool.schema.string(),
|
|
1487
|
-
pattern: tool.schema.string(),
|
|
1488
|
-
glob: tool.schema.string().optional(),
|
|
1489
|
-
limit: tool.schema.number().optional(),
|
|
1490
|
-
},
|
|
1491
|
-
async execute(args) {
|
|
1492
|
-
return orchestrator.remoteFind(args)
|
|
1493
|
-
},
|
|
1494
|
-
}),
|
|
1495
|
-
},
|
|
1496
|
-
}
|
|
1497
|
-
}
|
|
1498
|
-
```
|
|
1499
|
-
|
|
1500
|
-
Modify `src/index.ts`:
|
|
1501
|
-
|
|
1502
|
-
```ts
|
|
1503
|
-
export { OpenCodePlugin } from "./opencode/plugin"
|
|
1504
|
-
```
|
|
1505
|
-
|
|
1506
|
-
- [ ] **Step 4: Run the adapter checks**
|
|
1507
|
-
|
|
1508
|
-
Run: `bun test tests/unit/opencode-plugin.test.ts`
|
|
1509
|
-
Expected: PASS
|
|
1510
|
-
|
|
1511
|
-
Run: `bun run build`
|
|
1512
|
-
Expected: PASS and `dist/` emitted
|
|
1513
|
-
|
|
1514
|
-
- [ ] **Step 5: Commit**
|
|
1515
|
-
|
|
1516
|
-
```bash
|
|
1517
|
-
git add src/opencode/plugin.ts src/index.ts tests/unit/opencode-plugin.test.ts
|
|
1518
|
-
git commit -m "feat: register explicit remote tools for opencode"
|
|
1519
|
-
```
|
|
1520
|
-
|
|
1521
|
-
## Task 9: Add A Local OpenCode Smoke Fixture And Finish The Docs
|
|
1522
|
-
|
|
1523
|
-
**Files:**
|
|
1524
|
-
- Create: `examples/opencode-local/.opencode/package.json`
|
|
1525
|
-
- Create: `examples/opencode-local/.opencode/plugins/open-code.ts`
|
|
1526
|
-
- Create: `examples/opencode-local/opencode.json`
|
|
1527
|
-
- Modify: `README.md`
|
|
1528
|
-
|
|
1529
|
-
- [ ] **Step 1: Write the fixture files and README sections before manual verification**
|
|
1530
|
-
|
|
1531
|
-
Write `examples/opencode-local/.opencode/package.json`:
|
|
1532
|
-
|
|
1533
|
-
```json
|
|
1534
|
-
{
|
|
1535
|
-
"dependencies": {
|
|
1536
|
-
"@opencode-ai/plugin": "*",
|
|
1537
|
-
"diff": "*",
|
|
1538
|
-
"env-paths": "*",
|
|
1539
|
-
"keytar": "*",
|
|
1540
|
-
"shescape": "*",
|
|
1541
|
-
"ssh2": "*"
|
|
1542
|
-
}
|
|
1543
|
-
}
|
|
1544
|
-
```
|
|
1545
|
-
|
|
1546
|
-
Write `examples/opencode-local/.opencode/plugins/open-code.ts`:
|
|
1547
|
-
|
|
1548
|
-
```ts
|
|
1549
|
-
export { OpenCodePlugin as default } from "../../../../dist/index.js"
|
|
1550
|
-
```
|
|
1551
|
-
|
|
1552
|
-
Write `examples/opencode-local/opencode.json`:
|
|
1553
|
-
|
|
1554
|
-
```json
|
|
1555
|
-
{
|
|
1556
|
-
"$schema": "https://opencode.ai/config.json",
|
|
1557
|
-
"permission": {
|
|
1558
|
-
"list_servers": "allow",
|
|
1559
|
-
"remote_read_file": "allow",
|
|
1560
|
-
"remote_list_dir": "allow",
|
|
1561
|
-
"remote_stat": "allow",
|
|
1562
|
-
"remote_find": "allow",
|
|
1563
|
-
"remote_write_file": "ask",
|
|
1564
|
-
"remote_patch_file": "ask",
|
|
1565
|
-
"remote_exec": {
|
|
1566
|
-
"*": "ask",
|
|
1567
|
-
"cat *": "allow",
|
|
1568
|
-
"grep *": "allow",
|
|
1569
|
-
"find *": "allow",
|
|
1570
|
-
"ls *": "allow",
|
|
1571
|
-
"pwd": "allow",
|
|
1572
|
-
"uname *": "allow",
|
|
1573
|
-
"df *": "allow",
|
|
1574
|
-
"free *": "allow",
|
|
1575
|
-
"ps *": "allow",
|
|
1576
|
-
"systemctl status *": "allow"
|
|
1577
|
-
}
|
|
1578
|
-
}
|
|
1579
|
-
}
|
|
1580
|
-
```
|
|
1581
|
-
|
|
1582
|
-
Add to `README.md`:
|
|
1583
|
-
|
|
1584
|
-
```md
|
|
1585
|
-
## Development
|
|
1586
|
-
|
|
1587
|
-
Run `bun install`, `bun test`, and `bun run build`.
|
|
1588
|
-
Integration tests require Docker because `tests/integration/fake-ssh-server.ts` uses `testcontainers`.
|
|
1589
|
-
|
|
1590
|
-
## Manual OpenCode Smoke Test
|
|
1591
|
-
|
|
1592
|
-
1. `bun run build`
|
|
1593
|
-
2. `cd examples/opencode-local`
|
|
1594
|
-
3. `opencode`
|
|
1595
|
-
4. Ask the agent to call `list_servers`
|
|
1596
|
-
5. Ask the agent to call `remote_exec` with `cat /etc/hosts`
|
|
1597
|
-
6. Ask the agent to call `remote_write_file` and confirm the approval prompt appears
|
|
1598
|
-
```
|
|
1599
|
-
|
|
1600
|
-
- [ ] **Step 2: Run the automated verification before the manual smoke test**
|
|
1601
|
-
|
|
1602
|
-
Run: `bun test`
|
|
1603
|
-
Expected: PASS
|
|
1604
|
-
|
|
1605
|
-
Run: `bun run build`
|
|
1606
|
-
Expected: PASS
|
|
1607
|
-
|
|
1608
|
-
- [ ] **Step 3: Run the manual OpenCode smoke test**
|
|
1609
|
-
|
|
1610
|
-
Run:
|
|
1611
|
-
|
|
1612
|
-
```bash
|
|
1613
|
-
cd examples/opencode-local
|
|
1614
|
-
opencode
|
|
1615
|
-
```
|
|
1616
|
-
|
|
1617
|
-
Expected:
|
|
1618
|
-
- `list_servers`, `remote_exec`, `remote_read_file`, `remote_write_file`, `remote_patch_file`, `remote_list_dir`, `remote_stat`, and `remote_find` are visible to the host.
|
|
1619
|
-
- `remote_exec` with `cat /etc/hosts` runs without a prompt.
|
|
1620
|
-
- `remote_write_file` triggers an approval prompt.
|
|
1621
|
-
|
|
1622
|
-
If `remote_exec` object permissions do not match the `command` argument as expected, stop implementation and revisit the host adapter before shipping v1.
|
|
1623
|
-
|
|
1624
|
-
- [ ] **Step 4: Run the final project checks**
|
|
1625
|
-
|
|
1626
|
-
Run: `bun test`
|
|
1627
|
-
Expected: PASS
|
|
1628
|
-
|
|
1629
|
-
Run: `bun run typecheck`
|
|
1630
|
-
Expected: PASS
|
|
1631
|
-
|
|
1632
|
-
- [ ] **Step 5: Commit**
|
|
1633
|
-
|
|
1634
|
-
```bash
|
|
1635
|
-
git add examples/opencode-local/.opencode/package.json examples/opencode-local/.opencode/plugins/open-code.ts examples/opencode-local/opencode.json README.md
|
|
1636
|
-
git commit -m "docs: add local opencode smoke fixture"
|
|
1637
|
-
```
|
|
1638
|
-
|
|
1639
|
-
## Final Verification
|
|
1640
|
-
|
|
1641
|
-
- Run: `bun test`
|
|
1642
|
-
- Expected: full test suite passes
|
|
1643
|
-
- Run: `bun run typecheck`
|
|
1644
|
-
- Expected: no TypeScript errors
|
|
1645
|
-
- Run: `bun run build`
|
|
1646
|
-
- Expected: `dist/index.js` and type declarations are emitted
|
|
1647
|
-
- Run: manual smoke test in `examples/opencode-local`
|
|
1648
|
-
- Expected: explicit remote tools load, safe reads auto-run, writes prompt, audit files appear under runtime data paths
|
|
1649
|
-
|
|
1650
|
-
## Notes For The Implementer
|
|
1651
|
-
|
|
1652
|
-
- Keep the `opencode` adapter thin. If logic starts accumulating in `src/opencode/plugin.ts`, move it into `src/core/`.
|
|
1653
|
-
- Do not weaken the audit contract. Logging preflight must happen before remote execution, and dedicated file writes must preflight snapshot storage before mutating remote files.
|
|
1654
|
-
- Do not add session-wide approvals in v1.
|
|
1655
|
-
- Prefer extending dedicated remote file tools over making `remote_exec` more permissive.
|
|
1656
|
-
- If OpenCode host permissions behave differently from the documented expectation for custom tools, pause and reopen the design instead of silently changing the user-facing safety model.
|