gateproof 0.2.4 → 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (122) hide show
  1. package/README.md +1447 -153
  2. package/dist/cloudflare/index.d.ts +4 -6
  3. package/dist/cloudflare/index.d.ts.map +1 -1
  4. package/dist/cloudflare/index.js +9 -43
  5. package/dist/cloudflare/index.js.map +1 -1
  6. package/dist/index.d.ts +263 -75
  7. package/dist/index.d.ts.map +1 -1
  8. package/dist/index.js +1327 -212
  9. package/dist/index.js.map +1 -1
  10. package/package.json +18 -48
  11. package/dist/act.d.ts +0 -78
  12. package/dist/act.d.ts.map +0 -1
  13. package/dist/act.js +0 -47
  14. package/dist/act.js.map +0 -1
  15. package/dist/action-executors.d.ts +0 -39
  16. package/dist/action-executors.d.ts.map +0 -1
  17. package/dist/action-executors.js +0 -195
  18. package/dist/action-executors.js.map +0 -1
  19. package/dist/assert.d.ts +0 -59
  20. package/dist/assert.d.ts.map +0 -1
  21. package/dist/assert.js +0 -120
  22. package/dist/assert.js.map +0 -1
  23. package/dist/authority.d.ts +0 -34
  24. package/dist/authority.d.ts.map +0 -1
  25. package/dist/authority.js +0 -141
  26. package/dist/authority.js.map +0 -1
  27. package/dist/cli/gateproof.d.ts +0 -3
  28. package/dist/cli/gateproof.d.ts.map +0 -1
  29. package/dist/cli/gateproof.js +0 -548
  30. package/dist/cli/gateproof.js.map +0 -1
  31. package/dist/cloudflare/analytics.d.ts +0 -9
  32. package/dist/cloudflare/analytics.d.ts.map +0 -1
  33. package/dist/cloudflare/analytics.js +0 -98
  34. package/dist/cloudflare/analytics.js.map +0 -1
  35. package/dist/cloudflare/cli-stream.d.ts +0 -7
  36. package/dist/cloudflare/cli-stream.d.ts.map +0 -1
  37. package/dist/cloudflare/cli-stream.js +0 -85
  38. package/dist/cloudflare/cli-stream.js.map +0 -1
  39. package/dist/cloudflare/polling-backend.d.ts +0 -18
  40. package/dist/cloudflare/polling-backend.d.ts.map +0 -1
  41. package/dist/cloudflare/polling-backend.js +0 -53
  42. package/dist/cloudflare/polling-backend.js.map +0 -1
  43. package/dist/cloudflare/workers-logs.d.ts +0 -9
  44. package/dist/cloudflare/workers-logs.d.ts.map +0 -1
  45. package/dist/cloudflare/workers-logs.js +0 -51
  46. package/dist/cloudflare/workers-logs.js.map +0 -1
  47. package/dist/constants.d.ts +0 -11
  48. package/dist/constants.d.ts.map +0 -1
  49. package/dist/constants.js +0 -11
  50. package/dist/constants.js.map +0 -1
  51. package/dist/filepath-backend.d.ts +0 -64
  52. package/dist/filepath-backend.d.ts.map +0 -1
  53. package/dist/filepath-backend.js +0 -126
  54. package/dist/filepath-backend.js.map +0 -1
  55. package/dist/filepath-protocol.d.ts +0 -214
  56. package/dist/filepath-protocol.d.ts.map +0 -1
  57. package/dist/filepath-protocol.js +0 -239
  58. package/dist/filepath-protocol.js.map +0 -1
  59. package/dist/filepath-runtime.d.ts +0 -100
  60. package/dist/filepath-runtime.d.ts.map +0 -1
  61. package/dist/filepath-runtime.js +0 -190
  62. package/dist/filepath-runtime.js.map +0 -1
  63. package/dist/http-backend.d.ts +0 -32
  64. package/dist/http-backend.d.ts.map +0 -1
  65. package/dist/http-backend.js +0 -166
  66. package/dist/http-backend.js.map +0 -1
  67. package/dist/observe.d.ts +0 -26
  68. package/dist/observe.d.ts.map +0 -1
  69. package/dist/observe.js +0 -84
  70. package/dist/observe.js.map +0 -1
  71. package/dist/prd/define-prd.d.ts +0 -7
  72. package/dist/prd/define-prd.d.ts.map +0 -1
  73. package/dist/prd/define-prd.js +0 -8
  74. package/dist/prd/define-prd.js.map +0 -1
  75. package/dist/prd/index.d.ts +0 -7
  76. package/dist/prd/index.d.ts.map +0 -1
  77. package/dist/prd/index.js +0 -8
  78. package/dist/prd/index.js.map +0 -1
  79. package/dist/prd/loop.d.ts +0 -160
  80. package/dist/prd/loop.d.ts.map +0 -1
  81. package/dist/prd/loop.js +0 -462
  82. package/dist/prd/loop.js.map +0 -1
  83. package/dist/prd/runner.d.ts +0 -19
  84. package/dist/prd/runner.d.ts.map +0 -1
  85. package/dist/prd/runner.js +0 -253
  86. package/dist/prd/runner.js.map +0 -1
  87. package/dist/prd/scope-check.d.ts +0 -28
  88. package/dist/prd/scope-check.d.ts.map +0 -1
  89. package/dist/prd/scope-check.js +0 -135
  90. package/dist/prd/scope-check.js.map +0 -1
  91. package/dist/prd/scope-defaults.d.ts +0 -75
  92. package/dist/prd/scope-defaults.d.ts.map +0 -1
  93. package/dist/prd/scope-defaults.js +0 -235
  94. package/dist/prd/scope-defaults.js.map +0 -1
  95. package/dist/prd/types.d.ts +0 -101
  96. package/dist/prd/types.d.ts.map +0 -1
  97. package/dist/prd/types.js +0 -2
  98. package/dist/prd/types.js.map +0 -1
  99. package/dist/provider.d.ts +0 -6
  100. package/dist/provider.d.ts.map +0 -1
  101. package/dist/provider.js +0 -2
  102. package/dist/provider.js.map +0 -1
  103. package/dist/report.d.ts +0 -137
  104. package/dist/report.d.ts.map +0 -1
  105. package/dist/report.js +0 -234
  106. package/dist/report.js.map +0 -1
  107. package/dist/test-helpers.d.ts +0 -12
  108. package/dist/test-helpers.d.ts.map +0 -1
  109. package/dist/test-helpers.js +0 -33
  110. package/dist/test-helpers.js.map +0 -1
  111. package/dist/types.d.ts +0 -41
  112. package/dist/types.d.ts.map +0 -1
  113. package/dist/types.js +0 -2
  114. package/dist/types.js.map +0 -1
  115. package/dist/utils.d.ts +0 -22
  116. package/dist/utils.d.ts.map +0 -1
  117. package/dist/utils.js +0 -49
  118. package/dist/utils.js.map +0 -1
  119. package/dist/validation.d.ts +0 -6
  120. package/dist/validation.d.ts.map +0 -1
  121. package/dist/validation.js +0 -38
  122. package/dist/validation.js.map +0 -1
package/README.md CHANGED
@@ -1,222 +1,1516 @@
1
- # gateproof
1
+ # Gateproof
2
2
 
3
- Software is built in reverse. You know what you want before you know how to get there. TDD proved the idea: write the test first, then make it pass. Gateproof takes the next step.
3
+ Gateproof runs the proof, sends in the worker, and keeps going until the live claim is true.
4
4
 
5
- Write stories. Attach gates. Let agents iterate until reality matches intent.
5
+ ## Tutorial
6
6
 
7
- A **gate** observes real evidence (logs, telemetry), acts (browser, shell, deploy), and asserts outcomes. A **story** is a gate with a name and a place in a plan. A **prd.ts** is a list of stories in dependency order. The agent's only job is to make the next failing gate pass.
7
+ Goal: Start with one tiny gate that is small on purpose and complete on purpose.
8
8
 
9
- ## The thesis
9
+ ### examples/hello-world/plan.ts
10
10
 
11
- Plans are solid. Implementation is liquid.
11
+ ```ts
12
+ import { Effect } from "effect";
13
+ import {
14
+ Act,
15
+ Assert,
16
+ Gate,
17
+ Plan,
18
+ createHttpObserveResource,
19
+ type ScopeFile,
20
+ } from "../../src/index";
21
+ import { HELLO_WORLD_PORT } from "./server";
12
22
 
13
- Any codebase can be scoped down to stories and a `prd.ts`. Multiple agents can work the same plan, falling through the same checkpoints. Once a gate passes, previous work can't break -- the gate proves it. The skill shifts from writing code to defining the right guardrails.
23
+ const baseUrl = `http://127.0.0.1:${HELLO_WORLD_PORT}`;
14
24
 
15
- Gates are checkpoints that keep agents safe. They don't decide intent. They verify reality.
25
+ const scope = {
26
+ spec: {
27
+ title: "Hello World",
28
+ tutorial: {
29
+ goal: "Prove one tiny thing.",
30
+ outcome: "The run only passes when the live response says hello world.",
31
+ },
32
+ howTo: {
33
+ task: "Run one complete gate from one file.",
34
+ done: "The endpoint returns 200 and the body contains hello world.",
35
+ },
36
+ explanation: {
37
+ summary: "Even the smallest run is still a real proof loop.",
38
+ },
39
+ },
40
+ plan: Plan.define({
41
+ goals: [
42
+ {
43
+ id: "hello-world",
44
+ title: "GET / returns hello world",
45
+ gate: Gate.define({
46
+ observe: createHttpObserveResource({
47
+ url: `${baseUrl}/`,
48
+ }),
49
+ act: [Act.exec(`curl -sf ${baseUrl}/`)],
50
+ assert: [
51
+ Assert.httpResponse({ status: 200 }),
52
+ Assert.responseBodyIncludes("hello world"),
53
+ Assert.noErrors(),
54
+ ],
55
+ }),
56
+ },
57
+ ],
58
+ loop: {
59
+ maxIterations: 1,
60
+ stopOnFailure: true,
61
+ },
62
+ }),
63
+ } satisfies ScopeFile;
16
64
 
17
- ## Why this works
65
+ export default scope;
18
66
 
19
- Formal verification research established that the relationship between a specification and its implementation — called **refinement** — is itself a testable property. You don't need a theorem prover to get value from this idea. You can test refinement cheaply by running the system and checking that its behavior satisfies the spec.
67
+ if (import.meta.main) {
68
+ const result = await Effect.runPromise(Plan.runLoop(scope.plan));
69
+ console.log(JSON.stringify(result, null, 2));
20
70
 
21
- Gateproof distills this into three primitives:
71
+ if (result.status !== "pass") {
72
+ process.exitCode = 1;
73
+ }
74
+ }
75
+ ```
22
76
 
23
- 1. **Observe** collect real evidence (logs, telemetry) from a running system
24
- 2. **Act** — trigger real behavior (browser navigation, shell commands, deploys)
25
- 3. **Assert** — check that the evidence satisfies the specification
77
+ Outcome: The loop only passes when the live response says hello world.
26
78
 
27
- Each gate is a refinement check: does the running system's behavior refine what the story claims? The PRD orders these checks by dependency, so failures localize to the first broken obligation.
79
+ ## First Case Study: Cinder
28
80
 
29
- This is a deliberate simplification. We trade random input generation and exhaustive coverage for something an engineer can write in minutes and an agent can iterate against in a loop. The gate is the contract. The loop is the proof search.
81
+ ### alchemy.run.ts
30
82
 
31
- > Lineage: the *observe → act → assert* pattern draws on property-based testing ideas from [Chen, Rizkallah et al. — "Property-Based Testing: Climbing the Stairway to Verification" (SLE 2022)](https://doi.org/10.1145/3567512.3567520), which demonstrated that refinement properties can serve as a practical, incremental path toward verified systems.
83
+ ```ts
84
+ import alchemy from "alchemy";
85
+ import { Buffer } from "node:buffer";
86
+ import { mkdir } from "node:fs/promises";
87
+ import { R2RestStateStore } from "alchemy/state";
88
+ import {
89
+ CloudflareApiError,
90
+ DurableObjectNamespace,
91
+ KVNamespace,
92
+ R2Bucket,
93
+ Worker,
94
+ createBucket,
95
+ createCloudflareApi,
96
+ getBucket,
97
+ } from "alchemy/cloudflare";
32
98
 
33
- ## Install
99
+ function requireEnv(name: string): string {
100
+ const value = process.env[name]?.trim();
34
101
 
35
- ```bash
36
- bun add gateproof
37
- ```
102
+ if (!value) {
103
+ throw new Error(`${name} is required for cinder provisioning`);
104
+ }
38
105
 
39
- ## Minimal gate
106
+ return value;
107
+ }
40
108
 
41
- ```ts
42
- import { Gate, Act, Assert, createHttpObserveResource } from "gateproof";
109
+ const githubPat = requireEnv("GITHUB_PAT");
110
+ const webhookSecret = requireEnv("GITHUB_WEBHOOK_SECRET");
111
+ const internalToken = requireEnv("CINDER_INTERNAL_TOKEN");
112
+ const fixtureRepo = process.env.CINDER_FIXTURE_REPO?.trim() || "acoyfellow/cinder-prd-test";
113
+ const fixtureBranch = process.env.CINDER_FIXTURE_BRANCH?.trim() || "main";
114
+ const fixtureWorkflow =
115
+ process.env.CINDER_FIXTURE_WORKFLOW?.trim() || "cinder-proof.yml";
116
+ const githubApiBase = "https://api.github.com";
117
+ const stateBucketName = process.env.CINDER_STATE_BUCKET?.trim();
118
+ const stateBucketRegion = process.env.CINDER_STATE_REGION?.trim() || "auto";
43
119
 
44
- const result = await Gate.run({
45
- name: "post-deploy",
46
- observe: createHttpObserveResource({
47
- url: "https://api.example.com/health",
48
- }),
49
- act: [Act.wait(500)],
50
- assert: [Assert.noErrors()],
51
- stop: { maxMs: 10_000 },
52
- });
120
+ const fixtureCargoToml = `[package]
121
+ name = "cinder-proof"
122
+ version = "0.1.0"
123
+ edition = "2021"
53
124
 
54
- if (result.status !== "success") process.exit(1);
55
- ```
125
+ [dependencies]
126
+ serde = { version = "1", features = ["derive"] }
127
+ serde_json = "1"
128
+ `;
56
129
 
57
- ## Stories + PRD
130
+ const fixtureCargoLock = `# This file is automatically @generated by Cargo.
131
+ # It is not intended for manual editing.
132
+ version = 4
58
133
 
59
- ```ts
60
- import { definePrd, runPrd } from "gateproof/prd";
134
+ [[package]]
135
+ name = "cinder-proof"
136
+ version = "0.1.0"
137
+ dependencies = [
138
+ "serde",
139
+ "serde_json",
140
+ ]
141
+
142
+ [[package]]
143
+ name = "itoa"
144
+ version = "1.0.17"
145
+ source = "registry+https://github.com/rust-lang/crates.io-index"
146
+ checksum = "92ecc6618181def0457392ccd0ee51198e065e016d1d527a7ac1b6dc7c1f09d2"
147
+
148
+ [[package]]
149
+ name = "memchr"
150
+ version = "2.8.0"
151
+ source = "registry+https://github.com/rust-lang/crates.io-index"
152
+ checksum = "f8ca58f447f06ed17d5fc4043ce1b10dd205e060fb3ce5b979b8ed8e59ff3f79"
153
+
154
+ [[package]]
155
+ name = "proc-macro2"
156
+ version = "1.0.106"
157
+ source = "registry+https://github.com/rust-lang/crates.io-index"
158
+ checksum = "8fd00f0bb2e90d81d1044c2b32617f68fcb9fa3bb7640c23e9c748e53fb30934"
159
+ dependencies = [
160
+ "unicode-ident",
161
+ ]
162
+
163
+ [[package]]
164
+ name = "quote"
165
+ version = "1.0.44"
166
+ source = "registry+https://github.com/rust-lang/crates.io-index"
167
+ checksum = "21b2ebcf727b7760c461f091f9f0f539b77b8e87f2fd88131e7f1b433b3cece4"
168
+ dependencies = [
169
+ "proc-macro2",
170
+ ]
171
+
172
+ [[package]]
173
+ name = "serde"
174
+ version = "1.0.228"
175
+ source = "registry+https://github.com/rust-lang/crates.io-index"
176
+ checksum = "9a8e94ea7f378bd32cbbd37198a4a91436180c5bb472411e48b5ec2e2124ae9e"
177
+ dependencies = [
178
+ "serde_core",
179
+ "serde_derive",
180
+ ]
181
+
182
+ [[package]]
183
+ name = "serde_core"
184
+ version = "1.0.228"
185
+ source = "registry+https://github.com/rust-lang/crates.io-index"
186
+ checksum = "41d385c7d4ca58e59fc732af25c3983b67ac852c1a25000afe1175de458b67ad"
187
+ dependencies = [
188
+ "serde_derive",
189
+ ]
190
+
191
+ [[package]]
192
+ name = "serde_derive"
193
+ version = "1.0.228"
194
+ source = "registry+https://github.com/rust-lang/crates.io-index"
195
+ checksum = "d540f220d3187173da220f885ab66608367b6574e925011a9353e4badda91d79"
196
+ dependencies = [
197
+ "proc-macro2",
198
+ "quote",
199
+ "syn",
200
+ ]
201
+
202
+ [[package]]
203
+ name = "serde_json"
204
+ version = "1.0.149"
205
+ source = "registry+https://github.com/rust-lang/crates.io-index"
206
+ checksum = "83fc039473c5595ace860d8c4fafa220ff474b3fc6bfdb4293327f1a37e94d86"
207
+ dependencies = [
208
+ "itoa",
209
+ "memchr",
210
+ "serde",
211
+ "serde_core",
212
+ "zmij",
213
+ ]
214
+
215
+ [[package]]
216
+ name = "syn"
217
+ version = "2.0.117"
218
+ source = "registry+https://github.com/rust-lang/crates.io-index"
219
+ checksum = "e665b8803e7b1d2a727f4023456bbbbe74da67099c585258af0ad9c5013b9b99"
220
+ dependencies = [
221
+ "proc-macro2",
222
+ "quote",
223
+ "unicode-ident",
224
+ ]
225
+
226
+ [[package]]
227
+ name = "unicode-ident"
228
+ version = "1.0.24"
229
+ source = "registry+https://github.com/rust-lang/crates.io-index"
230
+ checksum = "e6e4313cd5fcd3dad5cafa179702e2b244f760991f45397d14d4ebf38247da75"
231
+
232
+ [[package]]
233
+ name = "zmij"
234
+ version = "1.0.21"
235
+ source = "registry+https://github.com/rust-lang/crates.io-index"
236
+ checksum = "b8848ee67ecc8aedbaf3e4122217aff892639231befc6a1b58d29fff4c2cabaa"
237
+ `;
238
+
239
+ const fixtureMainRs = `use serde::{Deserialize, Serialize};
240
+
241
+ #[derive(Serialize, Deserialize)]
242
+ struct ProofMessage {
243
+ ok: bool,
244
+ source: &'static str,
245
+ }
246
+
247
+ fn main() {
248
+ let message = ProofMessage {
249
+ ok: true,
250
+ source: "cinder-proof",
251
+ };
252
+
253
+ println!("{}", serde_json::to_string(&message).unwrap());
254
+ }
255
+ `;
256
+
257
+ const fixtureWorkflowContents = `name: cinder-proof
258
+ on:
259
+ workflow_dispatch:
260
+ jobs:
261
+ cargo-build:
262
+ runs-on: [self-hosted, cinder]
263
+ timeout-minutes: 20
264
+ steps:
265
+ - uses: actions/checkout@v4
266
+ - name: Build fixture
267
+ run: cargo build --locked
268
+ - name: Run fixture
269
+ run: cargo run --locked
270
+ `;
271
+
272
+ function parseFixtureRepository(repoRef: string) {
273
+ const [owner, name, ...extra] = repoRef.split("/");
274
+
275
+ if (!owner || !name || extra.length > 0) {
276
+ throw new Error(
277
+ `CINDER_FIXTURE_REPO must be "owner/name" but received "${repoRef}"`,
278
+ );
279
+ }
280
+
281
+ return { owner, name };
282
+ }
283
+
284
+ async function githubRequest(
285
+ path: string,
286
+ init: RequestInit = {},
287
+ okStatuses: number[] = [200],
288
+ ) {
289
+ const response = await fetch(`${githubApiBase}${path}`, {
290
+ ...init,
291
+ headers: {
292
+ Accept: "application/vnd.github+json",
293
+ Authorization: `Bearer ${githubPat}`,
294
+ "User-Agent": "cinder-provisioner",
295
+ "X-GitHub-Api-Version": "2022-11-28",
296
+ ...(init.headers ?? {}),
297
+ },
298
+ });
61
299
 
62
- const prd = definePrd({
63
- stories: [
300
+ if (okStatuses.includes(response.status)) {
301
+ return response;
302
+ }
303
+
304
+ const body = await response.text();
305
+ throw new Error(
306
+ `GitHub API ${init.method ?? "GET"} ${path} failed with ${response.status}: ${body}`,
307
+ );
308
+ }
309
+
310
+ async function ensureFixtureRepository() {
311
+ const { owner, name } = parseFixtureRepository(fixtureRepo);
312
+ const existing = await githubRequest(`/repos/${owner}/${name}`, {}, [200, 404]);
313
+
314
+ if (existing.status === 200) {
315
+ return (await existing.json()) as { default_branch: string; full_name: string };
316
+ }
317
+
318
+ const viewer = (await (
319
+ await githubRequest("/user")
320
+ ).json()) as { login: string };
321
+
322
+ if (viewer.login !== owner) {
323
+ throw new Error(
324
+ `Fixture repo ${fixtureRepo} is missing and cannot be auto-created because PAT owner "${viewer.login}" does not match "${owner}"`,
325
+ );
326
+ }
327
+
328
+ const created = await githubRequest(
329
+ "/user/repos",
64
330
  {
65
- id: "user-signup",
66
- title: "User can sign up with email",
67
- gateFile: "./gates/signup.gate.ts",
331
+ method: "POST",
332
+ body: JSON.stringify({
333
+ name,
334
+ description: "Canonical GitHub proof fixture for Cinder",
335
+ auto_init: true,
336
+ private: false,
337
+ }),
68
338
  },
339
+ [201],
340
+ );
341
+
342
+ return (await created.json()) as { default_branch: string; full_name: string };
343
+ }
344
+
345
+ async function ensureFixtureBranch(repo: { default_branch: string }) {
346
+ if (fixtureBranch === repo.default_branch) {
347
+ return;
348
+ }
349
+
350
+ const { owner, name } = parseFixtureRepository(fixtureRepo);
351
+ const branchResponse = await githubRequest(
352
+ `/repos/${owner}/${name}/git/ref/heads/${encodeURIComponent(fixtureBranch)}`,
353
+ {},
354
+ [200, 404],
355
+ );
356
+
357
+ if (branchResponse.status === 200) {
358
+ return;
359
+ }
360
+
361
+ const defaultBranchRef = (await (
362
+ await githubRequest(
363
+ `/repos/${owner}/${name}/git/ref/heads/${encodeURIComponent(repo.default_branch)}`,
364
+ )
365
+ ).json()) as { object: { sha: string } };
366
+
367
+ await githubRequest(
368
+ `/repos/${owner}/${name}/git/refs`,
69
369
  {
70
- id: "email-verification",
71
- title: "User receives verification email",
72
- gateFile: "./gates/verify.gate.ts",
73
- dependsOn: ["user-signup"],
370
+ method: "POST",
371
+ body: JSON.stringify({
372
+ ref: `refs/heads/${fixtureBranch}`,
373
+ sha: defaultBranchRef.object.sha,
374
+ }),
74
375
  },
75
- ] as const,
76
- });
376
+ [201],
377
+ );
378
+ }
77
379
 
78
- const result = await runPrd(prd);
79
- if (!result.success) process.exit(1);
80
- ```
380
+ async function upsertFixtureFile(path: string, content: string, message: string) {
381
+ const { owner, name } = parseFixtureRepository(fixtureRepo);
382
+ const encodedPath = path
383
+ .split("/")
384
+ .map((segment) => encodeURIComponent(segment))
385
+ .join("/");
386
+ const existing = await githubRequest(
387
+ `/repos/${owner}/${name}/contents/${encodedPath}?ref=${encodeURIComponent(fixtureBranch)}`,
388
+ {},
389
+ [200, 404],
390
+ );
391
+ const sha =
392
+ existing.status === 200
393
+ ? ((await existing.json()) as { sha: string }).sha
394
+ : undefined;
81
395
 
82
- ## Assertions
396
+ await githubRequest(
397
+ `/repos/${owner}/${name}/contents/${encodedPath}`,
398
+ {
399
+ method: "PUT",
400
+ body: JSON.stringify({
401
+ message,
402
+ branch: fixtureBranch,
403
+ content: Buffer.from(content, "utf8").toString("base64"),
404
+ sha,
405
+ }),
406
+ },
407
+ [200, 201],
408
+ );
409
+ }
83
410
 
84
- `Assert.noErrors()`, `Assert.hasAction(name)`, `Assert.hasStage(name)`, `Assert.custom(name, fn)`, `Assert.authority(policy)`.
411
+ async function upsertFixtureWebhook(webhookUrl: string) {
412
+ const { owner, name } = parseFixtureRepository(fixtureRepo);
413
+ const hooks = (await (
414
+ await githubRequest(`/repos/${owner}/${name}/hooks`)
415
+ ).json()) as Array<{ id: number; name: string; config?: { url?: string } }>;
416
+ const existing = hooks.find(
417
+ (hook) => hook.name === "web" && hook.config?.url === webhookUrl,
418
+ );
419
+ const body = JSON.stringify({
420
+ active: true,
421
+ events: ["workflow_job"],
422
+ config: {
423
+ url: webhookUrl,
424
+ content_type: "json",
425
+ insecure_ssl: "0",
426
+ secret: webhookSecret,
427
+ },
428
+ });
85
429
 
86
- ```ts
87
- import { Gate, Act, Assert } from "gateproof";
88
- import { CloudflareProvider } from "gateproof/cloudflare";
430
+ if (existing) {
431
+ await githubRequest(
432
+ `/repos/${owner}/${name}/hooks/${existing.id}`,
433
+ {
434
+ method: "PATCH",
435
+ body,
436
+ },
437
+ [200],
438
+ );
439
+ return;
440
+ }
89
441
 
90
- const provider = CloudflareProvider({
91
- accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
92
- apiToken: process.env.CLOUDFLARE_API_TOKEN!,
442
+ await githubRequest(
443
+ `/repos/${owner}/${name}/hooks`,
444
+ {
445
+ method: "POST",
446
+ body,
447
+ },
448
+ [201],
449
+ );
450
+ }
451
+
452
+ async function syncFixtureRepository(webhookUrl: string) {
453
+ const repo = await ensureFixtureRepository();
454
+ await ensureFixtureBranch(repo);
455
+
456
+ await upsertFixtureFile("Cargo.toml", fixtureCargoToml, "chore: sync cinder fixture Cargo.toml");
457
+ await upsertFixtureFile("Cargo.lock", fixtureCargoLock, "chore: sync cinder fixture Cargo.lock");
458
+ await upsertFixtureFile("src/main.rs", fixtureMainRs, "chore: sync cinder fixture main.rs");
459
+ await upsertFixtureFile(
460
+ `.github/workflows/${fixtureWorkflow}`,
461
+ fixtureWorkflowContents,
462
+ "chore: sync cinder fixture workflow",
463
+ );
464
+ await upsertFixtureWebhook(webhookUrl);
465
+ }
466
+
467
+ async function ensureStateBucket(bucketName: string, locationHint: string) {
468
+ const accountId = requireEnv("CLOUDFLARE_ACCOUNT_ID");
469
+ const apiToken = requireEnv("CLOUDFLARE_API_TOKEN");
470
+ const api = await createCloudflareApi({ accountId, apiToken } as any);
471
+
472
+ try {
473
+ await getBucket(api, bucketName);
474
+ return { accountId, apiToken };
475
+ } catch (error) {
476
+ if (!(error instanceof CloudflareApiError) || error.status !== 404) {
477
+ throw error;
478
+ }
479
+ }
480
+
481
+ await createBucket(api, bucketName, { locationHint });
482
+ return { accountId, apiToken };
483
+ }
484
+
485
+ const stateStoreCredentials = stateBucketName
486
+ ? await ensureStateBucket(stateBucketName, stateBucketRegion)
487
+ : null;
488
+
489
+ export const app = await alchemy("cinder", {
490
+ stage: process.env.CINDER_STAGE ?? "production",
491
+ ...(stateBucketName && stateStoreCredentials
492
+ ? {
493
+ stateStore: (scope) =>
494
+ new R2RestStateStore(scope, {
495
+ accountId: stateStoreCredentials.accountId,
496
+ apiToken: stateStoreCredentials.apiToken,
497
+ bucketName: stateBucketName,
498
+ } as any),
499
+ }
500
+ : {}),
93
501
  });
94
502
 
95
- const result = await Gate.run({
96
- name: "checkout-flow",
97
- observe: provider.observe({ backend: "analytics", dataset: "worker_logs" }),
98
- act: [Act.browser({ url: "https://app.example.com/checkout" })],
99
- assert: [
100
- Assert.noErrors(),
101
- Assert.hasAction("checkout_started"),
102
- Assert.custom("has-total", (logs) => logs.some(l => (l as { data?: { total?: number } }).data?.total > 0)),
103
- ],
104
- stop: { maxMs: 15_000 },
503
+ export const cacheBucket = await R2Bucket("cinder-cache", {
504
+ empty: false,
105
505
  });
106
- if (result.status !== "success") process.exit(1);
107
- ```
108
506
 
109
- ## Agent gates
507
+ export const runnerState = await KVNamespace("cinder-runner-state");
110
508
 
111
- Spawn an AI agent in an isolated container, observe its NDJSON event stream, and assert what it's allowed to do.
509
+ export const runnerPool = await DurableObjectNamespace("RunnerPool", {
510
+ className: "RunnerPool",
511
+ sqlite: true,
512
+ });
112
513
 
113
- ```ts
114
- import { Gate, Act, Assert } from "gateproof";
115
- import { setFilepathRuntime, CloudflareSandboxRuntime } from "gateproof";
116
- import { getSandbox } from "@cloudflare/sandbox";
117
-
118
- // 1. Wire up your container runtime (once at startup)
119
- setFilepathRuntime(new CloudflareSandboxRuntime({
120
- getSandbox: (config) => getSandbox(env.Sandbox, `agent-${config.name}`),
121
- }));
122
-
123
- // 2. Run the gate
124
- const container = await runtime.spawn({
125
- name: "fix-auth",
126
- agent: "claude-code",
127
- model: "claude-sonnet-4-20250514",
128
- task: "Fix the null pointer in src/auth.ts",
514
+ export const jobQueue = await DurableObjectNamespace("JobQueue", {
515
+ className: "JobQueue",
516
+ sqlite: true,
129
517
  });
130
518
 
131
- const observe = createFilepathObserveResource(container, "fix-auth");
132
-
133
- await Gate.run({
134
- name: "fix-auth-bug",
135
- observe,
136
- act: [Act.wait(300_000)],
137
- assert: [
138
- Assert.noErrors(),
139
- Assert.hasAction("commit"),
140
- Assert.hasAction("done"),
141
- Assert.authority({
142
- canCommit: true,
143
- canSpawn: false,
144
- forbiddenTools: ["delete_file"],
145
- }),
146
- ],
147
- stop: { idleMs: 5000, maxMs: 300_000 },
519
+ export const cacheWorker = await Worker("cinder-cache-worker", {
520
+ entrypoint: "./crates/cinder-cache/build/worker/shim.mjs",
521
+ bindings: {
522
+ CACHE_BUCKET: cacheBucket,
523
+ CINDER_INTERNAL_TOKEN: alchemy.secret(internalToken),
524
+ },
148
525
  });
526
+
527
+ export const orchestrator = await Worker("cinder-orchestrator", {
528
+ entrypoint: "./crates/cinder-orchestrator/build/worker/shim.mjs",
529
+ bindings: {
530
+ CACHE_BUCKET: cacheBucket,
531
+ RUNNER_STATE: runnerState,
532
+ RUNNER_POOL: runnerPool,
533
+ JOB_QUEUE: jobQueue,
534
+ GITHUB_WEBHOOK_SECRET: alchemy.secret(webhookSecret),
535
+ CINDER_INTERNAL_TOKEN: alchemy.secret(internalToken),
536
+ GITHUB_PAT: alchemy.secret(githubPat),
537
+ CINDER_CACHE_WORKER_URL: cacheWorker.url!,
538
+ CINDER_FIXTURE_REPO: fixtureRepo,
539
+ CINDER_FIXTURE_BRANCH: fixtureBranch,
540
+ CINDER_FIXTURE_WORKFLOW: fixtureWorkflow,
541
+ },
542
+ });
543
+
544
+ await app.finalize();
545
+ await syncFixtureRepository(`${orchestrator.url}/webhook/github`);
546
+
547
+ const runtimeDirectory = new URL("./.gateproof/", import.meta.url);
548
+ const runtimeFile = new URL("./.gateproof/runtime.json", import.meta.url);
549
+
550
+ await mkdir(runtimeDirectory, { recursive: true });
551
+
552
+ await Bun.write(
553
+ runtimeFile,
554
+ `${JSON.stringify(
555
+ {
556
+ generatedAt: new Date().toISOString(),
557
+ stage: process.env.CINDER_STAGE ?? "production",
558
+ orchestratorName: orchestrator.name,
559
+ orchestratorUrl: orchestrator.url,
560
+ cacheWorkerName: cacheWorker.name,
561
+ cacheWorkerUrl: cacheWorker.url,
562
+ fixtureRepo,
563
+ fixtureBranch,
564
+ fixtureWorkflow,
565
+ },
566
+ null,
567
+ 2,
568
+ )}\n`,
569
+ );
570
+
571
+ console.log(`Wrote runtime outputs to ${runtimeFile.pathname}`);
149
572
  ```
150
573
 
151
- `Assert.authority()` enforces governance policies against the agent's actual behavior — what it committed, spawned, and which tools it used.
574
+ ### plan.ts
152
575
 
153
- ## Writing good gates
576
+ ```ts
577
+ import { Effect } from "effect";
578
+ import crypto from "node:crypto";
579
+ import { existsSync, readFileSync, rmSync } from "node:fs";
580
+ import type { ScopeFile } from "gateproof";
581
+ import {
582
+ Act,
583
+ Assert,
584
+ Gate,
585
+ Plan,
586
+ Require,
587
+ } from "gateproof";
588
+ import { Cloudflare } from "gateproof/cloudflare";
154
589
 
155
- The hardest part of gateproof is not the library — it's writing gates that actually prove what you think they prove.
590
+ type RuntimeState = {
591
+ orchestratorName?: string;
592
+ orchestratorUrl?: string;
593
+ cacheWorkerUrl?: string;
594
+ fixtureRepo?: string;
595
+ fixtureBranch?: string;
596
+ fixtureWorkflow?: string;
597
+ };
156
598
 
157
- **A weak gate passes on silence.** If your system emits no logs and your only assertion is `Assert.noErrors()`, the gate passes vacuously. Nothing was tested. Use `requirePositiveSignal: true` on stories, or assert specific evidence (`Assert.hasAction`, `Assert.hasStage`).
599
+ function isRecord(value: unknown): value is Record<string, unknown> {
600
+ return typeof value === "object" && value !== null;
601
+ }
158
602
 
159
- **A good gate is falsifiable.** Ask: "what broken implementation would still pass this gate?" If the answer is "many," the gate is too weak. Tighten it until a broken system fails.
603
+ function readOptionalEnv(name: string): string | undefined {
604
+ const value = process.env[name];
605
+ if (typeof value !== "string") {
606
+ return undefined;
607
+ }
160
608
 
161
- **Start narrow, then widen.** One specific assertion that catches a real failure is worth more than ten vague ones. You can always add assertions later — you can't take back a false pass.
609
+ const trimmed = value.trim();
610
+ return trimmed.length > 0 ? trimmed : undefined;
611
+ }
162
612
 
163
- ## The loop
613
+ function loadRuntimeState(): RuntimeState | null {
614
+ const runtimeFile = new URL("./.gateproof/runtime.json", import.meta.url);
164
615
 
165
- Gate fails. Agent reads the failure evidence. Agent fixes code. Gate re-runs. Loop until pass.
616
+ if (!existsSync(runtimeFile)) {
617
+ return null;
618
+ }
166
619
 
167
- **Bring your own agent** — the loop takes any async function:
620
+ try {
621
+ const parsed: unknown = JSON.parse(readFileSync(runtimeFile, "utf8"));
622
+ if (!isRecord(parsed)) {
623
+ return null;
624
+ }
168
625
 
169
- ```ts
170
- import { runPrdLoop } from "gateproof/prd";
171
-
172
- await runPrdLoop("./prd.ts", {
173
- agent: async (ctx) => {
174
- // ctx.failureSummary — what failed and why
175
- // ctx.recentDiff — recent git changes
176
- // ctx.prdContent — full PRD for context
177
- // ctx.failedStory — the Story object that failed
178
- // ctx.signal — AbortSignal for cancellation
179
-
180
- // Use any agent: Claude Code, Cursor, Codex, custom LLM wrapper
181
- const result = await yourAgent.fix(ctx.failureSummary);
182
- return { changes: result.files, commitMsg: "fix: resolve failing gate" };
183
- },
184
- maxIterations: 5,
626
+ return {
627
+ orchestratorName:
628
+ typeof parsed.orchestratorName === "string" ? parsed.orchestratorName : undefined,
629
+ orchestratorUrl:
630
+ typeof parsed.orchestratorUrl === "string" ? parsed.orchestratorUrl : undefined,
631
+ cacheWorkerUrl:
632
+ typeof parsed.cacheWorkerUrl === "string" ? parsed.cacheWorkerUrl : undefined,
633
+ fixtureRepo: typeof parsed.fixtureRepo === "string" ? parsed.fixtureRepo : undefined,
634
+ fixtureBranch:
635
+ typeof parsed.fixtureBranch === "string" ? parsed.fixtureBranch : undefined,
636
+ fixtureWorkflow:
637
+ typeof parsed.fixtureWorkflow === "string" ? parsed.fixtureWorkflow : undefined,
638
+ };
639
+ } catch {
640
+ return null;
641
+ }
642
+ }
643
+
644
+ function resolveLocalRunnerId(): string {
645
+ try {
646
+ const hostname = readFileSync("/etc/hostname", "utf8").trim();
647
+ return `cinder-${hostname || "unknown"}`;
648
+ } catch {
649
+ return "cinder-unknown";
650
+ }
651
+ }
652
+
653
+ const runtimeState = loadRuntimeState();
654
+ const baseUrl = readOptionalEnv("CINDER_BASE_URL") ?? runtimeState?.orchestratorUrl ?? "";
655
+ const cacheWorkerUrl =
656
+ readOptionalEnv("CINDER_CACHE_WORKER_URL") ?? runtimeState?.cacheWorkerUrl ?? "";
657
+ const workerName =
658
+ readOptionalEnv("CINDER_WORKER_NAME") ?? runtimeState?.orchestratorName ?? "cinder-orchestrator";
659
+ const fixtureRepo =
660
+ readOptionalEnv("CINDER_FIXTURE_REPO") ?? runtimeState?.fixtureRepo ?? "acoyfellow/cinder-prd-test";
661
+ const fixtureBranch = readOptionalEnv("CINDER_FIXTURE_BRANCH") ?? runtimeState?.fixtureBranch ?? "";
662
+ const fixtureWorkflow =
663
+ readOptionalEnv("CINDER_FIXTURE_WORKFLOW") ?? runtimeState?.fixtureWorkflow ?? "";
664
+ const internalToken = readOptionalEnv("CINDER_INTERNAL_TOKEN") ?? "";
665
+
666
+ const missKey = crypto.randomBytes(32).toString("hex");
667
+ const newKey = crypto.randomBytes(32).toString("hex");
668
+ const speedThresholdMs = Number(process.env.SPEED_THRESHOLD_MS ?? "60000");
669
+ const testRepo = process.env.TEST_REPO ?? "";
670
+ const harnessBaseUrl = "http://127.0.0.1:9000";
671
+ const harnessRunUrl = `${harnessBaseUrl}/test/run`;
672
+ const localRunnerId = resolveLocalRunnerId();
673
+ const agentLogPath = "/tmp/cinder-agent-proof.log";
674
+ const agentPidPath = "/tmp/cinder-agent-proof.pid";
675
+ const runnerJobPath = "/tmp/cinder-proof-runner-job.json";
676
+ const queuePayloadPath = "/tmp/cinder-proof-queue-payload.json";
677
+
678
+ let managedHarness: ReturnType<typeof Bun.spawn> | null = null;
679
+
680
+ async function canReachLocalHarness(): Promise<boolean> {
681
+ try {
682
+ const response = await fetch(harnessBaseUrl);
683
+ return response.ok || response.status === 404;
684
+ } catch {
685
+ return false;
686
+ }
687
+ }
688
+
689
+ async function ensureLocalHarness(): Promise<void> {
690
+ if (await canReachLocalHarness()) {
691
+ return;
692
+ }
693
+
694
+ managedHarness = Bun.spawn({
695
+ cmd: ["bun", "harness.ts"],
696
+ cwd: process.cwd(),
697
+ stdout: "inherit",
698
+ stderr: "inherit",
699
+ });
700
+
701
+ const deadline = Date.now() + 5_000;
702
+ while (Date.now() < deadline) {
703
+ if (await canReachLocalHarness()) {
704
+ return;
705
+ }
706
+
707
+ await Bun.sleep(100);
708
+ }
709
+
710
+ throw new Error("cinder proof harness did not start on 127.0.0.1:9000");
711
+ }
712
+
713
+ function stopManagedHarness(): void {
714
+ if (!managedHarness) {
715
+ return;
716
+ }
717
+
718
+ managedHarness.kill();
719
+ managedHarness = null;
720
+ }
721
+
722
+ function stopManagedAgent(): void {
723
+ if (!existsSync(agentPidPath)) {
724
+ return;
725
+ }
726
+
727
+ try {
728
+ const pid = Number.parseInt(readFileSync(agentPidPath, "utf8").trim(), 10);
729
+ if (Number.isFinite(pid)) {
730
+ process.kill(pid);
731
+ }
732
+ } catch {
733
+ // Ignore stale proof agent state.
734
+ }
735
+
736
+ try {
737
+ rmSync(agentPidPath, { force: true });
738
+ } catch {
739
+ // Ignore pidfile cleanup failures during shutdown.
740
+ }
741
+ }
742
+
743
+ async function ensureColdBuildBaseline(): Promise<void> {
744
+ if (readOptionalEnv("COLD_BUILD_MS")) {
745
+ return;
746
+ }
747
+
748
+ if (!testRepo) {
749
+ return;
750
+ }
751
+
752
+ try {
753
+ const response = await fetch(harnessRunUrl, {
754
+ method: "POST",
755
+ headers: {
756
+ "Content-Type": "application/json",
757
+ },
758
+ body: JSON.stringify({
759
+ repo: testRepo,
760
+ with_cache: false,
761
+ }),
762
+ });
763
+
764
+ if (!response.ok) {
765
+ return;
766
+ }
767
+
768
+ const parsed: unknown = await response.json();
769
+ if (!isRecord(parsed)) {
770
+ return;
771
+ }
772
+
773
+ const buildDurationMs = parsed.build_duration_ms;
774
+ if (typeof buildDurationMs !== "number" || !Number.isFinite(buildDurationMs)) {
775
+ return;
776
+ }
777
+
778
+ process.env.COLD_BUILD_MS = String(buildDurationMs);
779
+ } catch {
780
+ // Let the existing prerequisite fail clearly if the harness is unavailable.
781
+ }
782
+ }
783
+
784
+ await ensureColdBuildBaseline();
785
+
786
+ const workerLogs = Cloudflare.observe({
787
+ accountId: readOptionalEnv("CLOUDFLARE_ACCOUNT_ID") ?? "",
788
+ apiToken: readOptionalEnv("CLOUDFLARE_API_TOKEN") ?? "",
789
+ workerName,
790
+ sinceMs: 120_000,
791
+ pollInterval: 1_000,
185
792
  });
186
- ```
187
793
 
188
- Or use a pre-built agent:
794
+ if (!process.env.CINDER_BASE_URL && baseUrl) {
795
+ process.env.CINDER_BASE_URL = baseUrl;
796
+ }
189
797
 
190
- ```ts
191
- import { runPrdLoop, createOpenCodeAgent } from "gateproof/prd";
798
+ if (!process.env.CINDER_WORKER_NAME && workerName) {
799
+ process.env.CINDER_WORKER_NAME = workerName;
800
+ }
801
+
802
+ if (!process.env.CINDER_FIXTURE_REPO && fixtureRepo) {
803
+ process.env.CINDER_FIXTURE_REPO = fixtureRepo;
804
+ }
805
+
806
+ if (!process.env.CINDER_FIXTURE_BRANCH && fixtureBranch) {
807
+ process.env.CINDER_FIXTURE_BRANCH = fixtureBranch;
808
+ }
809
+
810
+ if (!process.env.CINDER_FIXTURE_WORKFLOW && fixtureWorkflow) {
811
+ process.env.CINDER_FIXTURE_WORKFLOW = fixtureWorkflow;
812
+ }
192
813
 
193
- await runPrdLoop("./prd.ts", {
194
- agent: createOpenCodeAgent({ apiKey: process.env.OPENCODE_ZEN_API_KEY }),
195
- maxIterations: 7,
814
+ const scope = {
815
+ spec: {
816
+ title: "Cinder",
817
+ tutorial: {
818
+ goal: "Prove cinder on a live deployment, not just deploy it.",
819
+ outcome:
820
+ "Webhook intake, queueing, runner registration, cache paths, and the speed claim all go green.",
821
+ },
822
+ howTo: {
823
+ task: "Run the cinder proof loop against already-provisioned infrastructure.",
824
+ done:
825
+ "Cinder only exits green when the live system can do the work and the speed claim holds.",
826
+ },
827
+ explanation: {
828
+ summary:
829
+ "alchemy.run.ts creates the infrastructure once and writes .gateproof/runtime.json. This file is only the acceptance loop for the live product.",
830
+ },
831
+ },
832
+ plan: Plan.define({
833
+ goals: [
834
+ {
835
+ id: "webhook",
836
+ title: "A GitHub webhook queues a runnable job",
837
+ gate: Gate.define({
838
+ observe: workerLogs,
839
+ prerequisites: [
840
+ Require.env(
841
+ "CLOUDFLARE_ACCOUNT_ID",
842
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
843
+ ),
844
+ Require.env(
845
+ "CLOUDFLARE_API_TOKEN",
846
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
847
+ ),
848
+ Require.env(
849
+ "GITHUB_PAT",
850
+ "GITHUB_PAT is required to dispatch the GitHub proof workflow.",
851
+ ),
852
+ Require.env(
853
+ "CINDER_FIXTURE_BRANCH",
854
+ "Run bun run provision first or set CINDER_FIXTURE_BRANCH for the GitHub proof fixture.",
855
+ ),
856
+ Require.env(
857
+ "CINDER_FIXTURE_WORKFLOW",
858
+ "Run bun run provision first or set CINDER_FIXTURE_WORKFLOW for the GitHub proof fixture.",
859
+ ),
860
+ Require.env(
861
+ "CINDER_INTERNAL_TOKEN",
862
+ "CINDER_INTERNAL_TOKEN is required for internal API access.",
863
+ ),
864
+ Require.env(
865
+ "CINDER_BASE_URL",
866
+ "Run bun run provision first or set CINDER_BASE_URL to the live orchestrator URL.",
867
+ ),
868
+ ],
869
+ act: [
870
+ Act.exec(
871
+ `bun -e 'const repo = ${JSON.stringify(fixtureRepo)};
872
+ const workflow = ${JSON.stringify(fixtureWorkflow)};
873
+ const branch = ${JSON.stringify(fixtureBranch)};
874
+ const token = process.env.GITHUB_PAT;
875
+ if (!token) {
876
+ throw new Error("GITHUB_PAT is required");
877
+ }
878
+ const headers = {
879
+ Accept: "application/vnd.github+json",
880
+ Authorization: "Bearer " + token,
881
+ "X-GitHub-Api-Version": "2022-11-28",
882
+ };
883
+ const listUrl =
884
+ "https://api.github.com/repos/" +
885
+ repo +
886
+ "/actions/workflows/" +
887
+ workflow +
888
+ "/runs?event=workflow_dispatch&branch=" +
889
+ encodeURIComponent(branch) +
890
+ "&per_page=20";
891
+ const response = await fetch(listUrl, { headers });
892
+ if (!response.ok) {
893
+ throw new Error("GitHub workflow run listing failed: " + response.status);
894
+ }
895
+ const payload = await response.json();
896
+ const runs = Array.isArray(payload.workflow_runs) ? payload.workflow_runs : [];
897
+ for (const run of runs) {
898
+ if (typeof run?.id !== "number" || run.status === "completed") {
899
+ continue;
900
+ }
901
+ const cancelResponse = await fetch(
902
+ "https://api.github.com/repos/" + repo + "/actions/runs/" + run.id + "/cancel",
903
+ {
904
+ method: "POST",
905
+ headers,
906
+ },
907
+ );
908
+ if (!cancelResponse.ok && cancelResponse.status !== 409) {
909
+ throw new Error("GitHub workflow cancel failed: " + cancelResponse.status);
910
+ }
911
+ }'`,
912
+ {
913
+ timeoutMs: 60_000,
914
+ },
915
+ ),
916
+ Act.exec(
917
+ `curl -sf -X POST https://api.github.com/repos/${fixtureRepo}/actions/workflows/${fixtureWorkflow}/dispatches \
918
+ -H "Accept: application/vnd.github+json" \
919
+ -H "Authorization: Bearer $GITHUB_PAT" \
920
+ -H "X-GitHub-Api-Version: 2022-11-28" \
921
+ -d '${JSON.stringify({ ref: fixtureBranch })}'`,
922
+ ),
923
+ Act.exec("sleep 25"),
924
+ ],
925
+ assert: [
926
+ Assert.noErrors(),
927
+ Assert.hasAction("webhook_received"),
928
+ Assert.hasAction("signature_verified"),
929
+ Assert.hasAction("job_queued"),
930
+ ],
931
+ timeoutMs: 40_000,
932
+ }),
933
+ },
934
+ {
935
+ id: "queue",
936
+ title: "A queued job can be inspected without dequeueing",
937
+ gate: Gate.define({
938
+ observe: workerLogs,
939
+ prerequisites: [
940
+ Require.env(
941
+ "CLOUDFLARE_ACCOUNT_ID",
942
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
943
+ ),
944
+ Require.env(
945
+ "CLOUDFLARE_API_TOKEN",
946
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
947
+ ),
948
+ Require.env(
949
+ "CINDER_INTERNAL_TOKEN",
950
+ "CINDER_INTERNAL_TOKEN is required for queue inspection.",
951
+ ),
952
+ Require.env(
953
+ "CINDER_BASE_URL",
954
+ "Run bun run provision first or set CINDER_BASE_URL to the live orchestrator URL.",
955
+ ),
956
+ ],
957
+ act: [
958
+ Act.exec(
959
+ `sh -c 'curl -sf ${baseUrl}/jobs/peek \
960
+ -H "Authorization: Bearer ${internalToken}" \
961
+ | tee "${queuePayloadPath}"'`,
962
+ ),
963
+ ],
964
+ assert: [
965
+ Assert.noErrors(),
966
+ Assert.responseBodyIncludes("repo_full_name"),
967
+ Assert.responseBodyIncludes("repo_clone_url"),
968
+ Assert.responseBodyIncludes("runner_registration_url"),
969
+ Assert.responseBodyIncludes("runner_registration_token"),
970
+ Assert.responseBodyIncludes("cache_key"),
971
+ ],
972
+ timeoutMs: 8_000,
973
+ }),
974
+ },
975
+ {
976
+ id: "runner",
977
+ title: "A runner can execute a queued GitHub job",
978
+ gate: Gate.define({
979
+ observe: workerLogs,
980
+ prerequisites: [
981
+ Require.env(
982
+ "CLOUDFLARE_ACCOUNT_ID",
983
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
984
+ ),
985
+ Require.env(
986
+ "CLOUDFLARE_API_TOKEN",
987
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
988
+ ),
989
+ Require.env(
990
+ "CINDER_INTERNAL_TOKEN",
991
+ "CINDER_INTERNAL_TOKEN is required for runner registration.",
992
+ ),
993
+ Require.env(
994
+ "CINDER_BASE_URL",
995
+ "Run bun run provision first or set CINDER_BASE_URL to the live orchestrator URL.",
996
+ ),
997
+ Require.env(
998
+ "GITHUB_PAT",
999
+ "GITHUB_PAT is required to confirm the queued GitHub run completed.",
1000
+ ),
1001
+ ],
1002
+ act: [
1003
+ Act.exec(
1004
+ `curl -sf ${baseUrl}/jobs/peek \
1005
+ -H "Authorization: Bearer ${internalToken}" \
1006
+ > "${runnerJobPath}"`,
1007
+ ),
1008
+ Act.exec(
1009
+ `bun -e 'import crypto from "node:crypto";
1010
+ import { readFileSync } from "node:fs";
1011
+ const payload = JSON.parse(readFileSync(${JSON.stringify(runnerJobPath)}, "utf8"));
1012
+ if (typeof payload.cache_key !== "string" || payload.cache_key.length === 0) {
1013
+ throw new Error("runner job payload missing cache_key");
1014
+ }
1015
+ const key = payload.cache_key;
1016
+ const token = ${JSON.stringify(internalToken)};
1017
+ if (!token) {
1018
+ throw new Error("CINDER_INTERNAL_TOKEN is required for fixture cache reset");
1019
+ }
1020
+ let base = ${JSON.stringify(cacheWorkerUrl)};
1021
+ const restoreProbe = await fetch(
1022
+ ${JSON.stringify(baseUrl)} + "/cache/restore/" + key,
1023
+ {
1024
+ method: "POST",
1025
+ headers: {
1026
+ Authorization: "Bearer " + token,
1027
+ },
1028
+ },
1029
+ );
1030
+ if (!restoreProbe.ok) {
1031
+ throw new Error("fixture cache reset probe failed: " + restoreProbe.status);
1032
+ }
1033
+ const restorePayload = await restoreProbe.json();
1034
+ if (typeof restorePayload.url === "string" && restorePayload.url.length > 0) {
1035
+ base = new URL(restorePayload.url).origin;
1036
+ }
1037
+ if (!base) {
1038
+ throw new Error("cache worker base URL is required for fixture cache reset");
1039
+ }
1040
+ const exp = Math.floor(Date.now() / 1000) + 3600;
1041
+ const sig = crypto
1042
+ .createHmac("sha256", token)
1043
+ .update("delete:" + key + ":" + exp)
1044
+ .digest("hex");
1045
+ const response = await fetch(
1046
+ base.replace(/\\/$/, "") +
1047
+ "/objects/" +
1048
+ key +
1049
+ "?op=delete&exp=" +
1050
+ exp +
1051
+ "&sig=" +
1052
+ sig,
1053
+ {
1054
+ method: "DELETE",
1055
+ },
1056
+ );
1057
+ if (!response.ok && response.status !== 404) {
1058
+ throw new Error("fixture cache reset failed: " + response.status);
1059
+ }
1060
+ console.log("fixture cache reset");'`,
1061
+ ),
1062
+ Act.exec(
1063
+ `sh -c 'if [ -f "${agentPidPath}" ] && kill -0 "$(cat "${agentPidPath}")" 2>/dev/null; then exit 0; fi; : >"${agentLogPath}"; cargo run --quiet -p cinder-agent -- --url "${baseUrl}" --token "${internalToken}" --poll-ms 250 >"${agentLogPath}" 2>&1 & echo $! >"${agentPidPath}"; sleep 5'`,
1064
+ ),
1065
+ Act.exec(
1066
+ `bun -e 'import { existsSync, readFileSync } from "node:fs";
1067
+ const payload = JSON.parse(readFileSync(${JSON.stringify(runnerJobPath)}, "utf8"));
1068
+ if (typeof payload.run_id !== "number") {
1069
+ throw new Error("queue payload missing run_id");
1070
+ }
1071
+ if (typeof payload.repo_full_name !== "string" || payload.repo_full_name.length === 0) {
1072
+ throw new Error("queue payload missing repo_full_name");
1073
+ }
1074
+ const token = process.env.GITHUB_PAT;
1075
+ if (!token) {
1076
+ throw new Error("GITHUB_PAT is required");
1077
+ }
1078
+ const headers = {
1079
+ Accept: "application/vnd.github+json",
1080
+ Authorization: "Bearer " + token,
1081
+ "X-GitHub-Api-Version": "2022-11-28",
1082
+ };
1083
+ const deadline = Date.now() + 600000;
1084
+ let run = null;
1085
+ while (Date.now() < deadline) {
1086
+ const response = await fetch(
1087
+ "https://api.github.com/repos/" + payload.repo_full_name + "/actions/runs/" + payload.run_id,
1088
+ { headers },
1089
+ );
1090
+ if (!response.ok) {
1091
+ if (response.status >= 500) {
1092
+ await Bun.sleep(2000);
1093
+ continue;
1094
+ }
1095
+ throw new Error("GitHub workflow run fetch failed: " + response.status);
1096
+ }
1097
+ run = await response.json();
1098
+ if (run.status === "completed") {
1099
+ break;
1100
+ }
1101
+ await Bun.sleep(2000);
1102
+ }
1103
+ if (!run || run.status !== "completed") {
1104
+ throw new Error("GitHub workflow run did not complete");
1105
+ }
1106
+ const logNeedle = "completed with exit code 0";
1107
+ const logDeadline = Date.now() + 30000;
1108
+ while (Date.now() < logDeadline) {
1109
+ if (existsSync(${JSON.stringify(agentLogPath)})) {
1110
+ const logContents = readFileSync(${JSON.stringify(agentLogPath)}, "utf8");
1111
+ if (logContents.includes(logNeedle)) {
1112
+ break;
1113
+ }
1114
+ }
1115
+ await Bun.sleep(500);
1116
+ }
1117
+ console.log(JSON.stringify(run));
1118
+ if (existsSync(${JSON.stringify(agentLogPath)})) {
1119
+ console.log(readFileSync(${JSON.stringify(agentLogPath)}, "utf8"));
1120
+ }
1121
+ if (run.conclusion !== "success") {
1122
+ process.exit(1);
1123
+ }'`,
1124
+ {
1125
+ timeoutMs: 600_000,
1126
+ },
1127
+ ),
1128
+ ],
1129
+ assert: [
1130
+ Assert.noErrors(),
1131
+ Assert.hasAction("runner_registered"),
1132
+ Assert.hasAction("runner_pool_updated"),
1133
+ Assert.hasAction("job_dequeued"),
1134
+ Assert.responseBodyIncludes(`"conclusion":"success"`),
1135
+ Assert.responseBodyIncludes("starting github runner for job"),
1136
+ Assert.responseBodyIncludes("completed with exit code 0"),
1137
+ ],
1138
+ timeoutMs: 600_000,
1139
+ }),
1140
+ },
1141
+ {
1142
+ id: "cache-restore",
1143
+ title: "The fixture cache key currently restores as a cold miss",
1144
+ gate: Gate.define({
1145
+ prerequisites: [
1146
+ Require.env(
1147
+ "CLOUDFLARE_ACCOUNT_ID",
1148
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
1149
+ ),
1150
+ Require.env(
1151
+ "CLOUDFLARE_API_TOKEN",
1152
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
1153
+ ),
1154
+ Require.env(
1155
+ "CINDER_INTERNAL_TOKEN",
1156
+ "CINDER_INTERNAL_TOKEN is required for cache restore.",
1157
+ ),
1158
+ Require.env(
1159
+ "CINDER_BASE_URL",
1160
+ "Run bun run provision first or set CINDER_BASE_URL to the live orchestrator URL.",
1161
+ ),
1162
+ ],
1163
+ act: [
1164
+ Act.exec(
1165
+ `bun -e 'import { readFileSync } from "node:fs";
1166
+ const payload = JSON.parse(readFileSync(${JSON.stringify(runnerJobPath)}, "utf8"));
1167
+ if (typeof payload.job_id !== "number") {
1168
+ throw new Error("runner job payload missing job_id");
1169
+ }
1170
+ const needle = "cache miss for job " + payload.job_id;
1171
+ const deadline = Date.now() + 5000;
1172
+ while (Date.now() < deadline) {
1173
+ const log = readFileSync(${JSON.stringify(agentLogPath)}, "utf8");
1174
+ if (log.includes(needle)) {
1175
+ console.log(needle);
1176
+ process.exit(0);
1177
+ }
1178
+ await Bun.sleep(250);
1179
+ }
1180
+ throw new Error("agent log missing cache miss marker for job " + payload.job_id);'`,
1181
+ ),
1182
+ ],
1183
+ assert: [
1184
+ Assert.noErrors(),
1185
+ Assert.responseBodyIncludes("cache miss for job"),
1186
+ ],
1187
+ timeoutMs: 5_000,
1188
+ }),
1189
+ },
1190
+ {
1191
+ id: "cache-push",
1192
+ title: "The cache upload path returns a real cache-worker upload URL",
1193
+ gate: Gate.define({
1194
+ observe: workerLogs,
1195
+ prerequisites: [
1196
+ Require.env(
1197
+ "CLOUDFLARE_ACCOUNT_ID",
1198
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
1199
+ ),
1200
+ Require.env(
1201
+ "CLOUDFLARE_API_TOKEN",
1202
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
1203
+ ),
1204
+ Require.env(
1205
+ "CINDER_INTERNAL_TOKEN",
1206
+ "CINDER_INTERNAL_TOKEN is required for cache upload.",
1207
+ ),
1208
+ Require.env(
1209
+ "CINDER_BASE_URL",
1210
+ "Run bun run provision first or set CINDER_BASE_URL to the live orchestrator URL.",
1211
+ ),
1212
+ ],
1213
+ act: [
1214
+ Act.exec(
1215
+ `sh -c 'rm -f /tmp/cinder-proof-cache-push.tar.xz /tmp/cinder-proof-cache-push-download.tar.xz /tmp/cinder-proof-cache-push-upload.json /tmp/cinder-proof-cache-push-restore.json /tmp/cinder-proof-cache-push-list.txt; tmpdir="$(mktemp -d)"; printf "proof\\n" > "$tmpdir/proof.txt"; tar -cJf /tmp/cinder-proof-cache-push.tar.xz -C "$tmpdir" proof.txt; rm -rf "$tmpdir"'`,
1216
+ ),
1217
+ Act.exec(
1218
+ `bun -e 'import { writeFileSync } from "node:fs";
1219
+ const response = await fetch(
1220
+ ${JSON.stringify(baseUrl)} + "/cache/upload",
1221
+ {
1222
+ method: "POST",
1223
+ headers: {
1224
+ "Content-Type": "application/json",
1225
+ Authorization: "Bearer " + ${JSON.stringify(internalToken)},
1226
+ },
1227
+ body: JSON.stringify({
1228
+ key: ${JSON.stringify(newKey)},
1229
+ content_type: "application/x-xz",
1230
+ size_bytes: 1024,
1231
+ }),
1232
+ },
1233
+ );
1234
+ if (!response.ok) {
1235
+ throw new Error("cache upload failed: " + response.status);
1236
+ }
1237
+ const upload = await response.json();
1238
+ if (typeof upload.url !== "string" || upload.url.length === 0) {
1239
+ throw new Error("cache upload response missing url");
1240
+ }
1241
+ if (!upload.url.includes("/objects/")) {
1242
+ throw new Error("cache upload returned non-worker url");
1243
+ }
1244
+ writeFileSync("/tmp/cinder-proof-cache-push-upload.json", JSON.stringify(upload));
1245
+ console.log(JSON.stringify(upload));'`,
1246
+ ),
1247
+ Act.exec(
1248
+ `bun -e 'import { readFileSync } from "node:fs";
1249
+ const upload = JSON.parse(readFileSync("/tmp/cinder-proof-cache-push-upload.json", "utf8"));
1250
+ const archive = readFileSync("/tmp/cinder-proof-cache-push.tar.xz");
1251
+ const response = await fetch(upload.url, {
1252
+ method: "PUT",
1253
+ body: archive,
196
1254
  });
197
- ```
1255
+ if (!response.ok) {
1256
+ throw new Error("cache object upload failed: " + response.status);
1257
+ }
1258
+ console.log("cache object uploaded");'`,
1259
+ ),
1260
+ Act.exec(
1261
+ `bun -e 'import { writeFileSync } from "node:fs";
1262
+ const response = await fetch(
1263
+ ${JSON.stringify(baseUrl)} + "/cache/restore/" + ${JSON.stringify(newKey)},
1264
+ {
1265
+ method: "POST",
1266
+ headers: {
1267
+ Authorization: "Bearer " + ${JSON.stringify(internalToken)},
1268
+ },
1269
+ },
1270
+ );
1271
+ if (!response.ok) {
1272
+ throw new Error("cache restore failed: " + response.status);
1273
+ }
1274
+ const restore = await response.json();
1275
+ if (restore.miss === true) {
1276
+ throw new Error("cache restore returned miss after upload");
1277
+ }
1278
+ if (typeof restore.url !== "string" || restore.url.length === 0) {
1279
+ throw new Error("cache restore response missing url");
1280
+ }
1281
+ writeFileSync("/tmp/cinder-proof-cache-push-restore.json", JSON.stringify(restore));
1282
+ console.log(JSON.stringify(restore));'`,
1283
+ ),
1284
+ Act.exec(
1285
+ `bun -e 'import { readFileSync, writeFileSync } from "node:fs";
1286
+ const restore = JSON.parse(readFileSync("/tmp/cinder-proof-cache-push-restore.json", "utf8"));
1287
+ const response = await fetch(restore.url);
1288
+ if (!response.ok) {
1289
+ throw new Error("cache object download failed: " + response.status);
1290
+ }
1291
+ const bytes = new Uint8Array(await response.arrayBuffer());
1292
+ writeFileSync("/tmp/cinder-proof-cache-push-download.tar.xz", bytes);
1293
+ console.log("cache object downloaded");'`,
1294
+ ),
1295
+ Act.exec(
1296
+ `sh -c 'test -s /tmp/cinder-proof-cache-push-download.tar.xz && tar -tJf /tmp/cinder-proof-cache-push-download.tar.xz > /tmp/cinder-proof-cache-push-list.txt && cat /tmp/cinder-proof-cache-push-list.txt'`,
1297
+ ),
1298
+ ],
1299
+ assert: [
1300
+ Assert.noErrors(),
1301
+ Assert.responseBodyIncludes("proof.txt"),
1302
+ ],
1303
+ timeoutMs: 8_000,
1304
+ }),
1305
+ },
1306
+ {
1307
+ id: "speed-claim",
1308
+ title: "A warm workflow run can complete with a real cache hit",
1309
+ gate: Gate.define({
1310
+ observe: workerLogs,
1311
+ prerequisites: [
1312
+ Require.env(
1313
+ "CLOUDFLARE_ACCOUNT_ID",
1314
+ "CLOUDFLARE_ACCOUNT_ID is required for Cloudflare worker log observation.",
1315
+ ),
1316
+ Require.env(
1317
+ "CLOUDFLARE_API_TOKEN",
1318
+ "CLOUDFLARE_API_TOKEN is required for Cloudflare worker log observation.",
1319
+ ),
1320
+ Require.env(
1321
+ "GITHUB_PAT",
1322
+ "GITHUB_PAT is required to dispatch and confirm the warm workflow run.",
1323
+ ),
1324
+ ],
1325
+ act: [
1326
+ Act.exec(
1327
+ `bun -e 'import { readFileSync } from "node:fs";
1328
+ const payload = JSON.parse(readFileSync(${JSON.stringify(runnerJobPath)}, "utf8"));
1329
+ if (typeof payload.run_id !== "number") {
1330
+ throw new Error("runner job payload missing run_id");
1331
+ }
1332
+ console.log(String(payload.run_id));' > /tmp/cinder-proof-speed-before.txt`,
1333
+ ),
1334
+ Act.exec(
1335
+ `curl -sf -X POST https://api.github.com/repos/${fixtureRepo}/actions/workflows/${fixtureWorkflow}/dispatches \
1336
+ -H "Accept: application/vnd.github+json" \
1337
+ -H "Authorization: Bearer $GITHUB_PAT" \
1338
+ -H "X-GitHub-Api-Version: 2022-11-28" \
1339
+ -d '${JSON.stringify({ ref: fixtureBranch })}'`,
1340
+ ),
1341
+ Act.exec("sleep 5"),
1342
+ Act.exec(
1343
+ `bun -e 'import { readFileSync } from "node:fs";
1344
+ const repo = ${JSON.stringify(fixtureRepo)};
1345
+ const workflow = ${JSON.stringify(fixtureWorkflow)};
1346
+ const branch = ${JSON.stringify(fixtureBranch)};
1347
+ const token = process.env.GITHUB_PAT;
1348
+ if (!token) {
1349
+ throw new Error("GITHUB_PAT is required");
1350
+ }
1351
+ const previousId = readFileSync("/tmp/cinder-proof-speed-before.txt", "utf8").trim();
1352
+ const headers = {
1353
+ Accept: "application/vnd.github+json",
1354
+ Authorization: "Bearer " + token,
1355
+ "X-GitHub-Api-Version": "2022-11-28",
1356
+ };
1357
+ const listUrl =
1358
+ "https://api.github.com/repos/" +
1359
+ repo +
1360
+ "/actions/workflows/" +
1361
+ workflow +
1362
+ "/runs?event=workflow_dispatch&branch=" +
1363
+ encodeURIComponent(branch) +
1364
+ "&per_page=5";
1365
+ const deadline = Date.now() + 600000;
1366
+ let run = null;
1367
+ while (Date.now() < deadline) {
1368
+ const listResponse = await fetch(listUrl, { headers });
1369
+ if (!listResponse.ok) {
1370
+ throw new Error("GitHub workflow run listing failed: " + listResponse.status);
1371
+ }
1372
+ const listPayload = await listResponse.json();
1373
+ const runs = Array.isArray(listPayload.workflow_runs) ? listPayload.workflow_runs : [];
1374
+ const candidate = runs.find((entry) => typeof entry?.id === "number" && String(entry.id) !== previousId);
1375
+ if (candidate && typeof candidate.id === "number") {
1376
+ const runResponse = await fetch(
1377
+ "https://api.github.com/repos/" + repo + "/actions/runs/" + candidate.id,
1378
+ { headers },
1379
+ );
1380
+ if (!runResponse.ok) {
1381
+ throw new Error("GitHub workflow run fetch failed: " + runResponse.status);
1382
+ }
1383
+ run = await runResponse.json();
1384
+ if (run.status === "completed") {
1385
+ break;
1386
+ }
1387
+ }
1388
+ await Bun.sleep(2000);
1389
+ }
1390
+ if (!run || run.status !== "completed") {
1391
+ throw new Error("warm GitHub workflow run did not complete");
1392
+ }
1393
+ console.log(JSON.stringify(run));
1394
+ console.log(readFileSync(${JSON.stringify(agentLogPath)}, "utf8"));'`,
1395
+ {
1396
+ timeoutMs: 600_000,
1397
+ },
1398
+ ),
1399
+ ],
1400
+ assert: [
1401
+ Assert.noErrors(),
1402
+ Assert.responseBodyIncludes(`"conclusion":"success"`),
1403
+ Assert.responseBodyIncludes("cache restored for job"),
1404
+ ],
1405
+ timeoutMs: 600_000,
1406
+ }),
1407
+ },
1408
+ ],
1409
+ loop: {
1410
+ maxIterations: 1,
1411
+ stopOnFailure: true,
1412
+ },
1413
+ cleanup: {
1414
+ actions: [
1415
+ Act.exec(
1416
+ `if [ -n "${internalToken}" ] && [ -n "${baseUrl}" ]; then curl -sf -X DELETE ${baseUrl}/runners/${localRunnerId} -H "Authorization: Bearer ${internalToken}" >/dev/null; else exit 0; fi`,
1417
+ ),
1418
+ ],
1419
+ },
1420
+ }),
1421
+ } satisfies ScopeFile;
198
1422
 
199
- ## Generate a PRD from plain language
1423
+ export default scope;
200
1424
 
201
- ```bash
202
- echo "Build a signup flow with email verification" | npx gateproof prdts --stdout
1425
+ if (import.meta.main) {
1426
+ stopManagedAgent();
1427
+
1428
+ if (testRepo) {
1429
+ await ensureLocalHarness();
1430
+ }
1431
+
1432
+ await ensureColdBuildBaseline();
1433
+
1434
+ try {
1435
+ const result = await Effect.runPromise(
1436
+ Plan.runLoop(scope.plan, {
1437
+ maxIterations: scope.plan.loop?.maxIterations,
1438
+ }),
1439
+ );
1440
+
1441
+ console.log(JSON.stringify(result, null, 2));
1442
+
1443
+ if (result.status !== "pass") {
1444
+ process.exitCode = 1;
1445
+ }
1446
+ } finally {
1447
+ stopManagedAgent();
1448
+ stopManagedHarness();
1449
+ }
1450
+
1451
+ process.exit(process.exitCode ?? 0);
1452
+ }
203
1453
  ```
204
1454
 
205
- ## End-to-end CLI pipeline
1455
+ Status: Historical artifacts are available locally
1456
+
1457
+ The preserved Cinder files are present and typechecked against the local Gateproof package. Reproducing the live result still requires Cloudflare infrastructure and Cinder environment variables.
1458
+
1459
+ ## Roadmap
1460
+
1461
+ Gateproof is not ready to fully dogfood itself on a case study like Cinder yet. The next phase is about tightening the guardrails, not adding another rewrite.
206
1462
 
207
- > Contributed by @grok
1463
+ - Save the latest real proof result to disk so the loop always has a concrete last-known truth.
1464
+ - Make finalize refuse to ship unless the saved real proof result is fully green.
1465
+ - Separate the real proof path from side experiments so exploration can happen without polluting the proof story.
1466
+ - Let plans choose direct evidence when log tailing is flaky, so a valid live pass does not fail on observation noise alone.
1467
+ - Dogfood Gateproof on Cinder again only after those guardrails are in place.
1468
+
1469
+ ## How To
1470
+
1471
+ Task: Run one complete gate from one file.
1472
+
1473
+ Done when: The endpoint returns 200 and the body contains hello world.
1474
+
1475
+ Run it:
208
1476
 
209
1477
  ```bash
210
- # Natural language → prd.ts → agent loop
211
- echo "Build a signup flow with email verification" | npx gateproof prdts --out prd.ts
212
- npx gateproof smoke ./prd.ts
213
- bun run prd.ts
1478
+ bun run example:hello-world:worker
1479
+ bun run alchemy.run.ts
1480
+ bun run plan.ts
214
1481
  ```
215
1482
 
216
- ## Docs
1483
+ ## Breaking Changes In 0.4.0
1484
+
1485
+ - `Prd.*` is gone
1486
+ - `Claim.*` is gone
1487
+ - `plan.ts` is the canonical entrypoint
1488
+ - `Plan.*` replaces the old front door
1489
+
1490
+ ## Reference
1491
+
1492
+ Files:
1493
+ - `examples/hello-world/plan.ts`
1494
+ - `alchemy.run.ts`
1495
+ - `plan.ts`
1496
+
1497
+ Canonical gates:
1498
+ - GET / returns hello world
1499
+
1500
+ Loop:
1501
+ - `maxIterations: 1`
1502
+ - `stopOnFailure: true`
217
1503
 
218
- Full documentation, tutorials, and API reference: [gateproof.dev/docs](https://gateproof.dev/docs)
1504
+ Core API:
1505
+ - `Gate.define(...)`
1506
+ - `Plan.define(...)`
1507
+ - `Plan.run(...)`
1508
+ - `Plan.runLoop(...)`
1509
+ - `Cloudflare.observe(...)`
1510
+ - `Assert.hasAction(...)`
1511
+ - `Assert.responseBodyIncludes(...)`
1512
+ - `Assert.numericDeltaFromEnv(...)`
219
1513
 
220
- ## License
1514
+ ## Explanation
221
1515
 
222
- MIT
1516
+ Root plan.ts stays small. Gateproof itself is built forward.