@devinnn/docdrift 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -3,45 +3,71 @@
3
3
  Docs that never lie: detect drift between merged code and docs, then open low-noise, evidence-grounded remediation via Devin sessions.
4
4
 
5
5
  ## Deliverables
6
- - TypeScript CLI package (`docdrift`)
6
+
7
+ - **npm package**: [@devinnn/docdrift](https://www.npmjs.com/package/@devinnn/docdrift) — TypeScript CLI (`docdrift`)
7
8
  - `validate`
8
9
  - `detect --base <sha> --head <sha>`
9
10
  - `run --base <sha> --head <sha>`
10
11
  - `status --since 24h`
12
+ - `sla-check` — Check for doc-drift PRs open 7+ days and open a reminder issue
11
13
  - GitHub Action: `/Users/cameronking/Desktop/sideproject/docdrift/.github/workflows/devin-doc-drift.yml`
12
14
  - Repo-local config: `/Users/cameronking/Desktop/sideproject/docdrift/docdrift.yaml`
13
15
  - Demo API + OpenAPI exporter + driftable docs
14
16
  - PR template + Loom script
15
17
 
16
18
  ## Why this is low-noise
17
- - One PR per doc area per day (bundling rule).
18
- - Global PR/day cap.
19
- - Confidence gating and allowlist enforcement.
20
- - Conceptual docs default to issue escalation with targeted questions.
19
+
20
+ - **Single session, single PR** — One Devin session handles the whole docsite (API reference + guides).
21
+ - **Gate on API spec diff** — We only run when OpenAPI drift is detected; no session for docs-check-only failures.
22
+ - **requireHumanReview** When the PR touches guides/prose, we open an issue after the PR to direct attention.
23
+ - **7-day SLA** — If a doc-drift PR is open 7+ days, we open a reminder issue (configurable `slaDays`; use `sla-check` CLI or cron workflow).
24
+ - Confidence gating and allowlist/exclude enforcement.
21
25
  - Idempotency key prevents duplicate actions for same repo/SHAs/action.
22
26
 
23
- ## Detection tiers
24
- - Tier 0: docs checks (`npm run docs:check`)
25
- - Tier 1: OpenAPI drift (`openapi/generated.json` vs `docs/reference/openapi.json`)
26
- - Tier 2: heuristic path impacts (e.g. `apps/api/src/auth/**` -> `docs/guides/auth.md`)
27
+ ## Detection and gate
28
+
29
+ - **Gate:** We only run a Devin session when **OpenAPI drift** is detected. No drift → no session.
30
+ - Tier 1: OpenAPI drift (`openapi/generated.json` vs published spec)
31
+ - Tier 2: Heuristic path impacts from docAreas (e.g. `apps/api/src/auth/**` → guides)
27
32
 
28
33
  Output artifacts (under `.docdrift/`):
34
+
29
35
  - `drift_report.json`
30
36
  - `metrics.json` (after `run`)
31
37
 
32
38
  When you run docdrift as a package (e.g. `npx docdrift` or from another repo), all of this is written to **that repo’s** `.docdrift/` — i.e. the current working directory where the CLI is invoked, not inside the package. Add `.docdrift/` to the consuming repo’s `.gitignore` if you don’t want to commit run artifacts.
33
39
 
34
40
  ## Core flow (`docdrift run`)
41
+
35
42
  1. Validate config and command availability.
36
- 2. Build drift report.
43
+ 2. Build drift report. **Gate:** If no OpenAPI drift, exit (no session).
37
44
  3. Policy decision (`OPEN_PR | UPDATE_EXISTING_PR | OPEN_ISSUE | NOOP`).
38
- 4. Build evidence bundle (`.docdrift/evidence/<runId>/<docArea>` + tarball).
39
- 5. Upload attachments to Devin v1 and create session.
40
- 6. Poll session to terminal status.
45
+ 4. Build one aggregated evidence bundle for the whole docsite.
46
+ 5. One Devin session with whole-docsite prompt; poll to terminal status.
47
+ 6. If PR opened and touches `requireHumanReview` paths → create issue to direct attention.
41
48
  7. Surface result via GitHub commit comment; open issue on blocked/low-confidence paths.
42
- 8. Persist state in `.docdrift/state.json` and write `.docdrift/metrics.json`.
49
+ 8. Persist state (including `lastDocDriftPrUrl` for SLA); write `.docdrift/metrics.json`.
50
+
51
+ ## Where the docs are (this repo)
52
+
53
+ | Path | Purpose |
54
+ | ------------------------------------------ | ----------------------------------------------------------------------- |
55
+ | `apps/docs-site/openapi/openapi.json` | Published OpenAPI spec (docdrift updates this when drift is detected). |
56
+ | `apps/docs-site/docs/api/` | API reference MDX generated from the spec (`npm run docs:gen`). |
57
+ | `apps/docs-site/docs/guides/auth.md` | Conceptual auth guide (updated only for conceptual drift). |
58
+
59
+ The docsite is a Docusaurus app with `docusaurus-plugin-openapi-docs`. The **generated** spec from code lives at `openapi/generated.json` (from `npm run openapi:export`). Drift = generated vs published differ. Verification runs `docs:gen` and `docs:build` so the docsite actually builds.
60
+
61
+ ## How Devin updates them
62
+
63
+ 1. **Evidence bundle** — Docdrift builds a tarball with the drift report, OpenAPI diff, and impacted doc snippets, and uploads it to the Devin API as session attachments.
64
+ 2. **Devin session** — Devin is prompted (see `src/devin/prompts.ts`) to update only files under the allowlist (`openapi/**`, `apps/docs-site/**`), make minimal correct edits, run verification (`npm run docs:gen`, `npm run docs:build`), and open **one PR** per doc area with a clear description.
65
+ 3. **PR** — Devin updates `apps/docs-site/openapi/openapi.json` to match the current API, runs `docs:gen` to regenerate API reference MDX, and opens a pull request. You review and merge; the docsite builds and the docs are updated.
66
+
67
+ So the “fix” is a **PR opened by Devin** that you merge; the repo’s docs don’t change until that PR is merged.
43
68
 
44
69
  ## Local usage
70
+
45
71
  ```bash
46
72
  npm install
47
73
  npx tsx src/cli.ts validate
@@ -54,25 +80,29 @@ DEVIN_API_KEY=... GITHUB_TOKEN=... GITHUB_REPOSITORY=owner/repo GITHUB_SHA=<sha>
54
80
 
55
81
  You can run a full end-to-end demo locally with no remote repo. Ensure `.env` has `DEVIN_API_KEY` (and optionally `GITHUB_TOKEN` only when you have a real repo).
56
82
 
57
- 1. **One-time setup (already done if you have two commits with drift)**
58
- - Git is inited; baseline commit has docs in sync with API.
83
+ 1. **One-time setup (already done if you have two commits with drift)**
84
+ - Git is inited; baseline commit has docs in sync with API.
59
85
  - A later commit changes `apps/api/src/model.ts` (e.g. `name` → `fullName`) and runs `npm run openapi:export`, so `openapi/generated.json` drifts from `docs/reference/openapi.json`.
60
86
 
61
87
  2. **Run the pipeline**
88
+
62
89
  ```bash
63
90
  npm install
64
91
  npx tsx src/cli.ts validate
65
92
  npx tsx src/cli.ts detect --base b0f624f --head 6030902
66
93
  ```
94
+
67
95
  - Use your own `git log --oneline -3` to get `base` (older) and `head` (newer) SHAs if you recreated the demo.
68
96
 
69
97
  3. **Run with Devin (no GitHub calls)**
70
98
  Omit `GITHUB_TOKEN` so the CLI does not post comments or create issues. Devin session still runs; results are printed to stdout and written to `.docdrift/state.json` and `metrics.json`.
99
+
71
100
  ```bash
72
101
  export $(grep -v '^#' .env | xargs)
73
102
  unset GITHUB_TOKEN GITHUB_REPOSITORY GITHUB_SHA
74
103
  npx tsx src/cli.ts run --base b0f624f --head 6030902
75
104
  ```
105
+
76
106
  - `run` can take 1–3 minutes while the Devin session runs.
77
107
 
78
108
  4. **What you’ll see**
@@ -93,14 +123,15 @@ You can run a full end-to-end demo locally with no remote repo. Ensure `.env` ha
93
123
  ## Run on GitHub
94
124
 
95
125
  1. **Create a repo** on GitHub (e.g. `your-org/docdrift`), then add the remote and push:
126
+
96
127
  ```bash
97
128
  git remote add origin https://github.com/your-org/docdrift.git
98
129
  git push -u origin main
99
130
  ```
100
131
 
101
132
  2. **Add secret**
102
- Repo → **Settings** → **Secrets and variables** → **Actions** → **New repository secret**
103
- - Name: `DEVIN_API_KEY`
133
+ Repo → **Settings** → **Secrets and variables** → **Actions** → **New repository secret**
134
+ - Name: `DEVIN_API_KEY`
104
135
  - Value: your Devin API key (same as in `.env` locally)
105
136
 
106
137
  `GITHUB_TOKEN` is provided automatically; the workflow uses it for commit comments and issues.
@@ -109,6 +140,25 @@ You can run a full end-to-end demo locally with no remote repo. Ensure `.env` ha
109
140
  - **Push to `main`** — runs on every push (compares previous commit vs current).
110
141
  - **Manual run** — **Actions** tab → **devin-doc-drift** → **Run workflow** (uses `HEAD` and `HEAD^` as head/base).
111
142
 
143
+ ## See it work (demo on GitHub)
144
+
145
+ This repo has **intentional drift**: the API has been expanded (new fields `fullName`, `avatarUrl`, `createdAt`, `role` and new endpoint `GET /v1/users` with pagination), but **docs are unchanged** (`docs/reference/openapi.json` and `docs/reference/api.md` still describe the old single-endpoint, `id`/`name`/`email` only). Running docdrift will detect that and hand a large diff to Devin to fix via a PR. To see it:
146
+
147
+ 1. **Create a new GitHub repo** (e.g. `docdrift-demo`) so you have a clean place to run the workflow.
148
+ 2. **Push this project with full history** (so both commits are on `main`):
149
+ ```bash
150
+ git remote add origin https://github.com/YOUR_ORG/docdrift-demo.git
151
+ git push -u origin main
152
+ ```
153
+ 3. **Add secret** in that repo: **Settings** → **Secrets and variables** → **Actions** → `DEVIN_API_KEY` = your Devin API key.
154
+ 4. **Trigger the workflow**
155
+ - Either push another small commit (e.g. README tweak), or
156
+ - **Actions** → **devin-doc-drift** → **Run workflow**.
157
+ 5. **Where to look**
158
+ - **Actions** → open the run → **Run Doc Drift** step: the step logs print JSON with `sessionUrl`, `prUrl`, and `outcome` per doc area. Open any `sessionUrl` in your browser to see the Devin session.
159
+ - **Artifacts**: download **docdrift-artifacts** for `.docdrift/drift_report.json`, `.docdrift/metrics.json`, and evidence.
160
+ - **Devin dashboard**: sessions are tagged `docdrift`; you’ll see the run there once the step completes (often 1–3 minutes).
161
+
112
162
  ## Using in another repo (published package)
113
163
 
114
164
  Once published to npm, any repo can use the CLI locally or in GitHub Actions.
@@ -116,14 +166,14 @@ Once published to npm, any repo can use the CLI locally or in GitHub Actions.
116
166
  1. **In the consuming repo** add a `docdrift.yaml` at the root (see this repo’s `docdrift.yaml` and `docdrift-yml.md`).
117
167
  2. **CLI**
118
168
  ```bash
119
- npx docdrift@latest validate
120
- npx docdrift@latest detect --base <base-sha> --head <head-sha>
169
+ npx @devinnn/docdrift validate
170
+ npx @devinnn/docdrift detect --base <base-sha> --head <head-sha>
121
171
  # With env for run:
122
- DEVIN_API_KEY=... GITHUB_TOKEN=... GITHUB_REPOSITORY=owner/repo GITHUB_SHA=<sha> npx docdrift@latest run --base <base-sha> --head <head-sha>
172
+ DEVIN_API_KEY=... GITHUB_TOKEN=... GITHUB_REPOSITORY=owner/repo GITHUB_SHA=<sha> npx @devinnn/docdrift run --base <base-sha> --head <head-sha>
123
173
  ```
124
174
  3. **GitHub Actions** — add a step that runs the CLI (e.g. after checkout and setting base/head):
125
175
  ```yaml
126
- - run: npx docdrift@latest run --base ${{ steps.shas.outputs.base }} --head ${{ steps.shas.outputs.head }}
176
+ - run: npx @devinnn/docdrift run --base ${{ steps.shas.outputs.base }} --head ${{ steps.shas.outputs.head }}
127
177
  env:
128
178
  DEVIN_API_KEY: ${{ secrets.DEVIN_API_KEY }}
129
179
  GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -140,8 +190,10 @@ Once published to npm, any repo can use the CLI locally or in GitHub Actions.
140
190
  - Only the `dist/` directory is included (`files` in `package.json`). Consumers get the built CLI; they provide their own `docdrift.yaml` in their repo.
141
191
 
142
192
  ## Demo scenario
193
+
143
194
  - Autogen drift: rename a field in `apps/api/src/model.ts`, merge to `main`, observe docs PR path.
144
195
  - Conceptual drift: change auth behavior under `apps/api/src/auth/**`, merge to `main`, observe single escalation issue.
145
196
 
146
197
  ## Loom
198
+
147
199
  See `/Users/cameronking/Desktop/sideproject/docdrift/loom.md` for the minute-by-minute recording script.
package/dist/src/cli.js CHANGED
@@ -1,6 +1,11 @@
1
1
  #!/usr/bin/env node
2
2
  "use strict";
3
+ var __importDefault = (this && this.__importDefault) || function (mod) {
4
+ return (mod && mod.__esModule) ? mod : { "default": mod };
5
+ };
3
6
  Object.defineProperty(exports, "__esModule", { value: true });
7
+ const node_fs_1 = __importDefault(require("node:fs"));
8
+ const node_path_1 = __importDefault(require("node:path"));
4
9
  const index_1 = require("./index");
5
10
  function getArg(args, flag) {
6
11
  const index = args.indexOf(flag);
@@ -12,7 +17,7 @@ function getArg(args, flag) {
12
17
  async function main() {
13
18
  const [, , command, ...args] = process.argv;
14
19
  if (!command) {
15
- throw new Error("Usage: docdrift <validate|detect|run|status> [options]");
20
+ throw new Error("Usage: docdrift <validate|detect|run|status|sla-check> [options]");
16
21
  }
17
22
  switch (command) {
18
23
  case "validate": {
@@ -32,6 +37,9 @@ async function main() {
32
37
  const headSha = (0, index_1.requireSha)(getArg(args, "--head"), "--head");
33
38
  const trigger = (0, index_1.resolveTrigger)(process.env.GITHUB_EVENT_NAME);
34
39
  const results = await (0, index_1.runDocDrift)({ baseSha, headSha, trigger });
40
+ const outPath = node_path_1.default.resolve(".docdrift", "run-output.json");
41
+ node_fs_1.default.mkdirSync(node_path_1.default.dirname(outPath), { recursive: true });
42
+ node_fs_1.default.writeFileSync(outPath, JSON.stringify(results, null, 2), "utf-8");
35
43
  console.log(JSON.stringify(results, null, 2));
36
44
  return;
37
45
  }
@@ -41,6 +49,10 @@ async function main() {
41
49
  await (0, index_1.runStatus)(sinceHours);
42
50
  return;
43
51
  }
52
+ case "sla-check": {
53
+ await (0, index_1.runSlaCheck)();
54
+ return;
55
+ }
44
56
  default:
45
57
  throw new Error(`Unknown command: ${command}`);
46
58
  }
@@ -4,9 +4,11 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
4
4
  };
5
5
  Object.defineProperty(exports, "__esModule", { value: true });
6
6
  exports.loadConfig = loadConfig;
7
+ exports.loadNormalizedConfig = loadNormalizedConfig;
7
8
  const node_fs_1 = __importDefault(require("node:fs"));
8
9
  const node_path_1 = __importDefault(require("node:path"));
9
10
  const js_yaml_1 = __importDefault(require("js-yaml"));
11
+ const normalize_1 = require("./normalize");
10
12
  const schema_1 = require("./schema");
11
13
  function loadConfig(configPath = "docdrift.yaml") {
12
14
  const resolved = node_path_1.default.resolve(configPath);
@@ -21,5 +23,23 @@ function loadConfig(configPath = "docdrift.yaml") {
21
23
  .join("\n");
22
24
  throw new Error(`Invalid config:\n${message}`);
23
25
  }
24
- return result.data;
26
+ const data = result.data;
27
+ if (data.devin.customInstructions?.length) {
28
+ const configDir = node_path_1.default.dirname(resolved);
29
+ const contents = [];
30
+ for (const p of data.devin.customInstructions) {
31
+ const fullPath = node_path_1.default.resolve(configDir, p);
32
+ if (!node_fs_1.default.existsSync(fullPath)) {
33
+ throw new Error(`Custom instructions file not found: ${fullPath}`);
34
+ }
35
+ contents.push(node_fs_1.default.readFileSync(fullPath, "utf8"));
36
+ }
37
+ data.devin.customInstructionContent = contents.join("\n\n");
38
+ }
39
+ return data;
40
+ }
41
+ /** Load and normalize config for use by detection/run (always has openapi, docsite, etc.) */
42
+ function loadNormalizedConfig(configPath = "docdrift.yaml") {
43
+ const config = loadConfig(configPath);
44
+ return (0, normalize_1.normalizeConfig)(config);
25
45
  }
@@ -0,0 +1,68 @@
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ exports.normalizeConfig = normalizeConfig;
7
+ const node_path_1 = __importDefault(require("node:path"));
8
+ /**
9
+ * Produce a normalized config that the rest of the app consumes.
10
+ * Derives openapi/docsite/exclude/requireHumanReview from docAreas when using legacy config.
11
+ */
12
+ function normalizeConfig(config) {
13
+ let openapi;
14
+ let docsite;
15
+ let exclude = config.exclude ?? [];
16
+ let requireHumanReview = config.requireHumanReview ?? [];
17
+ if (config.openapi && config.docsite) {
18
+ // Simple config
19
+ openapi = config.openapi;
20
+ docsite = Array.isArray(config.docsite) ? config.docsite : [config.docsite];
21
+ }
22
+ else if (config.docAreas && config.docAreas.length > 0) {
23
+ // Legacy: derive from docAreas
24
+ const firstOpenApiArea = config.docAreas.find((a) => a.detect.openapi);
25
+ if (!firstOpenApiArea?.detect.openapi) {
26
+ throw new Error("Legacy config requires at least one docArea with detect.openapi");
27
+ }
28
+ const o = firstOpenApiArea.detect.openapi;
29
+ openapi = {
30
+ export: o.exportCmd,
31
+ generated: o.generatedPath,
32
+ published: o.publishedPath,
33
+ };
34
+ const allPaths = [o.publishedPath];
35
+ for (const area of config.docAreas) {
36
+ area.patch.targets?.forEach((t) => allPaths.push(t));
37
+ area.detect.paths?.forEach((p) => p.impacts.forEach((i) => allPaths.push(i)));
38
+ }
39
+ const roots = new Set();
40
+ for (const p of allPaths) {
41
+ const parts = p.split("/").filter(Boolean);
42
+ if (parts.length >= 2)
43
+ roots.add(parts[0] + "/" + parts[1]);
44
+ else if (parts.length === 1)
45
+ roots.add(parts[0]);
46
+ }
47
+ docsite = roots.size > 0 ? [...roots] : [node_path_1.default.dirname(o.publishedPath) || "."];
48
+ // Derive requireHumanReview from areas with requireHumanConfirmation or conceptual mode
49
+ const reviewPaths = new Set();
50
+ for (const area of config.docAreas) {
51
+ if (area.patch.requireHumanConfirmation || area.mode === "conceptual") {
52
+ area.patch.targets?.forEach((t) => reviewPaths.add(t));
53
+ area.detect.paths?.forEach((p) => p.impacts.forEach((i) => reviewPaths.add(i)));
54
+ }
55
+ }
56
+ requireHumanReview = [...reviewPaths];
57
+ }
58
+ else {
59
+ throw new Error("Config must include (openapi + docsite) or docAreas");
60
+ }
61
+ return {
62
+ ...config,
63
+ openapi,
64
+ docsite,
65
+ exclude,
66
+ requireHumanReview,
67
+ };
68
+ }
@@ -1,55 +1,85 @@
1
1
  "use strict";
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
- exports.docDriftConfigSchema = void 0;
3
+ exports.docDriftConfigSchema = exports.openApiSimpleSchema = void 0;
4
4
  const zod_1 = require("zod");
5
5
  const pathRuleSchema = zod_1.z.object({
6
6
  match: zod_1.z.string().min(1),
7
- impacts: zod_1.z.array(zod_1.z.string().min(1)).min(1)
7
+ impacts: zod_1.z.array(zod_1.z.string().min(1)).min(1),
8
8
  });
9
9
  const openApiDetectSchema = zod_1.z.object({
10
10
  exportCmd: zod_1.z.string().min(1),
11
11
  generatedPath: zod_1.z.string().min(1),
12
- publishedPath: zod_1.z.string().min(1)
12
+ publishedPath: zod_1.z.string().min(1),
13
+ });
14
+ /** Simple config: short field names for openapi block */
15
+ exports.openApiSimpleSchema = zod_1.z.object({
16
+ export: zod_1.z.string().min(1),
17
+ generated: zod_1.z.string().min(1),
18
+ published: zod_1.z.string().min(1),
13
19
  });
14
20
  const docAreaSchema = zod_1.z.object({
15
21
  name: zod_1.z.string().min(1),
16
22
  mode: zod_1.z.enum(["autogen", "conceptual"]),
17
23
  owners: zod_1.z.object({
18
- reviewers: zod_1.z.array(zod_1.z.string().min(1)).min(1)
24
+ reviewers: zod_1.z.array(zod_1.z.string().min(1)).min(1),
19
25
  }),
20
26
  detect: zod_1.z
21
27
  .object({
22
28
  openapi: openApiDetectSchema.optional(),
23
- paths: zod_1.z.array(pathRuleSchema).optional()
29
+ paths: zod_1.z.array(pathRuleSchema).optional(),
24
30
  })
25
31
  .refine((v) => Boolean(v.openapi) || Boolean(v.paths?.length), {
26
- message: "docArea.detect must include openapi or paths"
32
+ message: "docArea.detect must include openapi or paths",
27
33
  }),
28
34
  patch: zod_1.z.object({
29
35
  targets: zod_1.z.array(zod_1.z.string().min(1)).optional(),
30
- requireHumanConfirmation: zod_1.z.boolean().optional().default(false)
31
- })
36
+ requireHumanConfirmation: zod_1.z.boolean().optional().default(false),
37
+ }),
32
38
  });
33
- exports.docDriftConfigSchema = zod_1.z.object({
39
+ const policySchema = zod_1.z.object({
40
+ prCaps: zod_1.z.object({
41
+ maxPrsPerDay: zod_1.z.number().int().positive().default(1),
42
+ maxFilesTouched: zod_1.z.number().int().positive().default(12),
43
+ }),
44
+ confidence: zod_1.z.object({
45
+ autopatchThreshold: zod_1.z.number().min(0).max(1).default(0.8),
46
+ }),
47
+ allowlist: zod_1.z.array(zod_1.z.string().min(1)).min(1),
48
+ verification: zod_1.z.object({
49
+ commands: zod_1.z.array(zod_1.z.string().min(1)).min(1),
50
+ }),
51
+ /** Days before opening SLA issue for unmerged doc-drift PRs. 0 = disabled. */
52
+ slaDays: zod_1.z.number().int().min(0).optional().default(7),
53
+ /** Label to identify doc-drift PRs for SLA check (only these PRs count). */
54
+ slaLabel: zod_1.z.string().min(1).optional().default("docdrift"),
55
+ /**
56
+ * If false (default): Devin may only edit existing files. No new articles, no new folders.
57
+ * If true: Devin may add new articles, create folders, change information architecture.
58
+ * Gives teams control to prevent doc sprawl; mainly applies to conceptual/guides.
59
+ */
60
+ allowNewFiles: zod_1.z.boolean().optional().default(false),
61
+ });
62
+ exports.docDriftConfigSchema = zod_1.z
63
+ .object({
34
64
  version: zod_1.z.literal(1),
65
+ /** Simple config: openapi block (API spec = gate for run) */
66
+ openapi: exports.openApiSimpleSchema.optional(),
67
+ /** Simple config: docsite root path(s) */
68
+ docsite: zod_1.z.union([zod_1.z.string().min(1), zod_1.z.array(zod_1.z.string().min(1))]).optional(),
69
+ /** Paths we never touch (glob patterns) */
70
+ exclude: zod_1.z.array(zod_1.z.string().min(1)).optional().default([]),
71
+ /** Paths that require human review when touched (we create issue post-PR) */
72
+ requireHumanReview: zod_1.z.array(zod_1.z.string().min(1)).optional().default([]),
35
73
  devin: zod_1.z.object({
36
74
  apiVersion: zod_1.z.literal("v1"),
37
75
  unlisted: zod_1.z.boolean().default(true),
38
76
  maxAcuLimit: zod_1.z.number().int().positive().default(2),
39
- tags: zod_1.z.array(zod_1.z.string().min(1)).default(["docdrift"])
77
+ tags: zod_1.z.array(zod_1.z.string().min(1)).default(["docdrift"]),
78
+ customInstructions: zod_1.z.array(zod_1.z.string().min(1)).optional(),
79
+ customInstructionContent: zod_1.z.string().optional(),
40
80
  }),
41
- policy: zod_1.z.object({
42
- prCaps: zod_1.z.object({
43
- maxPrsPerDay: zod_1.z.number().int().positive().default(1),
44
- maxFilesTouched: zod_1.z.number().int().positive().default(12)
45
- }),
46
- confidence: zod_1.z.object({
47
- autopatchThreshold: zod_1.z.number().min(0).max(1).default(0.8)
48
- }),
49
- allowlist: zod_1.z.array(zod_1.z.string().min(1)).min(1),
50
- verification: zod_1.z.object({
51
- commands: zod_1.z.array(zod_1.z.string().min(1)).min(1)
52
- })
53
- }),
54
- docAreas: zod_1.z.array(docAreaSchema).min(1)
55
- });
81
+ policy: policySchema,
82
+ /** Legacy: doc areas (optional when openapi+docsite present) */
83
+ docAreas: zod_1.z.array(docAreaSchema).optional().default([]),
84
+ })
85
+ .refine((v) => (v.openapi && v.docsite) || v.docAreas.length >= 1, { message: "Config must include (openapi + docsite) or docAreas" });
@@ -13,9 +13,8 @@ async function validateRuntimeConfig(config) {
13
13
  }
14
14
  const commandSet = new Set([
15
15
  ...config.policy.verification.commands,
16
- ...config.docAreas
17
- .map((area) => area.detect.openapi?.exportCmd)
18
- .filter((value) => Boolean(value))
16
+ ...(config.openapi ? [config.openapi.export] : []),
17
+ ...(config.docAreas ?? []).map((area) => area.detect.openapi?.exportCmd).filter((value) => Boolean(value)),
19
18
  ]);
20
19
  for (const command of commandSet) {
21
20
  const binary = commandBinary(command);
@@ -24,7 +23,7 @@ async function validateRuntimeConfig(config) {
24
23
  errors.push(`Command not found for '${command}' (binary: ${binary})`);
25
24
  }
26
25
  }
27
- for (const area of config.docAreas) {
26
+ for (const area of config.docAreas ?? []) {
28
27
  if (area.mode === "autogen" && !area.patch.targets?.length) {
29
28
  warnings.push(`docArea '${area.name}' is autogen but has no patch.targets`);
30
29
  }
@@ -21,7 +21,7 @@ async function runDocsChecks(commands, evidenceDir) {
21
21
  "\n--- stdout ---",
22
22
  result.stdout,
23
23
  "\n--- stderr ---",
24
- result.stderr
24
+ result.stderr,
25
25
  ].join("\n"), "utf8");
26
26
  logs.push(logPath);
27
27
  commandResults.push({ command, exitCode: result.exitCode, logPath });
@@ -31,7 +31,7 @@ async function runDocsChecks(commands, evidenceDir) {
31
31
  return {
32
32
  logs,
33
33
  commandResults,
34
- summary: "Docs checks passed"
34
+ summary: "Docs checks passed",
35
35
  };
36
36
  }
37
37
  return {
@@ -42,7 +42,7 @@ async function runDocsChecks(commands, evidenceDir) {
42
42
  kind: "docs_check_failed",
43
43
  tier: 0,
44
44
  confidence: 0.99,
45
- evidence: failed.map((result) => result.logPath)
46
- }
45
+ evidence: failed.map((result) => result.logPath),
46
+ },
47
47
  };
48
48
  }
@@ -38,7 +38,7 @@ function detectHeuristicImpacts(docArea, changedPaths, evidenceDir) {
38
38
  kind: "heuristic_path_impact",
39
39
  tier: 2,
40
40
  confidence: 0.67,
41
- evidence: [evidencePath]
42
- }
41
+ evidence: [evidencePath],
42
+ },
43
43
  };
44
44
  }
@@ -7,18 +7,8 @@ exports.buildDriftReport = buildDriftReport;
7
7
  const node_path_1 = __importDefault(require("node:path"));
8
8
  const fs_1 = require("../utils/fs");
9
9
  const git_1 = require("../utils/git");
10
- const docsCheck_1 = require("./docsCheck");
11
10
  const heuristics_1 = require("./heuristics");
12
11
  const openapi_1 = require("./openapi");
13
- function defaultRecommendation(mode, signals) {
14
- if (!signals.length) {
15
- return "NOOP";
16
- }
17
- if (mode === "autogen") {
18
- return signals.some((s) => s.tier <= 1) ? "OPEN_PR" : "OPEN_ISSUE";
19
- }
20
- return "OPEN_ISSUE";
21
- }
22
12
  async function buildDriftReport(input) {
23
13
  const runInfo = {
24
14
  runId: `${Date.now()}`,
@@ -26,32 +16,47 @@ async function buildDriftReport(input) {
26
16
  baseSha: input.baseSha,
27
17
  headSha: input.headSha,
28
18
  trigger: input.trigger,
29
- timestamp: new Date().toISOString()
19
+ timestamp: new Date().toISOString(),
30
20
  };
31
21
  const evidenceRoot = node_path_1.default.resolve(".docdrift", "evidence", runInfo.runId);
32
22
  (0, fs_1.ensureDir)(evidenceRoot);
33
23
  const changedPaths = await (0, git_1.gitChangedPaths)(input.baseSha, input.headSha);
34
24
  const diffSummary = await (0, git_1.gitDiffSummary)(input.baseSha, input.headSha);
35
25
  const commits = await (0, git_1.gitCommitList)(input.baseSha, input.headSha);
36
- const docsCheck = await (0, docsCheck_1.runDocsChecks)(input.config.policy.verification.commands, evidenceRoot);
37
- const items = [];
38
- const checkSummaries = [docsCheck.summary];
26
+ (0, fs_1.writeJsonFile)(node_path_1.default.join(evidenceRoot, "changeset.json"), {
27
+ changedPaths,
28
+ diffSummary,
29
+ commits,
30
+ });
31
+ // Gate: run OpenAPI detection first. If no OpenAPI drift, exit (no session).
32
+ const openapiResult = await (0, openapi_1.detectOpenApiDriftFromNormalized)(input.config, evidenceRoot);
33
+ if (!openapiResult.signal) {
34
+ // No OpenAPI drift — gate closed. Return empty.
35
+ const report = {
36
+ run: {
37
+ repo: input.repo,
38
+ baseSha: input.baseSha,
39
+ headSha: input.headSha,
40
+ trigger: input.trigger,
41
+ timestamp: runInfo.timestamp,
42
+ },
43
+ items: [],
44
+ };
45
+ (0, fs_1.writeJsonFile)(node_path_1.default.resolve(".docdrift", "drift_report.json"), report);
46
+ return {
47
+ report,
48
+ aggregated: null,
49
+ changedPaths,
50
+ evidenceRoot,
51
+ runInfo,
52
+ hasOpenApiDrift: false,
53
+ };
54
+ }
55
+ // Gate passed: aggregate signals and impacted docs.
56
+ const signals = [openapiResult.signal];
57
+ const impactedDocs = new Set(openapiResult.impactedDocs);
58
+ const summaries = [openapiResult.summary];
39
59
  for (const docArea of input.config.docAreas) {
40
- const signals = [];
41
- const impactedDocs = new Set(docArea.patch.targets ?? []);
42
- const summaries = [];
43
- if (docsCheck.signal) {
44
- signals.push(docsCheck.signal);
45
- summaries.push(docsCheck.summary);
46
- }
47
- if (docArea.detect.openapi) {
48
- const openapiResult = await (0, openapi_1.detectOpenApiDrift)(docArea, evidenceRoot);
49
- if (openapiResult.signal) {
50
- signals.push(openapiResult.signal);
51
- }
52
- openapiResult.impactedDocs.forEach((doc) => impactedDocs.add(doc));
53
- summaries.push(openapiResult.summary);
54
- }
55
60
  if (docArea.detect.paths?.length) {
56
61
  const heuristicResult = (0, heuristics_1.detectHeuristicImpacts)(docArea, changedPaths, evidenceRoot);
57
62
  if (heuristicResult.signal) {
@@ -60,33 +65,37 @@ async function buildDriftReport(input) {
60
65
  heuristicResult.impactedDocs.forEach((doc) => impactedDocs.add(doc));
61
66
  summaries.push(heuristicResult.summary);
62
67
  }
63
- if (!signals.length) {
64
- continue;
65
- }
66
- items.push({
67
- docArea: docArea.name,
68
- mode: docArea.mode,
69
- signals,
70
- impactedDocs: [...impactedDocs],
71
- recommendedAction: defaultRecommendation(docArea.mode, signals),
72
- summary: summaries.filter(Boolean).join(" | ")
73
- });
74
68
  }
69
+ const aggregated = {
70
+ signals,
71
+ impactedDocs: [...impactedDocs],
72
+ summary: summaries.filter(Boolean).join(" | "),
73
+ };
74
+ const item = {
75
+ docArea: "docsite",
76
+ mode: "autogen",
77
+ signals: aggregated.signals,
78
+ impactedDocs: aggregated.impactedDocs,
79
+ recommendedAction: aggregated.signals.some((s) => s.tier <= 1) ? "OPEN_PR" : "OPEN_ISSUE",
80
+ summary: aggregated.summary,
81
+ };
75
82
  const report = {
76
83
  run: {
77
84
  repo: input.repo,
78
85
  baseSha: input.baseSha,
79
86
  headSha: input.headSha,
80
87
  trigger: input.trigger,
81
- timestamp: runInfo.timestamp
88
+ timestamp: runInfo.timestamp,
82
89
  },
83
- items
90
+ items: [item],
84
91
  };
85
92
  (0, fs_1.writeJsonFile)(node_path_1.default.resolve(".docdrift", "drift_report.json"), report);
86
- (0, fs_1.writeJsonFile)(node_path_1.default.join(evidenceRoot, "changeset.json"), {
93
+ return {
94
+ report,
95
+ aggregated,
87
96
  changedPaths,
88
- diffSummary,
89
- commits
90
- });
91
- return { report, changedPaths, evidenceRoot, runInfo, checkSummaries };
97
+ evidenceRoot,
98
+ runInfo,
99
+ hasOpenApiDrift: true,
100
+ };
92
101
  }