@devinnn/docdrift 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +147 -0
- package/dist/src/cli.js +51 -0
- package/dist/src/config/load.js +25 -0
- package/dist/src/config/schema.js +55 -0
- package/dist/src/config/validate.js +36 -0
- package/dist/src/detect/docsCheck.js +48 -0
- package/dist/src/detect/heuristics.js +44 -0
- package/dist/src/detect/index.js +92 -0
- package/dist/src/detect/openapi.js +123 -0
- package/dist/src/devin/prompts.js +55 -0
- package/dist/src/devin/schemas.js +99 -0
- package/dist/src/devin/v1.js +105 -0
- package/dist/src/evidence/bundle.js +81 -0
- package/dist/src/github/client.js +86 -0
- package/dist/src/index.js +375 -0
- package/dist/src/model/state.js +10 -0
- package/dist/src/model/types.js +2 -0
- package/dist/src/policy/confidence.js +31 -0
- package/dist/src/policy/engine.js +108 -0
- package/dist/src/policy/state.js +17 -0
- package/dist/src/utils/exec.js +24 -0
- package/dist/src/utils/fs.js +39 -0
- package/dist/src/utils/git.js +33 -0
- package/dist/src/utils/glob.js +21 -0
- package/dist/src/utils/hash.js +10 -0
- package/dist/src/utils/json.js +19 -0
- package/dist/src/utils/log.js +26 -0
- package/package.json +42 -0
package/README.md
ADDED
|
@@ -0,0 +1,147 @@
|
|
|
1
|
+
# Devin Doc Drift (Client E: DataStack)
|
|
2
|
+
|
|
3
|
+
Docs that never lie: detect drift between merged code and docs, then open low-noise, evidence-grounded remediation via Devin sessions.
|
|
4
|
+
|
|
5
|
+
## Deliverables
|
|
6
|
+
- TypeScript CLI package (`docdrift`)
|
|
7
|
+
- `validate`
|
|
8
|
+
- `detect --base <sha> --head <sha>`
|
|
9
|
+
- `run --base <sha> --head <sha>`
|
|
10
|
+
- `status --since 24h`
|
|
11
|
+
- GitHub Action: `/Users/cameronking/Desktop/sideproject/docdrift/.github/workflows/devin-doc-drift.yml`
|
|
12
|
+
- Repo-local config: `/Users/cameronking/Desktop/sideproject/docdrift/docdrift.yaml`
|
|
13
|
+
- Demo API + OpenAPI exporter + driftable docs
|
|
14
|
+
- PR template + Loom script
|
|
15
|
+
|
|
16
|
+
## Why this is low-noise
|
|
17
|
+
- One PR per doc area per day (bundling rule).
|
|
18
|
+
- Global PR/day cap.
|
|
19
|
+
- Confidence gating and allowlist enforcement.
|
|
20
|
+
- Conceptual docs default to issue escalation with targeted questions.
|
|
21
|
+
- Idempotency key prevents duplicate actions for same repo/SHAs/action.
|
|
22
|
+
|
|
23
|
+
## Detection tiers
|
|
24
|
+
- Tier 0: docs checks (`npm run docs:check`)
|
|
25
|
+
- Tier 1: OpenAPI drift (`openapi/generated.json` vs `docs/reference/openapi.json`)
|
|
26
|
+
- Tier 2: heuristic path impacts (e.g. `apps/api/src/auth/**` -> `docs/guides/auth.md`)
|
|
27
|
+
|
|
28
|
+
Output artifacts (under `.docdrift/`):
|
|
29
|
+
- `drift_report.json`
|
|
30
|
+
- `metrics.json` (after `run`)
|
|
31
|
+
|
|
32
|
+
When you run docdrift as a package (e.g. `npx docdrift` or from another repo), all of this is written to **that repo’s** `.docdrift/` — i.e. the current working directory where the CLI is invoked, not inside the package. Add `.docdrift/` to the consuming repo’s `.gitignore` if you don’t want to commit run artifacts.
|
|
33
|
+
|
|
34
|
+
## Core flow (`docdrift run`)
|
|
35
|
+
1. Validate config and command availability.
|
|
36
|
+
2. Build drift report.
|
|
37
|
+
3. Policy decision (`OPEN_PR | UPDATE_EXISTING_PR | OPEN_ISSUE | NOOP`).
|
|
38
|
+
4. Build evidence bundle (`.docdrift/evidence/<runId>/<docArea>` + tarball).
|
|
39
|
+
5. Upload attachments to Devin v1 and create session.
|
|
40
|
+
6. Poll session to terminal status.
|
|
41
|
+
7. Surface result via GitHub commit comment; open issue on blocked/low-confidence paths.
|
|
42
|
+
8. Persist state in `.docdrift/state.json` and write `.docdrift/metrics.json`.
|
|
43
|
+
|
|
44
|
+
## Local usage
|
|
45
|
+
```bash
|
|
46
|
+
npm install
|
|
47
|
+
npx tsx src/cli.ts validate
|
|
48
|
+
npm run openapi:export
|
|
49
|
+
npx tsx src/cli.ts detect --base <sha> --head <sha>
|
|
50
|
+
DEVIN_API_KEY=... GITHUB_TOKEN=... GITHUB_REPOSITORY=owner/repo GITHUB_SHA=<sha> npx tsx src/cli.ts run --base <sha> --head <sha>
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
## Local demo (no GitHub)
|
|
54
|
+
|
|
55
|
+
You can run a full end-to-end demo locally with no remote repo. Ensure `.env` has `DEVIN_API_KEY` (and optionally `GITHUB_TOKEN` only when you have a real repo).
|
|
56
|
+
|
|
57
|
+
1. **One-time setup (already done if you have two commits with drift)**
|
|
58
|
+
- Git is inited; baseline commit has docs in sync with API.
|
|
59
|
+
- A later commit changes `apps/api/src/model.ts` (e.g. `name` → `fullName`) and runs `npm run openapi:export`, so `openapi/generated.json` drifts from `docs/reference/openapi.json`.
|
|
60
|
+
|
|
61
|
+
2. **Run the pipeline**
|
|
62
|
+
```bash
|
|
63
|
+
npm install
|
|
64
|
+
npx tsx src/cli.ts validate
|
|
65
|
+
npx tsx src/cli.ts detect --base b0f624f --head 6030902
|
|
66
|
+
```
|
|
67
|
+
- Use your own `git log --oneline -3` to get `base` (older) and `head` (newer) SHAs if you recreated the demo.
|
|
68
|
+
|
|
69
|
+
3. **Run with Devin (no GitHub calls)**
|
|
70
|
+
Omit `GITHUB_TOKEN` so the CLI does not post comments or create issues. Devin session still runs; results are printed to stdout and written to `.docdrift/state.json` and `metrics.json`.
|
|
71
|
+
```bash
|
|
72
|
+
export $(grep -v '^#' .env | xargs)
|
|
73
|
+
unset GITHUB_TOKEN GITHUB_REPOSITORY GITHUB_SHA
|
|
74
|
+
npx tsx src/cli.ts run --base b0f624f --head 6030902
|
|
75
|
+
```
|
|
76
|
+
- `run` can take 1–3 minutes while the Devin session runs.
|
|
77
|
+
|
|
78
|
+
4. **What you’ll see**
|
|
79
|
+
- `.docdrift/drift_report.json` — drift items (e.g. OpenAPI `name` → `fullName`).
|
|
80
|
+
- `.docdrift/evidence/<runId>/` — evidence bundles and OpenAPI diff.
|
|
81
|
+
- Stdout — per–doc-area outcome (e.g. PR opened by Devin or blocked).
|
|
82
|
+
- `.docdrift/metrics.json` — counts and timing.
|
|
83
|
+
|
|
84
|
+
## CI usage
|
|
85
|
+
|
|
86
|
+
- Add secret: `DEVIN_API_KEY`
|
|
87
|
+
- Push to `main` or run `workflow_dispatch`
|
|
88
|
+
- Action uploads:
|
|
89
|
+
- `.docdrift/drift_report.json`
|
|
90
|
+
- `.docdrift/evidence/**`
|
|
91
|
+
- `.docdrift/metrics.json`
|
|
92
|
+
|
|
93
|
+
## Run on GitHub
|
|
94
|
+
|
|
95
|
+
1. **Create a repo** on GitHub (e.g. `your-org/docdrift`), then add the remote and push:
|
|
96
|
+
```bash
|
|
97
|
+
git remote add origin https://github.com/your-org/docdrift.git
|
|
98
|
+
git push -u origin main
|
|
99
|
+
```
|
|
100
|
+
|
|
101
|
+
2. **Add secret**
|
|
102
|
+
Repo → **Settings** → **Secrets and variables** → **Actions** → **New repository secret**
|
|
103
|
+
- Name: `DEVIN_API_KEY`
|
|
104
|
+
- Value: your Devin API key (same as in `.env` locally)
|
|
105
|
+
|
|
106
|
+
`GITHUB_TOKEN` is provided automatically; the workflow uses it for commit comments and issues.
|
|
107
|
+
|
|
108
|
+
3. **Trigger the workflow**
|
|
109
|
+
- **Push to `main`** — runs on every push (compares previous commit vs current).
|
|
110
|
+
- **Manual run** — **Actions** tab → **devin-doc-drift** → **Run workflow** (uses `HEAD` and `HEAD^` as head/base).
|
|
111
|
+
|
|
112
|
+
## Using in another repo (published package)
|
|
113
|
+
|
|
114
|
+
Once published to npm, any repo can use the CLI locally or in GitHub Actions.
|
|
115
|
+
|
|
116
|
+
1. **In the consuming repo** add a `docdrift.yaml` at the root (see this repo’s `docdrift.yaml` and `docdrift-yml.md`).
|
|
117
|
+
2. **CLI**
|
|
118
|
+
```bash
|
|
119
|
+
npx docdrift@latest validate
|
|
120
|
+
npx docdrift@latest detect --base <base-sha> --head <head-sha>
|
|
121
|
+
# With env for run:
|
|
122
|
+
DEVIN_API_KEY=... GITHUB_TOKEN=... GITHUB_REPOSITORY=owner/repo GITHUB_SHA=<sha> npx docdrift@latest run --base <base-sha> --head <head-sha>
|
|
123
|
+
```
|
|
124
|
+
3. **GitHub Actions** — add a step that runs the CLI (e.g. after checkout and setting base/head):
|
|
125
|
+
```yaml
|
|
126
|
+
- run: npx docdrift@latest run --base ${{ steps.shas.outputs.base }} --head ${{ steps.shas.outputs.head }}
|
|
127
|
+
env:
|
|
128
|
+
DEVIN_API_KEY: ${{ secrets.DEVIN_API_KEY }}
|
|
129
|
+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
130
|
+
GITHUB_REPOSITORY: ${{ github.repository }}
|
|
131
|
+
GITHUB_SHA: ${{ github.sha }}
|
|
132
|
+
```
|
|
133
|
+
Add repo secret `DEVIN_API_KEY`; `GITHUB_TOKEN` is provided by the runner.
|
|
134
|
+
|
|
135
|
+
## Publishing the package
|
|
136
|
+
|
|
137
|
+
- Set `"private": false` in `package.json` (or omit it).
|
|
138
|
+
- Set `"repository": { "type": "git", "url": "https://github.com/your-org/docdrift.git" }`.
|
|
139
|
+
- Run `pnpm build` (or `npm run build`), then `npm publish` (for a scoped package use `npm publish --access public`).
|
|
140
|
+
- Only the `dist/` directory is included (`files` in `package.json`). Consumers get the built CLI; they provide their own `docdrift.yaml` in their repo.
|
|
141
|
+
|
|
142
|
+
## Demo scenario
|
|
143
|
+
- Autogen drift: rename a field in `apps/api/src/model.ts`, merge to `main`, observe docs PR path.
|
|
144
|
+
- Conceptual drift: change auth behavior under `apps/api/src/auth/**`, merge to `main`, observe single escalation issue.
|
|
145
|
+
|
|
146
|
+
## Loom
|
|
147
|
+
See `/Users/cameronking/Desktop/sideproject/docdrift/loom.md` for the minute-by-minute recording script.
|
package/dist/src/cli.js
ADDED
|
@@ -0,0 +1,51 @@
|
|
|
1
|
+
#!/usr/bin/env node
|
|
2
|
+
"use strict";
|
|
3
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
4
|
+
const index_1 = require("./index");
|
|
5
|
+
function getArg(args, flag) {
|
|
6
|
+
const index = args.indexOf(flag);
|
|
7
|
+
if (index === -1) {
|
|
8
|
+
return undefined;
|
|
9
|
+
}
|
|
10
|
+
return args[index + 1];
|
|
11
|
+
}
|
|
12
|
+
async function main() {
|
|
13
|
+
const [, , command, ...args] = process.argv;
|
|
14
|
+
if (!command) {
|
|
15
|
+
throw new Error("Usage: docdrift <validate|detect|run|status> [options]");
|
|
16
|
+
}
|
|
17
|
+
switch (command) {
|
|
18
|
+
case "validate": {
|
|
19
|
+
await (0, index_1.runValidate)();
|
|
20
|
+
return;
|
|
21
|
+
}
|
|
22
|
+
case "detect": {
|
|
23
|
+
const baseSha = (0, index_1.requireSha)(getArg(args, "--base"), "--base");
|
|
24
|
+
const headSha = (0, index_1.requireSha)(getArg(args, "--head"), "--head");
|
|
25
|
+
const trigger = (0, index_1.resolveTrigger)(process.env.GITHUB_EVENT_NAME);
|
|
26
|
+
const result = await (0, index_1.runDetect)({ baseSha, headSha, trigger });
|
|
27
|
+
process.exitCode = result.hasDrift ? 1 : 0;
|
|
28
|
+
return;
|
|
29
|
+
}
|
|
30
|
+
case "run": {
|
|
31
|
+
const baseSha = (0, index_1.requireSha)(getArg(args, "--base"), "--base");
|
|
32
|
+
const headSha = (0, index_1.requireSha)(getArg(args, "--head"), "--head");
|
|
33
|
+
const trigger = (0, index_1.resolveTrigger)(process.env.GITHUB_EVENT_NAME);
|
|
34
|
+
const results = await (0, index_1.runDocDrift)({ baseSha, headSha, trigger });
|
|
35
|
+
console.log(JSON.stringify(results, null, 2));
|
|
36
|
+
return;
|
|
37
|
+
}
|
|
38
|
+
case "status": {
|
|
39
|
+
const since = getArg(args, "--since");
|
|
40
|
+
const sinceHours = (0, index_1.parseDurationHours)(since);
|
|
41
|
+
await (0, index_1.runStatus)(sinceHours);
|
|
42
|
+
return;
|
|
43
|
+
}
|
|
44
|
+
default:
|
|
45
|
+
throw new Error(`Unknown command: ${command}`);
|
|
46
|
+
}
|
|
47
|
+
}
|
|
48
|
+
main().catch((error) => {
|
|
49
|
+
console.error(error instanceof Error ? error.message : String(error));
|
|
50
|
+
process.exitCode = 1;
|
|
51
|
+
});
|
|
@@ -0,0 +1,25 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.loadConfig = loadConfig;
|
|
7
|
+
const node_fs_1 = __importDefault(require("node:fs"));
|
|
8
|
+
const node_path_1 = __importDefault(require("node:path"));
|
|
9
|
+
const js_yaml_1 = __importDefault(require("js-yaml"));
|
|
10
|
+
const schema_1 = require("./schema");
|
|
11
|
+
function loadConfig(configPath = "docdrift.yaml") {
|
|
12
|
+
const resolved = node_path_1.default.resolve(configPath);
|
|
13
|
+
if (!node_fs_1.default.existsSync(resolved)) {
|
|
14
|
+
throw new Error(`Config file not found: ${resolved}`);
|
|
15
|
+
}
|
|
16
|
+
const parsed = js_yaml_1.default.load(node_fs_1.default.readFileSync(resolved, "utf8"));
|
|
17
|
+
const result = schema_1.docDriftConfigSchema.safeParse(parsed);
|
|
18
|
+
if (!result.success) {
|
|
19
|
+
const message = result.error.errors
|
|
20
|
+
.map((e) => `${e.path.join(".") || "root"}: ${e.message}`)
|
|
21
|
+
.join("\n");
|
|
22
|
+
throw new Error(`Invalid config:\n${message}`);
|
|
23
|
+
}
|
|
24
|
+
return result.data;
|
|
25
|
+
}
|
|
@@ -0,0 +1,55 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.docDriftConfigSchema = void 0;
|
|
4
|
+
const zod_1 = require("zod");
|
|
5
|
+
const pathRuleSchema = zod_1.z.object({
|
|
6
|
+
match: zod_1.z.string().min(1),
|
|
7
|
+
impacts: zod_1.z.array(zod_1.z.string().min(1)).min(1)
|
|
8
|
+
});
|
|
9
|
+
const openApiDetectSchema = zod_1.z.object({
|
|
10
|
+
exportCmd: zod_1.z.string().min(1),
|
|
11
|
+
generatedPath: zod_1.z.string().min(1),
|
|
12
|
+
publishedPath: zod_1.z.string().min(1)
|
|
13
|
+
});
|
|
14
|
+
const docAreaSchema = zod_1.z.object({
|
|
15
|
+
name: zod_1.z.string().min(1),
|
|
16
|
+
mode: zod_1.z.enum(["autogen", "conceptual"]),
|
|
17
|
+
owners: zod_1.z.object({
|
|
18
|
+
reviewers: zod_1.z.array(zod_1.z.string().min(1)).min(1)
|
|
19
|
+
}),
|
|
20
|
+
detect: zod_1.z
|
|
21
|
+
.object({
|
|
22
|
+
openapi: openApiDetectSchema.optional(),
|
|
23
|
+
paths: zod_1.z.array(pathRuleSchema).optional()
|
|
24
|
+
})
|
|
25
|
+
.refine((v) => Boolean(v.openapi) || Boolean(v.paths?.length), {
|
|
26
|
+
message: "docArea.detect must include openapi or paths"
|
|
27
|
+
}),
|
|
28
|
+
patch: zod_1.z.object({
|
|
29
|
+
targets: zod_1.z.array(zod_1.z.string().min(1)).optional(),
|
|
30
|
+
requireHumanConfirmation: zod_1.z.boolean().optional().default(false)
|
|
31
|
+
})
|
|
32
|
+
});
|
|
33
|
+
exports.docDriftConfigSchema = zod_1.z.object({
|
|
34
|
+
version: zod_1.z.literal(1),
|
|
35
|
+
devin: zod_1.z.object({
|
|
36
|
+
apiVersion: zod_1.z.literal("v1"),
|
|
37
|
+
unlisted: zod_1.z.boolean().default(true),
|
|
38
|
+
maxAcuLimit: zod_1.z.number().int().positive().default(2),
|
|
39
|
+
tags: zod_1.z.array(zod_1.z.string().min(1)).default(["docdrift"])
|
|
40
|
+
}),
|
|
41
|
+
policy: zod_1.z.object({
|
|
42
|
+
prCaps: zod_1.z.object({
|
|
43
|
+
maxPrsPerDay: zod_1.z.number().int().positive().default(1),
|
|
44
|
+
maxFilesTouched: zod_1.z.number().int().positive().default(12)
|
|
45
|
+
}),
|
|
46
|
+
confidence: zod_1.z.object({
|
|
47
|
+
autopatchThreshold: zod_1.z.number().min(0).max(1).default(0.8)
|
|
48
|
+
}),
|
|
49
|
+
allowlist: zod_1.z.array(zod_1.z.string().min(1)).min(1),
|
|
50
|
+
verification: zod_1.z.object({
|
|
51
|
+
commands: zod_1.z.array(zod_1.z.string().min(1)).min(1)
|
|
52
|
+
})
|
|
53
|
+
}),
|
|
54
|
+
docAreas: zod_1.z.array(docAreaSchema).min(1)
|
|
55
|
+
});
|
|
@@ -0,0 +1,36 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.validateRuntimeConfig = validateRuntimeConfig;
|
|
4
|
+
const exec_1 = require("../utils/exec");
|
|
5
|
+
function commandBinary(command) {
|
|
6
|
+
return command.trim().split(/\s+/)[0] ?? "";
|
|
7
|
+
}
|
|
8
|
+
async function validateRuntimeConfig(config) {
|
|
9
|
+
const errors = [];
|
|
10
|
+
const warnings = [];
|
|
11
|
+
if (config.policy.prCaps.maxFilesTouched < 1) {
|
|
12
|
+
errors.push("policy.prCaps.maxFilesTouched must be >= 1");
|
|
13
|
+
}
|
|
14
|
+
const commandSet = new Set([
|
|
15
|
+
...config.policy.verification.commands,
|
|
16
|
+
...config.docAreas
|
|
17
|
+
.map((area) => area.detect.openapi?.exportCmd)
|
|
18
|
+
.filter((value) => Boolean(value))
|
|
19
|
+
]);
|
|
20
|
+
for (const command of commandSet) {
|
|
21
|
+
const binary = commandBinary(command);
|
|
22
|
+
const result = await (0, exec_1.execCommand)(`command -v ${binary}`);
|
|
23
|
+
if (result.exitCode !== 0) {
|
|
24
|
+
errors.push(`Command not found for '${command}' (binary: ${binary})`);
|
|
25
|
+
}
|
|
26
|
+
}
|
|
27
|
+
for (const area of config.docAreas) {
|
|
28
|
+
if (area.mode === "autogen" && !area.patch.targets?.length) {
|
|
29
|
+
warnings.push(`docArea '${area.name}' is autogen but has no patch.targets`);
|
|
30
|
+
}
|
|
31
|
+
if (area.mode === "conceptual" && !area.detect.paths?.length) {
|
|
32
|
+
warnings.push(`docArea '${area.name}' is conceptual but has no detect.paths rules`);
|
|
33
|
+
}
|
|
34
|
+
}
|
|
35
|
+
return { errors, warnings };
|
|
36
|
+
}
|
|
@@ -0,0 +1,48 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.runDocsChecks = runDocsChecks;
|
|
7
|
+
const node_fs_1 = __importDefault(require("node:fs"));
|
|
8
|
+
const node_path_1 = __importDefault(require("node:path"));
|
|
9
|
+
const exec_1 = require("../utils/exec");
|
|
10
|
+
const fs_1 = require("../utils/fs");
|
|
11
|
+
async function runDocsChecks(commands, evidenceDir) {
|
|
12
|
+
(0, fs_1.ensureDir)(evidenceDir);
|
|
13
|
+
const logs = [];
|
|
14
|
+
const commandResults = [];
|
|
15
|
+
for (const [index, command] of commands.entries()) {
|
|
16
|
+
const result = await (0, exec_1.execCommand)(command);
|
|
17
|
+
const logPath = node_path_1.default.join(evidenceDir, `docs-check.${index + 1}.log`);
|
|
18
|
+
node_fs_1.default.writeFileSync(logPath, [
|
|
19
|
+
`$ ${command}`,
|
|
20
|
+
`exitCode: ${result.exitCode}`,
|
|
21
|
+
"\n--- stdout ---",
|
|
22
|
+
result.stdout,
|
|
23
|
+
"\n--- stderr ---",
|
|
24
|
+
result.stderr
|
|
25
|
+
].join("\n"), "utf8");
|
|
26
|
+
logs.push(logPath);
|
|
27
|
+
commandResults.push({ command, exitCode: result.exitCode, logPath });
|
|
28
|
+
}
|
|
29
|
+
const failed = commandResults.filter((result) => result.exitCode !== 0);
|
|
30
|
+
if (!failed.length) {
|
|
31
|
+
return {
|
|
32
|
+
logs,
|
|
33
|
+
commandResults,
|
|
34
|
+
summary: "Docs checks passed"
|
|
35
|
+
};
|
|
36
|
+
}
|
|
37
|
+
return {
|
|
38
|
+
logs,
|
|
39
|
+
commandResults,
|
|
40
|
+
summary: `Docs checks failed (${failed.length}/${commandResults.length})`,
|
|
41
|
+
signal: {
|
|
42
|
+
kind: "docs_check_failed",
|
|
43
|
+
tier: 0,
|
|
44
|
+
confidence: 0.99,
|
|
45
|
+
evidence: failed.map((result) => result.logPath)
|
|
46
|
+
}
|
|
47
|
+
};
|
|
48
|
+
}
|
|
@@ -0,0 +1,44 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.detectHeuristicImpacts = detectHeuristicImpacts;
|
|
7
|
+
const node_fs_1 = __importDefault(require("node:fs"));
|
|
8
|
+
const node_path_1 = __importDefault(require("node:path"));
|
|
9
|
+
const glob_1 = require("../utils/glob");
|
|
10
|
+
function detectHeuristicImpacts(docArea, changedPaths, evidenceDir) {
|
|
11
|
+
const rules = docArea.detect.paths ?? [];
|
|
12
|
+
if (!rules.length) {
|
|
13
|
+
return { impactedDocs: [], summary: "No heuristic rules configured", evidenceFiles: [] };
|
|
14
|
+
}
|
|
15
|
+
const matched = [];
|
|
16
|
+
const impactedDocs = new Set();
|
|
17
|
+
for (const rule of rules) {
|
|
18
|
+
for (const changedPath of changedPaths) {
|
|
19
|
+
if ((0, glob_1.matchesGlob)(rule.match, changedPath)) {
|
|
20
|
+
matched.push({ rule: rule.match, path: changedPath, impacts: rule.impacts });
|
|
21
|
+
rule.impacts.forEach((doc) => impactedDocs.add(doc));
|
|
22
|
+
}
|
|
23
|
+
}
|
|
24
|
+
}
|
|
25
|
+
if (!matched.length) {
|
|
26
|
+
return { impactedDocs: [], summary: "No heuristic conceptual impacts", evidenceFiles: [] };
|
|
27
|
+
}
|
|
28
|
+
const evidencePath = node_path_1.default.join(evidenceDir, `${docArea.name}.heuristics.txt`);
|
|
29
|
+
const body = matched
|
|
30
|
+
.map((entry) => `${entry.path} matched ${entry.rule} -> ${entry.impacts.join(", ")}`)
|
|
31
|
+
.join("\n");
|
|
32
|
+
node_fs_1.default.writeFileSync(evidencePath, body, "utf8");
|
|
33
|
+
return {
|
|
34
|
+
impactedDocs: [...impactedDocs],
|
|
35
|
+
evidenceFiles: [evidencePath],
|
|
36
|
+
summary: `Heuristic impacts detected (${matched.length} matches)`,
|
|
37
|
+
signal: {
|
|
38
|
+
kind: "heuristic_path_impact",
|
|
39
|
+
tier: 2,
|
|
40
|
+
confidence: 0.67,
|
|
41
|
+
evidence: [evidencePath]
|
|
42
|
+
}
|
|
43
|
+
};
|
|
44
|
+
}
|
|
@@ -0,0 +1,92 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.buildDriftReport = buildDriftReport;
|
|
7
|
+
const node_path_1 = __importDefault(require("node:path"));
|
|
8
|
+
const fs_1 = require("../utils/fs");
|
|
9
|
+
const git_1 = require("../utils/git");
|
|
10
|
+
const docsCheck_1 = require("./docsCheck");
|
|
11
|
+
const heuristics_1 = require("./heuristics");
|
|
12
|
+
const openapi_1 = require("./openapi");
|
|
13
|
+
function defaultRecommendation(mode, signals) {
|
|
14
|
+
if (!signals.length) {
|
|
15
|
+
return "NOOP";
|
|
16
|
+
}
|
|
17
|
+
if (mode === "autogen") {
|
|
18
|
+
return signals.some((s) => s.tier <= 1) ? "OPEN_PR" : "OPEN_ISSUE";
|
|
19
|
+
}
|
|
20
|
+
return "OPEN_ISSUE";
|
|
21
|
+
}
|
|
22
|
+
async function buildDriftReport(input) {
|
|
23
|
+
const runInfo = {
|
|
24
|
+
runId: `${Date.now()}`,
|
|
25
|
+
repo: input.repo,
|
|
26
|
+
baseSha: input.baseSha,
|
|
27
|
+
headSha: input.headSha,
|
|
28
|
+
trigger: input.trigger,
|
|
29
|
+
timestamp: new Date().toISOString()
|
|
30
|
+
};
|
|
31
|
+
const evidenceRoot = node_path_1.default.resolve(".docdrift", "evidence", runInfo.runId);
|
|
32
|
+
(0, fs_1.ensureDir)(evidenceRoot);
|
|
33
|
+
const changedPaths = await (0, git_1.gitChangedPaths)(input.baseSha, input.headSha);
|
|
34
|
+
const diffSummary = await (0, git_1.gitDiffSummary)(input.baseSha, input.headSha);
|
|
35
|
+
const commits = await (0, git_1.gitCommitList)(input.baseSha, input.headSha);
|
|
36
|
+
const docsCheck = await (0, docsCheck_1.runDocsChecks)(input.config.policy.verification.commands, evidenceRoot);
|
|
37
|
+
const items = [];
|
|
38
|
+
const checkSummaries = [docsCheck.summary];
|
|
39
|
+
for (const docArea of input.config.docAreas) {
|
|
40
|
+
const signals = [];
|
|
41
|
+
const impactedDocs = new Set(docArea.patch.targets ?? []);
|
|
42
|
+
const summaries = [];
|
|
43
|
+
if (docsCheck.signal) {
|
|
44
|
+
signals.push(docsCheck.signal);
|
|
45
|
+
summaries.push(docsCheck.summary);
|
|
46
|
+
}
|
|
47
|
+
if (docArea.detect.openapi) {
|
|
48
|
+
const openapiResult = await (0, openapi_1.detectOpenApiDrift)(docArea, evidenceRoot);
|
|
49
|
+
if (openapiResult.signal) {
|
|
50
|
+
signals.push(openapiResult.signal);
|
|
51
|
+
}
|
|
52
|
+
openapiResult.impactedDocs.forEach((doc) => impactedDocs.add(doc));
|
|
53
|
+
summaries.push(openapiResult.summary);
|
|
54
|
+
}
|
|
55
|
+
if (docArea.detect.paths?.length) {
|
|
56
|
+
const heuristicResult = (0, heuristics_1.detectHeuristicImpacts)(docArea, changedPaths, evidenceRoot);
|
|
57
|
+
if (heuristicResult.signal) {
|
|
58
|
+
signals.push(heuristicResult.signal);
|
|
59
|
+
}
|
|
60
|
+
heuristicResult.impactedDocs.forEach((doc) => impactedDocs.add(doc));
|
|
61
|
+
summaries.push(heuristicResult.summary);
|
|
62
|
+
}
|
|
63
|
+
if (!signals.length) {
|
|
64
|
+
continue;
|
|
65
|
+
}
|
|
66
|
+
items.push({
|
|
67
|
+
docArea: docArea.name,
|
|
68
|
+
mode: docArea.mode,
|
|
69
|
+
signals,
|
|
70
|
+
impactedDocs: [...impactedDocs],
|
|
71
|
+
recommendedAction: defaultRecommendation(docArea.mode, signals),
|
|
72
|
+
summary: summaries.filter(Boolean).join(" | ")
|
|
73
|
+
});
|
|
74
|
+
}
|
|
75
|
+
const report = {
|
|
76
|
+
run: {
|
|
77
|
+
repo: input.repo,
|
|
78
|
+
baseSha: input.baseSha,
|
|
79
|
+
headSha: input.headSha,
|
|
80
|
+
trigger: input.trigger,
|
|
81
|
+
timestamp: runInfo.timestamp
|
|
82
|
+
},
|
|
83
|
+
items
|
|
84
|
+
};
|
|
85
|
+
(0, fs_1.writeJsonFile)(node_path_1.default.resolve(".docdrift", "drift_report.json"), report);
|
|
86
|
+
(0, fs_1.writeJsonFile)(node_path_1.default.join(evidenceRoot, "changeset.json"), {
|
|
87
|
+
changedPaths,
|
|
88
|
+
diffSummary,
|
|
89
|
+
commits
|
|
90
|
+
});
|
|
91
|
+
return { report, changedPaths, evidenceRoot, runInfo, checkSummaries };
|
|
92
|
+
}
|
|
@@ -0,0 +1,123 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
3
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
4
|
+
};
|
|
5
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
6
|
+
exports.detectOpenApiDrift = detectOpenApiDrift;
|
|
7
|
+
const node_fs_1 = __importDefault(require("node:fs"));
|
|
8
|
+
const node_path_1 = __importDefault(require("node:path"));
|
|
9
|
+
const exec_1 = require("../utils/exec");
|
|
10
|
+
const fs_1 = require("../utils/fs");
|
|
11
|
+
const json_1 = require("../utils/json");
|
|
12
|
+
function responseFields(spec) {
|
|
13
|
+
const fields = new Set();
|
|
14
|
+
const paths = spec?.paths ?? {};
|
|
15
|
+
for (const [pathName, methods] of Object.entries(paths)) {
|
|
16
|
+
for (const [method, methodDef] of Object.entries(methods)) {
|
|
17
|
+
const schema = methodDef?.responses?.["200"]?.content?.["application/json"]?.schema;
|
|
18
|
+
const properties = schema?.properties ?? {};
|
|
19
|
+
for (const key of Object.keys(properties)) {
|
|
20
|
+
fields.add(`${String(method).toUpperCase()} ${pathName}: ${key}`);
|
|
21
|
+
}
|
|
22
|
+
}
|
|
23
|
+
}
|
|
24
|
+
return fields;
|
|
25
|
+
}
|
|
26
|
+
function summarizeSpecDelta(previousSpec, currentSpec) {
|
|
27
|
+
const previous = responseFields(previousSpec);
|
|
28
|
+
const current = responseFields(currentSpec);
|
|
29
|
+
const added = [...current].filter((item) => !previous.has(item)).sort();
|
|
30
|
+
const removed = [...previous].filter((item) => !current.has(item)).sort();
|
|
31
|
+
const lines = [];
|
|
32
|
+
if (added.length) {
|
|
33
|
+
lines.push(`Added response fields (${added.length}):`);
|
|
34
|
+
lines.push(...added.map((value) => `+ ${value}`));
|
|
35
|
+
}
|
|
36
|
+
if (removed.length) {
|
|
37
|
+
lines.push(`Removed response fields (${removed.length}):`);
|
|
38
|
+
lines.push(...removed.map((value) => `- ${value}`));
|
|
39
|
+
}
|
|
40
|
+
if (!lines.length) {
|
|
41
|
+
return "OpenAPI changed, but no top-level response field changes were detected in 200 responses.";
|
|
42
|
+
}
|
|
43
|
+
return lines.join("\n");
|
|
44
|
+
}
|
|
45
|
+
async function detectOpenApiDrift(docArea, evidenceDir) {
|
|
46
|
+
if (!docArea.detect.openapi) {
|
|
47
|
+
return { impactedDocs: [], evidenceFiles: [], summary: "No OpenAPI detector configured" };
|
|
48
|
+
}
|
|
49
|
+
(0, fs_1.ensureDir)(evidenceDir);
|
|
50
|
+
const openapi = docArea.detect.openapi;
|
|
51
|
+
const exportLogPath = node_path_1.default.join(evidenceDir, `${docArea.name}.openapi-export.log`);
|
|
52
|
+
const exportResult = await (0, exec_1.execCommand)(openapi.exportCmd);
|
|
53
|
+
node_fs_1.default.writeFileSync(exportLogPath, [
|
|
54
|
+
`$ ${openapi.exportCmd}`,
|
|
55
|
+
`exitCode: ${exportResult.exitCode}`,
|
|
56
|
+
"\n--- stdout ---",
|
|
57
|
+
exportResult.stdout,
|
|
58
|
+
"\n--- stderr ---",
|
|
59
|
+
exportResult.stderr
|
|
60
|
+
].join("\n"), "utf8");
|
|
61
|
+
if (exportResult.exitCode !== 0) {
|
|
62
|
+
return {
|
|
63
|
+
impactedDocs: [openapi.publishedPath],
|
|
64
|
+
evidenceFiles: [exportLogPath],
|
|
65
|
+
summary: "OpenAPI export command failed",
|
|
66
|
+
signal: {
|
|
67
|
+
kind: "weak_evidence",
|
|
68
|
+
tier: 2,
|
|
69
|
+
confidence: 0.35,
|
|
70
|
+
evidence: [exportLogPath]
|
|
71
|
+
}
|
|
72
|
+
};
|
|
73
|
+
}
|
|
74
|
+
if (!node_fs_1.default.existsSync(openapi.generatedPath) || !node_fs_1.default.existsSync(openapi.publishedPath)) {
|
|
75
|
+
return {
|
|
76
|
+
impactedDocs: [openapi.generatedPath, openapi.publishedPath],
|
|
77
|
+
evidenceFiles: [exportLogPath],
|
|
78
|
+
summary: "OpenAPI file(s) missing",
|
|
79
|
+
signal: {
|
|
80
|
+
kind: "weak_evidence",
|
|
81
|
+
tier: 2,
|
|
82
|
+
confidence: 0.35,
|
|
83
|
+
evidence: [exportLogPath]
|
|
84
|
+
}
|
|
85
|
+
};
|
|
86
|
+
}
|
|
87
|
+
const generatedRaw = node_fs_1.default.readFileSync(openapi.generatedPath, "utf8");
|
|
88
|
+
const publishedRaw = node_fs_1.default.readFileSync(openapi.publishedPath, "utf8");
|
|
89
|
+
const generatedJson = JSON.parse(generatedRaw);
|
|
90
|
+
const publishedJson = JSON.parse(publishedRaw);
|
|
91
|
+
const normalizedGenerated = (0, json_1.stableStringify)(generatedJson);
|
|
92
|
+
const normalizedPublished = (0, json_1.stableStringify)(publishedJson);
|
|
93
|
+
if (normalizedGenerated === normalizedPublished) {
|
|
94
|
+
return {
|
|
95
|
+
impactedDocs: [openapi.publishedPath],
|
|
96
|
+
evidenceFiles: [exportLogPath],
|
|
97
|
+
summary: "No OpenAPI drift detected"
|
|
98
|
+
};
|
|
99
|
+
}
|
|
100
|
+
const summary = summarizeSpecDelta(publishedJson, generatedJson);
|
|
101
|
+
const diffPath = node_path_1.default.join(evidenceDir, `${docArea.name}.openapi.diff.txt`);
|
|
102
|
+
node_fs_1.default.writeFileSync(diffPath, [
|
|
103
|
+
"# OpenAPI Drift Summary",
|
|
104
|
+
summary,
|
|
105
|
+
"",
|
|
106
|
+
"# Published (normalized)",
|
|
107
|
+
normalizedPublished,
|
|
108
|
+
"",
|
|
109
|
+
"# Generated (normalized)",
|
|
110
|
+
normalizedGenerated
|
|
111
|
+
].join("\n"), "utf8");
|
|
112
|
+
return {
|
|
113
|
+
impactedDocs: [...new Set([openapi.publishedPath, ...(docArea.patch.targets ?? [])])],
|
|
114
|
+
evidenceFiles: [exportLogPath, diffPath],
|
|
115
|
+
summary,
|
|
116
|
+
signal: {
|
|
117
|
+
kind: "openapi_diff",
|
|
118
|
+
tier: 1,
|
|
119
|
+
confidence: 0.95,
|
|
120
|
+
evidence: [diffPath]
|
|
121
|
+
}
|
|
122
|
+
};
|
|
123
|
+
}
|