openapi-ai-generator 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 openapi-ai-generator contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,579 @@
1
+ # openapi-ai-generator
2
+
3
+ Automatically generate an up-to-date **OpenAPI 3.1 spec** for your Next.js App Router API routes using an LLM. Runs at build time — no manual spec maintenance required.
4
+
5
+ [![npm version](https://img.shields.io/npm/v/openapi-ai-generator)](https://www.npmjs.com/package/openapi-ai-generator)
6
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
7
+ [![Node.js](https://img.shields.io/badge/node-%3E%3D18-brightgreen)](https://nodejs.org)
8
+ [![TypeScript](https://img.shields.io/badge/TypeScript-5.x-blue)](https://www.typescriptlang.org/)
9
+
10
+ ---
11
+
12
+ ## How It Works
13
+
14
+ 1. **Scans** your project for Next.js `route.ts` files matching your `include` glob patterns
15
+ 2. **Analyzes** each route by sending its source code and JSDoc to your chosen LLM provider
16
+ 3. **Caches** results by SHA-256 hash — unchanged routes are never re-analyzed
17
+ 4. **Generates** a valid OpenAPI 3.1 `spec.json` and a Next.js `route.ts` that serves it at `/api/openapi.json`
18
+ 5. Optionally generates a [Scalar](https://scalar.com) interactive docs UI at `/api/docs`
19
+
20
+ ```
21
+ your routes ──► LLM analysis ──► OpenAPI 3.1 spec ──► /api/openapi.json
22
+
23
+ SHA-256 cache
24
+ (skip unchanged files)
25
+ ```
26
+
27
+ ---
28
+
29
+ ## Table of Contents
30
+
31
+ - [Requirements](#requirements)
32
+ - [Installation](#installation)
33
+ - [Quick Start](#quick-start)
34
+ - [Configuration](#configuration)
35
+ - [Providers](#providers)
36
+ - [CLI Usage](#cli-usage)
37
+ - [Next.js Plugin](#nextjs-plugin)
38
+ - [Programmatic API](#programmatic-api)
39
+ - [JSDoc Integration](#jsdoc-integration)
40
+ - [Caching](#caching)
41
+ - [Output Files](#output-files)
42
+ - [Scalar Docs UI](#scalar-docs-ui)
43
+ - [Path Conversion](#path-conversion)
44
+ - [Contributing](#contributing)
45
+ - [License](#license)
46
+
47
+ ---
48
+
49
+ ## Requirements
50
+
51
+ - Node.js 18+
52
+ - Next.js 13+ (App Router)
53
+ - An API key for one of: Azure OpenAI, OpenAI, or Anthropic
54
+
55
+ ---
56
+
57
+ ## Installation
58
+
59
+ ```bash
60
+ npm install --save-dev openapi-ai-generator
61
+ # or
62
+ pnpm add -D openapi-ai-generator
63
+ # or
64
+ yarn add -D openapi-ai-generator
65
+ ```
66
+
67
+ ---
68
+
69
+ ## Quick Start
70
+
71
+ ### 1. Create a config file
72
+
73
+ ```typescript
74
+ // openapi-gen.config.ts
75
+ import type { OpenAPIGenConfig } from 'openapi-ai-generator';
76
+
77
+ export default {
78
+ provider: 'openai',
79
+ output: {
80
+ specPath: 'src/app/api/openapi.json/route.ts',
81
+ },
82
+ openapi: {
83
+ title: 'My API',
84
+ version: '1.0.0',
85
+ },
86
+ } satisfies OpenAPIGenConfig;
87
+ ```
88
+
89
+ ### 2. Set your API key
90
+
91
+ ```bash
92
+ export OPENAI_API_KEY=sk-...
93
+ ```
94
+
95
+ ### 3. Run the generator
96
+
97
+ ```bash
98
+ npx openapi-ai-generator generate
99
+ ```
100
+
101
+ ### 4. Start your Next.js app and visit `/api/openapi.json`
102
+
103
+ Your live OpenAPI spec will be served automatically.
104
+
105
+ ---
106
+
107
+ ## Configuration
108
+
109
+ Create `openapi-gen.config.ts` (or `.js`, `.mjs`, `.cjs`) at your project root.
110
+
111
+ ```typescript
112
+ import type { OpenAPIGenConfig } from 'openapi-ai-generator';
113
+
114
+ export default {
115
+ // Required: which LLM provider to use
116
+ provider: 'azure', // 'azure' | 'openai' | 'anthropic'
117
+
118
+ // Output file paths (relative to project root)
119
+ output: {
120
+ specPath: 'src/app/api/openapi.json/route.ts', // serves GET /api/openapi.json
121
+ scalarDocs: false, // true = also generate Scalar UI
122
+ scalarPath: 'src/app/api/docs/route.ts', // serves GET /api/docs
123
+ },
124
+
125
+ // OpenAPI info object
126
+ openapi: {
127
+ title: 'My API',
128
+ version: '1.0.0',
129
+ description: 'Auto-generated API documentation',
130
+ servers: [
131
+ { url: 'https://api.example.com', description: 'Production' },
132
+ { url: 'http://localhost:3000', description: 'Development' },
133
+ ],
134
+ },
135
+
136
+ // How JSDoc comments are used during analysis
137
+ jsdocMode: 'context', // 'context' | 'exact' — see JSDoc Integration below
138
+
139
+ // SHA-256 content-based caching
140
+ cache: true,
141
+ cacheDir: '.openapi-cache',
142
+
143
+ // Which route files to include/exclude
144
+ include: ['src/app/api/**/route.ts'],
145
+ exclude: [
146
+ 'src/app/api/openapi.json/route.ts',
147
+ 'src/app/api/docs/route.ts',
148
+ ],
149
+ } satisfies OpenAPIGenConfig;
150
+ ```
151
+
152
+ ### Config Reference
153
+
154
+ | Option | Type | Default | Description |
155
+ |--------|------|---------|-------------|
156
+ | `provider` | `'azure' \| 'openai' \| 'anthropic'` | — | **Required.** LLM provider to use |
157
+ | `output.specPath` | `string` | — | **Required.** Path to the generated spec route file |
158
+ | `output.scalarDocs` | `boolean` | `false` | Generate Scalar interactive docs UI |
159
+ | `output.scalarPath` | `string` | `'src/app/api/docs/route.ts'` | Path for the Scalar docs route |
160
+ | `openapi.title` | `string` | — | **Required.** API title in the spec |
161
+ | `openapi.version` | `string` | — | **Required.** API version in the spec |
162
+ | `openapi.description` | `string` | `''` | API description |
163
+ | `openapi.servers` | `array` | `[]` | Server objects for the spec |
164
+ | `jsdocMode` | `'context' \| 'exact'` | `'context'` | How to use JSDoc (see below) |
165
+ | `cache` | `boolean` | `true` | Enable/disable content-hash caching |
166
+ | `cacheDir` | `string` | `'.openapi-cache'` | Directory for cache files |
167
+ | `include` | `string[]` | `['src/app/api/**/route.ts']` | Glob patterns for routes to analyze |
168
+ | `exclude` | `string[]` | `[]` | Glob patterns for routes to skip |
169
+
170
+ ---
171
+
172
+ ## Providers
173
+
174
+ ### OpenAI
175
+
176
+ ```bash
177
+ export OPENAI_API_KEY=sk-...
178
+ export OPENAI_MODEL=gpt-4o # optional, default: gpt-4o
179
+ ```
180
+
181
+ ```typescript
182
+ // openapi-gen.config.ts
183
+ export default { provider: 'openai', ... } satisfies OpenAPIGenConfig;
184
+ ```
185
+
186
+ ### Anthropic
187
+
188
+ ```bash
189
+ export ANTHROPIC_API_KEY=sk-ant-...
190
+ export ANTHROPIC_MODEL=claude-sonnet-4-6 # optional, default: claude-sonnet-4-6
191
+ ```
192
+
193
+ ```typescript
194
+ export default { provider: 'anthropic', ... } satisfies OpenAPIGenConfig;
195
+ ```
196
+
197
+ ### Azure OpenAI
198
+
199
+ ```bash
200
+ export AZURE_OPENAI_ENDPOINT=https://my-resource.openai.azure.com
201
+ export AZURE_OPENAI_API_KEY=...
202
+ export AZURE_OPENAI_DEPLOYMENT=gpt-4o
203
+ ```
204
+
205
+ ```typescript
206
+ export default { provider: 'azure', ... } satisfies OpenAPIGenConfig;
207
+ ```
208
+
209
+ ### Environment Variable Reference
210
+
211
+ | Variable | Provider | Required | Default |
212
+ |----------|----------|----------|---------|
213
+ | `OPENAI_API_KEY` | openai | Yes | — |
214
+ | `OPENAI_MODEL` | openai | No | `gpt-4o` |
215
+ | `ANTHROPIC_API_KEY` | anthropic | Yes | — |
216
+ | `ANTHROPIC_MODEL` | anthropic | No | `claude-sonnet-4-6` |
217
+ | `AZURE_OPENAI_ENDPOINT` | azure | Yes | — |
218
+ | `AZURE_OPENAI_API_KEY` | azure | Yes | — |
219
+ | `AZURE_OPENAI_DEPLOYMENT` | azure | Yes | — |
220
+
221
+ ---
222
+
223
+ ## CLI Usage
224
+
225
+ ```bash
226
+ # Basic — reads openapi-gen.config.ts from project root
227
+ npx openapi-ai-generator generate
228
+
229
+ # Custom config path
230
+ npx openapi-ai-generator generate --config ./config/openapi-gen.config.ts
231
+
232
+ # Override provider without editing config
233
+ npx openapi-ai-generator generate --provider anthropic
234
+
235
+ # Bypass cache — force re-analyze every route
236
+ npx openapi-ai-generator generate --no-cache
237
+ ```
238
+
239
+ ### CLI Options
240
+
241
+ | Flag | Short | Description |
242
+ |------|-------|-------------|
243
+ | `--config <path>` | `-c` | Path to config file |
244
+ | `--provider <name>` | `-p` | Override provider (`azure`, `openai`, `anthropic`) |
245
+ | `--no-cache` | | Force re-analysis of all routes, ignoring cache |
246
+ | `--version` | | Print version |
247
+ | `--help` | `-h` | Show help |
248
+
249
+ ### Example Output
250
+
251
+ ```
252
+ [openapi-ai-generator] Scanning routes...
253
+ [openapi-ai-generator] Found 8 route(s)
254
+ [openapi-ai-generator] Analyzing routes with provider: openai
255
+ [openapi-ai-generator] 8 routes analyzed (6 from cache, 0 exact JSDoc)
256
+
257
+ ✓ OpenAPI spec generated successfully
258
+ Routes analyzed: 8
259
+ From cache: 6
260
+ LLM calls made: 2
261
+ Spec written to: src/app/api/openapi.json/route.ts
262
+ ```
263
+
264
+ ---
265
+
266
+ ## Next.js Plugin
267
+
268
+ The plugin automatically runs the generator before every `next build`, keeping your spec always in sync without a separate CI step.
269
+
270
+ ```typescript
271
+ // next.config.ts
272
+ import type { NextConfig } from 'next';
273
+ import { withOpenAPIGen } from 'openapi-ai-generator/plugin';
274
+
275
+ const nextConfig: NextConfig = {
276
+ // ... your existing Next.js config
277
+ };
278
+
279
+ export default withOpenAPIGen(nextConfig);
280
+ ```
281
+
282
+ You can pass generator options as a second argument to override config file settings:
283
+
284
+ ```typescript
285
+ export default withOpenAPIGen(nextConfig, {
286
+ provider: 'openai',
287
+ cache: true,
288
+ });
289
+ ```
290
+
291
+ > **Note:** The plugin hooks into webpack's `beforeRun` event. It runs once per build, not per compilation chunk.
292
+
293
+ ---
294
+
295
+ ## Programmatic API
296
+
297
+ ```typescript
298
+ import { generate, loadConfig, scanRoutes, analyzeRoutes, assembleSpec } from 'openapi-ai-generator';
299
+
300
+ // High-level: run the full pipeline
301
+ const result = await generate({
302
+ config: './openapi-gen.config.ts', // optional config path
303
+ provider: 'openai', // optional override
304
+ cache: true, // optional override
305
+ cwd: process.cwd(), // optional working directory
306
+ });
307
+
308
+ console.log(result.routesAnalyzed); // number of routes processed
309
+ console.log(result.routesFromCache); // routes served from cache
310
+ console.log(result.routesSkippedLLM); // routes that skipped LLM (cache + exact JSDoc)
311
+ console.log(result.specPath); // output file path
312
+
313
+ // Low-level: use individual pipeline stages
314
+ const config = await loadConfig('./openapi-gen.config.ts');
315
+ const routes = await scanRoutes(config.include, config.exclude);
316
+ const analyzed = await analyzeRoutes(routes, {
317
+ provider: config.provider,
318
+ jsdocMode: config.jsdocMode,
319
+ cache: config.cache,
320
+ cacheDir: config.cacheDir,
321
+ });
322
+ const spec = assembleSpec(config, analyzed);
323
+ ```
324
+
325
+ ### Exported Types
326
+
327
+ ```typescript
328
+ import type {
329
+ OpenAPIGenConfig, // raw config shape (for your config file)
330
+ ResolvedConfig, // fully resolved config with all defaults applied
331
+ Provider, // 'azure' | 'openai' | 'anthropic'
332
+ JSDocMode, // 'context' | 'exact'
333
+ GenerateOptions, // options for generate()
334
+ GenerateResult, // return value of generate()
335
+ } from 'openapi-ai-generator';
336
+ ```
337
+
338
+ ---
339
+
340
+ ## JSDoc Integration
341
+
342
+ JSDoc comments in your route files are automatically extracted and passed to the LLM as context. You can also use them to skip the LLM entirely.
343
+
344
+ ### `jsdocMode: 'context'` (default)
345
+
346
+ JSDoc comments are included in the LLM prompt as supplementary context. The LLM uses them alongside the source code to produce better descriptions, parameter docs, and response schemas.
347
+
348
+ ```typescript
349
+ /**
350
+ * Returns a paginated list of users.
351
+ * Supports filtering by role and status.
352
+ *
353
+ * @param role - Filter users by role: 'admin' | 'user'
354
+ * @param status - Filter by account status: 'active' | 'suspended'
355
+ */
356
+ export async function GET(request: NextRequest) {
357
+ // ...
358
+ }
359
+ ```
360
+
361
+ ### `jsdocMode: 'exact'` (config-level)
362
+
363
+ When set globally, all routes with a valid `@openapi` JSON tag will use it verbatim — no LLM call is made for those routes.
364
+
365
+ ### `@openapi-exact` (per-route override)
366
+
367
+ Add `@openapi-exact` to a specific JSDoc block to skip LLM analysis for that route only, regardless of the global `jsdocMode`. The `@openapi` tag value must be a valid OpenAPI 3.1 `PathItem` JSON object.
368
+
369
+ ```typescript
370
+ /**
371
+ * @openapi-exact
372
+ * @openapi {
373
+ * "get": {
374
+ * "operationId": "getUsers",
375
+ * "summary": "List users",
376
+ * "tags": ["Users"],
377
+ * "parameters": [
378
+ * {
379
+ * "name": "page",
380
+ * "in": "query",
381
+ * "schema": { "type": "integer", "default": 1 }
382
+ * }
383
+ * ],
384
+ * "responses": {
385
+ * "200": {
386
+ * "description": "Successful response",
387
+ * "content": {
388
+ * "application/json": {
389
+ * "schema": {
390
+ * "type": "array",
391
+ * "items": { "$ref": "#/components/schemas/User" }
392
+ * }
393
+ * }
394
+ * }
395
+ * }
396
+ * }
397
+ * }
398
+ * }
399
+ */
400
+ export async function GET(request: NextRequest) {
401
+ // ...
402
+ }
403
+ ```
404
+
405
+ > If the `@openapi` JSON fails to parse, the route falls back to LLM analysis automatically.
406
+
407
+ ---
408
+
409
+ ## Caching
410
+
411
+ The cache stores LLM responses in `{cacheDir}/{hash}.json`. The hash is computed from:
412
+
413
+ ```
414
+ SHA-256(fileContent + providerName + modelId)
415
+ ```
416
+
417
+ This means the cache is automatically invalidated when:
418
+ - The route file's source code changes
419
+ - You switch to a different provider
420
+ - You switch to a different model
421
+
422
+ ### Cache location
423
+
424
+ By default: `.openapi-cache/` in your project root.
425
+
426
+ Add it to `.gitignore` (see below) or commit it to speed up CI builds.
427
+
428
+ ### Clearing the cache
429
+
430
+ ```bash
431
+ rm -rf .openapi-cache
432
+ ```
433
+
434
+ Or run with `--no-cache` to bypass it for a single run without deleting files.
435
+
436
+ ---
437
+
438
+ ## Output Files
439
+
440
+ ### `spec.json` + `route.ts`
441
+
442
+ The generator creates two co-located files at the directory specified by `output.specPath`:
443
+
444
+ **`spec.json`** — the full OpenAPI 3.1 spec:
445
+ ```json
446
+ {
447
+ "openapi": "3.1.0",
448
+ "info": {
449
+ "title": "My API",
450
+ "version": "1.0.0"
451
+ },
452
+ "paths": {
453
+ "/api/users": { ... },
454
+ "/api/users/{id}": { ... }
455
+ }
456
+ }
457
+ ```
458
+
459
+ **`route.ts`** — a static Next.js route that serves it:
460
+ ```typescript
461
+ import spec from './spec.json';
462
+
463
+ export const dynamic = 'force-static';
464
+
465
+ export function GET() {
466
+ return Response.json(spec);
467
+ }
468
+ ```
469
+
470
+ Both files are **generated artifacts** — do not edit them manually. Add them to `.gitignore` or commit them, depending on your workflow.
471
+
472
+ ---
473
+
474
+ ## Scalar Docs UI
475
+
476
+ Set `output.scalarDocs: true` to also generate an interactive API docs page powered by [Scalar](https://scalar.com).
477
+
478
+ ```typescript
479
+ // openapi-gen.config.ts
480
+ export default {
481
+ output: {
482
+ specPath: 'src/app/api/openapi.json/route.ts',
483
+ scalarDocs: true,
484
+ scalarPath: 'src/app/api/docs/route.ts', // serves GET /api/docs
485
+ },
486
+ ...
487
+ } satisfies OpenAPIGenConfig;
488
+ ```
489
+
490
+ Once generated, visit `/api/docs` in your running Next.js app to see the full interactive documentation.
491
+
492
+ ---
493
+
494
+ ## Path Conversion
495
+
496
+ Next.js file system routing is automatically converted to OpenAPI path syntax:
497
+
498
+ | File path | OpenAPI path |
499
+ |-----------|-------------|
500
+ | `src/app/api/users/route.ts` | `/api/users` |
501
+ | `src/app/api/users/[id]/route.ts` | `/api/users/{id}` |
502
+ | `src/app/api/posts/[postId]/comments/[commentId]/route.ts` | `/api/posts/{postId}/comments/{commentId}` |
503
+ | `src/app/api/files/[...path]/route.ts` | `/api/files/{path}` |
504
+ | `app/api/health/route.ts` | `/api/health` |
505
+
506
+ ---
507
+
508
+ ## Recommended `.gitignore` Additions
509
+
510
+ ```gitignore
511
+ # openapi-ai-generator cache (or remove this line to commit cache for faster CI)
512
+ .openapi-cache/
513
+
514
+ # Generated spec files (optional — depends on your workflow)
515
+ # src/app/api/openapi.json/
516
+ # src/app/api/docs/
517
+ ```
518
+
519
+ ---
520
+
521
+ ## Recommended `package.json` Scripts
522
+
523
+ ```json
524
+ {
525
+ "scripts": {
526
+ "generate:openapi": "openapi-ai-generator generate",
527
+ "build": "openapi-ai-generator generate && next build"
528
+ }
529
+ }
530
+ ```
531
+
532
+ ---
533
+
534
+ ## Troubleshooting
535
+
536
+ ### "No openapi-gen.config.ts found"
537
+
538
+ Ensure your config file is at the project root (where you run the CLI) and is named `openapi-gen.config.ts`, `.js`, `.mjs`, or `.cjs`.
539
+
540
+ ### "Cannot load TypeScript config file"
541
+
542
+ The CLI tries to load `.ts` config files using `tsx` or `ts-node`. Install one:
543
+
544
+ ```bash
545
+ npm install --save-dev tsx
546
+ ```
547
+
548
+ ### LLM returns invalid JSON
549
+
550
+ The generator logs a warning and skips the route (producing an empty PathItem). Try:
551
+ - Using `@openapi-exact` with hand-written JSON for that route
552
+ - Running with `--no-cache` and retrying
553
+ - Switching to a more capable model
554
+
555
+ ### Routes appear in spec but have no operations
556
+
557
+ The LLM likely couldn't infer enough from the source. Add JSDoc comments to the route to give it more context.
558
+
559
+ ### Cache not working as expected
560
+
561
+ The cache is keyed on `fileContent + provider + modelId`. If you change env vars (e.g. `OPENAI_MODEL`), the cache is automatically invalidated. If you suspect corruption, delete `.openapi-cache/` and rerun.
562
+
563
+ ---
564
+
565
+ ## Contributing
566
+
567
+ We welcome contributions! See [CONTRIBUTING.md](./CONTRIBUTING.md) for guidelines.
568
+
569
+ ---
570
+
571
+ ## Security
572
+
573
+ Please review our [Security Policy](./SECURITY.md) before reporting vulnerabilities.
574
+
575
+ ---
576
+
577
+ ## License
578
+
579
+ MIT — see [LICENSE](./LICENSE).