nlang-cli 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,219 @@
1
+ # nlang — Executable Extensions
2
+
3
+ A build system where **file extensions define the build pipeline**. Files with double extensions (`.html.md`, `.json.js`, `.css.md`) are automatically executed: markdown files are sent to an LLM, JavaScript/TypeScript files are run in Node.js.
4
+
5
+ ## Install
6
+
7
+ ```bash
8
+ npm install -g nlang
9
+ ```
10
+
11
+ ## Quick Start
12
+
13
+ 1. Create a file with a double extension:
14
+
15
+ ```markdown
16
+ ## <!-- index.html.md -->
17
+
18
+ ## model: gpt-4o-mini
19
+
20
+ Create a simple landing page for a developer portfolio.
21
+ Include a hero section, about section, and contact form.
22
+ Use modern CSS with a dark theme.
23
+ ```
24
+
25
+ 2. Generate the GitHub Action:
26
+
27
+ ```bash
28
+ nlang init
29
+ ```
30
+
31
+ 3. Set your API key in GitHub repo Settings → Secrets → `OPENAI_API_KEY`
32
+
33
+ 4. Push — your files will be built automatically!
34
+
35
+ ## How It Works
36
+
37
+ ### Double Extensions
38
+
39
+ The **last extension** determines the executor, everything before it is the output format:
40
+
41
+ | Source File | Executor | Output |
42
+ | ---------------- | -------------------- | ------------- |
43
+ | `index.html.md` | LLM prompt | `index.html` |
44
+ | `styles.css.md` | LLM prompt | `styles.css` |
45
+ | `data.json.js` | Node.js | `data.json` |
46
+ | `readme.md.md` | LLM prompt | `readme.md` |
47
+ | `sitemap.xml.ts` | Node.js (TypeScript) | `sitemap.xml` |
48
+
49
+ ### Markdown Executor (LLM)
50
+
51
+ Markdown files are sent as prompts to an LLM. Configure with **frontmatter**:
52
+
53
+ ```markdown
54
+ ---
55
+ model: gpt-4o
56
+ temperature: 0.7
57
+ max_tokens: 8192
58
+ system: "You are an expert web developer."
59
+ cacheTtl: 86400
60
+ ---
61
+
62
+ Your prompt here...
63
+ ```
64
+
65
+ ### JavaScript/TypeScript Executor
66
+
67
+ JS/TS files export a function that returns the output:
68
+
69
+ ```javascript
70
+ // data.json.js
71
+ export default async function (ctx) {
72
+ const response = await fetch("https://api.example.com/data");
73
+ const data = await response.json();
74
+ return JSON.stringify(data, null, 2);
75
+ }
76
+ ```
77
+
78
+ The `ctx` object contains:
79
+
80
+ - `deps` — resolved dependency contents
81
+ - `variables` — variable values for this variant
82
+ - `config` — merged nlang.json configuration
83
+ - `rootDir` — project root path
84
+ - `env` — environment variables
85
+
86
+ ### Dependencies with `@{path}`
87
+
88
+ Reference other files in your prompts:
89
+
90
+ ```markdown
91
+ <!-- components.html.md -->
92
+
93
+ Create HTML components following this design system:
94
+
95
+ @{design-tokens.json}
96
+
97
+ And matching these TypeScript types:
98
+
99
+ @{src/types.ts}
100
+ ```
101
+
102
+ Files are built in dependency order. If `design-tokens.json` is itself generated (e.g., from `design-tokens.json.md`), it will be built first.
103
+
104
+ You can also reference URLs:
105
+
106
+ ```markdown
107
+ @{https://raw.githubusercontent.com/user/repo/main/schema.json}
108
+ ```
109
+
110
+ ### Variables with `[name]`
111
+
112
+ Use bracket syntax in paths for templated builds:
113
+
114
+ ```
115
+ blog/
116
+ name.json # ["hello-world", "getting-started", "advanced-tips"]
117
+ [name].html.md # Template that uses [name] in the prompt
118
+ ```
119
+
120
+ The `[name].html.md` file will be executed once for each value in `name.json`, producing:
121
+
122
+ - `blog/hello-world.html`
123
+ - `blog/getting-started.html`
124
+ - `blog/advanced-tips.html`
125
+
126
+ ### Cron Schedules
127
+
128
+ Add a `trigger` to frontmatter for scheduled rebuilds:
129
+
130
+ ```markdown
131
+ ---
132
+ trigger: "0 */6 * * *"
133
+ ---
134
+
135
+ Fetch the latest news and generate an HTML summary...
136
+ ```
137
+
138
+ When you run `nlang init`, this is picked up and added to the GitHub Action schedule.
139
+
140
+ ### Configuration: `nlang.json`
141
+
142
+ Place `nlang.json` in any directory. More specific configs override parent configs:
143
+
144
+ ```json
145
+ {
146
+ "model": "gpt-4o",
147
+ "temperature": 0,
148
+ "cacheTtl": 3600,
149
+ "baseURL": "https://api.openai.com/v1",
150
+ "system": "You are a helpful assistant."
151
+ }
152
+ ```
153
+
154
+ Config resolution order (later wins):
155
+
156
+ 1. `~/.nlang` (global)
157
+ 2. `./nlang.json` (project root)
158
+ 3. `./subdir/nlang.json` (closer to file)
159
+ 4. File frontmatter (highest priority)
160
+
161
+ ### Caching
162
+
163
+ LLM responses are cached by content hash with configurable TTL:
164
+
165
+ - Default TTL: **1 hour** (3600s)
166
+ - When MCP is enabled: **no cache** by default
167
+ - Override with `cacheTtl` in frontmatter or `nlang.json`
168
+ - Set `cacheTtl: 0` to disable caching
169
+ - Set `cacheTtl: -1` for infinite cache (only invalidated by content changes)
170
+
171
+ ## CLI
172
+
173
+ ```bash
174
+ # Generate GitHub Action workflow
175
+ nlang init
176
+
177
+ # Build all executable files
178
+ nlang build
179
+
180
+ # Build a specific file and its dependency chain
181
+ nlang build --file blog/[name].html.md
182
+
183
+ # Dry run — show execution plan without running
184
+ nlang build --dry-run
185
+
186
+ # Specify project directory
187
+ nlang build -d /path/to/project
188
+ ```
189
+
190
+ ## Environment Variables
191
+
192
+ | Variable | Description |
193
+ | ---------------- | ------------------------------------------------ |
194
+ | `OPENAI_API_KEY` | OpenAI API key |
195
+ | `LLM_API_KEY` | Alternative API key (for OpenAI-compatible APIs) |
196
+
197
+ ## Example Project
198
+
199
+ ```
200
+ my-site/
201
+ ├── nlang.json # {"model": "gpt-4o-mini"}
202
+ ├── index.html.md # Landing page prompt
203
+ ├── styles.css.md # CSS prompt (references index.html.md output)
204
+ ├── blog/
205
+ │ ├── name.json # ["intro", "tutorial"]
206
+ │ ├── [name].html.md # Blog post template
207
+ │ └── index.html.js # Blog index (reads generated posts)
208
+ ├── data/
209
+ │ └── api-data.json.ts # Fetches and transforms API data
210
+ └── dist/ # ← Build output (auto-generated)
211
+ ├── index.html
212
+ ├── styles.css
213
+ ├── blog/
214
+ │ ├── intro.html
215
+ │ ├── tutorial.html
216
+ │ └── index.html
217
+ └── data/
218
+ └── api-data.json
219
+ ```
package/bin/nlang.js ADDED
@@ -0,0 +1,46 @@
1
+ #!/usr/bin/env node
2
+
3
+ import { parseArgs } from "node:util";
4
+ import { resolve } from "node:path";
5
+ import { initWorkflow } from "../src/init.js";
6
+ import { build } from "../src/build.js";
7
+
8
+ const { values, positionals } = parseArgs({
9
+ allowPositionals: true,
10
+ options: {
11
+ dir: { type: "string", short: "d", default: "." },
12
+ help: { type: "boolean", short: "h", default: false },
13
+ "dry-run": { type: "boolean", default: false },
14
+ file: { type: "string", short: "f" }
15
+ }
16
+ });
17
+
18
+ const command = positionals[0];
19
+
20
+ if (values.help || !command) {
21
+ console.log(`
22
+ nlang - Executable Extensions
23
+
24
+ Commands:
25
+ init Scan repo and generate .github/workflows/nlang.yml
26
+ build Execute all double-extension files in dependency order
27
+
28
+ Options:
29
+ -d, --dir <path> Root directory (default: .)
30
+ -f, --file <path> Build only a specific file (and its dependencies)
31
+ --dry-run Show what would be executed without running
32
+ -h, --help Show this help
33
+ `);
34
+ process.exit(0);
35
+ }
36
+
37
+ const rootDir = resolve(values.dir);
38
+
39
+ if (command === "init") {
40
+ await initWorkflow(rootDir);
41
+ } else if (command === "build") {
42
+ await build(rootDir, { dryRun: values["dry-run"], file: values.file });
43
+ } else {
44
+ console.error(`Unknown command: ${command}`);
45
+ process.exit(1);
46
+ }
package/package.json ADDED
@@ -0,0 +1,22 @@
1
+ {
2
+ "name": "nlang-cli",
3
+ "version": "0.1.0",
4
+ "description": "Executable Extensions - Build system for double-extension files",
5
+ "type": "module",
6
+ "bin": {
7
+ "nlang": "./bin/nlang.js"
8
+ },
9
+ "scripts": {
10
+ "build": "node src/build.js",
11
+ "init": "node bin/nlang.js init",
12
+ "test": "node --test src/__tests__/"
13
+ },
14
+ "dependencies": {
15
+ "gray-matter": "^4.0.3",
16
+ "glob": "^10.3.10",
17
+ "openai": "^4.73.0"
18
+ },
19
+ "engines": {
20
+ "node": ">=18"
21
+ }
22
+ }
package/src/build.js ADDED
@@ -0,0 +1,286 @@
1
+ import { scanFiles, loadConfig } from "./scanner.js";
2
+ import { buildExecutionGraph, getSubgraph } from "./graph.js";
3
+ import { loadCache, saveCache, computeHash, isCacheValid } from "./cache.js";
4
+ import { executeMarkdown } from "./executors/markdown.js";
5
+ import { executeJavaScript } from "./executors/javascript.js";
6
+ import { readFile, writeFile, mkdir } from "node:fs/promises";
7
+ import { join, dirname } from "node:path";
8
+ import { existsSync } from "node:fs";
9
+
10
+ /**
11
+ * @param {string} rootDir
12
+ * @param {{ dryRun?: boolean, file?: string }} opts
13
+ */
14
+ export async function build(rootDir, opts = {}) {
15
+ console.log(`\n🔨 nlang build starting in ${rootDir}\n`);
16
+
17
+ // 1. Scan for executable files
18
+ let allFiles = await scanFiles(rootDir);
19
+ console.log(`📂 Found ${allFiles.length} executable file(s)\n`);
20
+
21
+ if (allFiles.length === 0) {
22
+ console.log("Nothing to build.");
23
+ return;
24
+ }
25
+
26
+ // 2. If targeting a specific file, get its subgraph
27
+ let files = allFiles;
28
+ if (opts.file) {
29
+ files = getSubgraph(allFiles, opts.file);
30
+ console.log(
31
+ `🎯 Building subgraph for ${opts.file}: ${files.length} file(s)\n`
32
+ );
33
+ }
34
+
35
+ // 3. Expand variables — creates virtual file entries for each variable value
36
+ files = await expandVariables(files, rootDir);
37
+
38
+ // 4. Build execution graph
39
+ const { layers } = buildExecutionGraph(files);
40
+ console.log(`📊 Execution plan: ${layers.length} layer(s)\n`);
41
+ for (let i = 0; i < layers.length; i++) {
42
+ console.log(
43
+ ` Layer ${i + 1}: ${layers[i].map((f) => f.relativePath).join(", ")}`
44
+ );
45
+ }
46
+ console.log();
47
+
48
+ if (opts.dryRun) {
49
+ console.log("🏃 Dry run — not executing.");
50
+ return;
51
+ }
52
+
53
+ // 5. Load cache
54
+ const cache = await loadCache(rootDir);
55
+
56
+ // 6. Execute layer by layer
57
+ /** @type {Map<string, string>} - output path -> content */
58
+ const outputs = new Map();
59
+
60
+ for (let i = 0; i < layers.length; i++) {
61
+ const layer = layers[i];
62
+ console.log(`\n⚡ Executing layer ${i + 1}/${layers.length}...`);
63
+
64
+ await Promise.all(
65
+ layer.map(async (file) => {
66
+ try {
67
+ const result = await executeFile(file, rootDir, cache, outputs);
68
+ const outputPath = file.relativePath.slice(
69
+ 0,
70
+ -file.executorExtension.length
71
+ );
72
+ outputs.set(outputPath, result);
73
+ outputs.set(file.relativePath, result);
74
+ } catch (err) {
75
+ console.error(` ❌ ${file.relativePath}: ${err.message}`);
76
+ }
77
+ })
78
+ );
79
+ }
80
+
81
+ // 7. Write output files
82
+ console.log(`\n📝 Writing ${outputs.size / 2} output file(s)...\n`);
83
+ const distDir = join(rootDir, "dist");
84
+ await mkdir(distDir, { recursive: true });
85
+
86
+ for (const [relPath, content] of outputs) {
87
+ // Skip the .xyz.md paths, only write the output paths (without executor ext)
88
+ if (
89
+ relPath.endsWith(".md") ||
90
+ relPath.endsWith(".js") ||
91
+ relPath.endsWith(".ts")
92
+ ) {
93
+ // Check if this is a source path (has double extension)
94
+ const parts = relPath.split(".");
95
+ if (parts.length > 2) continue;
96
+ }
97
+
98
+ const outPath = join(distDir, relPath);
99
+ await mkdir(dirname(outPath), { recursive: true });
100
+ await writeFile(outPath, content);
101
+ console.log(` ✅ dist/${relPath}`);
102
+ }
103
+
104
+ // 8. Save cache
105
+ await saveCache(rootDir, cache);
106
+ console.log(`\n✨ Build complete!\n`);
107
+ }
108
+
109
+ /**
110
+ * Execute a single file.
111
+ * @param {import('./scanner.js').ExecutableFile} file
112
+ * @param {string} rootDir
113
+ * @param {Map<string, import('./cache.js').CacheEntry>} cache
114
+ * @param {Map<string, string>} outputs - already-built outputs
115
+ * @returns {Promise<string>}
116
+ */
117
+ async function executeFile(file, rootDir, cache, outputs) {
118
+ const config = await loadConfig(file.relativePath, rootDir);
119
+
120
+ // Resolve dependencies from outputs or filesystem
121
+ /** @type {Record<string, string>} */
122
+ const resolvedDeps = {};
123
+ for (const dep of file.dependencies) {
124
+ if (outputs.has(dep)) {
125
+ resolvedDeps[dep] = outputs.get(dep);
126
+ } else {
127
+ // Try reading from filesystem
128
+ try {
129
+ resolvedDeps[dep] = await readFile(join(rootDir, dep), "utf-8");
130
+ } catch {
131
+ // Try from dist
132
+ try {
133
+ resolvedDeps[dep] = await readFile(
134
+ join(rootDir, "dist", dep),
135
+ "utf-8"
136
+ );
137
+ } catch {
138
+ console.warn(
139
+ ` ⚠️ Dependency ${dep} not found for ${file.relativePath}`
140
+ );
141
+ }
142
+ }
143
+ }
144
+ }
145
+
146
+ // Compute content hash for caching
147
+ const fullContent =
148
+ file.content +
149
+ JSON.stringify(resolvedDeps) +
150
+ JSON.stringify(file._variables || {});
151
+ const hash = computeHash(fullContent, config);
152
+
153
+ // Determine TTL: if MCP is used, default to 0 (no cache) unless explicitly set
154
+ const usesMcp = file.frontmatter.mcp || config.mcp;
155
+ const defaultTtl = usesMcp ? 0 : 3600; // 1 hour default
156
+ const ttl = file.frontmatter.cacheTtl ?? config.cacheTtl ?? defaultTtl;
157
+
158
+ // Check cache
159
+ const cacheKey = file.relativePath + (file._variantKey || "");
160
+ const cached = cache.get(cacheKey);
161
+ if (isCacheValid(cached, hash)) {
162
+ console.log(` 💾 ${file.relativePath} (cached)`);
163
+ return cached.result;
164
+ }
165
+
166
+ console.log(` 🔧 ${file.relativePath}`);
167
+
168
+ let result;
169
+
170
+ if (file.executorExtension === ".md") {
171
+ result = await executeMarkdown({
172
+ content: file.content,
173
+ frontmatter: file.frontmatter,
174
+ config,
175
+ rootDir,
176
+ resolvedDeps,
177
+ variables: file._variables || {}
178
+ });
179
+ } else if (
180
+ file.executorExtension === ".js" ||
181
+ file.executorExtension === ".ts"
182
+ ) {
183
+ result = await executeJavaScript({
184
+ filePath: file.path,
185
+ content: file.content,
186
+ resolvedDeps,
187
+ variables: file._variables || {},
188
+ config,
189
+ rootDir
190
+ });
191
+ } else {
192
+ throw new Error(`Unknown executor: ${file.executorExtension}`);
193
+ }
194
+
195
+ // Update cache
196
+ cache.set(cacheKey, {
197
+ hash,
198
+ result,
199
+ timestamp: Date.now(),
200
+ ttl
201
+ });
202
+
203
+ return result;
204
+ }
205
+
206
+ /**
207
+ * Expand [variable] files into multiple virtual file entries.
208
+ * @param {import('./scanner.js').ExecutableFile[]} files
209
+ * @param {string} rootDir
210
+ * @returns {Promise<import('./scanner.js').ExecutableFile[]>}
211
+ */
212
+ async function expandVariables(files, rootDir) {
213
+ /** @type {import('./scanner.js').ExecutableFile[]} */
214
+ const expanded = [];
215
+
216
+ for (const file of files) {
217
+ if (file.variables.length === 0) {
218
+ expanded.push(file);
219
+ continue;
220
+ }
221
+
222
+ // Load variable values
223
+ /** @type {Record<string, string[]>} */
224
+ const varValues = {};
225
+ let hasAllValues = true;
226
+
227
+ for (const v of file.variables) {
228
+ const valuesPath = join(rootDir, v.valuesFile);
229
+ try {
230
+ const raw = await readFile(valuesPath, "utf-8");
231
+ const values = JSON.parse(raw);
232
+ if (Array.isArray(values)) {
233
+ varValues[v.name] = values;
234
+ } else {
235
+ console.warn(` ⚠️ ${v.valuesFile} should contain a JSON array`);
236
+ hasAllValues = false;
237
+ }
238
+ } catch {
239
+ console.warn(
240
+ ` ⚠️ Variable file ${v.valuesFile} not found for ${file.relativePath}`
241
+ );
242
+ hasAllValues = false;
243
+ }
244
+ }
245
+
246
+ if (!hasAllValues) {
247
+ expanded.push(file);
248
+ continue;
249
+ }
250
+
251
+ // Generate combinations (for simplicity, handle single variable; multi-var is a cartesian product)
252
+ const varNames = Object.keys(varValues);
253
+ const combos = cartesian(varNames.map((n) => varValues[n]));
254
+
255
+ for (const combo of combos) {
256
+ const vars = {};
257
+ let path = file.relativePath;
258
+ for (let j = 0; j < varNames.length; j++) {
259
+ vars[varNames[j]] = combo[j];
260
+ path = path.replaceAll(`[${varNames[j]}]`, combo[j]);
261
+ }
262
+
263
+ expanded.push({
264
+ ...file,
265
+ relativePath: path,
266
+ _variables: vars,
267
+ _variantKey: JSON.stringify(vars),
268
+ variables: [] // already expanded
269
+ });
270
+ }
271
+ }
272
+
273
+ return expanded;
274
+ }
275
+
276
+ /**
277
+ * Cartesian product of arrays.
278
+ * @param {string[][]} arrays
279
+ * @returns {string[][]}
280
+ */
281
+ function cartesian(arrays) {
282
+ if (arrays.length === 0) return [[]];
283
+ const [first, ...rest] = arrays;
284
+ const restCombos = cartesian(rest);
285
+ return first.flatMap((val) => restCombos.map((combo) => [val, ...combo]));
286
+ }
package/src/cache.js ADDED
@@ -0,0 +1,69 @@
1
+ import { readFile, writeFile, mkdir } from "node:fs/promises";
2
+ import { join, dirname } from "node:path";
3
+ import { createHash } from "node:crypto";
4
+
5
+ const CACHE_DIR = ".nlang-cache";
6
+
7
+ /**
8
+ * @typedef {{
9
+ * hash: string,
10
+ * result: string,
11
+ * timestamp: number,
12
+ * ttl: number,
13
+ * }} CacheEntry
14
+ */
15
+
16
+ /**
17
+ * @param {string} rootDir
18
+ * @returns {Promise<Map<string, CacheEntry>>}
19
+ */
20
+ export async function loadCache(rootDir) {
21
+ const cachePath = join(rootDir, CACHE_DIR, "cache.json");
22
+ try {
23
+ const raw = await readFile(cachePath, "utf-8");
24
+ const entries = JSON.parse(raw);
25
+ return new Map(Object.entries(entries));
26
+ } catch {
27
+ return new Map();
28
+ }
29
+ }
30
+
31
+ /**
32
+ * @param {string} rootDir
33
+ * @param {Map<string, CacheEntry>} cache
34
+ */
35
+ export async function saveCache(rootDir, cache) {
36
+ const cachePath = join(rootDir, CACHE_DIR, "cache.json");
37
+ await mkdir(dirname(cachePath), { recursive: true });
38
+ const obj = Object.fromEntries(cache);
39
+ await writeFile(cachePath, JSON.stringify(obj, null, 2));
40
+ }
41
+
42
+ /**
43
+ * Get a cache key hash for a file's content + dependencies.
44
+ * @param {string} content
45
+ * @param {Record<string, any>} config
46
+ * @returns {string}
47
+ */
48
+ export function computeHash(content, config = {}) {
49
+ const hash = createHash("sha256");
50
+ hash.update(content);
51
+ hash.update(JSON.stringify(config));
52
+ return hash.digest("hex").slice(0, 16);
53
+ }
54
+
55
+ /**
56
+ * Check if a cache entry is still valid.
57
+ * @param {CacheEntry | undefined} entry
58
+ * @param {string} currentHash
59
+ * @returns {boolean}
60
+ */
61
+ export function isCacheValid(entry, currentHash) {
62
+ if (!entry) return false;
63
+ if (entry.hash !== currentHash) return false;
64
+ if (entry.ttl > 0) {
65
+ const elapsed = Date.now() - entry.timestamp;
66
+ if (elapsed > entry.ttl * 1000) return false;
67
+ }
68
+ return true;
69
+ }
@@ -0,0 +1,122 @@
1
+ import { readFile, writeFile } from "node:fs/promises";
2
+ import { join, dirname } from "node:path";
3
+ import { pathToFileURL } from "node:url";
4
+ import { tmpdir } from "node:os";
5
+ import { randomBytes } from "node:crypto";
6
+
7
+ /**
8
+ * Execute a .js or .ts file and capture its output.
9
+ *
10
+ * The file should export a default function or be a script that returns/logs content.
11
+ * Convention: export default async function(context) { return "file content" }
12
+ *
13
+ * @param {object} opts
14
+ * @param {string} opts.filePath - absolute path to the .js/.ts file
15
+ * @param {string} opts.content - raw file content
16
+ * @param {Record<string, string>} opts.resolvedDeps - map of dep path -> content
17
+ * @param {Record<string, string>} opts.variables - map of variable name -> value
18
+ * @param {Record<string, any>} opts.config
19
+ * @param {string} opts.rootDir
20
+ * @returns {Promise<string>}
21
+ */
22
+ export async function executeJavaScript({
23
+ filePath,
24
+ content,
25
+ resolvedDeps,
26
+ variables,
27
+ config,
28
+ rootDir
29
+ }) {
30
+ // For .ts files, we rely on Node 22+ --experimental-strip-types or tsx
31
+ // For now, try direct import first
32
+
33
+ const context = {
34
+ deps: resolvedDeps,
35
+ variables,
36
+ config,
37
+ rootDir,
38
+ env: process.env
39
+ };
40
+
41
+ try {
42
+ // Dynamic import
43
+ const fileUrl = pathToFileURL(filePath).href;
44
+ const mod = await import(fileUrl);
45
+
46
+ if (typeof mod.default === "function") {
47
+ const result = await mod.default(context);
48
+ if (typeof result === "string") return result;
49
+ if (typeof result === "object") return JSON.stringify(result, null, 2);
50
+ return String(result);
51
+ }
52
+
53
+ if (typeof mod.default === "string") {
54
+ return mod.default;
55
+ }
56
+
57
+ // If module has a named export 'output'
58
+ if (typeof mod.output === "function") {
59
+ const result = await mod.output(context);
60
+ return typeof result === "string"
61
+ ? result
62
+ : JSON.stringify(result, null, 2);
63
+ }
64
+
65
+ if (typeof mod.output === "string") {
66
+ return mod.output;
67
+ }
68
+
69
+ throw new Error(
70
+ `${filePath}: No default export or 'output' export found. ` +
71
+ `Export a function or string: export default async function(ctx) { return "..." }`
72
+ );
73
+ } catch (err) {
74
+ // If import fails for .ts, try with tsx if available
75
+ if (filePath.endsWith(".ts")) {
76
+ return executeWithTsx(filePath, context);
77
+ }
78
+ throw err;
79
+ }
80
+ }
81
+
82
+ /**
83
+ * Fallback: execute .ts file via tsx/npx tsx
84
+ * @param {string} filePath
85
+ * @param {object} context
86
+ * @returns {Promise<string>}
87
+ */
88
+ async function executeWithTsx(filePath, context) {
89
+ const { execSync } = await import("node:child_process");
90
+
91
+ // Write a wrapper that imports the file and prints the result
92
+ const tmpFile = join(tmpdir(), `nlang-${randomBytes(4).toString("hex")}.mjs`);
93
+ const wrapper = `
94
+ import { pathToFileURL } from 'node:url';
95
+ const mod = await import(pathToFileURL(${JSON.stringify(filePath)}).href);
96
+ const ctx = ${JSON.stringify(context)};
97
+ let result;
98
+ if (typeof mod.default === 'function') result = await mod.default(ctx);
99
+ else if (typeof mod.default === 'string') result = mod.default;
100
+ else if (typeof mod.output === 'function') result = await mod.output(ctx);
101
+ else if (typeof mod.output === 'string') result = mod.output;
102
+ else throw new Error('No default or output export');
103
+ if (typeof result !== 'string') result = JSON.stringify(result, null, 2);
104
+ process.stdout.write(result);
105
+ `;
106
+
107
+ await writeFile(tmpFile, wrapper);
108
+
109
+ try {
110
+ const result = execSync(`npx tsx ${tmpFile}`, {
111
+ encoding: "utf-8",
112
+ cwd: dirname(filePath),
113
+ timeout: 60000
114
+ });
115
+ return result;
116
+ } finally {
117
+ try {
118
+ const { unlink } = await import("node:fs/promises");
119
+ await unlink(tmpFile);
120
+ } catch {}
121
+ }
122
+ }
@@ -0,0 +1,104 @@
1
+ import OpenAI from "openai";
2
+ import { readFile } from "node:fs/promises";
3
+ import { join } from "node:path";
4
+
5
+ /**
6
+ * Execute a markdown file by sending it to an LLM.
7
+ *
8
+ * @param {object} opts
9
+ * @param {string} opts.content - The markdown prompt (frontmatter stripped)
10
+ * @param {Record<string, any>} opts.frontmatter
11
+ * @param {Record<string, any>} opts.config - merged nlang.json config
12
+ * @param {string} opts.rootDir
13
+ * @param {Record<string, string>} opts.resolvedDeps - map of dep path -> content
14
+ * @param {Record<string, string>} opts.variables - map of variable name -> value
15
+ * @returns {Promise<string>}
16
+ */
17
+ export async function executeMarkdown({
18
+ content,
19
+ frontmatter,
20
+ config,
21
+ rootDir,
22
+ resolvedDeps,
23
+ variables
24
+ }) {
25
+ // Replace @{path} references with resolved content
26
+ let prompt = content;
27
+ for (const [depPath, depContent] of Object.entries(resolvedDeps)) {
28
+ prompt = prompt.replaceAll(`@{${depPath}}`, depContent);
29
+ }
30
+
31
+ // Replace [variable] references
32
+ for (const [varName, varValue] of Object.entries(variables)) {
33
+ prompt = prompt.replaceAll(`[${varName}]`, varValue);
34
+ }
35
+
36
+ // Also handle @{URL} references by fetching them
37
+ const urlRegex = /@\{(https?:\/\/[^}]+)\}/g;
38
+ let urlMatch;
39
+ while ((urlMatch = urlRegex.exec(prompt)) !== null) {
40
+ const url = urlMatch[1];
41
+ try {
42
+ const resp = await fetch(url);
43
+ const text = await resp.text();
44
+ prompt = prompt.replaceAll(`@{${url}}`, text);
45
+ } catch (err) {
46
+ console.warn(`⚠️ Failed to fetch ${url}: ${err.message}`);
47
+ }
48
+ }
49
+
50
+ // Determine model and API settings
51
+ const model = frontmatter.model || config.model || "gpt-4o-mini";
52
+ const baseURL = frontmatter.baseURL || config.baseURL || undefined;
53
+ const apiKey =
54
+ frontmatter.apiKey ||
55
+ config.apiKey ||
56
+ process.env.OPENAI_API_KEY ||
57
+ process.env.LLM_API_KEY;
58
+
59
+ if (!apiKey) {
60
+ throw new Error(
61
+ "No API key found. Set OPENAI_API_KEY or LLM_API_KEY env var, or configure in nlang.json"
62
+ );
63
+ }
64
+
65
+ const client = new OpenAI({
66
+ apiKey,
67
+ ...(baseURL ? { baseURL } : {})
68
+ });
69
+
70
+ const systemPrompt =
71
+ frontmatter.system ||
72
+ config.system ||
73
+ "You are a build tool. Output ONLY the requested file content, no explanations, no markdown fences unless the output format is markdown.";
74
+
75
+ console.log(` 🤖 Calling ${model}...`);
76
+
77
+ const response = await client.chat.completions.create({
78
+ model,
79
+ messages: [
80
+ { role: "system", content: systemPrompt },
81
+ { role: "user", content: prompt }
82
+ ],
83
+ temperature: frontmatter.temperature ?? config.temperature ?? 0,
84
+ max_tokens: frontmatter.max_tokens ?? config.max_tokens ?? 4096
85
+ });
86
+
87
+ const result = response.choices[0]?.message?.content || "";
88
+
89
+ // Strip wrapping code fences if present and the output isn't markdown
90
+ return stripCodeFences(result);
91
+ }
92
+
93
+ /**
94
+ * Strip leading/trailing code fences from LLM output.
95
+ * e.g. ```html\n...\n``` -> ...
96
+ * @param {string} text
97
+ * @returns {string}
98
+ */
99
+ function stripCodeFences(text) {
100
+ const trimmed = text.trim();
101
+ const match = trimmed.match(/^```\w*\n([\s\S]*?)\n```$/);
102
+ if (match) return match[1];
103
+ return trimmed;
104
+ }
package/src/graph.js ADDED
@@ -0,0 +1,149 @@
1
+ /**
2
+ * Build a dependency graph and return execution order.
3
+ * Returns layers of files that can be executed in parallel.
4
+ *
5
+ * @typedef {import('./scanner.js').ExecutableFile} ExecutableFile
6
+ */
7
+
8
+ /**
9
+ * @param {ExecutableFile[]} files
10
+ * @returns {{ layers: ExecutableFile[][], order: ExecutableFile[] }}
11
+ */
12
+ export function buildExecutionGraph(files) {
13
+ // Map relative paths to files (including output paths)
14
+ /** @type {Map<string, ExecutableFile>} */
15
+ const fileMap = new Map();
16
+ for (const file of files) {
17
+ fileMap.set(file.relativePath, file);
18
+ // Also map by output path (without executor extension)
19
+ const outputPath = file.relativePath.slice(
20
+ 0,
21
+ -file.executorExtension.length,
22
+ );
23
+ fileMap.set(outputPath, file);
24
+ }
25
+
26
+ // Build adjacency list: file -> files it depends on
27
+ /** @type {Map<string, Set<string>>} */
28
+ const adj = new Map();
29
+ /** @type {Map<string, number>} */
30
+ const inDegree = new Map();
31
+
32
+ for (const file of files) {
33
+ const key = file.relativePath;
34
+ if (!adj.has(key)) adj.set(key, new Set());
35
+ if (!inDegree.has(key)) inDegree.set(key, 0);
36
+
37
+ for (const dep of file.dependencies) {
38
+ // Resolve dep to a known file
39
+ const depFile = fileMap.get(dep);
40
+ if (depFile && depFile.relativePath !== key) {
41
+ const depKey = depFile.relativePath;
42
+ if (!adj.has(depKey)) adj.set(depKey, new Set());
43
+ if (!inDegree.has(depKey)) inDegree.set(depKey, 0);
44
+
45
+ // dep must run before this file
46
+ adj.get(depKey).add(key);
47
+ inDegree.set(key, (inDegree.get(key) || 0) + 1);
48
+ }
49
+ }
50
+ }
51
+
52
+ // Kahn's algorithm — topological sort with layers for parallelism
53
+ /** @type {ExecutableFile[][]} */
54
+ const layers = [];
55
+ let queue = [];
56
+
57
+ for (const file of files) {
58
+ if ((inDegree.get(file.relativePath) || 0) === 0) {
59
+ queue.push(file.relativePath);
60
+ }
61
+ }
62
+
63
+ const visited = new Set();
64
+
65
+ while (queue.length > 0) {
66
+ const layer = [];
67
+ const nextQueue = [];
68
+
69
+ for (const key of queue) {
70
+ if (visited.has(key)) continue;
71
+ visited.add(key);
72
+ const file = files.find((f) => f.relativePath === key);
73
+ if (file) layer.push(file);
74
+
75
+ for (const neighbor of adj.get(key) || []) {
76
+ const newDegree = (inDegree.get(neighbor) || 1) - 1;
77
+ inDegree.set(neighbor, newDegree);
78
+ if (newDegree === 0) {
79
+ nextQueue.push(neighbor);
80
+ }
81
+ }
82
+ }
83
+
84
+ if (layer.length > 0) layers.push(layer);
85
+ queue = nextQueue;
86
+ }
87
+
88
+ // Check for cycles
89
+ if (visited.size < files.length) {
90
+ const unvisited = files.filter((f) => !visited.has(f.relativePath));
91
+ console.error(
92
+ "⚠️ Circular dependencies detected in:",
93
+ unvisited.map((f) => f.relativePath),
94
+ );
95
+ }
96
+
97
+ const order = layers.flat();
98
+ return { layers, order };
99
+ }
100
+
101
+ /**
102
+ * Given a specific file, find all its transitive dependencies and dependants.
103
+ * @param {ExecutableFile[]} allFiles
104
+ * @param {string} targetPath - relative path
105
+ * @returns {ExecutableFile[]}
106
+ */
107
+ export function getSubgraph(allFiles, targetPath) {
108
+ /** @type {Map<string, ExecutableFile>} */
109
+ const fileMap = new Map();
110
+ for (const f of allFiles) {
111
+ fileMap.set(f.relativePath, f);
112
+ const outputPath = f.relativePath.slice(0, -f.executorExtension.length);
113
+ fileMap.set(outputPath, f);
114
+ }
115
+
116
+ const needed = new Set();
117
+
118
+ // Collect transitive dependencies (upstream)
119
+ function collectDeps(path) {
120
+ if (needed.has(path)) return;
121
+ needed.add(path);
122
+ const file = fileMap.get(path);
123
+ if (!file) return;
124
+ for (const dep of file.dependencies) {
125
+ const depFile = fileMap.get(dep);
126
+ if (depFile) collectDeps(depFile.relativePath);
127
+ }
128
+ }
129
+
130
+ // Collect transitive dependants (downstream)
131
+ function collectDependants(path) {
132
+ if (needed.has(path)) return;
133
+ needed.add(path);
134
+ for (const f of allFiles) {
135
+ const outputPath = path.slice(
136
+ 0,
137
+ -((fileMap.get(path)?.executorExtension || "").length || 0),
138
+ );
139
+ if (f.dependencies.includes(path) || f.dependencies.includes(outputPath)) {
140
+ collectDependants(f.relativePath);
141
+ }
142
+ }
143
+ }
144
+
145
+ collectDeps(targetPath);
146
+ collectDependants(targetPath);
147
+
148
+ return allFiles.filter((f) => needed.has(f.relativePath));
149
+ }
package/src/init.js ADDED
@@ -0,0 +1,162 @@
1
+ import { scanFiles } from "./scanner.js";
2
+ import { writeFile, mkdir } from "node:fs/promises";
3
+ import { join } from "node:path";
4
+
5
+ /**
6
+ * Generate .github/workflows/nlang.yml based on discovered files.
7
+ * @param {string} rootDir
8
+ */
9
+ export async function initWorkflow(rootDir) {
10
+ console.log(`\n🔍 Scanning ${rootDir} for executable files...\n`);
11
+
12
+ const files = await scanFiles(rootDir);
13
+ console.log(`📂 Found ${files.length} executable file(s)`);
14
+
15
+ // Collect cron schedules
16
+ /** @type {Map<string, string[]>} - cron expression -> file paths */
17
+ const cronMap = new Map();
18
+
19
+ for (const file of files) {
20
+ if (file.cron) {
21
+ const existing = cronMap.get(file.cron) || [];
22
+ existing.push(file.relativePath);
23
+ cronMap.set(file.cron, existing);
24
+ }
25
+ }
26
+
27
+ console.log(`⏰ Found ${cronMap.size} cron schedule(s)`);
28
+ for (const [cron, paths] of cronMap) {
29
+ console.log(` ${cron}: ${paths.join(", ")}`);
30
+ }
31
+
32
+ // Generate the workflow
33
+ const workflow = generateWorkflow(cronMap, files);
34
+
35
+ // Write it
36
+ const workflowDir = join(rootDir, ".github", "workflows");
37
+ await mkdir(workflowDir, { recursive: true });
38
+ const workflowPath = join(workflowDir, "nlang.yml");
39
+ await writeFile(workflowPath, workflow);
40
+
41
+ console.log(`\n✅ Generated ${workflowPath}\n`);
42
+ console.log(`Next steps:`);
43
+ console.log(
44
+ ` 1. Set OPENAI_API_KEY (or LLM_API_KEY) in your repo's Settings → Secrets`
45
+ );
46
+ console.log(` 2. Commit and push the workflow file`);
47
+ console.log(
48
+ ` 3. Your files will be built on push and on the configured schedules\n`
49
+ );
50
+ }
51
+
52
+ /**
53
+ * Generate the YAML workflow content.
54
+ * @param {Map<string, string[]>} cronMap
55
+ * @param {import('./scanner.js').ExecutableFile[]} files
56
+ * @returns {string}
57
+ */
58
+ function generateWorkflow(cronMap, files) {
59
+ // Build schedule entries
60
+ const schedules = [];
61
+ const cronEntries = [...cronMap.entries()];
62
+
63
+ for (const [cron] of cronEntries) {
64
+ schedules.push(` - cron: '${cron}'`);
65
+ }
66
+
67
+ const hasSchedules = schedules.length > 0;
68
+ const hasMdFiles = files.some((f) => f.executorExtension === ".md");
69
+ const hasTsFiles = files.some((f) => f.executorExtension === ".ts");
70
+
71
+ // Build the dispatch file mapping for cron triggers
72
+ // We use workflow_dispatch inputs to allow targeted builds
73
+ // For cron, we determine which files to build based on the schedule
74
+ const cronDispatchLogic = cronEntries
75
+ .map(([cron, paths], i) => {
76
+ const condition =
77
+ i === 0
78
+ ? `if [ "\${{ github.event.schedule }}" = "${cron}" ]; then`
79
+ : `elif [ \${{ github.event.schedule }}" = "${cron}" ]; then`;
80
+ // Build each file individually via --file flag
81
+ const buildCmds = paths
82
+ .map((p) => ` npx nlang build --file "${p}"`)
83
+ .join("\n");
84
+ return ` ${condition}\n${buildCmds}`;
85
+ })
86
+ .join("\n");
87
+
88
+ const yaml = `# Generated by nlang - Executable Extensions
89
+ # https://github.com/nicely-defined/nlang
90
+ #
91
+ # This workflow builds double-extension files:
92
+ # - .xyz.md files are sent to an LLM API
93
+ # - .xyz.js/.ts files are executed in Node.js
94
+ # Output is written to the dist/ directory.
95
+
96
+ name: nlang build
97
+
98
+ on:
99
+ push:
100
+ branches: [main, master]
101
+ paths-ignore:
102
+ - 'dist/**'
103
+ - '.nlang-cache/**'
104
+ ${hasSchedules ? ` schedule:\n${schedules.join("\n")}` : ""}
105
+ workflow_dispatch:
106
+ inputs:
107
+ file:
108
+ description: 'Specific file to build (optional)'
109
+ required: false
110
+ type: string
111
+
112
+ permissions:
113
+ contents: write
114
+
115
+ jobs:
116
+ build:
117
+ runs-on: ubuntu-latest
118
+ steps:
119
+ - name: Checkout
120
+ uses: actions/checkout@v4
121
+
122
+ - name: Setup Node.js
123
+ uses: actions/setup-node@v4
124
+ with:
125
+ node-version: '22'
126
+
127
+ - name: Install nlang
128
+ run: npm install -g nlang
129
+
130
+ ${hasTsFiles ? ` - name: Install tsx (for TypeScript support)\n run: npm install -g tsx\n` : ""}
131
+ - name: Restore cache
132
+ uses: actions/cache@v4
133
+ with:
134
+ path: .nlang-cache
135
+ key: nlang-cache-\${{ github.sha }}
136
+ restore-keys: |
137
+ nlang-cache-
138
+
139
+ - name: Build
140
+ env:
141
+ OPENAI_API_KEY: \${{ secrets.OPENAI_API_KEY }}
142
+ LLM_API_KEY: \${{ secrets.LLM_API_KEY }}
143
+ run: |
144
+ if [ -n "\${{ github.event.inputs.file }}" ]; then
145
+ npx nlang build --file "\${{ github.event.inputs.file }}"
146
+ ${cronDispatchLogic ? `${cronDispatchLogic}\n fi` : " else\n npx nlang build\n fi"}${!cronDispatchLogic ? "" : `\n else\n npx nlang build\n fi`}
147
+
148
+ - name: Commit and push output
149
+ run: |
150
+ git config user.name "nlang[bot]"
151
+ git config user.email "nlang[bot]@users.noreply.github.com"
152
+ git add dist/ .nlang-cache/
153
+ if git diff --staged --quiet; then
154
+ echo "No changes to commit"
155
+ else
156
+ git commit -m "nlang: build output [skip ci]"
157
+ git push
158
+ fi
159
+ `;
160
+
161
+ return yaml;
162
+ }
package/src/scanner.js ADDED
@@ -0,0 +1,192 @@
1
+ import { glob } from "glob";
2
+ import { readFile } from "node:fs/promises";
3
+ import { join, basename, dirname, extname, relative } from "node:path";
4
+ import matter from "gray-matter";
5
+
6
+ /**
7
+ * Represents a discovered executable file.
8
+ * @typedef {{
9
+ * path: string,
10
+ * relativePath: string,
11
+ * outputExtension: string,
12
+ * executorExtension: string,
13
+ * frontmatter: Record<string, any>,
14
+ * content: string,
15
+ * rawContent: string,
16
+ * dependencies: string[],
17
+ * variables: { name: string, valuesFile: string | null }[],
18
+ * cron: string | null,
19
+ * }} ExecutableFile
20
+ */
21
+
22
+ // Match files with 2+ extensions: e.g. foo.html.md, bar.json.js
23
+ const DOUBLE_EXT_PATTERN = "**/*.*.*";
24
+
25
+ // Executor extensions we know how to run
26
+ const EXECUTOR_EXTENSIONS = new Set([".md", ".js", ".ts"]);
27
+
28
+ /**
29
+ * Scan a directory for double-extension files.
30
+ * @param {string} rootDir
31
+ * @returns {Promise<ExecutableFile[]>}
32
+ */
33
+ export async function scanFiles(rootDir) {
34
+ const files = await glob(DOUBLE_EXT_PATTERN, {
35
+ cwd: rootDir,
36
+ nodir: true,
37
+ ignore: ["node_modules/**", ".git/**", ".github/**", "dist/**", "build/**"]
38
+ });
39
+
40
+ /** @type {ExecutableFile[]} */
41
+ const executables = [];
42
+
43
+ for (const relPath of files) {
44
+ const fullPath = join(rootDir, relPath);
45
+ const name = basename(relPath);
46
+
47
+ // Get the last extension (executor) and everything before it (output)
48
+ const executorExt = extname(name);
49
+ if (!EXECUTOR_EXTENSIONS.has(executorExt)) continue;
50
+
51
+ const withoutExecutor = name.slice(0, -executorExt.length);
52
+ const outputExt = extname(withoutExecutor);
53
+ if (!outputExt) continue; // need at least 2 extensions
54
+
55
+ const rawContent = await readFile(fullPath, "utf-8");
56
+
57
+ // Parse frontmatter for .md files
58
+ let frontmatterData = {};
59
+ let content = rawContent;
60
+ if (executorExt === ".md") {
61
+ const parsed = matter(rawContent);
62
+ frontmatterData = parsed.data;
63
+ content = parsed.content;
64
+ } else {
65
+ content = rawContent;
66
+ }
67
+
68
+ // Extract @{path} dependencies
69
+ const dependencies = extractDependencies(content);
70
+
71
+ // Extract [variable] patterns from path
72
+ const variables = extractVariables(relPath, rootDir);
73
+
74
+ // Extract cron
75
+ const cron = frontmatterData.trigger || null;
76
+
77
+ executables.push({
78
+ path: fullPath,
79
+ relativePath: relPath,
80
+ outputExtension: outputExt,
81
+ executorExtension: executorExt,
82
+ frontmatter: frontmatterData,
83
+ content,
84
+ rawContent,
85
+ dependencies,
86
+ variables,
87
+ cron
88
+ });
89
+ }
90
+
91
+ return executables;
92
+ }
93
+
94
+ /**
95
+ * Extract @{path} and @{URL} references from content.
96
+ * @param {string} content
97
+ * @returns {string[]}
98
+ */
99
+ function extractDependencies(content) {
100
+ const regex = /@\{([^}]+)\}/g;
101
+ const deps = [];
102
+ let match;
103
+ while ((match = regex.exec(content)) !== null) {
104
+ const ref = match[1];
105
+ // Skip URLs — only keep local paths
106
+ if (!ref.startsWith("http://") && !ref.startsWith("https://")) {
107
+ deps.push(ref);
108
+ }
109
+ }
110
+ return deps;
111
+ }
112
+
113
+ /**
114
+ * Extract [variable] patterns from file path and look for companion JSON.
115
+ * @param {string} relPath
116
+ * @param {string} rootDir
117
+ * @returns {{ name: string, valuesFile: string | null }[]}
118
+ */
119
+ function extractVariables(relPath, rootDir) {
120
+ const regex = /\[([^\]]+)\]/g;
121
+ const vars = [];
122
+ let match;
123
+ while ((match = regex.exec(relPath)) !== null) {
124
+ const varName = match[1];
125
+ // Look for companion JSON: same directory, varName.json
126
+ const dir = dirname(relPath);
127
+ const valuesFile = join(dir, `${varName}.json`);
128
+ vars.push({ name: varName, valuesFile });
129
+ }
130
+ return vars;
131
+ }
132
+
133
+ /**
134
+ * Load nlang.json config, walking up from file's directory to rootDir.
135
+ * More specific configs override less specific ones.
136
+ * @param {string} filePath - relative path of the file
137
+ * @param {string} rootDir
138
+ * @returns {Promise<Record<string, any>>}
139
+ */
140
+ export async function loadConfig(filePath, rootDir) {
141
+ const parts = dirname(filePath).split("/").filter(Boolean);
142
+ const configPaths = [];
143
+
144
+ // From most specific to least specific
145
+ let current = "";
146
+ // Root config
147
+ configPaths.push(join(rootDir, "nlang.json"));
148
+ // Also check ~/.nlang (global)
149
+ const homeDir = process.env.HOME || process.env.USERPROFILE || "";
150
+ if (homeDir) {
151
+ configPaths.push(join(homeDir, ".nlang"));
152
+ }
153
+
154
+ // Build paths from root to the file's directory
155
+ const dirConfigs = [];
156
+ for (const part of parts) {
157
+ current = current ? join(current, part) : part;
158
+ dirConfigs.push(join(rootDir, current, "nlang.json"));
159
+ }
160
+
161
+ // Merge: global < root < dir1 < dir2 < ... < most_specific_dir
162
+ let merged = {};
163
+
164
+ // Global config (lowest priority)
165
+ if (homeDir) {
166
+ merged = await tryLoadJson(join(homeDir, ".nlang"), merged);
167
+ }
168
+
169
+ // Root config
170
+ merged = await tryLoadJson(join(rootDir, "nlang.json"), merged);
171
+
172
+ // Directory configs (most specific wins)
173
+ for (const configPath of dirConfigs) {
174
+ merged = await tryLoadJson(configPath, merged);
175
+ }
176
+
177
+ return merged;
178
+ }
179
+
180
+ /**
181
+ * @param {string} path
182
+ * @param {Record<string, any>} base
183
+ * @returns {Promise<Record<string, any>>}
184
+ */
185
+ async function tryLoadJson(path, base) {
186
+ try {
187
+ const raw = await readFile(path, "utf-8");
188
+ return { ...base, ...JSON.parse(raw) };
189
+ } catch {
190
+ return base;
191
+ }
192
+ }