@mono-labs/cli 0.0.204 → 0.0.206

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,247 +1,123 @@
1
- <div align="center">
1
+ # mono-labs
2
2
 
3
- # mono-labs CLI
4
-
5
- Declarative, token-aware task runner for JavaScript/TypeScript monorepos.
6
- Configure commands with simple JSON – no custom scripting required.
7
-
8
- </div>
9
-
10
- ## Why This Exists
11
-
12
- You often need a repeatable set of steps to bootstrap or run your full stack
13
- (web, mobile, backend, infra). Traditional npm scripts become tangled. This CLI
14
- lets you:
15
-
16
- - Describe commands in `.mono/*.json` files
17
- - Emit dynamic values from scripts (`{out:token value}`)
18
- - Inject those values into later commands & environment variables
19
- - Run multiple background services + one attached foreground process
20
-
21
- No publishing needed: you can link and iterate locally.
3
+ Declarative monorepo orchestration, project tooling, and infrastructure
4
+ integration — built to scale real systems, not just scripts.
22
5
 
23
6
  ---
24
7
 
25
- ## Quick Start (Beginner Friendly)
8
+ ## What This Is
26
9
 
27
- 1. Install dependencies:
10
+ mono-labs is a monorepo control plane.
28
11
 
29
- ```bash
30
- yarn install
31
- ```
12
+ It combines:
32
13
 
33
- 2. Create a `.mono` directory in your project root.
34
- 3. Add a file `.mono/hello.json`:
35
-
36
- ```json
37
- { "actions": ["echo Hello World"] }
38
- ```
14
+ - a declarative, token-aware CLI runtime
15
+ - project-level orchestration utilities
16
+ - infrastructure and CI integration primitives
39
17
 
40
- 4. Run the command:
18
+ The goal is to make a monorepo behave like a single, coordinated system across:
41
19
 
42
- ```bash
43
- yarn mono hello
44
- ```
45
-
46
- You should see `Hello World`.
47
-
48
- ### Adding a Command with an Argument
49
-
50
- ```json
51
- // .mono/greet.json
52
- {
53
- "actions": ["echo Hi ${arg}"],
54
- "argument": { "description": "Name to greet", "default": "friend" }
55
- }
56
- ```
57
-
58
- ```bash
59
- yarn mono greet # Hi friend
60
- yarn mono greet Alice # Hi Alice
61
- ```
62
-
63
- ### Adding an Option
64
-
65
- ```json
66
- // .mono/build.json
67
- {
68
- "actions": ["echo Building for ${platform} debug=${debug}"],
69
- "options": {
70
- "platform": { "type": "string", "default": "ios" },
71
- "debug": { "description": "Enable debug mode" }
72
- }
73
- }
74
- ```
75
-
76
- ```bash
77
- yarn mono build --platform android --debug
78
- ```
20
+ - local development
21
+ - CI pipelines
22
+ - deployments
23
+ - infrastructure management
79
24
 
80
25
  ---
81
26
 
82
- ## Core Concepts
83
-
84
- | Concept | Summary |
85
- | -------------- | --------------------------------------------------------------------------------------------- |
86
- | `.mono/*.json` | Each file (except `config.json`) becomes a command. `dev.json` -> `yarn mono dev`. |
87
- | `preactions` | Sequential setup commands whose output can define tokens. |
88
- | `actions` | Main workload commands. All but last run detached; last is attached (interactive). |
89
- | Tokens | Printed from preactions as `{out:key value}` and later substituted as `${key}`. |
90
- | Environments | `environments.dev` / `environments.stage` provide token-aware env vars. Use `--stage` switch. |
91
- | Data Layer | Merges defaults, user flags, argument, and emitted tokens. |
27
+ ## What Problems It Solves
92
28
 
93
- Full schemas & rules: see `docs/configuration.md`.
29
+ Most monorepos suffer from:
94
30
 
95
- ---
31
+ - duplicated scripts across packages
32
+ - environment drift between dev and CI
33
+ - infrastructure logic isolated in pipelines
34
+ - brittle bash scripts
35
+ - slow onboarding
96
36
 
97
- ## Documentation Index
37
+ mono-labs solves this by providing:
98
38
 
99
- | Topic | File |
100
- | ------------------------ | ------------------------- |
101
- | Architecture / internals | `docs/architecture.md` |
102
- | Configuration schema | `docs/configuration.md` |
103
- | Practical examples | `docs/examples.md` |
104
- | Troubleshooting | `docs/troubleshooting.md` |
39
+ - declarative command definitions
40
+ - shared runtime state via tokens
41
+ - reusable project utilities
42
+ - programmatic CDK helpers
43
+ - one mental model for dev, CI, and deploy
105
44
 
106
45
  ---
107
46
 
108
- ## How It Works (Short Version)
47
+ ## High-Level Architecture
109
48
 
110
- 1. CLI scans `.mono/` at startup.
111
- 2. Builds Commander commands for each JSON file.
112
- 3. When invoked: merges defaults + flags + argument into data layer.
113
- 4. Runs `preactions` (foreground) capturing `{out:key value}` tokens.
114
- 5. Spawns each action (background except last). Performs `${token}`
115
- substitution.
116
- 6. Cleans background processes on exit or Ctrl+C.
49
+ mono-labs is intentionally layered:
117
50
 
118
- Details: `docs/architecture.md`.
51
+ 1. `.mono/` Declarative command definitions (JSON).
119
52
 
120
- ---
121
-
122
- ## Local Development / Linking
53
+ 2. CLI Runtime (`bin` + `lib`) Loads `.mono`, builds commands, executes
54
+ workflows, manages processes.
123
55
 
124
- From this repo root:
56
+ 3. Project Orchestration (`src/project`) Environment merging, configuration
57
+ management, monorepo utilities.
125
58
 
126
- ```bash
127
- yarn link
128
- ```
59
+ 4. Infrastructure Integration (`src/cdk`) CDK helpers, stack orchestration,
60
+ CI-friendly deployment primitives.
129
61
 
130
- In a target project:
62
+ Each layer can be used independently, but they are designed to work together.
131
63
 
132
- ```bash
133
- yarn link "@mono-labs/cli"
134
- ```
64
+ ---
135
65
 
136
- Then use:
66
+ ## Quick Start
137
67
 
138
- ```bash
139
- yarn mono <command>
140
- ```
68
+ Create a `.mono` directory and add:
141
69
 
142
- To unlink later:
70
+ .mono/hello.json
143
71
 
144
- ```bash
145
- yarn unlink "@mono-labs/cli"
146
- ```
72
+ { "actions": ["echo Hello World"] }
147
73
 
148
- Alternative (direct file install):
74
+ Run:
149
75
 
150
- ```bash
151
- yarn add file:/absolute/path/to/mono-labs-cli
152
- ```
76
+ yarn mono hello
153
77
 
154
78
  ---
155
79
 
156
- ## Emitting Dynamic Values
80
+ ## Typical Developer Workflow
157
81
 
158
- Inside a `preactions` script output lines like:
82
+ yarn mono dev yarn mono serve yarn mono mobile
159
83
 
160
- ```
161
- {out:ngrok_api https://1234.ngrok.dev}
162
- {out:region us-east-1}
163
- ```
84
+ If unsure:
164
85
 
165
- Then reference in actions or environments as `${ngrok_api}` or `${region}`.
86
+ yarn mono help
166
87
 
167
88
  ---
168
89
 
169
- ## Example Advanced Command
170
-
171
- ```json
172
- // .mono/dev.json
173
- {
174
- "preactions": ["docker compose up -d", "node scripts/ngrok_setup"],
175
- "actions": [
176
- "yarn backend dynamodb-admin -p 8082 --dynamo-endpoint=http://localhost:8000",
177
- "yarn mono backend server"
178
- ],
179
- "argument": { "type": "string", "default": "dev" },
180
- "options": {
181
- "stage": { "description": "Use stage env" },
182
- "profile": { "type": "string", "description": "Profile name" }
183
- },
184
- "environments": {
185
- "dev": { "API_URL": "${ngrok_api}", "MODE": "dev" },
186
- "stage": { "API_URL": "${ngrok_api}", "MODE": "stage" }
187
- }
188
- }
189
- ```
190
-
191
- Run:
192
-
193
- ```bash
194
- yarn mono dev --profile alpha
195
- ```
90
+ ## Documentation Index
196
91
 
197
- ---
92
+ Start here:
198
93
 
199
- ## Design Decisions
94
+ - docs/README.txt
200
95
 
201
- - JSON over JS: simpler, toolable, safer for newcomers.
202
- - Single positional argument: keeps mental model small.
203
- - Token system: decouples script output from later steps.
204
- - Background/foreground split: stable dev server orchestration.
96
+ Key docs:
205
97
 
206
- ---
98
+ - docs/architecture.md
99
+ - docs/configuration.md
100
+ - docs/examples.md
101
+ - docs/troubleshooting.md
207
102
 
208
- ## Extending
103
+ Advanced:
209
104
 
210
- | Need | Approach |
211
- | ---------------------- | --------------------------------------------- |
212
- | Multiple arguments | Extend `cliFactory.js` to parse more. |
213
- | JSON schema validation | Add Ajv in `boot()` loader. |
214
- | Parallel preactions | Modify `runHasteCommand.js` to `Promise.all`. |
215
- | Different token syntax | Adjust regex in `runForeground.js`. |
105
+ - docs/project-orchestration.md
106
+ - docs/infrastructure-integration.md
216
107
 
217
108
  ---
218
109
 
219
- ## Contributing
110
+ ## Who This Is For
220
111
 
221
- 1. Fork & clone
222
- 2. Create a feature branch
223
- 3. Add/adjust tests (future roadmap)
224
- 4. Submit PR with clear description
112
+ mono-labs is designed for teams that:
225
113
 
226
- ---
227
-
228
- ## FAQ (Fast Answers)
229
-
230
- | Question | Answer |
231
- | -------------------------------- | ------------------------------------------------------------------------------------------------------------------------- |
232
- | How do I list commands? | Look at filenames in `.mono/` or run `yarn mono --help`. |
233
- | How do I pass env vars manually? | `MY_VAR=1 yarn mono dev` (POSIX) or `set MY_VAR=1 && yarn mono dev` (CMD) or `$env:MY_VAR=1; yarn mono dev` (PowerShell). |
234
- | Does it support Windows? | Yes; process cleanup uses `taskkill`. |
235
- | What if a token is missing? | It stays literal (`${token}`); no crash. |
114
+ - run full-stack systems
115
+ - manage real infrastructure
116
+ - care about reproducibility
117
+ - want dev and CI to behave the same
236
118
 
237
119
  ---
238
120
 
239
121
  ## License
240
122
 
241
123
  MIT © Contributors
242
-
243
- ---
244
-
245
- ## Next Steps
246
-
247
- Jump to: `docs/examples.md` for hands-on learning.
package/dist/cdk/index.js CHANGED
@@ -1,4 +1,8 @@
1
- export function replaceTokens(str, env) {
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.replaceTokens = replaceTokens;
4
+ exports.setUpConfig = setUpConfig;
5
+ function replaceTokens(str, env) {
2
6
  if (typeof str !== 'string')
3
7
  return str;
4
8
  return str.replace(/\$\{([^}]+)\}|\$([A-Z0-9_]+)/g, (m, k1, k2) => {
@@ -26,7 +30,7 @@ function filterEnvByPrefix(env, prefix) {
26
30
  }
27
31
  return filtered;
28
32
  }
29
- export function setUpConfig(config) {
33
+ function setUpConfig(config) {
30
34
  const { extra = {}, ...other } = config.expo || {};
31
35
  const router = extra['router'] ?
32
36
  { origin: false, ...extra['router'] }
package/dist/expo.js CHANGED
@@ -1,4 +1,8 @@
1
- export function replaceTokens(str, env) {
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.replaceTokens = replaceTokens;
4
+ exports.setUpConfig = setUpConfig;
5
+ function replaceTokens(str, env) {
2
6
  if (typeof str !== 'string')
3
7
  return str;
4
8
  return str.replace(/\$\{([^}]+)\}|\$([A-Z0-9_]+)/g, (m, k1, k2) => {
@@ -26,7 +30,7 @@ function filterEnvByPrefix(env, prefix) {
26
30
  }
27
31
  return filtered;
28
32
  }
29
- export function setUpConfig(config) {
33
+ function setUpConfig(config) {
30
34
  const { extra = {}, ...other } = config.expo || {};
31
35
  const router = extra['router'] ?
32
36
  { origin: false, ...extra['router'] }
package/dist/merge-env.js CHANGED
@@ -1,14 +1,20 @@
1
- import fs from 'fs';
2
- import path from 'path';
3
- import dotenv from 'dotenv';
4
- export function loadMergedEnv() {
5
- const ENV_PATH = path.resolve(process.cwd(), '.env');
6
- const ENV_LOCAL_PATH = path.resolve(process.cwd(), '.env.local');
1
+ "use strict";
2
+ var __importDefault = (this && this.__importDefault) || function (mod) {
3
+ return (mod && mod.__esModule) ? mod : { "default": mod };
4
+ };
5
+ Object.defineProperty(exports, "__esModule", { value: true });
6
+ exports.loadMergedEnv = loadMergedEnv;
7
+ const fs_1 = __importDefault(require("fs"));
8
+ const path_1 = __importDefault(require("path"));
9
+ const dotenv_1 = __importDefault(require("dotenv"));
10
+ function loadMergedEnv() {
11
+ const ENV_PATH = path_1.default.resolve(process.cwd(), '.env');
12
+ const ENV_LOCAL_PATH = path_1.default.resolve(process.cwd(), '.env.local');
7
13
  // Load base .env
8
- const base = fs.existsSync(ENV_PATH) ? dotenv.parse(fs.readFileSync(ENV_PATH)) : {};
14
+ const base = fs_1.default.existsSync(ENV_PATH) ? dotenv_1.default.parse(fs_1.default.readFileSync(ENV_PATH)) : {};
9
15
  // Load overrides .env.local
10
- const local = fs.existsSync(ENV_LOCAL_PATH) ?
11
- dotenv.parse(fs.readFileSync(ENV_LOCAL_PATH))
16
+ const local = fs_1.default.existsSync(ENV_LOCAL_PATH) ?
17
+ dotenv_1.default.parse(fs_1.default.readFileSync(ENV_LOCAL_PATH))
12
18
  : {};
13
19
  // Merge: local overrides base
14
20
  const merged = {
@@ -0,0 +1,277 @@
1
+ "use strict";
2
+ // scripts/generate-readme.ts
3
+ // Node >= 18 recommended
4
+ var __importDefault = (this && this.__importDefault) || function (mod) {
5
+ return (mod && mod.__esModule) ? mod : { "default": mod };
6
+ };
7
+ Object.defineProperty(exports, "__esModule", { value: true });
8
+ const node_fs_1 = require("node:fs");
9
+ const node_path_1 = __importDefault(require("node:path"));
10
+ const generate_docs_js_1 = require("./generate-docs.js");
11
+ /* -------------------------------------------------------------------------- */
12
+ /* Path helpers */
13
+ /* -------------------------------------------------------------------------- */
14
+ // Always use the working directory as the root for all file actions
15
+ const REPO_ROOT = node_path_1.default.resolve(process.cwd());
16
+ const MONO_DIR = node_path_1.default.join(REPO_ROOT, '.mono');
17
+ const ROOT_PKG_JSON = node_path_1.default.join(REPO_ROOT, 'package.json');
18
+ const OUTPUT_PATH = node_path_1.default.join(REPO_ROOT, 'docs');
19
+ const OUTPUT_README = node_path_1.default.join(OUTPUT_PATH, 'command-line.md');
20
+ /* -------------------------------------------------------------------------- */
21
+ /* Utils */
22
+ /* -------------------------------------------------------------------------- */
23
+ async function ensureParentDir(filePath) {
24
+ // Always resolve parent dir relative to working directory
25
+ const dir = node_path_1.default.resolve(process.cwd(), node_path_1.default.dirname(filePath));
26
+ await node_fs_1.promises.mkdir(dir, { recursive: true });
27
+ }
28
+ async function exists(p) {
29
+ try {
30
+ await node_fs_1.promises.access(p);
31
+ return true;
32
+ }
33
+ catch {
34
+ return false;
35
+ }
36
+ }
37
+ function isObject(v) {
38
+ return v !== null && typeof v === 'object' && !Array.isArray(v);
39
+ }
40
+ function toPosix(p) {
41
+ return p.split(node_path_1.default.sep).join('/');
42
+ }
43
+ async function readJson(filePath) {
44
+ // Always resolve filePath relative to working directory
45
+ const absPath = node_path_1.default.resolve(process.cwd(), filePath);
46
+ const raw = await node_fs_1.promises.readFile(absPath, 'utf8');
47
+ return JSON.parse(raw);
48
+ }
49
+ async function listDir(dir) {
50
+ // Always resolve dir relative to working directory
51
+ const absDir = node_path_1.default.resolve(process.cwd(), dir);
52
+ return node_fs_1.promises.readdir(absDir, { withFileTypes: true });
53
+ }
54
+ function normalizeWorkspacePatterns(workspacesField) {
55
+ if (Array.isArray(workspacesField))
56
+ return workspacesField;
57
+ if (isObject(workspacesField) &&
58
+ Array.isArray(workspacesField.packages)) {
59
+ return workspacesField.packages;
60
+ }
61
+ return [];
62
+ }
63
+ function mdEscapeInline(value) {
64
+ return String(value ?? '').replaceAll('`', '\\`');
65
+ }
66
+ function indentLines(s, spaces = 2) {
67
+ const pad = ' '.repeat(spaces);
68
+ return s
69
+ .split('\n')
70
+ .map((line) => pad + line)
71
+ .join('\n');
72
+ }
73
+ /* -------------------------------------------------------------------------- */
74
+ /* Workspace glob pattern expansion */
75
+ /* -------------------------------------------------------------------------- */
76
+ function matchSegment(patternSeg, name) {
77
+ if (patternSeg === '*')
78
+ return true;
79
+ if (!patternSeg.includes('*'))
80
+ return patternSeg === name;
81
+ const escaped = patternSeg.replace(/[.+?^${}()|[\]\\]/g, '\\$&');
82
+ const regex = new RegExp(`^${escaped.replaceAll('*', '.*')}$`);
83
+ return regex.test(name);
84
+ }
85
+ async function expandWorkspacePattern(root, pattern) {
86
+ const segments = toPosix(pattern).split('/').filter(Boolean);
87
+ async function expandFrom(dir, index) {
88
+ // Always resolve dir relative to working directory
89
+ const absDir = node_path_1.default.resolve(process.cwd(), dir);
90
+ if (index >= segments.length)
91
+ return [absDir];
92
+ const seg = segments[index];
93
+ if (seg === '**') {
94
+ const results = [];
95
+ results.push(...(await expandFrom(absDir, index + 1)));
96
+ const entries = await node_fs_1.promises
97
+ .readdir(absDir, { withFileTypes: true })
98
+ .catch(() => []);
99
+ for (const entry of entries) {
100
+ if (!entry.isDirectory())
101
+ continue;
102
+ results.push(...(await expandFrom(node_path_1.default.join(absDir, entry.name), index)));
103
+ }
104
+ return results;
105
+ }
106
+ const entries = await node_fs_1.promises
107
+ .readdir(absDir, { withFileTypes: true })
108
+ .catch(() => []);
109
+ const results = [];
110
+ for (const entry of entries) {
111
+ if (!entry.isDirectory())
112
+ continue;
113
+ if (!matchSegment(seg, entry.name))
114
+ continue;
115
+ results.push(...(await expandFrom(node_path_1.default.join(absDir, entry.name), index + 1)));
116
+ }
117
+ return results;
118
+ }
119
+ const dirs = await expandFrom(root, 0);
120
+ const pkgDirs = [];
121
+ for (const d of dirs) {
122
+ if (await exists(node_path_1.default.join(d, 'package.json'))) {
123
+ pkgDirs.push(d);
124
+ }
125
+ }
126
+ return Array.from(new Set(pkgDirs));
127
+ }
128
+ async function findWorkspacePackageDirs(repoRoot, patterns) {
129
+ const dirs = [];
130
+ for (const pat of patterns) {
131
+ dirs.push(...(await expandWorkspacePattern(repoRoot, pat)));
132
+ }
133
+ return Array.from(new Set(dirs));
134
+ }
135
+ /* -------------------------------------------------------------------------- */
136
+ /* .mono configuration */
137
+ /* -------------------------------------------------------------------------- */
138
+ async function readMonoConfig() {
139
+ // Always resolve configPath relative to working directory
140
+ const configPath = node_path_1.default.resolve(process.cwd(), node_path_1.default.join(MONO_DIR, 'config.json'));
141
+ if (!(await exists(configPath)))
142
+ return null;
143
+ try {
144
+ const config = await readJson(configPath);
145
+ return { path: configPath, config };
146
+ }
147
+ catch {
148
+ return null;
149
+ }
150
+ }
151
+ function commandNameFromFile(filePath) {
152
+ return node_path_1.default.basename(filePath).replace(/\.json$/i, '');
153
+ }
154
+ async function readMonoCommands() {
155
+ // Always resolve MONO_DIR relative to working directory
156
+ const monoDirAbs = node_path_1.default.resolve(process.cwd(), MONO_DIR);
157
+ if (!(await exists(monoDirAbs)))
158
+ return [];
159
+ const entries = await listDir(monoDirAbs);
160
+ const jsonFiles = entries
161
+ .filter((e) => e.isFile() && e.name.endsWith('.json'))
162
+ .map((e) => node_path_1.default.join(monoDirAbs, e.name))
163
+ .filter((p) => node_path_1.default.basename(p) !== 'config.json');
164
+ const commands = [];
165
+ for (const file of jsonFiles) {
166
+ try {
167
+ const json = await readJson(file);
168
+ commands.push({
169
+ name: commandNameFromFile(file),
170
+ file,
171
+ json,
172
+ });
173
+ }
174
+ catch {
175
+ /* ignore invalid JSON */
176
+ }
177
+ }
178
+ return commands.sort((a, b) => a.name.localeCompare(b.name));
179
+ }
180
+ /* -------------------------------------------------------------------------- */
181
+ /* Options schema parsing */
182
+ /* -------------------------------------------------------------------------- */
183
+ function parseOptionsSchema(optionsObj) {
184
+ if (!isObject(optionsObj))
185
+ return [];
186
+ const entries = Object.entries(optionsObj).map(([key, raw]) => {
187
+ const o = isObject(raw) ? raw : {};
188
+ const hasType = typeof o.type === 'string' && o.type.length > 0;
189
+ return {
190
+ key,
191
+ kind: hasType ? 'value' : 'boolean',
192
+ type: hasType ? o.type : 'boolean',
193
+ description: typeof o.description === 'string' ? o.description : '',
194
+ shortcut: typeof o.shortcut === 'string' ? o.shortcut : '',
195
+ default: o.default,
196
+ allowed: Array.isArray(o.options) ? o.options : null,
197
+ allowAll: o.allowAll === true,
198
+ };
199
+ });
200
+ return entries.sort((a, b) => a.key.localeCompare(b.key));
201
+ }
202
+ /* -------------------------------------------------------------------------- */
203
+ /* Formatting */
204
+ /* -------------------------------------------------------------------------- */
205
+ function buildUsageExample(commandName, cmdJson, options) {
206
+ const arg = cmdJson.argument;
207
+ const hasArg = isObject(arg);
208
+ const parts = [`yarn mono ${commandName}`];
209
+ if (hasArg)
210
+ parts.push(`<${commandName}-arg>`);
211
+ const valueOpts = options.filter((o) => o.kind === 'value');
212
+ const boolOpts = options.filter((o) => o.kind === 'boolean');
213
+ for (const o of valueOpts.slice(0, 2)) {
214
+ const value = o.default !== undefined ?
215
+ String(o.default)
216
+ : (o.allowed?.[0] ?? '<value>');
217
+ parts.push(`--${o.key} ${value}`);
218
+ }
219
+ if (boolOpts[0]) {
220
+ parts.push(`--${boolOpts[0].key}`);
221
+ }
222
+ return parts.join(' ');
223
+ }
224
+ /* -------------------------------------------------------------------------- */
225
+ /* Main */
226
+ /* -------------------------------------------------------------------------- */
227
+ async function main() {
228
+ // Always resolve all paths relative to working directory
229
+ if (!(await exists(ROOT_PKG_JSON))) {
230
+ throw new Error(`Missing ${ROOT_PKG_JSON}`);
231
+ }
232
+ await ensureParentDir(OUTPUT_PATH);
233
+ const rootPkg = await readJson(ROOT_PKG_JSON);
234
+ const workspacePatterns = normalizeWorkspacePatterns(rootPkg.workspaces);
235
+ const monoConfig = await readMonoConfig();
236
+ const monoCommands = await readMonoCommands();
237
+ const pkgDirs = await findWorkspacePackageDirs(REPO_ROOT, workspacePatterns);
238
+ const packages = [];
239
+ for (const dir of pkgDirs) {
240
+ try {
241
+ const pkg = await readJson(node_path_1.default.join(dir, 'package.json'));
242
+ packages.push({
243
+ name: pkg.name ??
244
+ toPosix(node_path_1.default.relative(REPO_ROOT, dir)) ??
245
+ node_path_1.default.basename(dir),
246
+ dir,
247
+ scripts: pkg.scripts ?? {},
248
+ });
249
+ }
250
+ catch {
251
+ /* ignore */
252
+ }
253
+ }
254
+ const parts = [];
255
+ parts.push(`# Mono Command-Line Reference
256
+
257
+ > Generated by \`scripts/generate-readme.ts\`.
258
+
259
+ `);
260
+ // Reuse your existing formatters here
261
+ // (unchanged logic, now fully typed)
262
+ const docsIndex = await (0, generate_docs_js_1.generateDocsIndex)({
263
+ docsDir: node_path_1.default.join(REPO_ROOT, 'docs'),
264
+ excludeFile: 'command-line.md',
265
+ });
266
+ parts.push(docsIndex);
267
+ await ensureParentDir(OUTPUT_README);
268
+ await node_fs_1.promises.writeFile(OUTPUT_README, parts.join('\n'), 'utf8');
269
+ console.log(`Generated: ${OUTPUT_README}`);
270
+ console.log(`- mono config: ${monoConfig ? 'yes' : 'no'}`);
271
+ console.log(`- mono commands: ${monoCommands.length}`);
272
+ console.log(`- workspace packages: ${packages.length}`);
273
+ }
274
+ main().catch((err) => {
275
+ console.error(err instanceof Error ? err.stack : err);
276
+ process.exit(1);
277
+ });
@@ -0,0 +1,4 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ require("./build-mono-readme");
4
+ require("./generate-readme");