@turboforge/cli-kit 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 turboforge-dev
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,130 @@
1
+ # @turboforge/cli-kit
2
+
3
+ Build monorepo-aware CLIs without rewriting config loading, workspace detection, and logging every time.
4
+
5
+ <p className="flex gap-2">
6
+ <a href="https://github.com/turboforge-dev/turboforge/actions/workflows/ci.yml" rel="noopener noreferrer">
7
+ <img alt="CI" src="https://github.com/turboforge-dev/turboforge/actions/workflows/ci.yml/badge.svg" />
8
+ </a>
9
+ <a href="https://codecov.io/gh/turboforge-dev/turboforge/tree/main/packages/@turboforge/cli-kit" rel="noopener noreferrer">
10
+ <img alt="codecov" src="https://codecov.io/gh/turboforge-dev/turboforge/graph/badge.svg?flag=@turboforge/cli-kit" />
11
+ </a>
12
+ <a href="https://npmjs.com/package/@turboforge/cli-kit" rel="noopener noreferrer">
13
+ <img alt="npm version" src="https://img.shields.io/npm/v/@turboforge/cli-kit" />
14
+ </a>
15
+ <a href="https://npmjs.com/package/@turboforge/cli-kit" rel="noopener noreferrer">
16
+ <img alt="npm downloads" src="https://img.shields.io/npm/d18m/@turboforge/cli-kit" />
17
+ </a>
18
+ <a href="https://npmjs.com/package/@turboforge/cli-kit" rel="noopener noreferrer">
19
+ <img alt="npm bundle size" src="https://img.shields.io/bundlephobia/minzip/@turboforge/cli-kit" />
20
+ </a>
21
+ <img alt="license" src="https://img.shields.io/npm/l/@turboforge/cli-kit" />
22
+ </p>
23
+
24
+ `@turboforge/cli-kit` exists because internal tooling usually starts as one script, then turns into five slightly different scripts with different root-detection rules, different config formats, and no shared mental model. This package gives Turboforge and downstream tools a common foundation.
25
+
26
+ Part of the Turboforge system:
27
+
28
+ - use [`@turboforge/sync`](/c:/Users/G/web/open-source/turbo-forge/packages/forge-sync/README.md) when the problem is keeping a repo aligned with its upstream shape
29
+ - use `@turboforge/cli-kit` when the problem is building the repo-aware tools that operate inside that shape
30
+
31
+ ## Highlights
32
+
33
+ - Resolve layered config from defaults, files, env, and CLI input.
34
+ - Detect project roots and workspace packages without repo-specific glue code.
35
+ - Share one logging and runtime foundation across multiple repo tools.
36
+
37
+ ## Why It Exists
38
+
39
+ Most CLI helpers are either too generic to understand monorepos or too entangled with one app to reuse cleanly.
40
+
41
+ `@turboforge/cli-kit` focuses on the boring parts every serious repo tool needs:
42
+
43
+ - find the real project root
44
+ - discover workspace packages
45
+ - load layered config from the right place
46
+ - log in a way that works for both humans and automation
47
+
48
+ ## Real Example
49
+
50
+ You are building `repo doctor`, `release check`, and `docs sync` commands for the same workspace.
51
+
52
+ Without a shared kit, each command re-implements:
53
+
54
+ - "where is the repo root?"
55
+ - "which packages belong to this workspace?"
56
+ - "which config wins: default, file, env, or CLI flag?"
57
+
58
+ With `@turboforge/cli-kit`, those decisions become shared infrastructure instead of repeated code.
59
+
60
+ ## When To Use It
61
+
62
+ - You are building internal or OSS CLIs that need monorepo awareness.
63
+ - You want layered config resolution without writing a config loader from scratch.
64
+ - You need a small foundation, not a full CLI framework.
65
+
66
+ ## When Not To Use It
67
+
68
+ - You only need argument parsing.
69
+ - Your tool does not care about workspaces, repo roots, or shared config.
70
+ - You want a batteries-included command framework with prompts, subcommands, and plugin loading out of the box.
71
+
72
+ ## 📦 Installation
73
+
74
+ ```bash
75
+ pnpm add @turboforge/cli-kit
76
+ ```
77
+
78
+ **_or_**
79
+
80
+ ```bash
81
+ $ npm install @turboforge/cli-kit
82
+ ```
83
+
84
+ **_or_**
85
+
86
+ ```bash
87
+ $ yarn add @turboforge/cli-kit
88
+ ```
89
+
90
+ ### Optional Peer Dependencies
91
+
92
+ ```bash
93
+ pnpm add -D jiti defu
94
+ ```
95
+
96
+ - `jiti` lets you load TypeScript config files at runtime.
97
+ - `defu` gives you richer object merge behavior.
98
+
99
+ ## Example
100
+
101
+ ```ts
102
+ import {
103
+ createLogger,
104
+ findProjectRoot,
105
+ getWorkspacePackages,
106
+ resolveConfig,
107
+ } from "@turboforge/cli-kit";
108
+
109
+ const logger = createLogger({ level: "info", name: "repo-doctor" });
110
+ const root = await findProjectRoot();
111
+ const packages = await getWorkspacePackages(root);
112
+
113
+ const config = await resolveConfig({
114
+ name: "repo-doctor",
115
+ defaults: { fix: false },
116
+ });
117
+
118
+ logger.info(`checking ${packages.length} packages`, config.fix);
119
+ ```
120
+
121
+ ## What You Get
122
+
123
+ - `resolveConfig`: layered config for repo tools
124
+ - `findProjectRoot`: stable root detection
125
+ - `getWorkspacePackages`: workspace discovery
126
+ - `createLogger`: structured output for local use and automation
127
+
128
+ ## Ecosystem Fit
129
+
130
+ If Turboforge is about keeping a monorepo coherent, `@turboforge/cli-kit` is the layer you build that coherence on top of.
@@ -0,0 +1,251 @@
1
+ import { exec, execFile } from 'node:child_process';
2
+
3
+ interface ResolveConfigOptions<T> {
4
+ name: string;
5
+ cwd?: string;
6
+ defaults?: T;
7
+ envVars?: Partial<T>;
8
+ cliArgs?: Partial<T>;
9
+ configFile?: string;
10
+ /** If true, throws on configuration parsing errors. */
11
+ strict?: boolean;
12
+ }
13
+ /**
14
+ * Resolves configuration by walking up the tree and merging files.
15
+ */
16
+ declare const resolveConfig: <T>({ name, cwd, defaults, envVars, cliArgs, configFile, strict, }: ResolveConfigOptions<T>) => Promise<T>;
17
+ /**
18
+ * Type helper for defining config.
19
+ */
20
+ declare const defineConfig: <T>(config: T) => T;
21
+
22
+ /**
23
+ * Log severity levels.
24
+ *
25
+ * Ordered from lowest → highest severity.
26
+ * This ordering allows efficient numeric filtering.
27
+ */
28
+ type LogLevel = "debug" | "info" | "warn" | "error";
29
+ /**
30
+ * Logger configuration.
31
+ */
32
+ interface LoggerConfig {
33
+ /**
34
+ * Minimum severity level to output.
35
+ *
36
+ * Logs below this level will be ignored.
37
+ */
38
+ level: LogLevel;
39
+ /**
40
+ * Optional file path to append logs to.
41
+ *
42
+ * If provided, logs will be written to both terminal and file.
43
+ */
44
+ logFile?: string;
45
+ /**
46
+ * Log format.
47
+ *
48
+ * `text` - Human-readable format
49
+ * `json` - JSON format for machine parsing
50
+ *
51
+ * @default "text"
52
+ */
53
+ logFormat?: "text" | "json";
54
+ /**
55
+ * Optional name to prepend to all log messages.
56
+ * Useful for identifying the source of logs in multi-process environments.
57
+ */
58
+ name?: string;
59
+ }
60
+ /**
61
+ * Logger interface.
62
+ */
63
+ interface Logger {
64
+ debug: (...args: unknown[]) => void;
65
+ info: (...args: unknown[]) => void;
66
+ warn: (...args: unknown[]) => void;
67
+ error: (...args: unknown[]) => void;
68
+ /**
69
+ * Flushes and closes the file stream if active.
70
+ *
71
+ * Safe to call multiple times.
72
+ */
73
+ close: () => void;
74
+ }
75
+ /**
76
+ * Creates a minimal structured logger.
77
+ *
78
+ * Features:
79
+ * - Level-based filtering
80
+ * - Colored terminal output
81
+ * - Optional file logging
82
+ * - Automatic stream cleanup on process exit
83
+ *
84
+ * Designed for:
85
+ * - CLI tools
86
+ * - developer utilities
87
+ * - small Node services
88
+ *
89
+ * Example:
90
+ *
91
+ * ```ts
92
+ * const logger = createLogger({ level: "info", logFile: "./app.log" })
93
+ *
94
+ * logger.info("Server started", port)
95
+ * logger.warn("Cache miss")
96
+ * logger.error("Unhandled error", err)
97
+ * ```
98
+ */
99
+ declare const createLogger: (config: LoggerConfig) => Logger;
100
+
101
+ /**
102
+ * Finds the project root directory based on common markers.
103
+ * Priority: .git > .changeset > pnpm-workspace.yaml > package.json (with workspaces)
104
+ */
105
+ declare const findProjectRoot: (cwd?: string) => Promise<string>;
106
+
107
+ /**
108
+ * Minimal YAML parser with fallback.
109
+ * Uses `yaml` package if available, else falls back to regex-based parsing
110
+ * for common monorepo config patterns (like pnpm-workspace.yaml).
111
+ */
112
+ declare function parseYaml<T>(content: string): Promise<T>;
113
+
114
+ /**
115
+ * Promisified version of `child_process.exec`.
116
+ * Executes a command in a shell and buffers the output.
117
+ *
118
+ * @param command - The shell command to execute.
119
+ * @returns Promise resolving with `{ stdout, stderr }`.
120
+ */
121
+ declare const execAsync: typeof exec.__promisify__;
122
+ /**
123
+ * Promisified version of `child_process.execFile`.
124
+ * Executes an executable directly without spawning a shell.
125
+ *
126
+ * @param file - Path to executable.
127
+ * @param args - CLI arguments.
128
+ * @returns Promise resolving with `{ stdout, stderr }`.
129
+ */
130
+ declare const execFileAsync: typeof execFile.__promisify__;
131
+ /**
132
+ * Checks if a file or directory exists.
133
+ *
134
+ * Uses `fs.promises.access` for non-blocking I/O.
135
+ *
136
+ * Only `ENOENT` is interpreted as "does not exist".
137
+ * Other filesystem errors are rethrown.
138
+ *
139
+ * @param p - File system path to check.
140
+ */
141
+ declare const existsAsync: (p: string) => Promise<boolean>;
142
+ /**
143
+ * Recursively walks up the directory tree from `startDir`
144
+ * searching for a directory containing any of the `markers`.
145
+ *
146
+ * Results are cached to avoid redundant filesystem traversal.
147
+ *
148
+ * @param startDir - Directory to begin searching from.
149
+ * @param markers - File or directory names to detect.
150
+ *
151
+ * @returns Directory containing the marker, or `null`.
152
+ */
153
+ declare const findUp: (startDir: string, markers: string[]) => Promise<string | null>;
154
+ /**
155
+ * Options for parsing operations.
156
+ */
157
+ interface ParsingOptions {
158
+ /**
159
+ * Throw on syntax/import errors.
160
+ *
161
+ * @default false
162
+ */
163
+ strict?: boolean;
164
+ }
165
+ /**
166
+ * Reads and parses a JSON file safely.
167
+ *
168
+ * Returns `null` if the file does not exist or parsing fails
169
+ * (unless `strict` mode is enabled).
170
+ *
171
+ * @param p - JSON file path.
172
+ */
173
+ declare const readJson: <T = unknown>(p: string, options?: ParsingOptions) => Promise<T | null>;
174
+ /**
175
+ * Attempts to import a module.
176
+ *
177
+ * Supports:
178
+ * - Native dynamic `import()`
179
+ * - TypeScript configs via `jiti` (if available)
180
+ *
181
+ * @param p - Module
182
+ */
183
+ declare const tryImport: <T = unknown>(p: string, options?: ParsingOptions) => Promise<T | null>;
184
+ /**
185
+ * Deep merges two objects.
186
+ *
187
+ * Arrays are overwritten rather than concatenated.
188
+ * Guards against prototype pollution.
189
+ */
190
+ declare const deepMerge: (target: any, source: any) => any;
191
+ /**
192
+ * Atomically writes a file using a temp-file + rename strategy.
193
+ *
194
+ * Guarantees readers never observe partially written files.
195
+ *
196
+ * Strategy:
197
+ * 1. Write data to a temporary file in the same directory.
198
+ * 2. Rename temp file to target (atomic on same filesystem).
199
+ *
200
+ * Why this matters:
201
+ * - Prevents partially written files.
202
+ * - Ensures readers never observe truncated JSON.
203
+ * - Safe under concurrent writers (last-writer-wins).
204
+ *
205
+ * Concurrency Model:
206
+ * - Each invocation uses a `randomUUID()` temp filename
207
+ * to avoid cross-process collisions.
208
+ * - Rename is atomic on the same filesystem.
209
+ * - No locking is performed here.
210
+ *
211
+ * Limitations:
212
+ * - Does not prevent logical race conditions.
213
+ * - If two processes write simultaneously, the last rename wins.
214
+ * - Both temp and target must reside on the same filesystem
215
+ * for atomic guarantees.
216
+ *
217
+ * @param path Target file path.
218
+ * @param data UTF-8 string content.
219
+ */
220
+ declare const atomicWrite: (path: string, data: string) => Promise<void>;
221
+ /**
222
+ * Safely renames a file or directory.
223
+ *
224
+ * Handles common Windows rename issues where the target
225
+ * already exists.
226
+ *
227
+ * @param from Source
228
+ * @param to Target
229
+ */
230
+ declare const safeRename: (from: string, to: string) => Promise<void>;
231
+ /**
232
+ * Creates a minimal promise concurrency limiter.
233
+ *
234
+ * Ensures no more than `concurrency` async tasks run simultaneously.
235
+ * Additional tasks are queued FIFO.
236
+ *
237
+ * @param concurrency - Maximum concurrent tasks (must be ≥1).
238
+ */
239
+ declare const createLimiter: (concurrency: number) => <T>(task: () => Promise<T>) => Promise<T>;
240
+
241
+ /**
242
+ * Detects workspace packages in a monorepo.
243
+ * Supports `package.json` workspaces and `pnpm-workspace.yaml`.
244
+ */
245
+ declare const getWorkspacePackages: (root: string) => Promise<string[]>;
246
+ /**
247
+ * Check if the current directory is inside a monorepo
248
+ */
249
+ declare const isMonorepo: (cwd?: string) => Promise<boolean>;
250
+
251
+ export { type LogLevel, type Logger, type LoggerConfig, type ParsingOptions, type ResolveConfigOptions, atomicWrite, createLimiter, createLogger, deepMerge, defineConfig, execAsync, execFileAsync, existsAsync, findProjectRoot, findUp, getWorkspacePackages, isMonorepo, parseYaml, readJson, resolveConfig, safeRename, tryImport };
@@ -0,0 +1,251 @@
1
+ import { exec, execFile } from 'node:child_process';
2
+
3
+ interface ResolveConfigOptions<T> {
4
+ name: string;
5
+ cwd?: string;
6
+ defaults?: T;
7
+ envVars?: Partial<T>;
8
+ cliArgs?: Partial<T>;
9
+ configFile?: string;
10
+ /** If true, throws on configuration parsing errors. */
11
+ strict?: boolean;
12
+ }
13
+ /**
14
+ * Resolves configuration by walking up the tree and merging files.
15
+ */
16
+ declare const resolveConfig: <T>({ name, cwd, defaults, envVars, cliArgs, configFile, strict, }: ResolveConfigOptions<T>) => Promise<T>;
17
+ /**
18
+ * Type helper for defining config.
19
+ */
20
+ declare const defineConfig: <T>(config: T) => T;
21
+
22
+ /**
23
+ * Log severity levels.
24
+ *
25
+ * Ordered from lowest → highest severity.
26
+ * This ordering allows efficient numeric filtering.
27
+ */
28
+ type LogLevel = "debug" | "info" | "warn" | "error";
29
+ /**
30
+ * Logger configuration.
31
+ */
32
+ interface LoggerConfig {
33
+ /**
34
+ * Minimum severity level to output.
35
+ *
36
+ * Logs below this level will be ignored.
37
+ */
38
+ level: LogLevel;
39
+ /**
40
+ * Optional file path to append logs to.
41
+ *
42
+ * If provided, logs will be written to both terminal and file.
43
+ */
44
+ logFile?: string;
45
+ /**
46
+ * Log format.
47
+ *
48
+ * `text` - Human-readable format
49
+ * `json` - JSON format for machine parsing
50
+ *
51
+ * @default "text"
52
+ */
53
+ logFormat?: "text" | "json";
54
+ /**
55
+ * Optional name to prepend to all log messages.
56
+ * Useful for identifying the source of logs in multi-process environments.
57
+ */
58
+ name?: string;
59
+ }
60
+ /**
61
+ * Logger interface.
62
+ */
63
+ interface Logger {
64
+ debug: (...args: unknown[]) => void;
65
+ info: (...args: unknown[]) => void;
66
+ warn: (...args: unknown[]) => void;
67
+ error: (...args: unknown[]) => void;
68
+ /**
69
+ * Flushes and closes the file stream if active.
70
+ *
71
+ * Safe to call multiple times.
72
+ */
73
+ close: () => void;
74
+ }
75
+ /**
76
+ * Creates a minimal structured logger.
77
+ *
78
+ * Features:
79
+ * - Level-based filtering
80
+ * - Colored terminal output
81
+ * - Optional file logging
82
+ * - Automatic stream cleanup on process exit
83
+ *
84
+ * Designed for:
85
+ * - CLI tools
86
+ * - developer utilities
87
+ * - small Node services
88
+ *
89
+ * Example:
90
+ *
91
+ * ```ts
92
+ * const logger = createLogger({ level: "info", logFile: "./app.log" })
93
+ *
94
+ * logger.info("Server started", port)
95
+ * logger.warn("Cache miss")
96
+ * logger.error("Unhandled error", err)
97
+ * ```
98
+ */
99
+ declare const createLogger: (config: LoggerConfig) => Logger;
100
+
101
+ /**
102
+ * Finds the project root directory based on common markers.
103
+ * Priority: .git > .changeset > pnpm-workspace.yaml > package.json (with workspaces)
104
+ */
105
+ declare const findProjectRoot: (cwd?: string) => Promise<string>;
106
+
107
+ /**
108
+ * Minimal YAML parser with fallback.
109
+ * Uses `yaml` package if available, else falls back to regex-based parsing
110
+ * for common monorepo config patterns (like pnpm-workspace.yaml).
111
+ */
112
+ declare function parseYaml<T>(content: string): Promise<T>;
113
+
114
+ /**
115
+ * Promisified version of `child_process.exec`.
116
+ * Executes a command in a shell and buffers the output.
117
+ *
118
+ * @param command - The shell command to execute.
119
+ * @returns Promise resolving with `{ stdout, stderr }`.
120
+ */
121
+ declare const execAsync: typeof exec.__promisify__;
122
+ /**
123
+ * Promisified version of `child_process.execFile`.
124
+ * Executes an executable directly without spawning a shell.
125
+ *
126
+ * @param file - Path to executable.
127
+ * @param args - CLI arguments.
128
+ * @returns Promise resolving with `{ stdout, stderr }`.
129
+ */
130
+ declare const execFileAsync: typeof execFile.__promisify__;
131
+ /**
132
+ * Checks if a file or directory exists.
133
+ *
134
+ * Uses `fs.promises.access` for non-blocking I/O.
135
+ *
136
+ * Only `ENOENT` is interpreted as "does not exist".
137
+ * Other filesystem errors are rethrown.
138
+ *
139
+ * @param p - File system path to check.
140
+ */
141
+ declare const existsAsync: (p: string) => Promise<boolean>;
142
+ /**
143
+ * Recursively walks up the directory tree from `startDir`
144
+ * searching for a directory containing any of the `markers`.
145
+ *
146
+ * Results are cached to avoid redundant filesystem traversal.
147
+ *
148
+ * @param startDir - Directory to begin searching from.
149
+ * @param markers - File or directory names to detect.
150
+ *
151
+ * @returns Directory containing the marker, or `null`.
152
+ */
153
+ declare const findUp: (startDir: string, markers: string[]) => Promise<string | null>;
154
+ /**
155
+ * Options for parsing operations.
156
+ */
157
+ interface ParsingOptions {
158
+ /**
159
+ * Throw on syntax/import errors.
160
+ *
161
+ * @default false
162
+ */
163
+ strict?: boolean;
164
+ }
165
+ /**
166
+ * Reads and parses a JSON file safely.
167
+ *
168
+ * Returns `null` if the file does not exist or parsing fails
169
+ * (unless `strict` mode is enabled).
170
+ *
171
+ * @param p - JSON file path.
172
+ */
173
+ declare const readJson: <T = unknown>(p: string, options?: ParsingOptions) => Promise<T | null>;
174
+ /**
175
+ * Attempts to import a module.
176
+ *
177
+ * Supports:
178
+ * - Native dynamic `import()`
179
+ * - TypeScript configs via `jiti` (if available)
180
+ *
181
+ * @param p - Module
182
+ */
183
+ declare const tryImport: <T = unknown>(p: string, options?: ParsingOptions) => Promise<T | null>;
184
+ /**
185
+ * Deep merges two objects.
186
+ *
187
+ * Arrays are overwritten rather than concatenated.
188
+ * Guards against prototype pollution.
189
+ */
190
+ declare const deepMerge: (target: any, source: any) => any;
191
+ /**
192
+ * Atomically writes a file using a temp-file + rename strategy.
193
+ *
194
+ * Guarantees readers never observe partially written files.
195
+ *
196
+ * Strategy:
197
+ * 1. Write data to a temporary file in the same directory.
198
+ * 2. Rename temp file to target (atomic on same filesystem).
199
+ *
200
+ * Why this matters:
201
+ * - Prevents partially written files.
202
+ * - Ensures readers never observe truncated JSON.
203
+ * - Safe under concurrent writers (last-writer-wins).
204
+ *
205
+ * Concurrency Model:
206
+ * - Each invocation uses a `randomUUID()` temp filename
207
+ * to avoid cross-process collisions.
208
+ * - Rename is atomic on the same filesystem.
209
+ * - No locking is performed here.
210
+ *
211
+ * Limitations:
212
+ * - Does not prevent logical race conditions.
213
+ * - If two processes write simultaneously, the last rename wins.
214
+ * - Both temp and target must reside on the same filesystem
215
+ * for atomic guarantees.
216
+ *
217
+ * @param path Target file path.
218
+ * @param data UTF-8 string content.
219
+ */
220
+ declare const atomicWrite: (path: string, data: string) => Promise<void>;
221
+ /**
222
+ * Safely renames a file or directory.
223
+ *
224
+ * Handles common Windows rename issues where the target
225
+ * already exists.
226
+ *
227
+ * @param from Source
228
+ * @param to Target
229
+ */
230
+ declare const safeRename: (from: string, to: string) => Promise<void>;
231
+ /**
232
+ * Creates a minimal promise concurrency limiter.
233
+ *
234
+ * Ensures no more than `concurrency` async tasks run simultaneously.
235
+ * Additional tasks are queued FIFO.
236
+ *
237
+ * @param concurrency - Maximum concurrent tasks (must be ≥1).
238
+ */
239
+ declare const createLimiter: (concurrency: number) => <T>(task: () => Promise<T>) => Promise<T>;
240
+
241
+ /**
242
+ * Detects workspace packages in a monorepo.
243
+ * Supports `package.json` workspaces and `pnpm-workspace.yaml`.
244
+ */
245
+ declare const getWorkspacePackages: (root: string) => Promise<string[]>;
246
+ /**
247
+ * Check if the current directory is inside a monorepo
248
+ */
249
+ declare const isMonorepo: (cwd?: string) => Promise<boolean>;
250
+
251
+ export { type LogLevel, type Logger, type LoggerConfig, type ParsingOptions, type ResolveConfigOptions, atomicWrite, createLimiter, createLogger, deepMerge, defineConfig, execAsync, execFileAsync, existsAsync, findProjectRoot, findUp, getWorkspacePackages, isMonorepo, parseYaml, readJson, resolveConfig, safeRename, tryImport };
package/dist/index.js ADDED
@@ -0,0 +1,3 @@
1
+ "use strict";var X=Object.create;var v=Object.defineProperty;var H=Object.getOwnPropertyDescriptor;var K=Object.getOwnPropertyNames;var Q=Object.getPrototypeOf,Z=Object.prototype.hasOwnProperty;var rr=(r,t)=>{for(var e in t)v(r,e,{get:t[e],enumerable:!0})},_=(r,t,e,o)=>{if(t&&typeof t=="object"||typeof t=="function")for(let n of K(t))!Z.call(r,n)&&n!==e&&v(r,n,{get:()=>t[n],enumerable:!(o=H(t,n))||o.enumerable});return r};var T=(r,t,e)=>(e=r!=null?X(Q(r)):{},_(t||!r||!r.__esModule?v(e,"default",{value:r,enumerable:!0}):e,r)),tr=r=>_(v({},"__esModule",{value:!0}),r);var lr={};rr(lr,{atomicWrite:()=>nr,createLimiter:()=>J,createLogger:()=>ar,deepMerge:()=>R,defineConfig:()=>ir,execAsync:()=>er,execFileAsync:()=>or,existsAsync:()=>m,findProjectRoot:()=>M,findUp:()=>D,getWorkspacePackages:()=>cr,isMonorepo:()=>pr,parseYaml:()=>O,readJson:()=>h,resolveConfig:()=>sr,safeRename:()=>Y,tryImport:()=>I});module.exports=tr(lr);var G=require("fs/promises"),b=T(require("path"));var P=T(require("path"));var L=require("child_process"),U=require("crypto"),f=require("fs/promises"),u=require("path"),N=require("util");async function O(r){try{let n=await import("yaml").then(i=>i.default||i).catch(()=>null);if(n&&typeof n.parse=="function")return n.parse(r)}catch{}let t={},o=r.replace(/#.*$/gm,"").match(/packages:\s*\n((?:\s*-\s*.*\n?)*)/);if(o?.[1]){let n=[],i=/^\s*-\s*["']?([^"'\s]+)["']?/gm,a;for(;(a=i.exec(o[1]))!==null;)n.push(a[1]);t.packages=n}return t}var er=(0,N.promisify)(L.exec),or=(0,N.promisify)(L.execFile),m=async r=>{try{return await(0,f.access)(r),!0}catch(t){if(t.code==="ENOENT")return!1;throw t}},y=new Map,W=200,D=async(r,t)=>{let e=(0,u.resolve)(r),o=`${e}:${[...t].sort().join(",")}`;if(y.has(o))return y.get(o);let n=e,{root:i}=(0,u.parse)(n);for(;;){for(let a of t)if(await m((0,u.join)(n,a)))return y.set(o,n),y.size>W&&y.clear(),n;if(n===i)break;n=(0,u.dirname)(n)}return y.set(o,null),y.size>W&&y.clear(),null},h=async(r,t={})=>{try{let e=await(0,f.readFile)(r,"utf-8");return JSON.parse(e)}catch(e){if(e.code==="ENOENT")return null;if(t.strict)throw e;return null}},I=async(r,t={})=>{if(!await m(r))return null;try{let e=await import("jiti"),n=(e.createJiti?e.createJiti(process.cwd()):e.default(process.cwd()))(r);return n.default??n}catch{try{let e=await import(r);return e.default??e}catch(e){if(t.strict)throw/\.(ts|mts)$/.test(r)?new Error(`Failed to load TypeScript config at ${r}. Install 'jiti' to load TS configs. Original error: ${e}`):e;return null}}},R=(r,t)=>{if(typeof r!="object"||r===null||typeof t!="object"||t===null||Array.isArray(r)&&Array.isArray(t))return t;let e={...r};for(let o of Object.keys(t))o==="__proto__"||o==="constructor"||o==="prototype"||Object.hasOwn(t,o)&&(e[o]=o in r?R(r[o],t[o]):t[o]);return e},nr=async(r,t)=>{let e=(0,u.join)((0,u.dirname)(r),`${(0,U.randomUUID)()}.tmp`);try{await(0,f.writeFile)(e,t,"utf-8"),await Y(e,r)}catch(o){throw await(0,f.rm)(e,{force:!0}).catch(()=>{}),o}},Y=async(r,t)=>{if(await m(r))try{await(0,f.rename)(r,t)}catch(e){let o=e;if(o.code==="EEXIST"||o.code==="EPERM"){await(0,f.rm)(t,{recursive:!0,force:!0}),await(0,f.rename)(r,t);return}throw e}},J=r=>{if(r<1)throw new Error("createLimiter: concurrency must be >= 1");let t=0,e=[],o=()=>{t--,e.shift()?.()};return async n=>{t>=r&&await new Promise(i=>e.push(i)),t++;try{return await n()}finally{o()}}};var $=new Map,M=async(r=process.cwd())=>{let t=P.default.resolve(r);if($.has(t))return $.get(t);let e=await D(t,[".git",".changeset","pnpm-workspace.yaml"]);if(e)return $.set(t,e),e;let o=P.default.resolve(r),{root:n}=P.default.parse(o);for(;;){let i=P.default.join(o,"package.json"),a=await h(i);if(a?.workspaces&&Array.isArray(a.workspaces))return $.set(t,o),o;if(o===n)break;o=P.default.dirname(o)}return $.set(t,t),t};var sr=async({name:r,cwd:t=process.cwd(),defaults:e={},envVars:o={},cliArgs:n={},configFile:i,strict:a=!1})=>{let E=await M(t),l=[];if(i)l.push(b.default.resolve(t,i));else{let s=b.default.resolve(t),g=new RegExp(`^${r}\\.config\\.(ts|mts|js|mjs|json)$`);for(;;){try{let j=(await(0,G.readdir)(s)).find(F=>g.test(F));j&&l.push(b.default.join(s,j))}catch{}if(s===E)break;let x=b.default.dirname(s);if(x===s)break;s=x}}let c=[];for(let s of l.reverse())if(s.endsWith(".json")){let g=await h(s,{strict:a});g&&c.push(g)}else{let g=await I(s,{strict:a});g&&c.push(g)}let d=R;try{let s=await import("defu");s?.defu&&(d=(g,x)=>s.defu(x,g))}catch{}let p=e;for(let s of c)p=d(p,s);return p=d(p,o),p=d(p,n),p},ir=r=>r;var B=require("fs"),V=T(require("os")),k="\x1B[",A={gray:r=>`${k}90m${r}${k}39m`,blue:r=>`${k}34m${r}${k}39m`,yellow:r=>`${k}33m${r}${k}39m`,red:r=>`${k}31m${r}${k}39m`},q={debug:20,info:30,warn:40,error:50},z={debug:A.gray,info:A.blue,warn:A.yellow,error:A.red},ar=r=>{let t=q[r.level],e=r.logFormat??"text",o=r.name,n=o?{debug:`${o}:DEBUG`,info:`${o}:INFO`,warn:`${o}:WARN`,error:`${r.name}:ERROR`}:{debug:"DEBUG",info:"INFO",warn:"WARN",error:"ERROR"},i=null,a=!1,E=process.stdout.isTTY&&!process.env.NO_COLOR||!!process.env.FORCE_COLOR;if(r.logFile)try{i=(0,B.createWriteStream)(r.logFile,{flags:"a"}),i.on("error",()=>{i=null})}catch{i=null}let l=()=>{if(!a&&(a=!0,i)){try{i.end()}catch{}i=null}};process.once("exit",l),process.once("SIGINT",()=>{l(),process.exit(0)}),process.once("SIGTERM",()=>{l(),process.exit(0)});let c=process.pid,d=V.default.hostname(),p=(s,...g)=>{if(q[s]<t)return;let x=new Date().toISOString(),C=g.map(String).join(" "),j=e==="json"?JSON.stringify({ts:x,level:s,message:C,pid:c,hostname:d,name:o}):`[${x}] [${n[s]}] ${C}`,F=E&&z[s]?z[s](j):j;if((s==="warn"||s==="error"?process.stderr:process.stdout).write(`${F}
2
+ `),i)try{i.write(`${j}
3
+ `)}catch{i=null}};return{debug:(...s)=>p("debug",...s),info:(...s)=>p("info",...s),warn:(...s)=>p("warn",...s),error:(...s)=>p("error",...s),close:l}};var S=require("fs/promises"),w=T(require("path"));var cr=async r=>{let t=[],e=w.default.join(r,"package.json"),o=await h(e);o?.workspaces&&(Array.isArray(o.workspaces)?t.push(...o.workspaces):Array.isArray(o.workspaces.packages)&&t.push(...o.workspaces.packages));let n=w.default.join(r,"pnpm-workspace.yaml");if(await m(n))try{let l=await(0,S.readFile)(n,"utf-8"),c=await O(l);c?.packages&&t.push(...c.packages)}catch{}let i=J(10),a=[],E=[...new Set(t)];return await Promise.all(E.map(l=>i(async()=>{if(l.endsWith("/*")){let c=w.default.join(r,l.slice(0,-2));if(await m(c))try{let d=await(0,S.readdir)(c,{withFileTypes:!0});for(let p of d)p.isDirectory()&&await m(w.default.join(c,p.name,"package.json"))&&a.push(w.default.join(c,p.name))}catch{}}else{let c=w.default.join(r,l);await m(w.default.join(c,"package.json"))&&a.push(c)}}))),[...new Set(a)]},pr=async(r=process.cwd())=>{let t=w.default.join(r,"package.json");return!!((await h(t))?.workspaces||await m(w.default.join(r,"pnpm-workspace.yaml")))};0&&(module.exports={atomicWrite,createLimiter,createLogger,deepMerge,defineConfig,execAsync,execFileAsync,existsAsync,findProjectRoot,findUp,getWorkspacePackages,isMonorepo,parseYaml,readJson,resolveConfig,safeRename,tryImport});
package/dist/index.mjs ADDED
@@ -0,0 +1,3 @@
1
+ import{readdir as H}from"fs/promises";import P from"path";import T from"path";import{exec as W,execFile as U}from"child_process";import{randomUUID as Y}from"crypto";import{access as G,readFile as q,rename as L,rm as A,writeFile as z}from"fs/promises";import{dirname as S,join as C,parse as B,resolve as V}from"path";import{promisify as F}from"util";async function v(r){try{let s=await import("yaml").then(i=>i.default||i).catch(()=>null);if(s&&typeof s.parse=="function")return s.parse(r)}catch{}let t={},e=r.replace(/#.*$/gm,"").match(/packages:\s*\n((?:\s*-\s*.*\n?)*)/);if(e?.[1]){let s=[],i=/^\s*-\s*["']?([^"'\s]+)["']?/gm,a;for(;(a=i.exec(e[1]))!==null;)s.push(a[1]);t.packages=s}return t}var cr=F(W),pr=F(U),g=async r=>{try{return await G(r),!0}catch(t){if(t.code==="ENOENT")return!1;throw t}},w=new Map,R=200,N=async(r,t)=>{let o=V(r),e=`${o}:${[...t].sort().join(",")}`;if(w.has(e))return w.get(e);let s=o,{root:i}=B(s);for(;;){for(let a of t)if(await g(C(s,a)))return w.set(e,s),w.size>R&&w.clear(),s;if(s===i)break;s=S(s)}return w.set(e,null),w.size>R&&w.clear(),null},h=async(r,t={})=>{try{let o=await q(r,"utf-8");return JSON.parse(o)}catch(o){if(o.code==="ENOENT")return null;if(t.strict)throw o;return null}},D=async(r,t={})=>{if(!await g(r))return null;try{let o=await import("jiti"),s=(o.createJiti?o.createJiti(process.cwd()):o.default(process.cwd()))(r);return s.default??s}catch{try{let o=await import(r);return o.default??o}catch(o){if(t.strict)throw/\.(ts|mts)$/.test(r)?new Error(`Failed to load TypeScript config at ${r}. Install 'jiti' to load TS configs. Original error: ${o}`):o;return null}}},O=(r,t)=>{if(typeof r!="object"||r===null||typeof t!="object"||t===null||Array.isArray(r)&&Array.isArray(t))return t;let o={...r};for(let e of Object.keys(t))e==="__proto__"||e==="constructor"||e==="prototype"||Object.hasOwn(t,e)&&(o[e]=e in r?O(r[e],t[e]):t[e]);return o},lr=async(r,t)=>{let o=C(S(r),`${Y()}.tmp`);try{await z(o,t,"utf-8"),await X(o,r)}catch(e){throw await A(o,{force:!0}).catch(()=>{}),e}},X=async(r,t)=>{if(await g(r))try{await L(r,t)}catch(o){let e=o;if(e.code==="EEXIST"||e.code==="EPERM"){await A(t,{recursive:!0,force:!0}),await L(r,t);return}throw o}},I=r=>{if(r<1)throw new Error("createLimiter: concurrency must be >= 1");let t=0,o=[],e=()=>{t--,o.shift()?.()};return async s=>{t>=r&&await new Promise(i=>o.push(i)),t++;try{return await s()}finally{e()}}};var j=new Map,J=async(r=process.cwd())=>{let t=T.resolve(r);if(j.has(t))return j.get(t);let o=await N(t,[".git",".changeset","pnpm-workspace.yaml"]);if(o)return j.set(t,o),o;let e=T.resolve(r),{root:s}=T.parse(e);for(;;){let i=T.join(e,"package.json"),a=await h(i);if(a?.workspaces&&Array.isArray(a.workspaces))return j.set(t,e),e;if(e===s)break;e=T.dirname(e)}return j.set(t,t),t};var xr=async({name:r,cwd:t=process.cwd(),defaults:o={},envVars:e={},cliArgs:s={},configFile:i,strict:a=!1})=>{let x=await J(t),l=[];if(i)l.push(P.resolve(t,i));else{let n=P.resolve(t),f=new RegExp(`^${r}\\.config\\.(ts|mts|js|mjs|json)$`);for(;;){try{let k=(await H(n)).find(b=>f.test(b));k&&l.push(P.join(n,k))}catch{}if(n===x)break;let y=P.dirname(n);if(y===n)break;n=y}}let c=[];for(let n of l.reverse())if(n.endsWith(".json")){let f=await h(n,{strict:a});f&&c.push(f)}else{let f=await D(n,{strict:a});f&&c.push(f)}let m=O;try{let n=await import("defu");n?.defu&&(m=(f,y)=>n.defu(y,f))}catch{}let p=o;for(let n of c)p=m(p,n);return p=m(p,e),p=m(p,s),p},Tr=r=>r;import{createWriteStream as K}from"fs";import Q from"os";var d="\x1B[",E={gray:r=>`${d}90m${r}${d}39m`,blue:r=>`${d}34m${r}${d}39m`,yellow:r=>`${d}33m${r}${d}39m`,red:r=>`${d}31m${r}${d}39m`},M={debug:20,info:30,warn:40,error:50},_={debug:E.gray,info:E.blue,warn:E.yellow,error:E.red},$r=r=>{let t=M[r.level],o=r.logFormat??"text",e=r.name,s=e?{debug:`${e}:DEBUG`,info:`${e}:INFO`,warn:`${e}:WARN`,error:`${r.name}:ERROR`}:{debug:"DEBUG",info:"INFO",warn:"WARN",error:"ERROR"},i=null,a=!1,x=process.stdout.isTTY&&!process.env.NO_COLOR||!!process.env.FORCE_COLOR;if(r.logFile)try{i=K(r.logFile,{flags:"a"}),i.on("error",()=>{i=null})}catch{i=null}let l=()=>{if(!a&&(a=!0,i)){try{i.end()}catch{}i=null}};process.once("exit",l),process.once("SIGINT",()=>{l(),process.exit(0)}),process.once("SIGTERM",()=>{l(),process.exit(0)});let c=process.pid,m=Q.hostname(),p=(n,...f)=>{if(M[n]<t)return;let y=new Date().toISOString(),$=f.map(String).join(" "),k=o==="json"?JSON.stringify({ts:y,level:n,message:$,pid:c,hostname:m,name:e}):`[${y}] [${s[n]}] ${$}`,b=x&&_[n]?_[n](k):k;if((n==="warn"||n==="error"?process.stderr:process.stdout).write(`${b}
2
+ `),i)try{i.write(`${k}
3
+ `)}catch{i=null}};return{debug:(...n)=>p("debug",...n),info:(...n)=>p("info",...n),warn:(...n)=>p("warn",...n),error:(...n)=>p("error",...n),close:l}};import{readdir as Z,readFile as rr}from"fs/promises";import u from"path";var Rr=async r=>{let t=[],o=u.join(r,"package.json"),e=await h(o);e?.workspaces&&(Array.isArray(e.workspaces)?t.push(...e.workspaces):Array.isArray(e.workspaces.packages)&&t.push(...e.workspaces.packages));let s=u.join(r,"pnpm-workspace.yaml");if(await g(s))try{let l=await rr(s,"utf-8"),c=await v(l);c?.packages&&t.push(...c.packages)}catch{}let i=I(10),a=[],x=[...new Set(t)];return await Promise.all(x.map(l=>i(async()=>{if(l.endsWith("/*")){let c=u.join(r,l.slice(0,-2));if(await g(c))try{let m=await Z(c,{withFileTypes:!0});for(let p of m)p.isDirectory()&&await g(u.join(c,p.name,"package.json"))&&a.push(u.join(c,p.name))}catch{}}else{let c=u.join(r,l);await g(u.join(c,"package.json"))&&a.push(c)}}))),[...new Set(a)]},Ar=async(r=process.cwd())=>{let t=u.join(r,"package.json");return!!((await h(t))?.workspaces||await g(u.join(r,"pnpm-workspace.yaml")))};export{lr as atomicWrite,I as createLimiter,$r as createLogger,O as deepMerge,Tr as defineConfig,cr as execAsync,pr as execFileAsync,g as existsAsync,J as findProjectRoot,N as findUp,Rr as getWorkspacePackages,Ar as isMonorepo,v as parseYaml,h as readJson,xr as resolveConfig,X as safeRename,D as tryImport};
package/package.json ADDED
@@ -0,0 +1,91 @@
1
+ {
2
+ "name": "@turboforge/cli-kit",
3
+ "author": "Mayank Kumar Chaudhari <https://mayankchaudhari.com>",
4
+ "private": false,
5
+ "version": "1.0.0",
6
+ "description": "Low-level utilities and configuration resolution for building high-performance CLI tools in monorepos.",
7
+ "license": "MIT",
8
+ "main": "./dist/index.js",
9
+ "module": "./dist/index.mjs",
10
+ "types": "./dist/index.d.ts",
11
+ "exports": {
12
+ ".": {
13
+ "import": {
14
+ "types": "./dist/index.d.mts",
15
+ "default": "./dist/index.mjs"
16
+ },
17
+ "require": {
18
+ "types": "./dist/index.d.ts",
19
+ "default": "./dist/index.js"
20
+ }
21
+ },
22
+ "./package.json": "./package.json"
23
+ },
24
+ "forge": {
25
+ "icon": "Terminal",
26
+ "description": "CLI & Config Utilities",
27
+ "aliases": [
28
+ "@turbo-forge/cli-kit"
29
+ ]
30
+ },
31
+ "repository": {
32
+ "type": "git",
33
+ "url": "https://github.com/turboforge-dev/turboforge",
34
+ "directory": "packages/cli-kit"
35
+ },
36
+ "bugs": "https://github.com/turboforge-dev/turboforge/issues",
37
+ "homepage": "https://github.com/turboforge-dev/turboforge/blob/main/packages/cli-kit/README.md",
38
+ "sideEffects": false,
39
+ "files": [
40
+ "dist/**"
41
+ ],
42
+ "devDependencies": {
43
+ "@types/node": "^25.5.0",
44
+ "defu": "^6.1.4",
45
+ "jiti": "^2.6.1",
46
+ "tsup": "^8.5.1",
47
+ "typescript": "^5.9.3"
48
+ },
49
+ "funding": [
50
+ {
51
+ "type": "github",
52
+ "url": "https://github.com/sponsors/turboforge-dev"
53
+ },
54
+ {
55
+ "type": "github",
56
+ "url": "https://github.com/sponsors/mayank1513"
57
+ }
58
+ ],
59
+ "peerDependencies": {
60
+ "defu": "^6.0.0",
61
+ "jiti": "^2.0.0",
62
+ "yaml": "^2.0.0"
63
+ },
64
+ "peerDependenciesMeta": {
65
+ "defu": {
66
+ "optional": true
67
+ },
68
+ "jiti": {
69
+ "optional": true
70
+ },
71
+ "yaml": {
72
+ "optional": true
73
+ }
74
+ },
75
+ "keywords": [
76
+ "turboforge",
77
+ "cli",
78
+ "monorepo",
79
+ "config-loader",
80
+ "workspace-detection",
81
+ "logger",
82
+ "typescript",
83
+ "node-js",
84
+ "tooling"
85
+ ],
86
+ "scripts": {
87
+ "build": "tsup && gzip -c dist/index.js | wc -c",
88
+ "clean": "rm -rf dist",
89
+ "dev": "tsup --watch"
90
+ }
91
+ }