@grainulation/grainulation 1.0.0 → 1.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,85 +1,71 @@
1
- # grainulation
1
+ <p align="center">
2
+ <img src="site/wordmark.svg" alt="grainulation" width="400">
3
+ </p>
2
4
 
3
- **Structured research for decisions that satisfice.**
5
+ <p align="center">
6
+ <a href="https://www.npmjs.com/package/@grainulation/grainulation"><img src="https://img.shields.io/npm/v/@grainulation/grainulation" alt="npm version"></a>
7
+ <a href="https://www.npmjs.com/package/@grainulation/grainulation"><img src="https://img.shields.io/npm/dm/@grainulation/grainulation" alt="npm downloads"></a>
8
+ <a href="https://github.com/grainulation/grainulation/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-MIT-green" alt="license"></a>
9
+ <a href="https://nodejs.org"><img src="https://img.shields.io/node/v/@grainulation/grainulation" alt="node"></a>
10
+ <a href="https://github.com/grainulation/grainulation/actions"><img src="https://github.com/grainulation/grainulation/actions/workflows/ci.yml/badge.svg" alt="CI"></a>
11
+ <a href="https://deepwiki.com/grainulation/grainulation"><img src="https://deepwiki.com/badge.svg" alt="Explore on DeepWiki"></a>
12
+ </p>
4
13
 
5
- ---
14
+ <p align="center"><strong>Structured research for decisions that satisfice.</strong></p>
6
15
 
7
- ## Slow down and trust the process
16
+ Most decisions fail not because the team lacked data, but because they lacked a process for turning data into evidence and evidence into conviction. Grainulation is that process.
8
17
 
9
- Most decisions fail not because the team lacked data, but because they lacked a process for turning data into evidence and evidence into conviction.
18
+ You start with a question. You grow evidence: claims with types, confidence levels, and evidence tiers. You challenge what you find. You look for blind spots. And only when the evidence compiles -- when conflicts are resolved and gaps are acknowledged -- do you write the brief.
10
19
 
11
- Grainulation is that process.
20
+ ## Install
12
21
 
13
- You start with a question. Not an answer, not a hypothesis -- a question. Then you grow evidence: claims with types, confidence levels, and evidence tiers. You challenge what you find. You look for blind spots. You corroborate with external sources. And only when the evidence compiles -- when conflicts are resolved and gaps are acknowledged -- do you write the brief.
14
-
15
- The brief is not the goal. The brief is the receipt. The goal is the thinking that got you there.
16
-
17
- ## The journey
18
-
19
- ```mermaid
20
- flowchart LR
21
- Q["Question"] -->|"/init"| S["Seed Claims"]
22
- S -->|"/research"| C["Grow Evidence"]
23
- C -->|"/compile"| B["Compile Brief"]
24
- C -->|"/challenge /blind-spot"| A["Adversarial Pressure"]
25
- A -->|"/witness /feedback"| C
22
+ ```bash
23
+ npm install -g @grainulation/grainulation
26
24
  ```
27
25
 
28
- Every step is tracked. Every claim has provenance. Every decision is reproducible.
29
-
30
- ## The ecosystem
31
-
32
- Eight tools. Each does one thing. Use what you need.
33
-
34
- | Tool | What it does | Install |
35
- |------|-------------|---------|
36
- | **wheat** | Grows evidence. Research sprint engine. | `npx @grainulation/wheat init` |
37
- | **farmer** | Permission dashboard. Approve AI actions in real time. | `npm i -g @grainulation/farmer` |
38
- | **barn** | Shared tools. Claim schemas, templates, validators. | `npm i -g @grainulation/barn` |
39
- | **mill** | Processes output. Export to PDF, slides, wiki. | `npm i -g @grainulation/mill` |
40
- | **silo** | Stores knowledge. Reusable claim libraries. | `npm i -g @grainulation/silo` |
41
- | **harvest** | Analytics. Cross-sprint learning and prediction scoring. | `npm i -g @grainulation/harvest` |
42
- | **orchard** | Orchestration. Multi-sprint coordination. | `npm i -g @grainulation/orchard` |
43
- | **grainulation** | The machine. Unified CLI and brand. | `npm i -g grainulation` |
44
-
45
- **You don't need all eight.** Start with wheat. That's it. One command:
26
+ Or start a research sprint directly:
46
27
 
47
28
  ```bash
48
29
  npx @grainulation/wheat init
49
30
  ```
50
31
 
51
- Everything else is optional. Add tools when you feel the friction.
52
-
53
32
  ## Quick start
54
33
 
55
34
  ```bash
56
- # Start a research sprint
57
- npx @grainulation/wheat init
35
+ grainulation # Ecosystem overview
36
+ grainulation doctor # Health check: which tools, which versions
37
+ grainulation setup # Install the right tools for your role
38
+ grainulation wheat init # Delegate to any tool
39
+ grainulation farmer start
40
+ ```
58
41
 
59
- # Or install the unified CLI first
60
- npm install -g @grainulation/grainulation
42
+ ## The ecosystem
61
43
 
62
- # See what's installed
63
- grainulation doctor
44
+ Eight tools. Each does one thing. Use what you need.
64
45
 
65
- # Interactive setup based on your role
66
- grainulation setup
46
+ | Tool | What it does | Install |
47
+ | ------------------------------------------------------------ | ----------------------------------------------------------------------------- | ------------------------------------- |
48
+ | [wheat](https://github.com/grainulation/wheat) | Research engine. Grow structured evidence. | `npx @grainulation/wheat init` |
49
+ | [farmer](https://github.com/grainulation/farmer) | Permission dashboard. Approve AI actions in real time (admin + viewer roles). | `npm i -g @grainulation/farmer` |
50
+ | [barn](https://github.com/grainulation/barn) | Shared tools. Templates, validators, sprint detection. | `npm i -g @grainulation/barn` |
51
+ | [mill](https://github.com/grainulation/mill) | Format conversion. Export to PDF, CSV, slides, 24 formats. | `npm i -g @grainulation/mill` |
52
+ | [silo](https://github.com/grainulation/silo) | Knowledge storage. Reusable claim libraries and packs. | `npm i -g @grainulation/silo` |
53
+ | [harvest](https://github.com/grainulation/harvest) | Analytics. Cross-sprint patterns and prediction scoring. | `npm i -g @grainulation/harvest` |
54
+ | [orchard](https://github.com/grainulation/orchard) | Orchestration. Multi-sprint coordination and dependencies. | `npm i -g @grainulation/orchard` |
55
+ | [grainulation](https://github.com/grainulation/grainulation) | Unified CLI. Single entry point to the ecosystem. | `npm i -g @grainulation/grainulation` |
67
56
 
68
- # Delegate to any tool
69
- grainulation wheat init
70
- grainulation farmer start
71
- ```
57
+ **You don't need all eight.** Start with wheat. That's it. One command. Everything else is optional -- add tools when you feel the friction.
72
58
 
73
- ## The unified CLI
59
+ ## The journey
74
60
 
75
- ```bash
76
- grainulation # Ecosystem overview
77
- grainulation doctor # Health check: which tools, which versions
78
- grainulation setup # Install the right tools for your role
79
- grainulation <tool> ... # Delegate to any grainulation tool
61
+ ```
62
+ Question --> Seed Claims --> Grow Evidence --> Compile Brief
63
+ /init /research /challenge /brief
64
+ /blind-spot
65
+ /witness
80
66
  ```
81
67
 
82
- The CLI is the wayfinder. It doesn't do the work -- it points you to the tool that does.
68
+ Every step is tracked. Every claim has provenance. Every decision is reproducible.
83
69
 
84
70
  ## Philosophy
85
71
 
@@ -87,18 +73,16 @@ The CLI is the wayfinder. It doesn't do the work -- it points you to the tool th
87
73
 
88
74
  **Claims over opinions.** Every finding is a typed claim with an evidence tier. "I think" becomes "r003: factual, tested -- measured 340ms p95 latency under load."
89
75
 
90
- **Adversarial pressure over consensus.** The `/challenge` command exists because comfortable agreement is the enemy of good decisions. If nobody is stress-testing the claims, the research isn't done.
76
+ **Adversarial pressure over consensus.** The `/challenge` command exists because comfortable agreement is the enemy of good decisions.
91
77
 
92
78
  **Process over heroics.** A reproducible sprint that anyone can pick up beats a brilliant analysis that lives in one person's head.
93
79
 
94
80
  ## Zero dependencies
95
81
 
96
- Every grainulation tool runs on Node built-ins only. No npm install waterfall. No left-pad. No supply chain anxiety. Just `node`, `fs`, `http`, and `crypto`.
82
+ Every grainulation tool runs on Node built-ins only. No npm install waterfall. No left-pad. No supply chain anxiety.
97
83
 
98
84
  ## The name
99
85
 
100
- The name comes last.
101
-
102
86
  You build the crop (wheat), the steward (farmer), the barn, the mill, the silo, the harvest, the orchard -- and only then do you name the machine that connects them all.
103
87
 
104
88
  Grainulation: the machine that processes the grain.
@@ -1,7 +1,5 @@
1
1
  #!/usr/bin/env node
2
2
 
3
- 'use strict';
4
-
5
3
  /**
6
4
  * grainulation
7
5
  *
@@ -16,37 +14,38 @@
16
14
  * grainulation <tool> [args] Delegate to a grainulation tool
17
15
  */
18
16
 
19
- const verbose = process.argv.includes('--verbose') || process.argv.includes('-v');
20
- const jsonMode = process.argv.includes('--json');
17
+ const verbose =
18
+ process.argv.includes("--verbose") || process.argv.includes("-v");
19
+ const jsonMode = process.argv.includes("--json");
21
20
  function vlog(...a) {
22
21
  if (!verbose) return;
23
22
  const ts = new Date().toISOString();
24
- process.stderr.write(`[${ts}] grainulation: ${a.join(' ')}\n`);
23
+ process.stderr.write(`[${ts}] grainulation: ${a.join(" ")}\n`);
25
24
  }
26
25
 
27
26
  const command = process.argv[2];
28
- vlog('startup', `command=${command || '(none)'}`, `cwd=${process.cwd()}`);
27
+ vlog("startup", `command=${command || "(none)"}`, `cwd=${process.cwd()}`);
29
28
 
30
29
  // Serve command — start the HTTP server (ESM module)
31
- if (command === 'serve') {
32
- const path = require('node:path');
33
- const { spawn } = require('node:child_process');
34
- const serverPath = path.join(__dirname, '..', 'lib', 'server.mjs');
30
+ if (command === "serve") {
31
+ const path = require("node:path");
32
+ const { spawn } = require("node:child_process");
33
+ const serverPath = path.join(__dirname, "..", "lib", "server.mjs");
35
34
 
36
35
  // Forward remaining args to the server
37
36
  const serverArgs = process.argv.slice(3);
38
37
  const child = spawn(process.execPath, [serverPath, ...serverArgs], {
39
- stdio: 'inherit',
38
+ stdio: "inherit",
40
39
  });
41
40
 
42
- child.on('close', (code) => process.exit(code ?? 0));
43
- child.on('error', (err) => {
41
+ child.on("close", (code) => process.exit(code ?? 0));
42
+ child.on("error", (err) => {
44
43
  console.error(`grainulation: failed to start server: ${err.message}`);
45
44
  process.exit(1);
46
45
  });
47
46
  } else {
48
- const { route } = require('../lib/router');
47
+ const { route } = require("../lib/router");
49
48
  // Strip --json from args before routing (it's handled as a mode flag)
50
- const args = process.argv.slice(2).filter((a) => a !== '--json');
49
+ const args = process.argv.slice(2).filter((a) => a !== "--json");
51
50
  route(args, { json: jsonMode });
52
51
  }
package/lib/doctor.js CHANGED
@@ -1,9 +1,7 @@
1
- 'use strict';
2
-
3
- const { execSync } = require('node:child_process');
4
- const { existsSync } = require('node:fs');
5
- const path = require('node:path');
6
- const { getInstallable } = require('./ecosystem');
1
+ const { execSync } = require("node:child_process");
2
+ const { existsSync } = require("node:fs");
3
+ const path = require("node:path");
4
+ const { getInstallable } = require("./ecosystem");
7
5
 
8
6
  /**
9
7
  * Health check.
@@ -20,11 +18,12 @@ const { getInstallable } = require('./ecosystem');
20
18
  function checkGlobal(packageName) {
21
19
  try {
22
20
  const out = execSync(`npm list -g ${packageName} --depth=0 2>/dev/null`, {
23
- stdio: 'pipe',
24
- encoding: 'utf-8',
21
+ stdio: "pipe",
22
+ encoding: "utf-8",
23
+ timeout: 5000,
25
24
  });
26
25
  const match = out.match(new RegExp(`${escapeRegex(packageName)}@(\\S+)`));
27
- return match ? { version: match[1], method: 'global' } : null;
26
+ return match ? { version: match[1], method: "global" } : null;
28
27
  } catch {
29
28
  return null;
30
29
  }
@@ -36,25 +35,34 @@ function checkGlobal(packageName) {
36
35
  */
37
36
  function checkNpxCache(packageName) {
38
37
  try {
39
- const prefix = execSync('npm config get cache', {
40
- stdio: 'pipe',
41
- encoding: 'utf-8',
38
+ const prefix = execSync("npm config get cache", {
39
+ stdio: "pipe",
40
+ encoding: "utf-8",
41
+ timeout: 5000,
42
42
  }).trim();
43
- const npxDir = path.join(prefix, '_npx');
43
+ const npxDir = path.join(prefix, "_npx");
44
44
  if (!existsSync(npxDir)) return null;
45
45
 
46
46
  // npx cache has hash-named directories, each with node_modules
47
- const { readdirSync } = require('node:fs');
47
+ const { readdirSync } = require("node:fs");
48
48
  const entries = readdirSync(npxDir, { withFileTypes: true });
49
49
  for (const entry of entries) {
50
50
  if (!entry.isDirectory()) continue;
51
- const pkgJson = path.join(npxDir, entry.name, 'node_modules', packageName, 'package.json');
51
+ const pkgJson = path.join(
52
+ npxDir,
53
+ entry.name,
54
+ "node_modules",
55
+ packageName,
56
+ "package.json",
57
+ );
52
58
  if (existsSync(pkgJson)) {
53
59
  try {
54
- const pkg = JSON.parse(require('node:fs').readFileSync(pkgJson, 'utf-8'));
55
- return { version: pkg.version || 'installed', method: 'npx cache' };
60
+ const pkg = JSON.parse(
61
+ require("node:fs").readFileSync(pkgJson, "utf-8"),
62
+ );
63
+ return { version: pkg.version || "installed", method: "npx cache" };
56
64
  } catch {
57
- return { version: 'installed', method: 'npx cache' };
65
+ return { version: "installed", method: "npx cache" };
58
66
  }
59
67
  }
60
68
  }
@@ -70,10 +78,15 @@ function checkNpxCache(packageName) {
70
78
  */
71
79
  function checkLocal(packageName) {
72
80
  try {
73
- const pkgJson = path.join(process.cwd(), 'node_modules', packageName, 'package.json');
81
+ const pkgJson = path.join(
82
+ process.cwd(),
83
+ "node_modules",
84
+ packageName,
85
+ "package.json",
86
+ );
74
87
  if (existsSync(pkgJson)) {
75
- const pkg = JSON.parse(require('node:fs').readFileSync(pkgJson, 'utf-8'));
76
- return { version: pkg.version || 'installed', method: 'local' };
88
+ const pkg = JSON.parse(require("node:fs").readFileSync(pkgJson, "utf-8"));
89
+ return { version: pkg.version || "installed", method: "local" };
77
90
  }
78
91
  return null;
79
92
  } catch {
@@ -89,20 +102,27 @@ function checkLocal(packageName) {
89
102
  function checkSource(packageName) {
90
103
  const candidates = [
91
104
  // Sibling directory (monorepo or co-located checkouts)
92
- path.join(process.cwd(), '..', packageName.replace(/^@[^/]+\//, '')),
93
- path.join(process.cwd(), '..', packageName.replace(/^@[^/]+\//, ''), 'package.json'),
105
+ path.join(process.cwd(), "..", packageName.replace(/^@[^/]+\//, "")),
106
+ path.join(
107
+ process.cwd(),
108
+ "..",
109
+ packageName.replace(/^@[^/]+\//, ""),
110
+ "package.json",
111
+ ),
94
112
  // Packages dir (monorepo)
95
- path.join(process.cwd(), 'packages', packageName.replace(/^@[^/]+\//, '')),
113
+ path.join(process.cwd(), "packages", packageName.replace(/^@[^/]+\//, "")),
96
114
  ];
97
115
  for (const candidate of candidates) {
98
- const pkgJson = candidate.endsWith('package.json')
116
+ const pkgJson = candidate.endsWith("package.json")
99
117
  ? candidate
100
- : path.join(candidate, 'package.json');
118
+ : path.join(candidate, "package.json");
101
119
  if (existsSync(pkgJson)) {
102
120
  try {
103
- const pkg = JSON.parse(require('node:fs').readFileSync(pkgJson, 'utf-8'));
121
+ const pkg = JSON.parse(
122
+ require("node:fs").readFileSync(pkgJson, "utf-8"),
123
+ );
104
124
  if (pkg.name === packageName) {
105
- return { version: pkg.version || 'installed', method: 'source' };
125
+ return { version: pkg.version || "installed", method: "source" };
106
126
  }
107
127
  } catch {
108
128
  // not a match, continue
@@ -119,21 +139,24 @@ function checkSource(packageName) {
119
139
  */
120
140
  function checkNpxNoInstall(packageName) {
121
141
  try {
122
- const out = execSync(`npx --no-install ${packageName} --version 2>/dev/null`, {
123
- stdio: 'pipe',
124
- encoding: 'utf-8',
125
- timeout: 5000,
126
- }).trim();
142
+ const out = execSync(
143
+ `npx --no-install ${packageName} --version 2>/dev/null`,
144
+ {
145
+ stdio: "pipe",
146
+ encoding: "utf-8",
147
+ timeout: 5000,
148
+ },
149
+ ).trim();
127
150
  // Expect a version-like string
128
151
  const match = out.match(/v?(\d+\.\d+\.\d+\S*)/);
129
- return match ? { version: match[1], method: 'npx' } : null;
152
+ return match ? { version: match[1], method: "npx" } : null;
130
153
  } catch {
131
154
  return null;
132
155
  }
133
156
  }
134
157
 
135
158
  function escapeRegex(str) {
136
- return str.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
159
+ return str.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
137
160
  }
138
161
 
139
162
  /**
@@ -165,9 +188,51 @@ function getNodeVersion() {
165
188
 
166
189
  function getNpmVersion() {
167
190
  try {
168
- return execSync('npm --version', { stdio: 'pipe', encoding: 'utf-8' }).trim();
191
+ return execSync("npm --version", {
192
+ stdio: "pipe",
193
+ encoding: "utf-8",
194
+ timeout: 5000,
195
+ }).trim();
196
+ } catch {
197
+ return "not found";
198
+ }
199
+ }
200
+
201
+ function getPnpmVersion() {
202
+ try {
203
+ return execSync("pnpm --version", {
204
+ stdio: "pipe",
205
+ encoding: "utf-8",
206
+ timeout: 5000,
207
+ }).trim();
208
+ } catch {
209
+ return null;
210
+ }
211
+ }
212
+
213
+ function getBiomeVersion() {
214
+ try {
215
+ const out = execSync("npx biome --version", {
216
+ stdio: "pipe",
217
+ encoding: "utf-8",
218
+ timeout: 5000,
219
+ }).trim();
220
+ const match = out.match(/(\d+\.\d+\.\d+\S*)/);
221
+ return match ? match[1] : out;
169
222
  } catch {
170
- return 'not found';
223
+ return null;
224
+ }
225
+ }
226
+
227
+ function getHooksPath() {
228
+ try {
229
+ return execSync("git config core.hooksPath", {
230
+ stdio: "pipe",
231
+ encoding: "utf-8",
232
+ timeout: 5000,
233
+ }).trim();
234
+ } catch {
235
+ return null;
171
236
  }
172
237
  }
173
238
 
@@ -175,6 +240,10 @@ function run(opts) {
175
240
  const json = opts && opts.json;
176
241
  const tools = getInstallable();
177
242
 
243
+ const pnpmVersion = getPnpmVersion();
244
+ const biomeVersion = getBiomeVersion();
245
+ const hooksPath = getHooksPath();
246
+
178
247
  if (json) {
179
248
  const toolResults = [];
180
249
  for (const tool of tools) {
@@ -187,59 +256,103 @@ function run(opts) {
187
256
  method: result ? result.method : null,
188
257
  });
189
258
  }
190
- console.log(JSON.stringify({
191
- environment: { node: getNodeVersion(), npm: getNpmVersion() },
192
- tools: toolResults,
193
- installed: toolResults.filter((t) => t.installed).length,
194
- missing: toolResults.filter((t) => !t.installed).length,
195
- }));
259
+ console.log(
260
+ JSON.stringify({
261
+ environment: {
262
+ node: getNodeVersion(),
263
+ npm: getNpmVersion(),
264
+ pnpm: pnpmVersion,
265
+ biome: biomeVersion,
266
+ hooksPath: hooksPath,
267
+ },
268
+ tools: toolResults,
269
+ installed: toolResults.filter((t) => t.installed).length,
270
+ missing: toolResults.filter((t) => !t.installed).length,
271
+ }),
272
+ );
196
273
  return;
197
274
  }
198
275
 
199
276
  let installed = 0;
200
277
  let missing = 0;
201
278
 
202
- console.log('');
203
- console.log(' \x1b[1;33mgrainulation doctor\x1b[0m');
204
- console.log(' Checking ecosystem health...');
205
- console.log('');
279
+ console.log("");
280
+ console.log(" \x1b[1;33mgrainulation doctor\x1b[0m");
281
+ console.log(" Checking ecosystem health...");
282
+ console.log("");
206
283
 
207
284
  // Environment
208
- console.log(' \x1b[2mEnvironment:\x1b[0m');
285
+ console.log(" \x1b[2mEnvironment:\x1b[0m");
209
286
  console.log(` Node ${getNodeVersion()}`);
210
287
  console.log(` npm v${getNpmVersion()}`);
211
- console.log('');
288
+ if (pnpmVersion) {
289
+ console.log(` pnpm v${pnpmVersion}`);
290
+ } else {
291
+ console.log(" pnpm \x1b[2mnot found\x1b[0m");
292
+ }
293
+ console.log("");
294
+
295
+ // DX tooling
296
+ console.log(" \x1b[2mDX tooling:\x1b[0m");
297
+ if (biomeVersion) {
298
+ console.log(` \x1b[32m\u2713\x1b[0m Biome v${biomeVersion}`);
299
+ } else {
300
+ console.log(
301
+ " \x1b[2m\u2717 Biome not found (pnpm install to set up)\x1b[0m",
302
+ );
303
+ }
304
+ if (hooksPath) {
305
+ console.log(` \x1b[32m\u2713\x1b[0m Git hooks ${hooksPath}`);
306
+ } else {
307
+ console.log(
308
+ " \x1b[2m\u2717 Git hooks not configured (run: git config core.hooksPath .githooks)\x1b[0m",
309
+ );
310
+ }
311
+ console.log("");
212
312
 
213
313
  // Tools
214
- console.log(' \x1b[2mTools:\x1b[0m');
314
+ console.log(" \x1b[2mTools:\x1b[0m");
215
315
  for (const tool of tools) {
216
316
  const result = detect(tool.package);
217
317
  if (result) {
218
318
  installed++;
219
319
  const ver = `v${result.version}`.padEnd(10);
220
320
  console.log(
221
- ` \x1b[32m\u2713\x1b[0m ${tool.name.padEnd(12)} ${ver} \x1b[2m(${result.method})\x1b[0m`
321
+ ` \x1b[32m\u2713\x1b[0m ${tool.name.padEnd(12)} ${ver} \x1b[2m(${result.method})\x1b[0m`,
222
322
  );
223
323
  } else {
224
324
  missing++;
225
- console.log(` \x1b[2m\u2717 ${tool.name.padEnd(12)} -- (not found)\x1b[0m`);
325
+ console.log(
326
+ ` \x1b[2m\u2717 ${tool.name.padEnd(12)} -- (not found)\x1b[0m`,
327
+ );
226
328
  }
227
329
  }
228
330
 
229
- console.log('');
331
+ console.log("");
230
332
 
231
333
  // Summary
232
334
  if (missing === tools.length) {
233
- console.log(' \x1b[33mNo grainulation tools found.\x1b[0m');
234
- console.log(' Start with: npx @grainulation/wheat init');
335
+ console.log(" \x1b[33mNo grainulation tools found.\x1b[0m");
336
+ console.log(" Start with: npx @grainulation/wheat init");
235
337
  } else if (missing > 0) {
236
- console.log(` \x1b[32m${installed} found\x1b[0m, \x1b[2m${missing} not found\x1b[0m`);
237
- console.log(' Run \x1b[1mgrainulation setup\x1b[0m to install what you need.');
338
+ console.log(
339
+ ` \x1b[32m${installed} found\x1b[0m, \x1b[2m${missing} not found\x1b[0m`,
340
+ );
341
+ console.log(
342
+ " Run \x1b[1mgrainulation setup\x1b[0m to install what you need.",
343
+ );
238
344
  } else {
239
- console.log(' \x1b[32mAll tools found. Full ecosystem ready.\x1b[0m');
345
+ console.log(" \x1b[32mAll tools found. Full ecosystem ready.\x1b[0m");
240
346
  }
241
347
 
242
- console.log('');
348
+ console.log("");
243
349
  }
244
350
 
245
- module.exports = { run, getVersion, detect };
351
+ module.exports = {
352
+ run,
353
+ getVersion,
354
+ detect,
355
+ getPnpmVersion,
356
+ getBiomeVersion,
357
+ getHooksPath,
358
+ };