agentnetes 0.1.0 → 0.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +147 -0
  2. package/dist/index.js +103 -32
  3. package/package.json +14 -4
package/README.md ADDED
@@ -0,0 +1,147 @@
1
+ <div align="center">
2
+
3
+ <img src="https://shashikant86.github.io/agentnetes/logo.png" alt="Agentnetes" width="80" height="80" />
4
+
5
+ # agentnetes
6
+
7
+ **Zero to a Self-Organizing AI Agency. On Demand.**
8
+
9
+ *Self-Organizing AI Agent Swarms. On Demand. · Kubernetes-inspired orchestration for AI agents.*
10
+
11
+ [![npm version](https://img.shields.io/npm/v/agentnetes?style=for-the-badge&logo=npm&color=fb923c)](https://www.npmjs.com/package/agentnetes)
12
+ [![npm downloads](https://img.shields.io/npm/dm/agentnetes?style=for-the-badge&color=22c55e)](https://www.npmjs.com/package/agentnetes)
13
+ [![License: MIT](https://img.shields.io/badge/License-MIT-white?style=for-the-badge)](https://github.com/Shashikant86/agentnetes/blob/main/LICENSE)
14
+
15
+ [Live Demo](https://shashikant86.github.io/agentnetes/) · [Full Docs](https://shashikant86.github.io/agentnetes/docs) · [GitHub](https://github.com/Shashikant86/agentnetes)
16
+
17
+ </div>
18
+
19
+ ---
20
+
21
+ ## Prerequisites
22
+
23
+ | Requirement | Version | Notes |
24
+ |-------------|---------|-------|
25
+ | Node.js | 18+ | Required |
26
+ | Docker | any recent | Required for default sandbox |
27
+ | Google API key | free tier | Get at [aistudio.google.com](https://aistudio.google.com) |
28
+ | Git | any | Repo must have a remote `origin` |
29
+
30
+ ---
31
+
32
+ ## Quick start
33
+
34
+ ```bash
35
+ # 1. Pull the Docker base image (one-time)
36
+ docker pull node:20-alpine
37
+
38
+ # 2. Run on any git repo
39
+ cd your-project
40
+ GOOGLE_API_KEY=your_key npx agentnetes run "add comprehensive test coverage"
41
+ ```
42
+
43
+ No global install needed. Works on any git repository.
44
+
45
+ ---
46
+
47
+ ## How it works
48
+
49
+ 1. You type a goal inside any git repo
50
+ 2. A root agent (Tech Lead) explores your codebase and invents a specialist team
51
+ 3. Specialists run in parallel · each in their own isolated Docker container with the repo cloned inside
52
+ 4. They use two tools: `search()` (grep) and `execute()` (bash) · no file contents stuffed into prompts
53
+ 5. Agents write code, run tests, fix failures, and deliver together
54
+ 6. A final synthesis summarises everything found and built
55
+
56
+ Roles are fully emergent · nothing is hardcoded. A feature task spawns a Scout, Engineer, and Tester. A security audit spawns a completely different team.
57
+
58
+ ---
59
+
60
+ ## Usage
61
+
62
+ ```bash
63
+ # Run agents on the current git repo
64
+ GOOGLE_API_KEY=your_key npx agentnetes run "your goal here"
65
+
66
+ # Install globally to skip npx
67
+ npm install -g agentnetes
68
+ GOOGLE_API_KEY=your_key agentnetes run "your goal here"
69
+
70
+ # Pre-warm a sandbox snapshot for faster runs (requires Vercel token)
71
+ VERCEL_TOKEN=your_token agentnetes snapshot create
72
+
73
+ # List available snapshots
74
+ VERCEL_TOKEN=your_token agentnetes snapshot list
75
+ ```
76
+
77
+ ---
78
+
79
+ ## Environment variables
80
+
81
+ ```bash
82
+ # Required · get a free key at aistudio.google.com
83
+ GOOGLE_API_KEY=
84
+
85
+ # Sandbox provider (default: docker)
86
+ SANDBOX_PROVIDER=docker # docker | local | vercel | e2b | daytona
87
+
88
+ # Optional · override default models
89
+ PLANNER_MODEL=google/gemini-2.5-pro
90
+ WORKER_MODEL=google/gemini-2.5-flash
91
+
92
+ # Optional · Vercel sandbox only
93
+ VERCEL_TOKEN=
94
+ ```
95
+
96
+ > **Note:** Do not set `AI_GATEWAY_BASE_URL` when using `GOOGLE_API_KEY` directly · it will route through Vercel AI Gateway and fail with an authentication error.
97
+
98
+ ---
99
+
100
+ ## Sandbox providers
101
+
102
+ | Provider | Requirement | Speed | Notes |
103
+ |----------|-------------|-------|-------|
104
+ | `docker` | Docker running | Fast | Default. One `node:20-alpine` container per agent. |
105
+ | `local` | Nothing | Fastest | Runs on host machine. No isolation. |
106
+ | `vercel` | `VERCEL_TOKEN` | Fastest | Firecracker microVMs with snapshot support. |
107
+ | `e2b` | `E2B_API_KEY` | Fast | E2B cloud sandboxes. |
108
+ | `daytona` | `DAYTONA_API_KEY` | Fast | Daytona workspaces. |
109
+
110
+ ### Docker setup
111
+
112
+ Pull the base image once to avoid download delay on first run:
113
+
114
+ ```bash
115
+ docker pull node:20-alpine
116
+ ```
117
+
118
+ Each agent gets its own container with `bash` and `git` installed, and the target repo cloned to `/workspace`. Containers are removed automatically when agents finish.
119
+
120
+ To watch containers spin up in real time:
121
+
122
+ ```bash
123
+ # In a separate terminal
124
+ watch -n 1 docker ps
125
+ ```
126
+
127
+ ### Local sandbox (no Docker)
128
+
129
+ If you don't have Docker, use the local provider · agents run directly on your machine in a temp directory:
130
+
131
+ ```bash
132
+ SANDBOX_PROVIDER=local GOOGLE_API_KEY=your_key agentnetes run "your goal"
133
+ ```
134
+
135
+ ---
136
+
137
+ ## Tips
138
+
139
+ - Use **Gemini 2.5 Flash** as the worker model for faster runs · Flash completes tasks in seconds vs minutes for Pro
140
+ - Be specific in your goals: `"add vitest tests for all functions in src/utils/"` works better than `"add tests"`
141
+ - For large repos, agents focus on the most relevant files via `search()` · you don't need to pre-filter
142
+
143
+ ---
144
+
145
+ ## License
146
+
147
+ MIT · [github.com/Shashikant86/agentnetes](https://github.com/Shashikant86/agentnetes)
package/dist/index.js CHANGED
@@ -25,7 +25,7 @@ var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__ge
25
25
  ));
26
26
 
27
27
  // src/index.ts
28
- var import_child_process4 = require("child_process");
28
+ var import_child_process5 = require("child_process");
29
29
 
30
30
  // src/commands/run.ts
31
31
  var import_fs2 = require("fs");
@@ -37,23 +37,13 @@ var import_ai2 = require("ai");
37
37
  var import_zod2 = require("zod");
38
38
 
39
39
  // ../../lib/gateway.ts
40
- var import_gateway = require("@ai-sdk/gateway");
41
40
  var import_google = require("@ai-sdk/google");
42
- function makeProvider() {
43
- if (process.env.AI_GATEWAY_BASE_URL) {
44
- return (0, import_gateway.createGatewayProvider)({ baseURL: process.env.AI_GATEWAY_BASE_URL });
45
- }
46
- return (0, import_google.createGoogleGenerativeAI)({
47
- apiKey: process.env.GOOGLE_API_KEY ?? process.env.GOOGLE_GENERATIVE_AI_API_KEY
41
+ function gateway(model, apiKeyOverride) {
42
+ const google = (0, import_google.createGoogleGenerativeAI)({
43
+ apiKey: apiKeyOverride ?? process.env.GOOGLE_API_KEY ?? process.env.GOOGLE_GENERATIVE_AI_API_KEY
48
44
  });
49
- }
50
- var provider = makeProvider();
51
- function gateway(model) {
52
- if (process.env.AI_GATEWAY_BASE_URL) {
53
- return provider(model);
54
- }
55
45
  const modelId = model.includes("/") ? model.split("/").slice(1).join("/") : model;
56
- return provider(modelId);
46
+ return google(modelId);
57
47
  }
58
48
 
59
49
  // ../../lib/vrlm/tools.ts
@@ -157,11 +147,12 @@ var DockerSandbox = class {
157
147
  this.containerId = containerId;
158
148
  this.id = `docker-${containerId.slice(0, 12)}`;
159
149
  }
160
- async runCommand(_shell, args2) {
161
- const cmd = args2.join(" ");
150
+ async runCommand(shell, args2) {
151
+ const cIdx = args2.indexOf("-c");
152
+ const actualCmd = cIdx >= 0 && args2[cIdx + 1] ? args2[cIdx + 1] : args2.join(" ");
162
153
  try {
163
154
  const output = (0, import_child_process2.execSync)(
164
- `docker exec ${this.containerId} sh -c ${JSON.stringify(cmd)}`,
155
+ `docker exec -w /workspace ${this.containerId} ${shell} -c ${JSON.stringify(actualCmd)}`,
165
156
  { encoding: "utf-8", timeout: 6e4 }
166
157
  );
167
158
  return { stdout: async () => output ?? "", exitCode: 0 };
@@ -196,7 +187,7 @@ async function createDockerSandbox(repoUrl) {
196
187
  { encoding: "utf-8" }
197
188
  ).trim();
198
189
  (0, import_child_process2.execSync)(
199
- `docker exec ${containerId} sh -c "apk add --no-cache git 2>/dev/null"`,
190
+ `docker exec ${containerId} sh -c "apk add --no-cache git bash 2>/dev/null"`,
200
191
  { timeout: 3e4, stdio: "ignore" }
201
192
  );
202
193
  (0, import_child_process2.execSync)(
@@ -219,22 +210,22 @@ function detectProvider() {
219
210
  if (process.env.DAYTONA_API_KEY) return "daytona";
220
211
  return "docker";
221
212
  }
222
- async function createWorkerSandbox(repoUrl, snapshotId) {
223
- const provider2 = detectProvider();
224
- if (provider2 === "vercel") {
213
+ async function createWorkerSandbox(repoUrl, snapshotId, providerOverride) {
214
+ const provider = providerOverride ?? detectProvider();
215
+ if (provider === "vercel") {
225
216
  const { Sandbox: Sandbox2 } = await import("@vercel/sandbox");
226
217
  if (snapshotId) {
227
218
  return Sandbox2.create({ source: { type: "snapshot", snapshotId }, timeout: 10 * 60 * 1e3 });
228
219
  }
229
220
  return Sandbox2.create({ source: { type: "git", url: repoUrl, depth: 1 }, timeout: 10 * 60 * 1e3 });
230
221
  }
231
- if (provider2 === "e2b") {
222
+ if (provider === "e2b") {
232
223
  console.warn('[sandbox] e2b: install the "e2b" package to use this provider. Falling back to local.');
233
224
  }
234
- if (provider2 === "daytona") {
225
+ if (provider === "daytona") {
235
226
  console.warn('[sandbox] daytona: install the "@daytonaio/sdk" package to use this provider. Falling back to local.');
236
227
  }
237
- if (provider2 === "docker") {
228
+ if (provider === "docker") {
238
229
  return createDockerSandbox(repoUrl);
239
230
  }
240
231
  return createLocalSandbox(repoUrl);
@@ -584,7 +575,7 @@ var VrlmRuntime = class {
584
575
  // ── Planner ─────────────────────────────────────────────────────────────────
585
576
  async runPlanner(goal) {
586
577
  const { text } = await (0, import_ai2.generateText)({
587
- model: gateway(this.config.plannerModel),
578
+ model: gateway(this.config.plannerModel, this.config.googleApiKey),
588
579
  system: PLANNER_SYSTEM_PROMPT,
589
580
  prompt: buildPlannerPrompt(goal),
590
581
  maxOutputTokens: 2e3
@@ -595,7 +586,7 @@ var VrlmRuntime = class {
595
586
  }
596
587
  // ── Worker ──────────────────────────────────────────────────────────────────
597
588
  async runWorker(task, repoUrl, snapshotId) {
598
- let sandbox = await createWorkerSandbox(repoUrl, snapshotId);
589
+ let sandbox = await createWorkerSandbox(repoUrl, snapshotId, this.config.sandboxProvider);
599
590
  task.sandboxId = sandbox.id ?? "local";
600
591
  this.emitTaskUpdate(task.id, "running", "Sandbox ready");
601
592
  this.emitter.emit({
@@ -611,7 +602,7 @@ var VrlmRuntime = class {
611
602
  const findings = [];
612
603
  const terminalLines = [];
613
604
  const agent = new import_ai2.ToolLoopAgent({
614
- model: gateway(this.config.workerModel),
605
+ model: gateway(this.config.workerModel, this.config.googleApiKey),
615
606
  tools,
616
607
  stopWhen: (0, import_ai2.stepCountIs)(this.config.maxStepsPerAgent),
617
608
  instructions: buildWorkerPrompt(task, findings),
@@ -688,7 +679,7 @@ var VrlmRuntime = class {
688
679
  // ── Synthesis ────────────────────────────────────────────────────────────────
689
680
  async runSynthesis(goal, summaries) {
690
681
  const { text } = await (0, import_ai2.generateText)({
691
- model: gateway(this.config.plannerModel),
682
+ model: gateway(this.config.plannerModel, this.config.googleApiKey),
692
683
  prompt: buildSynthesisPrompt(goal, summaries),
693
684
  maxOutputTokens: 1500
694
685
  });
@@ -970,6 +961,80 @@ async function snapshotList() {
970
961
  }
971
962
  }
972
963
 
964
+ // src/commands/serve.ts
965
+ var import_child_process4 = require("child_process");
966
+ var import_fs4 = require("fs");
967
+ var import_path4 = require("path");
968
+ var import_url = require("url");
969
+ var import_picocolors4 = __toESM(require("picocolors"));
970
+ var import_meta = {};
971
+ var __dirname = (0, import_path4.dirname)((0, import_url.fileURLToPath)(import_meta.url));
972
+ async function serve(port) {
973
+ const webDir = (0, import_path4.join)(__dirname, "..", "web");
974
+ const serverJs = (0, import_path4.join)(webDir, "server.js");
975
+ if (!(0, import_fs4.existsSync)(serverJs)) {
976
+ console.error(import_picocolors4.default.red("Error: web UI not found."));
977
+ console.error("");
978
+ console.error("The standalone web UI is not bundled in this installation.");
979
+ console.error("Build it from the repo root with:");
980
+ console.error("");
981
+ console.error(import_picocolors4.default.cyan(" bash scripts/build-serve.sh"));
982
+ console.error(" cd packages/cli && npm run build && npm publish");
983
+ console.error("");
984
+ process.exit(1);
985
+ }
986
+ console.log("");
987
+ console.log(import_picocolors4.default.bold(import_picocolors4.default.white("Agentnetes")));
988
+ console.log(import_picocolors4.default.dim("Zero to a self-organizing AI agency. On demand."));
989
+ console.log("");
990
+ console.log(import_picocolors4.default.dim("Starting web UI..."));
991
+ console.log("");
992
+ const env = {
993
+ ...process.env,
994
+ PORT: String(port),
995
+ HOSTNAME: "0.0.0.0"
996
+ };
997
+ const child = (0, import_child_process4.spawn)("node", [serverJs], {
998
+ cwd: webDir,
999
+ env,
1000
+ stdio: "pipe"
1001
+ });
1002
+ let started = false;
1003
+ child.stdout.on("data", (data) => {
1004
+ const line = data.toString().trim();
1005
+ if (!started && (line.includes("ready") || line.includes("started") || line.includes("listening") || line.includes(String(port)))) {
1006
+ started = true;
1007
+ console.log(import_picocolors4.default.green("Ready") + " " + import_picocolors4.default.bold(`http://localhost:${port}`));
1008
+ console.log("");
1009
+ console.log(import_picocolors4.default.dim("Set GOOGLE_API_KEY in your environment to run real agents."));
1010
+ console.log(import_picocolors4.default.dim("Press Ctrl+C to stop."));
1011
+ console.log("");
1012
+ } else {
1013
+ process.stdout.write(data);
1014
+ }
1015
+ });
1016
+ child.stderr.on("data", (data) => {
1017
+ process.stderr.write(data);
1018
+ });
1019
+ child.on("exit", (code) => {
1020
+ if (code !== 0 && code !== null) {
1021
+ console.error(import_picocolors4.default.red(`Server exited with code ${code}`));
1022
+ process.exit(code);
1023
+ }
1024
+ });
1025
+ await new Promise((resolve) => {
1026
+ child.on("exit", () => resolve());
1027
+ process.on("SIGINT", () => {
1028
+ child.kill("SIGTERM");
1029
+ resolve();
1030
+ });
1031
+ process.on("SIGTERM", () => {
1032
+ child.kill("SIGTERM");
1033
+ resolve();
1034
+ });
1035
+ });
1036
+ }
1037
+
973
1038
  // src/index.ts
974
1039
  var args = process.argv.slice(2);
975
1040
  var command = args[0];
@@ -982,7 +1047,7 @@ async function main() {
982
1047
  }
983
1048
  let repoUrl;
984
1049
  try {
985
- repoUrl = (0, import_child_process4.execSync)("git remote get-url origin", { encoding: "utf-8" }).trim();
1050
+ repoUrl = (0, import_child_process5.execSync)("git remote get-url origin", { encoding: "utf-8" }).trim();
986
1051
  if (repoUrl.startsWith("git@github.com:")) {
987
1052
  repoUrl = repoUrl.replace("git@github.com:", "https://github.com/").replace(".git", "");
988
1053
  }
@@ -999,17 +1064,23 @@ async function main() {
999
1064
  await snapshotCreate();
1000
1065
  } else if (command === "snapshot" && args[1] === "list") {
1001
1066
  await snapshotList();
1067
+ } else if (command === "serve") {
1068
+ const portArg = args.indexOf("--port");
1069
+ const port = portArg !== -1 ? parseInt(args[portArg + 1], 10) : 3e3;
1070
+ await serve(port);
1002
1071
  } else {
1003
1072
  console.log("agentnetes - zero to a self-organizing AI agency. On demand.");
1004
1073
  console.log("");
1005
1074
  console.log("Usage:");
1006
1075
  console.log(' agentnetes run "goal" Run agents on the current git repo');
1076
+ console.log(" agentnetes serve Start the web UI on localhost:3000");
1077
+ console.log(" agentnetes serve --port 8080");
1007
1078
  console.log(" agentnetes snapshot create Pre-warm a sandbox snapshot for faster runs");
1008
1079
  console.log(" agentnetes snapshot list List available snapshots");
1009
1080
  console.log("");
1010
1081
  console.log("Environment variables:");
1011
- console.log(" AI_GATEWAY_BASE_URL Vercel AI Gateway endpoint");
1012
- console.log(" VERCEL_TOKEN Vercel API token (for sandbox)");
1082
+ console.log(" GOOGLE_API_KEY Google Gemini API key (aistudio.google.com)");
1083
+ console.log(" SANDBOX_PROVIDER docker | local | vercel | e2b (default: docker)");
1013
1084
  }
1014
1085
  }
1015
1086
  main().catch((err) => {
package/package.json CHANGED
@@ -1,8 +1,15 @@
1
1
  {
2
2
  "name": "agentnetes",
3
- "version": "0.1.0",
3
+ "version": "0.1.3",
4
4
  "description": "Zero to a self-organizing AI agency. On demand.",
5
- "keywords": ["ai", "agents", "cli", "gemini", "vercel", "autonomous"],
5
+ "keywords": [
6
+ "ai",
7
+ "agents",
8
+ "cli",
9
+ "gemini",
10
+ "vercel",
11
+ "autonomous"
12
+ ],
6
13
  "license": "MIT",
7
14
  "repository": {
8
15
  "type": "git",
@@ -12,14 +19,17 @@
12
19
  "bin": {
13
20
  "agentnetes": "./dist/index.js"
14
21
  },
15
- "files": ["dist"],
22
+ "files": [
23
+ "dist",
24
+ "web"
25
+ ],
16
26
  "scripts": {
17
27
  "build": "tsup",
28
+ "build:serve": "cd ../.. && bash scripts/build-serve.sh && cd packages/cli && tsup",
18
29
  "dev": "tsx src/index.ts",
19
30
  "prepublishOnly": "npm run build"
20
31
  },
21
32
  "dependencies": {
22
- "@ai-sdk/gateway": "beta",
23
33
  "@ai-sdk/google": "^3.0.52",
24
34
  "@vercel/sandbox": "latest",
25
35
  "ai": "beta",