llm-party-cli 0.1.1 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,8 +4,8 @@
4
4
  <strong>Bring your models. We'll bring the party.</strong>
5
5
  </p>
6
6
  <p align="center">
7
- <a href="https://llm-party.party">Website</a> &middot;
8
- <a href="https://www.npmjs.com/package/llm-party-cli">npm</a> &middot;
7
+ <a href="https://llm-party.party">Website</a> ·
8
+ <a href="https://www.npmjs.com/package/llm-party-cli">npm</a> ·
9
9
  <a href="https://github.com/aalasolutions/llm-party">GitHub</a>
10
10
  </p>
11
11
  <p align="center">
@@ -36,13 +36,13 @@ No MCP. No master/servant. No window juggling. Just peers at a terminal table.
36
36
 
37
37
  ## Why llm-party?
38
38
 
39
- | | Traditional multi-agent | llm-party |
40
- |---|---|---|
41
- | **Architecture** | MCP (master controls servants) | Peer orchestration (you control all) |
42
- | **Integration** | CLI wrapping, output scraping | Direct SDK adapters |
43
- | **Sessions** | Fresh each time | Persistent per provider |
44
- | **Context** | Agents are siloed | Every agent sees the full conversation |
45
- | **API tokens** | Separate keys per tool | Uses your existing CLI auth |
39
+ | | Traditional multi-agent | llm-party |
40
+ | ---------------------- | ------------------------------ | -------------------------------------- |
41
+ | **Architecture** | MCP (master controls servants) | Peer orchestration (you control all) |
42
+ | **Integration** | CLI wrapping, output scraping | Direct SDK adapters |
43
+ | **Sessions** | Fresh each time | Persistent per provider |
44
+ | **Context** | Agents are siloed | Every agent sees the full conversation |
45
+ | **API tokens** | Separate keys per tool | Uses your existing CLI auth |
46
46
 
47
47
  <br/>
48
48
 
@@ -50,7 +50,7 @@ No MCP. No master/servant. No window juggling. Just peers at a terminal table.
50
50
 
51
51
  ### Prerequisites
52
52
 
53
- Node.js 20+ is required (22+ if using the Copilot provider, which depends on `node:sqlite`). Make sure at least one AI CLI is installed and authenticated:
53
+ Bun runtime and Node.js 20+ are required (22+ if using the Copilot provider, which depends on `node:sqlite`). Make sure at least one AI CLI is installed and authenticated:
54
54
 
55
55
  ```bash
56
56
  claude --version # Claude Code CLI
@@ -72,13 +72,7 @@ npm install -g llm-party-cli
72
72
  llm-party
73
73
  ```
74
74
 
75
- On first run, llm-party creates `~/.llm-party/` with a default config and base prompt. Your system username is detected automatically. A single Claude agent is configured out of the box.
76
-
77
- To re-run setup or reset your config:
78
-
79
- ```bash
80
- llm-party --init
81
- ```
75
+ On first run, llm-party automatically creates `~/.llm-party/` with a default config and global memory structure. Your system username is detected automatically. No setup commands needed.
82
76
 
83
77
  ### Add more agents
84
78
 
@@ -126,11 +120,13 @@ That's it. No paths, no prompts, no usernames to configure. Just name, tag, prov
126
120
 
127
121
  Agents can pass the conversation to each other by ending their response with `@next:<tag>`. The orchestrator picks it up and dispatches automatically. Max 15 hops per cycle to prevent loops.
128
122
 
129
- <br/>
123
+ ## **WARNING: FULL AUTONOMY.**
124
+
125
+ All agents run with full permissions. They can read, write, edit files and execute shell commands with zero approval gates. There is no confirmation step before any action. Run in a disposable environment. You are responsible for any changes, data loss, costs, or side effects. Do not run against production systems.
130
126
 
131
127
  ## Important notes
132
128
 
133
- **No extra API tokens.** llm-party uses the original CLIs and SDKs under the hood. Your existing authentication and subscriptions are used directly. Sessions created by agents appear in each tool's native session history (Claude Code sessions, Codex threads, etc.).
129
+ **Uses your existing CLIs.** llm-party uses official SDKs that delegate to each provider's CLI binary. If `claude`, `codex`, or `copilot` commands work on your machine, llm-party works. Authentication is handled entirely by the provider's own tools.
134
130
 
135
131
  **Run in isolation.** Always run llm-party inside a disposable environment: a Docker container, a VM, or at minimum a throwaway git branch. Agents have full filesystem and shell access with zero approval gates.
136
132
 
@@ -142,26 +138,24 @@ Agents can pass the conversation to each other by ending their response with `@n
142
138
 
143
139
  llm-party uses **official, publicly available SDKs and CLIs** published by each provider. Nothing is reverse-engineered, patched, or bypassed.
144
140
 
145
- | Provider | Official SDK | Published by |
146
- |----------|-------------|-------------|
147
- | Claude | [`@anthropic-ai/claude-agent-sdk`](https://www.npmjs.com/package/@anthropic-ai/claude-agent-sdk) | Anthropic |
148
- | Codex | [`@openai/codex-sdk`](https://www.npmjs.com/package/@openai/codex-sdk) | OpenAI |
149
- | Copilot | [`@github/copilot-sdk`](https://www.npmjs.com/package/@github/copilot-sdk) | GitHub |
150
-
151
- All authentication flows through the provider's own CLI login. llm-party does not store, proxy, or intercept credentials.
141
+ | Provider | Official SDK | Published by |
142
+ | -------- | ----------------------------------------------------------------------------------------------- | ------------ |
143
+ | Claude | [`@anthropic-ai/claude-agent-sdk`](https://www.npmjs.com/package/@anthropic-ai/claude-agent-sdk) | Anthropic |
144
+ | Codex | [`@openai/codex-sdk`](https://www.npmjs.com/package/@openai/codex-sdk) | OpenAI |
145
+ | Copilot | [`@github/copilot-sdk`](https://www.npmjs.com/package/@github/copilot-sdk) | GitHub |
152
146
 
153
- If any provider believes this project violates their terms of service, please [open an issue](https://github.com/aalasolutions/llm-party/issues) and we will address it immediately.
147
+ All authentication flows through the provider's own CLI. llm-party does not implement its own auth flow, store credentials, or intercept authentication traffic.
154
148
 
155
149
  <br/>
156
150
 
157
151
  ## Supported providers
158
152
 
159
- | Provider | SDK | Session | Prompt Support |
160
- |----------|-----|---------|---------------|
161
- | **Claude** | `@anthropic-ai/claude-agent-sdk` | Persistent via session ID resume | Full control |
162
- | **Codex** | `@openai/codex-sdk` | Persistent thread with `run()` turns | Via `developer_instructions` (limitations below) |
163
- | **Copilot** | `@github/copilot-sdk` | Persistent via `sendAndWait()` | Full control |
164
- | **GLM** | Claude SDK + env proxy | Same as Claude | Full control |
153
+ | Provider | SDK | Session | Prompt Support |
154
+ | ----------------- | ---------------------------------- | -------------------------------------- | -------------------------------------------------- |
155
+ | **Claude** | `@anthropic-ai/claude-agent-sdk` | Persistent via session ID resume | Full control |
156
+ | **Codex** | `@openai/codex-sdk` | Persistent thread with `run()` turns | Via `developer_instructions` (limitations below) |
157
+ | **Copilot** | `@github/copilot-sdk` | Persistent via `sendAndWait()` | Full control |
158
+ | **GLM** | Claude SDK + env proxy | Same as Claude | Full control |
165
159
 
166
160
  <br/>
167
161
 
@@ -190,9 +184,9 @@ Orchestrator
190
184
  +-- Transcript Writer (JSONL, append-only, per session)
191
185
  ```
192
186
 
193
- Each agent receives a rolling window of recent messages (default 16) plus any unseen messages since its last turn. Messages from other agents are included so everyone sees the full multi-party conversation.
187
+ Each agent receives a rolling window of recent messages (configurable, default 16) plus any unseen messages since its last turn. Messages from other agents are included so everyone sees the full multi-party conversation.
194
188
 
195
- `~/.llm-party/config.json` is your global config. `base.md` is always loaded first for every agent. The `prompts` field in config adds extra prompt files on top of it.
189
+ `~/.llm-party/config.json` is your global config. Every agent receives a base system prompt automatically. The `prompts` field in config adds extra prompt files on top of it.
196
190
 
197
191
  <br/>
198
192
 
@@ -200,41 +194,41 @@ Each agent receives a rolling window of recent messages (default 16) plus any un
200
194
 
201
195
  ### Claude
202
196
 
203
- | | |
204
- |---|---|
205
- | SDK | `@anthropic-ai/claude-agent-sdk` |
197
+ | | |
198
+ | ------- | ----------------------------------------------------------------------------------------------- |
199
+ | SDK | `@anthropic-ai/claude-agent-sdk` |
206
200
  | Session | Persistent via `resume: sessionId`. First call creates a session, subsequent calls resume it. |
207
- | Prompt | Passed directly to the SDK. Full control over personality, behavior, and workflow rules. |
208
- | Tools | Read, Write, Edit, Bash, Glob, Grep |
201
+ | Prompt | Passed directly to the SDK. Full control over personality, behavior, and workflow rules. |
202
+ | Tools | Read, Write, Edit, Bash, Glob, Grep |
209
203
 
210
204
  ### Codex
211
205
 
212
- | | |
213
- |---|---|
214
- | SDK | `@openai/codex-sdk` |
215
- | Session | Persistent thread. `startThread()` creates it, `thread.run()` adds turns to the same conversation. |
216
- | Prompt | Injected via `developer_instructions` config key. Appended alongside Codex's built-in 13k token system prompt. |
217
- | Tools | exec_command, apply_patch, file operations |
206
+ | | |
207
+ | ------- | ---------------------------------------------------------------------------------------------------------------- |
208
+ | SDK | `@openai/codex-sdk` |
209
+ | Session | Persistent thread.`startThread()` creates it, `thread.run()` adds turns to the same conversation. |
210
+ | Prompt | Injected via `developer_instructions` config key. Appended alongside Codex's built-in 13k token system prompt. |
211
+ | Tools | exec_command, apply_patch, file operations |
218
212
 
219
213
  **Known limitation:** Codex's built-in system prompt cannot be overridden. Your instructions are appended alongside it. Action instructions (naming, formatting, workflow rules) work. Personality overrides do not.
220
214
 
221
215
  ### Copilot
222
216
 
223
- | | |
224
- |---|---|
225
- | SDK | `@github/copilot-sdk` |
226
- | Session | Persistent via `CopilotClient.createSession()`. |
227
- | Prompt | Set as `systemMessage` on session creation. Full control. |
228
- | Tools | Copilot built-in toolset |
217
+ | | |
218
+ | ------- | ----------------------------------------------------------- |
219
+ | SDK | `@github/copilot-sdk` |
220
+ | Session | Persistent via `CopilotClient.createSession()`. |
221
+ | Prompt | Set as `systemMessage` on session creation. Full control. |
222
+ | Tools | Copilot built-in toolset |
229
223
 
230
224
  ### GLM
231
225
 
232
- | | |
233
- |---|---|
234
- | SDK | `@anthropic-ai/claude-agent-sdk` (same as Claude) |
226
+ | | |
227
+ | ------- | --------------------------------------------------------- |
228
+ | SDK | `@anthropic-ai/claude-agent-sdk` (same as Claude) |
235
229
  | Session | Same as Claude, routed through a proxy via env overrides. |
236
- | Prompt | Same as Claude. Full control. |
237
- | Tools | Same as Claude |
230
+ | Prompt | Same as Claude. Full control. |
231
+ | Tools | Same as Claude |
238
232
 
239
233
  GLM uses the Claude SDK as a transport layer. The adapter routes API calls through a proxy by setting `ANTHROPIC_BASE_URL` and model aliases via the `env` config field.
240
234
 
@@ -242,38 +236,36 @@ GLM uses the Claude SDK as a transport layer. The adapter routes API calls throu
242
236
 
243
237
  ## Config reference
244
238
 
245
- Config file: `~/.llm-party/config.json` (created on first run).
239
+ Config file: `~/.llm-party/config.json` (created automatically on first run).
246
240
 
247
241
  Override with `LLM_PARTY_CONFIG` env var to point to a different file.
248
242
 
249
243
  ### Top-level fields
250
244
 
251
- | Field | Required | Default | Description |
252
- |-------|----------|---------|-------------|
253
- | `humanName` | No | Your system username | Display name in the terminal prompt and passed to agents |
254
- | `humanTag` | No | derived from `humanName` | Tag for human handoff detection (`@next:you`) |
255
- | `maxAutoHops` | No | `15` | Max agent-to-agent handoffs per cycle. Use `"unlimited"` to remove the cap |
256
- | `timeout` | No | `600` | Default timeout in seconds for all agents |
257
- | `agents` | Yes | | Array of agent definitions |
245
+ | Field | Required | Default | Description |
246
+ | --------------- | -------- | -------------------------- | ---------------------------------------------------------------------------- |
247
+ | `humanName` | No | Your system username | Display name in the terminal prompt and passed to agents |
248
+ | `humanTag` | No | derived from `humanName` | Tag for human handoff detection (`@next:you`) |
249
+ | `maxAutoHops` | No | `15` | Max agent-to-agent handoffs per cycle. Use `"unlimited"` to remove the cap |
250
+ | `timeout` | No | `600` | Default timeout in seconds for all agents |
251
+ | `agents` | Yes | | Array of agent definitions |
258
252
 
259
253
  ### Agent fields
260
254
 
261
- | Field | Required | Default | Description |
262
- |-------|----------|---------|-------------|
263
- | `name` | Yes | | Display name shown in responses as `[AGENT NAME]` |
264
- | `tag` | No | derived from `name` | Routing tag for `@tag` targeting |
265
- | `provider` | Yes | | SDK adapter: `claude`, `codex`, `copilot`, or `glm` |
266
- | `model` | Yes | | Model ID passed to the provider. Examples: `opus`, `sonnet`, `gpt-5.2`, `gpt-4.1`, `glm-5` |
267
- | `prompts` | No | none | Array of extra prompt file paths, concatenated after `base.md`. Relative to project root |
268
- | `executablePath` | No | PATH lookup | Path to the CLI binary. Supports `~/`. Only needed if the CLI is not in your PATH |
269
- | `env` | No | inherits `process.env` | Environment variable overrides for this agent |
270
- | `timeout` | No | top-level value | Per-agent timeout override in seconds |
255
+ | Field | Required | Default | Description |
256
+ | ------------------ | -------- | ------------------------ | --------------------------------------------------------------------------------------------------- |
257
+ | `name` | Yes | | Display name shown in responses as `[AGENT NAME]`. Must be unique. |
258
+ | `tag` | No | derived from `name` | Routing tag for `@tag` targeting |
259
+ | `provider` | Yes | | SDK adapter:`claude`, `codex`, `copilot`, or `glm` |
260
+ | `model` | Yes | | Model ID passed to the provider. Examples:`opus`, `sonnet`, `gpt-5.2`, `gpt-4.1`, `glm-5` |
261
+ | `prompts` | No | none | Array of extra prompt file paths, concatenated after `base.md`. Relative to project root |
262
+ | `executablePath` | No | PATH lookup | Path to the CLI binary. Supports `~/`. Only needed if the CLI is not in your PATH |
263
+ | `env` | No | inherits `process.env` | Environment variable overrides for this agent |
264
+ | `timeout` | No | top-level value | Per-agent timeout override in seconds |
271
265
 
272
266
  ### Prompts
273
267
 
274
- `base.md` is always loaded first for every agent. It defines orchestrator rules, routing syntax, handoff protocol, and team context. It lives at `~/.llm-party/base.md` (copied from the package on first run).
275
-
276
- To add extra instructions per agent, use the `prompts` field:
268
+ Every agent receives a base system prompt automatically. To add extra instructions per agent, use the `prompts` field:
277
269
 
278
270
  ```json
279
271
  {
@@ -285,23 +277,23 @@ To add extra instructions per agent, use the `prompts` field:
285
277
  }
286
278
  ```
287
279
 
288
- The final prompt sent to the agent is: `base.md` + `prompts[0]` + `prompts[1]` + ... joined with `---` separators. Template variables are rendered after concatenation:
280
+ Template variables available in prompt files:
289
281
 
290
- | Variable | Description |
291
- |----------|-------------|
292
- | `{{agentName}}` | This agent's display name |
293
- | `{{agentTag}}` | This agent's routing tag |
294
- | `{{humanName}}` | Your display name |
295
- | `{{humanTag}}` | Your routing tag |
296
- | `{{agentCount}}` | Total number of active agents |
297
- | `{{allAgentNames}}` | All agent names |
298
- | `{{allAgentTags}}` | All agent tags as `@tag` |
299
- | `{{otherAgentList}}` | Other agents with their tags |
300
- | `{{validHandoffTargets}}` | Valid `@next:tag` targets |
282
+ | Variable | Description |
283
+ | --------------------------- | ----------------------------- |
284
+ | `{{agentName}}` | This agent's display name |
285
+ | `{{agentTag}}` | This agent's routing tag |
286
+ | `{{humanName}}` | Your display name |
287
+ | `{{humanTag}}` | Your routing tag |
288
+ | `{{agentCount}}` | Total number of active agents |
289
+ | `{{allAgentNames}}` | All agent names |
290
+ | `{{allAgentTags}}` | All agent tags as `@tag` |
291
+ | `{{otherAgentList}}` | Other agents with their tags |
292
+ | `{{validHandoffTargets}}` | Valid `@next:tag` targets |
301
293
 
302
294
  ### GLM environment setup
303
295
 
304
- GLM requires environment overrides to route through a proxy. The adapter first tries to load env variables from your shell `glm` alias. Without the alias, provide everything in the `env` block:
296
+ GLM requires environment overrides to route through a proxy. The adapter tries to load env variables from your shell `glm` alias automatically. Without the alias, provide everything in the `env` block:
305
297
 
306
298
  ```json
307
299
  {
@@ -330,14 +322,15 @@ File changes made by agents are detected via `git status` after each response. N
330
322
 
331
323
  ## Terminal commands
332
324
 
333
- | Command | What it does |
334
- |---------|-------------|
335
- | `/agents` | List active agents with tag, provider, model |
336
- | `/history` | Print full conversation history |
337
- | `/save <path>` | Export conversation as JSON |
338
- | `/session` | Show session ID and transcript path |
339
- | `/changes` | Show git-modified files |
340
- | `/exit` | Quit |
325
+ | Command | What it does |
326
+ | ---------------- | -------------------------------------------- |
327
+ | `/agents` | List active agents with tag, provider, model |
328
+ | `/save <path>` | Export conversation as JSON |
329
+ | `/session` | Show session ID and transcript path |
330
+ | `/changes` | Show git-modified files |
331
+ | `/clear` | Clear chat display (Ctrl+L also works) |
332
+ | `/exit` | Quit (graceful shutdown, all adapters cleaned up) |
333
+ | `Ctrl+C` | Exit (or copy selected text if selection active) |
341
334
 
342
335
  <br/>
343
336
 
@@ -346,21 +339,21 @@ File changes made by agents are detected via `git status` after each response. N
346
339
  ```bash
347
340
  git clone https://github.com/aalasolutions/llm-party.git
348
341
  cd llm-party
349
- npm install
350
- npm run dev
342
+ bun install
343
+ bun run dev
351
344
  ```
352
345
 
353
346
  Build and run:
354
347
 
355
348
  ```bash
356
- npm run build
357
- npm start
349
+ bun run build
350
+ bun start
358
351
  ```
359
352
 
360
353
  Override config:
361
354
 
362
355
  ```bash
363
- LLM_PARTY_CONFIG=/path/to/config.json npm run dev
356
+ LLM_PARTY_CONFIG=/path/to/config.json bun run dev
364
357
  ```
365
358
 
366
359
  <br/>
@@ -373,6 +366,9 @@ Run `/agents` to see available tags. Tags match against agent `tag`, `name`, and
373
366
  **"Unsupported provider"**
374
367
  Valid providers: `claude`, `codex`, `copilot`, `glm`.
375
368
 
369
+ **"Duplicate agent name"**
370
+ Agent names must be unique (case-insensitive). Rename one of the duplicates in config.
371
+
376
372
  **Agent modifies source code unexpectedly**
377
373
  Expected with full permissions. Use git to review and revert.
378
374
 
@@ -388,6 +384,6 @@ Default is 600 seconds (10 minutes). Adjust with `timeout` in config (top-level
388
384
  <br/>
389
385
 
390
386
  <p align="center">
391
- <a href="https://llm-party.party">llm-party.party</a> &middot;
387
+ <a href="https://llm-party.party">llm-party.party</a> ·
392
388
  Built by <a href="https://aalasolutions.com">AALA Solutions</a>
393
389
  </p>
@@ -1,10 +1,37 @@
1
1
  {
2
+ "humanName": "AAMIR",
3
+ "humanTag": "aamir",
4
+ "maxAutoHops": 15,
2
5
  "agents": [
6
+ {
7
+ "name": "GLM",
8
+ "tag": "glm",
9
+ "provider": "glm",
10
+ "model": "glm-5",
11
+ "env": {
12
+ "ANTHROPIC_BASE_URL": "https://api.z.ai/api/anthropic",
13
+ "ANTHROPIC_DEFAULT_HAIKU_MODEL": "glm-4.5-air",
14
+ "ANTHROPIC_DEFAULT_SONNET_MODEL": "glm-4.5",
15
+ "ANTHROPIC_DEFAULT_OPUS_MODEL": "glm-5"
16
+ }
17
+ },
3
18
  {
4
19
  "name": "Claude",
5
20
  "tag": "claude",
6
21
  "provider": "claude",
7
- "model": "sonnet"
22
+ "model": "opus"
23
+ },
24
+ {
25
+ "name": "Copilot",
26
+ "tag": "copilot",
27
+ "provider": "copilot",
28
+ "model": "gpt-4.1"
29
+ },
30
+ {
31
+ "name": "Codex",
32
+ "tag": "codex",
33
+ "provider": "codex",
34
+ "model": "gpt-5.2"
8
35
  }
9
36
  ]
10
37
  }
@@ -1,4 +1,4 @@
1
- import { readFile, access, mkdir, copyFile } from "node:fs/promises";
1
+ import { readFile, writeFile, access, mkdir } from "node:fs/promises";
2
2
  import { homedir, userInfo } from "node:os";
3
3
  import path from "node:path";
4
4
  const VALID_PROVIDERS = ["claude", "codex", "copilot", "glm"];
@@ -60,50 +60,68 @@ export async function resolveConfigPath(appRoot) {
60
60
  return path.join(appRoot, "configs", "default.json");
61
61
  }
62
62
  export async function resolveBasePrompt(appRoot) {
63
- const globalBase = path.join(LLM_PARTY_HOME, "base.md");
63
+ const bundledBase = path.join(appRoot, "prompts", "base.md");
64
+ return await readFile(bundledBase, "utf8");
65
+ }
66
+ export async function resolveArtifactsPrompt(appRoot) {
67
+ const bundledArtifacts = path.join(appRoot, "prompts", "artifacts.md");
68
+ return await readFile(bundledArtifacts, "utf8");
69
+ }
70
+ export async function initProjectFolder(cwd) {
71
+ const projectHome = path.join(cwd, ".llm-party");
72
+ const memoryDir = path.join(projectHome, "memory");
73
+ const skillsDir = path.join(projectHome, "skills");
74
+ await mkdir(memoryDir, { recursive: true });
75
+ await mkdir(skillsDir, { recursive: true });
76
+ const tasksFile = path.join(projectHome, "TASKS.md");
64
77
  try {
65
- await access(globalBase);
66
- return await readFile(globalBase, "utf8");
78
+ await access(tasksFile);
67
79
  }
68
80
  catch {
69
- // fall through to bundled
81
+ await writeFile(tasksFile, "# Tasks\n", "utf8");
82
+ }
83
+ const projectMd = path.join(memoryDir, "project.md");
84
+ try {
85
+ await access(projectMd);
86
+ }
87
+ catch {
88
+ await writeFile(projectMd, "# Project Memory\n\n## Current State\n\nLast Updated:\nActive:\nBlockers:\nNext:\n\n---\n\n## Log\n", "utf8");
89
+ }
90
+ const decisionsMd = path.join(memoryDir, "decisions.md");
91
+ try {
92
+ await access(decisionsMd);
93
+ }
94
+ catch {
95
+ await writeFile(decisionsMd, "# Decisions\n", "utf8");
70
96
  }
71
- const bundledBase = path.join(appRoot, "prompts", "base.md");
72
- return await readFile(bundledBase, "utf8");
73
97
  }
74
98
  export async function initLlmPartyHome(appRoot) {
75
99
  await mkdir(LLM_PARTY_HOME, { recursive: true });
76
100
  await mkdir(path.join(LLM_PARTY_HOME, "sessions"), { recursive: true });
77
- const globalBase = path.join(LLM_PARTY_HOME, "base.md");
101
+ await mkdir(path.join(LLM_PARTY_HOME, "network"), { recursive: true });
102
+ await mkdir(path.join(LLM_PARTY_HOME, "agents"), { recursive: true });
103
+ const projectsYml = path.join(LLM_PARTY_HOME, "network", "projects.yml");
78
104
  try {
79
- await access(globalBase);
105
+ await access(projectsYml);
80
106
  }
81
107
  catch {
82
- const bundledBase = path.join(appRoot, "prompts", "base.md");
83
- await copyFile(bundledBase, globalBase);
108
+ await writeFile(projectsYml, "projects: []\n", "utf8");
109
+ }
110
+ const librariesYml = path.join(LLM_PARTY_HOME, "network", "libraries.yml");
111
+ try {
112
+ await access(librariesYml);
113
+ }
114
+ catch {
115
+ await writeFile(librariesYml, "libraries: []\n", "utf8");
84
116
  }
85
117
  const globalConfig = path.join(LLM_PARTY_HOME, "config.json");
86
118
  try {
87
119
  await access(globalConfig);
88
120
  }
89
121
  catch {
90
- const username = userInfo().username || "USER";
91
- const defaultConfig = {
92
- humanName: username,
93
- agents: [
94
- {
95
- name: "Claude",
96
- tag: "claude",
97
- provider: "claude",
98
- model: "sonnet"
99
- }
100
- ]
101
- };
102
- const { writeFile } = await import("node:fs/promises");
103
- await writeFile(globalConfig, JSON.stringify(defaultConfig, null, 2) + "\n", "utf8");
104
- }
105
- console.log(`Initialized ~/.llm-party/`);
106
- console.log(` config: ${globalConfig}`);
122
+ const bundledConfig = await readFile(path.join(appRoot, "configs", "default.json"), "utf8");
123
+ await writeFile(globalConfig, bundledConfig, "utf8");
124
+ }
107
125
  }
108
126
  export async function loadConfig(configPath) {
109
127
  const raw = await readFile(configPath, "utf8");