llm-party-cli 0.7.0 → 0.9.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +54 -22
- package/dist/index.js +686 -382
- package/package.json +1 -1
- package/prompts/artifacts.md +61 -4
- package/prompts/base.md +69 -7
package/README.md
CHANGED
|
@@ -17,7 +17,7 @@
|
|
|
17
17
|
|
|
18
18
|
<br/>
|
|
19
19
|
|
|
20
|
-
A peer orchestrator that puts **Claude**, **Codex**, **Copilot**, and
|
|
20
|
+
A peer orchestrator that puts **Claude**, **Codex**, **Copilot**, and any Claude-compatible API (GLM, Ollama, etc.) in the same terminal. You talk, they listen. They talk to each other. Nobody is the boss except you.
|
|
21
21
|
|
|
22
22
|
```
|
|
23
23
|
YOU > @claude review this function
|
|
@@ -145,18 +145,31 @@ llm-party uses **official, publicly available SDKs and CLIs** published by each
|
|
|
145
145
|
| Codex | [`@openai/codex-sdk`](https://www.npmjs.com/package/@openai/codex-sdk) | OpenAI |
|
|
146
146
|
| Copilot | [`@github/copilot-sdk`](https://www.npmjs.com/package/@github/copilot-sdk) | GitHub |
|
|
147
147
|
|
|
148
|
+
Custom providers (GLM, Ollama, etc.) route through a native CLI's SDK with environment overrides. No additional SDKs are required.
|
|
149
|
+
|
|
148
150
|
All authentication flows through the provider's own CLI. llm-party does not implement its own auth flow, store credentials, or intercept authentication traffic.
|
|
149
151
|
|
|
150
152
|
<br/>
|
|
151
153
|
|
|
152
154
|
## Supported providers
|
|
153
155
|
|
|
156
|
+
### Native providers (detected automatically)
|
|
157
|
+
|
|
154
158
|
| Provider | SDK | Session | Prompt Support |
|
|
155
159
|
| ----------------- | ---------------------------------- | -------------------------------------- | -------------------------------------------------- |
|
|
156
160
|
| **Claude** | `@anthropic-ai/claude-agent-sdk` | Persistent via session ID resume | Full control |
|
|
157
161
|
| **Codex** | `@openai/codex-sdk` | Persistent thread with `run()` turns | Via `developer_instructions` (limitations below) |
|
|
158
162
|
| **Copilot** | `@github/copilot-sdk` | Persistent via `sendAndWait()` | Full control |
|
|
159
|
-
|
|
163
|
+
|
|
164
|
+
### Custom providers (config-driven)
|
|
165
|
+
|
|
166
|
+
Any AI that exposes a Claude-compatible API can be added as a custom provider. Custom providers route through a native CLI (currently Claude) with environment overrides.
|
|
167
|
+
|
|
168
|
+
| Provider | API | Notes |
|
|
169
|
+
| ----------------- | ---------------------------------- | -------------------------------------------------- |
|
|
170
|
+
| **GLM** | Zhipu AI (`api.z.ai`) | Full Claude SDK compatibility via proxy |
|
|
171
|
+
| **Ollama** | Local (`localhost:11434`) | Any model Ollama supports |
|
|
172
|
+
| **Any** | Any Claude-compatible endpoint | Just set `AUTH_URL` and `AUTH_TOKEN` |
|
|
160
173
|
|
|
161
174
|
<br/>
|
|
162
175
|
|
|
@@ -178,7 +191,7 @@ Orchestrator
|
|
|
178
191
|
| +-- Claude -> ClaudeAdapter (SDK session, resume by ID)
|
|
179
192
|
| +-- Codex -> CodexAdapter (SDK thread, persistent turns)
|
|
180
193
|
| +-- Copilot -> CopilotAdapter (SDK session, sendAndWait)
|
|
181
|
-
| +--
|
|
194
|
+
| +-- Custom -> CustomAdapter (routes through native CLI + env override)
|
|
182
195
|
|
|
|
183
196
|
+-- Conversation Log (ordered, all messages, agent-prefixed)
|
|
184
197
|
|
|
|
@@ -222,16 +235,16 @@ Each agent receives a rolling window of recent messages (configurable, default 1
|
|
|
222
235
|
| Prompt | Set as `systemMessage` on session creation. Full control. |
|
|
223
236
|
| Tools | Copilot built-in toolset |
|
|
224
237
|
|
|
225
|
-
### GLM
|
|
238
|
+
### Custom (GLM, Ollama, etc.)
|
|
226
239
|
|
|
227
|
-
| |
|
|
228
|
-
| ------- |
|
|
229
|
-
| SDK | `@anthropic-ai/claude-agent-sdk`
|
|
230
|
-
| Session | Same as
|
|
231
|
-
| Prompt | Same as
|
|
232
|
-
| Tools | Same as
|
|
240
|
+
| | |
|
|
241
|
+
| ------- | -------------------------------------------------------------------------------- |
|
|
242
|
+
| SDK | Uses the native CLI's SDK (currently Claude's `@anthropic-ai/claude-agent-sdk`) |
|
|
243
|
+
| Session | Same as the underlying CLI |
|
|
244
|
+
| Prompt | Same as the underlying CLI. Full control. |
|
|
245
|
+
| Tools | Same as the underlying CLI |
|
|
233
246
|
|
|
234
|
-
|
|
247
|
+
Custom providers route API calls through a native CLI by overriding `AUTH_URL` and `AUTH_TOKEN` in the agent's `env` block. The `cli` field selects which native CLI to use (defaults to `claude`).
|
|
235
248
|
|
|
236
249
|
<br/>
|
|
237
250
|
|
|
@@ -247,7 +260,7 @@ Override with `LLM_PARTY_CONFIG` env var to point to a different file.
|
|
|
247
260
|
| --------------- | -------- | -------------------------- | ---------------------------------------------------------------------------- |
|
|
248
261
|
| `humanName` | No | Your system username | Display name in the terminal prompt and passed to agents |
|
|
249
262
|
| `humanTag` | No | derived from `humanName` | Tag for human handoff detection (`@next:you`) |
|
|
250
|
-
| `maxAutoHops` | No | `15` | Max agent-to-agent handoffs per cycle.
|
|
263
|
+
| `maxAutoHops` | No | `15` | Max agent-to-agent handoffs per cycle. `0` = unlimited |
|
|
251
264
|
| `timeout` | No | `600` | Default timeout in seconds for all agents |
|
|
252
265
|
| `agents` | Yes | | Array of agent definitions |
|
|
253
266
|
|
|
@@ -257,7 +270,9 @@ Override with `LLM_PARTY_CONFIG` env var to point to a different file.
|
|
|
257
270
|
| ------------------ | -------- | ------------------------ | --------------------------------------------------------------------------------------------------- |
|
|
258
271
|
| `name` | Yes | | Display name shown in responses as `[AGENT NAME]`. Must be unique. |
|
|
259
272
|
| `tag` | Yes | | Routing tag for `@tag` targeting. Letters, numbers, hyphens, underscores only. No spaces. |
|
|
260
|
-
| `provider` | Yes | | SDK adapter
|
|
273
|
+
| `provider` | Yes | | SDK adapter: `claude`, `codex`, `copilot`, or `custom` |
|
|
274
|
+
| `cli` | No | `"claude"` | For custom providers: which native CLI to route through |
|
|
275
|
+
| `active` | No | `true` | Set to `false` to disable an agent without removing its config |
|
|
261
276
|
| `model` | Yes | | Model ID passed to the provider. Examples:`opus`, `sonnet`, `gpt-5.2`, `gpt-4.1`, `glm-5` |
|
|
262
277
|
| `prompts` | No | none | Array of extra prompt file paths, concatenated after `base.md`. Relative to project root |
|
|
263
278
|
| `executablePath` | No | PATH lookup | Path to the CLI binary. Supports `~/`. Only needed if the CLI is not in your PATH |
|
|
@@ -294,25 +309,42 @@ Template variables available in prompt files:
|
|
|
294
309
|
| `{{validHandoffTargets}}` | Valid `@next:tag` targets |
|
|
295
310
|
| `{{preloadedSkills}}` | Skills assigned to this agent via `preloadSkills` |
|
|
296
311
|
|
|
297
|
-
###
|
|
312
|
+
### Custom provider setup
|
|
313
|
+
|
|
314
|
+
Custom providers use `AUTH_URL` and `AUTH_TOKEN` in the `env` block. The adapter maps these to the correct environment variables for the underlying CLI.
|
|
298
315
|
|
|
299
|
-
GLM
|
|
316
|
+
**GLM (Zhipu AI):**
|
|
300
317
|
|
|
301
318
|
```json
|
|
302
319
|
{
|
|
303
320
|
"name": "GLM",
|
|
304
|
-
"provider": "
|
|
321
|
+
"provider": "custom",
|
|
322
|
+
"cli": "claude",
|
|
305
323
|
"model": "glm-5",
|
|
306
324
|
"env": {
|
|
307
|
-
"
|
|
308
|
-
"
|
|
309
|
-
"ANTHROPIC_DEFAULT_HAIKU_MODEL": "glm-4.5-air",
|
|
310
|
-
"ANTHROPIC_DEFAULT_SONNET_MODEL": "glm-4.5",
|
|
311
|
-
"ANTHROPIC_DEFAULT_OPUS_MODEL": "glm-5"
|
|
325
|
+
"AUTH_URL": "https://api.z.ai/api/anthropic",
|
|
326
|
+
"AUTH_TOKEN": "your-glm-api-key"
|
|
312
327
|
}
|
|
313
328
|
}
|
|
314
329
|
```
|
|
315
330
|
|
|
331
|
+
**Ollama (local):**
|
|
332
|
+
|
|
333
|
+
```json
|
|
334
|
+
{
|
|
335
|
+
"name": "Ollama",
|
|
336
|
+
"provider": "custom",
|
|
337
|
+
"cli": "claude",
|
|
338
|
+
"model": "llama3",
|
|
339
|
+
"env": {
|
|
340
|
+
"AUTH_URL": "http://localhost:11434/v1",
|
|
341
|
+
"AUTH_TOKEN": "ollama"
|
|
342
|
+
}
|
|
343
|
+
}
|
|
344
|
+
```
|
|
345
|
+
|
|
346
|
+
Any endpoint that speaks the Anthropic API protocol works. Set `AUTH_URL` to the base URL and `AUTH_TOKEN` to the API key.
|
|
347
|
+
|
|
316
348
|
<br/>
|
|
317
349
|
|
|
318
350
|
## Skills
|
|
@@ -409,7 +441,7 @@ LLM_PARTY_CONFIG=/path/to/config.json bun run dev
|
|
|
409
441
|
Run `/agents` to see available tags. Tags match against agent `tag`, `name`, and `provider`.
|
|
410
442
|
|
|
411
443
|
**"Unsupported provider"**
|
|
412
|
-
Valid providers: `claude`, `codex`, `copilot`, `
|
|
444
|
+
Valid providers: `claude`, `codex`, `copilot`, `custom`.
|
|
413
445
|
|
|
414
446
|
**"Duplicate agent name"**
|
|
415
447
|
Agent names must be unique (case-insensitive). Rename one of the duplicates in config.
|