@draht/pods 2026.3.2-7 → 2026.3.2-9

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +18 -18
  2. package/package.json +3 -3
package/README.md CHANGED
@@ -5,7 +5,7 @@ Deploy and manage LLMs on GPU pods with automatic vLLM configuration for agentic
5
5
  ## Installation
6
6
 
7
7
  ```bash
8
- npm install -g @mariozechner/pi
8
+ bun add -g @draht/coding-agent
9
9
  ```
10
10
 
11
11
  ## What is pi?
@@ -22,7 +22,7 @@ npm install -g @mariozechner/pi
22
22
  ```bash
23
23
  # Set required environment variables
24
24
  export HF_TOKEN=your_huggingface_token # Get from https://huggingface.co/settings/tokens
25
- export PI_API_KEY=your_api_key # Any string you want for API authentication
25
+ export DRAHT_API_KEY=your_api_key # Any string you want for API authentication
26
26
 
27
27
  # Setup a DataCrunch pod with NFS storage (models path auto-extracted)
28
28
  pi pods setup dc1 "ssh root@1.2.3.4" \
@@ -39,7 +39,7 @@ pi agent qwen -i
39
39
 
40
40
  # Use with any OpenAI-compatible client
41
41
  export OPENAI_BASE_URL='http://1.2.3.4:8001/v1'
42
- export OPENAI_API_KEY=$PI_API_KEY
42
+ export OPENAI_API_KEY=$DRAHT_API_KEY
43
43
  ```
44
44
 
45
45
  ## Prerequisites
@@ -121,10 +121,10 @@ pi agent <name> -i # Interactive chat mode
121
121
  pi agent <name> -i -c # Continue previous session
122
122
 
123
123
  # Standalone OpenAI-compatible agent (works with any API)
124
- pi-agent --base-url http://localhost:8000/v1 --model llama-3.1 "Hello"
125
- pi-agent --api-key sk-... "What is 2+2?" # Uses OpenAI by default
126
- pi-agent --json "What is 2+2?" # Output event stream as JSONL
127
- pi-agent -i # Interactive mode
124
+ draht --base-url http://localhost:8000/v1 --model llama-3.1 "Hello"
125
+ draht --api-key sk-... "What is 2+2?" # Uses OpenAI by default
126
+ draht --json "What is 2+2?" # Output event stream as JSONL
127
+ draht -i # Interactive mode
128
128
  ```
129
129
 
130
130
  The agent includes tools for file operations (read, list, bash, glob, rg) to test agentic capabilities, particularly useful for code navigation and analysis tasks.
@@ -311,29 +311,29 @@ response = client.chat.completions.create(
311
311
  `pi` includes a standalone OpenAI-compatible agent that can work with any API:
312
312
 
313
313
  ```bash
314
- # Install globally to get pi-agent command
315
- npm install -g @mariozechner/pi
314
+ # Install globally to get draht command
315
+ bun add -g @draht/coding-agent
316
316
 
317
317
  # Use with OpenAI
318
- pi-agent --api-key sk-... "What is machine learning?"
318
+ draht --api-key sk-... "What is machine learning?"
319
319
 
320
320
  # Use with local vLLM
321
- pi-agent --base-url http://localhost:8000/v1 \
321
+ draht --base-url http://localhost:8000/v1 \
322
322
  --model meta-llama/Llama-3.1-8B-Instruct \
323
323
  --api-key dummy \
324
324
  "Explain quantum computing"
325
325
 
326
326
  # Interactive mode
327
- pi-agent -i
327
+ draht -i
328
328
 
329
329
  # Continue previous session
330
- pi-agent --continue "Follow up question"
330
+ draht --continue "Follow up question"
331
331
 
332
332
  # Custom system prompt
333
- pi-agent --system-prompt "You are a Python expert" "Write a web scraper"
333
+ draht --system-prompt "You are a Python expert" "Write a web scraper"
334
334
 
335
335
  # Use responses API (for GPT-OSS models)
336
- pi-agent --api responses --model openai/gpt-oss-20b "Hello"
336
+ draht --api responses --model openai/gpt-oss-20b "Hello"
337
337
  ```
338
338
 
339
339
  The agent supports:
@@ -411,7 +411,7 @@ Events are automatically converted to the appropriate API format (Chat Completio
411
411
 
412
412
  Use `--json` flag to output the event stream as JSONL (JSON Lines) for programmatic consumption:
413
413
  ```bash
414
- pi-agent --api-key sk-... --json "What is 2+2?"
414
+ draht --api-key sk-... --json "What is 2+2?"
415
415
  ```
416
416
 
417
417
  Each line is a complete JSON object representing an event:
@@ -502,8 +502,8 @@ ls -la ~/.pi/sessions/
502
502
  ## Environment Variables
503
503
 
504
504
  - `HF_TOKEN` - HuggingFace token for model downloads
505
- - `PI_API_KEY` - API key for vLLM endpoints
506
- - `PI_CONFIG_DIR` - Config directory (default: `~/.pi`)
505
+ - `DRAHT_API_KEY` - API key for vLLM endpoints
506
+ - `DRAHT_CONFIG_DIR` - Config directory (default: `~/.pi`)
507
507
  - `OPENAI_API_KEY` - Used by `pi-agent` when no `--api-key` provided
508
508
 
509
509
  ## License
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@draht/pods",
3
- "version": "2026.3.2-7",
3
+ "version": "2026.3.2-9",
4
4
  "description": "CLI tool for managing vLLM deployments on GPU pods",
5
5
  "type": "module",
6
6
  "bin": {
@@ -26,14 +26,14 @@
26
26
  "license": "MIT",
27
27
  "repository": {
28
28
  "type": "git",
29
- "url": "git+https://github.com/badlogic/pi-mono.git",
29
+ "url": "git+https://github.com/draht-dev/draht.git",
30
30
  "directory": "packages/pods"
31
31
  },
32
32
  "engines": {
33
33
  "node": ">=20.0.0"
34
34
  },
35
35
  "dependencies": {
36
- "@draht/agent-core": "2026.3.2-7",
36
+ "@draht/agent-core": "2026.3.2-9",
37
37
  "chalk": "^5.5.0"
38
38
  },
39
39
  "devDependencies": {}