ai-pipe 1.0.0 → 1.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,70 +1,124 @@
1
1
  # ai-pipe
2
2
 
3
- A lean CLI for calling LLMs from the terminal. Text in, text out.
3
+ <div align="center">
4
4
 
5
- Built on the [Vercel AI SDK](https://sdk.vercel.ai/) with [Bun](https://bun.sh/).
5
+ [![License: GPL-3.0](https://img.shields.io/badge/License-GPL--3.0-yellow.svg)](https://www.gnu.org/licenses/gpl-3.0)
6
+ [![TypeScript](https://img.shields.io/badge/TypeScript-5.0-blue.svg)](https://www.typescriptlang.org/)
7
+ [![Bun](https://img.shields.io/badge/Bun-1.3.5-black.svg)](https://bun.sh/)
6
8
 
7
- ## Features
9
+ </div>
8
10
 
9
- - **16 providers** OpenAI, Anthropic, Google, Perplexity, xAI, Mistral, Groq, DeepSeek, Cohere, OpenRouter, Azure AI, Together AI, Amazon Bedrock, Google Vertex AI, Ollama, Hugging Face
10
- - **Streaming by default** — tokens print as they arrive
11
- - **Pipe-friendly** — reads from stdin, writes to stdout, errors to stderr
12
- - **JSON output** — structured response with usage and finish reason
13
- - **Config directory** — set defaults in `~/.ai-pipe/`
14
- - **Shell completions** — bash, zsh, fish
15
- - **Standalone binary** — compile to a single executable with `bun build --compile`
11
+ A powerful CLI for calling LLMs from the terminal. Text in, text out. Built on the [Vercel AI SDK](https://sdk.vercel.ai/) with [Bun](https://bun.sh/).
16
12
 
17
- ## Install
13
+ ## ✨ Features
14
+
15
+ - 🤖 **16+ AI Providers** — OpenAI, Anthropic, Google, Perplexity, xAI, Mistral, Groq, DeepSeek, Cohere, OpenRouter, Azure AI, Together AI, Amazon Bedrock, Google Vertex AI, Ollama, Hugging Face
16
+ - 📡 **Streaming by Default** — tokens print as they arrive
17
+ - 🔄 **Pipe-friendly** — reads from stdin, writes to stdout, errors to stderr
18
+ - 📎 **File Attachments** — include file contents in prompts with `-f`
19
+ - 📋 **JSON Output** — structured response with usage and finish reason
20
+ - ⚙️ **Config Directory** — set defaults in `~/.ai-pipe/`
21
+ - 🐚 **Shell Completions** — bash, zsh, fish
22
+ - 📦 **Standalone Binary** — compile to a single executable with `bun build --compile`
23
+
24
+ ## 📦 Installation
25
+
26
+ ```sh
27
+ bun install -g ai-pipe
28
+ ```
29
+
30
+ This installs both `ai-pipe` and `ai` as CLI commands.
31
+
32
+ Or run without installing:
18
33
 
19
34
  ```sh
35
+ bunx ai-pipe "explain monads in one sentence"
36
+ ```
37
+
38
+ Also available via npm (requires [Bun](https://bun.sh/) at runtime):
39
+
40
+ ```sh
41
+ npm install -g ai-pipe
42
+ npx ai-pipe "explain monads in one sentence"
43
+ ```
44
+
45
+ ### From source
46
+
47
+ ```bash
20
48
  git clone https://github.com/andrew-bierman/ai-pipe.git
21
49
  cd ai-pipe
22
50
  bun install
23
51
  bun link
24
52
  ```
25
53
 
26
- This installs both `ai-pipe` and `ai` as CLI commands.
27
-
28
- ## Setup
54
+ ## 🔑 Setup
29
55
 
30
56
  Set an API key for at least one provider:
31
57
 
32
- ```sh
58
+ ```bash
59
+ # OpenAI
33
60
  export OPENAI_API_KEY="sk-..."
61
+
62
+ # Anthropic
34
63
  export ANTHROPIC_API_KEY="sk-ant-..."
64
+
65
+ # OpenRouter
35
66
  export OPENROUTER_API_KEY="sk-or-..."
67
+
68
+ # Azure AI
36
69
  export AZURE_AI_API_KEY="..."
70
+
71
+ # Together AI
37
72
  export TOGETHERAI_API_KEY="..."
73
+
74
+ # AWS (for Bedrock)
38
75
  export AWS_ACCESS_KEY_ID="..."
39
76
  export AWS_SECRET_ACCESS_KEY="..."
77
+
78
+ # Google Vertex AI
40
79
  export GOOGLE_VERTEX_PROJECT="my-project"
41
80
  export GOOGLE_VERTEX_LOCATION="us-central1"
81
+
82
+ # Hugging Face
42
83
  export HF_TOKEN="hf_..."
43
84
 
44
85
  # Ollama (local)
45
86
  export OLLAMA_HOST="http://localhost:11434"
46
87
  ```
47
88
 
48
- Run `ai-pipe --providers` to see which keys are configured.
89
+ > 💡 **Tip:** Run `ai-pipe --providers` to see which keys are configured.
49
90
 
50
- ## Usage
91
+ ## 🚀 Usage
51
92
 
52
- ```sh
93
+ ### Basic Usage
94
+
95
+ ```bash
53
96
  # Ask a question
54
- ai-pipe "explain monads in one sentence"
97
+ ai-pipe "What is TypeScript?"
98
+
99
+ # Streaming response
100
+ ai-pipe "Write a poem about coding"
55
101
 
56
102
  # Pipe content
57
103
  cat main.go | ai-pipe "review this code"
58
104
  echo "hello world" | ai-pipe "translate to French"
105
+ ```
59
106
 
60
- # Pick a provider and model
107
+ ### Advanced Options
108
+
109
+ ```bash
110
+ # Specify provider and model
61
111
  ai-pipe -m anthropic/claude-sonnet-4-5 "write a haiku"
62
112
  ai-pipe -m google/gemini-2.5-flash "summarize this" < article.txt
63
113
 
64
- # Set a system prompt
114
+ # Include file contents
115
+ ai-pipe -f main.go "review this code"
116
+ ai-pipe -f src/app.ts -f src/utils.ts "find bugs"
117
+
118
+ # Set system prompt
65
119
  ai-pipe -s "you are a senior Go developer" "review this PR" < diff.txt
66
120
 
67
- # Get JSON output
121
+ # JSON output
68
122
  ai-pipe --json "what is 2+2"
69
123
 
70
124
  # Disable streaming
@@ -77,30 +131,32 @@ ai-pipe -t 1.5 "write a creative story"
77
131
  ai-pipe --max-output-tokens 100 "explain quantum computing"
78
132
  ```
79
133
 
80
- If no `provider/` prefix is given, the model defaults to `openai`. If no `-m` flag is given, it defaults to `openai/gpt-4o`.
134
+ > 📌 **Note:** If no `provider/` prefix is given, the model defaults to `openai`. If no `-m` flag is given, it defaults to `openai/gpt-4o`.
81
135
 
82
- ## Providers
136
+ ### Available Providers
83
137
 
84
138
  | Provider | Env Variable | Example Model |
85
139
  |---|---|---|
86
- | openai | `OPENAI_API_KEY` | `openai/gpt-4o` |
87
- | anthropic | `ANTHROPIC_API_KEY` | `anthropic/claude-sonnet-4-5` |
88
- | google | `GOOGLE_GENERATIVE_AI_API_KEY` | `google/gemini-2.5-flash` |
89
- | perplexity | `PERPLEXITY_API_KEY` | `perplexity/sonar` |
90
- | xai | `XAI_API_KEY` | `xai/grok-3` |
91
- | mistral | `MISTRAL_API_KEY` | `mistral/mistral-large-latest` |
92
- | groq | `GROQ_API_KEY` | `groq/llama-3.3-70b-versatile` |
93
- | deepseek | `DEEPSEEK_API_KEY` | `deepseek/deepseek-chat` |
94
- | cohere | `COHERE_API_KEY` | `cohere/command-r-plus` |
95
- | openrouter | `OPENROUTER_API_KEY` | `openrouter/openrouter` |
96
- | azure | `AZURE_AI_API_KEY` | `azure/azure-model-id` |
97
- | togetherai | `TOGETHERAI_API_KEY` | `togetherai/meta-llama/Llama-3.3-70b-Instruct` |
98
- | bedrock | `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` | `bedrock/anthropic.claude-sonnet-4-2025-02-19` |
99
- | vertex | `GOOGLE_VERTEX_PROJECT`, `GOOGLE_VERTEX_LOCATION` | `vertex/google/cloud/llama-3.1` |
100
- | ollama | `OLLAMA_HOST` | `ollama/llama3` |
101
- | huggingface | `HF_TOKEN` | `huggingface/meta-llama/Llama-3.3-70b-Instruct` |
102
-
103
- ## Config Directory
140
+ | OpenAI | `OPENAI_API_KEY` | `openai/gpt-4o` |
141
+ | Anthropic | `ANTHROPIC_API_KEY` | `anthropic/claude-sonnet-4-5` |
142
+ | Google | `GOOGLE_GENERATIVE_AI_API_KEY` | `google/gemini-2.5-flash` |
143
+ | Perplexity | `PERPLEXITY_API_KEY` | `perplexity/sonar` |
144
+ | xAI | `XAI_API_KEY` | `xai/grok-3` |
145
+ | Mistral | `MISTRAL_API_KEY` | `mistral/mistral-large-latest` |
146
+ | Groq | `GROQ_API_KEY` | `groq/llama-3.3-70b-versatile` |
147
+ | DeepSeek | `DEEPSEEK_API_KEY` | `deepseek/deepseek-chat` |
148
+ | Cohere | `COHERE_API_KEY` | `cohere/command-r-plus` |
149
+ | OpenRouter | `OPENROUTER_API_KEY` | `openrouter/openrouter` |
150
+ | Azure | `AZURE_AI_API_KEY` | `azure/azure-model-id` |
151
+ | TogetherAI | `TOGETHERAI_API_KEY` | `togetherai/meta-llama/Llama-3.3-70b-Instruct` |
152
+ | Bedrock | `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` | `bedrock/anthropic.claude-sonnet-4-2025-02-19` |
153
+ | Vertex | `GOOGLE_VERTEX_PROJECT`, `GOOGLE_VERTEX_LOCATION` | `vertex/google/cloud/llama-3.1` |
154
+ | Ollama | `OLLAMA_HOST` | `ollama/llama3` |
155
+ | HuggingFace | `HF_TOKEN` | `huggingface/meta-llama/Llama-3.3-70b-Instruct` |
156
+
157
+ ## ⚙️ Configuration
158
+
159
+ ### Config Directory
104
160
 
105
161
  Create `~/.ai-pipe/` with two optional files:
106
162
 
@@ -128,15 +184,15 @@ API keys in `apiKeys.json` work as an alternative to environment variables. Envi
128
184
 
129
185
  Use a custom config directory with `-c`:
130
186
 
131
- ```sh
187
+ ```bash
132
188
  ai-pipe -c ./my-config-dir "hello"
133
189
  ```
134
190
 
135
- CLI flags always override config values.
191
+ > 🔧 **Note:** CLI flags always override config values.
136
192
 
137
- ## Shell Completions
193
+ ### Shell Completions
138
194
 
139
- ```sh
195
+ ```bash
140
196
  # bash — add to ~/.bashrc
141
197
  eval "$(ai-pipe --completions bash)"
142
198
 
@@ -147,7 +203,7 @@ eval "$(ai-pipe --completions zsh)"
147
203
  ai-pipe --completions fish > ~/.config/fish/completions/ai-pipe.fish
148
204
  ```
149
205
 
150
- ## JSON Output
206
+ ## 📊 JSON Output
151
207
 
152
208
  Use `--json` to get structured output:
153
209
 
@@ -166,11 +222,11 @@ Use `--json` to get structured output:
166
222
 
167
223
  Pipe into `jq` for further processing:
168
224
 
169
- ```sh
225
+ ```bash
170
226
  ai-pipe --json "list 3 colors" | jq -r '.text'
171
227
  ```
172
228
 
173
- ## Flags
229
+ ## 🛠️ Command Options
174
230
 
175
231
  ```
176
232
  Usage: ai-pipe [options] [prompt...]
@@ -178,6 +234,7 @@ Usage: ai-pipe [options] [prompt...]
178
234
  Options:
179
235
  -m, --model <model> Model in provider/model-id format
180
236
  -s, --system <prompt> System prompt
237
+ -f, --file <path> Include file contents in prompt (repeatable)
181
238
  -j, --json Output full JSON response object
182
239
  --no-stream Wait for full response, then print
183
240
  -t, --temperature <n> Sampling temperature (0-2)
@@ -189,15 +246,15 @@ Options:
189
246
  -h, --help Print help
190
247
  ```
191
248
 
192
- ## Build
249
+ ## 📦 Building & Distribution
193
250
 
194
251
  Compile to a standalone binary:
195
252
 
196
- ```sh
253
+ ```bash
197
254
  # Current platform
198
255
  bun run build
199
256
 
200
- # Cross-platform
257
+ # Cross-platform builds
201
258
  bun run build:mac # macOS ARM
202
259
  bun run build:mac-x64 # macOS Intel
203
260
  bun run build:linux # Linux x64
@@ -207,14 +264,76 @@ bun run build:all # All targets
207
264
 
208
265
  Binaries are output to `dist/`.
209
266
 
210
- ## Development
267
+ ## 🧪 Development
211
268
 
212
- ```sh
269
+ ```bash
270
+ # Install dependencies
213
271
  bun install
214
- bun test # 210 tests across 7 files
272
+
273
+ # Run tests
274
+ bun test # 227 tests across 7 files
275
+
276
+ # Type checking
215
277
  bun run typecheck # TypeScript type checking
216
278
  ```
217
279
 
218
- ## License
280
+ ## 🚀 Releasing
281
+
282
+ 1. `bun pm version patch` (or `minor` / `major`)
283
+ 2. `git push --follow-tags`
284
+
285
+ The release workflow handles `bun publish`, binary builds, and GitHub release.
286
+
287
+ ## 🗺️ Roadmap
288
+
289
+ - [x] **Streaming by default** — tokens print as they arrive
290
+ - [x] **Pipe-friendly** — reads from stdin, writes to stdout, errors to stderr
291
+ - [x] **JSON output** — structured response with usage and finish reason
292
+ - [x] **Config directory** — set defaults in `~/.ai-pipe/`
293
+ - [x] **Shell completions** — bash, zsh, fish
294
+ - [x] **Standalone binary** — compile to a single executable with `bun build --compile`
295
+ - [x] **16 providers** — OpenAI, Anthropic, Google, and 13 more
296
+ - [x] **npm publishing** — `npm install -g ai-pipe` / `bun install -g ai-pipe`
297
+ - [x] **File attachments** — include file contents in prompts with `-f`
298
+ - [ ] **Conversation history** — continue previous conversations with `-C`, named sessions with `--session`
299
+ - [ ] **Image input** — attach images for vision models with `--image`
300
+ - [ ] **Roles** — saved system prompts in `~/.ai-pipe/roles/` (e.g. `ai-pipe --role reviewer`)
301
+ - [ ] **Markdown rendering** — syntax-highlighted, formatted output in the terminal
302
+ - [ ] **Cost tracking** — show estimated token costs per request
303
+ - [ ] **Response caching** — skip duplicate API calls for identical prompts
304
+ - [ ] **Tool use** — function calling and MCP support
305
+
306
+ ## 📚 Documentation
307
+
308
+ - [API Reference](docs/api.md)
309
+ - [Provider Configuration](docs/providers.md)
310
+ - [Examples](examples/)
311
+ - [Contributing Guide](CONTRIBUTING.md)
312
+
313
+ ## 🤝 Contributing
314
+
315
+ Contributions are welcome! Please read our [Contributing Guide](CONTRIBUTING.md) for details.
316
+
317
+ 1. Fork the repository
318
+ 2. Create your feature branch (`git checkout -b feature/amazing-feature`)
319
+ 3. Commit your changes (`git commit -m 'Add amazing feature'`)
320
+ 4. Push to the branch (`git push origin feature/amazing-feature`)
321
+ 5. Open a Pull Request
322
+
323
+ ## 📄 License
324
+
325
+ This project is licensed under the GNU General Public License v3.0 - see the [LICENSE](LICENSE) file for details.
326
+
327
+ ## 🙏 Acknowledgments
328
+
329
+ - [Vercel](https://vercel.com/) for the amazing AI SDK
330
+ - [Bun](https://bun.sh/) for the fast JavaScript runtime
331
+ - All our amazing contributors and users!
332
+
333
+ ---
334
+
335
+ <div align="center">
336
+
337
+ **Built with ❤️ using ai-pipe**
219
338
 
220
- GPL-3.0 — see [LICENSE](LICENSE) for details.
339
+ </div>
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "ai-pipe",
3
- "version": "1.0.0",
3
+ "version": "1.0.2",
4
4
  "description": "A lean CLI for calling LLMs from the terminal. Text in, text out.",
5
5
  "module": "src/index.ts",
6
6
  "type": "module",
@@ -22,7 +22,9 @@
22
22
  "build:mac": "bun build src/index.ts --compile --target=bun-darwin-arm64 --outfile dist/ai-pipe-darwin-arm64",
23
23
  "build:mac-x64": "bun build src/index.ts --compile --target=bun-darwin-x64 --outfile dist/ai-pipe-darwin-x64",
24
24
  "build:all": "bun run build:mac && bun run build:mac-x64 && bun run build:linux && bun run build:linux-arm",
25
- "typecheck": "tsc --noEmit"
25
+ "typecheck": "tsc --noEmit",
26
+ "lint": "bunx biome check --write .",
27
+ "lint:check": "bunx biome check ."
26
28
  },
27
29
  "keywords": [
28
30
  "ai",
@@ -68,7 +70,9 @@
68
70
  "zod": "^4.3.6"
69
71
  },
70
72
  "devDependencies": {
71
- "@types/bun": "latest"
73
+ "@biomejs/biome": "^2.3.14",
74
+ "@types/bun": "latest",
75
+ "lefthook": "^2.1.0"
72
76
  },
73
77
  "peerDependencies": {
74
78
  "typescript": "^5"