airvo 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
airvo-0.1.0/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Airvo
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,6 @@
1
+ # Include compiled React dashboard in the package
2
+ recursive-include airvo/dashboard/dist *
3
+
4
+ # Include README and license
5
+ include README.md
6
+ include LICENSE
airvo-0.1.0/PKG-INFO ADDED
@@ -0,0 +1,420 @@
1
+ Metadata-Version: 2.4
2
+ Name: airvo
3
+ Version: 0.1.0
4
+ Summary: Your local AI coding copilot — any model, any provider, zero cloud.
5
+ License: MIT
6
+ Keywords: ai,copilot,llm,coding,developer-tools,local-ai,litellm,continue,vscode
7
+ Classifier: Development Status :: 3 - Alpha
8
+ Classifier: Intended Audience :: Developers
9
+ Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Programming Language :: Python :: 3.11
12
+ Classifier: Programming Language :: Python :: 3.12
13
+ Requires-Python: >=3.11
14
+ Description-Content-Type: text/markdown
15
+ License-File: LICENSE
16
+ Requires-Dist: fastapi>=0.109.0
17
+ Requires-Dist: uvicorn>=0.27.0
18
+ Requires-Dist: pydantic>=2.6.0
19
+ Requires-Dist: pydantic-settings>=2.2.0
20
+ Requires-Dist: litellm>=1.30.0
21
+ Requires-Dist: python-dotenv>=1.0.0
22
+ Requires-Dist: aiofiles>=23.0.0
23
+ Requires-Dist: typer>=0.9.0
24
+ Dynamic: license-file
25
+
26
+ <picture>
27
+ <source media="(prefers-color-scheme: dark)" srcset="airvo/docs/assets/airvo-logo-dark.svg">
28
+ <source media="(prefers-color-scheme: light)" srcset="airvo/docs/assets/airvo-logo-light.svg">
29
+ <img src="airvo/docs/assets/airvo-logo-light.svg" alt="Airvo" height="80"/>
30
+ </picture>
31
+
32
+ <br/>
33
+
34
+ [![PyPI version](https://img.shields.io/badge/pypi-v0.1.0-7c6dfa?style=flat-square&logo=pypi&logoColor=white)](https://pypi.org/project/airvo)
35
+ [![Python](https://img.shields.io/badge/python-3.11+-7c6dfa?style=flat-square&logo=python&logoColor=white)](https://python.org)
36
+ [![License](https://img.shields.io/badge/license-MIT-fa6d8f?style=flat-square)](LICENSE)
37
+ [![LiteLLM](https://img.shields.io/badge/powered%20by-LiteLLM-4ade80?style=flat-square)](https://litellm.ai)
38
+ [![Continue.dev](https://img.shields.io/badge/works%20with-continue.dev-4ade80?style=flat-square)](https://continue.dev)
39
+
40
+ **Your local AI coding assistant — any model, any provider. Your AI. Your Rules.**
41
+
42
+ Airvo runs on your machine, connects to any AI model simultaneously, and integrates directly into VS Code via continue.dev. No cloud lock-in. No subscriptions. Your API keys stay local.
43
+
44
+ ---
45
+
46
+ ## Table of Contents
47
+
48
+ - [What is Airvo?](#what-is-airvo)
49
+ - [Quick Start](#quick-start)
50
+ - [Features](#features)
51
+ - [Supported Models](#supported-models)
52
+ - [Dashboard](#dashboard)
53
+ - [Multi-Model Modes](#multi-model-modes)
54
+ - [VS Code Integration](#vs-code-integration)
55
+ - [Configuration](#configuration)
56
+ - [Use Cases](#use-cases)
57
+ - [Roadmap](#roadmap)
58
+ - [Security](#security)
59
+ - [FAQ](#faq)
60
+ - [Community](#community)
61
+ - [License](#license)
62
+
63
+ ---
64
+
65
+ ## What is Airvo?
66
+
67
+ Airvo is a local server that sits between your editor and any AI model. Install it once, configure your API keys in the dashboard, and start coding with AI — without changing your workflow.
68
+
69
+ ```
70
+ Your Editor (VS Code)
71
+
72
+ │ OpenAI-compatible API
73
+
74
+ Airvo Server ←─── runs on localhost:8765
75
+
76
+ ├── Groq (Llama 3.1, Llama 3.3)
77
+ ├── OpenAI (GPT-4o, GPT-4o mini)
78
+ ├── Anthropic (Claude Sonnet, Haiku)
79
+ ├── Ollama (100% local, no API key)
80
+ ├── LM Studio (100% local)
81
+ └── Any LiteLLM-compatible provider
82
+ ```
83
+
84
+ **Why Airvo?**
85
+
86
+ - ✅ Any model, any provider — no lock-in
87
+ - ✅ Two models responding in parallel — see both, choose the best
88
+ - ✅ Your API keys stored locally — never shared
89
+ - ✅ 100% local option — zero internet, zero cost
90
+ - ✅ Works with free tiers — Groq, Ollama, LM Studio
91
+ - ✅ No subscription required
92
+ - ✅ Works natively inside VS Code
93
+
94
+ ---
95
+
96
+ ## Quick Start
97
+
98
+ **1. Install Airvo**
99
+
100
+ ```bash
101
+ pip install airvo
102
+ ```
103
+
104
+ **2. Start the server**
105
+
106
+ ```bash
107
+ airvo start
108
+ ```
109
+
110
+ That's it. Airvo will:
111
+ - Create your config at `~/.airvo/models.json`
112
+ - Auto-configure continue.dev at `~/.continue/config.yaml`
113
+ - Open the dashboard at `http://localhost:8765`
114
+
115
+ **3. Add your first model**
116
+
117
+ Open the dashboard → Add Model → fill in the model details → Save.
118
+
119
+ Not sure where to start? Add Groq — it's free and fast:
120
+ - **Model ID:** `groq/llama-3.3-70b-versatile`
121
+ - **Provider:** `groq`
122
+ - **API Key:** get one free at [console.groq.com](https://console.groq.com) — no credit card required
123
+
124
+ **4. Install continue.dev in VS Code**
125
+
126
+ Install the [Continue extension](https://marketplace.visualstudio.com/items?itemName=Continue.continue) from the VS Code marketplace. Airvo already configured it for you.
127
+
128
+ **5. Start coding**
129
+
130
+ Open VS Code → press `Ctrl+L` → ask anything.
131
+
132
+ ---
133
+
134
+ ## Features
135
+
136
+ **🤖 Any Model, Any Provider**
137
+ Add any model supported by LiteLLM — over 100 providers. Groq, OpenAI, Anthropic, Ollama, LM Studio, DeepSeek, Mistral, Gemini, and more.
138
+
139
+ **⚡ Multi-Model Parallel**
140
+ Run two models simultaneously. See both responses in VS Code and choose the best one.
141
+
142
+ **🔒 100% Local Option**
143
+ Use Ollama or LM Studio with no API key, no internet, no cost. Your code never leaves your machine.
144
+
145
+ **🎛️ Visual Dashboard**
146
+ Manage models, configure API keys, toggle models on/off — all from a clean dark UI at `localhost:8765`.
147
+
148
+ **🧠 Project Context**
149
+ Write your stack, preferences and constraints once. Airvo injects it into every request so the model always knows your project — without you repeating yourself.
150
+
151
+ **🌡️ Tunable Behavior**
152
+ Adjust temperature (0.0 → 1.0) and max tokens per request directly from the dashboard. Precise and deterministic for code, creative for brainstorming.
153
+
154
+ **📊 Usage Stats**
155
+ See requests and tokens used per model — all stored locally. Know exactly what you're using and reset anytime.
156
+
157
+ **🌍 7 Languages**
158
+ Dashboard available in English, Español, Français, Deutsch, 中文, 日本語, Português.
159
+
160
+ **🔌 VS Code Native**
161
+ Works through continue.dev — chat, edit, and apply code changes without leaving your editor.
162
+
163
+ ---
164
+
165
+ ## Supported Models
166
+
167
+ | Provider | Model ID | Free | Notes |
168
+ |----------|----------|------|-------|
169
+ | Groq | `groq/llama-3.1-8b-instant` | ✅ | Fast, free tier |
170
+ | Groq | `groq/llama-3.3-70b-versatile` | ✅ | Powerful, free tier |
171
+ | OpenAI | `openai/gpt-4o` | ❌ | Requires API key |
172
+ | OpenAI | `openai/gpt-4o-mini` | ❌ | Cheaper option |
173
+ | Anthropic | `anthropic/claude-sonnet-4-5` | ❌ | Requires API key |
174
+ | Anthropic | `anthropic/claude-haiku-4-5` | ❌ | Fastest Claude |
175
+ | Ollama | `ollama/llama3` | ✅ | 100% local |
176
+ | Ollama | `ollama/codellama` | ✅ | Code-optimized |
177
+ | LM Studio | `lmstudio/local` | ✅ | 100% local |
178
+ | DeepSeek | `deepseek/deepseek-chat` | ❌ | Very affordable |
179
+ | Mistral | `mistral/mistral-large-latest` | ❌ | Requires API key |
180
+ | Gemini | `gemini/gemini-1.5-pro` | ❌ | Requires API key |
181
+
182
+ Any model supported by [LiteLLM](https://docs.litellm.ai/docs/providers) works with Airvo.
183
+
184
+ ---
185
+
186
+ ## Dashboard
187
+
188
+ The Airvo dashboard runs at `http://localhost:8765` and lets you manage everything visually.
189
+
190
+ **Models page** — activate/deactivate models, save API keys, see requests and tokens per model.
191
+
192
+ ![Airvo Dashboard - Models](airvo/docs/assets/screenshot-models.png)
193
+
194
+ **Configuration page** — set multi-model mode, adjust temperature and max tokens, enable project context, view usage stats.
195
+
196
+ **Add Model page** — add any model with contextual tooltips on every field.
197
+
198
+ ![Airvo Dashboard - Add Model](airvo/docs/assets/screenshot-add-model.png)
199
+
200
+ **Help page** — full reference guide, field-by-field documentation, FAQ.
201
+
202
+ ![Airvo Dashboard - Help](airvo/docs/assets/screenshot-help.png)
203
+
204
+ ---
205
+
206
+ ## Multi-Model Modes
207
+
208
+ Airvo supports running multiple models at once. Configure the mode in the Configuration page.
209
+
210
+ **Parallel** *(default)* — All active models respond to every message. See all answers side by side. Best for comparing outputs.
211
+
212
+ **Race** — All models receive the message simultaneously. The first to finish wins. Best for speed.
213
+
214
+ **Vote** — Models generate responses and the consensus answer is shown. Best for accuracy.
215
+
216
+ **Review** — One model generates a response, another critiques it. Best for quality.
217
+
218
+ ---
219
+
220
+ ## VS Code Integration
221
+
222
+ Airvo works through [continue.dev](https://continue.dev) — a VS Code extension for AI-assisted coding.
223
+
224
+ ![Continue.dev with Airvo](airvo/docs/assets/screenshot-vscode.png)
225
+
226
+ **What you can do:**
227
+
228
+ ```
229
+ Chat → ask questions, get explanations, generate code
230
+ Edit → select code and ask Airvo to modify it
231
+ Apply → apply suggested changes directly in your file
232
+ ```
233
+
234
+ **The continue.dev config** is created automatically by `airvo start`:
235
+
236
+ ```yaml
237
+ models:
238
+ - name: Airvo
239
+ provider: openai
240
+ model: airvo-auto
241
+ apiBase: http://localhost:8765/v1
242
+ apiKey: local
243
+ roles:
244
+ - chat
245
+ - edit
246
+ - apply
247
+ ```
248
+
249
+ ---
250
+
251
+ ## Configuration
252
+
253
+ **CLI options**
254
+
255
+ ```bash
256
+ airvo start # default: localhost:8765, opens browser
257
+ airvo start --port 9000 # custom port
258
+ airvo start --host 0.0.0.0 # accessible from local network
259
+ airvo start --no-browser # don't open browser automatically
260
+ airvo start --reload # hot reload (development)
261
+
262
+ airvo config --show # show current config
263
+ airvo version # show version
264
+ ```
265
+
266
+ **Models config** — stored at `~/.airvo/models.json`
267
+
268
+ ```json
269
+ [
270
+ {
271
+ "id": "groq/llama-3.3-70b-versatile",
272
+ "name": "Llama 3.3 70B (Groq)",
273
+ "provider": "groq",
274
+ "api_key": "your-api-key",
275
+ "base_url": null,
276
+ "active": true,
277
+ "notes": "More powerful, still free"
278
+ }
279
+ ]
280
+ ```
281
+
282
+ **Adding a local model (Ollama)**
283
+
284
+ ```bash
285
+ # 1. Install Ollama from ollama.com
286
+ # 2. Pull a model
287
+ ollama pull llama3
288
+
289
+ # 3. Add it in the Airvo dashboard
290
+ # Model ID: ollama/llama3
291
+ # Provider: ollama
292
+ # Base URL: http://localhost:11434
293
+ # API Key: (leave empty)
294
+ ```
295
+
296
+ ---
297
+
298
+ ## Use Cases
299
+
300
+ **Generate a function**
301
+ ```
302
+ "create a Python function that validates an email address with regex"
303
+ → Airvo generates the function with tests
304
+ → click Apply to add it directly to your file
305
+ ```
306
+
307
+ **Explain legacy code**
308
+ ```
309
+ "explain what this function does and why it might be slow"
310
+ → if you have two models active, both analyze in parallel
311
+ → you see both perspectives and choose the best explanation
312
+ ```
313
+
314
+ **Refactor on the fly**
315
+ ```
316
+ select code → "refactor this to use async/await"
317
+ → Airvo rewrites it
318
+ → Apply the change with one click
319
+ ```
320
+
321
+ **100% offline workflow**
322
+ ```
323
+ add ollama/llama3 → no API key, no internet, no cost
324
+ → full AI coding experience with zero data leaving your machine
325
+ ```
326
+
327
+ ---
328
+
329
+ ## Roadmap
330
+
331
+ **v0.1.0 — Available now**
332
+ - ✅ Local FastAPI server with OpenAI-compatible API
333
+ - ✅ LiteLLM integration — any provider, any model
334
+ - ✅ Multi-model parallel mode
335
+ - ✅ Continue.dev integration for VS Code
336
+ - ✅ Visual dashboard with 7 languages
337
+ - ✅ CLI — `pip install airvo && airvo start`
338
+ - ✅ Temperature and max tokens — configurable from dashboard
339
+ - ✅ Project Context — inject your stack into every request
340
+ - ✅ Usage stats — requests and tokens per model, stored locally
341
+
342
+ **What's next**
343
+
344
+ We're working on the next phase of Airvo. If you want to be the first to know:
345
+
346
+ → ⭐ Star this repo to follow updates
347
+ → 💬 Open an issue and tell us what you'd like to see
348
+
349
+ ---
350
+
351
+ ## Security
352
+
353
+ Airvo is designed with privacy and security in mind:
354
+
355
+ - **API keys stay local** — stored in `~/.airvo/models.json` on your machine, never sent to Airvo servers
356
+ - **Localhost only** — the server listens on `localhost:8765` by default, not accessible from the internet
357
+ - **Restricted CORS** — only the dashboard and VS Code extensions can make requests to the server
358
+ - **No telemetry** — Airvo collects no usage data, no analytics, no crash reports
359
+ - **Open source** — the full source code is on GitHub, you can audit everything
360
+
361
+ ---
362
+
363
+ ## FAQ
364
+
365
+ **How do I add a model?**
366
+ Open the dashboard → Add Model → fill in the Model ID, Provider, and API Key → Save. Any model supported by [LiteLLM](https://docs.litellm.ai/docs/providers) works. Check the Supported Models table for examples.
367
+
368
+ **How do I run two models in parallel?**
369
+ Add two models in the dashboard and activate both. Airvo will call them simultaneously on every request. You can have up to 2 active models in v0.1.0.
370
+
371
+ **What is the Model ID format?**
372
+ It follows LiteLLM's format: `provider/model-name`. For example: `groq/llama-3.3-70b-versatile`, `openai/gpt-4o`, `ollama/llama3`. Check the [LiteLLM docs](https://docs.litellm.ai/docs/providers) for the full list.
373
+
374
+ **Do I need to pay for anything?**
375
+ Airvo itself is free. You only pay for the AI models you use. Groq, Ollama, and LM Studio all have free options.
376
+
377
+ **Does my code get sent to the cloud?**
378
+ Only when you use cloud models (OpenAI, Anthropic, etc.) — and only the specific code you include in your message. When using local models (Ollama, LM Studio), nothing leaves your machine.
379
+
380
+ **Where are my API keys stored?**
381
+ Locally in `~/.airvo/models.json` on your machine. They are never sent anywhere except to the model provider when making requests.
382
+
383
+ **Can I use any model?**
384
+ Yes — any model supported by [LiteLLM](https://docs.litellm.ai/docs/providers) works with Airvo. Over 100 providers.
385
+
386
+ **How do I get a free Groq API key?**
387
+ Go to [console.groq.com](https://console.groq.com), sign up, and create an API key. No credit card required.
388
+
389
+ **How do I add a local model like Ollama?**
390
+ Install [Ollama](https://ollama.com), pull a model with `ollama pull llama3`, then add it in the dashboard with Model ID `ollama/llama3`, Provider `ollama`, Base URL `http://localhost:11434`, and leave the API Key empty.
391
+
392
+ **Can I run Airvo on a local network?**
393
+ Yes — run `airvo start --host 0.0.0.0` and it will be accessible from any device on your network.
394
+
395
+ **Airvo is not connecting to VS Code — what do I do?**
396
+ Make sure continue.dev is installed in VS Code and that `airvo start` has run at least once to create the config. You can verify the config exists at `~/.continue/config.yaml`.
397
+
398
+ ---
399
+
400
+ ## Community
401
+
402
+ Airvo is early. Your feedback shapes what comes next.
403
+
404
+ - 🐛 **Found a bug?** [Open an issue](https://github.com/airvo-dev/airvo/issues)
405
+ - 💡 **Have an idea?** [Start a discussion](https://github.com/airvo-dev/airvo/discussions)
406
+ - ⭐ **Liked Airvo?** Star the repo — it helps a lot
407
+
408
+ ---
409
+
410
+ ## License
411
+
412
+ MIT — see [LICENSE](LICENSE) for details.
413
+
414
+ ---
415
+
416
+ <p align="center">
417
+ Built for developers who want AI that works for them — not the other way around.
418
+ <br/><br/>
419
+ <strong>Your AI. Your Rules.</strong>
420
+ </p>