@infrarix/locopilot 1.1.0 โ 1.2.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +136 -170
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,239 +1,205 @@
|
|
|
1
|
-
|
|
1
|
+
<p align="center">
|
|
2
|
+
<img src="docs/static/img/logo.svg" alt="LocoPilot" width="96" height="96" />
|
|
3
|
+
</p>
|
|
2
4
|
|
|
3
|
-
>
|
|
4
|
-
> Run models locally in seconds, optionally scale with cloud GPUs โ all through one CLI + one API.
|
|
5
|
+
<h1 align="center">LocoPilot</h1>
|
|
5
6
|
|
|
6
|
-
|
|
7
|
+
<p align="center">
|
|
8
|
+
<strong>A local-first, OpenAI-compatible AI runtime in a single CLI.</strong><br>
|
|
9
|
+
Run open models on your laptop, fall back to cloud GPUs when you need to, fine-tune on your data โ all behind one OpenAI-compatible API.
|
|
10
|
+
</p>
|
|
7
11
|
|
|
8
|
-
|
|
9
|
-
|
|
10
|
-
|
|
11
|
-
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
|
|
15
|
-
- ๐ Auto-fallback to remote GPU (Pro)
|
|
16
|
-
- ๐ง Fine-tune models locally or in the cloud
|
|
17
|
-
- ๐ Expose APIs publicly in one command
|
|
12
|
+
<p align="center">
|
|
13
|
+
<a href="https://www.npmjs.com/package/@infrarix/locopilot"><img src="https://img.shields.io/npm/v/@infrarix/locopilot.svg?style=flat-square" alt="npm version" /></a>
|
|
14
|
+
<a href="https://www.npmjs.com/package/@infrarix/locopilot"><img src="https://img.shields.io/npm/dm/@infrarix/locopilot.svg?style=flat-square" alt="npm downloads" /></a>
|
|
15
|
+
<a href="https://github.com/Infrarix/locopilot/blob/main/LICENSE"><img src="https://img.shields.io/npm/l/@infrarix/locopilot.svg?style=flat-square" alt="License" /></a>
|
|
16
|
+
<a href="https://github.com/Infrarix/locopilot/actions"><img src="https://img.shields.io/github/actions/workflow/status/Infrarix/locopilot/release.yml?branch=main&style=flat-square" alt="CI status" /></a>
|
|
17
|
+
<img src="https://img.shields.io/node/v/@infrarix/locopilot.svg?style=flat-square" alt="Node version" />
|
|
18
|
+
</p>
|
|
18
19
|
|
|
19
|
-
|
|
20
|
+
<p align="center">
|
|
21
|
+
<a href="https://infrarix.github.io/locopilot/">Documentation</a> ยท
|
|
22
|
+
<a href="https://infrarix.github.io/locopilot/docs/getting-started/quickstart">Quickstart</a> ยท
|
|
23
|
+
<a href="https://infrarix.github.io/locopilot/docs/cli/init">CLI reference</a> ยท
|
|
24
|
+
<a href="CONTRIBUTING.md">Contributing</a>
|
|
25
|
+
</p>
|
|
20
26
|
|
|
21
27
|
---
|
|
22
28
|
|
|
23
|
-
|
|
24
|
-
|
|
25
|
-
## ๐ข Open Source (this repo)
|
|
26
|
-
|
|
27
|
-
| Component | Description |
|
|
28
|
-
| --------------------- | ---------------------------------------------------------------------------- |
|
|
29
|
-
| `apps/cli` | CLI โ `init`, `start`, `models`, `train`, `expose`, `login` |
|
|
30
|
-
| `apps/api` | Fastify API โ OpenAI-compatible endpoints |
|
|
31
|
-
| `core/runtime/ollama` | Local inference wrapper |
|
|
32
|
-
| `core/training` | Basic local training: MLX on Apple Silicon, Unsloth/Axolotl on Linux/Windows |
|
|
33
|
-
| `db/` | SQLite schema (free tier) |
|
|
34
|
-
| `utils/` | Shared helpers |
|
|
35
|
-
|
|
36
|
-
---
|
|
29
|
+
## Why LocoPilot
|
|
37
30
|
|
|
38
|
-
|
|
31
|
+
Most AI tooling forces a choice: ship to a hosted provider and pay per token, or stitch together your own runtime, gateway, and trainer. LocoPilot collapses both into one CLI.
|
|
39
32
|
|
|
40
|
-
|
|
33
|
+
- **Zero config.** `locopilot init` detects your platform, installs Ollama if missing, and writes a working API in under a minute.
|
|
34
|
+
- **OpenAI-compatible.** Point any client (OpenAI SDKs, LangChain, LlamaIndex, plain `curl`) at `http://localhost:8080/v1` and it just works โ `chat/completions`, `models`, streaming.
|
|
35
|
+
- **Local first, cloud when you need it.** The Free tier serves models from your own Ollama. Sign in with `locopilot login` and requests for models that aren't local fall through to remote GPU automatically.
|
|
36
|
+
- **Fine-tune from the same CLI.** MLX on Apple Silicon, Unsloth or Axolotl on Linux/Windows. Run locally for free, or pass `--cloud` for managed GPU.
|
|
37
|
+
- **No vendor lock-in.** MIT-licensed. The Pro features are pure HTTP clients โ no private packages, no kernel modules, no telemetry.
|
|
41
38
|
|
|
42
|
-
|
|
43
|
-
- โก Faster training (10โ50x)
|
|
44
|
-
- ๐ Public API (Cloudflare tunnel)
|
|
45
|
-
- ๐ Usage tracking + billing
|
|
46
|
-
- ๐ Auth + API key management
|
|
39
|
+
## Install
|
|
47
40
|
|
|
48
|
-
|
|
41
|
+
```bash
|
|
42
|
+
npm install -g @infrarix/locopilot
|
|
43
|
+
```
|
|
49
44
|
|
|
50
|
-
|
|
45
|
+
Requires **Node.js 20+**. Ollama is installed automatically by `locopilot init` if it isn't already on `$PATH`.
|
|
51
46
|
|
|
52
|
-
|
|
47
|
+
## Quickstart
|
|
53
48
|
|
|
54
49
|
```bash
|
|
55
|
-
# 1.
|
|
56
|
-
npm install -g @infrarix/locopilot
|
|
57
|
-
|
|
58
|
-
# 2. Initialize (auto-installs Ollama if missing)
|
|
50
|
+
# 1. Bootstrap (writes ~/.locopilot, installs Ollama if missing)
|
|
59
51
|
locopilot init
|
|
60
52
|
|
|
61
|
-
#
|
|
62
|
-
locopilot start
|
|
63
|
-
|
|
64
|
-
# 4. Pull a model
|
|
53
|
+
# 2. Pull a model
|
|
65
54
|
locopilot models pull llama3
|
|
66
55
|
|
|
67
|
-
#
|
|
56
|
+
# 3. Start the local API
|
|
57
|
+
locopilot start
|
|
58
|
+
|
|
59
|
+
# 4. Use it
|
|
68
60
|
curl http://localhost:8080/v1/chat/completions \
|
|
69
61
|
-H "Content-Type: application/json" \
|
|
70
62
|
-d '{"model":"llama3","messages":[{"role":"user","content":"Hello"}]}'
|
|
71
63
|
```
|
|
72
64
|
|
|
73
|
-
|
|
74
|
-
|
|
75
|
-
---
|
|
65
|
+
No Docker, no account, works offline.
|
|
76
66
|
|
|
77
|
-
|
|
67
|
+
### From the OpenAI SDK
|
|
78
68
|
|
|
79
|
-
```
|
|
80
|
-
|
|
81
|
-
locopilot login
|
|
69
|
+
```ts
|
|
70
|
+
import OpenAI from 'openai';
|
|
82
71
|
|
|
83
|
-
|
|
84
|
-
|
|
72
|
+
const client = new OpenAI({
|
|
73
|
+
baseURL: 'http://localhost:8080/v1',
|
|
74
|
+
apiKey: 'not-required-for-local',
|
|
75
|
+
});
|
|
85
76
|
|
|
86
|
-
|
|
77
|
+
const reply = await client.chat.completions.create({
|
|
78
|
+
model: 'llama3',
|
|
79
|
+
messages: [{ role: 'user', content: 'Hello' }],
|
|
80
|
+
});
|
|
87
81
|
```
|
|
88
82
|
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
# ๐ Architecture
|
|
92
|
-
|
|
93
|
-
## ๐ข Free Tier (Local Mode)
|
|
83
|
+
## Commands
|
|
94
84
|
|
|
95
|
-
|
|
96
|
-
|
|
97
|
-
|
|
98
|
-
|
|
85
|
+
| Command | Description |
|
|
86
|
+
| --------------------------------------------- | ---------------------------------------------------------- |
|
|
87
|
+
| `locopilot init` | Bootstrap config, install Ollama, run pre-flight checks |
|
|
88
|
+
| `locopilot doctor` | Diagnose the local environment |
|
|
89
|
+
| `locopilot start [--port <n>]` | Start the OpenAI-compatible API on `localhost:8080` |
|
|
90
|
+
| `locopilot models pull <model>` | Download a model via Ollama |
|
|
91
|
+
| `locopilot models list` | List installed models (and cloud catalog when Pro) |
|
|
92
|
+
| `locopilot models rm <model>` | Remove a local model |
|
|
93
|
+
| `locopilot train --config <file>` | Run a fine-tuning job locally (in-process) |
|
|
94
|
+
| `locopilot train --config <file> --cloud` | Submit a training job to LocoPilot Cloud *(Pro)* |
|
|
95
|
+
| `locopilot logs --job <id>` | Stream logs from a training job |
|
|
96
|
+
| `locopilot expose` | Publish the local API on a Cloudflare tunnel *(Pro)* |
|
|
97
|
+
| `locopilot login` | Sign in for Pro features |
|
|
98
|
+
| `locopilot logout` | Remove the stored Pro token |
|
|
99
|
+
| `locopilot whoami` | Show the current Pro account |
|
|
100
|
+
| `locopilot usage` | Show token usage and billing summary *(Pro)* |
|
|
99
101
|
|
|
100
|
-
|
|
101
|
-
- No account required
|
|
102
|
-
- No external dependencies
|
|
102
|
+
Full reference at <https://infrarix.github.io/locopilot/docs/cli/init>.
|
|
103
103
|
|
|
104
|
-
|
|
104
|
+
## Configuration
|
|
105
105
|
|
|
106
|
-
|
|
106
|
+
All local state lives under `~/.locopilot/`:
|
|
107
107
|
|
|
108
108
|
```
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
|
|
113
|
-
โโโ Tunnel (Cloudflare)
|
|
114
|
-
โโโ Auth
|
|
115
|
-
โโโ Usage tracking
|
|
109
|
+
~/.locopilot/
|
|
110
|
+
โโโ .env # API_PORT, OLLAMA_HOST, etc.
|
|
111
|
+
โโโ db.sqlite # Free-tier inference and training history
|
|
112
|
+
โโโ config.json # Pro-tier token (created by `locopilot login`)
|
|
116
113
|
```
|
|
117
114
|
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
# ๐ง Training
|
|
115
|
+
Tier detection is **purely client-side**: a valid `qs_โฆ` token in `~/.locopilot/config.json` enables Pro features. The CLI never reads cloud credentials or database URLs directly.
|
|
121
116
|
|
|
122
|
-
##
|
|
117
|
+
## Architecture
|
|
123
118
|
|
|
124
|
-
|
|
125
|
-
locopilot train --config config.json
|
|
126
|
-
```
|
|
119
|
+
LocoPilot ships as a single npm package. Source lives under [`src/`](src):
|
|
127
120
|
|
|
128
|
-
|
|
129
|
-
|
|
130
|
-
|
|
131
|
-
|
|
121
|
+
| Path | Purpose |
|
|
122
|
+
| --------------- | ---------------------------------------------------------------- |
|
|
123
|
+
| `src/api/` | Fastify 5 gateway exposing OpenAI-compatible endpoints |
|
|
124
|
+
| `src/cli/` | Commander.js CLI (`init`, `start`, `train`, `expose`, โฆ) |
|
|
125
|
+
| `src/worker/` | Basic in-process training worker (Free tier) |
|
|
126
|
+
| `src/training/` | `TrainingAdapter` interface + MLX / Unsloth / Axolotl runners |
|
|
127
|
+
| `src/cloud/` | Single point of egress to LocoPilot Cloud (Pro) |
|
|
128
|
+
| `src/shared/` | Shared utilities, types, DB pool, Ollama client |
|
|
132
129
|
|
|
133
|
-
|
|
130
|
+
### Free tier
|
|
134
131
|
|
|
135
|
-
|
|
136
|
-
|
|
137
|
-
|
|
138
|
-
locopilot train --cloud
|
|
132
|
+
```
|
|
133
|
+
client โโโบ local Fastify API โโโบ Ollama runtime
|
|
134
|
+
โโโบ SQLite (~/.locopilot/db.sqlite)
|
|
139
135
|
```
|
|
140
136
|
|
|
141
|
-
|
|
142
|
-
- Faster + better results
|
|
143
|
-
- No setup required
|
|
144
|
-
|
|
145
|
-
---
|
|
146
|
-
|
|
147
|
-
# ๐งฐ CLI Reference
|
|
137
|
+
### Pro tier
|
|
148
138
|
|
|
149
|
-
|
|
150
|
-
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
|
|
156
|
-
| `locopilot expose` | Get public API URL |
|
|
157
|
-
| `locopilot train --config <file>` | Local training |
|
|
158
|
-
| `locopilot train --cloud` | Cloud training (Pro) |
|
|
159
|
-
| `locopilot logs` | View logs |
|
|
139
|
+
```
|
|
140
|
+
client โโโบ local Fastify API โโโบ Ollama (local first)
|
|
141
|
+
โโโบ LocoPilot Cloud (model missing locally)
|
|
142
|
+
โโโ RunPod GPU
|
|
143
|
+
โโโ Cloudflare Tunnel (`expose`)
|
|
144
|
+
โโโ Auth + usage metering
|
|
145
|
+
```
|
|
160
146
|
|
|
161
|
-
|
|
147
|
+
## Fine-tuning
|
|
162
148
|
|
|
163
|
-
|
|
149
|
+
Free-tier training runs on your machine. The adapter is selected per OS:
|
|
164
150
|
|
|
165
|
-
|
|
151
|
+
- **macOS arm64** โ MLX (`mlx-lm`, Metal-optimized)
|
|
152
|
+
- **Linux / Windows + NVIDIA** โ Unsloth (default) or Axolotl
|
|
166
153
|
|
|
167
|
-
|
|
168
|
-
|
|
169
|
-
|
|
154
|
+
```bash
|
|
155
|
+
locopilot train --config train.json
|
|
156
|
+
```
|
|
170
157
|
|
|
171
|
-
|
|
158
|
+
For multi-GPU, faster wall time, and managed checkpoints:
|
|
172
159
|
|
|
173
|
-
|
|
174
|
-
|
|
160
|
+
```bash
|
|
161
|
+
locopilot train --config train.json --cloud
|
|
162
|
+
```
|
|
175
163
|
|
|
176
|
-
|
|
164
|
+
See [training docs](https://infrarix.github.io/locopilot/docs/training/configuration) for the full config schema and supported dataset formats (Alpaca, ShareGPT).
|
|
177
165
|
|
|
178
|
-
|
|
166
|
+
## Pro tier
|
|
179
167
|
|
|
180
|
-
|
|
181
|
-
- Pro tokens stored in:
|
|
168
|
+
LocoPilot is open-source and free for local use. The optional **LocoPilot Cloud** adds:
|
|
182
169
|
|
|
183
|
-
|
|
184
|
-
|
|
185
|
-
|
|
170
|
+
- Remote GPU inference fallback (RunPod-backed)
|
|
171
|
+
- Managed training (10โ50ร a laptop)
|
|
172
|
+
- Public HTTPS endpoint via Cloudflare Tunnel
|
|
173
|
+
- Token-level usage analytics and billing
|
|
186
174
|
|
|
187
|
-
|
|
175
|
+
Pro is purely an HTTP client โ no private packages are ever installed locally.
|
|
188
176
|
|
|
189
|
-
|
|
177
|
+
## Documentation
|
|
190
178
|
|
|
191
|
-
|
|
179
|
+
The full Docusaurus site lives at <https://infrarix.github.io/locopilot/>. Highlights:
|
|
192
180
|
|
|
193
|
-
|
|
194
|
-
|
|
181
|
+
- [Getting started](https://infrarix.github.io/locopilot/docs/getting-started/quickstart)
|
|
182
|
+
- [API reference](https://infrarix.github.io/locopilot/docs/api/chat-completions)
|
|
183
|
+
- [CLI reference](https://infrarix.github.io/locopilot/docs/cli/init)
|
|
184
|
+
- [Architecture overview](https://infrarix.github.io/locopilot/docs/architecture/overview)
|
|
185
|
+
- [Training guide](https://infrarix.github.io/locopilot/docs/training/configuration)
|
|
195
186
|
|
|
196
|
-
|
|
187
|
+
## Contributing
|
|
197
188
|
|
|
198
|
-
|
|
199
|
-
|
|
200
|
-
We welcome contributions!
|
|
201
|
-
|
|
202
|
-
- Improve CLI / API
|
|
203
|
-
- Add integrations
|
|
204
|
-
- Improve docs
|
|
189
|
+
Bug reports, feature requests, and PRs are welcome. See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup, branching, commit conventions, and the release process.
|
|
205
190
|
|
|
206
191
|
```bash
|
|
207
|
-
git clone https://github.com/
|
|
192
|
+
git clone https://github.com/Infrarix/locopilot.git
|
|
193
|
+
cd locopilot
|
|
194
|
+
npm install
|
|
195
|
+
npm run build
|
|
196
|
+
npm run dev
|
|
208
197
|
```
|
|
209
198
|
|
|
210
|
-
|
|
211
|
-
|
|
212
|
-
# ๐ License
|
|
213
|
-
|
|
214
|
-
MIT โ see [LICENSE](LICENSE)
|
|
215
|
-
|
|
216
|
-
---
|
|
217
|
-
|
|
218
|
-
# ๐ Roadmap
|
|
219
|
-
|
|
220
|
-
- [ ] Plugin SDK
|
|
221
|
-
- [ ] GUI dashboard (Pro)
|
|
222
|
-
- [ ] Multi-model routing
|
|
223
|
-
- [ ] Enterprise features
|
|
224
|
-
|
|
225
|
-
---
|
|
226
|
-
|
|
227
|
-
# ๐ Learn More
|
|
228
|
-
|
|
229
|
-
๐ [https://locopilot.dev](https://locopilot.dev)
|
|
230
|
-
|
|
231
|
-
---
|
|
199
|
+
## Security
|
|
232
200
|
|
|
233
|
-
|
|
201
|
+
Found a vulnerability? Please don't open a public issue โ see [SECURITY.md](SECURITY.md) for the responsible disclosure process.
|
|
234
202
|
|
|
235
|
-
|
|
236
|
-
- or **GitHub repo structure + badges**
|
|
237
|
-
- or **launch strategy (HN/Product Hunt)**
|
|
203
|
+
## License
|
|
238
204
|
|
|
239
|
-
|
|
205
|
+
[MIT](LICENSE) ยฉ 2025โ2026 LocoPilot
|