@infrarix/locopilot 1.1.0 โ†’ 1.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +136 -170
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -1,239 +1,205 @@
1
- # ๐Ÿš€ LocoPilot
1
+ <p align="center">
2
+ <img src="docs/static/img/logo.svg" alt="LocoPilot" width="96" height="96" />
3
+ </p>
2
4
 
3
- > Local-first, OpenAI-compatible AI platform.
4
- > Run models locally in seconds, optionally scale with cloud GPUs โ€” all through one CLI + one API.
5
+ <h1 align="center">LocoPilot</h1>
5
6
 
6
- [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE)
7
+ <p align="center">
8
+ <strong>A local-first, OpenAI-compatible AI runtime in a single CLI.</strong><br>
9
+ Run open models on your laptop, fall back to cloud GPUs when you need to, fine-tune on your data โ€” all behind one OpenAI-compatible API.
10
+ </p>
7
11
 
8
- ---
9
-
10
- # โœจ What is LocoPilot?
11
-
12
- LocoPilot lets you:
13
-
14
- - โšก Run LLMs locally via Ollama (zero config)
15
- - ๐Ÿ”„ Auto-fallback to remote GPU (Pro)
16
- - ๐Ÿง  Fine-tune models locally or in the cloud
17
- - ๐ŸŒ Expose APIs publicly in one command
12
+ <p align="center">
13
+ <a href="https://www.npmjs.com/package/@infrarix/locopilot"><img src="https://img.shields.io/npm/v/@infrarix/locopilot.svg?style=flat-square" alt="npm version" /></a>
14
+ <a href="https://www.npmjs.com/package/@infrarix/locopilot"><img src="https://img.shields.io/npm/dm/@infrarix/locopilot.svg?style=flat-square" alt="npm downloads" /></a>
15
+ <a href="https://github.com/Infrarix/locopilot/blob/main/LICENSE"><img src="https://img.shields.io/npm/l/@infrarix/locopilot.svg?style=flat-square" alt="License" /></a>
16
+ <a href="https://github.com/Infrarix/locopilot/actions"><img src="https://img.shields.io/github/actions/workflow/status/Infrarix/locopilot/release.yml?branch=main&style=flat-square" alt="CI status" /></a>
17
+ <img src="https://img.shields.io/node/v/@infrarix/locopilot.svg?style=flat-square" alt="Node version" />
18
+ </p>
18
19
 
19
- ๐Ÿ‘‰ Works like OpenAI โ€” just change your base URL.
20
+ <p align="center">
21
+ <a href="https://infrarix.github.io/locopilot/">Documentation</a> ยท
22
+ <a href="https://infrarix.github.io/locopilot/docs/getting-started/quickstart">Quickstart</a> ยท
23
+ <a href="https://infrarix.github.io/locopilot/docs/cli/init">CLI reference</a> ยท
24
+ <a href="CONTRIBUTING.md">Contributing</a>
25
+ </p>
20
26
 
21
27
  ---
22
28
 
23
- # ๐Ÿงฉ Open Source vs Pro
24
-
25
- ## ๐ŸŸข Open Source (this repo)
26
-
27
- | Component | Description |
28
- | --------------------- | ---------------------------------------------------------------------------- |
29
- | `apps/cli` | CLI โ€” `init`, `start`, `models`, `train`, `expose`, `login` |
30
- | `apps/api` | Fastify API โ€” OpenAI-compatible endpoints |
31
- | `core/runtime/ollama` | Local inference wrapper |
32
- | `core/training` | Basic local training: MLX on Apple Silicon, Unsloth/Axolotl on Linux/Windows |
33
- | `db/` | SQLite schema (free tier) |
34
- | `utils/` | Shared helpers |
35
-
36
- ---
29
+ ## Why LocoPilot
37
30
 
38
- ## ๐Ÿ”ต Pro (Cloud โ€” not shipped as code)
31
+ Most AI tooling forces a choice: ship to a hosted provider and pay per token, or stitch together your own runtime, gateway, and trainer. LocoPilot collapses both into one CLI.
39
32
 
40
- Powered by **LocoPilot Cloud**:
33
+ - **Zero config.** `locopilot init` detects your platform, installs Ollama if missing, and writes a working API in under a minute.
34
+ - **OpenAI-compatible.** Point any client (OpenAI SDKs, LangChain, LlamaIndex, plain `curl`) at `http://localhost:8080/v1` and it just works โ€” `chat/completions`, `models`, streaming.
35
+ - **Local first, cloud when you need it.** The Free tier serves models from your own Ollama. Sign in with `locopilot login` and requests for models that aren't local fall through to remote GPU automatically.
36
+ - **Fine-tune from the same CLI.** MLX on Apple Silicon, Unsloth or Axolotl on Linux/Windows. Run locally for free, or pass `--cloud` for managed GPU.
37
+ - **No vendor lock-in.** MIT-licensed. The Pro features are pure HTTP clients โ€” no private packages, no kernel modules, no telemetry.
41
38
 
42
- - โ˜๏ธ Remote GPU (via RunPod)
43
- - โšก Faster training (10โ€“50x)
44
- - ๐ŸŒ Public API (Cloudflare tunnel)
45
- - ๐Ÿ“Š Usage tracking + billing
46
- - ๐Ÿ” Auth + API key management
39
+ ## Install
47
40
 
48
- ๐Ÿ‘‰ No private packages. Everything runs via cloud APIs.
41
+ ```bash
42
+ npm install -g @infrarix/locopilot
43
+ ```
49
44
 
50
- ---
45
+ Requires **Node.js 20+**. Ollama is installed automatically by `locopilot init` if it isn't already on `$PATH`.
51
46
 
52
- # โšก Quickstart (Free Tier โ€” 60 seconds)
47
+ ## Quickstart
53
48
 
54
49
  ```bash
55
- # 1. Install CLI
56
- npm install -g @infrarix/locopilot
57
-
58
- # 2. Initialize (auto-installs Ollama if missing)
50
+ # 1. Bootstrap (writes ~/.locopilot, installs Ollama if missing)
59
51
  locopilot init
60
52
 
61
- # 3. Start local API
62
- locopilot start
63
-
64
- # 4. Pull a model
53
+ # 2. Pull a model
65
54
  locopilot models pull llama3
66
55
 
67
- # 5. Test inference
56
+ # 3. Start the local API
57
+ locopilot start
58
+
59
+ # 4. Use it
68
60
  curl http://localhost:8080/v1/chat/completions \
69
61
  -H "Content-Type: application/json" \
70
62
  -d '{"model":"llama3","messages":[{"role":"user","content":"Hello"}]}'
71
63
  ```
72
64
 
73
- ๐Ÿ‘‰ No Docker. No setup. Works offline.
74
-
75
- ---
65
+ No Docker, no account, works offline.
76
66
 
77
- # ๐Ÿ”“ Unlock Pro Features
67
+ ### From the OpenAI SDK
78
68
 
79
- ```bash
80
- # Login (OAuth or API key)
81
- locopilot login
69
+ ```ts
70
+ import OpenAI from 'openai';
82
71
 
83
- # Expose your API publicly
84
- locopilot expose
72
+ const client = new OpenAI({
73
+ baseURL: 'http://localhost:8080/v1',
74
+ apiKey: 'not-required-for-local',
75
+ });
85
76
 
86
- # Use cloud GPU automatically when needed
77
+ const reply = await client.chat.completions.create({
78
+ model: 'llama3',
79
+ messages: [{ role: 'user', content: 'Hello' }],
80
+ });
87
81
  ```
88
82
 
89
- ---
90
-
91
- # ๐Ÿ— Architecture
92
-
93
- ## ๐ŸŸข Free Tier (Local Mode)
83
+ ## Commands
94
84
 
95
- ```
96
- CLI โ†’ Local API โ†’ Ollama
97
- โ†’ SQLite
98
- ```
85
+ | Command | Description |
86
+ | --------------------------------------------- | ---------------------------------------------------------- |
87
+ | `locopilot init` | Bootstrap config, install Ollama, run pre-flight checks |
88
+ | `locopilot doctor` | Diagnose the local environment |
89
+ | `locopilot start [--port <n>]` | Start the OpenAI-compatible API on `localhost:8080` |
90
+ | `locopilot models pull <model>` | Download a model via Ollama |
91
+ | `locopilot models list` | List installed models (and cloud catalog when Pro) |
92
+ | `locopilot models rm <model>` | Remove a local model |
93
+ | `locopilot train --config <file>` | Run a fine-tuning job locally (in-process) |
94
+ | `locopilot train --config <file> --cloud` | Submit a training job to LocoPilot Cloud *(Pro)* |
95
+ | `locopilot logs --job <id>` | Stream logs from a training job |
96
+ | `locopilot expose` | Publish the local API on a Cloudflare tunnel *(Pro)* |
97
+ | `locopilot login` | Sign in for Pro features |
98
+ | `locopilot logout` | Remove the stored Pro token |
99
+ | `locopilot whoami` | Show the current Pro account |
100
+ | `locopilot usage` | Show token usage and billing summary *(Pro)* |
99
101
 
100
- - Runs fully offline
101
- - No account required
102
- - No external dependencies
102
+ Full reference at <https://infrarix.github.io/locopilot/docs/cli/init>.
103
103
 
104
- ---
104
+ ## Configuration
105
105
 
106
- ## ๐Ÿ”ต Pro Tier (Hybrid Mode)
106
+ All local state lives under `~/.locopilot/`:
107
107
 
108
108
  ```
109
- CLI โ†’ Local API
110
- โ”‚
111
- โ””โ”€โ”€โ†’ LocoPilot Cloud
112
- โ”œโ”€โ”€ GPU (RunPod)
113
- โ”œโ”€โ”€ Tunnel (Cloudflare)
114
- โ”œโ”€โ”€ Auth
115
- โ””โ”€โ”€ Usage tracking
109
+ ~/.locopilot/
110
+ โ”œโ”€โ”€ .env # API_PORT, OLLAMA_HOST, etc.
111
+ โ”œโ”€โ”€ db.sqlite # Free-tier inference and training history
112
+ โ””โ”€โ”€ config.json # Pro-tier token (created by `locopilot login`)
116
113
  ```
117
114
 
118
- ---
119
-
120
- # ๐Ÿง  Training
115
+ Tier detection is **purely client-side**: a valid `qs_โ€ฆ` token in `~/.locopilot/config.json` enables Pro features. The CLI never reads cloud credentials or database URLs directly.
121
116
 
122
- ## Free (Local)
117
+ ## Architecture
123
118
 
124
- ```bash
125
- locopilot train --config config.json
126
- ```
119
+ LocoPilot ships as a single npm package. Source lives under [`src/`](src):
127
120
 
128
- - Runs locally
129
- - Basic configs
130
- - Limited by your hardware
131
- - On Apple Silicon Macs, free-tier training automatically uses MLX (`mlx-lm`, Metal-optimized). On Linux/Windows, Unsloth/Axolotl are used.
121
+ | Path | Purpose |
122
+ | --------------- | ---------------------------------------------------------------- |
123
+ | `src/api/` | Fastify 5 gateway exposing OpenAI-compatible endpoints |
124
+ | `src/cli/` | Commander.js CLI (`init`, `start`, `train`, `expose`, โ€ฆ) |
125
+ | `src/worker/` | Basic in-process training worker (Free tier) |
126
+ | `src/training/` | `TrainingAdapter` interface + MLX / Unsloth / Axolotl runners |
127
+ | `src/cloud/` | Single point of egress to LocoPilot Cloud (Pro) |
128
+ | `src/shared/` | Shared utilities, types, DB pool, Ollama client |
132
129
 
133
- ---
130
+ ### Free tier
134
131
 
135
- ## Pro (Cloud)
136
-
137
- ```bash
138
- locopilot train --cloud
132
+ ```
133
+ client โ”€โ”€โ–บ local Fastify API โ”€โ”€โ–บ Ollama runtime
134
+ โ””โ”€โ–บ SQLite (~/.locopilot/db.sqlite)
139
135
  ```
140
136
 
141
- - Runs on GPU
142
- - Faster + better results
143
- - No setup required
144
-
145
- ---
146
-
147
- # ๐Ÿงฐ CLI Reference
137
+ ### Pro tier
148
138
 
149
- | Command | Description |
150
- | --------------------------------- | -------------------------------------------- |
151
- | `locopilot init` | Setup environment, install Ollama if missing |
152
- | `locopilot start` | Start local API server |
153
- | `locopilot login` | Authenticate for Pro features |
154
- | `locopilot models pull <model>` | Pull model locally |
155
- | `locopilot models list` | List available models |
156
- | `locopilot expose` | Get public API URL |
157
- | `locopilot train --config <file>` | Local training |
158
- | `locopilot train --cloud` | Cloud training (Pro) |
159
- | `locopilot logs` | View logs |
139
+ ```
140
+ client โ”€โ”€โ–บ local Fastify API โ”€โ”€โ–บ Ollama (local first)
141
+ โ””โ”€โ–บ LocoPilot Cloud (model missing locally)
142
+ โ”œโ”€โ”€ RunPod GPU
143
+ โ”œโ”€โ”€ Cloudflare Tunnel (`expose`)
144
+ โ””โ”€โ”€ Auth + usage metering
145
+ ```
160
146
 
161
- ---
147
+ ## Fine-tuning
162
148
 
163
- # ๐Ÿ“ฆ Requirements
149
+ Free-tier training runs on your machine. The adapter is selected per OS:
164
150
 
165
- ### Free Tier
151
+ - **macOS arm64** โ†’ MLX (`mlx-lm`, Metal-optimized)
152
+ - **Linux / Windows + NVIDIA** โ†’ Unsloth (default) or Axolotl
166
153
 
167
- - Node.js 20+
168
- - Internet (for install only)
169
- - No Docker required
154
+ ```bash
155
+ locopilot train --config train.json
156
+ ```
170
157
 
171
- ### Pro Features
158
+ For multi-GPU, faster wall time, and managed checkpoints:
172
159
 
173
- - LocoPilot account
174
- - Internet connection
160
+ ```bash
161
+ locopilot train --config train.json --cloud
162
+ ```
175
163
 
176
- ---
164
+ See [training docs](https://infrarix.github.io/locopilot/docs/training/configuration) for the full config schema and supported dataset formats (Alpaca, ShareGPT).
177
165
 
178
- # ๐Ÿ” Security
166
+ ## Pro tier
179
167
 
180
- - No API keys required for local mode
181
- - Pro tokens stored in:
168
+ LocoPilot is open-source and free for local use. The optional **LocoPilot Cloud** adds:
182
169
 
183
- ```
184
- ~/.locopilot/config.json
185
- ```
170
+ - Remote GPU inference fallback (RunPod-backed)
171
+ - Managed training (10โ€“50ร— a laptop)
172
+ - Public HTTPS endpoint via Cloudflare Tunnel
173
+ - Token-level usage analytics and billing
186
174
 
187
- - No cloud secrets stored locally
175
+ Pro is purely an HTTP client โ€” no private packages are ever installed locally.
188
176
 
189
- ---
177
+ ## Documentation
190
178
 
191
- # ๐Ÿ’ก Philosophy
179
+ The full Docusaurus site lives at <https://infrarix.github.io/locopilot/>. Highlights:
192
180
 
193
- > **Free = capability**
194
- > **Pro = speed, scale, convenience**
181
+ - [Getting started](https://infrarix.github.io/locopilot/docs/getting-started/quickstart)
182
+ - [API reference](https://infrarix.github.io/locopilot/docs/api/chat-completions)
183
+ - [CLI reference](https://infrarix.github.io/locopilot/docs/cli/init)
184
+ - [Architecture overview](https://infrarix.github.io/locopilot/docs/architecture/overview)
185
+ - [Training guide](https://infrarix.github.io/locopilot/docs/training/configuration)
195
186
 
196
- ---
187
+ ## Contributing
197
188
 
198
- # ๐Ÿง‘โ€๐Ÿ’ป Contributing
199
-
200
- We welcome contributions!
201
-
202
- - Improve CLI / API
203
- - Add integrations
204
- - Improve docs
189
+ Bug reports, feature requests, and PRs are welcome. See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup, branching, commit conventions, and the release process.
205
190
 
206
191
  ```bash
207
- git clone https://github.com/locopilot/locopilot
192
+ git clone https://github.com/Infrarix/locopilot.git
193
+ cd locopilot
194
+ npm install
195
+ npm run build
196
+ npm run dev
208
197
  ```
209
198
 
210
- ---
211
-
212
- # ๐Ÿ“„ License
213
-
214
- MIT โ€” see [LICENSE](LICENSE)
215
-
216
- ---
217
-
218
- # ๐Ÿš€ Roadmap
219
-
220
- - [ ] Plugin SDK
221
- - [ ] GUI dashboard (Pro)
222
- - [ ] Multi-model routing
223
- - [ ] Enterprise features
224
-
225
- ---
226
-
227
- # ๐ŸŒ Learn More
228
-
229
- ๐Ÿ‘‰ [https://locopilot.dev](https://locopilot.dev)
230
-
231
- ---
199
+ ## Security
232
200
 
233
- If you want next level:
201
+ Found a vulnerability? Please don't open a public issue โ€” see [SECURITY.md](SECURITY.md) for the responsible disclosure process.
234
202
 
235
- - I can also generate **landing page copy**
236
- - or **GitHub repo structure + badges**
237
- - or **launch strategy (HN/Product Hunt)**
203
+ ## License
238
204
 
239
- Just tell me ๐Ÿ‘
205
+ [MIT](LICENSE) ยฉ 2025โ€“2026 LocoPilot
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@infrarix/locopilot",
3
- "version": "1.1.0",
3
+ "version": "1.2.1",
4
4
  "description": "Local-first OpenAI-compatible AI platform",
5
5
  "license": "MIT",
6
6
  "private": false,