loclaude 0.0.1-alpha.3 → 0.0.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +2 -2
- package/README.md +150 -64
- package/libs/cli/dist/commands/doctor.d.ts.map +1 -1
- package/libs/cli/dist/index.bun.js +108 -27
- package/libs/cli/dist/index.bun.js.map +5 -5
- package/libs/cli/dist/index.js +108 -27
- package/libs/cli/dist/index.js.map +5 -5
- package/libs/cli/package.json +2 -2
- package/package.json +17 -7
package/CHANGELOG.md
CHANGED
|
@@ -63,5 +63,5 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|
|
63
63
|
|
|
64
64
|
This is an alpha release. The API and command structure may change before 1.0.
|
|
65
65
|
|
|
66
|
-
[Unreleased]: https://github.com/nicholasgalante1997/
|
|
67
|
-
[0.0.1-alpha.1]: https://github.com/nicholasgalante1997/
|
|
66
|
+
[Unreleased]: https://github.com/nicholasgalante1997/loclaude/compare/v0.0.1-rc.1...HEAD
|
|
67
|
+
[0.0.1-alpha.1]: https://github.com/nicholasgalante1997/loclaude/releases/tag/v0.0.1-alpha.1
|
package/README.md
CHANGED
|
@@ -1,76 +1,108 @@
|
|
|
1
|
+
<div align="center">
|
|
2
|
+
|
|
1
3
|
# loclaude
|
|
2
4
|
|
|
3
|
-
|
|
5
|
+
**Claude Code with Local LLMs**
|
|
4
6
|
|
|
5
|
-
|
|
6
|
-
- Launch Claude Code sessions connected to your local Ollama instance
|
|
7
|
-
- Manage Ollama + Open WebUI Docker containers
|
|
8
|
-
- Pull and manage Ollama models
|
|
9
|
-
- Scaffold new projects with opinionated Docker configs
|
|
10
|
-
- **Supports both GPU and CPU-only modes**
|
|
7
|
+
Stop burning through Claude API usage limits. Run Claude Code's powerful agentic workflow with local Ollama models on your own hardware.
|
|
11
8
|
|
|
12
|
-
|
|
9
|
+
> **Requires ollama v0.14.2 or higher**
|
|
13
10
|
|
|
14
|
-
|
|
15
|
-
# With npm (requires Node.js 18+)
|
|
16
|
-
npm install -g loclaude
|
|
11
|
+
**Zero API costs. No rate limits. Complete privacy.**
|
|
17
12
|
|
|
18
|
-
|
|
19
|
-
|
|
20
|
-
```
|
|
13
|
+
[](https://www.npmjs.com/package/loclaude)
|
|
14
|
+
[](https://opensource.org/licenses/MIT)
|
|
21
15
|
|
|
22
|
-
|
|
16
|
+
[Quick Start](#quick-start-5-minutes) • [Why loclaude?](#why-loclaude) • [Installation](#installation) • [FAQ](#faq)
|
|
23
17
|
|
|
24
|
-
|
|
25
|
-
- [Claude Code CLI](https://docs.anthropic.com/en/docs/claude-code) installed (`npm install -g @anthropic-ai/claude-code`)
|
|
18
|
+
</div>
|
|
26
19
|
|
|
27
|
-
|
|
20
|
+
---
|
|
28
21
|
|
|
29
|
-
|
|
30
|
-
|
|
22
|
+
## Why loclaude?
|
|
23
|
+
|
|
24
|
+
### Real Value
|
|
25
|
+
|
|
26
|
+
- **No Rate Limits**: Use Claude Code as much as you want
|
|
27
|
+
- **Privacy**: Your code never leaves your machine
|
|
28
|
+
- **Cost Control**: Use your own hardware, pay for electricity not tokens
|
|
29
|
+
- **Offline Capable**: Work without internet (after model download)
|
|
30
|
+
- **GPU or CPU**: Works with NVIDIA GPUs or CPU-only systems
|
|
31
|
+
|
|
32
|
+
### What to Expect
|
|
31
33
|
|
|
32
|
-
|
|
34
|
+
loclaude provides:
|
|
33
35
|
|
|
34
|
-
|
|
36
|
+
- One-command setup for Ollama + Open WebUI containers
|
|
37
|
+
- Smart model management with auto-loading
|
|
38
|
+
- GPU auto-detection with CPU fallback
|
|
39
|
+
- Project scaffolding with Docker configs
|
|
35
40
|
|
|
36
|
-
|
|
41
|
+
## Installation
|
|
37
42
|
|
|
38
43
|
```bash
|
|
39
|
-
|
|
44
|
+
# With npm (requires Node.js 18+)
|
|
45
|
+
npm install -g loclaude
|
|
46
|
+
|
|
47
|
+
# With bun (faster, recommended)
|
|
48
|
+
bun install -g loclaude # use bun-loclaude for commands
|
|
40
49
|
```
|
|
41
50
|
|
|
42
|
-
|
|
51
|
+
### vs. Other Solutions
|
|
52
|
+
|
|
53
|
+
| Solution | Cost | Speed | Privacy | Limits |
|
|
54
|
+
|----------|------|-------|---------|--------|
|
|
55
|
+
| **loclaude** | Free after setup | Fast (GPU) | 100% local | None |
|
|
56
|
+
| Claude API/Web | $20-200+/month | Fast | Cloud-based | Rate limited |
|
|
57
|
+
| GitHub Copilot | $10-20/month | Fast | Cloud-based | Context limited |
|
|
58
|
+
| Cursor/Codeium | $20+/month | Fast | Cloud-based | Usage limits |
|
|
43
59
|
|
|
44
|
-
|
|
60
|
+
loclaude gives you the utility of Ollama with the convenience of a managed solution for claude code integration.
|
|
61
|
+
|
|
62
|
+
## Quick Start (5 Minutes)
|
|
45
63
|
|
|
46
64
|
```bash
|
|
47
|
-
#
|
|
65
|
+
# 1. Install loclaude
|
|
66
|
+
npm install -g loclaude
|
|
67
|
+
|
|
68
|
+
# 2. Install Claude Code (if you haven't already)
|
|
69
|
+
npm install -g @anthropic-ai/claude-code
|
|
70
|
+
|
|
71
|
+
# 3. Setup your project (auto-detects GPU)
|
|
48
72
|
loclaude init
|
|
49
73
|
|
|
50
|
-
# Start Ollama
|
|
74
|
+
# 4. Start Ollama container
|
|
51
75
|
loclaude docker-up
|
|
52
76
|
|
|
53
|
-
# Pull a model
|
|
54
|
-
loclaude models-pull qwen3-coder:30b
|
|
77
|
+
# 5. Pull a model (choose based on your hardware)
|
|
78
|
+
loclaude models-pull qwen3-coder:30b # GPU with 16GB+ VRAM
|
|
79
|
+
# OR
|
|
80
|
+
loclaude models-pull qwen2.5-coder:7b # CPU or limited VRAM
|
|
55
81
|
|
|
56
|
-
# Run Claude Code with local LLM
|
|
82
|
+
# 6. Run Claude Code with unlimited local LLM
|
|
57
83
|
loclaude run
|
|
58
84
|
```
|
|
59
85
|
|
|
60
|
-
|
|
86
|
+
That's it! You now have unlimited Claude Code sessions with local models.
|
|
61
87
|
|
|
62
|
-
|
|
63
|
-
# Initialize without GPU support
|
|
64
|
-
loclaude init --no-gpu
|
|
88
|
+
## Prerequisites
|
|
65
89
|
|
|
66
|
-
|
|
67
|
-
|
|
90
|
+
**Required:**
|
|
91
|
+
|
|
92
|
+
- [Docker](https://docs.docker.com/get-docker/) with Docker Compose v2
|
|
93
|
+
- [Claude Code CLI](https://docs.anthropic.com/en/docs/claude-code) (`npm install -g @anthropic-ai/claude-code`)
|
|
68
94
|
|
|
69
|
-
|
|
70
|
-
loclaude models-pull qwen2.5-coder:7b
|
|
95
|
+
**Optional (for GPU acceleration):**
|
|
71
96
|
|
|
72
|
-
|
|
73
|
-
|
|
97
|
+
- NVIDIA GPU with 16GB+ VRAM (RTX 3090, 4090, A5000, etc.)
|
|
98
|
+
- [NVIDIA Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)
|
|
99
|
+
|
|
100
|
+
**CPU-only systems work fine!** Use `--no-gpu` flag during init and smaller models.
|
|
101
|
+
|
|
102
|
+
**Check your setup:**
|
|
103
|
+
|
|
104
|
+
```bash
|
|
105
|
+
loclaude doctor
|
|
74
106
|
```
|
|
75
107
|
|
|
76
108
|
## Features
|
|
@@ -78,20 +110,15 @@ loclaude run
|
|
|
78
110
|
### Automatic Model Loading
|
|
79
111
|
|
|
80
112
|
When you run `loclaude run`, it automatically:
|
|
113
|
+
|
|
81
114
|
1. Checks if your selected model is loaded in Ollama
|
|
82
|
-
2. If not loaded, warms up the model with a 10-minute keep-alive
|
|
115
|
+
2. If not loaded, warms up the model with a 10-minute keep-alive (Configurable through env vars)
|
|
83
116
|
3. Shows `[loaded]` indicator in model selection for running models
|
|
84
117
|
|
|
85
|
-
### Colorful CLI Output
|
|
86
|
-
|
|
87
|
-
All commands feature colorful, themed output for better readability:
|
|
88
|
-
- Status indicators with colors (green/yellow/red)
|
|
89
|
-
- Model sizes color-coded by magnitude
|
|
90
|
-
- Clear headers and structured output
|
|
91
|
-
|
|
92
118
|
### GPU Auto-Detection
|
|
93
119
|
|
|
94
120
|
`loclaude init` automatically detects NVIDIA GPUs and configures the appropriate Docker setup:
|
|
121
|
+
|
|
95
122
|
- **GPU detected**: Uses `runtime: nvidia` and CUDA-enabled images
|
|
96
123
|
- **No GPU**: Uses CPU-only configuration with smaller default models
|
|
97
124
|
|
|
@@ -147,21 +174,22 @@ loclaude config-paths # Show config file search paths
|
|
|
147
174
|
|
|
148
175
|
## Recommended Models
|
|
149
176
|
|
|
150
|
-
### For GPU (16GB+ VRAM)
|
|
177
|
+
### For GPU (16GB+ VRAM) - Best Experience
|
|
178
|
+
|
|
179
|
+
| Model | Size | Speed | Quality | Best For |
|
|
180
|
+
|-------|------|-------|---------|----------|
|
|
181
|
+
| `qwen3-coder:30b` | ~17 GB | ~50-100 tok/s | Excellent | **Most coding tasks, refactoring, debugging** |
|
|
182
|
+
| `deepseek-coder:33b` | ~18 GB | ~40-80 tok/s | Excellent | Code understanding, complex logic |
|
|
151
183
|
|
|
152
|
-
|
|
153
|
-
|-------|------|----------|
|
|
154
|
-
| `qwen3-coder:30b` | ~17 GB | Best coding performance |
|
|
155
|
-
| `deepseek-coder:33b` | ~18 GB | Code understanding |
|
|
156
|
-
| `gpt-oss:20b` | ~13 GB | General purpose |
|
|
184
|
+
**Recommendation:** Start with `qwen3-coder:30b` for the best balance of speed and quality.
|
|
157
185
|
|
|
158
|
-
### For CPU or Limited VRAM
|
|
186
|
+
### For CPU or Limited VRAM (<16GB) - Still Productive
|
|
159
187
|
|
|
160
|
-
| Model | Size |
|
|
161
|
-
|
|
162
|
-
| `qwen2.5-coder:7b` | ~4 GB |
|
|
163
|
-
| `
|
|
164
|
-
| `
|
|
188
|
+
| Model | Size | Speed | Quality | Best For |
|
|
189
|
+
|-------|------|-------|---------|----------|
|
|
190
|
+
| `qwen2.5-coder:7b` | ~4 GB | ~10-20 tok/s | Good | **Code completion, simple refactoring** |
|
|
191
|
+
| `deepseek-coder:6.7b` | ~4 GB | ~10-20 tok/s | Good | Understanding existing code |
|
|
192
|
+
| `llama3.2:3b` | ~2 GB | ~15-30 tok/s | Fair | Quick edits, file operations |
|
|
165
193
|
|
|
166
194
|
## Configuration
|
|
167
195
|
|
|
@@ -217,8 +245,8 @@ When containers are running:
|
|
|
217
245
|
|
|
218
246
|
| Service | URL | Description |
|
|
219
247
|
|---------|-----|-------------|
|
|
220
|
-
| Ollama API | http://localhost:11434 | LLM inference API |
|
|
221
|
-
| Open WebUI | http://localhost:3000 | Chat interface |
|
|
248
|
+
| Ollama API | <http://localhost:11434> | LLM inference API |
|
|
249
|
+
| Open WebUI | <http://localhost:3000> | Chat interface |
|
|
222
250
|
|
|
223
251
|
## Project Structure
|
|
224
252
|
|
|
@@ -248,6 +276,30 @@ mise run pull <model> # loclaude models-pull <model>
|
|
|
248
276
|
mise run doctor # loclaude doctor
|
|
249
277
|
```
|
|
250
278
|
|
|
279
|
+
## FAQ
|
|
280
|
+
|
|
281
|
+
### Is this really unlimited?
|
|
282
|
+
|
|
283
|
+
Yes! Once you have models downloaded, you can run as many sessions as you want with zero additional cost.
|
|
284
|
+
|
|
285
|
+
### How does the quality compare to Claude API?
|
|
286
|
+
|
|
287
|
+
30B parameter models (qwen3-coder:30b) are comparable to GPT-3.5 and work okay for most coding tasks. Larger models have a bit more success. Claude API is still better, but this allows for continuing work when you have hit that pesky usage limit.
|
|
288
|
+
|
|
289
|
+
### Do I need a GPU?
|
|
290
|
+
|
|
291
|
+
No, but highly recommended. CPU-only mode works with smaller models at ~10-20 tokens/sec. A GPU (16GB+ VRAM) gives you 50-100 tokens/sec with larger, better models.
|
|
292
|
+
|
|
293
|
+
### What's the catch?
|
|
294
|
+
|
|
295
|
+
- Initial setup takes 5-10 minutes
|
|
296
|
+
- Model downloads are large (4-20GB)
|
|
297
|
+
- GPU hardware investment if you don't have one (~$500-1500 used)
|
|
298
|
+
|
|
299
|
+
### Can I use this with the Claude API too?
|
|
300
|
+
|
|
301
|
+
Absolutely! Keep using Claude API for critical tasks, use loclaude for everything else to save money and avoid limits.
|
|
302
|
+
|
|
251
303
|
## Troubleshooting
|
|
252
304
|
|
|
253
305
|
### Check System Requirements
|
|
@@ -257,6 +309,7 @@ loclaude doctor
|
|
|
257
309
|
```
|
|
258
310
|
|
|
259
311
|
This verifies:
|
|
312
|
+
|
|
260
313
|
- Docker and Docker Compose installation
|
|
261
314
|
- NVIDIA GPU detection (optional)
|
|
262
315
|
- NVIDIA Container Toolkit (optional)
|
|
@@ -301,12 +354,45 @@ If inference is slow on CPU:
|
|
|
301
354
|
2. Expect ~10-20 tokens/sec on modern CPUs
|
|
302
355
|
3. Consider cloud models via Ollama: `glm-4.7:cloud`
|
|
303
356
|
|
|
357
|
+
## Contributing
|
|
358
|
+
|
|
359
|
+
loclaude is open source and welcomes contributions! Here's how you can help:
|
|
360
|
+
|
|
361
|
+
### Share Your Experience
|
|
362
|
+
|
|
363
|
+
- Star the repo if loclaude saves you money or rate limits
|
|
364
|
+
- Share your setup and model recommendations
|
|
365
|
+
- Write about your experience on dev.to, Twitter, or your blog
|
|
366
|
+
- Report bugs and request features via GitHub Issues
|
|
367
|
+
|
|
368
|
+
### Code Contributions
|
|
369
|
+
|
|
370
|
+
- Fix bugs or add features (see open issues)
|
|
371
|
+
- Improve documentation or examples
|
|
372
|
+
- Add support for new model providers
|
|
373
|
+
- Optimize model loading and performance
|
|
374
|
+
|
|
375
|
+
### Spread the Word
|
|
376
|
+
|
|
377
|
+
- Post on r/LocalLLaMA, r/selfhosted, r/ClaudeAI
|
|
378
|
+
- Share in Discord/Slack dev communities
|
|
379
|
+
- Help others troubleshoot in GitHub Discussions
|
|
380
|
+
|
|
381
|
+
Every star, issue report, and shared experience helps more developers discover unlimited local Claude Code.
|
|
382
|
+
|
|
383
|
+
## Getting Help
|
|
384
|
+
|
|
385
|
+
- **Issues/Bugs**: [GitHub Issues](https://github.com/nicholasgalante1997/loclaude/issues)
|
|
386
|
+
- **Questions**: [GitHub Discussions](https://github.com/nicholasgalante1997/loclaude/discussions)
|
|
387
|
+
- **Documentation**: Run `loclaude --help` or check this README
|
|
388
|
+
- **System Check**: Run `loclaude doctor` to diagnose problems
|
|
389
|
+
|
|
304
390
|
## Development
|
|
305
391
|
|
|
306
392
|
### Building from Source
|
|
307
393
|
|
|
308
394
|
```bash
|
|
309
|
-
git clone https://github.com/nicholasgalante1997/
|
|
395
|
+
git clone https://github.com/nicholasgalante1997/loclaude.git loclaude
|
|
310
396
|
cd loclaude
|
|
311
397
|
bun install
|
|
312
398
|
bun run build
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"doctor.d.ts","sourceRoot":"","sources":["../../lib/commands/doctor.ts"],"names":[],"mappings":"AAAA;;GAEG;
|
|
1
|
+
{"version":3,"file":"doctor.d.ts","sourceRoot":"","sources":["../../lib/commands/doctor.ts"],"names":[],"mappings":"AAAA;;GAEG;AAmRH,wBAAsB,MAAM,IAAI,OAAO,CAAC,IAAI,CAAC,CA+B5C;AAED;;GAEG;AACH,wBAAsB,YAAY,IAAI,OAAO,CAAC,OAAO,CAAC,CAMrD"}
|
|
@@ -1082,13 +1082,13 @@ function getClaudeExtraArgs() {
|
|
|
1082
1082
|
var OLLAMA_URL = getOllamaUrl();
|
|
1083
1083
|
var DEFAULT_MODEL = getDefaultModel();
|
|
1084
1084
|
|
|
1085
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1085
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/key.js
|
|
1086
1086
|
var isUpKey = (key, keybindings = []) => key.name === "up" || keybindings.includes("vim") && key.name === "k" || keybindings.includes("emacs") && key.ctrl && key.name === "p";
|
|
1087
1087
|
var isDownKey = (key, keybindings = []) => key.name === "down" || keybindings.includes("vim") && key.name === "j" || keybindings.includes("emacs") && key.ctrl && key.name === "n";
|
|
1088
1088
|
var isBackspaceKey = (key) => key.name === "backspace";
|
|
1089
1089
|
var isNumberKey = (key) => "1234567890".includes(key.name);
|
|
1090
1090
|
var isEnterKey = (key) => key.name === "enter" || key.name === "return";
|
|
1091
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1091
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/errors.js
|
|
1092
1092
|
class AbortPromptError extends Error {
|
|
1093
1093
|
name = "AbortPromptError";
|
|
1094
1094
|
message = "Prompt was aborted";
|
|
@@ -1114,10 +1114,10 @@ class HookError extends Error {
|
|
|
1114
1114
|
class ValidationError extends Error {
|
|
1115
1115
|
name = "ValidationError";
|
|
1116
1116
|
}
|
|
1117
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1117
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-state.js
|
|
1118
1118
|
import { AsyncResource as AsyncResource2 } from "async_hooks";
|
|
1119
1119
|
|
|
1120
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1120
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/hook-engine.js
|
|
1121
1121
|
import { AsyncLocalStorage, AsyncResource } from "async_hooks";
|
|
1122
1122
|
var hookStorage = new AsyncLocalStorage;
|
|
1123
1123
|
function createStore(rl) {
|
|
@@ -1222,7 +1222,7 @@ var effectScheduler = {
|
|
|
1222
1222
|
}
|
|
1223
1223
|
};
|
|
1224
1224
|
|
|
1225
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1225
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-state.js
|
|
1226
1226
|
function useState(defaultValue) {
|
|
1227
1227
|
return withPointer((pointer) => {
|
|
1228
1228
|
const setState = AsyncResource2.bind(function setState(newValue) {
|
|
@@ -1240,7 +1240,7 @@ function useState(defaultValue) {
|
|
|
1240
1240
|
});
|
|
1241
1241
|
}
|
|
1242
1242
|
|
|
1243
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1243
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-effect.js
|
|
1244
1244
|
function useEffect(cb, depArray) {
|
|
1245
1245
|
withPointer((pointer) => {
|
|
1246
1246
|
const oldDeps = pointer.get();
|
|
@@ -1252,7 +1252,7 @@ function useEffect(cb, depArray) {
|
|
|
1252
1252
|
});
|
|
1253
1253
|
}
|
|
1254
1254
|
|
|
1255
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1255
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/theme.js
|
|
1256
1256
|
import { styleText } from "util";
|
|
1257
1257
|
|
|
1258
1258
|
// ../../node_modules/.bun/@inquirer+figures@2.0.3/node_modules/@inquirer/figures/dist/index.js
|
|
@@ -1544,7 +1544,7 @@ var figures = shouldUseMain ? mainSymbols : fallbackSymbols;
|
|
|
1544
1544
|
var dist_default2 = figures;
|
|
1545
1545
|
var replacements = Object.entries(specialMainSymbols);
|
|
1546
1546
|
|
|
1547
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1547
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/theme.js
|
|
1548
1548
|
var defaultTheme = {
|
|
1549
1549
|
prefix: {
|
|
1550
1550
|
idle: styleText("blue", "?"),
|
|
@@ -1565,7 +1565,7 @@ var defaultTheme = {
|
|
|
1565
1565
|
}
|
|
1566
1566
|
};
|
|
1567
1567
|
|
|
1568
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1568
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/make-theme.js
|
|
1569
1569
|
function isPlainObject(value) {
|
|
1570
1570
|
if (typeof value !== "object" || value === null)
|
|
1571
1571
|
return false;
|
|
@@ -1593,7 +1593,7 @@ function makeTheme(...themes) {
|
|
|
1593
1593
|
return deepMerge2(...themesToMerge);
|
|
1594
1594
|
}
|
|
1595
1595
|
|
|
1596
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1596
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-prefix.js
|
|
1597
1597
|
function usePrefix({ status = "idle", theme }) {
|
|
1598
1598
|
const [showLoader, setShowLoader] = useState(false);
|
|
1599
1599
|
const [tick, setTick] = useState(0);
|
|
@@ -1623,7 +1623,7 @@ function usePrefix({ status = "idle", theme }) {
|
|
|
1623
1623
|
const iconName = status === "loading" ? "idle" : status;
|
|
1624
1624
|
return typeof prefix === "string" ? prefix : prefix[iconName] ?? prefix["idle"];
|
|
1625
1625
|
}
|
|
1626
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1626
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-memo.js
|
|
1627
1627
|
function useMemo(fn, dependencies) {
|
|
1628
1628
|
return withPointer((pointer) => {
|
|
1629
1629
|
const prev = pointer.get();
|
|
@@ -1635,11 +1635,11 @@ function useMemo(fn, dependencies) {
|
|
|
1635
1635
|
return prev.value;
|
|
1636
1636
|
});
|
|
1637
1637
|
}
|
|
1638
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1638
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-ref.js
|
|
1639
1639
|
function useRef(val) {
|
|
1640
1640
|
return useState({ current: val })[0];
|
|
1641
1641
|
}
|
|
1642
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1642
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/use-keypress.js
|
|
1643
1643
|
function useKeypress(userHandler) {
|
|
1644
1644
|
const signal = useRef(userHandler);
|
|
1645
1645
|
signal.current = userHandler;
|
|
@@ -1657,7 +1657,7 @@ function useKeypress(userHandler) {
|
|
|
1657
1657
|
};
|
|
1658
1658
|
}, []);
|
|
1659
1659
|
}
|
|
1660
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
1660
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/utils.js
|
|
1661
1661
|
var import_cli_width = __toESM(require_cli_width(), 1);
|
|
1662
1662
|
|
|
1663
1663
|
// ../../node_modules/.bun/ansi-regex@6.2.2/node_modules/ansi-regex/index.js
|
|
@@ -2095,7 +2095,7 @@ function wrapAnsi(string, columns, options) {
|
|
|
2095
2095
|
`);
|
|
2096
2096
|
}
|
|
2097
2097
|
|
|
2098
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2098
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/utils.js
|
|
2099
2099
|
function breakLines(content, width) {
|
|
2100
2100
|
return content.split(`
|
|
2101
2101
|
`).flatMap((line) => wrapAnsi(line, width, { trim: false, hard: true }).split(`
|
|
@@ -2106,7 +2106,7 @@ function readlineWidth() {
|
|
|
2106
2106
|
return import_cli_width.default({ defaultWidth: 80, output: readline().output });
|
|
2107
2107
|
}
|
|
2108
2108
|
|
|
2109
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2109
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/pagination/use-pagination.js
|
|
2110
2110
|
function usePointerPosition({ active, renderedItems, pageSize, loop }) {
|
|
2111
2111
|
const state = useRef({
|
|
2112
2112
|
lastPointer: active,
|
|
@@ -2172,7 +2172,7 @@ function usePagination({ items, active, renderItem, pageSize, loop = true }) {
|
|
|
2172
2172
|
return pageBuffer.filter((line) => typeof line === "string").join(`
|
|
2173
2173
|
`);
|
|
2174
2174
|
}
|
|
2175
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2175
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/create-prompt.js
|
|
2176
2176
|
var import_mute_stream = __toESM(require_lib(), 1);
|
|
2177
2177
|
import * as readline2 from "readline";
|
|
2178
2178
|
import { AsyncResource as AsyncResource3 } from "async_hooks";
|
|
@@ -2385,7 +2385,7 @@ var {
|
|
|
2385
2385
|
unload
|
|
2386
2386
|
} = signalExitWrap(processOk(process3) ? new SignalExit(process3) : new SignalExitFallback);
|
|
2387
2387
|
|
|
2388
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2388
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/screen-manager.js
|
|
2389
2389
|
import { stripVTControlCharacters } from "util";
|
|
2390
2390
|
|
|
2391
2391
|
// ../../node_modules/.bun/@inquirer+ansi@2.0.3/node_modules/@inquirer/ansi/dist/index.js
|
|
@@ -2404,7 +2404,7 @@ var cursorTo = (x, y) => {
|
|
|
2404
2404
|
var eraseLine = ESC + "2K";
|
|
2405
2405
|
var eraseLines = (lines) => lines > 0 ? (eraseLine + cursorUp(1)).repeat(lines - 1) + eraseLine + cursorLeft : "";
|
|
2406
2406
|
|
|
2407
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2407
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/screen-manager.js
|
|
2408
2408
|
var height = (content) => content.split(`
|
|
2409
2409
|
`).length;
|
|
2410
2410
|
var lastLine = (content) => content.split(`
|
|
@@ -2469,7 +2469,7 @@ class ScreenManager {
|
|
|
2469
2469
|
}
|
|
2470
2470
|
}
|
|
2471
2471
|
|
|
2472
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2472
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/promise-polyfill.js
|
|
2473
2473
|
class PromisePolyfill extends Promise {
|
|
2474
2474
|
static withResolver() {
|
|
2475
2475
|
let resolve;
|
|
@@ -2482,7 +2482,7 @@ class PromisePolyfill extends Promise {
|
|
|
2482
2482
|
}
|
|
2483
2483
|
}
|
|
2484
2484
|
|
|
2485
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2485
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/create-prompt.js
|
|
2486
2486
|
function getCallSites() {
|
|
2487
2487
|
const _prepareStackTrace = Error.prepareStackTrace;
|
|
2488
2488
|
let result = [];
|
|
@@ -2568,7 +2568,7 @@ function createPrompt(view) {
|
|
|
2568
2568
|
};
|
|
2569
2569
|
return prompt;
|
|
2570
2570
|
}
|
|
2571
|
-
// ../../node_modules/.bun/@inquirer+core@11.1.1+
|
|
2571
|
+
// ../../node_modules/.bun/@inquirer+core@11.1.1+b219b5910764fa5c/node_modules/@inquirer/core/dist/lib/Separator.js
|
|
2572
2572
|
import { styleText as styleText2 } from "util";
|
|
2573
2573
|
class Separator {
|
|
2574
2574
|
separator = styleText2("dim", Array.from({ length: 15 }).join(dist_default2.line));
|
|
@@ -2582,7 +2582,7 @@ class Separator {
|
|
|
2582
2582
|
return Boolean(choice && typeof choice === "object" && "type" in choice && choice.type === "separator");
|
|
2583
2583
|
}
|
|
2584
2584
|
}
|
|
2585
|
-
// ../../node_modules/.bun/@inquirer+select@5.0.4+
|
|
2585
|
+
// ../../node_modules/.bun/@inquirer+select@5.0.4+b219b5910764fa5c/node_modules/@inquirer/select/dist/index.js
|
|
2586
2586
|
import { styleText as styleText3 } from "util";
|
|
2587
2587
|
var selectTheme = {
|
|
2588
2588
|
icon: { cursor: dist_default2.pointer },
|
|
@@ -3144,6 +3144,86 @@ async function checkOllamaConnection() {
|
|
|
3144
3144
|
};
|
|
3145
3145
|
}
|
|
3146
3146
|
}
|
|
3147
|
+
var MIN_OLLAMA_VERSION = "0.14.2";
|
|
3148
|
+
function parseVersion(version) {
|
|
3149
|
+
const match = version.match(/(\d+)\.(\d+)\.(\d+)/);
|
|
3150
|
+
if (!match || !match[1] || !match[2] || !match[3])
|
|
3151
|
+
return null;
|
|
3152
|
+
return {
|
|
3153
|
+
major: parseInt(match[1], 10),
|
|
3154
|
+
minor: parseInt(match[2], 10),
|
|
3155
|
+
patch: parseInt(match[3], 10)
|
|
3156
|
+
};
|
|
3157
|
+
}
|
|
3158
|
+
function compareVersions(a, b) {
|
|
3159
|
+
const parsedA = parseVersion(a);
|
|
3160
|
+
const parsedB = parseVersion(b);
|
|
3161
|
+
if (!parsedA || !parsedB)
|
|
3162
|
+
return 0;
|
|
3163
|
+
if (parsedA.major !== parsedB.major)
|
|
3164
|
+
return parsedA.major - parsedB.major;
|
|
3165
|
+
if (parsedA.minor !== parsedB.minor)
|
|
3166
|
+
return parsedA.minor - parsedB.minor;
|
|
3167
|
+
return parsedA.patch - parsedB.patch;
|
|
3168
|
+
}
|
|
3169
|
+
async function checkOllamaVersion() {
|
|
3170
|
+
const ollamaUrl = getOllamaUrl();
|
|
3171
|
+
try {
|
|
3172
|
+
const response = await fetch(`${ollamaUrl}/api/version`, {
|
|
3173
|
+
signal: AbortSignal.timeout(5000)
|
|
3174
|
+
});
|
|
3175
|
+
if (!response.ok) {
|
|
3176
|
+
return {
|
|
3177
|
+
name: "Ollama Version",
|
|
3178
|
+
status: "warning",
|
|
3179
|
+
message: "Could not determine version",
|
|
3180
|
+
hint: "Ollama may not be running. Try: loclaude docker-up"
|
|
3181
|
+
};
|
|
3182
|
+
}
|
|
3183
|
+
const data = await response.json();
|
|
3184
|
+
const version = data.version;
|
|
3185
|
+
if (!version) {
|
|
3186
|
+
return {
|
|
3187
|
+
name: "Ollama Version",
|
|
3188
|
+
status: "warning",
|
|
3189
|
+
message: "Unknown version",
|
|
3190
|
+
hint: "Could not parse version from Ollama API"
|
|
3191
|
+
};
|
|
3192
|
+
}
|
|
3193
|
+
const comparison = compareVersions(version, MIN_OLLAMA_VERSION);
|
|
3194
|
+
if (comparison > 0) {
|
|
3195
|
+
return {
|
|
3196
|
+
name: "Ollama Version",
|
|
3197
|
+
status: "ok",
|
|
3198
|
+
message: "Compatible",
|
|
3199
|
+
version
|
|
3200
|
+
};
|
|
3201
|
+
} else if (comparison === 0) {
|
|
3202
|
+
return {
|
|
3203
|
+
name: "Ollama Version",
|
|
3204
|
+
status: "ok",
|
|
3205
|
+
message: "Compatible",
|
|
3206
|
+
version,
|
|
3207
|
+
hint: `Version ${version} is the minimum. Consider upgrading for best compatibility.`
|
|
3208
|
+
};
|
|
3209
|
+
} else {
|
|
3210
|
+
return {
|
|
3211
|
+
name: "Ollama Version",
|
|
3212
|
+
status: "error",
|
|
3213
|
+
message: `Version too old (requires > ${MIN_OLLAMA_VERSION})`,
|
|
3214
|
+
version,
|
|
3215
|
+
hint: `Upgrade Ollama to a version greater than ${MIN_OLLAMA_VERSION}`
|
|
3216
|
+
};
|
|
3217
|
+
}
|
|
3218
|
+
} catch (error3) {
|
|
3219
|
+
return {
|
|
3220
|
+
name: "Ollama Version",
|
|
3221
|
+
status: "warning",
|
|
3222
|
+
message: "Could not check version",
|
|
3223
|
+
hint: `Cannot connect to ${ollamaUrl}. Start Ollama: loclaude docker-up`
|
|
3224
|
+
};
|
|
3225
|
+
}
|
|
3226
|
+
}
|
|
3147
3227
|
function formatCheck(check) {
|
|
3148
3228
|
let line = statusLine(check.status, check.name, check.message, check.version);
|
|
3149
3229
|
if (check.hint) {
|
|
@@ -3161,7 +3241,8 @@ async function doctor() {
|
|
|
3161
3241
|
checkNvidiaSmi(),
|
|
3162
3242
|
checkNvidiaContainerToolkit(),
|
|
3163
3243
|
checkClaude(),
|
|
3164
|
-
checkOllamaConnection()
|
|
3244
|
+
checkOllamaConnection(),
|
|
3245
|
+
checkOllamaVersion()
|
|
3165
3246
|
]);
|
|
3166
3247
|
for (const check of checks) {
|
|
3167
3248
|
console.log(formatCheck(check));
|
|
@@ -3592,7 +3673,7 @@ run = "loclaude config"
|
|
|
3592
3673
|
`;
|
|
3593
3674
|
var README_TEMPLATE = `# Project Name
|
|
3594
3675
|
|
|
3595
|
-
> Powered by [loclaude](https://github.com/nicholasgalante1997/
|
|
3676
|
+
> Powered by [loclaude](https://github.com/nicholasgalante1997/loclaude) - Run Claude Code with local Ollama LLMs
|
|
3596
3677
|
|
|
3597
3678
|
## Prerequisites
|
|
3598
3679
|
|
|
@@ -3737,7 +3818,7 @@ Project-specific instructions for Claude Code.
|
|
|
3737
3818
|
|
|
3738
3819
|
## Project Overview
|
|
3739
3820
|
|
|
3740
|
-
This project uses [loclaude](https://github.com/nicholasgalante1997/
|
|
3821
|
+
This project uses [loclaude](https://github.com/nicholasgalante1997/loclaude) to run Claude Code with local Ollama LLMs.
|
|
3741
3822
|
|
|
3742
3823
|
## Quick Reference
|
|
3743
3824
|
|
|
@@ -4264,5 +4345,5 @@ export {
|
|
|
4264
4345
|
cli
|
|
4265
4346
|
};
|
|
4266
4347
|
|
|
4267
|
-
//# debugId=
|
|
4348
|
+
//# debugId=8AC1271036F3EB4864756E2164756E21
|
|
4268
4349
|
//# sourceMappingURL=index.bun.js.map
|