@iamharshil/aix-cli 2.0.3 → 3.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +223 -158
- package/dist/bin/aix.js +8 -7
- package/dist/bin/aix.js.map +4 -4
- package/dist/commands/init.d.ts.map +1 -1
- package/dist/commands/run.d.ts +2 -1
- package/dist/commands/run.d.ts.map +1 -1
- package/dist/commands/status.d.ts.map +1 -1
- package/dist/services/config.d.ts +4 -1
- package/dist/services/config.d.ts.map +1 -1
- package/dist/services/ollama.d.ts +11 -0
- package/dist/services/ollama.d.ts.map +1 -0
- package/dist/src/index.js +8 -7
- package/dist/src/index.js.map +4 -4
- package/dist/types/index.d.ts +19 -0
- package/dist/types/index.d.ts.map +1 -1
- package/package.json +11 -6
package/README.md
CHANGED
|
@@ -1,74 +1,82 @@
|
|
|
1
|
+
<div align="center">
|
|
2
|
+
|
|
1
3
|
# AIX CLI
|
|
2
4
|
|
|
3
|
-
|
|
5
|
+
**Run Claude Code & OpenCode with local AI models. No API keys. No cloud. Complete privacy.**
|
|
4
6
|
|
|
5
|
-
|
|
7
|
+
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
8
|
+
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
9
|
+
[](LICENSE)
|
|
10
|
+
[](https://github.com/iamharshil/aix-cli/actions)
|
|
11
|
+
[](https://nodejs.org/)
|
|
6
12
|
|
|
7
|
-
[
|
|
8
|
-
[](https://opensource.org/licenses/MIT)
|
|
9
|
-
[](https://nodejs.org/)
|
|
10
|
-
[](https://github.com/iamharshil/aix-cli/releases)
|
|
11
|
-
[](https://github.com/iamharshil/aix-cli/actions)
|
|
12
|
-
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
13
|
+
[Getting Started](#getting-started) · [Documentation](#usage) · [Contributing](CONTRIBUTING.md) · [Changelog](CHANGELOG.md)
|
|
13
14
|
|
|
14
15
|
</div>
|
|
15
16
|
|
|
16
17
|
---
|
|
17
18
|
|
|
18
|
-
##
|
|
19
|
+
## What is AIX?
|
|
19
20
|
|
|
20
|
-
|
|
21
|
+
AIX CLI is a bridge between local model servers and AI coding assistants. It connects [LM Studio](https://lmstudio.ai) or [Ollama](https://ollama.com) to [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai) — letting you use **locally-running language models** as the backend for your favorite AI dev tools.
|
|
21
22
|
|
|
22
|
-
|
|
23
|
+
No API keys. No cloud calls. No data leaving your machine.
|
|
23
24
|
|
|
24
|
-
|
|
25
|
-
|
|
26
|
-
|
|
27
|
-
|
|
28
|
-
|
|
29
|
-
|
|
30
|
-
|
|
25
|
+
```
|
|
26
|
+
┌──────────────────────────────────────────────────┐
|
|
27
|
+
│ $ aix-cli run │
|
|
28
|
+
│ │
|
|
29
|
+
│ ? Select model backend: Ollama │
|
|
30
|
+
│ ✔ Connected to Ollama │
|
|
31
|
+
│ ✔ Model selected: qwen2.5-coder:14b │
|
|
32
|
+
│ ✔ Launching Claude Code... │
|
|
33
|
+
│ │
|
|
34
|
+
│ Your code stays local. Always. │
|
|
35
|
+
└──────────────────────────────────────────────────┘
|
|
36
|
+
```
|
|
31
37
|
|
|
32
|
-
|
|
38
|
+
### Why AIX?
|
|
33
39
|
|
|
34
|
-
-
|
|
35
|
-
-
|
|
36
|
-
-
|
|
40
|
+
- 🔒 **Privacy-first** — All inference runs locally on your hardware. Your code never leaves your machine.
|
|
41
|
+
- 🔑 **No API keys** — No subscriptions, no usage limits, no cloud dependencies.
|
|
42
|
+
- 🚀 **GPU-accelerated** — Take advantage of your local GPU for fast inference.
|
|
43
|
+
- 🔀 **Multi-backend** — Use LM Studio or Ollama as your model server.
|
|
44
|
+
- 🛠️ **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
|
|
45
|
+
- ⚡ **Zero config** — Just run `aix-cli run` and start coding.
|
|
37
46
|
|
|
38
|
-
|
|
47
|
+
---
|
|
39
48
|
|
|
40
|
-
|
|
41
|
-
# Install
|
|
42
|
-
npm install -g @iamharshil/aix-cli
|
|
49
|
+
## Getting Started
|
|
43
50
|
|
|
44
|
-
|
|
45
|
-
aix-cli doctor
|
|
51
|
+
### Prerequisites
|
|
46
52
|
|
|
47
|
-
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
|
|
51
|
-
|
|
53
|
+
| Requirement | Description |
|
|
54
|
+
|---|---|
|
|
55
|
+
| [Node.js](https://nodejs.org/) ≥ 18 | JavaScript runtime |
|
|
56
|
+
| [LM Studio](https://lmstudio.ai) **or** [Ollama](https://ollama.com) | Local model server (at least one required) |
|
|
57
|
+
| [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) | AI coding assistant (at least one required) |
|
|
52
58
|
|
|
53
|
-
###
|
|
59
|
+
### Install
|
|
54
60
|
|
|
55
61
|
```bash
|
|
56
62
|
npm install -g @iamharshil/aix-cli
|
|
57
63
|
```
|
|
58
64
|
|
|
59
|
-
|
|
65
|
+
<details>
|
|
66
|
+
<summary>Other package managers</summary>
|
|
60
67
|
|
|
61
68
|
```bash
|
|
69
|
+
# Yarn
|
|
62
70
|
yarn global add @iamharshil/aix-cli
|
|
63
|
-
```
|
|
64
|
-
|
|
65
|
-
### Using pnpm
|
|
66
71
|
|
|
67
|
-
|
|
72
|
+
# pnpm
|
|
68
73
|
pnpm add -g @iamharshil/aix-cli
|
|
69
74
|
```
|
|
70
75
|
|
|
71
|
-
|
|
76
|
+
</details>
|
|
77
|
+
|
|
78
|
+
<details>
|
|
79
|
+
<summary>Build from source</summary>
|
|
72
80
|
|
|
73
81
|
```bash
|
|
74
82
|
git clone https://github.com/iamharshil/aix-cli.git
|
|
@@ -78,69 +86,96 @@ npm run build
|
|
|
78
86
|
npm link
|
|
79
87
|
```
|
|
80
88
|
|
|
81
|
-
|
|
89
|
+
</details>
|
|
82
90
|
|
|
83
|
-
###
|
|
84
|
-
|
|
85
|
-
Verify your environment is properly configured:
|
|
91
|
+
### Verify
|
|
86
92
|
|
|
87
93
|
```bash
|
|
88
94
|
aix-cli doctor
|
|
89
95
|
```
|
|
90
96
|
|
|
91
|
-
|
|
97
|
+
This checks that LM Studio / Ollama, Claude Code / OpenCode, and your environment are properly configured.
|
|
92
98
|
|
|
93
|
-
|
|
99
|
+
---
|
|
100
|
+
|
|
101
|
+
## Usage
|
|
102
|
+
|
|
103
|
+
### `aix-cli run` — Start a coding session
|
|
104
|
+
|
|
105
|
+
The primary command. Launches Claude Code or OpenCode backed by a local model.
|
|
94
106
|
|
|
95
107
|
```bash
|
|
96
|
-
# Interactive
|
|
97
|
-
aix-cli
|
|
108
|
+
# Interactive — prompts for backend, model, and provider
|
|
109
|
+
aix-cli run
|
|
98
110
|
|
|
99
|
-
#
|
|
100
|
-
aix-cli
|
|
111
|
+
# Specify backend and model
|
|
112
|
+
aix-cli run -b ollama -m qwen2.5-coder:14b
|
|
113
|
+
aix-cli run -b lmstudio -m llama-3-8b
|
|
101
114
|
|
|
102
|
-
#
|
|
103
|
-
aix-cli
|
|
115
|
+
# Use OpenCode instead of Claude Code
|
|
116
|
+
aix-cli run --provider opencode
|
|
117
|
+
|
|
118
|
+
# Pass a prompt directly
|
|
119
|
+
aix-cli run -b ollama -m qwen2.5-coder:14b -- "Refactor auth middleware"
|
|
104
120
|
```
|
|
105
121
|
|
|
106
|
-
###
|
|
122
|
+
### `aix-cli init` — Set up backend and model
|
|
107
123
|
|
|
108
|
-
|
|
124
|
+
Configure your preferred backend and load/select a model.
|
|
109
125
|
|
|
110
126
|
```bash
|
|
111
|
-
# Interactive
|
|
112
|
-
aix-cli
|
|
127
|
+
aix-cli init # Interactive setup
|
|
128
|
+
aix-cli init -b ollama -m qwen2.5-coder:14b # Ollama with specific model
|
|
129
|
+
aix-cli init -b lmstudio -m llama-3-8b -p claude # Full config in one command
|
|
130
|
+
```
|
|
113
131
|
|
|
114
|
-
|
|
115
|
-
aix-cli run -m llama-3-8b
|
|
132
|
+
### `aix-cli status` — Check what's running
|
|
116
133
|
|
|
117
|
-
|
|
118
|
-
aix-cli run --provider opencode
|
|
119
|
-
aix-cli run -p opencode -m llama-3-8b
|
|
134
|
+
Shows status for both LM Studio and Ollama, including available and running models.
|
|
120
135
|
|
|
121
|
-
|
|
122
|
-
aix-cli
|
|
136
|
+
```bash
|
|
137
|
+
aix-cli status
|
|
123
138
|
```
|
|
124
139
|
|
|
125
|
-
###
|
|
140
|
+
### `aix-cli doctor` — System diagnostics
|
|
126
141
|
|
|
127
|
-
|
|
142
|
+
Verifies your environment is ready to go.
|
|
128
143
|
|
|
129
144
|
```bash
|
|
130
|
-
aix-cli
|
|
145
|
+
aix-cli doctor
|
|
131
146
|
```
|
|
132
147
|
|
|
133
|
-
|
|
148
|
+
### Command Reference
|
|
149
|
+
|
|
150
|
+
| Command | Aliases | Description |
|
|
151
|
+
|---|---|---|
|
|
152
|
+
| `run` | `r` | Run Claude Code / OpenCode with a local model |
|
|
153
|
+
| `init` | `i`, `load` | Set up backend, select model, configure provider |
|
|
154
|
+
| `status` | `s`, `stats` | Show LM Studio & Ollama status |
|
|
155
|
+
| `doctor` | `d`, `check` | Run system diagnostics |
|
|
156
|
+
|
|
157
|
+
### Global Options
|
|
158
|
+
|
|
159
|
+
| Flag | Description |
|
|
160
|
+
|---|---|
|
|
161
|
+
| `-b, --backend <name>` | Model backend: `lmstudio` or `ollama` |
|
|
162
|
+
| `-m, --model <name>` | Model name or ID to use |
|
|
163
|
+
| `-p, --provider <name>` | Coding tool: `claude` (default) or `opencode` |
|
|
164
|
+
| `-v, --verbose` | Show verbose output |
|
|
165
|
+
| `-h, --help` | Show help |
|
|
166
|
+
| `-V, --version` | Show version |
|
|
167
|
+
|
|
168
|
+
---
|
|
134
169
|
|
|
135
|
-
|
|
170
|
+
## Configuration
|
|
136
171
|
|
|
137
|
-
AIX
|
|
172
|
+
AIX stores its configuration in the OS-appropriate config directory:
|
|
138
173
|
|
|
139
|
-
| Platform | Path
|
|
140
|
-
|
|
141
|
-
| macOS
|
|
142
|
-
| Linux
|
|
143
|
-
| Windows
|
|
174
|
+
| Platform | Path |
|
|
175
|
+
|---|---|
|
|
176
|
+
| macOS | `~/Library/Application Support/aix-cli/` |
|
|
177
|
+
| Linux | `~/.config/aix-cli/` |
|
|
178
|
+
| Windows | `%APPDATA%\aix-cli\` |
|
|
144
179
|
|
|
145
180
|
### Config File
|
|
146
181
|
|
|
@@ -148,131 +183,161 @@ AIX CLI stores configuration in your system's app data directory:
|
|
|
148
183
|
{
|
|
149
184
|
"lmStudioUrl": "http://localhost",
|
|
150
185
|
"lmStudioPort": 1234,
|
|
186
|
+
"ollamaUrl": "http://localhost",
|
|
187
|
+
"ollamaPort": 11434,
|
|
151
188
|
"defaultTimeout": 30000,
|
|
152
|
-
"
|
|
153
|
-
"defaultProvider": "claude"
|
|
189
|
+
"defaultBackend": "ollama",
|
|
190
|
+
"defaultProvider": "claude",
|
|
191
|
+
"model": "qwen2.5-coder:14b"
|
|
154
192
|
}
|
|
155
193
|
```
|
|
156
194
|
|
|
157
195
|
### Environment Variables
|
|
158
196
|
|
|
159
|
-
| Variable
|
|
160
|
-
|
|
161
|
-
| `LM_STUDIO_PORT` | LM Studio server port | `1234`
|
|
197
|
+
| Variable | Description | Default |
|
|
198
|
+
|---|---|---|
|
|
199
|
+
| `LM_STUDIO_PORT` | Override the LM Studio server port | `1234` |
|
|
200
|
+
|
|
201
|
+
---
|
|
162
202
|
|
|
163
|
-
##
|
|
203
|
+
## How It Works
|
|
164
204
|
|
|
165
205
|
```
|
|
166
|
-
|
|
167
|
-
│
|
|
168
|
-
|
|
169
|
-
|
|
170
|
-
│
|
|
171
|
-
|
|
172
|
-
│
|
|
173
|
-
|
|
174
|
-
│
|
|
175
|
-
|
|
176
|
-
│
|
|
177
|
-
│
|
|
178
|
-
│
|
|
179
|
-
│
|
|
180
|
-
|
|
181
|
-
|
|
182
|
-
|
|
183
|
-
|
|
184
|
-
|
|
185
|
-
│
|
|
186
|
-
│
|
|
187
|
-
|
|
188
|
-
│ │
|
|
189
|
-
└─────────────────────────────────────────────────────┘
|
|
206
|
+
┌───────────────────┐ ┌───────────────────┐
|
|
207
|
+
│ LM Studio │ │ Ollama │
|
|
208
|
+
│ (port 1234) │ │ (port 11434) │
|
|
209
|
+
└────────┬──────────┘ └────────┬───────────┘
|
|
210
|
+
│ │
|
|
211
|
+
REST API REST API
|
|
212
|
+
│ │
|
|
213
|
+
└───────────┬────────────┘
|
|
214
|
+
│
|
|
215
|
+
┌────────┴──────────┐
|
|
216
|
+
│ AIX CLI │
|
|
217
|
+
│ backend routing │
|
|
218
|
+
│ model selection │
|
|
219
|
+
│ config mgmt │
|
|
220
|
+
└───┬──────────┬────┘
|
|
221
|
+
│ │
|
|
222
|
+
┌────────┘ └────────┐
|
|
223
|
+
▼ ▼
|
|
224
|
+
┌──────────────┐ ┌──────────────┐
|
|
225
|
+
│ Claude Code │ │ OpenCode │
|
|
226
|
+
│ --model X │ │ --model X │
|
|
227
|
+
└──────────────┘ └──────────────┘
|
|
190
228
|
```
|
|
191
229
|
|
|
230
|
+
1. **LM Studio** or **Ollama** runs a local inference server with an OpenAI-compatible API.
|
|
231
|
+
2. **AIX CLI** discovers available models, manages configuration, and orchestrates the connection.
|
|
232
|
+
3. **Claude Code** or **OpenCode** receives the model endpoint and runs as it normally would — except fully local.
|
|
233
|
+
|
|
234
|
+
---
|
|
235
|
+
|
|
192
236
|
## Troubleshooting
|
|
193
237
|
|
|
194
|
-
|
|
238
|
+
<details>
|
|
239
|
+
<summary><strong>LM Studio server not running</strong></summary>
|
|
195
240
|
|
|
196
|
-
|
|
197
|
-
|
|
198
|
-
|
|
199
|
-
|
|
200
|
-
|
|
201
|
-
|
|
241
|
+
1. Open LM Studio
|
|
242
|
+
2. Navigate to the **Server** tab (left sidebar)
|
|
243
|
+
3. Click **Start Server**
|
|
244
|
+
4. Confirm with `aix-cli status`
|
|
245
|
+
|
|
246
|
+
</details>
|
|
247
|
+
|
|
248
|
+
<details>
|
|
249
|
+
<summary><strong>Ollama not running</strong></summary>
|
|
250
|
+
|
|
251
|
+
1. Install Ollama from [ollama.com](https://ollama.com)
|
|
252
|
+
2. Start the server: `ollama serve`
|
|
253
|
+
3. Pull a model: `ollama pull qwen2.5-coder:14b`
|
|
254
|
+
4. Confirm with `aix-cli status`
|
|
255
|
+
|
|
256
|
+
</details>
|
|
257
|
+
|
|
258
|
+
<details>
|
|
259
|
+
<summary><strong>No models found</strong></summary>
|
|
260
|
+
|
|
261
|
+
**LM Studio:** Open LM Studio → **Search** tab → download a model.
|
|
262
|
+
|
|
263
|
+
**Ollama:** Run `ollama pull <model>` to download a model (e.g., `ollama pull llama3.2`).
|
|
264
|
+
|
|
265
|
+
Then run `aix-cli init` to select and configure.
|
|
266
|
+
|
|
267
|
+
</details>
|
|
268
|
+
|
|
269
|
+
<details>
|
|
270
|
+
<summary><strong>Connection refused</strong></summary>
|
|
271
|
+
|
|
272
|
+
Check that the correct port is being used:
|
|
273
|
+
- LM Studio defaults to port `1234`
|
|
274
|
+
- Ollama defaults to port `11434`
|
|
275
|
+
|
|
276
|
+
You can configure custom ports in your AIX config file (path shown by `aix-cli doctor`).
|
|
277
|
+
|
|
278
|
+
</details>
|
|
202
279
|
|
|
203
|
-
|
|
280
|
+
<details>
|
|
281
|
+
<summary><strong>Claude Code / OpenCode not detected</strong></summary>
|
|
282
|
+
|
|
283
|
+
Install the missing provider globally:
|
|
204
284
|
|
|
205
285
|
```bash
|
|
206
|
-
#
|
|
207
|
-
|
|
208
|
-
|
|
209
|
-
#
|
|
210
|
-
|
|
286
|
+
# Claude Code
|
|
287
|
+
npm install -g @anthropic-ai/claude-code
|
|
288
|
+
|
|
289
|
+
# OpenCode
|
|
290
|
+
npm install -g opencode
|
|
211
291
|
```
|
|
212
292
|
|
|
213
|
-
|
|
293
|
+
Then re-run `aix-cli doctor` to confirm.
|
|
214
294
|
|
|
215
|
-
|
|
295
|
+
</details>
|
|
296
|
+
|
|
297
|
+
---
|
|
216
298
|
|
|
217
299
|
## Security & Privacy
|
|
218
300
|
|
|
219
|
-
|
|
220
|
-
- ✅ No data sent to external servers
|
|
221
|
-
- ✅ No telemetry or analytics
|
|
222
|
-
- ✅ No API keys required
|
|
223
|
-
- ✅ Your code stays on your machine
|
|
301
|
+
AIX is designed around a simple principle: **your code never leaves your machine.**
|
|
224
302
|
|
|
225
|
-
|
|
303
|
+
- ✅ All AI inference runs locally via LM Studio or Ollama
|
|
304
|
+
- ✅ No telemetry, analytics, or tracking of any kind
|
|
305
|
+
- ✅ No outbound network calls (except to `localhost`)
|
|
306
|
+
- ✅ No API keys or accounts required
|
|
307
|
+
- ✅ Fully open-source — audit the code yourself
|
|
226
308
|
|
|
227
|
-
|
|
309
|
+
Found a vulnerability? Please report it responsibly via our [Security Policy](SECURITY.md).
|
|
228
310
|
|
|
229
|
-
|
|
311
|
+
---
|
|
312
|
+
|
|
313
|
+
## Contributing
|
|
314
|
+
|
|
315
|
+
Contributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines on how to get started.
|
|
230
316
|
|
|
231
317
|
```bash
|
|
232
|
-
# Clone and setup
|
|
233
318
|
git clone https://github.com/iamharshil/aix-cli.git
|
|
234
319
|
cd aix-cli
|
|
235
320
|
npm install
|
|
236
|
-
|
|
237
|
-
#
|
|
238
|
-
npm run
|
|
239
|
-
|
|
240
|
-
# Run tests
|
|
241
|
-
npm test
|
|
242
|
-
|
|
243
|
-
# Lint
|
|
244
|
-
npm run lint
|
|
245
|
-
|
|
246
|
-
# Build
|
|
247
|
-
npm run build
|
|
321
|
+
npm run dev # Run in development mode
|
|
322
|
+
npm test # Run tests
|
|
323
|
+
npm run lint # Lint
|
|
248
324
|
```
|
|
249
325
|
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
- TypeScript with strict mode
|
|
253
|
-
- ESLint + Prettier configured
|
|
254
|
-
- 2-space indentation
|
|
255
|
-
- Single quotes, semicolons
|
|
326
|
+
---
|
|
256
327
|
|
|
257
328
|
## Related Projects
|
|
258
329
|
|
|
259
|
-
- [LM Studio](https://lmstudio.ai)
|
|
260
|
-
- [
|
|
261
|
-
- [
|
|
330
|
+
- [LM Studio](https://lmstudio.ai) — Run local AI models with a visual interface
|
|
331
|
+
- [Ollama](https://ollama.com) — Run large language models locally
|
|
332
|
+
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code) — Anthropic's AI coding assistant
|
|
333
|
+
- [OpenCode](https://opencode.ai) — Open-source AI coding assistant
|
|
262
334
|
|
|
263
335
|
## License
|
|
264
336
|
|
|
265
|
-
[MIT](LICENSE)
|
|
266
|
-
|
|
267
|
-
## Support
|
|
268
|
-
|
|
269
|
-
- 📋 [Issues](https://github.com/iamharshil/aix-cli/issues) - Report bugs
|
|
270
|
-
- 💬 [Discussions](https://github.com/iamharshil/aix-cli/discussions) - Ask questions
|
|
337
|
+
[MIT](LICENSE) © [Harshil](https://github.com/iamharshil)
|
|
271
338
|
|
|
272
339
|
---
|
|
273
340
|
|
|
274
341
|
<div align="center">
|
|
275
|
-
|
|
276
|
-
Built with ❤️ for privacy-conscious developers
|
|
277
|
-
|
|
342
|
+
<sub>Built for developers who care about privacy.</sub>
|
|
278
343
|
</div>
|
package/dist/bin/aix.js
CHANGED
|
@@ -1,10 +1,11 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
|
-
var
|
|
3
|
-
Running: claude ${
|
|
4
|
-
`));try{await
|
|
2
|
+
var ge=Object.defineProperty;var pe=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var C=(t,e)=>()=>(t&&(e=t(t=0)),e);var O=(t,e)=>{for(var o in e)ge(t,o,{get:e[o],enumerable:!0})};var Y={};O(Y,{ConfigService:()=>I,configService:()=>c});import fe from"conf";var I,c,M=C(()=>{"use strict";I=class{store;constructor(){this.store=new fe({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,ollamaUrl:"http://localhost",ollamaPort:11434,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}setDefaultBackend(e){this.store.set("defaultBackend",e)}getDefaultBackend(){return this.store.get("defaultBackend")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}getOllamaUrl(){let e=this.store.get("ollamaUrl"),o=this.store.get("ollamaPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new I});var te={};O(te,{LMStudioService:()=>z,lmStudioService:()=>h});import{execa as U}from"execa";import Z from"ora";import ee from"chalk";var oe,z,h,L=C(()=>{"use strict";M();oe=[1234,1235,1236,1237],z=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let r=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!r.ok)continue;let n=await r.json(),i=[];return Array.isArray(n)?i=n:n.models&&Array.isArray(n.models)?i=n.models:n.data&&Array.isArray(n.data)&&(i=n.data),i.map(l=>{let s=l;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(l=>l.id&&l.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let r=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:r.models??[],activeModel:r.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let r=o??Z({text:`Loading model: ${ee.cyan(e)}`,color:"cyan"}).start();try{let n=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!n.ok)throw new Error(`Failed to load model: ${n.statusText}`);return r.succeed(`Model ${ee.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:r}}catch(n){throw r.fail(`Failed to load model: ${n instanceof Error?n.message:"Unknown error"}`),n}}async startServer(e){let o=e??Z({text:"Starting LM Studio server...",color:"cyan"}).start();try{let r=process.platform==="darwin",n=process.platform==="linux",i=process.platform==="win32",l;if(r){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let d of s)try{let{existsSync:u}=await import("fs");if(u(d)){l=`open "${d}" --args --server`;break}}catch{}if(l?.startsWith("open")){await U("open",[s.find(d=>{try{let{existsSync:u}=pe("fs");return u(d)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else n?l=await this.findLinuxBinary():i&&(l=await this.findWindowsExecutable());if(!l)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await U(l,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(r){throw o.fail(`Failed to start LM Studio: ${r instanceof Error?r.message:"Unknown error"}`),r}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await U("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,r=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let n of r)try{return await U("cmd",["/c","if exist",`"${n}"`,"echo","yes"]),n}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of oe)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return oe[0]??1234}},h=new z});var ne={};O(ne,{OllamaService:()=>j,ollamaService:()=>S});var j,S,A=C(()=>{"use strict";M();j=class{getBaseUrl(){return c.getOllamaUrl()}getApiUrl(e){return`${this.getBaseUrl()}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){try{let e=await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(1e4)});return e.ok?((await e.json()).models??[]).map(n=>{let i=n.details??{};return{id:String(n.name??n.model??""),name:String(n.name??n.model??""),size:Number(n.size??0),quantization:String(i.quantization_level??""),family:String(i.family??""),parameterSize:String(i.parameter_size??"")}}).filter(n=>n.id&&n.name):[]}catch{return[]}}async getRunningModels(){try{let e=await fetch(this.getApiUrl("/api/ps"),{method:"GET",signal:AbortSignal.timeout(5e3)});return e.ok?((await e.json()).models??[]).map(n=>String(n.name??n.model??"")).filter(Boolean):[]}catch{return[]}}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("ollamaPort"),models:[],runningModels:[]};let[o,r]=await Promise.all([this.getAvailableModels(),this.getRunningModels()]);return{running:!0,port:c.get("ollamaPort"),models:o,runningModels:r}}},S=new j});var se={};O(se,{ClaudeService:()=>q,claudeService:()=>R});import{execa as K}from"execa";import Me from"chalk";var q,R,H=C(()=>{"use strict";q=class{async isClaudeCodeInstalled(){try{return await K("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:r=[],verbose:n=!1}=e,i=this.extractProvider(o),l=this.extractModelName(o);if(!i||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${l}`,d=["--model",s,...r];n&&console.log(Me.dim(`
|
|
3
|
+
Running: claude ${d.join(" ")}
|
|
4
|
+
`));try{await K("claude",d,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(u){if(u instanceof Error&&"exitCode"in u){let v=u.exitCode;process.exit(v??1)}throw u}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await K("claude",["--version"])).stdout}catch{return}}},R=new q});var ae={};O(ae,{OpenCodeService:()=>G,openCodeService:()=>E});import{execa as V}from"execa";import $e from"chalk";var G,E,Q=C(()=>{"use strict";G=class{async isOpenCodeInstalled(){try{return await V("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:r=[],verbose:n=!1}=e,i=this.extractProvider(o),l=this.extractModelName(o);if(!i||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...r];n&&console.log($e.dim(`
|
|
5
5
|
Running: opencode ${s.join(" ")}
|
|
6
|
-
`));try{await
|
|
7
|
-
Model ready: ${
|
|
8
|
-
|
|
9
|
-
|
|
6
|
+
`));try{await V("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:l,OPENCODE_MODEL_PROVIDER:i}})}catch(d){if(d instanceof Error&&"exitCode"in d){let u=d.exitCode;process.exit(u??1)}throw d}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await V("opencode",["--version"])).stdout}catch{return}}},E=new G});import{Command as Ce}from"commander";import g from"chalk";L();A();M();import F from"ora";import p from"chalk";import _ from"inquirer";import ie from"inquirer";async function $(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),r=e?o.findIndex(i=>i.value.id===e):0;return(await ie.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,r),pageSize:Math.min(t.length,15)}])).model}async function T(t,e=!0){return(await ie.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import re from"chalk";function b(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],r=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,r)).toFixed(2))} ${o[r]}`}function y(t){console.log(re.green("\u2713")+" "+t)}function ve(t){console.error(re.red("\u2717")+" "+t)}function f(t,e=1){ve(t),process.exit(e)}async function he(){let t=c.getDefaultBackend(),{backendSelection:e}=await _.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",default:t??"lmstudio",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]),{saveDefault:o}=await _.prompt([{type:"confirm",name:"saveDefault",message:"Save as default backend?",default:!1}]);return o&&(c.setDefaultBackend(e),y(`Default backend set to ${p.cyan(e)}`)),e}async function Se(){let t=c.getDefaultProvider(),{providerSelection:e}=await _.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",default:t??"claude",choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]),{saveDefault:o}=await _.prompt([{type:"confirm",name:"saveDefault",message:"Save as default coding tool?",default:!1}]);return o&&(c.setDefaultProvider(e),y(`Default coding tool set to ${p.cyan(e)}`)),e}async function we(t,e){let o=F({text:"Checking LM Studio status...",color:"cyan"}).start(),r=await h.checkStatus();r||(o.info("LM Studio server not running"),o.stop(),await T("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),r=!0),o.succeed("Connected to LM Studio");let n=F({text:"Fetching available models...",color:"cyan"}).start(),i=await h.getAvailableModels();i.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.succeed(`Found ${p.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(p.bold("Available Models:")),console.log(p.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((m,x)=>{let w=b(m.size),N=m.loaded?p.green(" [LOADED]"):"";console.log(` ${p.dim(String(x+1).padStart(2))}. ${m.name} ${p.dim(`(${w})`)}${N}`)}),console.log();let l=c.getLastUsedModel(),s=t.model,d=s?i.find(m=>m.id===s||m.name.includes(s)):await $(i,l);d||f("No model selected"),await h.loadModel(d.id,o);let u=d.id.replace("/","--"),v=e==="opencode"?"OpenCode":"Claude Code";y(p.bold(`
|
|
7
|
+
Model ready: ${d.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${p.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+u)}`),console.log(),console.log(`Or use ${p.cyan("aix-cli run")} to start an interactive session`)}async function ye(t,e){let o=F({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=F({text:"Fetching available models...",color:"cyan"}).start(),i=await S.getAvailableModels();i.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.succeed(`Found ${p.bold(i.length)} model${i.length===1?"":"s"}`);let l=await S.getRunningModels();console.log(),console.log(p.bold("Available Models:")),console.log(p.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((m,x)=>{let w=b(m.size),ue=l.includes(m.id)?p.green(" [RUNNING]"):"",me=m.parameterSize?p.dim(` ${m.parameterSize}`):"";console.log(` ${p.dim(String(x+1).padStart(2))}. ${m.name}${me} ${p.dim(`(${w})`)}${ue}`)}),console.log();let s=c.getLastUsedModel(),d=t.model,u=d?i.find(m=>m.id===d||m.name.includes(d)):await $(i,s);u||f("No model selected"),c.setModel(u.id);let v=e==="opencode"?"OpenCode":"Claude Code";y(p.bold(`
|
|
8
|
+
Model selected: ${u.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${p.cyan((e==="opencode"?"opencode":"claude")+" --model ollama/"+u.id)}`),console.log(),console.log(`Or use ${p.cyan("aix-cli run")} to start an interactive session`)}async function W(t={}){let e=t.backend??await he(),o=t.provider??await Se();e==="ollama"?await ye(t,o):await we(t,o)}L();A();H();Q();M();import P from"ora";import le from"chalk";import de from"inquirer";async function be(){let t=c.getDefaultBackend();if(t)return t;let{backendSelection:e}=await de.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]);return e}async function Pe(){let t=await R.isClaudeCodeInstalled(),e=await E.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&f("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:r}=await de.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return r}function B(t){return t==="opencode"?"OpenCode":"Claude Code"}async function ke(t,e){let o=P({text:"Checking LM Studio status...",color:"cyan"}).start(),r=await h.checkStatus();r||(o.info("LM Studio server not running"),o.stop(),await T("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),r=!0),o.succeed("Connected to LM Studio");let n=P({text:"Fetching available models...",color:"cyan"}).start(),i=await h.getAvailableModels();i.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.stop();let l;if(t.model){let v=i.find(m=>m.id===t.model||m.name.toLowerCase().includes(t.model.toLowerCase()));v||f(`Model "${t.model}" not found. Available models: ${i.map(m=>m.name).join(", ")}`),l=v.id}else{let v=c.getLastUsedModel();l=(await $(i,v)).id}let s=P({text:`Loading model: ${le.cyan(l)}`,color:"cyan"}).start();await h.loadModel(l,s);let u=`lmstudio/${l.replace("/","--")}`;await ce(e,u,t)}async function xe(t,e){let o=P({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=P({text:"Fetching available models...",color:"cyan"}).start(),i=await S.getAvailableModels();i.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.stop();let l;if(t.model){let d=i.find(u=>u.id===t.model||u.name.toLowerCase().includes(t.model.toLowerCase()));d||f(`Model "${t.model}" not found. Available models: ${i.map(u=>u.name).join(", ")}`),l=d.id}else{let d=c.getLastUsedModel();l=(await $(i,d)).id}c.setModel(l);let s=`ollama/${l}`;await ce(e,s,t)}async function ce(t,e,o){let r=B(t);y(le.green(`
|
|
9
|
+
Starting ${r} with model: ${e}
|
|
10
|
+
`));try{t==="opencode"?await E.run({model:e,args:o.args??[],verbose:o.verbose}):await R.run({model:e,args:o.args??[],verbose:o.verbose})}catch(n){f(`Failed to run ${r}: ${n instanceof Error?n.message:"Unknown error"}`)}}async function J(t={}){let e;if(t.provider)e=t.provider;else{let i=c.getDefaultProvider();i?e=i:e=await Pe()}let o=P({text:`Checking ${B(e)} installation...`,color:"cyan"}).start();(e==="opencode"?await E.isOpenCodeInstalled():await R.isClaudeCodeInstalled())||(o.fail(`${B(e)} is not installed.`),f(`Please install ${B(e)} first.`)),o.succeed(`${B(e)} is installed`),(t.backend??await be())==="ollama"?await xe(t,e):await ke(t,e)}L();A();import a from"chalk";async function X(){let[t,e]=await Promise.all([h.getStatus(),S.getStatus()]);console.log(),console.log(a.bold("LM Studio")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${t.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${t.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(t.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${t.port}`)}`),t.activeModel&&console.log(` ${a.dim("\u25B8")} Active Model: ${a.green(t.activeModel)}`),t.running&&t.models.length>0?(console.log(),console.log(a.bold(" Models")),t.models.forEach((o,r)=>{let n=b(o.size),i=o.id===t.activeModel?` ${a.green("[LOADED]")}`:"";console.log(` ${a.dim(String(r+1)+".")} ${o.name}${i}`),console.log(` ${a.dim("ID:")} ${o.id}`),console.log(` ${a.dim("Size:")} ${n}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):t.running&&console.log(` ${a.dim("No models available")}`),console.log(),console.log(a.bold("Ollama")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${e.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${e.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(e.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${e.port}`)}`),e.running&&e.runningModels.length>0&&console.log(` ${a.dim("\u25B8")} Running: ${a.green(e.runningModels.join(", "))}`),e.running&&e.models.length>0?(console.log(),console.log(a.bold(" Models")),e.models.forEach((o,r)=>{let n=b(o.size),l=e.runningModels.includes(o.id)?` ${a.green("[RUNNING]")}`:"",s=o.parameterSize?` ${a.dim(o.parameterSize)}`:"";console.log(` ${a.dim(String(r+1)+".")} ${o.name}${s}${l}`),console.log(` ${a.dim("Size:")} ${n}`),o.family&&console.log(` ${a.dim("Family:")} ${o.family}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):e.running&&console.log(` ${a.dim("No models available")}`),console.log()}var k=new Ce;k.name("aix-cli").description("Run Claude Code or OpenCode with local AI models from LM Studio or Ollama").version("3.0.0").showHelpAfterError();function D(t=0){console.log(),console.log(g.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>D(0));process.on("SIGTERM",()=>D(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?D(0):(console.error(g.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&D(0)});k.command("init",{isDefault:!1}).aliases(["i","load"]).description("Select a backend, load a model, and configure your provider").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").action(W);k.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio or Ollama").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await J({...e,args:t})});k.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio and Ollama status and available models").action(X);k.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(L(),te)),{ollamaService:e}=await Promise.resolve().then(()=>(A(),ne)),{claudeService:o}=await Promise.resolve().then(()=>(H(),se)),{openCodeService:r}=await Promise.resolve().then(()=>(Q(),ae)),{configService:n}=await Promise.resolve().then(()=>(M(),Y));console.log(g.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(g.dim("\u2500".repeat(40)));let[i,l,s,d]=await Promise.all([t.checkStatus(),e.checkStatus(),o.isClaudeCodeInstalled(),r.isOpenCodeInstalled()]),u=n.getDefaultProvider(),v=n.getDefaultBackend(),m=n.get("lmStudioPort"),x=n.get("ollamaPort");console.log(),console.log(g.bold("Backends")),console.log(` ${i?"\u2705":"\u26A0\uFE0F"} LM Studio: ${i?g.green("Running"):g.yellow("Not running")} ${g.dim(`(port ${m})`)}`),console.log(` ${l?"\u2705":"\u26A0\uFE0F"} Ollama: ${l?g.green("Running"):g.yellow("Not running")} ${g.dim(`(port ${x})`)}`),console.log(),console.log(g.bold("Coding Tools")),console.log(` ${s?"\u2705":"\u274C"} Claude Code: ${s?g.green("Installed"):g.red("Not installed")}`),console.log(` ${d?"\u2705":"\u274C"} OpenCode: ${d?g.green("Installed"):g.red("Not installed")}`),console.log(),console.log(g.bold("Defaults")),console.log(` \u{1F4CC} Backend: ${g.cyan(v??"not set")}`),console.log(` \u{1F4CC} Coding tool: ${g.cyan(u??"not set")}`);let w=[];s||w.push(` \u2192 ${g.cyan("npm install -g @anthropic-ai/claude-code")}`),d||w.push(` \u2192 ${g.cyan("npm install -g opencode")}`),!i&&!l&&w.push(` \u2192 Start LM Studio or run ${g.cyan("ollama serve")}`),w.length>0&&(console.log(),console.log(g.bold("\u{1F4CB} Next Steps:")),w.forEach(N=>console.log(N))),console.log()});k.parse();
|
|
10
11
|
//# sourceMappingURL=aix.js.map
|