@iamharshil/aix-cli 2.0.2 → 2.0.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +199 -156
- package/dist/bin/aix.js +6 -6
- package/dist/bin/aix.js.map +3 -3
- package/dist/commands/run.d.ts.map +1 -1
- package/dist/services/config.d.ts +1 -1
- package/dist/services/config.d.ts.map +1 -1
- package/dist/src/index.js +6 -6
- package/dist/src/index.js.map +3 -3
- package/package.json +10 -6
package/README.md
CHANGED
|
@@ -1,74 +1,78 @@
|
|
|
1
|
+
<div align="center">
|
|
2
|
+
|
|
1
3
|
# AIX CLI
|
|
2
4
|
|
|
3
|
-
|
|
5
|
+
**Run Claude Code & OpenCode with local AI models. No API keys. No cloud. Complete privacy.**
|
|
4
6
|
|
|
5
|
-
|
|
7
|
+
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
8
|
+
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
9
|
+
[](LICENSE)
|
|
10
|
+
[](https://github.com/iamharshil/aix-cli/actions)
|
|
11
|
+
[](https://nodejs.org/)
|
|
6
12
|
|
|
7
|
-
[
|
|
8
|
-
[](https://opensource.org/licenses/MIT)
|
|
9
|
-
[](https://nodejs.org/)
|
|
10
|
-
[](https://github.com/iamharshil/aix-cli/releases)
|
|
11
|
-
[](https://github.com/iamharshil/aix-cli/actions)
|
|
12
|
-
[](https://www.npmjs.com/package/@iamharshil/aix-cli)
|
|
13
|
+
[Getting Started](#getting-started) · [Documentation](#usage) · [Contributing](CONTRIBUTING.md) · [Changelog](CHANGELOG.md)
|
|
13
14
|
|
|
14
15
|
</div>
|
|
15
16
|
|
|
16
17
|
---
|
|
17
18
|
|
|
18
|
-
##
|
|
19
|
-
|
|
20
|
-
**AIX CLI** enables you to use locally-running AI models from [LM Studio](https://lmstudio.ai) directly with [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai). No API keys, no cloud dependencies, complete privacy.
|
|
21
|
-
|
|
22
|
-
### Why AIX?
|
|
19
|
+
## What is AIX?
|
|
23
20
|
|
|
24
|
-
|
|
25
|
-
| -------------------- | ---------------------------------------------- |
|
|
26
|
-
| 🔒 **Privacy-First** | All processing happens locally on your machine |
|
|
27
|
-
| 🔑 **No API Keys** | No external services or subscriptions |
|
|
28
|
-
| 🚀 **Fast** | Local inference with your GPU |
|
|
29
|
-
| 🛡️ **Secure** | Your code never leaves your machine |
|
|
30
|
-
| 🔧 **Simple** | Just run `aix-cli run` and start coding |
|
|
21
|
+
AIX CLI is a bridge between [LM Studio](https://lmstudio.ai) and AI coding assistants like [Claude Code](https://docs.anthropic.com/en/docs/claude-code) and [OpenCode](https://opencode.ai). It lets you use **locally-running language models** as the backend for your favorite AI dev tools — no API keys, no cloud calls, no data leaving your machine.
|
|
31
22
|
|
|
32
|
-
|
|
23
|
+
```
|
|
24
|
+
┌──────────────────────────────────────────────────┐
|
|
25
|
+
│ $ aix-cli run │
|
|
26
|
+
│ │
|
|
27
|
+
│ ✔ LM Studio running on http://localhost:1234 │
|
|
28
|
+
│ ✔ Model loaded: qwen2.5-coder-14b │
|
|
29
|
+
│ ✔ Launching Claude Code... │
|
|
30
|
+
│ │
|
|
31
|
+
│ Your code stays local. Always. │
|
|
32
|
+
└──────────────────────────────────────────────────┘
|
|
33
|
+
```
|
|
33
34
|
|
|
34
|
-
|
|
35
|
-
- [LM Studio](https://lmstudio.ai) - Download and run local AI models
|
|
36
|
-
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) - AI coding assistant
|
|
35
|
+
### Why AIX?
|
|
37
36
|
|
|
38
|
-
|
|
37
|
+
- 🔒 **Privacy-first** — All inference runs locally on your hardware. Your code never leaves your machine.
|
|
38
|
+
- 🔑 **No API keys** — No subscriptions, no usage limits, no cloud dependencies.
|
|
39
|
+
- 🚀 **GPU-accelerated** — Take advantage of your local GPU for fast inference via LM Studio.
|
|
40
|
+
- 🔀 **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
|
|
41
|
+
- ⚡ **Zero config** — Just run `aix-cli run` and start coding.
|
|
39
42
|
|
|
40
|
-
|
|
41
|
-
# Install
|
|
42
|
-
npm install -g @iamharshil/aix-cli
|
|
43
|
+
---
|
|
43
44
|
|
|
44
|
-
|
|
45
|
-
aix-cli doctor
|
|
45
|
+
## Getting Started
|
|
46
46
|
|
|
47
|
-
|
|
48
|
-
aix-cli run
|
|
49
|
-
```
|
|
47
|
+
### Prerequisites
|
|
50
48
|
|
|
51
|
-
|
|
49
|
+
| Requirement | Description |
|
|
50
|
+
|---|---|
|
|
51
|
+
| [Node.js](https://nodejs.org/) ≥ 18 | JavaScript runtime |
|
|
52
|
+
| [LM Studio](https://lmstudio.ai) | Run local AI models and expose them via API |
|
|
53
|
+
| [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) | AI coding assistant (at least one required) |
|
|
52
54
|
|
|
53
|
-
###
|
|
55
|
+
### Install
|
|
54
56
|
|
|
55
57
|
```bash
|
|
56
58
|
npm install -g @iamharshil/aix-cli
|
|
57
59
|
```
|
|
58
60
|
|
|
59
|
-
|
|
61
|
+
<details>
|
|
62
|
+
<summary>Other package managers</summary>
|
|
60
63
|
|
|
61
64
|
```bash
|
|
65
|
+
# Yarn
|
|
62
66
|
yarn global add @iamharshil/aix-cli
|
|
63
|
-
```
|
|
64
|
-
|
|
65
|
-
### Using pnpm
|
|
66
67
|
|
|
67
|
-
|
|
68
|
+
# pnpm
|
|
68
69
|
pnpm add -g @iamharshil/aix-cli
|
|
69
70
|
```
|
|
70
71
|
|
|
71
|
-
|
|
72
|
+
</details>
|
|
73
|
+
|
|
74
|
+
<details>
|
|
75
|
+
<summary>Build from source</summary>
|
|
72
76
|
|
|
73
77
|
```bash
|
|
74
78
|
git clone https://github.com/iamharshil/aix-cli.git
|
|
@@ -78,69 +82,94 @@ npm run build
|
|
|
78
82
|
npm link
|
|
79
83
|
```
|
|
80
84
|
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
### System Check
|
|
85
|
+
</details>
|
|
84
86
|
|
|
85
|
-
Verify
|
|
87
|
+
### Verify
|
|
86
88
|
|
|
87
89
|
```bash
|
|
88
90
|
aix-cli doctor
|
|
89
91
|
```
|
|
90
92
|
|
|
91
|
-
|
|
93
|
+
This checks that LM Studio, Claude Code / OpenCode, and your environment are properly configured.
|
|
94
|
+
|
|
95
|
+
---
|
|
96
|
+
|
|
97
|
+
## Usage
|
|
98
|
+
|
|
99
|
+
### `aix-cli run` — Start a coding session
|
|
92
100
|
|
|
93
|
-
|
|
101
|
+
The primary command. Launches Claude Code or OpenCode backed by a local LM Studio model.
|
|
94
102
|
|
|
95
103
|
```bash
|
|
96
|
-
# Interactive
|
|
97
|
-
aix-cli
|
|
104
|
+
# Interactive — picks a running model automatically
|
|
105
|
+
aix-cli run
|
|
98
106
|
|
|
99
|
-
#
|
|
100
|
-
aix-cli
|
|
107
|
+
# Specify a model
|
|
108
|
+
aix-cli run -m qwen2.5-coder-14b
|
|
109
|
+
|
|
110
|
+
# Use OpenCode instead of Claude Code
|
|
111
|
+
aix-cli run --provider opencode
|
|
101
112
|
|
|
102
|
-
#
|
|
103
|
-
aix-cli
|
|
113
|
+
# Pass a prompt directly
|
|
114
|
+
aix-cli run -m qwen2.5-coder-14b -- "Refactor auth middleware to use JWTs"
|
|
104
115
|
```
|
|
105
116
|
|
|
106
|
-
###
|
|
117
|
+
### `aix-cli init` — Load a model
|
|
107
118
|
|
|
108
|
-
|
|
119
|
+
Loads a model into LM Studio. If no model is specified, you'll get an interactive picker.
|
|
109
120
|
|
|
110
121
|
```bash
|
|
111
|
-
# Interactive
|
|
112
|
-
aix-cli
|
|
122
|
+
aix-cli init # Interactive model selection
|
|
123
|
+
aix-cli init -m llama-3-8b # Load a specific model
|
|
124
|
+
aix-cli init -m llama-3-8b -p opencode # Set default provider at the same time
|
|
125
|
+
```
|
|
113
126
|
|
|
114
|
-
|
|
115
|
-
aix-cli run -m llama-3-8b
|
|
127
|
+
### `aix-cli status` — Check what's running
|
|
116
128
|
|
|
117
|
-
|
|
118
|
-
aix-cli run --provider opencode
|
|
119
|
-
aix-cli run -p opencode -m llama-3-8b
|
|
129
|
+
Shows the current LM Studio server status and which models are loaded.
|
|
120
130
|
|
|
121
|
-
|
|
122
|
-
aix-cli
|
|
131
|
+
```bash
|
|
132
|
+
aix-cli status
|
|
123
133
|
```
|
|
124
134
|
|
|
125
|
-
###
|
|
135
|
+
### `aix-cli doctor` — System diagnostics
|
|
126
136
|
|
|
127
|
-
|
|
137
|
+
Verifies your environment is ready to go.
|
|
128
138
|
|
|
129
139
|
```bash
|
|
130
|
-
aix-cli
|
|
140
|
+
aix-cli doctor
|
|
131
141
|
```
|
|
132
142
|
|
|
133
|
-
|
|
143
|
+
### Command Reference
|
|
144
|
+
|
|
145
|
+
| Command | Aliases | Description |
|
|
146
|
+
|---|---|---|
|
|
147
|
+
| `run` | `r` | Run Claude Code / OpenCode with a local model |
|
|
148
|
+
| `init` | `i`, `load` | Load a model into LM Studio |
|
|
149
|
+
| `status` | `s`, `stats` | Show LM Studio server status |
|
|
150
|
+
| `doctor` | `d`, `check` | Run system diagnostics |
|
|
134
151
|
|
|
135
|
-
###
|
|
152
|
+
### Global Options
|
|
136
153
|
|
|
137
|
-
|
|
154
|
+
| Flag | Description |
|
|
155
|
+
|---|---|
|
|
156
|
+
| `-m, --model <name>` | Model name or ID to use |
|
|
157
|
+
| `-p, --provider <name>` | Provider: `claude` (default) or `opencode` |
|
|
158
|
+
| `-v, --verbose` | Show verbose output |
|
|
159
|
+
| `-h, --help` | Show help |
|
|
160
|
+
| `-V, --version` | Show version |
|
|
138
161
|
|
|
139
|
-
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
162
|
+
---
|
|
163
|
+
|
|
164
|
+
## Configuration
|
|
165
|
+
|
|
166
|
+
AIX stores its configuration in the OS-appropriate config directory:
|
|
167
|
+
|
|
168
|
+
| Platform | Path |
|
|
169
|
+
|---|---|
|
|
170
|
+
| macOS | `~/Library/Application Support/aix-cli/` |
|
|
171
|
+
| Linux | `~/.config/aix-cli/` |
|
|
172
|
+
| Windows | `%APPDATA%\aix-cli\` |
|
|
144
173
|
|
|
145
174
|
### Config File
|
|
146
175
|
|
|
@@ -149,130 +178,144 @@ AIX CLI stores configuration in your system's app data directory:
|
|
|
149
178
|
"lmStudioUrl": "http://localhost",
|
|
150
179
|
"lmStudioPort": 1234,
|
|
151
180
|
"defaultTimeout": 30000,
|
|
152
|
-
"model": "lmstudio/
|
|
181
|
+
"model": "lmstudio/qwen2.5-coder-14b",
|
|
153
182
|
"defaultProvider": "claude"
|
|
154
183
|
}
|
|
155
184
|
```
|
|
156
185
|
|
|
157
186
|
### Environment Variables
|
|
158
187
|
|
|
159
|
-
| Variable
|
|
160
|
-
|
|
161
|
-
| `LM_STUDIO_PORT` | LM Studio server port | `1234`
|
|
188
|
+
| Variable | Description | Default |
|
|
189
|
+
|---|---|---|
|
|
190
|
+
| `LM_STUDIO_PORT` | Override the LM Studio server port | `1234` |
|
|
162
191
|
|
|
163
|
-
|
|
192
|
+
---
|
|
193
|
+
|
|
194
|
+
## How It Works
|
|
164
195
|
|
|
165
196
|
```
|
|
166
|
-
|
|
167
|
-
│
|
|
168
|
-
|
|
169
|
-
│
|
|
170
|
-
|
|
171
|
-
│
|
|
172
|
-
|
|
173
|
-
│
|
|
174
|
-
|
|
175
|
-
│
|
|
176
|
-
│
|
|
177
|
-
│
|
|
178
|
-
|
|
179
|
-
│
|
|
180
|
-
|
|
181
|
-
|
|
182
|
-
|
|
183
|
-
│
|
|
184
|
-
│
|
|
185
|
-
|
|
186
|
-
│ │--model │ │--model │ │
|
|
187
|
-
│ └──────────┘ └──────────┘ │
|
|
188
|
-
│ │
|
|
189
|
-
└─────────────────────────────────────────────────────┘
|
|
197
|
+
┌───────────────────┐
|
|
198
|
+
│ LM Studio │
|
|
199
|
+
│ (local server) │
|
|
200
|
+
│ port 1234 │
|
|
201
|
+
└────────┬──────────┘
|
|
202
|
+
│
|
|
203
|
+
REST API
|
|
204
|
+
│
|
|
205
|
+
┌────────┴──────────┐
|
|
206
|
+
│ AIX CLI │
|
|
207
|
+
│ model routing │
|
|
208
|
+
│ config mgmt │
|
|
209
|
+
└───┬──────────┬────┘
|
|
210
|
+
│ │
|
|
211
|
+
┌────────┘ └────────┐
|
|
212
|
+
▼ ▼
|
|
213
|
+
┌──────────────┐ ┌──────────────┐
|
|
214
|
+
│ Claude Code │ │ OpenCode │
|
|
215
|
+
│ --model X │ │ --model X │
|
|
216
|
+
└──────────────┘ └──────────────┘
|
|
190
217
|
```
|
|
191
218
|
|
|
219
|
+
1. **LM Studio** runs a local inference server exposing an OpenAI-compatible API.
|
|
220
|
+
2. **AIX CLI** discovers available models, manages configuration, and orchestrates the connection.
|
|
221
|
+
3. **Claude Code** or **OpenCode** receives the model endpoint and runs as it normally would — except fully local.
|
|
222
|
+
|
|
223
|
+
---
|
|
224
|
+
|
|
192
225
|
## Troubleshooting
|
|
193
226
|
|
|
194
|
-
|
|
227
|
+
<details>
|
|
228
|
+
<summary><strong>LM Studio server not running</strong></summary>
|
|
229
|
+
|
|
230
|
+
1. Open LM Studio
|
|
231
|
+
2. Navigate to the **Server** tab (left sidebar)
|
|
232
|
+
3. Click **Start Server**
|
|
233
|
+
4. Confirm with `aix-cli status`
|
|
234
|
+
|
|
235
|
+
</details>
|
|
236
|
+
|
|
237
|
+
<details>
|
|
238
|
+
<summary><strong>No models found</strong></summary>
|
|
239
|
+
|
|
240
|
+
1. Open LM Studio → **Search** tab
|
|
241
|
+
2. Download a model (e.g., Qwen 2.5 Coder, Llama 3, Mistral)
|
|
242
|
+
3. Wait for the download to complete
|
|
243
|
+
4. Run `aix-cli init` to load it
|
|
244
|
+
|
|
245
|
+
</details>
|
|
246
|
+
|
|
247
|
+
<details>
|
|
248
|
+
<summary><strong>Connection refused on port 1234</strong></summary>
|
|
249
|
+
|
|
250
|
+
Check the LM Studio server tab for the actual port it's running on. If it differs from `1234`, update your config:
|
|
195
251
|
|
|
196
252
|
```bash
|
|
197
|
-
#
|
|
198
|
-
#
|
|
199
|
-
# 3. Click Start Server
|
|
200
|
-
# 4. Run: aix-cli run
|
|
253
|
+
# The config file path is shown by `aix-cli doctor`
|
|
254
|
+
# Edit the config file and set "lmStudioPort" to the correct port
|
|
201
255
|
```
|
|
202
256
|
|
|
203
|
-
|
|
257
|
+
</details>
|
|
258
|
+
|
|
259
|
+
<details>
|
|
260
|
+
<summary><strong>Claude Code / OpenCode not detected</strong></summary>
|
|
261
|
+
|
|
262
|
+
Install the missing provider globally:
|
|
204
263
|
|
|
205
264
|
```bash
|
|
206
|
-
#
|
|
207
|
-
|
|
208
|
-
|
|
209
|
-
#
|
|
210
|
-
|
|
265
|
+
# Claude Code
|
|
266
|
+
npm install -g @anthropic-ai/claude-code
|
|
267
|
+
|
|
268
|
+
# OpenCode
|
|
269
|
+
npm install -g opencode
|
|
211
270
|
```
|
|
212
271
|
|
|
213
|
-
|
|
272
|
+
Then re-run `aix-cli doctor` to confirm.
|
|
273
|
+
|
|
274
|
+
</details>
|
|
214
275
|
|
|
215
|
-
|
|
276
|
+
---
|
|
216
277
|
|
|
217
278
|
## Security & Privacy
|
|
218
279
|
|
|
219
|
-
|
|
220
|
-
- ✅ No data sent to external servers
|
|
221
|
-
- ✅ No telemetry or analytics
|
|
222
|
-
- ✅ No API keys required
|
|
223
|
-
- ✅ Your code stays on your machine
|
|
280
|
+
AIX is designed around a simple principle: **your code never leaves your machine.**
|
|
224
281
|
|
|
225
|
-
|
|
282
|
+
- ✅ All AI inference runs locally via LM Studio
|
|
283
|
+
- ✅ No telemetry, analytics, or tracking of any kind
|
|
284
|
+
- ✅ No outbound network calls (except to `localhost`)
|
|
285
|
+
- ✅ No API keys or accounts required
|
|
286
|
+
- ✅ Fully open-source — audit the code yourself
|
|
287
|
+
|
|
288
|
+
Found a vulnerability? Please report it responsibly via our [Security Policy](SECURITY.md).
|
|
226
289
|
|
|
227
|
-
|
|
290
|
+
---
|
|
291
|
+
|
|
292
|
+
## Contributing
|
|
228
293
|
|
|
229
|
-
|
|
294
|
+
Contributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines on how to get started.
|
|
230
295
|
|
|
231
296
|
```bash
|
|
232
|
-
# Clone and setup
|
|
233
297
|
git clone https://github.com/iamharshil/aix-cli.git
|
|
234
298
|
cd aix-cli
|
|
235
299
|
npm install
|
|
236
|
-
|
|
237
|
-
#
|
|
238
|
-
npm run
|
|
239
|
-
|
|
240
|
-
# Run tests
|
|
241
|
-
npm test
|
|
242
|
-
|
|
243
|
-
# Lint
|
|
244
|
-
npm run lint
|
|
245
|
-
|
|
246
|
-
# Build
|
|
247
|
-
npm run build
|
|
300
|
+
npm run dev # Run in development mode
|
|
301
|
+
npm test # Run tests
|
|
302
|
+
npm run lint # Lint
|
|
248
303
|
```
|
|
249
304
|
|
|
250
|
-
|
|
251
|
-
|
|
252
|
-
- TypeScript with strict mode
|
|
253
|
-
- ESLint + Prettier configured
|
|
254
|
-
- 2-space indentation
|
|
255
|
-
- Single quotes, semicolons
|
|
305
|
+
---
|
|
256
306
|
|
|
257
307
|
## Related Projects
|
|
258
308
|
|
|
259
|
-
- [LM Studio](https://lmstudio.ai)
|
|
260
|
-
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code)
|
|
261
|
-
- [OpenCode](https://opencode.ai)
|
|
309
|
+
- [LM Studio](https://lmstudio.ai) — Run local AI models with a visual interface
|
|
310
|
+
- [Claude Code](https://docs.anthropic.com/en/docs/claude-code) — Anthropic's AI coding assistant
|
|
311
|
+
- [OpenCode](https://opencode.ai) — Open-source AI coding assistant
|
|
262
312
|
|
|
263
313
|
## License
|
|
264
314
|
|
|
265
|
-
[MIT](LICENSE)
|
|
266
|
-
|
|
267
|
-
## Support
|
|
268
|
-
|
|
269
|
-
- 📋 [Issues](https://github.com/iamharshil/aix-cli/issues) - Report bugs
|
|
270
|
-
- 💬 [Discussions](https://github.com/iamharshil/aix-cli/discussions) - Ask questions
|
|
315
|
+
[MIT](LICENSE) © [Harshil](https://github.com/iamharshil)
|
|
271
316
|
|
|
272
317
|
---
|
|
273
318
|
|
|
274
319
|
<div align="center">
|
|
275
|
-
|
|
276
|
-
Built with ❤️ for privacy-conscious developers
|
|
277
|
-
|
|
320
|
+
<sub>Built for developers who care about privacy.</sub>
|
|
278
321
|
</div>
|
package/dist/bin/aix.js
CHANGED
|
@@ -1,10 +1,10 @@
|
|
|
1
1
|
#!/usr/bin/env node
|
|
2
|
-
var ne=Object.defineProperty;var ie=(
|
|
2
|
+
var ne=Object.defineProperty;var ie=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var O=(t,e)=>()=>(t&&(e=t(t=0)),e);var A=(t,e)=>{for(var o in e)ne(t,o,{get:e[o],enumerable:!0})};var W={};A(W,{ConfigService:()=>E,configService:()=>c});import se from"conf";var E,c,M=O(()=>{"use strict";E=class{store;constructor(){this.store=new se({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new E});var Q={};A(Q,{LMStudioService:()=>I,lmStudioService:()=>g});import{execa as k}from"execa";import H from"ora";import V from"chalk";var J,I,g,C=O(()=>{"use strict";M();J=[1234,1235,1236,1237],I=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let n=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!n.ok)continue;let r=await n.json(),i=[];return Array.isArray(r)?i=r:r.models&&Array.isArray(r.models)?i=r.models:r.data&&Array.isArray(r.data)&&(i=r.data),i.map(a=>{let s=a;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(a=>a.id&&a.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let n=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:n.models??[],activeModel:n.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let n=o??H({text:`Loading model: ${V.cyan(e)}`,color:"cyan"}).start();try{let r=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!r.ok)throw new Error(`Failed to load model: ${r.statusText}`);return n.succeed(`Model ${V.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:n}}catch(r){throw n.fail(`Failed to load model: ${r instanceof Error?r.message:"Unknown error"}`),r}}async startServer(e){let o=e??H({text:"Starting LM Studio server...",color:"cyan"}).start();try{let n=process.platform==="darwin",r=process.platform==="linux",i=process.platform==="win32",a;if(n){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let l of s)try{let{existsSync:p}=await import("fs");if(p(l)){a=`open "${l}" --args --server`;break}}catch{}if(a?.startsWith("open")){await k("open",[s.find(l=>{try{let{existsSync:p}=ie("fs");return p(l)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else r?a=await this.findLinuxBinary():i&&(a=await this.findWindowsExecutable());if(!a)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await k(a,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(n){throw o.fail(`Failed to start LM Studio: ${n instanceof Error?n.message:"Unknown error"}`),n}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await k("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,n=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let r of n)try{return await k("cmd",["/c","if exist",`"${r}"`,"echo","yes"]),r}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of J)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return J[0]??1234}},g=new I});var oe={};A(oe,{ClaudeService:()=>U,claudeService:()=>P});import{execa as _}from"execa";import le from"chalk";var U,P,F=O(()=>{"use strict";U=class{async isClaudeCodeInstalled(){try{return await _("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${a}`,l=["--model",s,...n];r&&console.log(le.dim(`
|
|
3
3
|
Running: claude ${l.join(" ")}
|
|
4
|
-
`));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(
|
|
4
|
+
`));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(p){if(p instanceof Error&&"exitCode"in p){let L=p.exitCode;process.exit(L??1)}throw p}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await _("claude",["--version"])).stdout}catch{return}}},P=new U});var te={};A(te,{OpenCodeService:()=>z,openCodeService:()=>$});import{execa as q}from"execa";import de from"chalk";var z,$,B=O(()=>{"use strict";z=class{async isOpenCodeInstalled(){try{return await q("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...n];r&&console.log(de.dim(`
|
|
5
5
|
Running: opencode ${s.join(" ")}
|
|
6
|
-
`));try{await
|
|
7
|
-
Model ready: ${l.name}`)),console.log(),console.log(`Run ${
|
|
8
|
-
Starting ${w} with model: ${
|
|
9
|
-
`));try{e==="opencode"?await
|
|
6
|
+
`));try{await q("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:a,OPENCODE_MODEL_PROVIDER:i}})}catch(l){if(l instanceof Error&&"exitCode"in l){let p=l.exitCode;process.exit(p??1)}throw l}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await q("opencode",["--version"])).stdout}catch{return}}},$=new z});import{Command as me}from"commander";import u from"chalk";C();M();import Z from"ora";import h from"chalk";import ee from"inquirer";import X from"inquirer";async function D(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),n=e?o.findIndex(i=>i.value.id===e):0;return(await X.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,n),pageSize:Math.min(t.length,15)}])).model}async function N(t,e=!0){return(await X.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import Y from"chalk";function R(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],n=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,n)).toFixed(2))} ${o[n]}`}function x(t){console.log(Y.green("\u2713")+" "+t)}function ae(t){console.error(Y.red("\u2717")+" "+t)}function v(t,e=1){ae(t),process.exit(e)}async function j(t={}){let e;if(t.provider)e=t.provider;else{let f=c.getDefaultProvider(),{providerSelection:w}=await ee.prompt([{type:"list",name:"providerSelection",message:"Select provider:",default:f,choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]);e=w;let{saveDefault:m}=await ee.prompt([{type:"confirm",name:"saveDefault",message:"Save as default provider?",default:!1}]);m&&(c.setDefaultProvider(e),x(`Default provider set to ${h.cyan(e)}`))}let o=Z({text:"Checking LM Studio status...",color:"cyan"}).start(),n=await g.checkStatus();n||(o.info("LM Studio server not running"),o.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),n=!0),o.succeed("Connected to LM Studio");let r=Z({text:"Fetching available models...",color:"cyan"}).start(),i=await g.getAvailableModels();i.length===0&&(r.fail("No models found. Download some models in LM Studio first."),v("No models available")),r.succeed(`Found ${h.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(h.bold("Available Models:")),console.log(h.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((f,w)=>{let m=R(f.size),S=f.loaded?h.green(" [LOADED]"):"";console.log(` ${h.dim(String(w+1).padStart(2))}. ${f.name} ${h.dim(`(${m})`)}${S}`)}),console.log();let a=c.getLastUsedModel(),s=t.model,l=s?i.find(f=>f.id===s||f.name.includes(s)):await D(i,a);l||v("No model selected"),await g.loadModel(l.id,o);let p=l.id.replace("/","--"),L=e==="opencode"?"OpenCode":"Claude Code";x(h.bold(`
|
|
7
|
+
Model ready: ${l.name}`)),console.log(),console.log(`Run ${L} with this model:`),console.log(` ${h.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+p)}`),console.log(),console.log(`Or use ${h.cyan("aix-cli run")} to start an interactive session`)}C();F();B();M();import T from"ora";import re from"chalk";import ce from"inquirer";async function ue(){let t=await P.isClaudeCodeInstalled(),e=await $.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&v("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:n}=await ce.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return n}async function G(t={}){let e;if(t.provider)e=t.provider;else{let m=c.getDefaultProvider();m?e=m:e=await ue()}let o=T({text:`Checking ${e==="opencode"?"OpenCode":"Claude Code"} installation...`,color:"cyan"}).start();(e==="opencode"?await $.isOpenCodeInstalled():await P.isClaudeCodeInstalled())||(o.fail(`${e==="opencode"?"OpenCode":"Claude Code"} is not installed.`),v(`Please install ${e==="opencode"?"OpenCode":"Claude Code"} first.`)),o.succeed(`${e==="opencode"?"OpenCode":"Claude Code"} is installed`);let r=T({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await g.checkStatus();i||(r.info("LM Studio server not running"),r.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),i=!0),r.succeed("Connected to LM Studio");let a=T({text:"Fetching available models...",color:"cyan"}).start(),s=await g.getAvailableModels();s.length===0&&(a.fail("No models found. Download some models in LM Studio first."),v("No models available")),a.stop();let l;if(t.model){let m=s.find(S=>S.id===t.model||S.name.toLowerCase().includes(t.model.toLowerCase()));m||v(`Model "${t.model}" not found. Available models: ${s.map(S=>S.name).join(", ")}`),l=m.id}else{let m=c.getLastUsedModel();l=(await D(s,m)).id}let p=T({text:`Loading model: ${re.cyan(l)}`,color:"cyan"}).start();await g.loadModel(l,p);let f=`lmstudio/${l.replace("/","--")}`,w=e==="opencode"?"OpenCode":"Claude Code";x(re.green(`
|
|
8
|
+
Starting ${w} with model: ${f}
|
|
9
|
+
`));try{e==="opencode"?await $.run({model:f,args:t.args??[],verbose:t.verbose}):await P.run({model:f,args:t.args??[],verbose:t.verbose})}catch(m){v(`Failed to run ${w}: ${m instanceof Error?m.message:"Unknown error"}`)}}C();import d from"chalk";async function K(){let t=await g.getStatus();console.log(),console.log(d.bold("LM Studio Status")),console.log(d.dim("\u2500".repeat(50))),console.log(` ${t.running?d.green("\u25CF"):d.red("\u25CB")} Server: ${t.running?d.green("Running"):d.red("Stopped")}`),console.log(` ${d.dim("\u25B8")} Port: ${d.cyan(String(t.port))}`),console.log(` ${d.dim("\u25B8")} URL: ${d.cyan(`http://localhost:${t.port}`)}`),t.activeModel?console.log(` ${d.dim("\u25B8")} Active Model: ${d.green(t.activeModel)}`):console.log(` ${d.dim("\u25B8")} Active Model: ${d.dim("None")}`),console.log(),console.log(d.bold("Models")),console.log(d.dim("\u2500".repeat(50))),t.models.length===0?console.log(` ${d.dim("No models available")}`):t.models.forEach((e,o)=>{let n=R(e.size),r=e.id===t.activeModel?` ${d.green("[LOADED]")}`:"";console.log(` ${d.dim(String(o+1)+".")} ${e.name}${r}`),console.log(` ${d.dim("ID:")} ${e.id}`),console.log(` ${d.dim("Size:")} ${n}`),e.quantization&&console.log(` ${d.dim("Quantization:")} ${e.quantization}`),console.log()})}var y=new me;y.name("aix-cli").description("AI CLI tool that integrates LM Studio with Claude Code or OpenCode for local AI-powered development").version("2.0.0").showHelpAfterError();function b(t=0){console.log(),console.log(u.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>b(0));process.on("SIGTERM",()=>b(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?b(0):(console.error(u.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&b(0)});y.command("init",{isDefault:!1}).aliases(["i","load"]).description("Initialize and load a model into LM Studio").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").action(j);y.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await G({...e,args:t})});y.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio status and available models").action(K);y.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(C(),Q)),{claudeService:e}=await Promise.resolve().then(()=>(F(),oe)),{openCodeService:o}=await Promise.resolve().then(()=>(B(),te)),{configService:n}=await Promise.resolve().then(()=>(M(),W));console.log(u.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(u.dim("\u2500".repeat(40)));let r=await t.checkStatus(),i=await e.isClaudeCodeInstalled(),a=await o.isOpenCodeInstalled(),s=n.getDefaultProvider(),l=n.get("lmStudioPort");console.log(),console.log(`${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?u.green("Running"):u.yellow("Not running")}`),console.log(`${i?"\u2705":"\u274C"} Claude Code: ${i?u.green("Installed"):u.red("Not installed")}`),console.log(`${a?"\u2705":"\u274C"} OpenCode: ${a?u.green("Installed"):u.red("Not installed")}`),console.log(`\u{1F310} Server: ${u.cyan(`http://localhost:${l}`)}`),console.log(`\u{1F4CC} Default provider: ${u.cyan(s)}`),(!i||!a||!r)&&(console.log(),console.log(u.bold("\u{1F4CB} Next Steps:")),i||console.log(` 1. ${u.cyan("npm install -g @anthropic-ai/claude-code")}`),a||console.log(` 2. ${u.cyan("npm install -g opencode")}`),r||console.log(" 3. Open LM Studio and start the server")),console.log(),console.log(u.dim("\u{1F4D6} Docs: ")+u.cyan("https://lmstudio.ai"))});y.parse();
|
|
10
10
|
//# sourceMappingURL=aix.js.map
|