@iamharshil/aix-cli 2.0.3 → 3.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,74 +1,82 @@
1
+ <div align="center">
2
+
1
3
  # AIX CLI
2
4
 
3
- > AI-powered CLI tool that integrates LM Studio with Claude Code or OpenCode for enhanced local development assistance
5
+ **Run Claude Code & OpenCode with local AI models. No API keys. No cloud. Complete privacy.**
4
6
 
5
- <div align="center">
7
+ [![npm version](https://img.shields.io/npm/v/@iamharshil/aix-cli.svg?style=flat-square&color=cb3837)](https://www.npmjs.com/package/@iamharshil/aix-cli)
8
+ [![Downloads](https://img.shields.io/npm/dm/@iamharshil/aix-cli?style=flat-square&color=blue)](https://www.npmjs.com/package/@iamharshil/aix-cli)
9
+ [![License](https://img.shields.io/badge/license-MIT-green?style=flat-square)](LICENSE)
10
+ [![CI](https://img.shields.io/github/actions/workflow/status/iamharshil/aix-cli/ci.yml?style=flat-square&label=CI)](https://github.com/iamharshil/aix-cli/actions)
11
+ [![Node](https://img.shields.io/badge/node-%E2%89%A518-417e38?style=flat-square)](https://nodejs.org/)
6
12
 
7
- [![npm version](https://img.shields.io/npm/v/@iamharshil/aix-cli.svg)](https://www.npmjs.com/package/@iamharshil/aix-cli)
8
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
9
- [![Node.js Version](https://img.shields.io/badge/node-%3E%3D18.0.0-brightgreen)](https://nodejs.org/)
10
- [![GitHub Release](https://img.shields.io/github/v/release/iamharshil/aix-cli)](https://github.com/iamharshil/aix-cli/releases)
11
- [![Build Status](https://img.shields.io/github/actions/workflow/status/iamharshil/aix-cli/ci)](https://github.com/iamharshil/aix-cli/actions)
12
- [![Total Downloads](https://img.shields.io/npm/dt/@iamharshil/aix-cli)](https://www.npmjs.com/package/@iamharshil/aix-cli)
13
+ [Getting Started](#getting-started) · [Documentation](#usage) · [Contributing](CONTRIBUTING.md) · [Changelog](CHANGELOG.md)
13
14
 
14
15
  </div>
15
16
 
16
17
  ---
17
18
 
18
- ## Overview
19
+ ## What is AIX?
19
20
 
20
- **AIX CLI** enables you to use locally-running AI models from [LM Studio](https://lmstudio.ai) directly with [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai). No API keys, no cloud dependencies, complete privacy.
21
+ AIX CLI is a bridge between local model servers and AI coding assistants. It connects [LM Studio](https://lmstudio.ai) or [Ollama](https://ollama.com) to [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai) letting you use **locally-running language models** as the backend for your favorite AI dev tools.
21
22
 
22
- ### Why AIX?
23
+ No API keys. No cloud calls. No data leaving your machine.
23
24
 
24
- | Feature | Description |
25
- | -------------------- | ---------------------------------------------- |
26
- | 🔒 **Privacy-First** | All processing happens locally on your machine |
27
- | 🔑 **No API Keys** | No external services or subscriptions |
28
- | 🚀 **Fast** | Local inference with your GPU |
29
- | 🛡️ **Secure** | Your code never leaves your machine |
30
- | 🔧 **Simple** | Just run `aix-cli run` and start coding |
25
+ ```
26
+ ┌──────────────────────────────────────────────────┐
27
+ │ $ aix-cli run │
28
+ │ │
29
+ │ ? Select model backend: Ollama
30
+ │ ✔ Connected to Ollama │
31
+ │ ✔ Model selected: qwen2.5-coder:14b │
32
+ │ ✔ Launching Claude Code... │
33
+ │ │
34
+ │ Your code stays local. Always. │
35
+ └──────────────────────────────────────────────────┘
36
+ ```
31
37
 
32
- ## Prerequisites
38
+ ### Why AIX?
33
39
 
34
- - [Node.js](https://nodejs.org/) 18.0.0 or higher
35
- - [LM Studio](https://lmstudio.ai) - Download and run local AI models
36
- - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) - AI coding assistant
40
+ - 🔒 **Privacy-first** — All inference runs locally on your hardware. Your code never leaves your machine.
41
+ - 🔑 **No API keys** No subscriptions, no usage limits, no cloud dependencies.
42
+ - 🚀 **GPU-accelerated** Take advantage of your local GPU for fast inference.
43
+ - 🔀 **Multi-backend** — Use LM Studio or Ollama as your model server.
44
+ - 🛠️ **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
45
+ - ⚡ **Zero config** — Just run `aix-cli run` and start coding.
37
46
 
38
- ## Quick Start
47
+ ---
39
48
 
40
- ```bash
41
- # Install
42
- npm install -g @iamharshil/aix-cli
49
+ ## Getting Started
43
50
 
44
- # Verify setup
45
- aix-cli doctor
51
+ ### Prerequisites
46
52
 
47
- # Run with interactive model selection
48
- aix-cli run
49
- ```
50
-
51
- ## Installation
53
+ | Requirement | Description |
54
+ |---|---|
55
+ | [Node.js](https://nodejs.org/) ≥ 18 | JavaScript runtime |
56
+ | [LM Studio](https://lmstudio.ai) **or** [Ollama](https://ollama.com) | Local model server (at least one required) |
57
+ | [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) | AI coding assistant (at least one required) |
52
58
 
53
- ### Using npm (Recommended)
59
+ ### Install
54
60
 
55
61
  ```bash
56
62
  npm install -g @iamharshil/aix-cli
57
63
  ```
58
64
 
59
- ### Using Yarn
65
+ <details>
66
+ <summary>Other package managers</summary>
60
67
 
61
68
  ```bash
69
+ # Yarn
62
70
  yarn global add @iamharshil/aix-cli
63
- ```
64
-
65
- ### Using pnpm
66
71
 
67
- ```bash
72
+ # pnpm
68
73
  pnpm add -g @iamharshil/aix-cli
69
74
  ```
70
75
 
71
- ### From Source
76
+ </details>
77
+
78
+ <details>
79
+ <summary>Build from source</summary>
72
80
 
73
81
  ```bash
74
82
  git clone https://github.com/iamharshil/aix-cli.git
@@ -78,69 +86,96 @@ npm run build
78
86
  npm link
79
87
  ```
80
88
 
81
- ## Usage
89
+ </details>
82
90
 
83
- ### System Check
84
-
85
- Verify your environment is properly configured:
91
+ ### Verify
86
92
 
87
93
  ```bash
88
94
  aix-cli doctor
89
95
  ```
90
96
 
91
- ### Initialize Model
97
+ This checks that LM Studio / Ollama, Claude Code / OpenCode, and your environment are properly configured.
92
98
 
93
- Load a model from your local LM Studio models:
99
+ ---
100
+
101
+ ## Usage
102
+
103
+ ### `aix-cli run` — Start a coding session
104
+
105
+ The primary command. Launches Claude Code or OpenCode backed by a local model.
94
106
 
95
107
  ```bash
96
- # Interactive selection (prompts for provider)
97
- aix-cli init
108
+ # Interactive prompts for backend, model, and provider
109
+ aix-cli run
98
110
 
99
- # Specific model
100
- aix-cli init -m llama-3-8b
111
+ # Specify backend and model
112
+ aix-cli run -b ollama -m qwen2.5-coder:14b
113
+ aix-cli run -b lmstudio -m llama-3-8b
101
114
 
102
- # With provider
103
- aix-cli init -m llama-3-8b --provider opencode
115
+ # Use OpenCode instead of Claude Code
116
+ aix-cli run --provider opencode
117
+
118
+ # Pass a prompt directly
119
+ aix-cli run -b ollama -m qwen2.5-coder:14b -- "Refactor auth middleware"
104
120
  ```
105
121
 
106
- ### Run with Provider
122
+ ### `aix-cli init` — Set up backend and model
107
123
 
108
- Start coding with your local model using Claude Code or OpenCode:
124
+ Configure your preferred backend and load/select a model.
109
125
 
110
126
  ```bash
111
- # Interactive session (uses default provider)
112
- aix-cli run
127
+ aix-cli init # Interactive setup
128
+ aix-cli init -b ollama -m qwen2.5-coder:14b # Ollama with specific model
129
+ aix-cli init -b lmstudio -m llama-3-8b -p claude # Full config in one command
130
+ ```
113
131
 
114
- # With specific model
115
- aix-cli run -m llama-3-8b
132
+ ### `aix-cli status` — Check what's running
116
133
 
117
- # With OpenCode instead of Claude Code
118
- aix-cli run --provider opencode
119
- aix-cli run -p opencode -m llama-3-8b
134
+ Shows status for both LM Studio and Ollama, including available and running models.
120
135
 
121
- # With arguments
122
- aix-cli run -m llama-3-8b -- "Write a hello world in Python"
136
+ ```bash
137
+ aix-cli status
123
138
  ```
124
139
 
125
- ### Check Status
140
+ ### `aix-cli doctor` — System diagnostics
126
141
 
127
- View LM Studio status and loaded models:
142
+ Verifies your environment is ready to go.
128
143
 
129
144
  ```bash
130
- aix-cli status
145
+ aix-cli doctor
131
146
  ```
132
147
 
133
- ## Configuration
148
+ ### Command Reference
149
+
150
+ | Command | Aliases | Description |
151
+ |---|---|---|
152
+ | `run` | `r` | Run Claude Code / OpenCode with a local model |
153
+ | `init` | `i`, `load` | Set up backend, select model, configure provider |
154
+ | `status` | `s`, `stats` | Show LM Studio & Ollama status |
155
+ | `doctor` | `d`, `check` | Run system diagnostics |
156
+
157
+ ### Global Options
158
+
159
+ | Flag | Description |
160
+ |---|---|
161
+ | `-b, --backend <name>` | Model backend: `lmstudio` or `ollama` |
162
+ | `-m, --model <name>` | Model name or ID to use |
163
+ | `-p, --provider <name>` | Coding tool: `claude` (default) or `opencode` |
164
+ | `-v, --verbose` | Show verbose output |
165
+ | `-h, --help` | Show help |
166
+ | `-V, --version` | Show version |
167
+
168
+ ---
134
169
 
135
- ### Config Location
170
+ ## Configuration
136
171
 
137
- AIX CLI stores configuration in your system's app data directory:
172
+ AIX stores its configuration in the OS-appropriate config directory:
138
173
 
139
- | Platform | Path |
140
- | -------- | ---------------------------------------- |
141
- | macOS | `~/Library/Application Support/aix-cli/` |
142
- | Linux | `~/.config/aix-cli/` |
143
- | Windows | `%APPDATA%\aix-cli\` |
174
+ | Platform | Path |
175
+ |---|---|
176
+ | macOS | `~/Library/Application Support/aix-cli/` |
177
+ | Linux | `~/.config/aix-cli/` |
178
+ | Windows | `%APPDATA%\aix-cli\` |
144
179
 
145
180
  ### Config File
146
181
 
@@ -148,131 +183,161 @@ AIX CLI stores configuration in your system's app data directory:
148
183
  {
149
184
  "lmStudioUrl": "http://localhost",
150
185
  "lmStudioPort": 1234,
186
+ "ollamaUrl": "http://localhost",
187
+ "ollamaPort": 11434,
151
188
  "defaultTimeout": 30000,
152
- "model": "lmstudio/llama-3-8b",
153
- "defaultProvider": "claude"
189
+ "defaultBackend": "ollama",
190
+ "defaultProvider": "claude",
191
+ "model": "qwen2.5-coder:14b"
154
192
  }
155
193
  ```
156
194
 
157
195
  ### Environment Variables
158
196
 
159
- | Variable | Description | Default |
160
- | ---------------- | --------------------- | ------- |
161
- | `LM_STUDIO_PORT` | LM Studio server port | `1234` |
197
+ | Variable | Description | Default |
198
+ |---|---|---|
199
+ | `LM_STUDIO_PORT` | Override the LM Studio server port | `1234` |
200
+
201
+ ---
162
202
 
163
- ## Architecture
203
+ ## How It Works
164
204
 
165
205
  ```
166
- ┌─────────────────────────────────────────────────────┐
167
- AIX Flow
168
- ├─────────────────────────────────────────────────────┤
169
- │ │
170
- ┌──────────┐ ┌──────────────┐
171
- │ │ User │────▶│ LM Studio │ │
172
- └──────────┘ └──────────────┘
173
- │ │ │ │
174
- │ ▼ │
175
- │ │ ┌──────────────┐ │
176
- │ │ Local API
177
- │ │ (port 1234)
178
- └──────────────┘ │
179
- │ │
180
- │ ▼ ┌─────────┴─────────┐ │
181
- │ ┌─────────────┴─────────────┐
182
- │ │ │ │
183
- ▼ │
184
- │ ┌──────────┐ ┌──────────┐ │
185
- Claude Code │OpenCode
186
- │--model│--model
187
- │ └──────────┘ └──────────┘ │
188
- │ │
189
- └─────────────────────────────────────────────────────┘
206
+ ┌───────────────────┐ ┌───────────────────┐
207
+ LM Studio │ Ollama │
208
+ │ (port 1234) │ │ (port 11434) │
209
+ └────────┬──────────┘ └────────┬───────────┘
210
+
211
+ REST API REST API
212
+
213
+ └───────────┬────────────┘
214
+
215
+ ┌────────┴──────────┐
216
+ AIX CLI
217
+ backend routing
218
+ model selection
219
+ config mgmt
220
+ └───┬──────────┬────┘
221
+
222
+ ┌────────┘ └────────┐
223
+
224
+ ┌──────────────┐ ┌──────────────┐
225
+ Claude Code │OpenCode
226
+ --model X │ --model X
227
+ └──────────────┘ └──────────────┘
190
228
  ```
191
229
 
230
+ 1. **LM Studio** or **Ollama** runs a local inference server with an OpenAI-compatible API.
231
+ 2. **AIX CLI** discovers available models, manages configuration, and orchestrates the connection.
232
+ 3. **Claude Code** or **OpenCode** receives the model endpoint and runs as it normally would — except fully local.
233
+
234
+ ---
235
+
192
236
  ## Troubleshooting
193
237
 
194
- ### LM Studio server not running
238
+ <details>
239
+ <summary><strong>LM Studio server not running</strong></summary>
195
240
 
196
- ```bash
197
- # 1. Open LM Studio
198
- # 2. Go to Server tab (left sidebar)
199
- # 3. Click Start Server
200
- # 4. Run: aix-cli run
201
- ```
241
+ 1. Open LM Studio
242
+ 2. Navigate to the **Server** tab (left sidebar)
243
+ 3. Click **Start Server**
244
+ 4. Confirm with `aix-cli status`
245
+
246
+ </details>
247
+
248
+ <details>
249
+ <summary><strong>Ollama not running</strong></summary>
250
+
251
+ 1. Install Ollama from [ollama.com](https://ollama.com)
252
+ 2. Start the server: `ollama serve`
253
+ 3. Pull a model: `ollama pull qwen2.5-coder:14b`
254
+ 4. Confirm with `aix-cli status`
255
+
256
+ </details>
257
+
258
+ <details>
259
+ <summary><strong>No models found</strong></summary>
260
+
261
+ **LM Studio:** Open LM Studio → **Search** tab → download a model.
262
+
263
+ **Ollama:** Run `ollama pull <model>` to download a model (e.g., `ollama pull llama3.2`).
264
+
265
+ Then run `aix-cli init` to select and configure.
266
+
267
+ </details>
268
+
269
+ <details>
270
+ <summary><strong>Connection refused</strong></summary>
271
+
272
+ Check that the correct port is being used:
273
+ - LM Studio defaults to port `1234`
274
+ - Ollama defaults to port `11434`
275
+
276
+ You can configure custom ports in your AIX config file (path shown by `aix-cli doctor`).
277
+
278
+ </details>
202
279
 
203
- ### No models found
280
+ <details>
281
+ <summary><strong>Claude Code / OpenCode not detected</strong></summary>
282
+
283
+ Install the missing provider globally:
204
284
 
205
285
  ```bash
206
- # 1. Open LM Studio
207
- # 2. Go to Search tab
208
- # 3. Download a model (e.g., Llama 3, Mistral)
209
- # 4. Wait for download to complete
210
- # 5. Run: aix-cli init
286
+ # Claude Code
287
+ npm install -g @anthropic-ai/claude-code
288
+
289
+ # OpenCode
290
+ npm install -g opencode
211
291
  ```
212
292
 
213
- ### Connection refused on port 1234
293
+ Then re-run `aix-cli doctor` to confirm.
214
294
 
215
- Check LM Studio's server tab for the correct port and update your config.
295
+ </details>
296
+
297
+ ---
216
298
 
217
299
  ## Security & Privacy
218
300
 
219
- - All AI processing happens locally
220
- - ✅ No data sent to external servers
221
- - ✅ No telemetry or analytics
222
- - ✅ No API keys required
223
- - ✅ Your code stays on your machine
301
+ AIX is designed around a simple principle: **your code never leaves your machine.**
224
302
 
225
- ## Contributing
303
+ - ✅ All AI inference runs locally via LM Studio or Ollama
304
+ - ✅ No telemetry, analytics, or tracking of any kind
305
+ - ✅ No outbound network calls (except to `localhost`)
306
+ - ✅ No API keys or accounts required
307
+ - ✅ Fully open-source — audit the code yourself
226
308
 
227
- Contributions are welcome! Please read our [Contributing Guide](CONTRIBUTING.md).
309
+ Found a vulnerability? Please report it responsibly via our [Security Policy](SECURITY.md).
228
310
 
229
- ### Development
311
+ ---
312
+
313
+ ## Contributing
314
+
315
+ Contributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines on how to get started.
230
316
 
231
317
  ```bash
232
- # Clone and setup
233
318
  git clone https://github.com/iamharshil/aix-cli.git
234
319
  cd aix-cli
235
320
  npm install
236
-
237
- # Development mode
238
- npm run dev
239
-
240
- # Run tests
241
- npm test
242
-
243
- # Lint
244
- npm run lint
245
-
246
- # Build
247
- npm run build
321
+ npm run dev # Run in development mode
322
+ npm test # Run tests
323
+ npm run lint # Lint
248
324
  ```
249
325
 
250
- ### Code Style
251
-
252
- - TypeScript with strict mode
253
- - ESLint + Prettier configured
254
- - 2-space indentation
255
- - Single quotes, semicolons
326
+ ---
256
327
 
257
328
  ## Related Projects
258
329
 
259
- - [LM Studio](https://lmstudio.ai) - Run local AI models
260
- - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) - AI coding assistant
261
- - [OpenCode](https://opencode.ai) - Open-source AI coding assistant
330
+ - [LM Studio](https://lmstudio.ai) Run local AI models with a visual interface
331
+ - [Ollama](https://ollama.com) Run large language models locally
332
+ - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) Anthropic's AI coding assistant
333
+ - [OpenCode](https://opencode.ai) — Open-source AI coding assistant
262
334
 
263
335
  ## License
264
336
 
265
- [MIT](LICENSE) - © 2024 Harshil
266
-
267
- ## Support
268
-
269
- - 📋 [Issues](https://github.com/iamharshil/aix-cli/issues) - Report bugs
270
- - 💬 [Discussions](https://github.com/iamharshil/aix-cli/discussions) - Ask questions
337
+ [MIT](LICENSE) © [Harshil](https://github.com/iamharshil)
271
338
 
272
339
  ---
273
340
 
274
341
  <div align="center">
275
-
276
- Built with ❤️ for privacy-conscious developers
277
-
342
+ <sub>Built for developers who care about privacy.</sub>
278
343
  </div>
package/dist/bin/aix.js CHANGED
@@ -1,10 +1,11 @@
1
1
  #!/usr/bin/env node
2
- var ne=Object.defineProperty;var ie=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var O=(t,e)=>()=>(t&&(e=t(t=0)),e);var A=(t,e)=>{for(var o in e)ne(t,o,{get:e[o],enumerable:!0})};var W={};A(W,{ConfigService:()=>E,configService:()=>c});import se from"conf";var E,c,M=O(()=>{"use strict";E=class{store;constructor(){this.store=new se({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new E});var Q={};A(Q,{LMStudioService:()=>I,lmStudioService:()=>g});import{execa as k}from"execa";import H from"ora";import V from"chalk";var J,I,g,C=O(()=>{"use strict";M();J=[1234,1235,1236,1237],I=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let n=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!n.ok)continue;let r=await n.json(),i=[];return Array.isArray(r)?i=r:r.models&&Array.isArray(r.models)?i=r.models:r.data&&Array.isArray(r.data)&&(i=r.data),i.map(a=>{let s=a;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(a=>a.id&&a.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let n=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:n.models??[],activeModel:n.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let n=o??H({text:`Loading model: ${V.cyan(e)}`,color:"cyan"}).start();try{let r=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!r.ok)throw new Error(`Failed to load model: ${r.statusText}`);return n.succeed(`Model ${V.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:n}}catch(r){throw n.fail(`Failed to load model: ${r instanceof Error?r.message:"Unknown error"}`),r}}async startServer(e){let o=e??H({text:"Starting LM Studio server...",color:"cyan"}).start();try{let n=process.platform==="darwin",r=process.platform==="linux",i=process.platform==="win32",a;if(n){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let l of s)try{let{existsSync:p}=await import("fs");if(p(l)){a=`open "${l}" --args --server`;break}}catch{}if(a?.startsWith("open")){await k("open",[s.find(l=>{try{let{existsSync:p}=ie("fs");return p(l)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else r?a=await this.findLinuxBinary():i&&(a=await this.findWindowsExecutable());if(!a)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await k(a,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(n){throw o.fail(`Failed to start LM Studio: ${n instanceof Error?n.message:"Unknown error"}`),n}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await k("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,n=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let r of n)try{return await k("cmd",["/c","if exist",`"${r}"`,"echo","yes"]),r}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of J)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return J[0]??1234}},g=new I});var oe={};A(oe,{ClaudeService:()=>U,claudeService:()=>P});import{execa as _}from"execa";import le from"chalk";var U,P,F=O(()=>{"use strict";U=class{async isClaudeCodeInstalled(){try{return await _("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${a}`,l=["--model",s,...n];r&&console.log(le.dim(`
3
- Running: claude ${l.join(" ")}
4
- `));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(p){if(p instanceof Error&&"exitCode"in p){let L=p.exitCode;process.exit(L??1)}throw p}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await _("claude",["--version"])).stdout}catch{return}}},P=new U});var te={};A(te,{OpenCodeService:()=>z,openCodeService:()=>$});import{execa as q}from"execa";import de from"chalk";var z,$,B=O(()=>{"use strict";z=class{async isOpenCodeInstalled(){try{return await q("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...n];r&&console.log(de.dim(`
2
+ var ge=Object.defineProperty;var pe=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var C=(t,e)=>()=>(t&&(e=t(t=0)),e);var O=(t,e)=>{for(var o in e)ge(t,o,{get:e[o],enumerable:!0})};var Y={};O(Y,{ConfigService:()=>I,configService:()=>c});import fe from"conf";var I,c,M=C(()=>{"use strict";I=class{store;constructor(){this.store=new fe({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,ollamaUrl:"http://localhost",ollamaPort:11434,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}setDefaultBackend(e){this.store.set("defaultBackend",e)}getDefaultBackend(){return this.store.get("defaultBackend")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}getOllamaUrl(){let e=this.store.get("ollamaUrl"),o=this.store.get("ollamaPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new I});var te={};O(te,{LMStudioService:()=>z,lmStudioService:()=>h});import{execa as U}from"execa";import Z from"ora";import ee from"chalk";var oe,z,h,L=C(()=>{"use strict";M();oe=[1234,1235,1236,1237],z=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let r=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!r.ok)continue;let n=await r.json(),i=[];return Array.isArray(n)?i=n:n.models&&Array.isArray(n.models)?i=n.models:n.data&&Array.isArray(n.data)&&(i=n.data),i.map(l=>{let s=l;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(l=>l.id&&l.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let r=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:r.models??[],activeModel:r.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let r=o??Z({text:`Loading model: ${ee.cyan(e)}`,color:"cyan"}).start();try{let n=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!n.ok)throw new Error(`Failed to load model: ${n.statusText}`);return r.succeed(`Model ${ee.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:r}}catch(n){throw r.fail(`Failed to load model: ${n instanceof Error?n.message:"Unknown error"}`),n}}async startServer(e){let o=e??Z({text:"Starting LM Studio server...",color:"cyan"}).start();try{let r=process.platform==="darwin",n=process.platform==="linux",i=process.platform==="win32",l;if(r){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let d of s)try{let{existsSync:u}=await import("fs");if(u(d)){l=`open "${d}" --args --server`;break}}catch{}if(l?.startsWith("open")){await U("open",[s.find(d=>{try{let{existsSync:u}=pe("fs");return u(d)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else n?l=await this.findLinuxBinary():i&&(l=await this.findWindowsExecutable());if(!l)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await U(l,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(r){throw o.fail(`Failed to start LM Studio: ${r instanceof Error?r.message:"Unknown error"}`),r}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await U("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,r=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let n of r)try{return await U("cmd",["/c","if exist",`"${n}"`,"echo","yes"]),n}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of oe)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return oe[0]??1234}},h=new z});var ne={};O(ne,{OllamaService:()=>j,ollamaService:()=>S});var j,S,A=C(()=>{"use strict";M();j=class{getBaseUrl(){return c.getOllamaUrl()}getApiUrl(e){return`${this.getBaseUrl()}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){try{let e=await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(1e4)});return e.ok?((await e.json()).models??[]).map(n=>{let i=n.details??{};return{id:String(n.name??n.model??""),name:String(n.name??n.model??""),size:Number(n.size??0),quantization:String(i.quantization_level??""),family:String(i.family??""),parameterSize:String(i.parameter_size??"")}}).filter(n=>n.id&&n.name):[]}catch{return[]}}async getRunningModels(){try{let e=await fetch(this.getApiUrl("/api/ps"),{method:"GET",signal:AbortSignal.timeout(5e3)});return e.ok?((await e.json()).models??[]).map(n=>String(n.name??n.model??"")).filter(Boolean):[]}catch{return[]}}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("ollamaPort"),models:[],runningModels:[]};let[o,r]=await Promise.all([this.getAvailableModels(),this.getRunningModels()]);return{running:!0,port:c.get("ollamaPort"),models:o,runningModels:r}}},S=new j});var se={};O(se,{ClaudeService:()=>q,claudeService:()=>R});import{execa as K}from"execa";import Me from"chalk";var q,R,H=C(()=>{"use strict";q=class{async isClaudeCodeInstalled(){try{return await K("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:r=[],verbose:n=!1}=e,i=this.extractProvider(o),l=this.extractModelName(o);if(!i||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${l}`,d=["--model",s,...r];n&&console.log(Me.dim(`
3
+ Running: claude ${d.join(" ")}
4
+ `));try{await K("claude",d,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(u){if(u instanceof Error&&"exitCode"in u){let v=u.exitCode;process.exit(v??1)}throw u}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await K("claude",["--version"])).stdout}catch{return}}},R=new q});var ae={};O(ae,{OpenCodeService:()=>G,openCodeService:()=>E});import{execa as V}from"execa";import $e from"chalk";var G,E,Q=C(()=>{"use strict";G=class{async isOpenCodeInstalled(){try{return await V("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:r=[],verbose:n=!1}=e,i=this.extractProvider(o),l=this.extractModelName(o);if(!i||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...r];n&&console.log($e.dim(`
5
5
  Running: opencode ${s.join(" ")}
6
- `));try{await q("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:a,OPENCODE_MODEL_PROVIDER:i}})}catch(l){if(l instanceof Error&&"exitCode"in l){let p=l.exitCode;process.exit(p??1)}throw l}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await q("opencode",["--version"])).stdout}catch{return}}},$=new z});import{Command as me}from"commander";import u from"chalk";C();M();import Z from"ora";import h from"chalk";import ee from"inquirer";import X from"inquirer";async function D(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),n=e?o.findIndex(i=>i.value.id===e):0;return(await X.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,n),pageSize:Math.min(t.length,15)}])).model}async function N(t,e=!0){return(await X.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import Y from"chalk";function R(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],n=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,n)).toFixed(2))} ${o[n]}`}function x(t){console.log(Y.green("\u2713")+" "+t)}function ae(t){console.error(Y.red("\u2717")+" "+t)}function v(t,e=1){ae(t),process.exit(e)}async function j(t={}){let e;if(t.provider)e=t.provider;else{let f=c.getDefaultProvider(),{providerSelection:w}=await ee.prompt([{type:"list",name:"providerSelection",message:"Select provider:",default:f,choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]);e=w;let{saveDefault:m}=await ee.prompt([{type:"confirm",name:"saveDefault",message:"Save as default provider?",default:!1}]);m&&(c.setDefaultProvider(e),x(`Default provider set to ${h.cyan(e)}`))}let o=Z({text:"Checking LM Studio status...",color:"cyan"}).start(),n=await g.checkStatus();n||(o.info("LM Studio server not running"),o.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),n=!0),o.succeed("Connected to LM Studio");let r=Z({text:"Fetching available models...",color:"cyan"}).start(),i=await g.getAvailableModels();i.length===0&&(r.fail("No models found. Download some models in LM Studio first."),v("No models available")),r.succeed(`Found ${h.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(h.bold("Available Models:")),console.log(h.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((f,w)=>{let m=R(f.size),S=f.loaded?h.green(" [LOADED]"):"";console.log(` ${h.dim(String(w+1).padStart(2))}. ${f.name} ${h.dim(`(${m})`)}${S}`)}),console.log();let a=c.getLastUsedModel(),s=t.model,l=s?i.find(f=>f.id===s||f.name.includes(s)):await D(i,a);l||v("No model selected"),await g.loadModel(l.id,o);let p=l.id.replace("/","--"),L=e==="opencode"?"OpenCode":"Claude Code";x(h.bold(`
7
- Model ready: ${l.name}`)),console.log(),console.log(`Run ${L} with this model:`),console.log(` ${h.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+p)}`),console.log(),console.log(`Or use ${h.cyan("aix-cli run")} to start an interactive session`)}C();F();B();M();import T from"ora";import re from"chalk";import ce from"inquirer";async function ue(){let t=await P.isClaudeCodeInstalled(),e=await $.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&v("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:n}=await ce.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return n}async function G(t={}){let e;if(t.provider)e=t.provider;else{let m=c.getDefaultProvider();m?e=m:e=await ue()}let o=T({text:`Checking ${e==="opencode"?"OpenCode":"Claude Code"} installation...`,color:"cyan"}).start();(e==="opencode"?await $.isOpenCodeInstalled():await P.isClaudeCodeInstalled())||(o.fail(`${e==="opencode"?"OpenCode":"Claude Code"} is not installed.`),v(`Please install ${e==="opencode"?"OpenCode":"Claude Code"} first.`)),o.succeed(`${e==="opencode"?"OpenCode":"Claude Code"} is installed`);let r=T({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await g.checkStatus();i||(r.info("LM Studio server not running"),r.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),i=!0),r.succeed("Connected to LM Studio");let a=T({text:"Fetching available models...",color:"cyan"}).start(),s=await g.getAvailableModels();s.length===0&&(a.fail("No models found. Download some models in LM Studio first."),v("No models available")),a.stop();let l;if(t.model){let m=s.find(S=>S.id===t.model||S.name.toLowerCase().includes(t.model.toLowerCase()));m||v(`Model "${t.model}" not found. Available models: ${s.map(S=>S.name).join(", ")}`),l=m.id}else{let m=c.getLastUsedModel();l=(await D(s,m)).id}let p=T({text:`Loading model: ${re.cyan(l)}`,color:"cyan"}).start();await g.loadModel(l,p);let f=`lmstudio/${l.replace("/","--")}`,w=e==="opencode"?"OpenCode":"Claude Code";x(re.green(`
8
- Starting ${w} with model: ${f}
9
- `));try{e==="opencode"?await $.run({model:f,args:t.args??[],verbose:t.verbose}):await P.run({model:f,args:t.args??[],verbose:t.verbose})}catch(m){v(`Failed to run ${w}: ${m instanceof Error?m.message:"Unknown error"}`)}}C();import d from"chalk";async function K(){let t=await g.getStatus();console.log(),console.log(d.bold("LM Studio Status")),console.log(d.dim("\u2500".repeat(50))),console.log(` ${t.running?d.green("\u25CF"):d.red("\u25CB")} Server: ${t.running?d.green("Running"):d.red("Stopped")}`),console.log(` ${d.dim("\u25B8")} Port: ${d.cyan(String(t.port))}`),console.log(` ${d.dim("\u25B8")} URL: ${d.cyan(`http://localhost:${t.port}`)}`),t.activeModel?console.log(` ${d.dim("\u25B8")} Active Model: ${d.green(t.activeModel)}`):console.log(` ${d.dim("\u25B8")} Active Model: ${d.dim("None")}`),console.log(),console.log(d.bold("Models")),console.log(d.dim("\u2500".repeat(50))),t.models.length===0?console.log(` ${d.dim("No models available")}`):t.models.forEach((e,o)=>{let n=R(e.size),r=e.id===t.activeModel?` ${d.green("[LOADED]")}`:"";console.log(` ${d.dim(String(o+1)+".")} ${e.name}${r}`),console.log(` ${d.dim("ID:")} ${e.id}`),console.log(` ${d.dim("Size:")} ${n}`),e.quantization&&console.log(` ${d.dim("Quantization:")} ${e.quantization}`),console.log()})}var y=new me;y.name("aix-cli").description("AI CLI tool that integrates LM Studio with Claude Code or OpenCode for local AI-powered development").version("2.0.0").showHelpAfterError();function b(t=0){console.log(),console.log(u.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>b(0));process.on("SIGTERM",()=>b(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?b(0):(console.error(u.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&b(0)});y.command("init",{isDefault:!1}).aliases(["i","load"]).description("Initialize and load a model into LM Studio").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").action(j);y.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await G({...e,args:t})});y.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio status and available models").action(K);y.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(C(),Q)),{claudeService:e}=await Promise.resolve().then(()=>(F(),oe)),{openCodeService:o}=await Promise.resolve().then(()=>(B(),te)),{configService:n}=await Promise.resolve().then(()=>(M(),W));console.log(u.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(u.dim("\u2500".repeat(40)));let r=await t.checkStatus(),i=await e.isClaudeCodeInstalled(),a=await o.isOpenCodeInstalled(),s=n.getDefaultProvider(),l=n.get("lmStudioPort");console.log(),console.log(`${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?u.green("Running"):u.yellow("Not running")}`),console.log(`${i?"\u2705":"\u274C"} Claude Code: ${i?u.green("Installed"):u.red("Not installed")}`),console.log(`${a?"\u2705":"\u274C"} OpenCode: ${a?u.green("Installed"):u.red("Not installed")}`),console.log(`\u{1F310} Server: ${u.cyan(`http://localhost:${l}`)}`),console.log(`\u{1F4CC} Default provider: ${u.cyan(s)}`),(!i||!a||!r)&&(console.log(),console.log(u.bold("\u{1F4CB} Next Steps:")),i||console.log(` 1. ${u.cyan("npm install -g @anthropic-ai/claude-code")}`),a||console.log(` 2. ${u.cyan("npm install -g opencode")}`),r||console.log(" 3. Open LM Studio and start the server")),console.log(),console.log(u.dim("\u{1F4D6} Docs: ")+u.cyan("https://lmstudio.ai"))});y.parse();
6
+ `));try{await V("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:l,OPENCODE_MODEL_PROVIDER:i}})}catch(d){if(d instanceof Error&&"exitCode"in d){let u=d.exitCode;process.exit(u??1)}throw d}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await V("opencode",["--version"])).stdout}catch{return}}},E=new G});import{Command as Ce}from"commander";import g from"chalk";L();A();M();import F from"ora";import p from"chalk";import _ from"inquirer";import ie from"inquirer";async function $(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),r=e?o.findIndex(i=>i.value.id===e):0;return(await ie.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,r),pageSize:Math.min(t.length,15)}])).model}async function T(t,e=!0){return(await ie.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import re from"chalk";function b(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],r=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,r)).toFixed(2))} ${o[r]}`}function y(t){console.log(re.green("\u2713")+" "+t)}function ve(t){console.error(re.red("\u2717")+" "+t)}function f(t,e=1){ve(t),process.exit(e)}async function he(){let t=c.getDefaultBackend(),{backendSelection:e}=await _.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",default:t??"lmstudio",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]),{saveDefault:o}=await _.prompt([{type:"confirm",name:"saveDefault",message:"Save as default backend?",default:!1}]);return o&&(c.setDefaultBackend(e),y(`Default backend set to ${p.cyan(e)}`)),e}async function Se(){let t=c.getDefaultProvider(),{providerSelection:e}=await _.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",default:t??"claude",choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]),{saveDefault:o}=await _.prompt([{type:"confirm",name:"saveDefault",message:"Save as default coding tool?",default:!1}]);return o&&(c.setDefaultProvider(e),y(`Default coding tool set to ${p.cyan(e)}`)),e}async function we(t,e){let o=F({text:"Checking LM Studio status...",color:"cyan"}).start(),r=await h.checkStatus();r||(o.info("LM Studio server not running"),o.stop(),await T("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),r=!0),o.succeed("Connected to LM Studio");let n=F({text:"Fetching available models...",color:"cyan"}).start(),i=await h.getAvailableModels();i.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.succeed(`Found ${p.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(p.bold("Available Models:")),console.log(p.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((m,x)=>{let w=b(m.size),N=m.loaded?p.green(" [LOADED]"):"";console.log(` ${p.dim(String(x+1).padStart(2))}. ${m.name} ${p.dim(`(${w})`)}${N}`)}),console.log();let l=c.getLastUsedModel(),s=t.model,d=s?i.find(m=>m.id===s||m.name.includes(s)):await $(i,l);d||f("No model selected"),await h.loadModel(d.id,o);let u=d.id.replace("/","--"),v=e==="opencode"?"OpenCode":"Claude Code";y(p.bold(`
7
+ Model ready: ${d.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${p.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+u)}`),console.log(),console.log(`Or use ${p.cyan("aix-cli run")} to start an interactive session`)}async function ye(t,e){let o=F({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=F({text:"Fetching available models...",color:"cyan"}).start(),i=await S.getAvailableModels();i.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.succeed(`Found ${p.bold(i.length)} model${i.length===1?"":"s"}`);let l=await S.getRunningModels();console.log(),console.log(p.bold("Available Models:")),console.log(p.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((m,x)=>{let w=b(m.size),ue=l.includes(m.id)?p.green(" [RUNNING]"):"",me=m.parameterSize?p.dim(` ${m.parameterSize}`):"";console.log(` ${p.dim(String(x+1).padStart(2))}. ${m.name}${me} ${p.dim(`(${w})`)}${ue}`)}),console.log();let s=c.getLastUsedModel(),d=t.model,u=d?i.find(m=>m.id===d||m.name.includes(d)):await $(i,s);u||f("No model selected"),c.setModel(u.id);let v=e==="opencode"?"OpenCode":"Claude Code";y(p.bold(`
8
+ Model selected: ${u.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${p.cyan((e==="opencode"?"opencode":"claude")+" --model ollama/"+u.id)}`),console.log(),console.log(`Or use ${p.cyan("aix-cli run")} to start an interactive session`)}async function W(t={}){let e=t.backend??await he(),o=t.provider??await Se();e==="ollama"?await ye(t,o):await we(t,o)}L();A();H();Q();M();import P from"ora";import le from"chalk";import de from"inquirer";async function be(){let t=c.getDefaultBackend();if(t)return t;let{backendSelection:e}=await de.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]);return e}async function Pe(){let t=await R.isClaudeCodeInstalled(),e=await E.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&f("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:r}=await de.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return r}function B(t){return t==="opencode"?"OpenCode":"Claude Code"}async function ke(t,e){let o=P({text:"Checking LM Studio status...",color:"cyan"}).start(),r=await h.checkStatus();r||(o.info("LM Studio server not running"),o.stop(),await T("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),r=!0),o.succeed("Connected to LM Studio");let n=P({text:"Fetching available models...",color:"cyan"}).start(),i=await h.getAvailableModels();i.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.stop();let l;if(t.model){let v=i.find(m=>m.id===t.model||m.name.toLowerCase().includes(t.model.toLowerCase()));v||f(`Model "${t.model}" not found. Available models: ${i.map(m=>m.name).join(", ")}`),l=v.id}else{let v=c.getLastUsedModel();l=(await $(i,v)).id}let s=P({text:`Loading model: ${le.cyan(l)}`,color:"cyan"}).start();await h.loadModel(l,s);let u=`lmstudio/${l.replace("/","--")}`;await ce(e,u,t)}async function xe(t,e){let o=P({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=P({text:"Fetching available models...",color:"cyan"}).start(),i=await S.getAvailableModels();i.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.stop();let l;if(t.model){let d=i.find(u=>u.id===t.model||u.name.toLowerCase().includes(t.model.toLowerCase()));d||f(`Model "${t.model}" not found. Available models: ${i.map(u=>u.name).join(", ")}`),l=d.id}else{let d=c.getLastUsedModel();l=(await $(i,d)).id}c.setModel(l);let s=`ollama/${l}`;await ce(e,s,t)}async function ce(t,e,o){let r=B(t);y(le.green(`
9
+ Starting ${r} with model: ${e}
10
+ `));try{t==="opencode"?await E.run({model:e,args:o.args??[],verbose:o.verbose}):await R.run({model:e,args:o.args??[],verbose:o.verbose})}catch(n){f(`Failed to run ${r}: ${n instanceof Error?n.message:"Unknown error"}`)}}async function J(t={}){let e;if(t.provider)e=t.provider;else{let i=c.getDefaultProvider();i?e=i:e=await Pe()}let o=P({text:`Checking ${B(e)} installation...`,color:"cyan"}).start();(e==="opencode"?await E.isOpenCodeInstalled():await R.isClaudeCodeInstalled())||(o.fail(`${B(e)} is not installed.`),f(`Please install ${B(e)} first.`)),o.succeed(`${B(e)} is installed`),(t.backend??await be())==="ollama"?await xe(t,e):await ke(t,e)}L();A();import a from"chalk";async function X(){let[t,e]=await Promise.all([h.getStatus(),S.getStatus()]);console.log(),console.log(a.bold("LM Studio")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${t.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${t.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(t.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${t.port}`)}`),t.activeModel&&console.log(` ${a.dim("\u25B8")} Active Model: ${a.green(t.activeModel)}`),t.running&&t.models.length>0?(console.log(),console.log(a.bold(" Models")),t.models.forEach((o,r)=>{let n=b(o.size),i=o.id===t.activeModel?` ${a.green("[LOADED]")}`:"";console.log(` ${a.dim(String(r+1)+".")} ${o.name}${i}`),console.log(` ${a.dim("ID:")} ${o.id}`),console.log(` ${a.dim("Size:")} ${n}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):t.running&&console.log(` ${a.dim("No models available")}`),console.log(),console.log(a.bold("Ollama")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${e.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${e.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(e.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${e.port}`)}`),e.running&&e.runningModels.length>0&&console.log(` ${a.dim("\u25B8")} Running: ${a.green(e.runningModels.join(", "))}`),e.running&&e.models.length>0?(console.log(),console.log(a.bold(" Models")),e.models.forEach((o,r)=>{let n=b(o.size),l=e.runningModels.includes(o.id)?` ${a.green("[RUNNING]")}`:"",s=o.parameterSize?` ${a.dim(o.parameterSize)}`:"";console.log(` ${a.dim(String(r+1)+".")} ${o.name}${s}${l}`),console.log(` ${a.dim("Size:")} ${n}`),o.family&&console.log(` ${a.dim("Family:")} ${o.family}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):e.running&&console.log(` ${a.dim("No models available")}`),console.log()}var k=new Ce;k.name("aix-cli").description("Run Claude Code or OpenCode with local AI models from LM Studio or Ollama").version("3.0.0").showHelpAfterError();function D(t=0){console.log(),console.log(g.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>D(0));process.on("SIGTERM",()=>D(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?D(0):(console.error(g.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&D(0)});k.command("init",{isDefault:!1}).aliases(["i","load"]).description("Select a backend, load a model, and configure your provider").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").action(W);k.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio or Ollama").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await J({...e,args:t})});k.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio and Ollama status and available models").action(X);k.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(L(),te)),{ollamaService:e}=await Promise.resolve().then(()=>(A(),ne)),{claudeService:o}=await Promise.resolve().then(()=>(H(),se)),{openCodeService:r}=await Promise.resolve().then(()=>(Q(),ae)),{configService:n}=await Promise.resolve().then(()=>(M(),Y));console.log(g.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(g.dim("\u2500".repeat(40)));let[i,l,s,d]=await Promise.all([t.checkStatus(),e.checkStatus(),o.isClaudeCodeInstalled(),r.isOpenCodeInstalled()]),u=n.getDefaultProvider(),v=n.getDefaultBackend(),m=n.get("lmStudioPort"),x=n.get("ollamaPort");console.log(),console.log(g.bold("Backends")),console.log(` ${i?"\u2705":"\u26A0\uFE0F"} LM Studio: ${i?g.green("Running"):g.yellow("Not running")} ${g.dim(`(port ${m})`)}`),console.log(` ${l?"\u2705":"\u26A0\uFE0F"} Ollama: ${l?g.green("Running"):g.yellow("Not running")} ${g.dim(`(port ${x})`)}`),console.log(),console.log(g.bold("Coding Tools")),console.log(` ${s?"\u2705":"\u274C"} Claude Code: ${s?g.green("Installed"):g.red("Not installed")}`),console.log(` ${d?"\u2705":"\u274C"} OpenCode: ${d?g.green("Installed"):g.red("Not installed")}`),console.log(),console.log(g.bold("Defaults")),console.log(` \u{1F4CC} Backend: ${g.cyan(v??"not set")}`),console.log(` \u{1F4CC} Coding tool: ${g.cyan(u??"not set")}`);let w=[];s||w.push(` \u2192 ${g.cyan("npm install -g @anthropic-ai/claude-code")}`),d||w.push(` \u2192 ${g.cyan("npm install -g opencode")}`),!i&&!l&&w.push(` \u2192 Start LM Studio or run ${g.cyan("ollama serve")}`),w.length>0&&(console.log(),console.log(g.bold("\u{1F4CB} Next Steps:")),w.forEach(N=>console.log(N))),console.log()});k.parse();
10
11
  //# sourceMappingURL=aix.js.map