@iamharshil/aix-cli 2.0.4 → 3.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -18,14 +18,17 @@
18
18
 
19
19
  ## What is AIX?
20
20
 
21
- AIX CLI is a bridge between [LM Studio](https://lmstudio.ai) and AI coding assistants like [Claude Code](https://docs.anthropic.com/en/docs/claude-code) and [OpenCode](https://opencode.ai). It lets you use **locally-running language models** as the backend for your favorite AI dev tools — no API keys, no cloud calls, no data leaving your machine.
21
+ AIX CLI is a bridge between local model servers and AI coding assistants. It connects [LM Studio](https://lmstudio.ai) or [Ollama](https://ollama.com) to [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai) letting you use **locally-running language models** as the backend for your favorite AI dev tools.
22
+
23
+ No API keys. No cloud calls. No data leaving your machine.
22
24
 
23
25
  ```
24
26
  ┌──────────────────────────────────────────────────┐
25
27
  │ $ aix-cli run │
26
28
  │ │
27
- LM Studio running on http://localhost:1234
28
- │ ✔ Model loaded: qwen2.5-coder-14b
29
+ ? Select model backend: Ollama
30
+ │ ✔ Connected to Ollama
31
+ │ ✔ Model selected: qwen2.5-coder:14b │
29
32
  │ ✔ Launching Claude Code... │
30
33
  │ │
31
34
  │ Your code stays local. Always. │
@@ -36,8 +39,9 @@ AIX CLI is a bridge between [LM Studio](https://lmstudio.ai) and AI coding assis
36
39
 
37
40
  - 🔒 **Privacy-first** — All inference runs locally on your hardware. Your code never leaves your machine.
38
41
  - 🔑 **No API keys** — No subscriptions, no usage limits, no cloud dependencies.
39
- - 🚀 **GPU-accelerated** — Take advantage of your local GPU for fast inference via LM Studio.
40
- - 🔀 **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
42
+ - 🚀 **GPU-accelerated** — Take advantage of your local GPU for fast inference.
43
+ - 🔀 **Multi-backend** — Use LM Studio or Ollama as your model server.
44
+ - 🛠️ **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
41
45
  - ⚡ **Zero config** — Just run `aix-cli run` and start coding.
42
46
 
43
47
  ---
@@ -49,7 +53,7 @@ AIX CLI is a bridge between [LM Studio](https://lmstudio.ai) and AI coding assis
49
53
  | Requirement | Description |
50
54
  |---|---|
51
55
  | [Node.js](https://nodejs.org/) ≥ 18 | JavaScript runtime |
52
- | [LM Studio](https://lmstudio.ai) | Run local AI models and expose them via API |
56
+ | [LM Studio](https://lmstudio.ai) **or** [Ollama](https://ollama.com) | Local model server (at least one required) |
53
57
  | [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) | AI coding assistant (at least one required) |
54
58
 
55
59
  ### Install
@@ -90,7 +94,7 @@ npm link
90
94
  aix-cli doctor
91
95
  ```
92
96
 
93
- This checks that LM Studio, Claude Code / OpenCode, and your environment are properly configured.
97
+ This checks that LM Studio / Ollama, Claude Code / OpenCode, and your environment are properly configured.
94
98
 
95
99
  ---
96
100
 
@@ -98,35 +102,36 @@ This checks that LM Studio, Claude Code / OpenCode, and your environment are pro
98
102
 
99
103
  ### `aix-cli run` — Start a coding session
100
104
 
101
- The primary command. Launches Claude Code or OpenCode backed by a local LM Studio model.
105
+ The primary command. Launches Claude Code or OpenCode backed by a local model.
102
106
 
103
107
  ```bash
104
- # Interactive — picks a running model automatically
108
+ # Interactive — prompts for backend, model, and provider
105
109
  aix-cli run
106
110
 
107
- # Specify a model
108
- aix-cli run -m qwen2.5-coder-14b
111
+ # Specify backend and model
112
+ aix-cli run -b ollama -m qwen2.5-coder:14b
113
+ aix-cli run -b lmstudio -m llama-3-8b
109
114
 
110
115
  # Use OpenCode instead of Claude Code
111
116
  aix-cli run --provider opencode
112
117
 
113
118
  # Pass a prompt directly
114
- aix-cli run -m qwen2.5-coder-14b -- "Refactor auth middleware to use JWTs"
119
+ aix-cli run -b ollama -m qwen2.5-coder:14b -- "Refactor auth middleware"
115
120
  ```
116
121
 
117
- ### `aix-cli init` — Load a model
122
+ ### `aix-cli init` — Set up backend and model
118
123
 
119
- Loads a model into LM Studio. If no model is specified, you'll get an interactive picker.
124
+ Configure your preferred backend and load/select a model.
120
125
 
121
126
  ```bash
122
- aix-cli init # Interactive model selection
123
- aix-cli init -m llama-3-8b # Load a specific model
124
- aix-cli init -m llama-3-8b -p opencode # Set default provider at the same time
127
+ aix-cli init # Interactive setup
128
+ aix-cli init -b ollama -m qwen2.5-coder:14b # Ollama with specific model
129
+ aix-cli init -b lmstudio -m llama-3-8b -p claude # Full config in one command
125
130
  ```
126
131
 
127
132
  ### `aix-cli status` — Check what's running
128
133
 
129
- Shows the current LM Studio server status and which models are loaded.
134
+ Shows status for both LM Studio and Ollama, including available and running models.
130
135
 
131
136
  ```bash
132
137
  aix-cli status
@@ -145,16 +150,18 @@ aix-cli doctor
145
150
  | Command | Aliases | Description |
146
151
  |---|---|---|
147
152
  | `run` | `r` | Run Claude Code / OpenCode with a local model |
148
- | `init` | `i`, `load` | Load a model into LM Studio |
149
- | `status` | `s`, `stats` | Show LM Studio server status |
153
+ | `init` | `i`, `load` | Set up backend, select model, configure provider |
154
+ | `status` | `s`, `stats` | Show LM Studio & Ollama status |
150
155
  | `doctor` | `d`, `check` | Run system diagnostics |
156
+ | `update` | `upgrade`, `u` | Update AIX CLI to the latest version |
151
157
 
152
158
  ### Global Options
153
159
 
154
160
  | Flag | Description |
155
161
  |---|---|
162
+ | `-b, --backend <name>` | Model backend: `lmstudio` or `ollama` |
156
163
  | `-m, --model <name>` | Model name or ID to use |
157
- | `-p, --provider <name>` | Provider: `claude` (default) or `opencode` |
164
+ | `-p, --provider <name>` | Coding tool: `claude` (default) or `opencode` |
158
165
  | `-v, --verbose` | Show verbose output |
159
166
  | `-h, --help` | Show help |
160
167
  | `-V, --version` | Show version |
@@ -177,9 +184,12 @@ AIX stores its configuration in the OS-appropriate config directory:
177
184
  {
178
185
  "lmStudioUrl": "http://localhost",
179
186
  "lmStudioPort": 1234,
187
+ "ollamaUrl": "http://localhost",
188
+ "ollamaPort": 11434,
180
189
  "defaultTimeout": 30000,
181
- "model": "lmstudio/qwen2.5-coder-14b",
182
- "defaultProvider": "claude"
190
+ "defaultBackend": "ollama",
191
+ "defaultProvider": "claude",
192
+ "model": "qwen2.5-coder:14b"
183
193
  }
184
194
  ```
185
195
 
@@ -194,29 +204,31 @@ AIX stores its configuration in the OS-appropriate config directory:
194
204
  ## How It Works
195
205
 
196
206
  ```
197
- ┌───────────────────┐
198
- │ LM Studio │
199
- │ (local server)
200
- │ port 1234 │
201
- └────────┬──────────┘
202
-
203
- REST API
204
-
205
- ┌────────┴──────────┐
206
- │ AIX CLI │
207
- model routing
208
- config mgmt
209
- └───┬──────────┬────┘
210
-
211
- ┌────────┘ └────────┐
212
- ▼ ▼
213
- ┌──────────────┐ ┌──────────────┐
214
- │ Claude Code │ │ OpenCode │
215
- │ --model X │ │ --model X │
216
- └──────────────┘ └──────────────┘
207
+ ┌───────────────────┐ ┌───────────────────┐
208
+ │ LM Studio │ │ Ollama │
209
+ │ (port 1234) │ (port 11434) │
210
+ └────────┬──────────┘ └────────┬───────────┘
211
+ │ │
212
+ REST API REST API
213
+ │ │
214
+ └───────────┬────────────┘
215
+
216
+ ┌────────┴──────────┐
217
+ AIX CLI
218
+ backend routing
219
+ │ model selection │
220
+ config mgmt
221
+ └───┬──────────┬────┘
222
+ │ │
223
+ ┌────────┘ └────────┐
224
+ ▼ ▼
225
+ ┌──────────────┐ ┌──────────────┐
226
+ │ Claude Code │ │ OpenCode │
227
+ │ --model X │ │ --model X │
228
+ └──────────────┘ └──────────────┘
217
229
  ```
218
230
 
219
- 1. **LM Studio** runs a local inference server exposing an OpenAI-compatible API.
231
+ 1. **LM Studio** or **Ollama** runs a local inference server with an OpenAI-compatible API.
220
232
  2. **AIX CLI** discovers available models, manages configuration, and orchestrates the connection.
221
233
  3. **Claude Code** or **OpenCode** receives the model endpoint and runs as it normally would — except fully local.
222
234
 
@@ -234,25 +246,35 @@ AIX stores its configuration in the OS-appropriate config directory:
234
246
 
235
247
  </details>
236
248
 
249
+ <details>
250
+ <summary><strong>Ollama not running</strong></summary>
251
+
252
+ 1. Install Ollama from [ollama.com](https://ollama.com)
253
+ 2. Start the server: `ollama serve`
254
+ 3. Pull a model: `ollama pull qwen2.5-coder:14b`
255
+ 4. Confirm with `aix-cli status`
256
+
257
+ </details>
258
+
237
259
  <details>
238
260
  <summary><strong>No models found</strong></summary>
239
261
 
240
- 1. Open LM Studio → **Search** tab
241
- 2. Download a model (e.g., Qwen 2.5 Coder, Llama 3, Mistral)
242
- 3. Wait for the download to complete
243
- 4. Run `aix-cli init` to load it
262
+ **LM Studio:** Open LM Studio → **Search** tab → download a model.
263
+
264
+ **Ollama:** Run `ollama pull <model>` to download a model (e.g., `ollama pull llama3.2`).
265
+
266
+ Then run `aix-cli init` to select and configure.
244
267
 
245
268
  </details>
246
269
 
247
270
  <details>
248
- <summary><strong>Connection refused on port 1234</strong></summary>
271
+ <summary><strong>Connection refused</strong></summary>
249
272
 
250
- Check the LM Studio server tab for the actual port it's running on. If it differs from `1234`, update your config:
273
+ Check that the correct port is being used:
274
+ - LM Studio defaults to port `1234`
275
+ - Ollama defaults to port `11434`
251
276
 
252
- ```bash
253
- # The config file path is shown by `aix-cli doctor`
254
- # Edit the config file and set "lmStudioPort" to the correct port
255
- ```
277
+ You can configure custom ports in your AIX config file (path shown by `aix-cli doctor`).
256
278
 
257
279
  </details>
258
280
 
@@ -279,7 +301,7 @@ Then re-run `aix-cli doctor` to confirm.
279
301
 
280
302
  AIX is designed around a simple principle: **your code never leaves your machine.**
281
303
 
282
- - ✅ All AI inference runs locally via LM Studio
304
+ - ✅ All AI inference runs locally via LM Studio or Ollama
283
305
  - ✅ No telemetry, analytics, or tracking of any kind
284
306
  - ✅ No outbound network calls (except to `localhost`)
285
307
  - ✅ No API keys or accounts required
@@ -307,6 +329,7 @@ npm run lint # Lint
307
329
  ## Related Projects
308
330
 
309
331
  - [LM Studio](https://lmstudio.ai) — Run local AI models with a visual interface
332
+ - [Ollama](https://ollama.com) — Run large language models locally
310
333
  - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) — Anthropic's AI coding assistant
311
334
  - [OpenCode](https://opencode.ai) — Open-source AI coding assistant
312
335
 
package/dist/bin/aix.js CHANGED
@@ -1,10 +1,11 @@
1
1
  #!/usr/bin/env node
2
- var ne=Object.defineProperty;var ie=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var O=(t,e)=>()=>(t&&(e=t(t=0)),e);var A=(t,e)=>{for(var o in e)ne(t,o,{get:e[o],enumerable:!0})};var W={};A(W,{ConfigService:()=>E,configService:()=>c});import se from"conf";var E,c,M=O(()=>{"use strict";E=class{store;constructor(){this.store=new se({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new E});var Q={};A(Q,{LMStudioService:()=>I,lmStudioService:()=>g});import{execa as k}from"execa";import H from"ora";import V from"chalk";var J,I,g,C=O(()=>{"use strict";M();J=[1234,1235,1236,1237],I=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let n=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!n.ok)continue;let r=await n.json(),i=[];return Array.isArray(r)?i=r:r.models&&Array.isArray(r.models)?i=r.models:r.data&&Array.isArray(r.data)&&(i=r.data),i.map(a=>{let s=a;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(a=>a.id&&a.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let n=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:n.models??[],activeModel:n.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let n=o??H({text:`Loading model: ${V.cyan(e)}`,color:"cyan"}).start();try{let r=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!r.ok)throw new Error(`Failed to load model: ${r.statusText}`);return n.succeed(`Model ${V.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:n}}catch(r){throw n.fail(`Failed to load model: ${r instanceof Error?r.message:"Unknown error"}`),r}}async startServer(e){let o=e??H({text:"Starting LM Studio server...",color:"cyan"}).start();try{let n=process.platform==="darwin",r=process.platform==="linux",i=process.platform==="win32",a;if(n){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let l of s)try{let{existsSync:p}=await import("fs");if(p(l)){a=`open "${l}" --args --server`;break}}catch{}if(a?.startsWith("open")){await k("open",[s.find(l=>{try{let{existsSync:p}=ie("fs");return p(l)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else r?a=await this.findLinuxBinary():i&&(a=await this.findWindowsExecutable());if(!a)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await k(a,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(n){throw o.fail(`Failed to start LM Studio: ${n instanceof Error?n.message:"Unknown error"}`),n}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await k("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,n=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let r of n)try{return await k("cmd",["/c","if exist",`"${r}"`,"echo","yes"]),r}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of J)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return J[0]??1234}},g=new I});var oe={};A(oe,{ClaudeService:()=>U,claudeService:()=>P});import{execa as _}from"execa";import le from"chalk";var U,P,F=O(()=>{"use strict";U=class{async isClaudeCodeInstalled(){try{return await _("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${a}`,l=["--model",s,...n];r&&console.log(le.dim(`
3
- Running: claude ${l.join(" ")}
4
- `));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(p){if(p instanceof Error&&"exitCode"in p){let L=p.exitCode;process.exit(L??1)}throw p}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await _("claude",["--version"])).stdout}catch{return}}},P=new U});var te={};A(te,{OpenCodeService:()=>z,openCodeService:()=>$});import{execa as q}from"execa";import de from"chalk";var z,$,B=O(()=>{"use strict";z=class{async isOpenCodeInstalled(){try{return await q("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...n];r&&console.log(de.dim(`
2
+ var we=Object.defineProperty;var ye=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var C=(t,e)=>()=>(t&&(e=t(t=0)),e);var O=(t,e)=>{for(var o in e)we(t,o,{get:e[o],enumerable:!0})};var ne={};O(ne,{ConfigService:()=>N,configService:()=>c});import Me from"conf";var N,c,$=C(()=>{"use strict";N=class{store;constructor(){this.store=new Me({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,ollamaUrl:"http://localhost",ollamaPort:11434,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}setDefaultBackend(e){this.store.set("defaultBackend",e)}getDefaultBackend(){return this.store.get("defaultBackend")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}getOllamaUrl(){let e=this.store.get("ollamaUrl"),o=this.store.get("ollamaPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new N});var ae={};O(ae,{LMStudioService:()=>j,lmStudioService:()=>h});import{execa as z}from"execa";import re from"ora";import ie from"chalk";var se,j,h,L=C(()=>{"use strict";$();se=[1234,1235,1236,1237],j=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let i=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!i.ok)continue;let n=await i.json(),r=[];return Array.isArray(n)?r=n:n.models&&Array.isArray(n.models)?r=n.models:n.data&&Array.isArray(n.data)&&(r=n.data),r.map(l=>{let s=l;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(l=>l.id&&l.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let i=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:i.models??[],activeModel:i.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let i=o??re({text:`Loading model: ${ie.cyan(e)}`,color:"cyan"}).start();try{let n=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!n.ok)throw new Error(`Failed to load model: ${n.statusText}`);return i.succeed(`Model ${ie.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:i}}catch(n){throw i.fail(`Failed to load model: ${n instanceof Error?n.message:"Unknown error"}`),n}}async startServer(e){let o=e??re({text:"Starting LM Studio server...",color:"cyan"}).start();try{let i=process.platform==="darwin",n=process.platform==="linux",r=process.platform==="win32",l;if(i){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let d of s)try{let{existsSync:u}=await import("fs");if(u(d)){l=`open "${d}" --args --server`;break}}catch{}if(l?.startsWith("open")){await z("open",[s.find(d=>{try{let{existsSync:u}=ye("fs");return u(d)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else n?l=await this.findLinuxBinary():r&&(l=await this.findWindowsExecutable());if(!l)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await z(l,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(i){throw o.fail(`Failed to start LM Studio: ${i instanceof Error?i.message:"Unknown error"}`),i}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await z("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,i=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let n of i)try{return await z("cmd",["/c","if exist",`"${n}"`,"echo","yes"]),n}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of se)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return se[0]??1234}},h=new j});var le={};O(le,{OllamaService:()=>T,ollamaService:()=>S});var T,S,A=C(()=>{"use strict";$();T=class{getBaseUrl(){return c.getOllamaUrl()}getApiUrl(e){return`${this.getBaseUrl()}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){try{let e=await fetch(this.getApiUrl("/api/tags"),{method:"GET",signal:AbortSignal.timeout(1e4)});return e.ok?((await e.json()).models??[]).map(n=>{let r=n.details??{};return{id:String(n.name??n.model??""),name:String(n.name??n.model??""),size:Number(n.size??0),quantization:String(r.quantization_level??""),family:String(r.family??""),parameterSize:String(r.parameter_size??"")}}).filter(n=>n.id&&n.name):[]}catch{return[]}}async getRunningModels(){try{let e=await fetch(this.getApiUrl("/api/ps"),{method:"GET",signal:AbortSignal.timeout(5e3)});return e.ok?((await e.json()).models??[]).map(n=>String(n.name??n.model??"")).filter(Boolean):[]}catch{return[]}}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("ollamaPort"),models:[],runningModels:[]};let[o,i]=await Promise.all([this.getAvailableModels(),this.getRunningModels()]);return{running:!0,port:c.get("ollamaPort"),models:o,runningModels:i}}},S=new T});var ue={};O(ue,{ClaudeService:()=>G,claudeService:()=>R});import{execa as J}from"execa";import xe from"chalk";var G,R,Q=C(()=>{"use strict";G=class{async isClaudeCodeInstalled(){try{return await J("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:i=[],verbose:n=!1}=e,r=this.extractProvider(o),l=this.extractModelName(o);if(!r||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${r}/${l}`,d=["--model",s,...i];n&&console.log(xe.dim(`
3
+ Running: claude ${d.join(" ")}
4
+ `));try{await J("claude",d,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(u){if(u instanceof Error&&"exitCode"in u){let v=u.exitCode;process.exit(v??1)}throw u}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await J("claude",["--version"])).stdout}catch{return}}},R=new G});var me={};O(me,{OpenCodeService:()=>W,openCodeService:()=>E});import{execa as X}from"execa";import Ce from"chalk";var W,E,Y=C(()=>{"use strict";W=class{async isOpenCodeInstalled(){try{return await X("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:i=[],verbose:n=!1}=e,r=this.extractProvider(o),l=this.extractModelName(o);if(!r||!l)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...i];n&&console.log(Ce.dim(`
5
5
  Running: opencode ${s.join(" ")}
6
- `));try{await q("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:a,OPENCODE_MODEL_PROVIDER:i}})}catch(l){if(l instanceof Error&&"exitCode"in l){let p=l.exitCode;process.exit(p??1)}throw l}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await q("opencode",["--version"])).stdout}catch{return}}},$=new z});import{Command as me}from"commander";import u from"chalk";C();M();import Z from"ora";import h from"chalk";import ee from"inquirer";import X from"inquirer";async function D(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),n=e?o.findIndex(i=>i.value.id===e):0;return(await X.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,n),pageSize:Math.min(t.length,15)}])).model}async function N(t,e=!0){return(await X.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import Y from"chalk";function R(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],n=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,n)).toFixed(2))} ${o[n]}`}function x(t){console.log(Y.green("\u2713")+" "+t)}function ae(t){console.error(Y.red("\u2717")+" "+t)}function v(t,e=1){ae(t),process.exit(e)}async function j(t={}){let e;if(t.provider)e=t.provider;else{let f=c.getDefaultProvider(),{providerSelection:w}=await ee.prompt([{type:"list",name:"providerSelection",message:"Select provider:",default:f,choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]);e=w;let{saveDefault:m}=await ee.prompt([{type:"confirm",name:"saveDefault",message:"Save as default provider?",default:!1}]);m&&(c.setDefaultProvider(e),x(`Default provider set to ${h.cyan(e)}`))}let o=Z({text:"Checking LM Studio status...",color:"cyan"}).start(),n=await g.checkStatus();n||(o.info("LM Studio server not running"),o.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),n=!0),o.succeed("Connected to LM Studio");let r=Z({text:"Fetching available models...",color:"cyan"}).start(),i=await g.getAvailableModels();i.length===0&&(r.fail("No models found. Download some models in LM Studio first."),v("No models available")),r.succeed(`Found ${h.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(h.bold("Available Models:")),console.log(h.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((f,w)=>{let m=R(f.size),S=f.loaded?h.green(" [LOADED]"):"";console.log(` ${h.dim(String(w+1).padStart(2))}. ${f.name} ${h.dim(`(${m})`)}${S}`)}),console.log();let a=c.getLastUsedModel(),s=t.model,l=s?i.find(f=>f.id===s||f.name.includes(s)):await D(i,a);l||v("No model selected"),await g.loadModel(l.id,o);let p=l.id.replace("/","--"),L=e==="opencode"?"OpenCode":"Claude Code";x(h.bold(`
7
- Model ready: ${l.name}`)),console.log(),console.log(`Run ${L} with this model:`),console.log(` ${h.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+p)}`),console.log(),console.log(`Or use ${h.cyan("aix-cli run")} to start an interactive session`)}C();F();B();M();import T from"ora";import re from"chalk";import ce from"inquirer";async function ue(){let t=await P.isClaudeCodeInstalled(),e=await $.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&v("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:n}=await ce.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return n}async function G(t={}){let e;if(t.provider)e=t.provider;else{let m=c.getDefaultProvider();m?e=m:e=await ue()}let o=T({text:`Checking ${e==="opencode"?"OpenCode":"Claude Code"} installation...`,color:"cyan"}).start();(e==="opencode"?await $.isOpenCodeInstalled():await P.isClaudeCodeInstalled())||(o.fail(`${e==="opencode"?"OpenCode":"Claude Code"} is not installed.`),v(`Please install ${e==="opencode"?"OpenCode":"Claude Code"} first.`)),o.succeed(`${e==="opencode"?"OpenCode":"Claude Code"} is installed`);let r=T({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await g.checkStatus();i||(r.info("LM Studio server not running"),r.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),i=!0),r.succeed("Connected to LM Studio");let a=T({text:"Fetching available models...",color:"cyan"}).start(),s=await g.getAvailableModels();s.length===0&&(a.fail("No models found. Download some models in LM Studio first."),v("No models available")),a.stop();let l;if(t.model){let m=s.find(S=>S.id===t.model||S.name.toLowerCase().includes(t.model.toLowerCase()));m||v(`Model "${t.model}" not found. Available models: ${s.map(S=>S.name).join(", ")}`),l=m.id}else{let m=c.getLastUsedModel();l=(await D(s,m)).id}let p=T({text:`Loading model: ${re.cyan(l)}`,color:"cyan"}).start();await g.loadModel(l,p);let f=`lmstudio/${l.replace("/","--")}`,w=e==="opencode"?"OpenCode":"Claude Code";x(re.green(`
8
- Starting ${w} with model: ${f}
9
- `));try{e==="opencode"?await $.run({model:f,args:t.args??[],verbose:t.verbose}):await P.run({model:f,args:t.args??[],verbose:t.verbose})}catch(m){v(`Failed to run ${w}: ${m instanceof Error?m.message:"Unknown error"}`)}}C();import d from"chalk";async function K(){let t=await g.getStatus();console.log(),console.log(d.bold("LM Studio Status")),console.log(d.dim("\u2500".repeat(50))),console.log(` ${t.running?d.green("\u25CF"):d.red("\u25CB")} Server: ${t.running?d.green("Running"):d.red("Stopped")}`),console.log(` ${d.dim("\u25B8")} Port: ${d.cyan(String(t.port))}`),console.log(` ${d.dim("\u25B8")} URL: ${d.cyan(`http://localhost:${t.port}`)}`),t.activeModel?console.log(` ${d.dim("\u25B8")} Active Model: ${d.green(t.activeModel)}`):console.log(` ${d.dim("\u25B8")} Active Model: ${d.dim("None")}`),console.log(),console.log(d.bold("Models")),console.log(d.dim("\u2500".repeat(50))),t.models.length===0?console.log(` ${d.dim("No models available")}`):t.models.forEach((e,o)=>{let n=R(e.size),r=e.id===t.activeModel?` ${d.green("[LOADED]")}`:"";console.log(` ${d.dim(String(o+1)+".")} ${e.name}${r}`),console.log(` ${d.dim("ID:")} ${e.id}`),console.log(` ${d.dim("Size:")} ${n}`),e.quantization&&console.log(` ${d.dim("Quantization:")} ${e.quantization}`),console.log()})}var y=new me;y.name("aix-cli").description("AI CLI tool that integrates LM Studio with Claude Code or OpenCode for local AI-powered development").version("2.0.0").showHelpAfterError();function b(t=0){console.log(),console.log(u.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>b(0));process.on("SIGTERM",()=>b(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?b(0):(console.error(u.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&b(0)});y.command("init",{isDefault:!1}).aliases(["i","load"]).description("Initialize and load a model into LM Studio").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").action(j);y.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await G({...e,args:t})});y.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio status and available models").action(K);y.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(C(),Q)),{claudeService:e}=await Promise.resolve().then(()=>(F(),oe)),{openCodeService:o}=await Promise.resolve().then(()=>(B(),te)),{configService:n}=await Promise.resolve().then(()=>(M(),W));console.log(u.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(u.dim("\u2500".repeat(40)));let r=await t.checkStatus(),i=await e.isClaudeCodeInstalled(),a=await o.isOpenCodeInstalled(),s=n.getDefaultProvider(),l=n.get("lmStudioPort");console.log(),console.log(`${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?u.green("Running"):u.yellow("Not running")}`),console.log(`${i?"\u2705":"\u274C"} Claude Code: ${i?u.green("Installed"):u.red("Not installed")}`),console.log(`${a?"\u2705":"\u274C"} OpenCode: ${a?u.green("Installed"):u.red("Not installed")}`),console.log(`\u{1F310} Server: ${u.cyan(`http://localhost:${l}`)}`),console.log(`\u{1F4CC} Default provider: ${u.cyan(s)}`),(!i||!a||!r)&&(console.log(),console.log(u.bold("\u{1F4CB} Next Steps:")),i||console.log(` 1. ${u.cyan("npm install -g @anthropic-ai/claude-code")}`),a||console.log(` 2. ${u.cyan("npm install -g opencode")}`),r||console.log(" 3. Open LM Studio and start the server")),console.log(),console.log(u.dim("\u{1F4D6} Docs: ")+u.cyan("https://lmstudio.ai"))});y.parse();
6
+ `));try{await X("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:l,OPENCODE_MODEL_PROVIDER:r}})}catch(d){if(d instanceof Error&&"exitCode"in d){let u=d.exitCode;process.exit(u??1)}throw d}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await X("opencode",["--version"])).stdout}catch{return}}},E=new W});import{Command as Be}from"commander";import p from"chalk";L();A();$();import _ from"ora";import g from"chalk";import q from"inquirer";import de from"inquirer";async function b(t,e){let o=t.map(r=>({name:`${r.name} (${r.id})`,value:r,short:r.name})),i=e?o.findIndex(r=>r.value.id===e):0;return(await de.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,i),pageSize:Math.min(t.length,15)}])).model}async function F(t,e=!0){return(await de.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import K from"chalk";function P(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],i=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,i)).toFixed(2))} ${o[i]}`}function y(t){console.log(K.green("\u2713")+" "+t)}function V(t){console.error(K.red("\u2717")+" "+t)}function ce(t){console.log(K.blue("\u2139")+" "+t)}function f(t,e=1){V(t),process.exit(e)}async function $e(){let t=c.getDefaultBackend(),{backendSelection:e}=await q.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",default:t??"lmstudio",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]),{saveDefault:o}=await q.prompt([{type:"confirm",name:"saveDefault",message:"Save as default backend?",default:!1}]);return o&&(c.setDefaultBackend(e),y(`Default backend set to ${g.cyan(e)}`)),e}async function be(){let t=c.getDefaultProvider(),{providerSelection:e}=await q.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",default:t??"claude",choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]),{saveDefault:o}=await q.prompt([{type:"confirm",name:"saveDefault",message:"Save as default coding tool?",default:!1}]);return o&&(c.setDefaultProvider(e),y(`Default coding tool set to ${g.cyan(e)}`)),e}async function Pe(t,e){let o=_({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await h.checkStatus();i||(o.info("LM Studio server not running"),o.stop(),await F("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),i=!0),o.succeed("Connected to LM Studio");let n=_({text:"Fetching available models...",color:"cyan"}).start(),r=await h.getAvailableModels();r.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.succeed(`Found ${g.bold(r.length)} model${r.length===1?"":"s"}`),console.log(),console.log(g.bold("Available Models:")),console.log(g.dim("\u2500".repeat(process.stdout.columns||80))),r.forEach((m,x)=>{let w=P(m.size),I=m.loaded?g.green(" [LOADED]"):"";console.log(` ${g.dim(String(x+1).padStart(2))}. ${m.name} ${g.dim(`(${w})`)}${I}`)}),console.log();let l=c.getLastUsedModel(),s=t.model,d=s?r.find(m=>m.id===s||m.name.includes(s)):await b(r,l);d||f("No model selected"),await h.loadModel(d.id,o);let u=d.id.replace("/","--"),v=e==="opencode"?"OpenCode":"Claude Code";y(g.bold(`
7
+ Model ready: ${d.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${g.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+u)}`),console.log(),console.log(`Or use ${g.cyan("aix-cli run")} to start an interactive session`)}async function ke(t,e){let o=_({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=_({text:"Fetching available models...",color:"cyan"}).start(),r=await S.getAvailableModels();r.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.succeed(`Found ${g.bold(r.length)} model${r.length===1?"":"s"}`);let l=await S.getRunningModels();console.log(),console.log(g.bold("Available Models:")),console.log(g.dim("\u2500".repeat(process.stdout.columns||80))),r.forEach((m,x)=>{let w=P(m.size),he=l.includes(m.id)?g.green(" [RUNNING]"):"",Se=m.parameterSize?g.dim(` ${m.parameterSize}`):"";console.log(` ${g.dim(String(x+1).padStart(2))}. ${m.name}${Se} ${g.dim(`(${w})`)}${he}`)}),console.log();let s=c.getLastUsedModel(),d=t.model,u=d?r.find(m=>m.id===d||m.name.includes(d)):await b(r,s);u||f("No model selected"),c.setModel(u.id);let v=e==="opencode"?"OpenCode":"Claude Code";y(g.bold(`
8
+ Model selected: ${u.name}`)),console.log(),console.log(`Run ${v} with this model:`),console.log(` ${g.cyan((e==="opencode"?"opencode":"claude")+" --model ollama/"+u.id)}`),console.log(),console.log(`Or use ${g.cyan("aix-cli run")} to start an interactive session`)}async function H(t={}){let e=t.backend??await $e(),o=t.provider??await be();e==="ollama"?await ke(t,o):await Pe(t,o)}L();A();Q();Y();$();import k from"ora";import pe from"chalk";import ge from"inquirer";async function Oe(){let t=c.getDefaultBackend();if(t)return t;let{backendSelection:e}=await ge.prompt([{type:"list",name:"backendSelection",message:"Select model backend:",choices:[{name:"\u{1F5A5}\uFE0F LM Studio",value:"lmstudio"},{name:"\u{1F999} Ollama",value:"ollama"}]}]);return e}async function Le(){let t=await R.isClaudeCodeInstalled(),e=await E.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&f("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:i}=await ge.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return i}function U(t){return t==="opencode"?"OpenCode":"Claude Code"}async function Ae(t,e){let o=k({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await h.checkStatus();i||(o.info("LM Studio server not running"),o.stop(),await F("Would you like to start the LM Studio server?")||f("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await h.startServer(),i=!0),o.succeed("Connected to LM Studio");let n=k({text:"Fetching available models...",color:"cyan"}).start(),r=await h.getAvailableModels();r.length===0&&(n.fail("No models found. Download some models in LM Studio first."),f("No models available")),n.stop();let l;if(t.model){let v=r.find(m=>m.id===t.model||m.name.toLowerCase().includes(t.model.toLowerCase()));v||f(`Model "${t.model}" not found. Available models: ${r.map(m=>m.name).join(", ")}`),l=v.id}else{let v=c.getLastUsedModel();l=(await b(r,v)).id}let s=k({text:`Loading model: ${pe.cyan(l)}`,color:"cyan"}).start();await h.loadModel(l,s);let u=`lmstudio/${l.replace("/","--")}`;await fe(e,u,t)}async function Re(t,e){let o=k({text:"Checking Ollama status...",color:"cyan"}).start();await S.checkStatus()||(o.fail("Ollama is not running"),f("Ollama must be running. Start it with: ollama serve")),o.succeed("Connected to Ollama");let n=k({text:"Fetching available models...",color:"cyan"}).start(),r=await S.getAvailableModels();r.length===0&&(n.fail("No models found. Pull a model first: ollama pull <model>"),f("No models available")),n.stop();let l;if(t.model){let d=r.find(u=>u.id===t.model||u.name.toLowerCase().includes(t.model.toLowerCase()));d||f(`Model "${t.model}" not found. Available models: ${r.map(u=>u.name).join(", ")}`),l=d.id}else{let d=c.getLastUsedModel();l=(await b(r,d)).id}c.setModel(l);let s=`ollama/${l}`;await fe(e,s,t)}async function fe(t,e,o){let i=U(t);y(pe.green(`
9
+ Starting ${i} with model: ${e}
10
+ `));try{t==="opencode"?await E.run({model:e,args:o.args??[],verbose:o.verbose}):await R.run({model:e,args:o.args??[],verbose:o.verbose})}catch(n){f(`Failed to run ${i}: ${n instanceof Error?n.message:"Unknown error"}`)}}async function Z(t={}){let e;if(t.provider)e=t.provider;else{let r=c.getDefaultProvider();r?e=r:e=await Le()}let o=k({text:`Checking ${U(e)} installation...`,color:"cyan"}).start();(e==="opencode"?await E.isOpenCodeInstalled():await R.isClaudeCodeInstalled())||(o.fail(`${U(e)} is not installed.`),f(`Please install ${U(e)} first.`)),o.succeed(`${U(e)} is installed`),(t.backend??await Oe())==="ollama"?await Re(t,e):await Ae(t,e)}L();A();import a from"chalk";async function ee(){let[t,e]=await Promise.all([h.getStatus(),S.getStatus()]);console.log(),console.log(a.bold("LM Studio")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${t.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${t.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(t.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${t.port}`)}`),t.activeModel&&console.log(` ${a.dim("\u25B8")} Active Model: ${a.green(t.activeModel)}`),t.running&&t.models.length>0?(console.log(),console.log(a.bold(" Models")),t.models.forEach((o,i)=>{let n=P(o.size),r=o.id===t.activeModel?` ${a.green("[LOADED]")}`:"";console.log(` ${a.dim(String(i+1)+".")} ${o.name}${r}`),console.log(` ${a.dim("ID:")} ${o.id}`),console.log(` ${a.dim("Size:")} ${n}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):t.running&&console.log(` ${a.dim("No models available")}`),console.log(),console.log(a.bold("Ollama")),console.log(a.dim("\u2500".repeat(50))),console.log(` ${e.running?a.green("\u25CF"):a.red("\u25CB")} Server: ${e.running?a.green("Running"):a.red("Stopped")}`),console.log(` ${a.dim("\u25B8")} Port: ${a.cyan(String(e.port))}`),console.log(` ${a.dim("\u25B8")} URL: ${a.cyan(`http://localhost:${e.port}`)}`),e.running&&e.runningModels.length>0&&console.log(` ${a.dim("\u25B8")} Running: ${a.green(e.runningModels.join(", "))}`),e.running&&e.models.length>0?(console.log(),console.log(a.bold(" Models")),e.models.forEach((o,i)=>{let n=P(o.size),l=e.runningModels.includes(o.id)?` ${a.green("[RUNNING]")}`:"",s=o.parameterSize?` ${a.dim(o.parameterSize)}`:"";console.log(` ${a.dim(String(i+1)+".")} ${o.name}${s}${l}`),console.log(` ${a.dim("Size:")} ${n}`),o.family&&console.log(` ${a.dim("Family:")} ${o.family}`),o.quantization&&console.log(` ${a.dim("Quantization:")} ${o.quantization}`)})):e.running&&console.log(` ${a.dim("No models available")}`),console.log()}import Ee from"ora";import D from"chalk";import{execa as ve}from"execa";import{readFileSync as Ue}from"fs";import{fileURLToPath as De}from"url";function oe(){try{let t=De(new URL("../../package.json",import.meta.url)),e=JSON.parse(Ue(t,"utf8"));return String(e.version)}catch{return"unknown"}}async function te(){let t=Ee({text:"Checking for updates...",color:"cyan"}).start();try{let e=oe();if(e==="unknown"){t.fail("Could not determine current version.");return}let{stdout:o}=await ve("npm",["view","@iamharshil/aix-cli","version"]),i=o.trim();if(e===i){t.succeed(`You're already on the latest version: ${D.green(`v${e}`)}`);return}t.text=`Updating: ${D.yellow(`v${e}`)} \u2192 ${D.green(`v${i}`)}...`,await ve("npm",["install","-g","@iamharshil/aix-cli@latest"]),t.succeed(`Successfully updated to ${D.green(`v${i}`)}! \u{1F680}`),ce(`Restart your terminal or run ${D.cyan("aix-cli --help")} to see what's new.`)}catch(e){t.fail("Failed to update."),V(e instanceof Error?e.message:String(e))}}var M=new Be;M.name("aix-cli").description("Run Claude Code or OpenCode with local AI models from LM Studio or Ollama").version(oe()).showHelpAfterError();function B(t=0){console.log(),console.log(p.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>B(0));process.on("SIGTERM",()=>B(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?B(0):(console.error(p.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&B(0)});M.command("init",{isDefault:!1}).aliases(["i","load"]).description("Select a backend, load a model, and configure your provider").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").action(H);M.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio or Ollama").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Coding tool to use (claude or opencode)","").option("-b, --backend <backend>","Model backend to use (lmstudio or ollama)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await Z({...e,args:t})});M.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio and Ollama status and available models").action(ee);M.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(L(),ae)),{ollamaService:e}=await Promise.resolve().then(()=>(A(),le)),{claudeService:o}=await Promise.resolve().then(()=>(Q(),ue)),{openCodeService:i}=await Promise.resolve().then(()=>(Y(),me)),{configService:n}=await Promise.resolve().then(()=>($(),ne));console.log(p.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(p.dim("\u2500".repeat(40)));let[r,l,s,d]=await Promise.all([t.checkStatus(),e.checkStatus(),o.isClaudeCodeInstalled(),i.isOpenCodeInstalled()]),u=n.getDefaultProvider(),v=n.getDefaultBackend(),m=n.get("lmStudioPort"),x=n.get("ollamaPort");console.log(),console.log(p.bold("Backends")),console.log(` ${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?p.green("Running"):p.yellow("Not running")} ${p.dim(`(port ${m})`)}`),console.log(` ${l?"\u2705":"\u26A0\uFE0F"} Ollama: ${l?p.green("Running"):p.yellow("Not running")} ${p.dim(`(port ${x})`)}`),console.log(),console.log(p.bold("Coding Tools")),console.log(` ${s?"\u2705":"\u274C"} Claude Code: ${s?p.green("Installed"):p.red("Not installed")}`),console.log(` ${d?"\u2705":"\u274C"} OpenCode: ${d?p.green("Installed"):p.red("Not installed")}`),console.log(),console.log(p.bold("Defaults")),console.log(` \u{1F4CC} Backend: ${p.cyan(v??"not set")}`),console.log(` \u{1F4CC} Coding tool: ${p.cyan(u??"not set")}`);let w=[];s||w.push(` \u2192 ${p.cyan("npm install -g @anthropic-ai/claude-code")}`),d||w.push(` \u2192 ${p.cyan("npm install -g opencode")}`),!r&&!l&&w.push(` \u2192 Start LM Studio or run ${p.cyan("ollama serve")}`),w.length>0&&(console.log(),console.log(p.bold("\u{1F4CB} Next Steps:")),w.forEach(I=>console.log(I))),console.log()});M.command("update",{isDefault:!1}).aliases(["upgrade","u"]).description("Update AIX CLI to the latest version").action(te);M.parse();
10
11
  //# sourceMappingURL=aix.js.map