@iamharshil/aix-cli 2.0.2 → 2.0.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,74 +1,78 @@
1
+ <div align="center">
2
+
1
3
  # AIX CLI
2
4
 
3
- > AI-powered CLI tool that integrates LM Studio with Claude Code or OpenCode for enhanced local development assistance
5
+ **Run Claude Code & OpenCode with local AI models. No API keys. No cloud. Complete privacy.**
4
6
 
5
- <div align="center">
7
+ [![npm version](https://img.shields.io/npm/v/@iamharshil/aix-cli.svg?style=flat-square&color=cb3837)](https://www.npmjs.com/package/@iamharshil/aix-cli)
8
+ [![Downloads](https://img.shields.io/npm/dm/@iamharshil/aix-cli?style=flat-square&color=blue)](https://www.npmjs.com/package/@iamharshil/aix-cli)
9
+ [![License](https://img.shields.io/badge/license-MIT-green?style=flat-square)](LICENSE)
10
+ [![CI](https://img.shields.io/github/actions/workflow/status/iamharshil/aix-cli/ci.yml?style=flat-square&label=CI)](https://github.com/iamharshil/aix-cli/actions)
11
+ [![Node](https://img.shields.io/badge/node-%E2%89%A518-417e38?style=flat-square)](https://nodejs.org/)
6
12
 
7
- [![npm version](https://img.shields.io/npm/v/@iamharshil/aix-cli.svg)](https://www.npmjs.com/package/@iamharshil/aix-cli)
8
- [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
9
- [![Node.js Version](https://img.shields.io/badge/node-%3E%3D18.0.0-brightgreen)](https://nodejs.org/)
10
- [![GitHub Release](https://img.shields.io/github/v/release/iamharshil/aix-cli)](https://github.com/iamharshil/aix-cli/releases)
11
- [![Build Status](https://img.shields.io/github/actions/workflow/status/iamharshil/aix-cli/ci)](https://github.com/iamharshil/aix-cli/actions)
12
- [![Total Downloads](https://img.shields.io/npm/dt/@iamharshil/aix-cli)](https://www.npmjs.com/package/@iamharshil/aix-cli)
13
+ [Getting Started](#getting-started) · [Documentation](#usage) · [Contributing](CONTRIBUTING.md) · [Changelog](CHANGELOG.md)
13
14
 
14
15
  </div>
15
16
 
16
17
  ---
17
18
 
18
- ## Overview
19
-
20
- **AIX CLI** enables you to use locally-running AI models from [LM Studio](https://lmstudio.ai) directly with [Claude Code](https://docs.anthropic.com/en/docs/claude-code) or [OpenCode](https://opencode.ai). No API keys, no cloud dependencies, complete privacy.
21
-
22
- ### Why AIX?
19
+ ## What is AIX?
23
20
 
24
- | Feature | Description |
25
- | -------------------- | ---------------------------------------------- |
26
- | 🔒 **Privacy-First** | All processing happens locally on your machine |
27
- | 🔑 **No API Keys** | No external services or subscriptions |
28
- | 🚀 **Fast** | Local inference with your GPU |
29
- | 🛡️ **Secure** | Your code never leaves your machine |
30
- | 🔧 **Simple** | Just run `aix-cli run` and start coding |
21
+ AIX CLI is a bridge between [LM Studio](https://lmstudio.ai) and AI coding assistants like [Claude Code](https://docs.anthropic.com/en/docs/claude-code) and [OpenCode](https://opencode.ai). It lets you use **locally-running language models** as the backend for your favorite AI dev tools — no API keys, no cloud calls, no data leaving your machine.
31
22
 
32
- ## Prerequisites
23
+ ```
24
+ ┌──────────────────────────────────────────────────┐
25
+ │ $ aix-cli run │
26
+ │ │
27
+ │ ✔ LM Studio running on http://localhost:1234 │
28
+ │ ✔ Model loaded: qwen2.5-coder-14b │
29
+ │ ✔ Launching Claude Code... │
30
+ │ │
31
+ │ Your code stays local. Always. │
32
+ └──────────────────────────────────────────────────┘
33
+ ```
33
34
 
34
- - [Node.js](https://nodejs.org/) 18.0.0 or higher
35
- - [LM Studio](https://lmstudio.ai) - Download and run local AI models
36
- - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) - AI coding assistant
35
+ ### Why AIX?
37
36
 
38
- ## Quick Start
37
+ - 🔒 **Privacy-first** — All inference runs locally on your hardware. Your code never leaves your machine.
38
+ - 🔑 **No API keys** — No subscriptions, no usage limits, no cloud dependencies.
39
+ - 🚀 **GPU-accelerated** — Take advantage of your local GPU for fast inference via LM Studio.
40
+ - 🔀 **Multi-provider** — Switch between Claude Code and OpenCode with a single flag.
41
+ - ⚡ **Zero config** — Just run `aix-cli run` and start coding.
39
42
 
40
- ```bash
41
- # Install
42
- npm install -g @iamharshil/aix-cli
43
+ ---
43
44
 
44
- # Verify setup
45
- aix-cli doctor
45
+ ## Getting Started
46
46
 
47
- # Run with interactive model selection
48
- aix-cli run
49
- ```
47
+ ### Prerequisites
50
48
 
51
- ## Installation
49
+ | Requirement | Description |
50
+ |---|---|
51
+ | [Node.js](https://nodejs.org/) ≥ 18 | JavaScript runtime |
52
+ | [LM Studio](https://lmstudio.ai) | Run local AI models and expose them via API |
53
+ | [Claude Code](https://docs.anthropic.com/en/docs/claude-code) **or** [OpenCode](https://opencode.ai) | AI coding assistant (at least one required) |
52
54
 
53
- ### Using npm (Recommended)
55
+ ### Install
54
56
 
55
57
  ```bash
56
58
  npm install -g @iamharshil/aix-cli
57
59
  ```
58
60
 
59
- ### Using Yarn
61
+ <details>
62
+ <summary>Other package managers</summary>
60
63
 
61
64
  ```bash
65
+ # Yarn
62
66
  yarn global add @iamharshil/aix-cli
63
- ```
64
-
65
- ### Using pnpm
66
67
 
67
- ```bash
68
+ # pnpm
68
69
  pnpm add -g @iamharshil/aix-cli
69
70
  ```
70
71
 
71
- ### From Source
72
+ </details>
73
+
74
+ <details>
75
+ <summary>Build from source</summary>
72
76
 
73
77
  ```bash
74
78
  git clone https://github.com/iamharshil/aix-cli.git
@@ -78,69 +82,94 @@ npm run build
78
82
  npm link
79
83
  ```
80
84
 
81
- ## Usage
82
-
83
- ### System Check
85
+ </details>
84
86
 
85
- Verify your environment is properly configured:
87
+ ### Verify
86
88
 
87
89
  ```bash
88
90
  aix-cli doctor
89
91
  ```
90
92
 
91
- ### Initialize Model
93
+ This checks that LM Studio, Claude Code / OpenCode, and your environment are properly configured.
94
+
95
+ ---
96
+
97
+ ## Usage
98
+
99
+ ### `aix-cli run` — Start a coding session
92
100
 
93
- Load a model from your local LM Studio models:
101
+ The primary command. Launches Claude Code or OpenCode backed by a local LM Studio model.
94
102
 
95
103
  ```bash
96
- # Interactive selection (prompts for provider)
97
- aix-cli init
104
+ # Interactive picks a running model automatically
105
+ aix-cli run
98
106
 
99
- # Specific model
100
- aix-cli init -m llama-3-8b
107
+ # Specify a model
108
+ aix-cli run -m qwen2.5-coder-14b
109
+
110
+ # Use OpenCode instead of Claude Code
111
+ aix-cli run --provider opencode
101
112
 
102
- # With provider
103
- aix-cli init -m llama-3-8b --provider opencode
113
+ # Pass a prompt directly
114
+ aix-cli run -m qwen2.5-coder-14b -- "Refactor auth middleware to use JWTs"
104
115
  ```
105
116
 
106
- ### Run with Provider
117
+ ### `aix-cli init` — Load a model
107
118
 
108
- Start coding with your local model using Claude Code or OpenCode:
119
+ Loads a model into LM Studio. If no model is specified, you'll get an interactive picker.
109
120
 
110
121
  ```bash
111
- # Interactive session (uses default provider)
112
- aix-cli run
122
+ aix-cli init # Interactive model selection
123
+ aix-cli init -m llama-3-8b # Load a specific model
124
+ aix-cli init -m llama-3-8b -p opencode # Set default provider at the same time
125
+ ```
113
126
 
114
- # With specific model
115
- aix-cli run -m llama-3-8b
127
+ ### `aix-cli status` — Check what's running
116
128
 
117
- # With OpenCode instead of Claude Code
118
- aix-cli run --provider opencode
119
- aix-cli run -p opencode -m llama-3-8b
129
+ Shows the current LM Studio server status and which models are loaded.
120
130
 
121
- # With arguments
122
- aix-cli run -m llama-3-8b -- "Write a hello world in Python"
131
+ ```bash
132
+ aix-cli status
123
133
  ```
124
134
 
125
- ### Check Status
135
+ ### `aix-cli doctor` — System diagnostics
126
136
 
127
- View LM Studio status and loaded models:
137
+ Verifies your environment is ready to go.
128
138
 
129
139
  ```bash
130
- aix-cli status
140
+ aix-cli doctor
131
141
  ```
132
142
 
133
- ## Configuration
143
+ ### Command Reference
144
+
145
+ | Command | Aliases | Description |
146
+ |---|---|---|
147
+ | `run` | `r` | Run Claude Code / OpenCode with a local model |
148
+ | `init` | `i`, `load` | Load a model into LM Studio |
149
+ | `status` | `s`, `stats` | Show LM Studio server status |
150
+ | `doctor` | `d`, `check` | Run system diagnostics |
134
151
 
135
- ### Config Location
152
+ ### Global Options
136
153
 
137
- AIX CLI stores configuration in your system's app data directory:
154
+ | Flag | Description |
155
+ |---|---|
156
+ | `-m, --model <name>` | Model name or ID to use |
157
+ | `-p, --provider <name>` | Provider: `claude` (default) or `opencode` |
158
+ | `-v, --verbose` | Show verbose output |
159
+ | `-h, --help` | Show help |
160
+ | `-V, --version` | Show version |
138
161
 
139
- | Platform | Path |
140
- | -------- | ---------------------------------------- |
141
- | macOS | `~/Library/Application Support/aix-cli/` |
142
- | Linux | `~/.config/aix-cli/` |
143
- | Windows | `%APPDATA%\aix-cli\` |
162
+ ---
163
+
164
+ ## Configuration
165
+
166
+ AIX stores its configuration in the OS-appropriate config directory:
167
+
168
+ | Platform | Path |
169
+ |---|---|
170
+ | macOS | `~/Library/Application Support/aix-cli/` |
171
+ | Linux | `~/.config/aix-cli/` |
172
+ | Windows | `%APPDATA%\aix-cli\` |
144
173
 
145
174
  ### Config File
146
175
 
@@ -149,130 +178,144 @@ AIX CLI stores configuration in your system's app data directory:
149
178
  "lmStudioUrl": "http://localhost",
150
179
  "lmStudioPort": 1234,
151
180
  "defaultTimeout": 30000,
152
- "model": "lmstudio/llama-3-8b",
181
+ "model": "lmstudio/qwen2.5-coder-14b",
153
182
  "defaultProvider": "claude"
154
183
  }
155
184
  ```
156
185
 
157
186
  ### Environment Variables
158
187
 
159
- | Variable | Description | Default |
160
- | ---------------- | --------------------- | ------- |
161
- | `LM_STUDIO_PORT` | LM Studio server port | `1234` |
188
+ | Variable | Description | Default |
189
+ |---|---|---|
190
+ | `LM_STUDIO_PORT` | Override the LM Studio server port | `1234` |
162
191
 
163
- ## Architecture
192
+ ---
193
+
194
+ ## How It Works
164
195
 
165
196
  ```
166
- ┌─────────────────────────────────────────────────────┐
167
- AIX Flow
168
- ├─────────────────────────────────────────────────────┤
169
-
170
- │ ┌──────────┐ ┌──────────────┐ │
171
- │ User │────▶│ LM Studio │ │
172
- │ └──────────┘ └──────────────┘ │
173
- │ │ │
174
- │ │ ▼ │
175
- ┌──────────────┐ │
176
- │ │ Local API
177
- │ │ (port 1234)
178
- │ │ └──────────────┘ │
179
- │ │
180
- │ ▼ ┌─────────┴─────────┐ │
181
- │ ┌─────────────┴─────────────┐ │
182
- │ │ │ │
183
- ▼ ▼ ▼
184
- ┌──────────┐ ┌──────────┐
185
- │ │Claude Code│ │OpenCode │ │
186
- │ │--model │ │--model │ │
187
- │ └──────────┘ └──────────┘ │
188
- │ │
189
- └─────────────────────────────────────────────────────┘
197
+ ┌───────────────────┐
198
+ LM Studio
199
+ │ (local server) │
200
+ port 1234
201
+ └────────┬──────────┘
202
+
203
+ REST API
204
+
205
+ ┌────────┴──────────┐
206
+ AIX CLI
207
+ model routing
208
+ config mgmt
209
+ └───┬──────────┬────┘
210
+
211
+ ┌────────┘ └────────┐
212
+ ▼ ▼
213
+ ┌──────────────┐ ┌──────────────┐
214
+ Claude Code │ │ OpenCode
215
+ --model X │ │ --model X
216
+ └──────────────┘ └──────────────┘
190
217
  ```
191
218
 
219
+ 1. **LM Studio** runs a local inference server exposing an OpenAI-compatible API.
220
+ 2. **AIX CLI** discovers available models, manages configuration, and orchestrates the connection.
221
+ 3. **Claude Code** or **OpenCode** receives the model endpoint and runs as it normally would — except fully local.
222
+
223
+ ---
224
+
192
225
  ## Troubleshooting
193
226
 
194
- ### LM Studio server not running
227
+ <details>
228
+ <summary><strong>LM Studio server not running</strong></summary>
229
+
230
+ 1. Open LM Studio
231
+ 2. Navigate to the **Server** tab (left sidebar)
232
+ 3. Click **Start Server**
233
+ 4. Confirm with `aix-cli status`
234
+
235
+ </details>
236
+
237
+ <details>
238
+ <summary><strong>No models found</strong></summary>
239
+
240
+ 1. Open LM Studio → **Search** tab
241
+ 2. Download a model (e.g., Qwen 2.5 Coder, Llama 3, Mistral)
242
+ 3. Wait for the download to complete
243
+ 4. Run `aix-cli init` to load it
244
+
245
+ </details>
246
+
247
+ <details>
248
+ <summary><strong>Connection refused on port 1234</strong></summary>
249
+
250
+ Check the LM Studio server tab for the actual port it's running on. If it differs from `1234`, update your config:
195
251
 
196
252
  ```bash
197
- # 1. Open LM Studio
198
- # 2. Go to Server tab (left sidebar)
199
- # 3. Click Start Server
200
- # 4. Run: aix-cli run
253
+ # The config file path is shown by `aix-cli doctor`
254
+ # Edit the config file and set "lmStudioPort" to the correct port
201
255
  ```
202
256
 
203
- ### No models found
257
+ </details>
258
+
259
+ <details>
260
+ <summary><strong>Claude Code / OpenCode not detected</strong></summary>
261
+
262
+ Install the missing provider globally:
204
263
 
205
264
  ```bash
206
- # 1. Open LM Studio
207
- # 2. Go to Search tab
208
- # 3. Download a model (e.g., Llama 3, Mistral)
209
- # 4. Wait for download to complete
210
- # 5. Run: aix-cli init
265
+ # Claude Code
266
+ npm install -g @anthropic-ai/claude-code
267
+
268
+ # OpenCode
269
+ npm install -g opencode
211
270
  ```
212
271
 
213
- ### Connection refused on port 1234
272
+ Then re-run `aix-cli doctor` to confirm.
273
+
274
+ </details>
214
275
 
215
- Check LM Studio's server tab for the correct port and update your config.
276
+ ---
216
277
 
217
278
  ## Security & Privacy
218
279
 
219
- - All AI processing happens locally
220
- - ✅ No data sent to external servers
221
- - ✅ No telemetry or analytics
222
- - ✅ No API keys required
223
- - ✅ Your code stays on your machine
280
+ AIX is designed around a simple principle: **your code never leaves your machine.**
224
281
 
225
- ## Contributing
282
+ - ✅ All AI inference runs locally via LM Studio
283
+ - ✅ No telemetry, analytics, or tracking of any kind
284
+ - ✅ No outbound network calls (except to `localhost`)
285
+ - ✅ No API keys or accounts required
286
+ - ✅ Fully open-source — audit the code yourself
287
+
288
+ Found a vulnerability? Please report it responsibly via our [Security Policy](SECURITY.md).
226
289
 
227
- Contributions are welcome! Please read our [Contributing Guide](CONTRIBUTING.md).
290
+ ---
291
+
292
+ ## Contributing
228
293
 
229
- ### Development
294
+ Contributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines on how to get started.
230
295
 
231
296
  ```bash
232
- # Clone and setup
233
297
  git clone https://github.com/iamharshil/aix-cli.git
234
298
  cd aix-cli
235
299
  npm install
236
-
237
- # Development mode
238
- npm run dev
239
-
240
- # Run tests
241
- npm test
242
-
243
- # Lint
244
- npm run lint
245
-
246
- # Build
247
- npm run build
300
+ npm run dev # Run in development mode
301
+ npm test # Run tests
302
+ npm run lint # Lint
248
303
  ```
249
304
 
250
- ### Code Style
251
-
252
- - TypeScript with strict mode
253
- - ESLint + Prettier configured
254
- - 2-space indentation
255
- - Single quotes, semicolons
305
+ ---
256
306
 
257
307
  ## Related Projects
258
308
 
259
- - [LM Studio](https://lmstudio.ai) - Run local AI models
260
- - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) - AI coding assistant
261
- - [OpenCode](https://opencode.ai) - Open-source AI coding assistant
309
+ - [LM Studio](https://lmstudio.ai) Run local AI models with a visual interface
310
+ - [Claude Code](https://docs.anthropic.com/en/docs/claude-code) Anthropic's AI coding assistant
311
+ - [OpenCode](https://opencode.ai) Open-source AI coding assistant
262
312
 
263
313
  ## License
264
314
 
265
- [MIT](LICENSE) - © 2024 Harshil
266
-
267
- ## Support
268
-
269
- - 📋 [Issues](https://github.com/iamharshil/aix-cli/issues) - Report bugs
270
- - 💬 [Discussions](https://github.com/iamharshil/aix-cli/discussions) - Ask questions
315
+ [MIT](LICENSE) © [Harshil](https://github.com/iamharshil)
271
316
 
272
317
  ---
273
318
 
274
319
  <div align="center">
275
-
276
- Built with ❤️ for privacy-conscious developers
277
-
320
+ <sub>Built for developers who care about privacy.</sub>
278
321
  </div>
package/dist/bin/aix.js CHANGED
@@ -1,10 +1,10 @@
1
1
  #!/usr/bin/env node
2
- var ne=Object.defineProperty;var ie=(o=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(o,{get:(e,t)=>(typeof require<"u"?require:e)[t]}):o)(function(o){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+o+'" is not supported')});var b=(o,e)=>()=>(o&&(e=o(o=0)),e);var L=(o,e)=>{for(var t in e)ne(o,t,{get:e[t],enumerable:!0})};var W={};L(W,{ConfigService:()=>O,configService:()=>c});import se from"conf";var O,c,M=b(()=>{"use strict";O=class{store;constructor(){this.store=new se({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,defaultTimeout:3e4,autoStartServer:!1,defaultProvider:"claude"},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,t){this.store.set(e,t)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")??"claude"}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),t=this.store.get("lmStudioPort");return`${e}:${t}`}reset(){this.store.clear()}},c=new O});var Q={};L(Q,{LMStudioService:()=>E,lmStudioService:()=>f});import{execa as A}from"execa";import H from"ora";import V from"chalk";var J,E,f,x=b(()=>{"use strict";M();J=[1234,1235,1236,1237],E=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let t of e)try{let n=await fetch(this.getApiUrl(t),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!n.ok)continue;let r=await n.json(),i=[];return Array.isArray(r)?i=r:r.models&&Array.isArray(r.models)?i=r.models:r.data&&Array.isArray(r.data)&&(i=r.data),i.map(a=>{let s=a;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(a=>a.id&&a.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let t=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!t.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let n=await t.json();return{running:!0,port:c.get("lmStudioPort"),models:n.models??[],activeModel:n.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,t){let n=t??H({text:`Loading model: ${V.cyan(e)}`,color:"cyan"}).start();try{let r=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!r.ok)throw new Error(`Failed to load model: ${r.statusText}`);return n.succeed(`Model ${V.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:n}}catch(r){throw n.fail(`Failed to load model: ${r instanceof Error?r.message:"Unknown error"}`),r}}async startServer(e){let t=e??H({text:"Starting LM Studio server...",color:"cyan"}).start();try{let n=process.platform==="darwin",r=process.platform==="linux",i=process.platform==="win32",a;if(n){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let l of s)try{let{existsSync:m}=await import("fs");if(m(l)){a=`open "${l}" --args --server`;break}}catch{}if(a?.startsWith("open")){await A("open",[s.find(l=>{try{let{existsSync:m}=ie("fs");return m(l)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),t.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else r?a=await this.findLinuxBinary():i&&(a=await this.findWindowsExecutable());if(!a)throw t.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await A(a,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),t.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(n){throw t.fail(`Failed to start LM Studio: ${n instanceof Error?n.message:"Unknown error"}`),n}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let t of e)try{return await A("test",["-x",t]),t}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,t=process.env.PROGRAMFILES,n=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",t?`${t}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let r of n)try{return await A("cmd",["/c","if exist",`"${r}"`,"echo","yes"]),r}catch{continue}}async waitForServer(e=6e4){let t=Date.now();for(;Date.now()-t<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(t=>setTimeout(t,e))}async findAvailablePort(){for(let e of J)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return J[0]??1234}},f=new E});var oe={};L(oe,{ClaudeService:()=>N,claudeService:()=>R});import{execa as _}from"execa";import le from"chalk";var N,R,F=b(()=>{"use strict";N=class{async isClaudeCodeInstalled(){try{return await _("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:t,args:n=[],verbose:r=!1}=e,i=this.extractProvider(t),a=this.extractModelName(t);if(!i||!a)throw new Error(`Invalid model format: ${t}. Expected format: provider/model-name`);let s=`${i}/${a}`,l=["--model",s,...n];r&&console.log(le.dim(`
2
+ var ne=Object.defineProperty;var ie=(t=>typeof require<"u"?require:typeof Proxy<"u"?new Proxy(t,{get:(e,o)=>(typeof require<"u"?require:e)[o]}):t)(function(t){if(typeof require<"u")return require.apply(this,arguments);throw Error('Dynamic require of "'+t+'" is not supported')});var O=(t,e)=>()=>(t&&(e=t(t=0)),e);var A=(t,e)=>{for(var o in e)ne(t,o,{get:e[o],enumerable:!0})};var W={};A(W,{ConfigService:()=>E,configService:()=>c});import se from"conf";var E,c,M=O(()=>{"use strict";E=class{store;constructor(){this.store=new se({projectName:"aix",defaults:{lmStudioUrl:"http://localhost",lmStudioPort:1234,defaultTimeout:3e4,autoStartServer:!1},clearInvalidConfig:!0})}get(e){return this.store.get(e)}set(e,o){this.store.set(e,o)}setModel(e){this.store.set("model",e)}getLastUsedModel(){return this.store.get("model")}setDefaultProvider(e){this.store.set("defaultProvider",e)}getDefaultProvider(){return this.store.get("defaultProvider")}getLMStudioUrl(){let e=this.store.get("lmStudioUrl"),o=this.store.get("lmStudioPort");return`${e}:${o}`}reset(){this.store.clear()}},c=new E});var Q={};A(Q,{LMStudioService:()=>I,lmStudioService:()=>g});import{execa as k}from"execa";import H from"ora";import V from"chalk";var J,I,g,C=O(()=>{"use strict";M();J=[1234,1235,1236,1237],I=class{baseUrl;constructor(){this.baseUrl=c.getLMStudioUrl()}getApiUrl(e){return`${this.baseUrl}${e}`}async checkStatus(){try{return(await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(3e3)})).ok}catch{return!1}}async getAvailableModels(){let e=["/api/v1/models","/api/models","/v1/models","/api/ls-model/list"];for(let o of e)try{let n=await fetch(this.getApiUrl(o),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!n.ok)continue;let r=await n.json(),i=[];return Array.isArray(r)?i=r:r.models&&Array.isArray(r.models)?i=r.models:r.data&&Array.isArray(r.data)&&(i=r.data),i.map(a=>{let s=a;return{id:String(s.key||s.id||s.model||""),name:String(s.display_name||s.name||s.id||s.model||""),size:Number(s.size_bytes||s.size||s.file_size||0),quantization:String(s.quantization?typeof s.quantization=="object"?s.quantization.name:s.quantization:"")}}).filter(a=>a.id&&a.name)}catch{continue}return[]}async getStatus(){if(!await this.checkStatus())return{running:!1,port:c.get("lmStudioPort"),models:[]};try{let o=await fetch(this.getApiUrl("/api/status"),{method:"GET",signal:AbortSignal.timeout(1e4)});if(!o.ok)return{running:!1,port:c.get("lmStudioPort"),models:[]};let n=await o.json();return{running:!0,port:c.get("lmStudioPort"),models:n.models??[],activeModel:n.active_model}}catch{return{running:!1,port:c.get("lmStudioPort"),models:[]}}}async loadModel(e,o){let n=o??H({text:`Loading model: ${V.cyan(e)}`,color:"cyan"}).start();try{let r=await fetch(this.getApiUrl("/api/model/load"),{method:"POST",headers:{"Content-Type":"application/json"},body:JSON.stringify({model:e}),signal:AbortSignal.timeout(3e5)});if(!r.ok)throw new Error(`Failed to load model: ${r.statusText}`);return n.succeed(`Model ${V.green(e)} loaded successfully`),c.setModel(e),{loadSpinner:n}}catch(r){throw n.fail(`Failed to load model: ${r instanceof Error?r.message:"Unknown error"}`),r}}async startServer(e){let o=e??H({text:"Starting LM Studio server...",color:"cyan"}).start();try{let n=process.platform==="darwin",r=process.platform==="linux",i=process.platform==="win32",a;if(n){let s=["/Applications/LM Studio.app",`${process.env.HOME}/Applications/LM Studio.app`];for(let l of s)try{let{existsSync:p}=await import("fs");if(p(l)){a=`open "${l}" --args --server`;break}}catch{}if(a?.startsWith("open")){await k("open",[s.find(l=>{try{let{existsSync:p}=ie("fs");return p(l)}catch{return!1}})||"/Applications/LM Studio.app","--args","--server"],{detached:!0,stdio:"ignore"}),o.succeed("LM Studio server started"),await this.waitForServer(6e4);return}}else r?a=await this.findLinuxBinary():i&&(a=await this.findWindowsExecutable());if(!a)throw o.fail("LM Studio not found. Please install it from https://lmstudio.ai"),new Error("LM Studio not installed");await k(a,["--server"],{detached:!0,stdio:"ignore",env:{...process.env,LM_STUDIO_SERVER_PORT:String(c.get("lmStudioPort"))}}),o.succeed("LM Studio server started"),await this.waitForServer(6e4)}catch(n){throw o.fail(`Failed to start LM Studio: ${n instanceof Error?n.message:"Unknown error"}`),n}}async findLinuxBinary(){let e=["/usr/bin/lm-studio","/usr/local/bin/lm-studio",`${process.env.HOME}/.local/bin/lm-studio`];for(let o of e)try{return await k("test",["-x",o]),o}catch{continue}}async findWindowsExecutable(){let e=process.env.LOCALAPPDATA,o=process.env.PROGRAMFILES,n=[e?`${e}\\Programs\\LM Studio\\lm-studio.exe`:"",o?`${o}\\LM Studio\\lm-studio.exe`:""].filter(Boolean);for(let r of n)try{return await k("cmd",["/c","if exist",`"${r}"`,"echo","yes"]),r}catch{continue}}async waitForServer(e=6e4){let o=Date.now();for(;Date.now()-o<e;){if(await this.checkStatus())return!0;await this.sleep(2e3)}return!1}sleep(e){return new Promise(o=>setTimeout(o,e))}async findAvailablePort(){for(let e of J)try{if((await fetch(`http://localhost:${e}/api/status`,{method:"GET",signal:AbortSignal.timeout(1e3)})).ok)return e}catch{return c.set("lmStudioPort",e),e}return J[0]??1234}},g=new I});var oe={};A(oe,{ClaudeService:()=>U,claudeService:()=>P});import{execa as _}from"execa";import le from"chalk";var U,P,F=O(()=>{"use strict";U=class{async isClaudeCodeInstalled(){try{return await _("claude",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=`${i}/${a}`,l=["--model",s,...n];r&&console.log(le.dim(`
3
3
  Running: claude ${l.join(" ")}
4
- `));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(m){if(m instanceof Error&&"exitCode"in m){let P=m.exitCode;process.exit(P??1)}throw m}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let t=e.split("/");if(!(t.length<2))return t.slice(1).join("/")}async getVersion(){try{return(await _("claude",["--version"])).stdout}catch{return}}},R=new N});var te={};L(te,{OpenCodeService:()=>U,openCodeService:()=>z});import{execa as B}from"execa";import de from"chalk";var U,z,q=b(()=>{"use strict";U=class{async isOpenCodeInstalled(){try{return await B("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:t,args:n=[],verbose:r=!1}=e,i=this.extractProvider(t),a=this.extractModelName(t);if(!i||!a)throw new Error(`Invalid model format: ${t}. Expected format: provider/model-name`);let s=["--model",t,...n];r&&console.log(de.dim(`
4
+ `));try{await _("claude",l,{stdio:"inherit",env:{...process.env,ANTHROPIC_MODEL:s}})}catch(p){if(p instanceof Error&&"exitCode"in p){let L=p.exitCode;process.exit(L??1)}throw p}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await _("claude",["--version"])).stdout}catch{return}}},P=new U});var te={};A(te,{OpenCodeService:()=>z,openCodeService:()=>$});import{execa as q}from"execa";import de from"chalk";var z,$,B=O(()=>{"use strict";z=class{async isOpenCodeInstalled(){try{return await q("opencode",["--version"],{stdio:"ignore"}),!0}catch{return!1}}async run(e){let{model:o,args:n=[],verbose:r=!1}=e,i=this.extractProvider(o),a=this.extractModelName(o);if(!i||!a)throw new Error(`Invalid model format: ${o}. Expected format: provider/model-name`);let s=["--model",o,...n];r&&console.log(de.dim(`
5
5
  Running: opencode ${s.join(" ")}
6
- `));try{await B("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:a,OPENCODE_MODEL_PROVIDER:i}})}catch(l){if(l instanceof Error&&"exitCode"in l){let m=l.exitCode;process.exit(m??1)}throw l}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let t=e.split("/");if(!(t.length<2))return t.slice(1).join("/")}async getVersion(){try{return(await B("opencode",["--version"])).stdout}catch{return}}},z=new U});import{Command as ce}from"commander";import u from"chalk";x();M();import Z from"ora";import h from"chalk";import ee from"inquirer";import X from"inquirer";async function k(o,e){let t=o.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),n=e?t.findIndex(i=>i.value.id===e):0;return(await X.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:t,default:Math.max(0,n),pageSize:Math.min(o.length,15)}])).model}async function I(o,e=!0){return(await X.prompt([{type:"confirm",name:"confirm",message:o,default:e}])).confirm}import Y from"chalk";function D(o){if(o===0)return"0 B";let e=1024,t=["B","KB","MB","GB","TB"],n=Math.floor(Math.log(o)/Math.log(e));return`${parseFloat((o/Math.pow(e,n)).toFixed(2))} ${t[n]}`}function C(o){console.log(Y.green("\u2713")+" "+o)}function ae(o){console.error(Y.red("\u2717")+" "+o)}function v(o,e=1){ae(o),process.exit(e)}async function j(o={}){let e;if(o.provider)e=o.provider;else{let p=c.getDefaultProvider(),{providerSelection:w}=await ee.prompt([{type:"list",name:"providerSelection",message:"Select provider:",default:p,choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]);e=w;let{saveDefault:g}=await ee.prompt([{type:"confirm",name:"saveDefault",message:"Save as default provider?",default:!1}]);g&&(c.setDefaultProvider(e),C(`Default provider set to ${h.cyan(e)}`))}let t=Z({text:"Checking LM Studio status...",color:"cyan"}).start(),n=await f.checkStatus();n||(t.info("LM Studio server not running"),t.stop(),await I("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await f.startServer(),n=!0),t.succeed("Connected to LM Studio");let r=Z({text:"Fetching available models...",color:"cyan"}).start(),i=await f.getAvailableModels();i.length===0&&(r.fail("No models found. Download some models in LM Studio first."),v("No models available")),r.succeed(`Found ${h.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(h.bold("Available Models:")),console.log(h.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((p,w)=>{let g=D(p.size),S=p.loaded?h.green(" [LOADED]"):"";console.log(` ${h.dim(String(w+1).padStart(2))}. ${p.name} ${h.dim(`(${g})`)}${S}`)}),console.log();let a=c.getLastUsedModel(),s=o.model,l=s?i.find(p=>p.id===s||p.name.includes(s)):await k(i,a);l||v("No model selected"),await f.loadModel(l.id,t);let m=l.id.replace("/","--"),P=e==="opencode"?"OpenCode":"Claude Code";C(h.bold(`
7
- Model ready: ${l.name}`)),console.log(),console.log(`Run ${P} with this model:`),console.log(` ${h.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+m)}`),console.log(),console.log(`Or use ${h.cyan("aix-cli run")} to start an interactive session`)}x();F();q();M();import T from"ora";import re from"chalk";async function G(o={}){let e=o.provider??c.getDefaultProvider(),t=T({text:`Checking ${e==="opencode"?"OpenCode":"Claude Code"} installation...`,color:"cyan"}).start();(e==="opencode"?await z.isOpenCodeInstalled():await R.isClaudeCodeInstalled())||(t.fail(`${e==="opencode"?"OpenCode":"Claude Code"} is not installed.`),v(`Please install ${e==="opencode"?"OpenCode":"Claude Code"} first.`)),t.succeed(`${e==="opencode"?"OpenCode":"Claude Code"} is installed`);let r=T({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await f.checkStatus();i||(r.info("LM Studio server not running"),r.stop(),await I("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await f.startServer(),i=!0),r.succeed("Connected to LM Studio");let a=T({text:"Fetching available models...",color:"cyan"}).start(),s=await f.getAvailableModels();s.length===0&&(a.fail("No models found. Download some models in LM Studio first."),v("No models available")),a.stop();let l;if(o.model){let g=s.find(S=>S.id===o.model||S.name.toLowerCase().includes(o.model.toLowerCase()));g||v(`Model "${o.model}" not found. Available models: ${s.map(S=>S.name).join(", ")}`),l=g.id}else{let g=c.getLastUsedModel();l=(await k(s,g)).id}let m=T({text:`Loading model: ${re.cyan(l)}`,color:"cyan"}).start();await f.loadModel(l,m);let p=`lmstudio/${l.replace("/","--")}`,w=e==="opencode"?"OpenCode":"Claude Code";C(re.green(`
8
- Starting ${w} with model: ${p}
9
- `));try{e==="opencode"?await z.run({model:p,args:o.args??[],verbose:o.verbose}):await R.run({model:p,args:o.args??[],verbose:o.verbose})}catch(g){v(`Failed to run ${w}: ${g instanceof Error?g.message:"Unknown error"}`)}}x();import d from"chalk";async function K(){let o=await f.getStatus();console.log(),console.log(d.bold("LM Studio Status")),console.log(d.dim("\u2500".repeat(50))),console.log(` ${o.running?d.green("\u25CF"):d.red("\u25CB")} Server: ${o.running?d.green("Running"):d.red("Stopped")}`),console.log(` ${d.dim("\u25B8")} Port: ${d.cyan(String(o.port))}`),console.log(` ${d.dim("\u25B8")} URL: ${d.cyan(`http://localhost:${o.port}`)}`),o.activeModel?console.log(` ${d.dim("\u25B8")} Active Model: ${d.green(o.activeModel)}`):console.log(` ${d.dim("\u25B8")} Active Model: ${d.dim("None")}`),console.log(),console.log(d.bold("Models")),console.log(d.dim("\u2500".repeat(50))),o.models.length===0?console.log(` ${d.dim("No models available")}`):o.models.forEach((e,t)=>{let n=D(e.size),r=e.id===o.activeModel?` ${d.green("[LOADED]")}`:"";console.log(` ${d.dim(String(t+1)+".")} ${e.name}${r}`),console.log(` ${d.dim("ID:")} ${e.id}`),console.log(` ${d.dim("Size:")} ${n}`),e.quantization&&console.log(` ${d.dim("Quantization:")} ${e.quantization}`),console.log()})}var y=new ce;y.name("aix-cli").description("AI CLI tool that integrates LM Studio with Claude Code or OpenCode for local AI-powered development").version("2.0.0").showHelpAfterError();function $(o=0){console.log(),console.log(u.dim(o===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(o)}process.on("SIGINT",()=>$(0));process.on("SIGTERM",()=>$(0));process.on("uncaughtException",o=>{o.message?.includes("ExitPromptError")||o.message?.includes("User force closed")||o.message?.includes("prompt")?$(0):(console.error(u.red("Error:"),o.message),process.exit(1))});process.on("unhandledRejection",o=>{let e=String(o);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&$(0)});y.command("init",{isDefault:!1}).aliases(["i","load"]).description("Initialize and load a model into LM Studio").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").action(j);y.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(o,e)=>{await G({...e,args:o})});y.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio status and available models").action(K);y.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:o}=await Promise.resolve().then(()=>(x(),Q)),{claudeService:e}=await Promise.resolve().then(()=>(F(),oe)),{openCodeService:t}=await Promise.resolve().then(()=>(q(),te)),{configService:n}=await Promise.resolve().then(()=>(M(),W));console.log(u.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(u.dim("\u2500".repeat(40)));let r=await o.checkStatus(),i=await e.isClaudeCodeInstalled(),a=await t.isOpenCodeInstalled(),s=n.getDefaultProvider(),l=n.get("lmStudioPort");console.log(),console.log(`${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?u.green("Running"):u.yellow("Not running")}`),console.log(`${i?"\u2705":"\u274C"} Claude Code: ${i?u.green("Installed"):u.red("Not installed")}`),console.log(`${a?"\u2705":"\u274C"} OpenCode: ${a?u.green("Installed"):u.red("Not installed")}`),console.log(`\u{1F310} Server: ${u.cyan(`http://localhost:${l}`)}`),console.log(`\u{1F4CC} Default provider: ${u.cyan(s)}`),(!i||!a||!r)&&(console.log(),console.log(u.bold("\u{1F4CB} Next Steps:")),i||console.log(` 1. ${u.cyan("npm install -g @anthropic-ai/claude-code")}`),a||console.log(` 2. ${u.cyan("npm install -g opencode")}`),r||console.log(" 3. Open LM Studio and start the server")),console.log(),console.log(u.dim("\u{1F4D6} Docs: ")+u.cyan("https://lmstudio.ai"))});y.parse();
6
+ `));try{await q("opencode",s,{stdio:"inherit",env:{...process.env,OPENCODE_MODEL_NAME:a,OPENCODE_MODEL_PROVIDER:i}})}catch(l){if(l instanceof Error&&"exitCode"in l){let p=l.exitCode;process.exit(p??1)}throw l}}extractProvider(e){return e.split("/")[0]}extractModelName(e){let o=e.split("/");if(!(o.length<2))return o.slice(1).join("/")}async getVersion(){try{return(await q("opencode",["--version"])).stdout}catch{return}}},$=new z});import{Command as me}from"commander";import u from"chalk";C();M();import Z from"ora";import h from"chalk";import ee from"inquirer";import X from"inquirer";async function D(t,e){let o=t.map(i=>({name:`${i.name} (${i.id})`,value:i,short:i.name})),n=e?o.findIndex(i=>i.value.id===e):0;return(await X.prompt([{type:"list",name:"model",message:"Select a model to load:",choices:o,default:Math.max(0,n),pageSize:Math.min(t.length,15)}])).model}async function N(t,e=!0){return(await X.prompt([{type:"confirm",name:"confirm",message:t,default:e}])).confirm}import Y from"chalk";function R(t){if(t===0)return"0 B";let e=1024,o=["B","KB","MB","GB","TB"],n=Math.floor(Math.log(t)/Math.log(e));return`${parseFloat((t/Math.pow(e,n)).toFixed(2))} ${o[n]}`}function x(t){console.log(Y.green("\u2713")+" "+t)}function ae(t){console.error(Y.red("\u2717")+" "+t)}function v(t,e=1){ae(t),process.exit(e)}async function j(t={}){let e;if(t.provider)e=t.provider;else{let f=c.getDefaultProvider(),{providerSelection:w}=await ee.prompt([{type:"list",name:"providerSelection",message:"Select provider:",default:f,choices:[{name:"Claude Code",value:"claude"},{name:"OpenCode",value:"opencode"}]}]);e=w;let{saveDefault:m}=await ee.prompt([{type:"confirm",name:"saveDefault",message:"Save as default provider?",default:!1}]);m&&(c.setDefaultProvider(e),x(`Default provider set to ${h.cyan(e)}`))}let o=Z({text:"Checking LM Studio status...",color:"cyan"}).start(),n=await g.checkStatus();n||(o.info("LM Studio server not running"),o.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),n=!0),o.succeed("Connected to LM Studio");let r=Z({text:"Fetching available models...",color:"cyan"}).start(),i=await g.getAvailableModels();i.length===0&&(r.fail("No models found. Download some models in LM Studio first."),v("No models available")),r.succeed(`Found ${h.bold(i.length)} model${i.length===1?"":"s"}`),console.log(),console.log(h.bold("Available Models:")),console.log(h.dim("\u2500".repeat(process.stdout.columns||80))),i.forEach((f,w)=>{let m=R(f.size),S=f.loaded?h.green(" [LOADED]"):"";console.log(` ${h.dim(String(w+1).padStart(2))}. ${f.name} ${h.dim(`(${m})`)}${S}`)}),console.log();let a=c.getLastUsedModel(),s=t.model,l=s?i.find(f=>f.id===s||f.name.includes(s)):await D(i,a);l||v("No model selected"),await g.loadModel(l.id,o);let p=l.id.replace("/","--"),L=e==="opencode"?"OpenCode":"Claude Code";x(h.bold(`
7
+ Model ready: ${l.name}`)),console.log(),console.log(`Run ${L} with this model:`),console.log(` ${h.cyan((e==="opencode"?"opencode":"claude")+" --model lmstudio/"+p)}`),console.log(),console.log(`Or use ${h.cyan("aix-cli run")} to start an interactive session`)}C();F();B();M();import T from"ora";import re from"chalk";import ce from"inquirer";async function ue(){let t=await P.isClaudeCodeInstalled(),e=await $.isOpenCodeInstalled(),o=[];if(t&&o.push({name:"Claude Code",value:"claude"}),e&&o.push({name:"OpenCode",value:"opencode"}),o.length===0&&v("Neither Claude Code nor OpenCode is installed."),o.length===1)return o[0].value;let{providerSelection:n}=await ce.prompt([{type:"list",name:"providerSelection",message:"Select coding tool:",choices:o}]);return n}async function G(t={}){let e;if(t.provider)e=t.provider;else{let m=c.getDefaultProvider();m?e=m:e=await ue()}let o=T({text:`Checking ${e==="opencode"?"OpenCode":"Claude Code"} installation...`,color:"cyan"}).start();(e==="opencode"?await $.isOpenCodeInstalled():await P.isClaudeCodeInstalled())||(o.fail(`${e==="opencode"?"OpenCode":"Claude Code"} is not installed.`),v(`Please install ${e==="opencode"?"OpenCode":"Claude Code"} first.`)),o.succeed(`${e==="opencode"?"OpenCode":"Claude Code"} is installed`);let r=T({text:"Checking LM Studio status...",color:"cyan"}).start(),i=await g.checkStatus();i||(r.info("LM Studio server not running"),r.stop(),await N("Would you like to start the LM Studio server?")||v("LM Studio server must be running. Start it manually or use the Server tab in LM Studio."),await g.startServer(),i=!0),r.succeed("Connected to LM Studio");let a=T({text:"Fetching available models...",color:"cyan"}).start(),s=await g.getAvailableModels();s.length===0&&(a.fail("No models found. Download some models in LM Studio first."),v("No models available")),a.stop();let l;if(t.model){let m=s.find(S=>S.id===t.model||S.name.toLowerCase().includes(t.model.toLowerCase()));m||v(`Model "${t.model}" not found. Available models: ${s.map(S=>S.name).join(", ")}`),l=m.id}else{let m=c.getLastUsedModel();l=(await D(s,m)).id}let p=T({text:`Loading model: ${re.cyan(l)}`,color:"cyan"}).start();await g.loadModel(l,p);let f=`lmstudio/${l.replace("/","--")}`,w=e==="opencode"?"OpenCode":"Claude Code";x(re.green(`
8
+ Starting ${w} with model: ${f}
9
+ `));try{e==="opencode"?await $.run({model:f,args:t.args??[],verbose:t.verbose}):await P.run({model:f,args:t.args??[],verbose:t.verbose})}catch(m){v(`Failed to run ${w}: ${m instanceof Error?m.message:"Unknown error"}`)}}C();import d from"chalk";async function K(){let t=await g.getStatus();console.log(),console.log(d.bold("LM Studio Status")),console.log(d.dim("\u2500".repeat(50))),console.log(` ${t.running?d.green("\u25CF"):d.red("\u25CB")} Server: ${t.running?d.green("Running"):d.red("Stopped")}`),console.log(` ${d.dim("\u25B8")} Port: ${d.cyan(String(t.port))}`),console.log(` ${d.dim("\u25B8")} URL: ${d.cyan(`http://localhost:${t.port}`)}`),t.activeModel?console.log(` ${d.dim("\u25B8")} Active Model: ${d.green(t.activeModel)}`):console.log(` ${d.dim("\u25B8")} Active Model: ${d.dim("None")}`),console.log(),console.log(d.bold("Models")),console.log(d.dim("\u2500".repeat(50))),t.models.length===0?console.log(` ${d.dim("No models available")}`):t.models.forEach((e,o)=>{let n=R(e.size),r=e.id===t.activeModel?` ${d.green("[LOADED]")}`:"";console.log(` ${d.dim(String(o+1)+".")} ${e.name}${r}`),console.log(` ${d.dim("ID:")} ${e.id}`),console.log(` ${d.dim("Size:")} ${n}`),e.quantization&&console.log(` ${d.dim("Quantization:")} ${e.quantization}`),console.log()})}var y=new me;y.name("aix-cli").description("AI CLI tool that integrates LM Studio with Claude Code or OpenCode for local AI-powered development").version("2.0.0").showHelpAfterError();function b(t=0){console.log(),console.log(u.dim(t===0?"\u{1F44B} Goodbye!":"\u274C Cancelled.")),process.exit(t)}process.on("SIGINT",()=>b(0));process.on("SIGTERM",()=>b(0));process.on("uncaughtException",t=>{t.message?.includes("ExitPromptError")||t.message?.includes("User force closed")||t.message?.includes("prompt")?b(0):(console.error(u.red("Error:"),t.message),process.exit(1))});process.on("unhandledRejection",t=>{let e=String(t);(e.includes("ExitPromptError")||e.includes("User force closed")||e.includes("prompt"))&&b(0)});y.command("init",{isDefault:!1}).aliases(["i","load"]).description("Initialize and load a model into LM Studio").option("-m, --model <name>","Model name or ID to load","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").action(j);y.command("run",{isDefault:!1}).aliases(["r"]).description("Run Claude Code or OpenCode with a model from LM Studio").option("-m, --model <name>","Model name or ID to use","").option("-p, --provider <provider>","Provider to use (claude or opencode)","").option("-v, --verbose","Show verbose output").argument("[args...]","Additional arguments for the provider").action(async(t,e)=>{await G({...e,args:t})});y.command("status",{isDefault:!1}).aliases(["s","stats"]).description("Show LM Studio status and available models").action(K);y.command("doctor",{isDefault:!1}).aliases(["d","check"]).description("Check system requirements and configuration").action(async()=>{let{lmStudioService:t}=await Promise.resolve().then(()=>(C(),Q)),{claudeService:e}=await Promise.resolve().then(()=>(F(),oe)),{openCodeService:o}=await Promise.resolve().then(()=>(B(),te)),{configService:n}=await Promise.resolve().then(()=>(M(),W));console.log(u.bold.cyan("\u{1F527} AIX CLI System Check")),console.log(u.dim("\u2500".repeat(40)));let r=await t.checkStatus(),i=await e.isClaudeCodeInstalled(),a=await o.isOpenCodeInstalled(),s=n.getDefaultProvider(),l=n.get("lmStudioPort");console.log(),console.log(`${r?"\u2705":"\u26A0\uFE0F"} LM Studio: ${r?u.green("Running"):u.yellow("Not running")}`),console.log(`${i?"\u2705":"\u274C"} Claude Code: ${i?u.green("Installed"):u.red("Not installed")}`),console.log(`${a?"\u2705":"\u274C"} OpenCode: ${a?u.green("Installed"):u.red("Not installed")}`),console.log(`\u{1F310} Server: ${u.cyan(`http://localhost:${l}`)}`),console.log(`\u{1F4CC} Default provider: ${u.cyan(s)}`),(!i||!a||!r)&&(console.log(),console.log(u.bold("\u{1F4CB} Next Steps:")),i||console.log(` 1. ${u.cyan("npm install -g @anthropic-ai/claude-code")}`),a||console.log(` 2. ${u.cyan("npm install -g opencode")}`),r||console.log(" 3. Open LM Studio and start the server")),console.log(),console.log(u.dim("\u{1F4D6} Docs: ")+u.cyan("https://lmstudio.ai"))});y.parse();
10
10
  //# sourceMappingURL=aix.js.map