codemaxxing 0.4.15 → 0.4.17

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -191,10 +191,35 @@ Conversations auto-save to SQLite. Pick up where you left off:
191
191
  - `/session delete` — remove a session
192
192
  - `/resume` — interactive session picker
193
193
 
194
+ ### 🔌 MCP Support (Model Context Protocol)
195
+ Connect to external tools via the industry-standard MCP protocol. Databases, GitHub, Slack, browsers — anything with an MCP server.
196
+ - Compatible with `.cursor/mcp.json` and `opencode.json` configs
197
+ - `/mcp` — show connected servers
198
+ - `/mcp add github npx -y @modelcontextprotocol/server-github` — add a server
199
+ - `/mcp tools` — list all available MCP tools
200
+
201
+ ### 🖥️ Zero-Setup Local LLM
202
+ First time with no LLM? Codemaxxing walks you through it:
203
+ 1. Detects your hardware (CPU, RAM, GPU)
204
+ 2. Recommends coding models that fit your machine
205
+ 3. Installs Ollama automatically
206
+ 4. Downloads the model with a progress bar
207
+ 5. Connects and drops you into coding mode
208
+
209
+ No googling, no config files, no decisions. Just run `codemaxxing`.
210
+
211
+ ### 🦙 Ollama Management
212
+ Full Ollama control from inside codemaxxing:
213
+ - `/ollama` — status, installed models, GPU usage
214
+ - `/ollama pull` — interactive model picker + download
215
+ - `/ollama delete` — pick and remove models
216
+ - `/ollama start` / `/ollama stop` — server management
217
+ - Exit warning when Ollama is using GPU memory
218
+
194
219
  ### 🔄 Multi-Provider
195
- Switch models mid-session without restarting:
196
- - `/model gpt-4o` — switch to a different model
197
- - `/models` — list available models from your provider
220
+ Switch models mid-session with an interactive picker:
221
+ - `/model` — browse and switch models
222
+ - `/model gpt-4o` — switch directly by name
198
223
  - Native Anthropic API support (not just OpenAI-compatible)
199
224
 
200
225
  ### 🎨 14 Themes
@@ -216,11 +241,15 @@ Type `/` for autocomplete suggestions. Arrow keys to navigate, Tab or Enter to s
216
241
  | `/help` | Show all commands |
217
242
  | `/connect` | Retry LLM connection |
218
243
  | `/login` | Interactive auth setup |
244
+ | `/model` | Browse & switch models (picker) |
219
245
  | `/architect` | Toggle architect mode / set model |
220
246
  | `/skills` | Browse, install, manage skills |
221
247
  | `/lint on/off` | Toggle auto-linting |
222
- | `/model <name>` | Switch model mid-session |
223
- | `/models` | List available models |
248
+ | `/mcp` | MCP server status & tools |
249
+ | `/ollama` | Ollama status, models & GPU |
250
+ | `/ollama pull` | Download a model (picker) |
251
+ | `/ollama delete` | Remove a model (picker) |
252
+ | `/ollama start/stop` | Server management |
224
253
  | `/theme` | Switch color theme |
225
254
  | `/map` | Show repository map |
226
255
  | `/sessions` | List past sessions |
@@ -295,14 +324,16 @@ Settings are stored in `~/.codemaxxing/settings.json`:
295
324
 
296
325
  ## Tools
297
326
 
298
- Codemaxxing gives the model these tools:
327
+ Built-in tools:
299
328
 
300
329
  - **read_file** — Read file contents (safe)
301
- - **write_file** — Write/create files (requires approval)
330
+ - **write_file** — Write/create files (requires approval, shows diff)
302
331
  - **list_files** — List directory contents (safe)
303
332
  - **search_files** — Search for patterns across files (safe)
304
333
  - **run_command** — Execute shell commands (requires approval)
305
334
 
335
+ Plus any tools from connected MCP servers (databases, APIs, GitHub, etc.)
336
+
306
337
  ## Project Context
307
338
 
308
339
  Drop a `CODEMAXXING.md` file in your project root to give the model extra context about your codebase, conventions, or instructions. It's automatically included in the system prompt.
@@ -311,8 +342,10 @@ Drop a `CODEMAXXING.md` file in your project root to give the model extra contex
311
342
 
312
343
  - **Runtime:** Node.js + TypeScript
313
344
  - **TUI:** [Ink](https://github.com/vadimdemedes/ink) (React for the terminal)
314
- - **LLM SDK:** [OpenAI SDK](https://github.com/openai/openai-node) (works with any compatible API)
345
+ - **LLM SDKs:** [OpenAI SDK](https://github.com/openai/openai-node) + [Anthropic SDK](https://github.com/anthropics/anthropic-sdk-typescript)
346
+ - **MCP:** [@modelcontextprotocol/sdk](https://github.com/modelcontextprotocol/typescript-sdk)
315
347
  - **Sessions:** [better-sqlite3](https://github.com/WiseLibs/better-sqlite3)
348
+ - **Local LLM:** Ollama integration (auto-install, pull, manage)
316
349
  - **Zero cloud dependencies** — everything runs locally
317
350
 
318
351
  ## Inspired By
package/dist/config.d.ts CHANGED
@@ -41,6 +41,17 @@ export declare function getConfigPath(): string;
41
41
  /**
42
42
  * Auto-detect local LLM servers
43
43
  */
44
+ export type DetectionResult = {
45
+ status: "connected";
46
+ provider: ProviderConfig;
47
+ } | {
48
+ status: "no-models";
49
+ serverName: string;
50
+ baseUrl: string;
51
+ } | {
52
+ status: "no-server";
53
+ };
54
+ export declare function detectLocalProviderDetailed(): Promise<DetectionResult>;
44
55
  export declare function detectLocalProvider(): Promise<ProviderConfig | null>;
45
56
  /**
46
57
  * List available models from a provider endpoint
package/dist/config.js CHANGED
@@ -149,9 +149,49 @@ export function applyOverrides(config, args) {
149
149
  export function getConfigPath() {
150
150
  return CONFIG_FILE;
151
151
  }
152
- /**
153
- * Auto-detect local LLM servers
154
- */
152
+ export async function detectLocalProviderDetailed() {
153
+ const endpoints = [
154
+ { name: "LM Studio", url: "http://localhost:1234/v1" },
155
+ { name: "Ollama", url: "http://localhost:11434/v1" },
156
+ { name: "vLLM", url: "http://localhost:8000/v1" },
157
+ ];
158
+ let serverFound = null;
159
+ for (const endpoint of endpoints) {
160
+ try {
161
+ const controller = new AbortController();
162
+ const timeout = setTimeout(() => controller.abort(), 2000);
163
+ const res = await fetch(`${endpoint.url}/models`, {
164
+ signal: controller.signal,
165
+ });
166
+ clearTimeout(timeout);
167
+ if (res.ok) {
168
+ const data = (await res.json());
169
+ const models = data.data ?? [];
170
+ if (models.length === 0) {
171
+ // Server is up but no models — remember it but keep looking
172
+ if (!serverFound)
173
+ serverFound = endpoint;
174
+ continue;
175
+ }
176
+ return {
177
+ status: "connected",
178
+ provider: {
179
+ baseUrl: endpoint.url,
180
+ apiKey: "not-needed",
181
+ model: models[0].id,
182
+ },
183
+ };
184
+ }
185
+ }
186
+ catch {
187
+ // Server not running, try next
188
+ }
189
+ }
190
+ if (serverFound) {
191
+ return { status: "no-models", serverName: serverFound.name, baseUrl: serverFound.url };
192
+ }
193
+ return { status: "no-server" };
194
+ }
155
195
  export async function detectLocalProvider() {
156
196
  const endpoints = [
157
197
  { name: "LM Studio", url: "http://localhost:1234/v1" },
@@ -169,7 +209,11 @@ export async function detectLocalProvider() {
169
209
  if (res.ok) {
170
210
  const data = (await res.json());
171
211
  const models = data.data ?? [];
172
- const model = models[0]?.id ?? "auto";
212
+ if (models.length === 0) {
213
+ // Server is up but no models available — don't fake a connection
214
+ continue;
215
+ }
216
+ const model = models[0].id;
173
217
  return {
174
218
  baseUrl: endpoint.url,
175
219
  apiKey: "not-needed",
package/dist/index.js CHANGED
@@ -5,7 +5,7 @@ import { render, Box, Text, useInput, useApp, useStdout } from "ink";
5
5
  import { EventEmitter } from "events";
6
6
  import TextInput from "ink-text-input";
7
7
  import { CodingAgent } from "./agent.js";
8
- import { loadConfig, saveConfig, detectLocalProvider, parseCLIArgs, applyOverrides, listModels } from "./config.js";
8
+ import { loadConfig, saveConfig, detectLocalProvider, detectLocalProviderDetailed, parseCLIArgs, applyOverrides, listModels } from "./config.js";
9
9
  import { listSessions, getSession, loadMessages, deleteSession } from "./utils/sessions.js";
10
10
  import { execSync } from "child_process";
11
11
  import { isGitRepo, getBranch, getStatus, getDiff, undoLastCommit } from "./utils/git.js";
@@ -156,6 +156,8 @@ function App() {
156
156
  const [ollamaDeletePickerIndex, setOllamaDeletePickerIndex] = useState(0);
157
157
  const [ollamaPullPicker, setOllamaPullPicker] = useState(false);
158
158
  const [ollamaPullPickerIndex, setOllamaPullPickerIndex] = useState(0);
159
+ const [modelPicker, setModelPicker] = useState(null);
160
+ const [modelPickerIndex, setModelPickerIndex] = useState(0);
159
161
  const [wizardScreen, setWizardScreen] = useState(null);
160
162
  const [wizardIndex, setWizardIndex] = useState(0);
161
163
  const [wizardHardware, setWizardHardware] = useState(null);
@@ -218,15 +220,21 @@ function App() {
218
220
  if (provider.model === "auto" || (provider.baseUrl === "http://localhost:1234/v1" && !cliArgs.baseUrl)) {
219
221
  info.push("Detecting local LLM server...");
220
222
  setConnectionInfo([...info]);
221
- const detected = await detectLocalProvider();
222
- if (detected) {
223
+ const detection = await detectLocalProviderDetailed();
224
+ if (detection.status === "connected") {
223
225
  // Keep CLI model override if specified
224
226
  if (cliArgs.model)
225
- detected.model = cliArgs.model;
226
- provider = detected;
227
+ detection.provider.model = cliArgs.model;
228
+ provider = detection.provider;
227
229
  info.push(`✔ Connected to ${provider.baseUrl} → ${provider.model}`);
228
230
  setConnectionInfo([...info]);
229
231
  }
232
+ else if (detection.status === "no-models") {
233
+ info.push(`⚠ ${detection.serverName} is running but has no models. Use /ollama pull to download one.`);
234
+ setConnectionInfo([...info]);
235
+ setReady(true);
236
+ return;
237
+ }
230
238
  else {
231
239
  info.push("✗ No local LLM server found.");
232
240
  setConnectionInfo([...info]);
@@ -901,10 +909,42 @@ function App() {
901
909
  }
902
910
  return;
903
911
  }
904
- if (trimmed.startsWith("/model")) {
905
- const newModel = trimmed.replace("/model", "").trim();
912
+ if (trimmed === "/model") {
913
+ // Show picker of available models
914
+ addMsg("info", "Fetching available models...");
915
+ try {
916
+ const ollamaModels = await listInstalledModelsDetailed();
917
+ if (ollamaModels.length > 0) {
918
+ setModelPicker(ollamaModels.map(m => m.name));
919
+ setModelPickerIndex(0);
920
+ return;
921
+ }
922
+ }
923
+ catch (err) {
924
+ // Ollama not available or failed, try provider
925
+ }
926
+ // Fallback: try provider's model list
927
+ if (providerRef.current?.baseUrl && providerRef.current.baseUrl !== "auto") {
928
+ try {
929
+ const providerModels = await listModels(providerRef.current.baseUrl, providerRef.current.apiKey || "");
930
+ if (providerModels.length > 0) {
931
+ setModelPicker(providerModels);
932
+ setModelPickerIndex(0);
933
+ return;
934
+ }
935
+ }
936
+ catch (err) {
937
+ // Provider fetch failed
938
+ }
939
+ }
940
+ // No models found anywhere
941
+ addMsg("error", "No models available. Download one with /ollama pull or configure a provider.");
942
+ return;
943
+ }
944
+ if (trimmed.startsWith("/model ")) {
945
+ const newModel = trimmed.replace("/model ", "").trim();
906
946
  if (!newModel) {
907
- addMsg("info", `Current model: ${agent.getModel()}\n Usage: /model <model-name>`);
947
+ addMsg("info", `Current model: ${modelName}\n Usage: /model <model-name>`);
908
948
  return;
909
949
  }
910
950
  agent.switchModel(newModel);
@@ -1370,6 +1410,33 @@ function App() {
1370
1410
  }
1371
1411
  return;
1372
1412
  }
1413
+ // ── Model picker ──
1414
+ if (modelPicker) {
1415
+ if (key.upArrow) {
1416
+ setModelPickerIndex((prev) => (prev - 1 + modelPicker.length) % modelPicker.length);
1417
+ return;
1418
+ }
1419
+ if (key.downArrow) {
1420
+ setModelPickerIndex((prev) => (prev + 1) % modelPicker.length);
1421
+ return;
1422
+ }
1423
+ if (key.escape) {
1424
+ setModelPicker(null);
1425
+ return;
1426
+ }
1427
+ if (key.return) {
1428
+ const selected = modelPicker[modelPickerIndex];
1429
+ if (selected && agent) {
1430
+ agent.switchModel(selected);
1431
+ setModelName(selected);
1432
+ addMsg("info", `✅ Switched to: ${selected}`);
1433
+ refreshConnectionBanner();
1434
+ }
1435
+ setModelPicker(null);
1436
+ return;
1437
+ }
1438
+ return;
1439
+ }
1373
1440
  // ── Ollama delete picker ──
1374
1441
  if (ollamaDeletePicker) {
1375
1442
  if (key.upArrow) {
@@ -1960,7 +2027,7 @@ function App() {
1960
2027
  })(), skillsPicker === "remove" && (() => {
1961
2028
  const installed = listInstalledSkills();
1962
2029
  return (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.error, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.error, children: "Remove a skill:" }), installed.map((s, i) => (_jsxs(Text, { children: [i === skillsPickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsxs(Text, { color: i === skillsPickerIndex ? theme.colors.suggestion : theme.colors.muted, children: [s.name, " \u2014 ", s.description] })] }, s.name))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter remove · Esc back" })] }));
1963
- })(), themePicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Choose a theme:" }), listThemes().map((key, i) => (_jsxs(Text, { children: [i === themePickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === themePickerIndex ? theme.colors.suggestion : theme.colors.primary, bold: true, children: key }), _jsxs(Text, { color: theme.colors.muted, children: [" — ", THEMES[key].description] }), key === theme.name.toLowerCase() ? _jsx(Text, { color: theme.colors.muted, children: " (current)" }) : null] }, key))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), sessionPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.secondary, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Resume a session:" }), sessionPicker.map((s, i) => (_jsxs(Text, { children: [i === sessionPickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === sessionPickerIndex ? theme.colors.suggestion : theme.colors.muted, children: s.display })] }, s.id))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), deleteSessionPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.error, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.error, children: "Delete a session:" }), deleteSessionPicker.map((s, i) => (_jsxs(Text, { children: [i === deleteSessionPickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === deleteSessionPickerIndex ? theme.colors.suggestion : theme.colors.muted, children: s.display })] }, s.id))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), deleteSessionConfirm && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.warning, paddingX: 1, marginBottom: 0, children: [_jsxs(Text, { bold: true, color: theme.colors.warning, children: ["Delete session ", deleteSessionConfirm.id, "?"] }), _jsxs(Text, { color: theme.colors.muted, children: [" ", deleteSessionConfirm.display] }), _jsxs(Text, { children: [_jsx(Text, { color: theme.colors.error, bold: true, children: " [y]" }), _jsx(Text, { children: "es " }), _jsx(Text, { color: theme.colors.success, bold: true, children: "[n]" }), _jsx(Text, { children: "o" })] })] })), ollamaDeletePicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Delete which model?" }), _jsx(Text, { children: "" }), ollamaDeletePicker.models.map((m, i) => (_jsxs(Text, { children: [" ", i === ollamaDeletePickerIndex ? _jsx(Text, { color: theme.colors.primary, bold: true, children: "▸ " }) : " ", _jsx(Text, { color: i === ollamaDeletePickerIndex ? theme.colors.primary : undefined, children: m.name }), _jsxs(Text, { color: theme.colors.muted, children: [" (", (m.size / (1024 * 1024 * 1024)).toFixed(1), " GB)"] })] }, m.name))), _jsx(Text, { children: "" }), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter to delete · Esc cancel" })] })), ollamaPullPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Download which model?" }), _jsx(Text, { children: "" }), [
2030
+ })(), themePicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Choose a theme:" }), listThemes().map((key, i) => (_jsxs(Text, { children: [i === themePickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === themePickerIndex ? theme.colors.suggestion : theme.colors.primary, bold: true, children: key }), _jsxs(Text, { color: theme.colors.muted, children: [" — ", THEMES[key].description] }), key === theme.name.toLowerCase() ? _jsx(Text, { color: theme.colors.muted, children: " (current)" }) : null] }, key))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), sessionPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.secondary, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Resume a session:" }), sessionPicker.map((s, i) => (_jsxs(Text, { children: [i === sessionPickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === sessionPickerIndex ? theme.colors.suggestion : theme.colors.muted, children: s.display })] }, s.id))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), deleteSessionPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.error, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.error, children: "Delete a session:" }), deleteSessionPicker.map((s, i) => (_jsxs(Text, { children: [i === deleteSessionPickerIndex ? _jsx(Text, { color: theme.colors.suggestion, bold: true, children: "▸ " }) : _jsx(Text, { children: " " }), _jsx(Text, { color: i === deleteSessionPickerIndex ? theme.colors.suggestion : theme.colors.muted, children: s.display })] }, s.id))), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter select · Esc cancel" })] })), deleteSessionConfirm && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.warning, paddingX: 1, marginBottom: 0, children: [_jsxs(Text, { bold: true, color: theme.colors.warning, children: ["Delete session ", deleteSessionConfirm.id, "?"] }), _jsxs(Text, { color: theme.colors.muted, children: [" ", deleteSessionConfirm.display] }), _jsxs(Text, { children: [_jsx(Text, { color: theme.colors.error, bold: true, children: " [y]" }), _jsx(Text, { children: "es " }), _jsx(Text, { color: theme.colors.success, bold: true, children: "[n]" }), _jsx(Text, { children: "o" })] })] })), modelPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Switch model:" }), _jsx(Text, { children: "" }), modelPicker.map((m, i) => (_jsxs(Text, { children: [" ", i === modelPickerIndex ? _jsx(Text, { color: theme.colors.primary, bold: true, children: "▸ " }) : " ", _jsx(Text, { color: i === modelPickerIndex ? theme.colors.primary : undefined, children: m }), m === modelName ? _jsx(Text, { color: theme.colors.success, children: " (active)" }) : null] }, m))), _jsx(Text, { children: "" }), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter to switch · Esc cancel" })] })), ollamaDeletePicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Delete which model?" }), _jsx(Text, { children: "" }), ollamaDeletePicker.models.map((m, i) => (_jsxs(Text, { children: [" ", i === ollamaDeletePickerIndex ? _jsx(Text, { color: theme.colors.primary, bold: true, children: "▸ " }) : " ", _jsx(Text, { color: i === ollamaDeletePickerIndex ? theme.colors.primary : undefined, children: m.name }), _jsxs(Text, { color: theme.colors.muted, children: [" (", (m.size / (1024 * 1024 * 1024)).toFixed(1), " GB)"] })] }, m.name))), _jsx(Text, { children: "" }), _jsx(Text, { dimColor: true, children: " ↑↓ navigate · Enter to delete · Esc cancel" })] })), ollamaPullPicker && (_jsxs(Box, { flexDirection: "column", borderStyle: "single", borderColor: theme.colors.border, paddingX: 1, marginBottom: 0, children: [_jsx(Text, { bold: true, color: theme.colors.secondary, children: "Download which model?" }), _jsx(Text, { children: "" }), [
1964
2031
  { id: "qwen2.5-coder:7b", name: "Qwen 2.5 Coder 7B", size: "5 GB", desc: "Best balance of speed & quality" },
1965
2032
  { id: "qwen2.5-coder:14b", name: "Qwen 2.5 Coder 14B", size: "9 GB", desc: "Higher quality, needs 16GB+ RAM" },
1966
2033
  { id: "qwen2.5-coder:3b", name: "Qwen 2.5 Coder 3B", size: "2 GB", desc: "\u26A0\uFE0F Basic \u2014 may struggle with tool calls" },
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "codemaxxing",
3
- "version": "0.4.15",
3
+ "version": "0.4.17",
4
4
  "description": "Open-source terminal coding agent. Connect any LLM. Max your code.",
5
5
  "main": "dist/index.js",
6
6
  "bin": {
package/src/config.ts CHANGED
@@ -193,6 +193,57 @@ export function getConfigPath(): string {
193
193
  /**
194
194
  * Auto-detect local LLM servers
195
195
  */
196
+ export type DetectionResult =
197
+ | { status: "connected"; provider: ProviderConfig }
198
+ | { status: "no-models"; serverName: string; baseUrl: string }
199
+ | { status: "no-server" };
200
+
201
+ export async function detectLocalProviderDetailed(): Promise<DetectionResult> {
202
+ const endpoints = [
203
+ { name: "LM Studio", url: "http://localhost:1234/v1" },
204
+ { name: "Ollama", url: "http://localhost:11434/v1" },
205
+ { name: "vLLM", url: "http://localhost:8000/v1" },
206
+ ];
207
+
208
+ let serverFound: { name: string; url: string } | null = null;
209
+
210
+ for (const endpoint of endpoints) {
211
+ try {
212
+ const controller = new AbortController();
213
+ const timeout = setTimeout(() => controller.abort(), 2000);
214
+ const res = await fetch(`${endpoint.url}/models`, {
215
+ signal: controller.signal,
216
+ });
217
+ clearTimeout(timeout);
218
+
219
+ if (res.ok) {
220
+ const data = (await res.json()) as { data?: Array<{ id: string }> };
221
+ const models = data.data ?? [];
222
+ if (models.length === 0) {
223
+ // Server is up but no models — remember it but keep looking
224
+ if (!serverFound) serverFound = endpoint;
225
+ continue;
226
+ }
227
+ return {
228
+ status: "connected",
229
+ provider: {
230
+ baseUrl: endpoint.url,
231
+ apiKey: "not-needed",
232
+ model: models[0]!.id,
233
+ },
234
+ };
235
+ }
236
+ } catch {
237
+ // Server not running, try next
238
+ }
239
+ }
240
+
241
+ if (serverFound) {
242
+ return { status: "no-models", serverName: serverFound.name, baseUrl: serverFound.url };
243
+ }
244
+ return { status: "no-server" };
245
+ }
246
+
196
247
  export async function detectLocalProvider(): Promise<ProviderConfig | null> {
197
248
  const endpoints = [
198
249
  { name: "LM Studio", url: "http://localhost:1234/v1" },
@@ -212,7 +263,11 @@ export async function detectLocalProvider(): Promise<ProviderConfig | null> {
212
263
  if (res.ok) {
213
264
  const data = (await res.json()) as { data?: Array<{ id: string }> };
214
265
  const models = data.data ?? [];
215
- const model = models[0]?.id ?? "auto";
266
+ if (models.length === 0) {
267
+ // Server is up but no models available — don't fake a connection
268
+ continue;
269
+ }
270
+ const model = models[0]!.id;
216
271
  return {
217
272
  baseUrl: endpoint.url,
218
273
  apiKey: "not-needed",
package/src/index.tsx CHANGED
@@ -5,7 +5,7 @@ import { render, Box, Text, useInput, useApp, useStdout } from "ink";
5
5
  import { EventEmitter } from "events";
6
6
  import TextInput from "ink-text-input";
7
7
  import { CodingAgent } from "./agent.js";
8
- import { loadConfig, saveConfig, detectLocalProvider, parseCLIArgs, applyOverrides, listModels } from "./config.js";
8
+ import { loadConfig, saveConfig, detectLocalProvider, detectLocalProviderDetailed, parseCLIArgs, applyOverrides, listModels } from "./config.js";
9
9
  import { listSessions, getSession, loadMessages, deleteSession } from "./utils/sessions.js";
10
10
  import { execSync } from "child_process";
11
11
  import { isGitRepo, getBranch, getStatus, getDiff, undoLastCommit } from "./utils/git.js";
@@ -191,6 +191,8 @@ function App() {
191
191
  const [ollamaDeletePickerIndex, setOllamaDeletePickerIndex] = useState(0);
192
192
  const [ollamaPullPicker, setOllamaPullPicker] = useState(false);
193
193
  const [ollamaPullPickerIndex, setOllamaPullPickerIndex] = useState(0);
194
+ const [modelPicker, setModelPicker] = useState<string[] | null>(null);
195
+ const [modelPickerIndex, setModelPickerIndex] = useState(0);
194
196
 
195
197
  // ── Setup Wizard State ──
196
198
  type WizardScreen = "connection" | "models" | "install-ollama" | "pulling" | null;
@@ -262,13 +264,18 @@ function App() {
262
264
  if (provider.model === "auto" || (provider.baseUrl === "http://localhost:1234/v1" && !cliArgs.baseUrl)) {
263
265
  info.push("Detecting local LLM server...");
264
266
  setConnectionInfo([...info]);
265
- const detected = await detectLocalProvider();
266
- if (detected) {
267
+ const detection = await detectLocalProviderDetailed();
268
+ if (detection.status === "connected") {
267
269
  // Keep CLI model override if specified
268
- if (cliArgs.model) detected.model = cliArgs.model;
269
- provider = detected;
270
+ if (cliArgs.model) detection.provider.model = cliArgs.model;
271
+ provider = detection.provider;
270
272
  info.push(`✔ Connected to ${provider.baseUrl} → ${provider.model}`);
271
273
  setConnectionInfo([...info]);
274
+ } else if (detection.status === "no-models") {
275
+ info.push(`⚠ ${detection.serverName} is running but has no models. Use /ollama pull to download one.`);
276
+ setConnectionInfo([...info]);
277
+ setReady(true);
278
+ return;
272
279
  } else {
273
280
  info.push("✗ No local LLM server found.");
274
281
  setConnectionInfo([...info]);
@@ -939,10 +946,42 @@ function App() {
939
946
  }
940
947
  return;
941
948
  }
942
- if (trimmed.startsWith("/model")) {
943
- const newModel = trimmed.replace("/model", "").trim();
949
+ if (trimmed === "/model") {
950
+ // Show picker of available models
951
+ addMsg("info", "Fetching available models...");
952
+ try {
953
+ const ollamaModels = await listInstalledModelsDetailed();
954
+ if (ollamaModels.length > 0) {
955
+ setModelPicker(ollamaModels.map(m => m.name));
956
+ setModelPickerIndex(0);
957
+ return;
958
+ }
959
+ } catch (err) {
960
+ // Ollama not available or failed, try provider
961
+ }
962
+
963
+ // Fallback: try provider's model list
964
+ if (providerRef.current?.baseUrl && providerRef.current.baseUrl !== "auto") {
965
+ try {
966
+ const providerModels = await listModels(providerRef.current.baseUrl, providerRef.current.apiKey || "");
967
+ if (providerModels.length > 0) {
968
+ setModelPicker(providerModels);
969
+ setModelPickerIndex(0);
970
+ return;
971
+ }
972
+ } catch (err) {
973
+ // Provider fetch failed
974
+ }
975
+ }
976
+
977
+ // No models found anywhere
978
+ addMsg("error", "No models available. Download one with /ollama pull or configure a provider.");
979
+ return;
980
+ }
981
+ if (trimmed.startsWith("/model ")) {
982
+ const newModel = trimmed.replace("/model ", "").trim();
944
983
  if (!newModel) {
945
- addMsg("info", `Current model: ${agent.getModel()}\n Usage: /model <model-name>`);
984
+ addMsg("info", `Current model: ${modelName}\n Usage: /model <model-name>`);
946
985
  return;
947
986
  }
948
987
  agent.switchModel(newModel);
@@ -1390,6 +1429,34 @@ function App() {
1390
1429
  return;
1391
1430
  }
1392
1431
 
1432
+ // ── Model picker ──
1433
+ if (modelPicker) {
1434
+ if (key.upArrow) {
1435
+ setModelPickerIndex((prev) => (prev - 1 + modelPicker.length) % modelPicker.length);
1436
+ return;
1437
+ }
1438
+ if (key.downArrow) {
1439
+ setModelPickerIndex((prev) => (prev + 1) % modelPicker.length);
1440
+ return;
1441
+ }
1442
+ if (key.escape) {
1443
+ setModelPicker(null);
1444
+ return;
1445
+ }
1446
+ if (key.return) {
1447
+ const selected = modelPicker[modelPickerIndex];
1448
+ if (selected && agent) {
1449
+ agent.switchModel(selected);
1450
+ setModelName(selected);
1451
+ addMsg("info", `✅ Switched to: ${selected}`);
1452
+ refreshConnectionBanner();
1453
+ }
1454
+ setModelPicker(null);
1455
+ return;
1456
+ }
1457
+ return;
1458
+ }
1459
+
1393
1460
  // ── Ollama delete picker ──
1394
1461
  if (ollamaDeletePicker) {
1395
1462
  if (key.upArrow) {
@@ -2212,6 +2279,23 @@ function App() {
2212
2279
  </Box>
2213
2280
  )}
2214
2281
 
2282
+ {/* ═══ MODEL PICKER ═══ */}
2283
+ {modelPicker && (
2284
+ <Box flexDirection="column" borderStyle="single" borderColor={theme.colors.border} paddingX={1} marginBottom={0}>
2285
+ <Text bold color={theme.colors.secondary}>Switch model:</Text>
2286
+ <Text>{""}</Text>
2287
+ {modelPicker.map((m, i) => (
2288
+ <Text key={m}>
2289
+ {" "}{i === modelPickerIndex ? <Text color={theme.colors.primary} bold>{"▸ "}</Text> : " "}
2290
+ <Text color={i === modelPickerIndex ? theme.colors.primary : undefined}>{m}</Text>
2291
+ {m === modelName ? <Text color={theme.colors.success}>{" (active)"}</Text> : null}
2292
+ </Text>
2293
+ ))}
2294
+ <Text>{""}</Text>
2295
+ <Text dimColor>{" ↑↓ navigate · Enter to switch · Esc cancel"}</Text>
2296
+ </Box>
2297
+ )}
2298
+
2215
2299
  {/* ═══ OLLAMA DELETE PICKER ═══ */}
2216
2300
  {ollamaDeletePicker && (
2217
2301
  <Box flexDirection="column" borderStyle="single" borderColor={theme.colors.border} paddingX={1} marginBottom={0}>