navada-edge-cli 3.0.2 → 3.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/Dockerfile +24 -2
- package/README.md +25 -9
- package/docker-compose.yml +10 -4
- package/lib/agent.js +120 -16
- package/lib/cli.js +9 -0
- package/lib/commands/ai.js +34 -19
- package/lib/commands/index.js +13 -17
- package/lib/commands/nvidia.js +241 -0
- package/lib/commands/sandbox.js +323 -0
- package/lib/commands/system.js +11 -2
- package/lib/config.js +18 -3
- package/lib/serve.js +20 -7
- package/lib/ui.js +11 -5
- package/package.json +4 -4
package/Dockerfile
CHANGED
|
@@ -1,3 +1,25 @@
|
|
|
1
1
|
FROM node:22-slim
|
|
2
|
-
|
|
3
|
-
|
|
2
|
+
|
|
3
|
+
LABEL maintainer="Leslie Akpareva <leeakpareva@hotmail.com>"
|
|
4
|
+
LABEL org.opencontainers.image.title="navada-edge-cli"
|
|
5
|
+
LABEL org.opencontainers.image.description="NAVADA Edge CLI — interactive terminal agent"
|
|
6
|
+
|
|
7
|
+
# Install Python for python tools
|
|
8
|
+
RUN apt-get update && apt-get install -y --no-install-recommends python3 python3-pip && \
|
|
9
|
+
rm -rf /var/lib/apt/lists/*
|
|
10
|
+
|
|
11
|
+
WORKDIR /app
|
|
12
|
+
|
|
13
|
+
# Copy package files and install
|
|
14
|
+
COPY package.json package-lock.json ./
|
|
15
|
+
RUN npm ci --production
|
|
16
|
+
|
|
17
|
+
# Copy source
|
|
18
|
+
COPY bin/ bin/
|
|
19
|
+
COPY lib/ lib/
|
|
20
|
+
COPY LICENSE README.md ./
|
|
21
|
+
|
|
22
|
+
# Config directory
|
|
23
|
+
RUN mkdir -p /root/.navada
|
|
24
|
+
|
|
25
|
+
ENTRYPOINT ["node", "bin/navada.js"]
|
package/README.md
CHANGED
|
@@ -10,34 +10,50 @@ npm install -g navada-edge-cli
|
|
|
10
10
|
|
|
11
11
|
**NAVADA Edge CLI** is a production-grade terminal tool that gives developers and infrastructure teams an AI agent with full system access, connected to a distributed computing network.
|
|
12
12
|
|
|
13
|
+
### Install
|
|
14
|
+
|
|
15
|
+
```bash
|
|
16
|
+
npm install -g navada-edge-cli
|
|
17
|
+
navada
|
|
18
|
+
```
|
|
19
|
+
|
|
20
|
+
Or run with Docker:
|
|
21
|
+
|
|
22
|
+
```bash
|
|
23
|
+
docker run -it navada-edge-cli:3.0.2
|
|
24
|
+
```
|
|
25
|
+
|
|
13
26
|
**The problem:** Managing distributed infrastructure across multiple cloud providers, on-prem servers, and services requires jumping between terminals, dashboards, and APIs. AI assistants answer questions but can't execute.
|
|
14
27
|
|
|
15
|
-
**The solution:** One CLI that talks naturally, executes commands locally and remotely, and unifies access to your entire infrastructure through a conversational AI agent with
|
|
28
|
+
**The solution:** One CLI that talks naturally, executes commands locally and remotely, and unifies access to your entire infrastructure through a conversational AI agent with tool use, streaming, and smart model routing.
|
|
16
29
|
|
|
17
30
|
**Key differentiators:**
|
|
31
|
+
- **Free tier included** — Grok AI with 30 RPM, no API key needed, just install and go
|
|
32
|
+
- **Multi-provider AI** — Anthropic (Claude Sonnet 4), OpenAI (GPT-4o/mini), Grok, Qwen Coder — all with streaming
|
|
33
|
+
- **Smart routing** — auto-picks the best model: code queries to Qwen, complex to Claude, general to Grok
|
|
18
34
|
- **Conversational AI agent** — type naturally, the agent uses tools to execute (not just answer)
|
|
19
|
-
- **Full computer access** — shell, files, processes on your local machine
|
|
35
|
+
- **Full computer access** — shell, files, processes, Python execution on your local machine
|
|
20
36
|
- **Distributed network** — 4 physical nodes connected via Tailscale VPN, managed from one terminal
|
|
21
37
|
- **Two sub-agents** — Lucas CTO (infrastructure) and Claude CoS (communications + automation)
|
|
22
38
|
- **Cloud-native** — Cloudflare (R2, Flux AI, Stream, DNS), Azure (n8n), private Docker registry
|
|
23
|
-
- **
|
|
24
|
-
- **
|
|
39
|
+
- **58 slash commands** — direct access when you need precision
|
|
40
|
+
- **3 learning modes** — `/learn python`, `/learn csharp`, `/learn node` — interactive tutor mode
|
|
41
|
+
- **Split-panel TUI** — session info, token usage, rate limits, provider status
|
|
25
42
|
- **4 themes** — dark, crow (achromatic), matrix, light
|
|
26
|
-
- **
|
|
43
|
+
- **Docker-first** — runs as a container with `restart: always`, ships with Dockerfile
|
|
27
44
|
- **Cross-platform** — Windows, macOS, Linux
|
|
28
|
-
- **
|
|
29
|
-
- **Zero config to start** — install, login with any API key, start talking
|
|
45
|
+
- **Zero config to start** — install and start talking, free tier works immediately
|
|
30
46
|
|
|
31
47
|
**Who it's for:** DevOps engineers, infrastructure teams, AI developers, and anyone who manages distributed systems and wants an AI copilot in their terminal.
|
|
32
48
|
|
|
33
|
-
**Pricing:** The CLI is free and open source (MIT).
|
|
49
|
+
**Pricing:** The CLI is free and open source (MIT). Free tier (Grok) works out of the box. Add your own API key for full agent mode (Anthropic, OpenAI, or HuggingFace).
|
|
34
50
|
|
|
35
51
|
```
|
|
36
52
|
╭─────────────────────────────────────────────────────────╮
|
|
37
53
|
│ ███╗ ██╗ █████╗ ██╗ ██╗ █████╗ ██████╗ █████╗ │
|
|
38
54
|
│ ██╔██╗ ██║███████║██║ ██║███████║██║ ██║███████║ │
|
|
39
55
|
│ ██║ ╚████║██║ ██║ ╚████╔╝ ██║ ██║██████╔╝██║ ██║ │
|
|
40
|
-
│ E D G E N E T W O R K
|
|
56
|
+
│ E D G E N E T W O R K v3.0.2 │
|
|
41
57
|
╰─────────────────────────────────────────────────────────╯
|
|
42
58
|
```
|
|
43
59
|
|
package/docker-compose.yml
CHANGED
|
@@ -1,10 +1,16 @@
|
|
|
1
1
|
services:
|
|
2
2
|
cli:
|
|
3
3
|
build: .
|
|
4
|
-
image: navada-edge-cli:2.0
|
|
4
|
+
image: navada-edge-cli:3.2.0
|
|
5
5
|
container_name: navada-edge-cli
|
|
6
|
-
|
|
6
|
+
restart: always
|
|
7
7
|
stdin_open: true
|
|
8
8
|
tty: true
|
|
9
|
-
|
|
10
|
-
-
|
|
9
|
+
volumes:
|
|
10
|
+
- cli-config:/root/.navada
|
|
11
|
+
environment:
|
|
12
|
+
- NODE_ENV=production
|
|
13
|
+
- TERM=xterm-256color
|
|
14
|
+
|
|
15
|
+
volumes:
|
|
16
|
+
cli-config:
|
package/lib/agent.js
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
'use strict';
|
|
2
2
|
|
|
3
|
-
const { execSync } = require('child_process');
|
|
3
|
+
const { execSync, execFileSync } = require('child_process');
|
|
4
4
|
const fs = require('fs');
|
|
5
5
|
const path = require('path');
|
|
6
6
|
const os = require('os');
|
|
@@ -17,15 +17,51 @@ const config = require('./config');
|
|
|
17
17
|
const IDENTITY = {
|
|
18
18
|
name: 'NAVADA Edge',
|
|
19
19
|
role: 'AI Infrastructure Agent',
|
|
20
|
-
personality: `You are NAVADA Edge — an AI agent that operates inside the user's terminal.
|
|
20
|
+
personality: `You are NAVADA Edge — an AI agent built by Lee Akpareva (Principal AI Consultant, MBA, MA) that operates inside the user's terminal.
|
|
21
21
|
You are professional, technical, concise, and helpful. You speak with authority about distributed systems, Docker, AI, and cloud infrastructure.
|
|
22
|
-
You have
|
|
23
|
-
|
|
24
|
-
-
|
|
25
|
-
-
|
|
26
|
-
|
|
27
|
-
|
|
22
|
+
You have FULL ACCESS to the user's computer — you CAN and SHOULD use your tools to execute tasks:
|
|
23
|
+
- shell: run ANY bash, PowerShell, or system command on the user's machine
|
|
24
|
+
- read_file / write_file / list_files: full filesystem access — create, read, modify any file
|
|
25
|
+
- python_exec / python_pip / python_script: run Python code directly
|
|
26
|
+
- sandbox_run: run code with syntax-highlighted output
|
|
27
|
+
- system_info: check CPU, RAM, disk, OS
|
|
28
|
+
You also connect to the NAVADA Edge Network (4 nodes via Tailscale VPN):
|
|
29
|
+
- lucas_exec / lucas_ssh / lucas_docker: execute commands on remote nodes (EC2, HP, Oracle)
|
|
30
|
+
- mcp_call: access 18 MCP tools on the ASUS server
|
|
31
|
+
- docker_registry: manage the private Docker registry
|
|
32
|
+
- send_email / generate_image: communications and AI image generation
|
|
33
|
+
- founder_info: information about Lee Akpareva, the creator of NAVADA
|
|
34
|
+
When users ask you to DO something — DO IT. Use write_file to create files. Use shell to run commands. Never say "I can't" when you have a tool for it.
|
|
28
35
|
Keep responses short. Code blocks when needed. No fluff.`,
|
|
36
|
+
founder: {
|
|
37
|
+
name: 'Leslie (Lee) Akpareva',
|
|
38
|
+
title: 'Principal AI Consultant & Founder, NAVADA Edge Network',
|
|
39
|
+
qualifications: 'MBA (Business Administration), MA (International Relations), EF8 Alumni',
|
|
40
|
+
experience: '17+ years in enterprise IT, insurance, and AI infrastructure',
|
|
41
|
+
current_role: 'AI Project Lead at Generali UK (insurance)',
|
|
42
|
+
expertise: [
|
|
43
|
+
'AI/ML infrastructure and deployment',
|
|
44
|
+
'Distributed systems and edge computing',
|
|
45
|
+
'Docker containerisation and orchestration',
|
|
46
|
+
'Cloud architecture (AWS, Azure, Oracle, Cloudflare)',
|
|
47
|
+
'MCP (Model Context Protocol) server development',
|
|
48
|
+
'Full-stack development (Node.js, Python, React, Next.js)',
|
|
49
|
+
'Enterprise IT transformation and automation',
|
|
50
|
+
],
|
|
51
|
+
projects: [
|
|
52
|
+
'NAVADA Edge Network — distributed AI home server (4 nodes, 25+ containers)',
|
|
53
|
+
'NAVADA Edge SDK + CLI — npm packages for network access',
|
|
54
|
+
'WorldMonitor — OSINT intelligence dashboard',
|
|
55
|
+
'Lucas CTO — autonomous infrastructure agent',
|
|
56
|
+
'OpenCode — open source coding assistant',
|
|
57
|
+
'NAVADA Robotics — robotics and hardware',
|
|
58
|
+
'Raven Terminal — security terminal',
|
|
59
|
+
],
|
|
60
|
+
contact: {
|
|
61
|
+
github: 'github.com/leeakpareva',
|
|
62
|
+
website: 'navada-lab.space',
|
|
63
|
+
},
|
|
64
|
+
},
|
|
29
65
|
};
|
|
30
66
|
|
|
31
67
|
function getSystemPrompt() {
|
|
@@ -50,8 +86,30 @@ const sessionState = {
|
|
|
50
86
|
messages: 0,
|
|
51
87
|
startTime: Date.now(),
|
|
52
88
|
learningMode: null, // 'python' | 'csharp' | 'node' | null
|
|
89
|
+
history: [], // conversation history for context continuity
|
|
53
90
|
};
|
|
54
91
|
|
|
92
|
+
// Conversation history management
|
|
93
|
+
function addToHistory(role, content) {
|
|
94
|
+
sessionState.history.push({ role, content });
|
|
95
|
+
// Keep last 40 turns to avoid token overflow
|
|
96
|
+
if (sessionState.history.length > 40) {
|
|
97
|
+
sessionState.history = sessionState.history.slice(-40);
|
|
98
|
+
}
|
|
99
|
+
sessionState.messages++;
|
|
100
|
+
}
|
|
101
|
+
|
|
102
|
+
function getConversationHistory() {
|
|
103
|
+
return sessionState.history;
|
|
104
|
+
}
|
|
105
|
+
|
|
106
|
+
function clearHistory() {
|
|
107
|
+
sessionState.history = [];
|
|
108
|
+
sessionState.messages = 0;
|
|
109
|
+
sessionState.tokens = { input: 0, output: 0, total: 0 };
|
|
110
|
+
sessionState.cost = 0;
|
|
111
|
+
}
|
|
112
|
+
|
|
55
113
|
// ---------------------------------------------------------------------------
|
|
56
114
|
// Rate limit tracking (in-memory, per session)
|
|
57
115
|
// ---------------------------------------------------------------------------
|
|
@@ -154,7 +212,8 @@ const localTools = {
|
|
|
154
212
|
execute: (code) => {
|
|
155
213
|
try {
|
|
156
214
|
const py = process.platform === 'win32' ? 'python' : 'python3';
|
|
157
|
-
|
|
215
|
+
// Safe: use execFileSync with -c flag — no shell interpolation
|
|
216
|
+
const output = execFileSync(py, ['-c', code], {
|
|
158
217
|
timeout: 30000, encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'],
|
|
159
218
|
});
|
|
160
219
|
return output.trim();
|
|
@@ -168,8 +227,12 @@ const localTools = {
|
|
|
168
227
|
description: 'Install a Python package',
|
|
169
228
|
execute: (pkg) => {
|
|
170
229
|
try {
|
|
230
|
+
// Validate package name — alphanumeric, hyphens, underscores, dots, brackets, version specifiers
|
|
231
|
+
if (!/^[a-zA-Z0-9._\-\[\],>=<! ]+$/.test(pkg)) {
|
|
232
|
+
return 'Error: Invalid package name';
|
|
233
|
+
}
|
|
171
234
|
const py = process.platform === 'win32' ? 'python' : 'python3';
|
|
172
|
-
const output =
|
|
235
|
+
const output = execFileSync(py, ['-m', 'pip', 'install', ...pkg.split(/\s+/)], {
|
|
173
236
|
timeout: 60000, encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'],
|
|
174
237
|
});
|
|
175
238
|
return output.trim().split('\n').slice(-3).join('\n');
|
|
@@ -183,10 +246,12 @@ const localTools = {
|
|
|
183
246
|
description: 'Run a Python script file',
|
|
184
247
|
execute: (scriptPath) => {
|
|
185
248
|
try {
|
|
249
|
+
const resolved = path.resolve(scriptPath);
|
|
250
|
+
if (!fs.existsSync(resolved)) return `Error: File not found: ${resolved}`;
|
|
186
251
|
const py = process.platform === 'win32' ? 'python' : 'python3';
|
|
187
|
-
const output =
|
|
252
|
+
const output = execFileSync(py, [resolved], {
|
|
188
253
|
timeout: 60000, encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'],
|
|
189
|
-
cwd: path.dirname(
|
|
254
|
+
cwd: path.dirname(resolved),
|
|
190
255
|
});
|
|
191
256
|
return output.trim();
|
|
192
257
|
} catch (e) {
|
|
@@ -194,6 +259,11 @@ const localTools = {
|
|
|
194
259
|
}
|
|
195
260
|
},
|
|
196
261
|
},
|
|
262
|
+
|
|
263
|
+
founderInfo: {
|
|
264
|
+
description: 'Get information about the NAVADA Edge founder',
|
|
265
|
+
execute: () => JSON.stringify(IDENTITY.founder, null, 2),
|
|
266
|
+
},
|
|
197
267
|
};
|
|
198
268
|
|
|
199
269
|
// ---------------------------------------------------------------------------
|
|
@@ -201,7 +271,6 @@ const localTools = {
|
|
|
201
271
|
// ---------------------------------------------------------------------------
|
|
202
272
|
const FREE_TIER_ENDPOINTS = [
|
|
203
273
|
'https://api.navada-edge-server.uk/api/v1/chat', // Cloudflare tunnel (public, works for all)
|
|
204
|
-
'http://100.88.118.128:7900/api/v1/chat', // Direct Tailscale (VPN users only)
|
|
205
274
|
];
|
|
206
275
|
|
|
207
276
|
async function callFreeTier(messages, stream = false) {
|
|
@@ -592,11 +661,13 @@ function detectIntent(message) {
|
|
|
592
661
|
async function chat(userMessage, conversationHistory = []) {
|
|
593
662
|
const anthropicKey = config.get('anthropicKey') || process.env.ANTHROPIC_API_KEY || '';
|
|
594
663
|
const openaiKey = config.get('openaiKey') || process.env.OPENAI_API_KEY || '';
|
|
664
|
+
const nvidiaKey = config.get('nvidiaKey') || process.env.NVIDIA_API_KEY || '';
|
|
595
665
|
const apiKey = config.getApiKey() || '';
|
|
596
666
|
|
|
597
667
|
// Determine which provider to use
|
|
598
668
|
const effectiveAnthropicKey = anthropicKey || (apiKey.startsWith('sk-ant') ? apiKey : '');
|
|
599
669
|
const effectiveOpenAIKey = openaiKey || (apiKey.startsWith('sk-') && !apiKey.startsWith('sk-ant') ? apiKey : '');
|
|
670
|
+
const effectiveNvidiaKey = nvidiaKey || (apiKey.startsWith('nvapi-') ? apiKey : '');
|
|
600
671
|
|
|
601
672
|
const modelPref = config.getModel();
|
|
602
673
|
const intent = detectIntent(userMessage);
|
|
@@ -604,10 +675,11 @@ async function chat(userMessage, conversationHistory = []) {
|
|
|
604
675
|
// Track active provider for UI
|
|
605
676
|
if (effectiveAnthropicKey) sessionState.provider = 'Anthropic';
|
|
606
677
|
else if (effectiveOpenAIKey) sessionState.provider = 'OpenAI';
|
|
678
|
+
else if (effectiveNvidiaKey) sessionState.provider = 'NVIDIA';
|
|
607
679
|
else sessionState.provider = 'Grok (free)';
|
|
608
680
|
|
|
609
681
|
// No personal key — use free tier
|
|
610
|
-
if (!effectiveAnthropicKey && !effectiveOpenAIKey) {
|
|
682
|
+
if (!effectiveAnthropicKey && !effectiveOpenAIKey && !effectiveNvidiaKey) {
|
|
611
683
|
if (intent === 'code' && navada.config.hfToken) {
|
|
612
684
|
try {
|
|
613
685
|
const r = await navada.ai.huggingface.qwen(userMessage);
|
|
@@ -617,6 +689,21 @@ async function chat(userMessage, conversationHistory = []) {
|
|
|
617
689
|
return grokChat(userMessage, conversationHistory);
|
|
618
690
|
}
|
|
619
691
|
|
|
692
|
+
// NVIDIA key — route to NVIDIA
|
|
693
|
+
if (effectiveNvidiaKey && (!effectiveAnthropicKey || modelPref?.startsWith('nvidia') || modelPref?.startsWith('llama') || modelPref?.startsWith('deepseek') || modelPref?.startsWith('mistral') || modelPref?.startsWith('gemma') || modelPref?.startsWith('nemotron'))) {
|
|
694
|
+
const { streamNvidia } = require('./commands/nvidia');
|
|
695
|
+
const nvidiaModel = config.get('nvidiaModel') || 'llama-3.3-70b';
|
|
696
|
+
sessionState.provider = 'NVIDIA';
|
|
697
|
+
sessionState.model = nvidiaModel;
|
|
698
|
+
const messages = [
|
|
699
|
+
...conversationHistory.map(m => ({ role: m.role, content: typeof m.content === 'string' ? m.content : JSON.stringify(m.content) })),
|
|
700
|
+
{ role: 'user', content: userMessage },
|
|
701
|
+
];
|
|
702
|
+
process.stdout.write(ui.dim(' NAVADA > '));
|
|
703
|
+
const result = await streamNvidia(effectiveNvidiaKey, messages, nvidiaModel);
|
|
704
|
+
return result.content;
|
|
705
|
+
}
|
|
706
|
+
|
|
620
707
|
// OpenAI key — route to OpenAI
|
|
621
708
|
if (effectiveOpenAIKey && (!effectiveAnthropicKey || modelPref === 'gpt-4o' || modelPref === 'gpt-4o-mini')) {
|
|
622
709
|
return openAIChat(effectiveOpenAIKey, userMessage, conversationHistory);
|
|
@@ -711,6 +798,16 @@ async function chat(userMessage, conversationHistory = []) {
|
|
|
711
798
|
description: 'Run a Python script file.',
|
|
712
799
|
input_schema: { type: 'object', properties: { path: { type: 'string', description: 'Path to .py file' } }, required: ['path'] },
|
|
713
800
|
},
|
|
801
|
+
{
|
|
802
|
+
name: 'sandbox_run',
|
|
803
|
+
description: 'Run code in an isolated sandbox with syntax highlighting. Supports javascript, python, typescript. Returns colored output.',
|
|
804
|
+
input_schema: { type: 'object', properties: { code: { type: 'string', description: 'Code to execute' }, language: { type: 'string', description: 'javascript, python, or typescript' } }, required: ['code'] },
|
|
805
|
+
},
|
|
806
|
+
{
|
|
807
|
+
name: 'founder_info',
|
|
808
|
+
description: 'Get information about Lee Akpareva, founder of NAVADA Edge Network. Use when asked about the creator, founder, Lee, or who made NAVADA.',
|
|
809
|
+
input_schema: { type: 'object', properties: {} },
|
|
810
|
+
},
|
|
714
811
|
];
|
|
715
812
|
|
|
716
813
|
const messages = [
|
|
@@ -782,6 +879,14 @@ async function executeTool(name, input) {
|
|
|
782
879
|
case 'python_exec': return localTools.pythonExec.execute(input.code);
|
|
783
880
|
case 'python_pip': return localTools.pythonPip.execute(input.package);
|
|
784
881
|
case 'python_script': return localTools.pythonScript.execute(input.path);
|
|
882
|
+
case 'sandbox_run': {
|
|
883
|
+
const { runCode, displayCode, displayOutput } = require('./commands/sandbox');
|
|
884
|
+
displayCode(input.code, input.language);
|
|
885
|
+
const result = runCode(input.code, input.language);
|
|
886
|
+
displayOutput(result);
|
|
887
|
+
return result.error ? `Error (exit ${result.exitCode}): ${result.error}` : result.output;
|
|
888
|
+
}
|
|
889
|
+
case 'founder_info': return localTools.founderInfo.execute();
|
|
785
890
|
default: return `Unknown tool: ${name}`;
|
|
786
891
|
}
|
|
787
892
|
} catch (e) {
|
|
@@ -853,7 +958,6 @@ function getUpdateInfo() { return _updateInfo; }
|
|
|
853
958
|
async function reportTelemetry(event, data = {}) {
|
|
854
959
|
// Try dashboard first, then public tunnel
|
|
855
960
|
const endpoints = [
|
|
856
|
-
'http://100.88.118.128:7900',
|
|
857
961
|
'https://api.navada-edge-server.uk',
|
|
858
962
|
];
|
|
859
963
|
|
|
@@ -880,4 +984,4 @@ async function reportTelemetry(event, data = {}) {
|
|
|
880
984
|
}
|
|
881
985
|
}
|
|
882
986
|
|
|
883
|
-
module.exports = { IDENTITY, chat, localTools, reportTelemetry, fallbackChat, checkForUpdate, getUpdateInfo, rateTracker, sessionState };
|
|
987
|
+
module.exports = { IDENTITY, chat, localTools, reportTelemetry, fallbackChat, checkForUpdate, getUpdateInfo, rateTracker, sessionState, addToHistory, getConversationHistory, clearHistory };
|
package/lib/cli.js
CHANGED
|
@@ -10,6 +10,15 @@ const { execute, getCompletions } = require('./registry');
|
|
|
10
10
|
const { loadAll } = require('./commands/index');
|
|
11
11
|
const { reportTelemetry, checkForUpdate, getUpdateInfo, rateTracker } = require('./agent');
|
|
12
12
|
|
|
13
|
+
// Global error safety — prevent unhandled crashes
|
|
14
|
+
process.on('unhandledRejection', (err) => {
|
|
15
|
+
console.log(ui.error(`Unhandled error: ${err?.message || err}`));
|
|
16
|
+
});
|
|
17
|
+
process.on('uncaughtException', (err) => {
|
|
18
|
+
console.log(ui.error(`Fatal: ${err?.message || err}`));
|
|
19
|
+
process.exit(1);
|
|
20
|
+
});
|
|
21
|
+
|
|
13
22
|
function applyConfig() {
|
|
14
23
|
const cfg = config.getAll();
|
|
15
24
|
const overrides = {};
|
package/lib/commands/ai.js
CHANGED
|
@@ -3,13 +3,10 @@
|
|
|
3
3
|
const navada = require('navada-edge-sdk');
|
|
4
4
|
const ui = require('../ui');
|
|
5
5
|
const config = require('../config');
|
|
6
|
-
const { chat: agentChat, reportTelemetry, rateTracker } = require('../agent');
|
|
6
|
+
const { chat: agentChat, reportTelemetry, rateTracker, addToHistory, getConversationHistory, clearHistory } = require('../agent');
|
|
7
7
|
|
|
8
8
|
module.exports = function(reg) {
|
|
9
9
|
|
|
10
|
-
// Conversation history for multi-turn
|
|
11
|
-
const conversationHistory = [];
|
|
12
|
-
|
|
13
10
|
reg('chat', 'Chat with NAVADA Edge AI agent', async (args) => {
|
|
14
11
|
const msg = args.join(' ');
|
|
15
12
|
if (!msg) { console.log(ui.dim('Just type naturally — no /command needed.')); return; }
|
|
@@ -17,24 +14,22 @@ module.exports = function(reg) {
|
|
|
17
14
|
const hasKey = config.getApiKey() || config.get('anthropicKey') || process.env.ANTHROPIC_API_KEY;
|
|
18
15
|
|
|
19
16
|
// Show a brief "thinking" indicator, then clear it when streaming starts
|
|
17
|
+
let spinner;
|
|
20
18
|
if (!hasKey) {
|
|
21
19
|
process.stdout.write(ui.dim(' NAVADA > '));
|
|
22
20
|
} else {
|
|
23
21
|
const ora = require('ora');
|
|
24
|
-
|
|
22
|
+
spinner = ora({ text: ' NAVADA thinking...', color: 'white' }).start();
|
|
25
23
|
}
|
|
26
24
|
|
|
27
25
|
try {
|
|
28
|
-
const response = await agentChat(msg,
|
|
26
|
+
const response = await agentChat(msg, getConversationHistory());
|
|
29
27
|
|
|
30
28
|
if (spinner) spinner.stop();
|
|
31
29
|
|
|
32
30
|
// Update conversation history
|
|
33
|
-
|
|
34
|
-
|
|
35
|
-
|
|
36
|
-
// Keep history manageable (last 20 turns)
|
|
37
|
-
while (conversationHistory.length > 40) conversationHistory.splice(0, 2);
|
|
31
|
+
addToHistory('user', msg);
|
|
32
|
+
addToHistory('assistant', response);
|
|
38
33
|
|
|
39
34
|
// Only print if not already streamed
|
|
40
35
|
if (!response._streamed) {
|
|
@@ -51,6 +46,14 @@ module.exports = function(reg) {
|
|
|
51
46
|
}
|
|
52
47
|
}, { category: 'AI', aliases: ['ask'] });
|
|
53
48
|
|
|
49
|
+
reg('clear', 'Clear conversation history and reset session', () => {
|
|
50
|
+
clearHistory();
|
|
51
|
+
console.clear();
|
|
52
|
+
const banner = require('../ui').banner;
|
|
53
|
+
console.log(banner());
|
|
54
|
+
console.log(ui.success('Conversation cleared — fresh session'));
|
|
55
|
+
}, { category: 'AI' });
|
|
56
|
+
|
|
54
57
|
reg('qwen', 'Qwen Coder 32B (FREE via HuggingFace)', async (args) => {
|
|
55
58
|
const prompt = args.join(' ');
|
|
56
59
|
if (!prompt) { console.log(ui.dim('Usage: /qwen Write a function to validate UK postcodes')); return; }
|
|
@@ -106,24 +109,36 @@ module.exports = function(reg) {
|
|
|
106
109
|
|
|
107
110
|
reg('model', 'Show/set default AI model', (args) => {
|
|
108
111
|
if (args[0]) {
|
|
109
|
-
const valid = ['auto', 'claude', 'gpt-4o', 'gpt-4o-mini', 'qwen'];
|
|
112
|
+
const valid = ['auto', 'claude', 'gpt-4o', 'gpt-4o-mini', 'qwen', 'nvidia', 'llama-3.3-70b', 'llama-3.1-8b', 'mistral-large', 'gemma-2-27b', 'codellama-70b', 'deepseek-r1', 'phi-3-medium', 'nemotron-70b'];
|
|
110
113
|
if (!valid.includes(args[0])) { console.log(ui.error(`Invalid model. Options: ${valid.join(', ')}`)); return; }
|
|
111
114
|
config.setModel(args[0]);
|
|
115
|
+
// If it's an NVIDIA model name, also set it as the nvidia model
|
|
116
|
+
const nvidiaModels = ['llama-3.3-70b', 'llama-3.1-8b', 'mistral-large', 'gemma-2-27b', 'codellama-70b', 'deepseek-r1', 'phi-3-medium', 'nemotron-70b'];
|
|
117
|
+
if (nvidiaModels.includes(args[0])) config.set('nvidiaModel', args[0]);
|
|
112
118
|
console.log(ui.success(`Model set to: ${args[0]}`));
|
|
113
119
|
} else {
|
|
114
120
|
console.log(ui.header('AI MODELS'));
|
|
115
121
|
const current = config.getModel();
|
|
116
122
|
console.log(ui.label('Current', current));
|
|
117
123
|
console.log('');
|
|
118
|
-
console.log(ui.dim('
|
|
119
|
-
console.log(ui.label('auto', '
|
|
120
|
-
console.log(ui.label('claude', 'Claude Sonnet 4
|
|
121
|
-
console.log(ui.label('gpt-4o', '
|
|
122
|
-
console.log(ui.label('qwen', 'Qwen Coder 32B (
|
|
124
|
+
console.log(ui.dim('Core providers:'));
|
|
125
|
+
console.log(ui.label('auto', 'Smart routing — picks best provider per query'));
|
|
126
|
+
console.log(ui.label('claude', 'Claude Sonnet 4 (Anthropic) — full agent + tools'));
|
|
127
|
+
console.log(ui.label('gpt-4o', 'GPT-4o (OpenAI) — tool use + streaming'));
|
|
128
|
+
console.log(ui.label('qwen', 'Qwen Coder 32B (HuggingFace — FREE)'));
|
|
129
|
+
console.log('');
|
|
130
|
+
console.log(ui.dim('NVIDIA models (FREE via build.nvidia.com):'));
|
|
131
|
+
console.log(ui.label('llama-3.3-70b', 'Meta Llama 3.3 70B'));
|
|
132
|
+
console.log(ui.label('deepseek-r1', 'DeepSeek R1'));
|
|
133
|
+
console.log(ui.label('mistral-large', 'Mistral Large 2'));
|
|
134
|
+
console.log(ui.label('codellama-70b', 'Code Llama 70B'));
|
|
135
|
+
console.log(ui.label('gemma-2-27b', 'Google Gemma 2 27B'));
|
|
136
|
+
console.log(ui.label('nemotron-70b', 'NVIDIA Nemotron 70B'));
|
|
123
137
|
console.log('');
|
|
124
|
-
console.log(ui.dim('Set
|
|
138
|
+
console.log(ui.dim('Set: /model deepseek-r1'));
|
|
139
|
+
console.log(ui.dim('NVIDIA key: /login nvapi-your-key (free at build.nvidia.com)'));
|
|
125
140
|
}
|
|
126
|
-
}, { category: 'AI', subs: ['auto', 'claude', 'gpt-4o', 'gpt-4o-mini', 'qwen'] });
|
|
141
|
+
}, { category: 'AI', subs: ['auto', 'claude', 'gpt-4o', 'gpt-4o-mini', 'qwen', 'nvidia', 'llama-3.3-70b', 'deepseek-r1', 'mistral-large', 'codellama-70b', 'gemma-2-27b', 'nemotron-70b'] });
|
|
127
142
|
|
|
128
143
|
reg('research', 'RAG search via MCP', async (args) => {
|
|
129
144
|
const query = args.join(' ');
|
package/lib/commands/index.js
CHANGED
|
@@ -3,26 +3,22 @@
|
|
|
3
3
|
const { register } = require('../registry');
|
|
4
4
|
|
|
5
5
|
// Load all command modules — each exports a function(register)
|
|
6
|
-
const
|
|
7
|
-
|
|
8
|
-
|
|
9
|
-
|
|
10
|
-
require('./docker'),
|
|
11
|
-
require('./database'),
|
|
12
|
-
require('./cloudflare'),
|
|
13
|
-
require('./ai'),
|
|
14
|
-
require('./azure'),
|
|
15
|
-
require('./agents'),
|
|
16
|
-
require('./tasks'),
|
|
17
|
-
require('./keys'),
|
|
18
|
-
require('./setup'),
|
|
19
|
-
require('./system'),
|
|
20
|
-
require('./learn'),
|
|
6
|
+
const moduleNames = [
|
|
7
|
+
'network', 'mcp', 'lucas', 'docker', 'database', 'cloudflare',
|
|
8
|
+
'ai', 'azure', 'agents', 'tasks', 'keys', 'setup', 'system',
|
|
9
|
+
'learn', 'sandbox', 'nvidia',
|
|
21
10
|
];
|
|
22
11
|
|
|
23
12
|
function loadAll() {
|
|
24
|
-
for (const
|
|
25
|
-
|
|
13
|
+
for (const name of moduleNames) {
|
|
14
|
+
try {
|
|
15
|
+
const mod = require(`./${name}`);
|
|
16
|
+
mod(register);
|
|
17
|
+
} catch (e) {
|
|
18
|
+
// Don't crash the entire CLI if one module fails to load
|
|
19
|
+
const ui = require('../ui');
|
|
20
|
+
console.log(ui.warn(`Failed to load module: ${name} — ${e.message}`));
|
|
21
|
+
}
|
|
26
22
|
}
|
|
27
23
|
}
|
|
28
24
|
|
|
@@ -0,0 +1,241 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
|
|
3
|
+
const https = require('https');
|
|
4
|
+
const ui = require('../ui');
|
|
5
|
+
const config = require('../config');
|
|
6
|
+
|
|
7
|
+
// ---------------------------------------------------------------------------
|
|
8
|
+
// NVIDIA API — free tier models via build.nvidia.com
|
|
9
|
+
// ---------------------------------------------------------------------------
|
|
10
|
+
const NVIDIA_MODELS = {
|
|
11
|
+
'llama-3.3-70b': { id: 'meta/llama-3.3-70b-instruct', name: 'Llama 3.3 70B', free: true },
|
|
12
|
+
'llama-3.1-8b': { id: 'meta/llama-3.1-8b-instruct', name: 'Llama 3.1 8B', free: true },
|
|
13
|
+
'mistral-large': { id: 'mistralai/mistral-large-2-instruct', name: 'Mistral Large 2', free: true },
|
|
14
|
+
'gemma-2-27b': { id: 'google/gemma-2-27b-it', name: 'Gemma 2 27B', free: true },
|
|
15
|
+
'codellama-70b': { id: 'meta/codellama-70b-instruct', name: 'Code Llama 70B', free: true },
|
|
16
|
+
'deepseek-r1': { id: 'deepseek-ai/deepseek-r1', name: 'DeepSeek R1', free: true },
|
|
17
|
+
'phi-3-medium': { id: 'microsoft/phi-3-medium-128k-instruct', name: 'Phi 3 Medium 128K', free: true },
|
|
18
|
+
'nemotron-70b': { id: 'nvidia/llama-3.1-nemotron-70b-instruct', name: 'Nemotron 70B', free: true },
|
|
19
|
+
};
|
|
20
|
+
|
|
21
|
+
const DEFAULT_NVIDIA_MODEL = 'llama-3.3-70b';
|
|
22
|
+
|
|
23
|
+
// ---------------------------------------------------------------------------
|
|
24
|
+
// NVIDIA API chat (streaming)
|
|
25
|
+
// ---------------------------------------------------------------------------
|
|
26
|
+
function streamNvidia(apiKey, messages, modelKey = DEFAULT_NVIDIA_MODEL) {
|
|
27
|
+
const modelInfo = NVIDIA_MODELS[modelKey] || NVIDIA_MODELS[DEFAULT_NVIDIA_MODEL];
|
|
28
|
+
|
|
29
|
+
return new Promise((resolve, reject) => {
|
|
30
|
+
const body = JSON.stringify({
|
|
31
|
+
model: modelInfo.id,
|
|
32
|
+
messages: [
|
|
33
|
+
{ role: 'system', content: 'You are NAVADA, an AI infrastructure agent. Keep responses concise and technical. Use code blocks with language tags when showing code.' },
|
|
34
|
+
...messages,
|
|
35
|
+
],
|
|
36
|
+
max_tokens: 4096,
|
|
37
|
+
stream: true,
|
|
38
|
+
temperature: 0.7,
|
|
39
|
+
});
|
|
40
|
+
|
|
41
|
+
const req = https.request('https://integrate.api.nvidia.com/v1/chat/completions', {
|
|
42
|
+
method: 'POST',
|
|
43
|
+
headers: {
|
|
44
|
+
'Authorization': `Bearer ${apiKey}`,
|
|
45
|
+
'Content-Type': 'application/json',
|
|
46
|
+
'Content-Length': Buffer.byteLength(body),
|
|
47
|
+
},
|
|
48
|
+
timeout: 120000,
|
|
49
|
+
}, (res) => {
|
|
50
|
+
if (res.statusCode !== 200) {
|
|
51
|
+
let data = '';
|
|
52
|
+
res.on('data', c => data += c);
|
|
53
|
+
res.on('end', () => {
|
|
54
|
+
if (res.statusCode === 401) reject(new Error('Invalid NVIDIA API key. Get one free at https://build.nvidia.com'));
|
|
55
|
+
else if (res.statusCode === 429) reject(new Error('NVIDIA rate limit reached. Wait a moment and try again.'));
|
|
56
|
+
else reject(new Error(`NVIDIA API error ${res.statusCode}: ${data.slice(0, 200)}`));
|
|
57
|
+
});
|
|
58
|
+
return;
|
|
59
|
+
}
|
|
60
|
+
|
|
61
|
+
let buffer = '';
|
|
62
|
+
let fullContent = '';
|
|
63
|
+
|
|
64
|
+
res.on('data', (chunk) => {
|
|
65
|
+
buffer += chunk.toString();
|
|
66
|
+
const lines = buffer.split('\n');
|
|
67
|
+
buffer = lines.pop();
|
|
68
|
+
|
|
69
|
+
for (const line of lines) {
|
|
70
|
+
if (!line.startsWith('data: ')) continue;
|
|
71
|
+
const data = line.slice(6).trim();
|
|
72
|
+
if (data === '[DONE]') continue;
|
|
73
|
+
try {
|
|
74
|
+
const event = JSON.parse(data);
|
|
75
|
+
const delta = event.choices?.[0]?.delta?.content || '';
|
|
76
|
+
if (delta) {
|
|
77
|
+
process.stdout.write(delta);
|
|
78
|
+
fullContent += delta;
|
|
79
|
+
}
|
|
80
|
+
} catch {}
|
|
81
|
+
}
|
|
82
|
+
});
|
|
83
|
+
|
|
84
|
+
res.on('end', () => {
|
|
85
|
+
if (fullContent) process.stdout.write('\n');
|
|
86
|
+
resolve({ content: fullContent, model: modelInfo.name, streamed: true });
|
|
87
|
+
});
|
|
88
|
+
});
|
|
89
|
+
|
|
90
|
+
req.on('error', reject);
|
|
91
|
+
req.on('timeout', () => { req.destroy(); reject(new Error('NVIDIA API timeout')); });
|
|
92
|
+
req.write(body);
|
|
93
|
+
req.end();
|
|
94
|
+
});
|
|
95
|
+
}
|
|
96
|
+
|
|
97
|
+
// Non-streaming fallback
|
|
98
|
+
async function chatNvidia(apiKey, messages, modelKey = DEFAULT_NVIDIA_MODEL) {
|
|
99
|
+
const modelInfo = NVIDIA_MODELS[modelKey] || NVIDIA_MODELS[DEFAULT_NVIDIA_MODEL];
|
|
100
|
+
|
|
101
|
+
return new Promise((resolve, reject) => {
|
|
102
|
+
const body = JSON.stringify({
|
|
103
|
+
model: modelInfo.id,
|
|
104
|
+
messages: [
|
|
105
|
+
{ role: 'system', content: 'You are NAVADA, an AI infrastructure agent. Keep responses concise and technical. Use code blocks with language tags when showing code.' },
|
|
106
|
+
...messages,
|
|
107
|
+
],
|
|
108
|
+
max_tokens: 4096,
|
|
109
|
+
temperature: 0.7,
|
|
110
|
+
});
|
|
111
|
+
|
|
112
|
+
const req = https.request('https://integrate.api.nvidia.com/v1/chat/completions', {
|
|
113
|
+
method: 'POST',
|
|
114
|
+
headers: {
|
|
115
|
+
'Authorization': `Bearer ${apiKey}`,
|
|
116
|
+
'Content-Type': 'application/json',
|
|
117
|
+
'Content-Length': Buffer.byteLength(body),
|
|
118
|
+
},
|
|
119
|
+
timeout: 120000,
|
|
120
|
+
}, (res) => {
|
|
121
|
+
let data = '';
|
|
122
|
+
res.on('data', c => data += c);
|
|
123
|
+
res.on('end', () => {
|
|
124
|
+
if (res.statusCode !== 200) {
|
|
125
|
+
reject(new Error(`NVIDIA API error ${res.statusCode}: ${data.slice(0, 200)}`));
|
|
126
|
+
return;
|
|
127
|
+
}
|
|
128
|
+
try {
|
|
129
|
+
const parsed = JSON.parse(data);
|
|
130
|
+
resolve({ content: parsed.choices?.[0]?.message?.content || '', model: modelInfo.name });
|
|
131
|
+
} catch (e) { reject(e); }
|
|
132
|
+
});
|
|
133
|
+
});
|
|
134
|
+
|
|
135
|
+
req.on('error', reject);
|
|
136
|
+
req.on('timeout', () => { req.destroy(); reject(new Error('Timeout')); });
|
|
137
|
+
req.write(body);
|
|
138
|
+
req.end();
|
|
139
|
+
});
|
|
140
|
+
}
|
|
141
|
+
|
|
142
|
+
// ---------------------------------------------------------------------------
|
|
143
|
+
// Command registration
|
|
144
|
+
// ---------------------------------------------------------------------------
|
|
145
|
+
module.exports = function(reg) {
|
|
146
|
+
|
|
147
|
+
reg('nvidia', 'NVIDIA AI models (free tier via build.nvidia.com)', async (args) => {
|
|
148
|
+
const sub = args[0];
|
|
149
|
+
|
|
150
|
+
if (!sub || sub === 'help') {
|
|
151
|
+
console.log(ui.header('NVIDIA AI'));
|
|
152
|
+
console.log(ui.dim('Free AI models via NVIDIA build.nvidia.com'));
|
|
153
|
+
console.log('');
|
|
154
|
+
console.log(ui.cmd('nvidia login <key>', 'Set your NVIDIA API key'));
|
|
155
|
+
console.log(ui.cmd('nvidia models', 'List available models'));
|
|
156
|
+
console.log(ui.cmd('nvidia model <name>', 'Set default NVIDIA model'));
|
|
157
|
+
console.log(ui.cmd('nvidia chat <msg>', 'Chat with NVIDIA model'));
|
|
158
|
+
console.log(ui.cmd('nvidia status', 'Check API connection'));
|
|
159
|
+
console.log('');
|
|
160
|
+
console.log(ui.dim('Get a free key: https://build.nvidia.com'));
|
|
161
|
+
console.log(ui.dim('Then: /nvidia login nvapi-xxxx'));
|
|
162
|
+
return;
|
|
163
|
+
}
|
|
164
|
+
|
|
165
|
+
if (sub === 'login') {
|
|
166
|
+
const key = args[1];
|
|
167
|
+
if (!key) { console.log(ui.error('Usage: /nvidia login nvapi-your-key-here')); return; }
|
|
168
|
+
if (!key.startsWith('nvapi-')) { console.log(ui.error('NVIDIA keys start with nvapi-')); return; }
|
|
169
|
+
config.set('nvidiaKey', key);
|
|
170
|
+
console.log(ui.success('NVIDIA API key saved'));
|
|
171
|
+
console.log(ui.dim('Test it: /nvidia status'));
|
|
172
|
+
return;
|
|
173
|
+
}
|
|
174
|
+
|
|
175
|
+
if (sub === 'models') {
|
|
176
|
+
console.log(ui.header('NVIDIA MODELS'));
|
|
177
|
+
const currentModel = config.get('nvidiaModel') || DEFAULT_NVIDIA_MODEL;
|
|
178
|
+
for (const [key, info] of Object.entries(NVIDIA_MODELS)) {
|
|
179
|
+
const active = key === currentModel ? ' ◄' : '';
|
|
180
|
+
console.log(ui.label(key.padEnd(18), `${info.name}${info.free ? ' (FREE)' : ''}${active}`));
|
|
181
|
+
}
|
|
182
|
+
console.log('');
|
|
183
|
+
console.log(ui.dim(`Current: ${currentModel}`));
|
|
184
|
+
console.log(ui.dim('Set: /nvidia model deepseek-r1'));
|
|
185
|
+
return;
|
|
186
|
+
}
|
|
187
|
+
|
|
188
|
+
if (sub === 'model') {
|
|
189
|
+
const model = args[1];
|
|
190
|
+
if (!model) { console.log(ui.error('Usage: /nvidia model llama-3.3-70b')); return; }
|
|
191
|
+
if (!NVIDIA_MODELS[model]) {
|
|
192
|
+
console.log(ui.error(`Unknown model: ${model}`));
|
|
193
|
+
console.log(ui.dim(`Available: ${Object.keys(NVIDIA_MODELS).join(', ')}`));
|
|
194
|
+
return;
|
|
195
|
+
}
|
|
196
|
+
config.set('nvidiaModel', model);
|
|
197
|
+
console.log(ui.success(`NVIDIA model set to: ${NVIDIA_MODELS[model].name}`));
|
|
198
|
+
return;
|
|
199
|
+
}
|
|
200
|
+
|
|
201
|
+
if (sub === 'status') {
|
|
202
|
+
const key = config.get('nvidiaKey');
|
|
203
|
+
if (!key) { console.log(ui.error('No NVIDIA key set. /nvidia login nvapi-your-key')); return; }
|
|
204
|
+
console.log(ui.dim('Testing NVIDIA API...'));
|
|
205
|
+
try {
|
|
206
|
+
const result = await chatNvidia(key, [{ role: 'user', content: 'Say "NVIDIA connected" in exactly 2 words.' }]);
|
|
207
|
+
console.log(ui.success(`NVIDIA API connected — ${result.model}`));
|
|
208
|
+
console.log(ui.dim(`Response: ${result.content}`));
|
|
209
|
+
} catch (e) {
|
|
210
|
+
console.log(ui.error(e.message));
|
|
211
|
+
}
|
|
212
|
+
return;
|
|
213
|
+
}
|
|
214
|
+
|
|
215
|
+
if (sub === 'chat') {
|
|
216
|
+
const key = config.get('nvidiaKey');
|
|
217
|
+
if (!key) { console.log(ui.error('No NVIDIA key set. /nvidia login nvapi-your-key')); return; }
|
|
218
|
+
const msg = args.slice(1).join(' ');
|
|
219
|
+
if (!msg) { console.log(ui.dim('Usage: /nvidia chat explain Docker networking')); return; }
|
|
220
|
+
|
|
221
|
+
const model = config.get('nvidiaModel') || DEFAULT_NVIDIA_MODEL;
|
|
222
|
+
process.stdout.write(ui.dim(` NAVADA [${NVIDIA_MODELS[model].name}] > `));
|
|
223
|
+
try {
|
|
224
|
+
await streamNvidia(key, [{ role: 'user', content: msg }], model);
|
|
225
|
+
} catch (e) {
|
|
226
|
+
console.log(ui.error(e.message));
|
|
227
|
+
}
|
|
228
|
+
return;
|
|
229
|
+
}
|
|
230
|
+
|
|
231
|
+
console.log(ui.dim('Unknown subcommand. Try /nvidia help'));
|
|
232
|
+
|
|
233
|
+
}, { category: 'AI', aliases: ['nv'], subs: ['login', 'models', 'model', 'chat', 'status', 'help'] });
|
|
234
|
+
|
|
235
|
+
};
|
|
236
|
+
|
|
237
|
+
// Export for agent integration
|
|
238
|
+
module.exports.streamNvidia = streamNvidia;
|
|
239
|
+
module.exports.chatNvidia = chatNvidia;
|
|
240
|
+
module.exports.NVIDIA_MODELS = NVIDIA_MODELS;
|
|
241
|
+
module.exports.DEFAULT_NVIDIA_MODEL = DEFAULT_NVIDIA_MODEL;
|
|
@@ -0,0 +1,323 @@
|
|
|
1
|
+
'use strict';
|
|
2
|
+
|
|
3
|
+
const { execSync } = require('child_process');
|
|
4
|
+
const fs = require('fs');
|
|
5
|
+
const path = require('path');
|
|
6
|
+
const os = require('os');
|
|
7
|
+
const chalk = require('chalk');
|
|
8
|
+
const ui = require('../ui');
|
|
9
|
+
const config = require('../config');
|
|
10
|
+
|
|
11
|
+
// ---------------------------------------------------------------------------
|
|
12
|
+
// Syntax highlighter — colorizes code output for the terminal
|
|
13
|
+
// ---------------------------------------------------------------------------
|
|
14
|
+
const LANG_KEYWORDS = {
|
|
15
|
+
javascript: /\b(const|let|var|function|return|if|else|for|while|class|import|export|from|async|await|new|this|typeof|instanceof|try|catch|throw|switch|case|break|continue|default|yield|delete|in|of|do|void|with)\b/g,
|
|
16
|
+
python: /\b(def|class|return|if|elif|else|for|while|import|from|as|try|except|finally|raise|with|yield|lambda|pass|break|continue|not|and|or|is|in|True|False|None|self|print|async|await|nonlocal|global|assert|del)\b/g,
|
|
17
|
+
typescript: /\b(const|let|var|function|return|if|else|for|while|class|import|export|from|async|await|new|this|typeof|instanceof|try|catch|throw|switch|case|break|continue|default|yield|interface|type|enum|implements|extends|abstract|readonly|private|public|protected|static|override|declare|namespace|module|keyof|infer)\b/g,
|
|
18
|
+
csharp: /\b(using|namespace|class|public|private|protected|static|void|int|string|bool|float|double|var|new|return|if|else|for|foreach|while|try|catch|throw|switch|case|break|continue|async|await|override|virtual|abstract|interface|struct|enum|null|true|false|this|base|readonly|const|ref|out|in|params|yield|delegate|event|lock|checked|unchecked|fixed|sizeof|typeof|is|as|where|select|from|orderby|group|join|let|into)\b/g,
|
|
19
|
+
go: /\b(func|return|if|else|for|range|switch|case|break|continue|var|const|type|struct|interface|map|chan|select|defer|go|import|package|fallthrough|default|nil|true|false|make|len|cap|append|copy|delete|new|panic|recover|close|iota)\b/g,
|
|
20
|
+
rust: /\b(fn|let|mut|return|if|else|for|while|loop|match|struct|enum|impl|trait|pub|use|mod|crate|self|super|async|await|move|ref|where|type|const|static|unsafe|extern|dyn|macro_rules|as|in|break|continue|true|false|None|Some|Ok|Err|Box|Vec|String|Option|Result)\b/g,
|
|
21
|
+
};
|
|
22
|
+
|
|
23
|
+
const STRING_RE = /(["'`])(?:(?!\1|\\).|\\.)*?\1/g;
|
|
24
|
+
const NUMBER_RE = /\b(\d+\.?\d*(?:e[+-]?\d+)?)\b/gi;
|
|
25
|
+
const COMMENT_LINE_RE = /(\/\/.*$|#(?!!).*$)/gm;
|
|
26
|
+
const COMMENT_BLOCK_RE = /(\/\*[\s\S]*?\*\/)/g;
|
|
27
|
+
const FUNC_CALL_RE = /\b([a-zA-Z_]\w*)\s*\(/g;
|
|
28
|
+
const DECORATOR_RE = /(@\w+)/g;
|
|
29
|
+
|
|
30
|
+
function highlight(code, lang = 'javascript') {
|
|
31
|
+
const langKey = lang.toLowerCase().replace(/^(js|node)$/, 'javascript').replace(/^(ts)$/, 'typescript').replace(/^(py)$/, 'python').replace(/^(cs)$/, 'csharp').replace(/^(rs)$/, 'rust');
|
|
32
|
+
|
|
33
|
+
// Placeholder system to avoid double-coloring
|
|
34
|
+
const placeholders = [];
|
|
35
|
+
const ph = (styled) => { placeholders.push(styled); return `\x00PH${placeholders.length - 1}\x00`; };
|
|
36
|
+
|
|
37
|
+
let result = code;
|
|
38
|
+
|
|
39
|
+
// 1. Comments first (highest priority)
|
|
40
|
+
result = result.replace(COMMENT_BLOCK_RE, (m) => ph(chalk.gray.italic(m)));
|
|
41
|
+
result = result.replace(COMMENT_LINE_RE, (m) => ph(chalk.gray.italic(m)));
|
|
42
|
+
|
|
43
|
+
// 2. Strings
|
|
44
|
+
result = result.replace(STRING_RE, (m) => ph(chalk.green(m)));
|
|
45
|
+
|
|
46
|
+
// 3. Decorators
|
|
47
|
+
result = result.replace(DECORATOR_RE, (m) => ph(chalk.yellow(m)));
|
|
48
|
+
|
|
49
|
+
// 4. Function calls
|
|
50
|
+
result = result.replace(FUNC_CALL_RE, (m, name) => ph(chalk.yellow(name)) + '(');
|
|
51
|
+
|
|
52
|
+
// 5. Numbers
|
|
53
|
+
result = result.replace(NUMBER_RE, (m) => ph(chalk.magenta(m)));
|
|
54
|
+
|
|
55
|
+
// 6. Keywords
|
|
56
|
+
const keywordRe = LANG_KEYWORDS[langKey] || LANG_KEYWORDS.javascript;
|
|
57
|
+
result = result.replace(keywordRe, (m) => ph(chalk.cyan.bold(m)));
|
|
58
|
+
|
|
59
|
+
// Restore placeholders
|
|
60
|
+
result = result.replace(/\x00PH(\d+)\x00/g, (_, i) => placeholders[parseInt(i)]);
|
|
61
|
+
|
|
62
|
+
return result;
|
|
63
|
+
}
|
|
64
|
+
|
|
65
|
+
// ---------------------------------------------------------------------------
|
|
66
|
+
// Sandbox — isolated code execution with colored output
|
|
67
|
+
// ---------------------------------------------------------------------------
|
|
68
|
+
const SANDBOX_DIR = path.join(os.tmpdir(), 'navada-sandbox');
|
|
69
|
+
|
|
70
|
+
function ensureSandbox() {
|
|
71
|
+
if (!fs.existsSync(SANDBOX_DIR)) fs.mkdirSync(SANDBOX_DIR, { recursive: true });
|
|
72
|
+
}
|
|
73
|
+
|
|
74
|
+
function detectLang(code, lang) {
|
|
75
|
+
if (lang) return lang.toLowerCase();
|
|
76
|
+
if (/\bdef\s+\w+|import\s+\w+|print\s*\(/.test(code)) return 'python';
|
|
77
|
+
if (/\bfunc\s+\w+|package\s+\w+|fmt\./.test(code)) return 'go';
|
|
78
|
+
if (/\bfn\s+\w+|let\s+mut\b|use\s+std/.test(code)) return 'rust';
|
|
79
|
+
if (/\busing\s+System|namespace\s+|Console\./.test(code)) return 'csharp';
|
|
80
|
+
return 'javascript';
|
|
81
|
+
}
|
|
82
|
+
|
|
83
|
+
function runCode(code, lang) {
|
|
84
|
+
ensureSandbox();
|
|
85
|
+
const detected = detectLang(code, lang);
|
|
86
|
+
|
|
87
|
+
let file, cmd;
|
|
88
|
+
switch (detected) {
|
|
89
|
+
case 'python':
|
|
90
|
+
case 'py':
|
|
91
|
+
file = path.join(SANDBOX_DIR, `run_${Date.now()}.py`);
|
|
92
|
+
fs.writeFileSync(file, code);
|
|
93
|
+
cmd = `${process.platform === 'win32' ? 'python' : 'python3'} "${file}"`;
|
|
94
|
+
break;
|
|
95
|
+
|
|
96
|
+
case 'javascript':
|
|
97
|
+
case 'js':
|
|
98
|
+
case 'node':
|
|
99
|
+
file = path.join(SANDBOX_DIR, `run_${Date.now()}.js`);
|
|
100
|
+
fs.writeFileSync(file, code);
|
|
101
|
+
cmd = `node "${file}"`;
|
|
102
|
+
break;
|
|
103
|
+
|
|
104
|
+
case 'typescript':
|
|
105
|
+
case 'ts':
|
|
106
|
+
file = path.join(SANDBOX_DIR, `run_${Date.now()}.ts`);
|
|
107
|
+
fs.writeFileSync(file, code);
|
|
108
|
+
cmd = `npx tsx "${file}" 2>/dev/null || npx ts-node "${file}"`;
|
|
109
|
+
break;
|
|
110
|
+
|
|
111
|
+
default:
|
|
112
|
+
return { lang: detected, error: `Language "${detected}" not supported for execution. Supported: javascript, python, typescript` };
|
|
113
|
+
}
|
|
114
|
+
|
|
115
|
+
try {
|
|
116
|
+
const output = execSync(cmd, {
|
|
117
|
+
timeout: 30000,
|
|
118
|
+
encoding: 'utf-8',
|
|
119
|
+
cwd: SANDBOX_DIR,
|
|
120
|
+
stdio: ['pipe', 'pipe', 'pipe'],
|
|
121
|
+
env: { ...process.env, NODE_NO_WARNINGS: '1' },
|
|
122
|
+
});
|
|
123
|
+
// Clean up
|
|
124
|
+
try { fs.unlinkSync(file); } catch {}
|
|
125
|
+
return { lang: detected, output: output.trim(), exitCode: 0 };
|
|
126
|
+
} catch (e) {
|
|
127
|
+
try { fs.unlinkSync(file); } catch {}
|
|
128
|
+
return { lang: detected, output: (e.stdout || '').trim(), error: (e.stderr || e.message || '').trim(), exitCode: e.status || 1 };
|
|
129
|
+
}
|
|
130
|
+
}
|
|
131
|
+
|
|
132
|
+
// ---------------------------------------------------------------------------
|
|
133
|
+
// Display — code + output with colors
|
|
134
|
+
// ---------------------------------------------------------------------------
|
|
135
|
+
function displayCode(code, lang) {
|
|
136
|
+
const detected = detectLang(code, lang);
|
|
137
|
+
const lines = code.split('\n');
|
|
138
|
+
const gutterWidth = String(lines.length).length;
|
|
139
|
+
|
|
140
|
+
console.log('');
|
|
141
|
+
console.log(ui.dim(`─── ${detected.toUpperCase()} ${'─'.repeat(40)}`));
|
|
142
|
+
console.log('');
|
|
143
|
+
|
|
144
|
+
lines.forEach((line, i) => {
|
|
145
|
+
const num = chalk.gray(String(i + 1).padStart(gutterWidth) + ' │ ');
|
|
146
|
+
const colored = highlight(line, detected);
|
|
147
|
+
console.log(' ' + num + colored);
|
|
148
|
+
});
|
|
149
|
+
|
|
150
|
+
console.log('');
|
|
151
|
+
}
|
|
152
|
+
|
|
153
|
+
function displayOutput(result) {
|
|
154
|
+
if (result.error && !result.output) {
|
|
155
|
+
console.log(ui.dim('─── ERROR ' + '─'.repeat(38)));
|
|
156
|
+
console.log('');
|
|
157
|
+
result.error.split('\n').forEach(line => {
|
|
158
|
+
console.log(' ' + chalk.red(line));
|
|
159
|
+
});
|
|
160
|
+
} else {
|
|
161
|
+
console.log(ui.dim('─── OUTPUT ' + '─'.repeat(37)));
|
|
162
|
+
console.log('');
|
|
163
|
+
if (result.output) {
|
|
164
|
+
result.output.split('\n').forEach(line => {
|
|
165
|
+
console.log(' ' + chalk.white(line));
|
|
166
|
+
});
|
|
167
|
+
}
|
|
168
|
+
if (result.error) {
|
|
169
|
+
console.log('');
|
|
170
|
+
console.log(ui.dim('─── STDERR ' + '─'.repeat(37)));
|
|
171
|
+
result.error.split('\n').forEach(line => {
|
|
172
|
+
console.log(' ' + chalk.yellow(line));
|
|
173
|
+
});
|
|
174
|
+
}
|
|
175
|
+
}
|
|
176
|
+
console.log('');
|
|
177
|
+
const status = result.exitCode === 0 ? chalk.green('✓ exit 0') : chalk.red(`✗ exit ${result.exitCode}`);
|
|
178
|
+
console.log(' ' + status);
|
|
179
|
+
console.log('');
|
|
180
|
+
}
|
|
181
|
+
|
|
182
|
+
// ---------------------------------------------------------------------------
|
|
183
|
+
// Command registration
|
|
184
|
+
// ---------------------------------------------------------------------------
|
|
185
|
+
module.exports = function(reg) {
|
|
186
|
+
|
|
187
|
+
reg('sandbox', 'Run code in a sandbox with syntax highlighting', async (args) => {
|
|
188
|
+
if (!args.length) {
|
|
189
|
+
console.log(ui.header('CODING SANDBOX'));
|
|
190
|
+
console.log(ui.dim('Run code with syntax-highlighted output'));
|
|
191
|
+
console.log('');
|
|
192
|
+
console.log(ui.cmd('sandbox run <lang>', 'Enter code to run (opens multi-line input)'));
|
|
193
|
+
console.log(ui.cmd('sandbox exec <file>', 'Run a file in the sandbox'));
|
|
194
|
+
console.log(ui.cmd('sandbox highlight <file>', 'Syntax-highlight a file (no execution)'));
|
|
195
|
+
console.log(ui.cmd('sandbox demo', 'Run a demo to test colors'));
|
|
196
|
+
console.log('');
|
|
197
|
+
console.log(ui.dim('Languages: javascript, python, typescript'));
|
|
198
|
+
console.log(ui.dim('Shorthand: /run <code> — quick-run a one-liner'));
|
|
199
|
+
return;
|
|
200
|
+
}
|
|
201
|
+
|
|
202
|
+
const sub = args[0];
|
|
203
|
+
|
|
204
|
+
if (sub === 'demo') {
|
|
205
|
+
// Demo all language highlighting
|
|
206
|
+
const demos = {
|
|
207
|
+
javascript: `const greet = (name) => {
|
|
208
|
+
// Welcome message
|
|
209
|
+
const time = new Date().getHours();
|
|
210
|
+
if (time < 12) return \`Good morning, \${name}!\`;
|
|
211
|
+
return \`Hello, \${name}!\`;
|
|
212
|
+
};
|
|
213
|
+
|
|
214
|
+
console.log(greet("NAVADA"));
|
|
215
|
+
console.log("Version:", 3.02);`,
|
|
216
|
+
|
|
217
|
+
python: `def fibonacci(n):
|
|
218
|
+
"""Generate fibonacci sequence"""
|
|
219
|
+
a, b = 0, 1
|
|
220
|
+
result = []
|
|
221
|
+
for _ in range(n):
|
|
222
|
+
result.append(a)
|
|
223
|
+
a, b = b, a + b
|
|
224
|
+
return result
|
|
225
|
+
|
|
226
|
+
# Print first 10 numbers
|
|
227
|
+
print(fibonacci(10))
|
|
228
|
+
print(f"Sum: {sum(fibonacci(10))}")`,
|
|
229
|
+
};
|
|
230
|
+
|
|
231
|
+
for (const [lang, code] of Object.entries(demos)) {
|
|
232
|
+
displayCode(code, lang);
|
|
233
|
+
const result = runCode(code, lang);
|
|
234
|
+
displayOutput(result);
|
|
235
|
+
}
|
|
236
|
+
return;
|
|
237
|
+
}
|
|
238
|
+
|
|
239
|
+
if (sub === 'highlight' && args[1]) {
|
|
240
|
+
const filePath = path.resolve(args.slice(1).join(' '));
|
|
241
|
+
if (!fs.existsSync(filePath)) { console.log(ui.error(`File not found: ${filePath}`)); return; }
|
|
242
|
+
const code = fs.readFileSync(filePath, 'utf-8');
|
|
243
|
+
const ext = path.extname(filePath).slice(1);
|
|
244
|
+
displayCode(code, ext);
|
|
245
|
+
return;
|
|
246
|
+
}
|
|
247
|
+
|
|
248
|
+
if (sub === 'exec' && args[1]) {
|
|
249
|
+
const filePath = path.resolve(args.slice(1).join(' '));
|
|
250
|
+
if (!fs.existsSync(filePath)) { console.log(ui.error(`File not found: ${filePath}`)); return; }
|
|
251
|
+
const code = fs.readFileSync(filePath, 'utf-8');
|
|
252
|
+
const ext = path.extname(filePath).slice(1);
|
|
253
|
+
displayCode(code, ext);
|
|
254
|
+
const result = runCode(code, ext);
|
|
255
|
+
displayOutput(result);
|
|
256
|
+
return;
|
|
257
|
+
}
|
|
258
|
+
|
|
259
|
+
if (sub === 'run') {
|
|
260
|
+
const lang = args[1] || 'javascript';
|
|
261
|
+
console.log(ui.dim(`Enter ${lang} code (type END on a new line to execute):`));
|
|
262
|
+
console.log('');
|
|
263
|
+
|
|
264
|
+
// Collect multi-line input
|
|
265
|
+
const readline = require('readline');
|
|
266
|
+
const rl = readline.createInterface({ input: process.stdin, output: process.stdout, prompt: chalk.gray(' > ') });
|
|
267
|
+
const lines = [];
|
|
268
|
+
|
|
269
|
+
return new Promise((resolve) => {
|
|
270
|
+
rl.prompt();
|
|
271
|
+
rl.on('line', (line) => {
|
|
272
|
+
if (line.trim() === 'END') {
|
|
273
|
+
rl.close();
|
|
274
|
+
const code = lines.join('\n');
|
|
275
|
+
displayCode(code, lang);
|
|
276
|
+
const result = runCode(code, lang);
|
|
277
|
+
displayOutput(result);
|
|
278
|
+
resolve();
|
|
279
|
+
} else {
|
|
280
|
+
lines.push(line);
|
|
281
|
+
rl.prompt();
|
|
282
|
+
}
|
|
283
|
+
});
|
|
284
|
+
});
|
|
285
|
+
}
|
|
286
|
+
|
|
287
|
+
// Quick run — rest of args is code
|
|
288
|
+
const code = args.join(' ');
|
|
289
|
+
const lang = detectLang(code);
|
|
290
|
+
displayCode(code, lang);
|
|
291
|
+
const result = runCode(code, lang);
|
|
292
|
+
displayOutput(result);
|
|
293
|
+
|
|
294
|
+
}, { category: 'CODE', aliases: ['sb'], subs: ['run', 'exec', 'highlight', 'demo'] });
|
|
295
|
+
|
|
296
|
+
// Quick run shortcut
|
|
297
|
+
reg('run', 'Quick-run code in the sandbox', async (args) => {
|
|
298
|
+
if (!args.length) { console.log(ui.dim('Usage: /run console.log("hello") or /run py print("hello")')); return; }
|
|
299
|
+
|
|
300
|
+
// Check for language prefix
|
|
301
|
+
const langShorts = { js: 'javascript', py: 'python', ts: 'typescript', node: 'javascript' };
|
|
302
|
+
let lang = null;
|
|
303
|
+
let code = args.join(' ');
|
|
304
|
+
|
|
305
|
+
if (langShorts[args[0]]) {
|
|
306
|
+
lang = langShorts[args[0]];
|
|
307
|
+
code = args.slice(1).join(' ');
|
|
308
|
+
}
|
|
309
|
+
|
|
310
|
+
const detected = detectLang(code, lang);
|
|
311
|
+
displayCode(code, detected);
|
|
312
|
+
const result = runCode(code, detected);
|
|
313
|
+
displayOutput(result);
|
|
314
|
+
|
|
315
|
+
}, { category: 'CODE' });
|
|
316
|
+
|
|
317
|
+
};
|
|
318
|
+
|
|
319
|
+
// Export for agent tool use
|
|
320
|
+
module.exports.highlight = highlight;
|
|
321
|
+
module.exports.runCode = runCode;
|
|
322
|
+
module.exports.displayCode = displayCode;
|
|
323
|
+
module.exports.displayOutput = displayOutput;
|
package/lib/commands/system.js
CHANGED
|
@@ -65,10 +65,15 @@ module.exports = function(reg) {
|
|
|
65
65
|
}, { category: 'SYSTEM' });
|
|
66
66
|
|
|
67
67
|
// --- /login ---
|
|
68
|
-
reg('login', 'Set API key (Anthropic, OpenAI, NAVADA Edge, or HuggingFace)', (args) => {
|
|
68
|
+
reg('login', 'Set API key (Anthropic, OpenAI, NVIDIA, NAVADA Edge, or HuggingFace)', (args) => {
|
|
69
69
|
if (!args[0]) {
|
|
70
70
|
console.log(ui.dim('Usage: /login <api-key>'));
|
|
71
|
-
console.log(ui.dim('Accepts:
|
|
71
|
+
console.log(ui.dim('Accepts:'));
|
|
72
|
+
console.log(ui.dim(' sk-ant-... Anthropic (full agent + tool use)'));
|
|
73
|
+
console.log(ui.dim(' sk-... OpenAI (GPT-4o)'));
|
|
74
|
+
console.log(ui.dim(' nvapi-... NVIDIA (Llama, Mistral, DeepSeek — FREE)'));
|
|
75
|
+
console.log(ui.dim(' nv_edge_... NAVADA Edge (MCP + Dashboard)'));
|
|
76
|
+
console.log(ui.dim(' hf_... HuggingFace (Qwen Coder — FREE)'));
|
|
72
77
|
return;
|
|
73
78
|
}
|
|
74
79
|
const key = args[0].trim();
|
|
@@ -77,6 +82,10 @@ module.exports = function(reg) {
|
|
|
77
82
|
if (key.startsWith('sk-ant')) {
|
|
78
83
|
config.set('anthropicKey', key);
|
|
79
84
|
console.log(ui.success('Anthropic key saved — full agent mode with tool use enabled'));
|
|
85
|
+
} else if (key.startsWith('nvapi-')) {
|
|
86
|
+
config.set('nvidiaKey', key);
|
|
87
|
+
console.log(ui.success('NVIDIA key saved — Llama, Mistral, DeepSeek, Gemma enabled'));
|
|
88
|
+
console.log(ui.dim('/nvidia models to see all available models'));
|
|
80
89
|
} else if (key.startsWith('hf_')) {
|
|
81
90
|
config.set('hfToken', key);
|
|
82
91
|
navada.init({ hfToken: key });
|
package/lib/config.js
CHANGED
|
@@ -15,13 +15,28 @@ function ensureDir() {
|
|
|
15
15
|
function load() {
|
|
16
16
|
try {
|
|
17
17
|
if (!fs.existsSync(CONFIG_FILE)) return {};
|
|
18
|
-
|
|
19
|
-
|
|
18
|
+
const raw = fs.readFileSync(CONFIG_FILE, 'utf-8');
|
|
19
|
+
if (!raw.trim()) return {};
|
|
20
|
+
return JSON.parse(raw);
|
|
21
|
+
} catch (e) {
|
|
22
|
+
// Back up corrupted config
|
|
23
|
+
try {
|
|
24
|
+
const backupPath = CONFIG_FILE + '.bak';
|
|
25
|
+
if (fs.existsSync(CONFIG_FILE)) fs.copyFileSync(CONFIG_FILE, backupPath);
|
|
26
|
+
} catch {}
|
|
27
|
+
return {};
|
|
28
|
+
}
|
|
20
29
|
}
|
|
21
30
|
|
|
22
31
|
function save(config) {
|
|
23
32
|
ensureDir();
|
|
24
|
-
|
|
33
|
+
const data = JSON.stringify(config, null, 2);
|
|
34
|
+
// Write atomically via temp file
|
|
35
|
+
const tmpFile = CONFIG_FILE + '.tmp';
|
|
36
|
+
fs.writeFileSync(tmpFile, data);
|
|
37
|
+
fs.renameSync(tmpFile, CONFIG_FILE);
|
|
38
|
+
// Restrict permissions on config file (contains API keys)
|
|
39
|
+
try { fs.chmodSync(CONFIG_FILE, 0o600); } catch {}
|
|
25
40
|
}
|
|
26
41
|
|
|
27
42
|
function isFirstRun() { return !fs.existsSync(CONFIG_FILE); }
|
package/lib/serve.js
CHANGED
|
@@ -9,8 +9,10 @@ function start(port = 7800) {
|
|
|
9
9
|
const server = http.createServer(async (req, res) => {
|
|
10
10
|
const url = req.url.split('?')[0];
|
|
11
11
|
|
|
12
|
-
// CORS
|
|
13
|
-
|
|
12
|
+
// CORS — restrict to local/Tailscale origins
|
|
13
|
+
const origin = req.headers.origin || '';
|
|
14
|
+
const allowedOrigin = (origin.includes('localhost') || origin.includes('127.0.0.1') || origin.includes('100.')) ? origin : '';
|
|
15
|
+
res.setHeader('Access-Control-Allow-Origin', allowedOrigin || `http://localhost:${port}`);
|
|
14
16
|
res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS');
|
|
15
17
|
res.setHeader('Access-Control-Allow-Headers', 'Content-Type');
|
|
16
18
|
if (req.method === 'OPTIONS') { res.writeHead(204); return res.end(); }
|
|
@@ -24,25 +26,36 @@ function start(port = 7800) {
|
|
|
24
26
|
// Execute command
|
|
25
27
|
if (url === '/api/cmd' && req.method === 'POST') {
|
|
26
28
|
let body = '';
|
|
27
|
-
|
|
29
|
+
let bodySize = 0;
|
|
30
|
+
const MAX_BODY = 8192;
|
|
31
|
+
req.on('data', c => {
|
|
32
|
+
bodySize += c.length;
|
|
33
|
+
if (bodySize > MAX_BODY) { req.destroy(); return; }
|
|
34
|
+
body += c;
|
|
35
|
+
});
|
|
28
36
|
req.on('end', async () => {
|
|
29
37
|
try {
|
|
38
|
+
if (bodySize > MAX_BODY) { res.writeHead(413); return res.end(JSON.stringify({ error: 'Request too large' })); }
|
|
30
39
|
const { command } = JSON.parse(body);
|
|
31
|
-
if (!command) { res.writeHead(400); return res.end(JSON.stringify({ error: 'command required' })); }
|
|
40
|
+
if (!command || typeof command !== 'string') { res.writeHead(400); return res.end(JSON.stringify({ error: 'command required (string)' })); }
|
|
41
|
+
if (command.length > 2000) { res.writeHead(400); return res.end(JSON.stringify({ error: 'command too long' })); }
|
|
32
42
|
|
|
33
43
|
// Capture output
|
|
34
44
|
const orig = console.log;
|
|
35
45
|
const buf = [];
|
|
36
46
|
console.log = (...args) => buf.push(args.join(' '));
|
|
37
|
-
|
|
38
|
-
|
|
47
|
+
try {
|
|
48
|
+
await execute(command);
|
|
49
|
+
} finally {
|
|
50
|
+
console.log = orig;
|
|
51
|
+
}
|
|
39
52
|
|
|
40
53
|
const output = buf.join('\n').replace(/\x1b\[[0-9;]*m/g, ''); // strip ANSI
|
|
41
54
|
res.writeHead(200, { 'Content-Type': 'application/json' });
|
|
42
55
|
res.end(JSON.stringify({ command, output }));
|
|
43
56
|
} catch (e) {
|
|
44
57
|
res.writeHead(500, { 'Content-Type': 'application/json' });
|
|
45
|
-
res.end(JSON.stringify({ error:
|
|
58
|
+
res.end(JSON.stringify({ error: 'Internal server error' }));
|
|
46
59
|
}
|
|
47
60
|
});
|
|
48
61
|
return;
|
package/lib/ui.js
CHANGED
|
@@ -90,8 +90,9 @@ function sessionPanel(width) {
|
|
|
90
90
|
kvLine(' Model: ', model),
|
|
91
91
|
kvLine(' Tier: ', tier),
|
|
92
92
|
'',
|
|
93
|
-
sectionLine('
|
|
94
|
-
kvLine('
|
|
93
|
+
sectionLine('Usage'),
|
|
94
|
+
kvLine(' Messages:', String(state.messages || 0)),
|
|
95
|
+
kvLine(' Tokens: ', String(state.tokens?.total || 0)),
|
|
95
96
|
kvLine(' Cost: ', `$${(state.cost || 0).toFixed(4)}`),
|
|
96
97
|
'',
|
|
97
98
|
sectionLine('Configuration'),
|
|
@@ -114,15 +115,18 @@ function sessionPanel(width) {
|
|
|
114
115
|
return lines;
|
|
115
116
|
}
|
|
116
117
|
|
|
118
|
+
let _pythonCache = null;
|
|
117
119
|
function detectPython() {
|
|
120
|
+
if (_pythonCache !== null) return _pythonCache;
|
|
118
121
|
try {
|
|
119
122
|
const { execSync } = require('child_process');
|
|
120
123
|
const py = process.platform === 'win32' ? 'python' : 'python3';
|
|
121
124
|
const ver = execSync(`${py} --version`, { timeout: 3000, encoding: 'utf-8' }).trim();
|
|
122
|
-
|
|
125
|
+
_pythonCache = style('success', ver.replace('Python ', ''));
|
|
123
126
|
} catch {
|
|
124
|
-
|
|
127
|
+
_pythonCache = style('offline', 'not found');
|
|
125
128
|
}
|
|
129
|
+
return _pythonCache;
|
|
126
130
|
}
|
|
127
131
|
|
|
128
132
|
function footerBar(width) {
|
|
@@ -178,7 +182,9 @@ function prompt() {
|
|
|
178
182
|
try { state = require('./agent').sessionState; } catch { state = {}; }
|
|
179
183
|
const mode = state?.learningMode;
|
|
180
184
|
const modeTag = mode ? style('accent', `[${mode}]`) + ' ' : '';
|
|
181
|
-
|
|
185
|
+
const msgCount = state?.messages || 0;
|
|
186
|
+
const countTag = msgCount > 0 ? style('dim', `(${msgCount}) `) : '';
|
|
187
|
+
return modeTag + countTag + style('dim', 'navada') + style('accent', '> ');
|
|
182
188
|
}
|
|
183
189
|
|
|
184
190
|
function box(title, content) {
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "navada-edge-cli",
|
|
3
|
-
"version": "3.0
|
|
3
|
+
"version": "3.2.0",
|
|
4
4
|
"description": "Interactive CLI for the NAVADA Edge Network — explore nodes, agents, Cloudflare, AI, Docker, and MCP from your terminal",
|
|
5
5
|
"main": "lib/cli.js",
|
|
6
6
|
"bin": {
|
|
@@ -41,14 +41,14 @@
|
|
|
41
41
|
"url": "git+https://github.com/Navada25/edge-sdk.git"
|
|
42
42
|
},
|
|
43
43
|
"dependencies": {
|
|
44
|
-
"navada-edge-sdk": "^1.1.0",
|
|
45
44
|
"chalk": "^4.1.2",
|
|
46
45
|
"cli-table3": "^0.6.5",
|
|
46
|
+
"navada-edge-sdk": "^1.1.0",
|
|
47
47
|
"ora": "^5.4.1"
|
|
48
48
|
},
|
|
49
49
|
"optionalDependencies": {
|
|
50
|
-
"
|
|
51
|
-
"
|
|
50
|
+
"nodemailer": "^8.0.4",
|
|
51
|
+
"qrcode-terminal": "^0.12.0"
|
|
52
52
|
},
|
|
53
53
|
"publishConfig": {
|
|
54
54
|
"access": "public"
|