navada-edge-cli 3.5.9 → 3.5.10

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +87 -18
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -88,6 +88,16 @@ Developers, system administrators, and infrastructure engineers prefer CLIs beca
88
88
 
89
89
  NAVADA Edge CLI is an AI-powered operating system layer that runs in your terminal. It is the first interface to the NAVADA Edge Network -- a distributed computing platform built for AI workloads, infrastructure management, and developer tooling.
90
90
 
91
+ ### Two parts, one CLI
92
+
93
+ The NAVADA Edge CLI has two distinct modes of operation:
94
+
95
+ **Part 1: Standalone CLI** -- install from npm, runs on your machine, no account needed. Full AI agent with file operations, shell access, Python execution, code sandbox, and 6 AI providers. The free tier works out of the box. When it runs out, add your own API key. This is a complete, independent developer tool.
96
+
97
+ **Part 2: Cloud Compute** -- opt-in. Sign up at the Edge Portal, generate an API key, and run tasks 24/7 on AWS or Azure. Your laptop can be closed while jobs run in the cloud. Monitoring, output streaming, and task management built into the same CLI.
98
+
99
+ Part 1 works without Part 2. Part 2 extends Part 1.
100
+
91
101
  ### What it does
92
102
 
93
103
  The CLI wraps your terminal in a conversational AI agent. It has tools for file operations, shell execution, Docker management, remote SSH, database queries, cloud services, image generation, and more. You can use slash commands for precision or type naturally and let the agent figure out what to do.
@@ -172,7 +182,7 @@ navada> list files in my home directory
172
182
  navada> create a file called test.txt with "Hello from NAVADA"
173
183
  ```
174
184
 
175
- The free tier uses GPT-4o-mini via the NAVADA Edge server. No API key required -- install and go.
185
+ The free tier uses Grok via the NAVADA Edge server. No API key required -- install and go. File operations (create, read, edit, delete folders and files) work on the free tier without any AI provider.
176
186
 
177
187
  To unlock full agent mode with tool use, add your own API key:
178
188
 
@@ -195,7 +205,7 @@ The CLI supports 6 AI providers. Each is activated by logging in with the corres
195
205
 
196
206
  | Provider | Key Prefix | Model | Cost | Tool Use |
197
207
  |---|---|---|---|---|
198
- | **NAVADA Free Tier** | (none needed) | GPT-4o-mini | Free (30 RPM) | No |
208
+ | **NAVADA Free Tier** | (none needed) | Grok (via Edge server) | Free (30 RPM) | File ops only |
199
209
  | **Anthropic** | `sk-ant-...` | Claude Sonnet 4 | Paid | Yes (full agent) |
200
210
  | **OpenAI** | `sk-...` | GPT-4o | Paid | Yes |
201
211
  | **Google Gemini** | `AIza...` | Gemini 2.0 Flash | Free | No |
@@ -259,7 +269,7 @@ navada> /model deepseek-r1 # always use DeepSeek R1
259
269
 
260
270
  ## Commands
261
271
 
262
- 75 commands organised by category. Use `/help` inside the CLI for the full list.
272
+ 72 commands (91 with aliases) organised by category. Use `/help` inside the CLI for the full list.
263
273
 
264
274
  ### AI
265
275
 
@@ -355,7 +365,26 @@ navada> /model deepseek-r1 # always use DeepSeek R1
355
365
  |---|---|
356
366
  | `/db <sql>` | Query PostgreSQL |
357
367
 
358
- ### EDGE
368
+ ### FILES
369
+
370
+ | Command | Description |
371
+ |---|---|
372
+ | `/read <path>` | Read a file (with line numbers) |
373
+ | `/write <path> <content>` | Write content to a file |
374
+ | `/edit <path> <search> -> <replace>` | Find and replace in a file |
375
+ | `/delete <path>` | Delete a file or empty directory |
376
+ | `/ls [path]` | List files and directories |
377
+ | `/mkdir <path>` | Create a directory |
378
+ | `/touch <path>` | Create an empty file |
379
+
380
+ File operations also work via natural language on any tier (no API key needed):
381
+
382
+ ```
383
+ navada> create a folder on my desktop called MyProject
384
+ Created folder: C:\Users\you\Desktop\MyProject
385
+ ```
386
+
387
+ ### EDGE (Part 2 -- Cloud Compute)
359
388
 
360
389
  | Command | Description |
361
390
  |---|---|
@@ -363,7 +392,12 @@ navada> /model deepseek-r1 # always use DeepSeek R1
363
392
  | `/edge status` | Check Edge Network connection |
364
393
  | `/edge logout` | Disconnect from Edge Network |
365
394
  | `/edge tier` | Show current tier and limits |
395
+ | `/edge setup` | Create agent.md and sub-agents directory |
366
396
  | `/onboard` | Open Edge Portal to create account |
397
+ | `/offload <command>` | Run a task 24/7 on the cloud |
398
+ | `/sessions` | View your cloud task sessions |
399
+ | `/attach <session-id>` | Stream output from a running task |
400
+ | `/kill <session-id>` | Stop a running cloud task |
367
401
 
368
402
  ### TASKS
369
403
 
@@ -427,6 +461,7 @@ navada> /model deepseek-r1 # always use DeepSeek R1
427
461
  | `/activity` | Recent activity log |
428
462
  | `/version` | Version and tier info |
429
463
  | `/upgrade` | Check for CLI updates |
464
+ | `/audit` | Security and compliance audit (30 checks) |
430
465
  | `/exit` | Exit CLI |
431
466
 
432
467
  ---
@@ -469,12 +504,15 @@ The agent (Anthropic and OpenAI providers) supports tool use. The AI model decid
469
504
  | `shell` | Local | Run any shell command on your machine |
470
505
  | `read_file` | Local | Read files from your filesystem |
471
506
  | `write_file` | Local | Create or modify files |
507
+ | `edit_file` | Local | Find and replace text in a file |
508
+ | `delete_file` | Local | Delete a file or empty directory |
472
509
  | `list_files` | Local | Browse directories |
473
510
  | `system_info` | Local | CPU, RAM, disk, hostname, OS |
474
511
  | `python_exec` | Local | Execute Python code |
475
512
  | `python_pip` | Local | Install Python packages |
476
513
  | `python_script` | Local | Run a Python script file |
477
514
  | `sandbox_run` | Local | Run code with syntax highlighting |
515
+ | `founder_info` | Local | Accurate answers about the NAVADA founder (CV-grounded) |
478
516
  | `network_status` | Network | Ping all NAVADA Edge nodes |
479
517
  | `lucas_exec` | Remote | Run bash on EC2 via Lucas CTO |
480
518
  | `lucas_ssh` | Remote | SSH to any network node |
@@ -483,7 +521,6 @@ The agent (Anthropic and OpenAI providers) supports tool use. The AI model decid
483
521
  | `docker_registry` | Remote | Query the private Docker registry |
484
522
  | `send_email` | Remote | Send email via SMTP or MCP |
485
523
  | `generate_image` | Remote | Generate images (Flux or DALL-E) |
486
- | `founder_info` | Local | Information about the NAVADA founder |
487
524
 
488
525
  The execution loop works like this:
489
526
 
@@ -537,11 +574,12 @@ navada> /doctor
537
574
 
538
575
  ### API key tiers
539
576
 
540
- | Tier | Requests/day | Tokens/day | Edge tasks | Max runtime |
541
- |---|---|---|---|---|
542
- | **Free** | 100 | 50K | 10 | 5 minutes |
543
- | **Pro** | Coming soon | -- | -- | -- |
544
- | **Enterprise** | Coming soon | -- | -- | -- |
577
+ | Tier | Cloud Compute | Pricing |
578
+ |---|---|---|
579
+ | **Free** | 3 sessions, 1 concurrent, 5 min max | Free |
580
+ | **Starter** | 10 sessions, 3 concurrent, 15 min max | TBC |
581
+ | **Pro** | 50 sessions, 5 concurrent, 1 hour max | TBC |
582
+ | **Enterprise** | 500 sessions, 10 concurrent, 4 hour max | TBC |
545
583
 
546
584
  ### Portal
547
585
 
@@ -555,17 +593,25 @@ This is the early stage of an AI-powered operating system.
555
593
 
556
594
  Today, the CLI is a terminal agent -- you install it, type naturally, and it executes tasks on your machine using AI. But the terminal is just the first interface layer. The architecture is designed for what comes next.
557
595
 
558
- ### Where this is going
596
+ ### What is built today
559
597
 
560
- **Local LLM execution.** The CLI currently routes to cloud AI providers. The next step is running models locally on NVIDIA GPUs via Docker containers. Instead of paying per token to a cloud API, your home server runs Llama, Mistral, or DeepSeek locally. The CLI routes to whichever is fastest -- local GPU or cloud API -- transparently.
598
+ **Part 1: Standalone CLI.** Install from npm and go. Full local tools (file CRUD, shell, Python, sandbox), 6 AI providers, conversation history, agent.md customisation, sub-agents, learning modes, code sandbox. Works completely independently with no account or sign-up required. Free tier included, or bring your own API key.
561
599
 
562
- **Edge compute offloading.** Today, `/lucas exec` sends a command to one remote node. In the future, you will be able to offload long-running tasks (ML training, batch processing, video encoding) to any node in the Edge Network. The CLI submits a task, the network schedules it on an available node, and you get notified when it completes.
600
+ **Part 2: Cloud Compute.** Sign up at the Edge Portal, generate an API key, and offload tasks to AWS (EC2) for 24/7 execution. Tasks run when your laptop is closed. Monitor sessions, stream output, kill tasks -- all from the CLI. Authenticated via `nv_edge_` API keys with tier-based limits.
563
601
 
564
- **Sub-agents.** Lucas CTO is the first sub-agent -- an autonomous infrastructure manager. More are planned: a security agent for vulnerability scanning, a data agent for ETL pipelines, a monitoring agent for alerting. Each runs in its own container and communicates via MCP.
602
+ **agent.md customisation.** Every user can define their own `agent.md` file at `~/.navada/agent.md` -- a plain-text configuration that shapes the AI's personality, tools, and behaviour. Your agent becomes uniquely yours. Sub-agents live in `~/.navada/agents/` and can be switched mid-session with `/agent use <name>`.
565
603
 
566
- **agent.md customisation.** Every user will be able to define their own `agent.md` file -- a plain-text configuration that shapes the AI's personality, tools, and behaviour. Your agent becomes uniquely yours: different system prompts, different tool sets, different priorities. This is the NAVADA moat -- an AI operating system that adapts to each user, not a one-size-fits-all chatbot.
604
+ **Knowledge skills.** The CLI uses Python-based knowledge skills for grounded, accurate responses. Instead of the AI hallucinating, factual data is baked into Python scripts that the agent calls as tools. No RAG infrastructure needed -- just a Python file and a prompt.
567
605
 
568
- **Multi-device.** The CLI already supports mobile access via `/serve`. The vision is a unified agent layer across terminal, web, and mobile -- same context, same tools, same conversation -- wherever you are.
606
+ ### Where this is going
607
+
608
+ **Azure compute node.** A second cloud region for redundancy and lower latency. The CLI will route tasks to the nearest available node.
609
+
610
+ **Local LLM execution.** Running models locally on NVIDIA GPUs via Docker containers. Instead of paying per token to a cloud API, your home server runs Llama, Mistral, or DeepSeek locally. The CLI routes to whichever is fastest -- local GPU or cloud API -- transparently.
611
+
612
+ **More sub-agents.** Lucas CTO is the first sub-agent -- an autonomous infrastructure manager. More are planned: a security agent for vulnerability scanning, a data agent for ETL pipelines, a monitoring agent for alerting. Each runs in its own container and communicates via MCP.
613
+
614
+ **Multi-device.** The CLI supports mobile access via `/serve`. The vision is a unified agent layer across terminal, web, and mobile -- same context, same tools, same conversation -- wherever you are.
569
615
 
570
616
  ### The operating system analogy
571
617
 
@@ -609,11 +655,34 @@ NAVADA_LUCAS=http://100.x.x.x:8820
609
655
 
610
656
  | Package | Install | Purpose |
611
657
  |---|---|---|
612
- | **navada-edge-sdk** | `npm i navada-edge-sdk` | SDK for Node.js applications |
658
+ | **navada-edge-sdk** | `npm i navada-edge-sdk` | SDK for Node.js applications -- build on top of the Edge Network |
613
659
  | **navada-edge-cli** | `npm i -g navada-edge-cli` | AI agent in your terminal |
614
- | **Edge Portal** | [portal.navada-edge-server.uk](https://portal.navada-edge-server.uk) | Account management and API keys |
660
+ | **Edge Portal** | [portal.navada-edge-server.uk](https://portal.navada-edge-server.uk) | Account management, API keys, usage dashboard |
661
+ | **Edge Compute** | via CLI `/offload` | 24/7 task execution on AWS/Azure |
615
662
  | **MCP Server** | `POST /mcp` | JSON-RPC tool server (18 tools) |
616
663
 
664
+ ### SDK usage
665
+
666
+ The CLI ships with the NAVADA Edge SDK as a dependency. You can also install it independently to build your own applications:
667
+
668
+ ```bash
669
+ npm install navada-edge-sdk
670
+ ```
671
+
672
+ ```javascript
673
+ const navada = require('navada-edge-sdk');
674
+
675
+ // Configure
676
+ navada.init({ mcpApiKey: 'nv_edge_your_key' });
677
+
678
+ // Use
679
+ const status = await navada.network.ping();
680
+ const result = await navada.mcp.call('server_status');
681
+ const image = await navada.cloudflare.flux.generate('a sunset over mountains');
682
+ ```
683
+
684
+ The SDK provides programmatic access to the same services the CLI uses: network nodes, MCP tools, Cloudflare (R2, Flux, Stream, DNS), AI providers, Docker registry, and PostgreSQL.
685
+
617
686
  ---
618
687
 
619
688
  ## Telemetry
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "navada-edge-cli",
3
- "version": "3.5.9",
3
+ "version": "3.5.10",
4
4
  "description": "Interactive CLI for the NAVADA Edge Network — explore nodes, agents, Cloudflare, AI, Docker, and MCP from your terminal",
5
5
  "main": "lib/cli.js",
6
6
  "bin": {