@halfagiraf/clawx 0.2.8 → 0.2.9

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +5 -0
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -6,6 +6,11 @@
6
6
 
7
7
  Terminal-first coding agent — runs locally with Ollama, DeepSeek, OpenAI, or any OpenAI-compatible endpoint.
8
8
 
9
+ > **Always update before use — this project is in active development and can change by the hour:**
10
+ > ```bash
11
+ > npm install -g @halfagiraf/clawx@latest
12
+ > ```
13
+
9
14
  > **Beta** — Clawx is under active development. It works well with the providers we've tested (Ollama, DeepSeek, OpenAI, Anthropic) but not every combination has been battle-tested yet. If you hit a bug, [open an issue](https://github.com/stevenmcsorley/clawx/issues) — we fix things fast.
10
15
 
11
16
  Clawx started because tools like OpenClaw kept getting heavier. Prompts ballooned, context windows filled up, and local models choked. We wanted the good parts — the tool-calling loop, the terminal UI, the coding tools — without the bloat. So we built something lean on top of the open-source [pi-coding-agent](https://github.com/badlogic/pi-mono) SDK: an agent that runs local models on modest hardware, hits DeepSeek when you need more muscle, and scales up to frontier models when the task calls for it. No token budget wasted on platform overhead. Just the model, the tools, and your prompt.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@halfagiraf/clawx",
3
- "version": "0.2.8",
3
+ "version": "0.2.9",
4
4
  "description": "Terminal-first coding agent — runs locally with Ollama, DeepSeek, OpenAI, or any OpenAI-compatible endpoint",
5
5
  "type": "module",
6
6
  "bin": {