@halfagiraf/clawx 0.2.4 → 0.2.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1 +1 @@
1
- {"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAgE/D;AAED,wBAAgB,oBAAoB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAkBnE"}
1
+ {"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAyG/D;AAED,wBAAgB,oBAAoB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAkBnE"}
@@ -52,11 +52,52 @@ Guidelines:
52
52
  After Recommendations — Always Offer Next Steps:
53
53
  Once you've presented your model recommendations, ALWAYS end by offering these three actions for any of the recommended models:
54
54
 
55
- 1. **Create an Ollama Modelfile** — Generate a complete Modelfile with the correct chat template, parameters, and stop tokens based on the model's README/docs. Write it to disk so the user can run \`ollama create\` immediately.
55
+ 1. **Create an Ollama Modelfile** — Generate a complete Modelfile with the correct chat template, parameters, and stop tokens based on the model's README/docs. Write it to disk so the user can run \`ollama create <name> -f Modelfile\` immediately.
56
56
 
57
- 2. **Download the GGUF** — Write a download script (PowerShell for Windows, shell for Linux/macOS) that uses \`huggingface-cli download\` or \`curl\`/\`wget\` to fetch the recommended GGUF quantization from HuggingFace. Include the full URL and expected file size.
57
+ 2. **Download the GGUF** — Download the GGUF file directly. Do NOT write download scripts (no PowerShell scripts, no .sh files). Instead:
58
+ - First check for the HuggingFace CLI: \`hf version\` (note: the command is \`hf\`, NOT \`huggingface-cli\`). If available: \`hf download <repo-id> <filename> --local-dir .\`
59
+ - If hf is not available, try: \`curl -L -o <filename> "https://huggingface.co/<repo-id>/resolve/main/<filename>"\`
60
+ - If curl is not available, try: \`wget -O <filename> "https://huggingface.co/<repo-id>/resolve/main/<filename>"\`
61
+ - Run the download command directly in the terminal via the bash/run_shell tool
62
+ - ALWAYS also show the user the full download command so they can run it themselves later or cancel if they don't want to download now
63
+ - Include the expected file size so the user knows what to expect
58
64
 
59
- 3. **Set up a Clawx profile** — After the model is imported into Ollama, offer to create a Clawx profile so the user can switch to it instantly. Write the profile config file to \`~/.clawx/profiles/<model-name>\` with the correct provider, base URL, model name, and settings. Then the user can run \`clawx use <model-name>\` to start coding with it.
65
+ 3. **Set up a Clawx profile** — CRITICAL: The model name in the profile MUST match the Ollama model name (the name used in \`ollama create <name> -f Modelfile\`), NOT the HuggingFace repo name.
66
+
67
+ The correct workflow is:
68
+ - First, decide on a short Ollama model name (e.g. \`lfm-nova\`, \`qwen-coder-7b\`). This is the name used with \`ollama create\`.
69
+ - Write a profile config file to \`~/.clawx/profiles/<profile-name>\` with these exact contents:
70
+ \`\`\`
71
+ # Clawx profile — <profile-name>
72
+ CLAWDEX_PROVIDER=ollama
73
+ CLAWDEX_BASE_URL=http://localhost:11434/v1
74
+ CLAWDEX_MODEL=<ollama-model-name>
75
+ CLAWDEX_API_KEY=not-needed
76
+ CLAWDEX_THINKING_LEVEL=off
77
+ CLAWDEX_MAX_TOKENS=16384
78
+ \`\`\`
79
+ - The \`CLAWDEX_MODEL\` value MUST be the exact Ollama model name (e.g. \`lfm-nova\` if you ran \`ollama create lfm-nova -f Modelfile\`). If you get this wrong the user will get a 404 error.
80
+ - Tell the user they can switch to it with: \`clawx use <profile-name>\`
81
+ - And switch back to their current setup with: \`clawx use <previous-profile>\`
82
+ - NEVER overwrite \`~/.clawx/config\` — that's the user's active config. Only write to \`~/.clawx/profiles/\`.
83
+
84
+ Example full workflow:
85
+ \`\`\`
86
+ # 1. Download the GGUF
87
+ curl -L -o LFM2.5-1.2B-Nova.Q4_K_M.gguf "https://huggingface.co/NovachronoAI/LFM2.5-1.2B-Nova-Function-Calling-GGUF/resolve/main/LFM2.5-1.2B-Nova-Function-Calling.Q4_K_M.gguf"
88
+
89
+ # 2. Create Ollama model (note: "lfm-nova" is the Ollama model name)
90
+ ollama create lfm-nova -f Modelfile
91
+
92
+ # 3. Verify it's in Ollama
93
+ ollama list
94
+
95
+ # 4. Profile uses the SAME name "lfm-nova" as CLAWDEX_MODEL
96
+ # ~/.clawx/profiles/lfm-nova contains: CLAWDEX_MODEL=lfm-nova
97
+
98
+ # 5. Switch to it
99
+ clawx use lfm-nova
100
+ \`\`\`
60
101
 
61
102
  Present these as a numbered list like:
62
103
  "Want me to set any of these up? I can:
@@ -1 +1 @@
1
- {"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;uFAiD8E,CAAC;AACxF,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAC,QAAsB;IACzD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;kFAGyE,CAAC;AACnF,CAAC"}
1
+ {"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;uFA0F8E,CAAC;AACxF,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAC,QAAsB;IACzD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;kFAGyE,CAAC;AACnF,CAAC"}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@halfagiraf/clawx",
3
- "version": "0.2.4",
3
+ "version": "0.2.7",
4
4
  "description": "Terminal-first coding agent — runs locally with Ollama, DeepSeek, OpenAI, or any OpenAI-compatible endpoint",
5
5
  "type": "module",
6
6
  "bin": {