@halfagiraf/clawx 0.2.5 → 0.2.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1 +1 @@
1
- {"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAmF/D;AAED,wBAAgB,oBAAoB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAkBnE"}
1
+ {"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAyG/D;AAED,wBAAgB,oBAAoB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAkBnE"}
@@ -55,17 +55,20 @@ Once you've presented your model recommendations, ALWAYS end by offering these t
55
55
  1. **Create an Ollama Modelfile** — Generate a complete Modelfile with the correct chat template, parameters, and stop tokens based on the model's README/docs. Write it to disk so the user can run \`ollama create <name> -f Modelfile\` immediately.
56
56
 
57
57
  2. **Download the GGUF** — Download the GGUF file directly. Do NOT write download scripts (no PowerShell scripts, no .sh files). Instead:
58
- - First check if \`huggingface-cli\` is available by running: \`huggingface-cli --version\`
59
- - If available, use: \`huggingface-cli download <repo-id> <filename> --local-dir .\`
60
- - If not available, use \`curl -L -o <filename> https://huggingface.co/<repo-id>/resolve/main/<filename>\`
58
+ - First check for the HuggingFace CLI: \`hf version\` (note: the command is \`hf\`, NOT \`huggingface-cli\`). If available: \`hf download <repo-id> <filename> --local-dir .\`
59
+ - If hf is not available, try: \`curl -L -o <filename> "https://huggingface.co/<repo-id>/resolve/main/<filename>"\`
60
+ - If curl is not available, try: \`wget -O <filename> "https://huggingface.co/<repo-id>/resolve/main/<filename>"\`
61
61
  - Run the download command directly in the terminal via the bash/run_shell tool
62
- - ALWAYS also show the user the download command so they can run it themselves later or cancel if they don't want to download now
62
+ - ALWAYS also show the user the full download command so they can run it themselves later or cancel if they don't want to download now
63
63
  - Include the expected file size so the user knows what to expect
64
64
 
65
- 3. **Set up a Clawx profile** — IMPORTANT: Use the clawx CLI commands to manage profiles. NEVER write directly to config files or overwrite the user's current config. The correct workflow is:
66
- - Write a profile config file to \`~/.clawx/profiles/<model-name>\` with these exact contents:
65
+ 3. **Set up a Clawx profile** — CRITICAL: The model name in the profile MUST match the Ollama model name (the name used in \`ollama create <name> -f Modelfile\`), NOT the HuggingFace repo name.
66
+
67
+ The correct workflow is:
68
+ - First, decide on a short Ollama model name (e.g. \`lfm-nova\`, \`qwen-coder-7b\`). This is the name used with \`ollama create\`.
69
+ - Write a profile config file to \`~/.clawx/profiles/<profile-name>\` with these exact contents:
67
70
  \`\`\`
68
- # Clawx profile — <model-name>
71
+ # Clawx profile — <profile-name>
69
72
  CLAWDEX_PROVIDER=ollama
70
73
  CLAWDEX_BASE_URL=http://localhost:11434/v1
71
74
  CLAWDEX_MODEL=<ollama-model-name>
@@ -73,10 +76,29 @@ Once you've presented your model recommendations, ALWAYS end by offering these t
73
76
  CLAWDEX_THINKING_LEVEL=off
74
77
  CLAWDEX_MAX_TOKENS=16384
75
78
  \`\`\`
76
- - Then tell the user they can switch to it with: \`clawx use <model-name>\`
79
+ - The \`CLAWDEX_MODEL\` value MUST be the exact Ollama model name (e.g. \`lfm-nova\` if you ran \`ollama create lfm-nova -f Modelfile\`). If you get this wrong the user will get a 404 error.
80
+ - Tell the user they can switch to it with: \`clawx use <profile-name>\`
77
81
  - And switch back to their current setup with: \`clawx use <previous-profile>\`
78
82
  - NEVER overwrite \`~/.clawx/config\` — that's the user's active config. Only write to \`~/.clawx/profiles/\`.
79
83
 
84
+ Example full workflow:
85
+ \`\`\`
86
+ # 1. Download the GGUF
87
+ curl -L -o LFM2.5-1.2B-Nova.Q4_K_M.gguf "https://huggingface.co/NovachronoAI/LFM2.5-1.2B-Nova-Function-Calling-GGUF/resolve/main/LFM2.5-1.2B-Nova-Function-Calling.Q4_K_M.gguf"
88
+
89
+ # 2. Create Ollama model (note: "lfm-nova" is the Ollama model name)
90
+ ollama create lfm-nova -f Modelfile
91
+
92
+ # 3. Verify it's in Ollama
93
+ ollama list
94
+
95
+ # 4. Profile uses the SAME name "lfm-nova" as CLAWDEX_MODEL
96
+ # ~/.clawx/profiles/lfm-nova contains: CLAWDEX_MODEL=lfm-nova
97
+
98
+ # 5. Switch to it
99
+ clawx use lfm-nova
100
+ \`\`\`
101
+
80
102
  Present these as a numbered list like:
81
103
  "Want me to set any of these up? I can:
82
104
  1. Create an Ollama Modelfile for [model name]
@@ -1 +1 @@
1
- {"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;uFAoE8E,CAAC;AACxF,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAC,QAAsB;IACzD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;kFAGyE,CAAC;AACnF,CAAC"}
1
+ {"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;uFA0F8E,CAAC;AACxF,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAC,QAAsB;IACzD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;kFAGyE,CAAC;AACnF,CAAC"}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@halfagiraf/clawx",
3
- "version": "0.2.5",
3
+ "version": "0.2.7",
4
4
  "description": "Terminal-first coding agent — runs locally with Ollama, DeepSeek, OpenAI, or any OpenAI-compatible endpoint",
5
5
  "type": "module",
6
6
  "bin": {