@halfagiraf/clawx 0.2.2 → 0.2.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md
CHANGED
|
@@ -649,6 +649,49 @@ Here's an excerpt from a real session. The agent searched for coding models, che
|
|
|
649
649
|
an Ollama Modelfile for your chosen model?
|
|
650
650
|
```
|
|
651
651
|
|
|
652
|
+
Ask it to create a Modelfile and it does the rest — inspects the model's GGUF files, reads the README for the prompt format, and generates a complete Ollama setup:
|
|
653
|
+
|
|
654
|
+
```
|
|
655
|
+
> Yes, create an Ollama Modelfile for NovachronoAI/LFM2.5-1.2B-Nova-Function-Calling-GGUF
|
|
656
|
+
|
|
657
|
+
hf_model_info { "model_id": "NovachronoAI/LFM2.5-1.2B-Nova-Function-Calling-GGUF" }
|
|
658
|
+
|
|
659
|
+
GGUF files (6):
|
|
660
|
+
LFM2.5-1.2B-Nova-Function-Calling.Q2_K.gguf
|
|
661
|
+
LFM2.5-1.2B-Nova-Function-Calling.Q4_K_M.gguf
|
|
662
|
+
LFM2.5-1.2B-Nova-Function-Calling.Q8_0.gguf
|
|
663
|
+
...
|
|
664
|
+
|
|
665
|
+
hf_readme { "model_id": "NovachronoAI/LFM2.5-1.2B-Nova-Function-Calling-GGUF" }
|
|
666
|
+
|
|
667
|
+
ChatML Native: Uses the standard <|im_start|> format
|
|
668
|
+
JSON Reliability: 97% syntax accuracy
|
|
669
|
+
Expected output: <tool_call>{"name": "...", "arguments": {...}}</tool_call>
|
|
670
|
+
|
|
671
|
+
[writes Modelfile]
|
|
672
|
+
FROM ./LFM2.5-1.2B-Nova-Function-Calling.Q4_K_M.gguf
|
|
673
|
+
TEMPLATE """<|im_start|>system
|
|
674
|
+
{{ .System }}<|im_end|>
|
|
675
|
+
{{ range .Messages }}<|im_start|>{{ .Role }}
|
|
676
|
+
{{ .Content }}<|im_end|>
|
|
677
|
+
{{ end }}<|im_start|>assistant
|
|
678
|
+
"""
|
|
679
|
+
PARAMETER temperature 0.1
|
|
680
|
+
PARAMETER num_ctx 4096
|
|
681
|
+
PARAMETER stop <|im_end|>
|
|
682
|
+
PARAMETER stop <|im_start|>
|
|
683
|
+
|
|
684
|
+
[writes download_model.ps1] — PowerShell script to download the GGUF
|
|
685
|
+
[writes test_function_calling.py] — Python test script to verify it works
|
|
686
|
+
|
|
687
|
+
Quick Start:
|
|
688
|
+
.\download_model.ps1
|
|
689
|
+
ollama create lfm-nova -f Modelfile
|
|
690
|
+
ollama run lfm-nova "Calculate area of circle with radius 5"
|
|
691
|
+
|
|
692
|
+
Expected VRAM: ~0.8GB — plenty of room on your 12GB RTX 3060
|
|
693
|
+
```
|
|
694
|
+
|
|
652
695
|
Scout uses the same model/provider flags as the main TUI. You can run it with a local model (`-m qwen2.5-coder:7b-instruct -p ollama`) or a cloud API (`-p deepseek`). The text tool parser works in scout mode too, so models that output tool calls as text (like Qwen) will still work.
|
|
653
696
|
|
|
654
697
|
### Basic REPL commands
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,
|
|
1
|
+
{"version":3,"file":"scout-prompt.d.ts","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAEH,OAAO,KAAK,EAAE,YAAY,EAAE,MAAM,uBAAuB,CAAC;AAE1D,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAgE/D;AAED,wBAAgB,oBAAoB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CAkBnE"}
|
|
@@ -49,6 +49,23 @@ Guidelines:
|
|
|
49
49
|
- Be proactive: if the user asks about "coding models", search for multiple relevant terms
|
|
50
50
|
- Format recommendations clearly with model name, quant, estimated VRAM, and key strengths
|
|
51
51
|
|
|
52
|
+
After Recommendations — Always Offer Next Steps:
|
|
53
|
+
Once you've presented your model recommendations, ALWAYS end by offering these three actions for any of the recommended models:
|
|
54
|
+
|
|
55
|
+
1. **Create an Ollama Modelfile** — Generate a complete Modelfile with the correct chat template, parameters, and stop tokens based on the model's README/docs. Write it to disk so the user can run \`ollama create\` immediately.
|
|
56
|
+
|
|
57
|
+
2. **Download the GGUF** — Write a download script (PowerShell for Windows, shell for Linux/macOS) that uses \`huggingface-cli download\` or \`curl\`/\`wget\` to fetch the recommended GGUF quantization from HuggingFace. Include the full URL and expected file size.
|
|
58
|
+
|
|
59
|
+
3. **Set up a Clawx profile** — After the model is imported into Ollama, offer to create a Clawx profile so the user can switch to it instantly. Write the profile config file to \`~/.clawx/profiles/<model-name>\` with the correct provider, base URL, model name, and settings. Then the user can run \`clawx use <model-name>\` to start coding with it.
|
|
60
|
+
|
|
61
|
+
Present these as a numbered list like:
|
|
62
|
+
"Want me to set any of these up? I can:
|
|
63
|
+
1. Create an Ollama Modelfile for [model name]
|
|
64
|
+
2. Download the [quant] GGUF (~X GB)
|
|
65
|
+
3. Set up a Clawx profile so you can \`clawx use [name]\` to start coding with it
|
|
66
|
+
|
|
67
|
+
Just pick a model and I'll do all three, or tell me which steps you want."
|
|
68
|
+
|
|
52
69
|
You are conversational and helpful. Research thoroughly before making recommendations.`;
|
|
53
70
|
}
|
|
54
71
|
export function buildScoutChatPrompt(hardware) {
|
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO
|
|
1
|
+
{"version":3,"file":"scout-prompt.js","sourceRoot":"","sources":["../../src/utils/scout-prompt.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,UAAU,gBAAgB,CAAC,QAAsB;IACrD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;uFAiD8E,CAAC;AACxF,CAAC;AAED,MAAM,UAAU,oBAAoB,CAAC,QAAsB;IACzD,MAAM,OAAO,GAAG;QACd,UAAU,QAAQ,CAAC,GAAG,EAAE;QACxB,WAAW,QAAQ,CAAC,IAAI,EAAE;QAC1B,iBAAiB,QAAQ,CAAC,GAAG,EAAE;QAC/B,SAAS,QAAQ,CAAC,EAAE,EAAE;QACtB,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,YAAY,QAAQ,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE;KACnD;SACE,MAAM,CAAC,OAAO,CAAC;SACf,IAAI,CAAC,IAAI,CAAC,CAAC;IAEd,OAAO;;;EAGP,OAAO;;;kFAGyE,CAAC;AACnF,CAAC"}
|