stably 4.12.20 → 4.12.21
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +0 -33
- package/dist/index.mjs +1 -1
- package/dist/stably-browser.js +145 -33
- package/package.json +2 -2
package/README.md
CHANGED
|
@@ -44,36 +44,3 @@ Stably CLI supports two auth modes:
|
|
|
44
44
|
|
|
45
45
|
If both are present, the CLI will honor the environment variables and warn that stored OAuth credentials are being ignored. To switch back to browser login, unset the env vars.
|
|
46
46
|
|
|
47
|
-
## Cheap local/CI agent testing (Ollama)
|
|
48
|
-
|
|
49
|
-
You can run the agent against Ollama via its Anthropic-compatible API (cheap, local, and deterministic enough for smoke tests).
|
|
50
|
-
This is **dev-only** behavior (it is ignored in `NODE_ENV=production` builds).
|
|
51
|
-
|
|
52
|
-
- `STABLY_BYPASS_AI_PROXY=1` (dev-only; ensures we don't override `ANTHROPIC_BASE_URL`)
|
|
53
|
-
- Dev-only convenience: `STABLY_USE_OLLAMA=1` (dev-only; ignored when `NODE_ENV=production`)
|
|
54
|
-
- `STABLY_AGENT_MODEL=qwen2.5-coder:0.5b` (fastest/cheapest suggested default)
|
|
55
|
-
|
|
56
|
-
### Ollama: fastest local model (recommended)
|
|
57
|
-
|
|
58
|
-
If you want a convenience helper that starts Ollama (if needed), pulls the model, and prints the exact env vars to use:
|
|
59
|
-
|
|
60
|
-
```bash
|
|
61
|
-
pnpm ollama:dev-env
|
|
62
|
-
```
|
|
63
|
-
|
|
64
|
-
Notes:
|
|
65
|
-
- On Linux, this script will try to install Ollama automatically (requires `sudo` or root).
|
|
66
|
-
- On macOS, it will try Homebrew if available (`brew install ollama` / `brew install --cask ollama`); otherwise it will prompt you to install Ollama manually.
|
|
67
|
-
|
|
68
|
-
1) Start Ollama and pull the tiny model:
|
|
69
|
-
|
|
70
|
-
```bash
|
|
71
|
-
ollama serve
|
|
72
|
-
ollama pull qwen2.5-coder:0.5b
|
|
73
|
-
```
|
|
74
|
-
|
|
75
|
-
2) Run the CLI against Ollama (Anthropic-compatible API):
|
|
76
|
-
|
|
77
|
-
```bash
|
|
78
|
-
STABLY_BYPASS_AI_PROXY=1 STABLY_USE_OLLAMA=1 STABLY_AGENT_MODEL=qwen2.5-coder:0.5b pnpm dev
|
|
79
|
-
```
|