opencommit 3.2.10 → 3.2.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +38 -0
  2. package/out/cli.cjs +2210 -774
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -74,6 +74,22 @@ oco config set OCO_API_URL='http://192.168.1.10:11434/api/chat'
74
74
 
75
75
  where 192.168.1.10 is example of endpoint URL, where you have ollama set up.
76
76
 
77
+ #### Troubleshooting Ollama IPv6/IPv4 Connection Fix
78
+
79
+ If you encounter issues with Ollama, such as the error
80
+
81
+ ```sh
82
+ ✖ local model issues. details: connect ECONNREFUSED ::1:11434
83
+ ```
84
+
85
+ It's likely because Ollama is not listening on IPv6 by default. To fix this, you can set the OLLAMA_HOST environment variable to 0.0.0.0 before starting Ollama:
86
+
87
+ ```bash
88
+ export OLLAMA_HOST=0.0.0.0
89
+ ```
90
+
91
+ This will make Ollama listen on all interfaces, including IPv6 and IPv4, resolving the connection issue. You can add this line to your shell configuration file (like `.bashrc` or `.zshrc`) to make it persistent across sessions.
92
+
77
93
  ### Flags
78
94
 
79
95
  There are multiple optional flags that can be used with the `oco` command:
@@ -185,6 +201,28 @@ or for as a cheaper option:
185
201
  oco config set OCO_MODEL=gpt-3.5-turbo
186
202
  ```
187
203
 
204
+ ### Model Management
205
+
206
+ OpenCommit automatically fetches available models from your provider when you run `oco setup`. Models are cached for 7 days to reduce API calls.
207
+
208
+ To see available models for your current provider:
209
+
210
+ ```sh
211
+ oco models
212
+ ```
213
+
214
+ To refresh the model list (e.g., after new models are released):
215
+
216
+ ```sh
217
+ oco models --refresh
218
+ ```
219
+
220
+ To see models for a specific provider:
221
+
222
+ ```sh
223
+ oco models --provider anthropic
224
+ ```
225
+
188
226
  ### Switch to other LLM providers with a custom URL
189
227
 
190
228
  By default OpenCommit uses [OpenAI](https://openai.com).