opencommit 3.2.9 → 3.2.11
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +17 -1
- package/out/cli.cjs +2062 -902
- package/package.json +2 -1
package/README.md
CHANGED
|
@@ -74,6 +74,22 @@ oco config set OCO_API_URL='http://192.168.1.10:11434/api/chat'
|
|
|
74
74
|
|
|
75
75
|
where 192.168.1.10 is example of endpoint URL, where you have ollama set up.
|
|
76
76
|
|
|
77
|
+
#### Troubleshooting Ollama IPv6/IPv4 Connection Fix
|
|
78
|
+
|
|
79
|
+
If you encounter issues with Ollama, such as the error
|
|
80
|
+
|
|
81
|
+
```sh
|
|
82
|
+
✖ local model issues. details: connect ECONNREFUSED ::1:11434
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
It's likely because Ollama is not listening on IPv6 by default. To fix this, you can set the OLLAMA_HOST environment variable to 0.0.0.0 before starting Ollama:
|
|
86
|
+
|
|
87
|
+
```bash
|
|
88
|
+
export OLLAMA_HOST=0.0.0.0
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
This will make Ollama listen on all interfaces, including IPv6 and IPv4, resolving the connection issue. You can add this line to your shell configuration file (like `.bashrc` or `.zshrc`) to make it persistent across sessions.
|
|
92
|
+
|
|
77
93
|
### Flags
|
|
78
94
|
|
|
79
95
|
There are multiple optional flags that can be used with the `oco` command:
|
|
@@ -106,7 +122,7 @@ Create a `.env` file and add OpenCommit config variables there like this:
|
|
|
106
122
|
|
|
107
123
|
```env
|
|
108
124
|
...
|
|
109
|
-
OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama, gemini, flowise, deepseek>
|
|
125
|
+
OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama, gemini, flowise, deepseek, aimlapi>
|
|
110
126
|
OCO_API_KEY=<your OpenAI API token> // or other LLM provider API token
|
|
111
127
|
OCO_API_URL=<may be used to set proxy path to OpenAI api>
|
|
112
128
|
OCO_API_CUSTOM_HEADERS=<JSON string of custom HTTP headers to include in API requests>
|