opencommit 3.0.16 → 3.0.18
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +25 -15
- package/out/cli.cjs +10555 -9511
- package/package.json +12 -4
package/README.md
CHANGED
|
@@ -58,7 +58,7 @@ git add <files...>
|
|
|
58
58
|
oco
|
|
59
59
|
```
|
|
60
60
|
|
|
61
|
-
|
|
61
|
+
### Running locally with Ollama
|
|
62
62
|
|
|
63
63
|
You can also run it with local model through ollama:
|
|
64
64
|
|
|
@@ -71,12 +71,29 @@ git add <files...>
|
|
|
71
71
|
OCO_AI_PROVIDER='ollama' opencommit
|
|
72
72
|
```
|
|
73
73
|
|
|
74
|
+
If you want to use a model other than mistral, you can do so by setting the `OCO_AI_PROVIDER` environment variable as follows:
|
|
75
|
+
|
|
76
|
+
```sh
|
|
77
|
+
OCO_AI_PROVIDER='ollama/llama3:8b' opencommit
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
if you have ollama that is set up in docker/ on another machine with GPUs (not locally), you can change the default endpoint url.
|
|
81
|
+
You can do so by setting the `OCO_OLLAMA_API_URL` environment variable as follows:
|
|
82
|
+
|
|
83
|
+
```sh
|
|
84
|
+
OCO_OLLAMA_API_URL='http://192.168.1.10:11434/api/chat' opencommit
|
|
85
|
+
```
|
|
86
|
+
|
|
87
|
+
where 192.168.1.10 is example of endpoint URL, where you have ollama set up.
|
|
88
|
+
|
|
74
89
|
### Flags
|
|
75
90
|
|
|
76
91
|
There are multiple optional flags that can be used with the `oco` command:
|
|
77
92
|
|
|
78
93
|
#### Use Full GitMoji Specification
|
|
79
94
|
|
|
95
|
+
Link to the GitMoji specification: https://gitmoji.dev/
|
|
96
|
+
|
|
80
97
|
This flag can only be used if the `OCO_EMOJI` configuration item is set to `true`. This flag allows users to use all emojis in the GitMoji specification, By default, the GitMoji full specification is set to `false`, which only includes 10 emojis (🐛✨📝🚀✅♻️⬆️🔧🌐💡).
|
|
81
98
|
This is due to limit the number of tokens sent in each request. However, if you would like to use the full GitMoji specification, you can use the `--fgm` flag.
|
|
82
99
|
|
|
@@ -105,11 +122,12 @@ OCO_TOKENS_MAX_OUTPUT=<max response tokens (default: 500)>
|
|
|
105
122
|
OCO_OPENAI_BASE_PATH=<may be used to set proxy path to OpenAI api>
|
|
106
123
|
OCO_DESCRIPTION=<postface a message with ~3 sentences description of the changes>
|
|
107
124
|
OCO_EMOJI=<boolean, add GitMoji>
|
|
108
|
-
OCO_MODEL=<either 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview'>
|
|
125
|
+
OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview'>
|
|
109
126
|
OCO_LANGUAGE=<locale, scroll to the bottom to see options>
|
|
110
127
|
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=<message template placeholder, default: '$msg'>
|
|
111
128
|
OCO_PROMPT_MODULE=<either conventional-commit or @commitlint, default: conventional-commit>
|
|
112
129
|
OCO_ONE_LINE_COMMIT=<one line commit message, default: false>
|
|
130
|
+
OCO_AI_PROVIDER=<anthropic, azure, ollama or ollama/model default ollama model: mistral>
|
|
113
131
|
```
|
|
114
132
|
|
|
115
133
|
### Global config for all repos
|
|
@@ -119,7 +137,7 @@ Local config still has more priority than Global config, but you may set `OCO_MO
|
|
|
119
137
|
Simply set any of the variables above like this:
|
|
120
138
|
|
|
121
139
|
```sh
|
|
122
|
-
oco config set OCO_MODEL=gpt-
|
|
140
|
+
oco config set OCO_MODEL=gpt-4o
|
|
123
141
|
```
|
|
124
142
|
|
|
125
143
|
Configure [GitMoji](https://gitmoji.dev/) to preface a message.
|
|
@@ -136,7 +154,7 @@ oco config set OCO_EMOJI=false
|
|
|
136
154
|
|
|
137
155
|
### Switch to GPT-4 or other models
|
|
138
156
|
|
|
139
|
-
By default, OpenCommit uses `gpt-
|
|
157
|
+
By default, OpenCommit uses `gpt-4o` model.
|
|
140
158
|
|
|
141
159
|
You may switch to GPT-4 which performs better, but costs ~x15 times more 🤠
|
|
142
160
|
|
|
@@ -147,17 +165,9 @@ oco config set OCO_MODEL=gpt-4
|
|
|
147
165
|
or for as a cheaper option:
|
|
148
166
|
|
|
149
167
|
```sh
|
|
150
|
-
oco config set OCO_MODEL=gpt-
|
|
168
|
+
oco config set OCO_MODEL=gpt-4o-mini
|
|
151
169
|
```
|
|
152
170
|
|
|
153
|
-
or for GPT-4 Turbo (Preview) which is more capable, has knowledge of world events up to April 2023, a 128k context window and 2-3x cheaper vs GPT-4:
|
|
154
|
-
|
|
155
|
-
```sh
|
|
156
|
-
oco config set OCO_MODEL=gpt-4-0125-preview
|
|
157
|
-
```
|
|
158
|
-
|
|
159
|
-
Make sure that you spell it `gpt-4` (lowercase) and that you have API access to the 4th model. Even if you have ChatGPT+, that doesn't necessarily mean that you have API access to GPT-4.
|
|
160
|
-
|
|
161
171
|
### Switch to Azure OpenAI
|
|
162
172
|
|
|
163
173
|
By default OpenCommit uses [OpenAI](https://openai.com).
|
|
@@ -213,7 +223,7 @@ Replace `<module>` with either `conventional-commit` or `@commitlint`.
|
|
|
213
223
|
|
|
214
224
|
#### Example:
|
|
215
225
|
|
|
216
|
-
To switch to using
|
|
226
|
+
To switch to using the `'@commitlint` prompt module, run:
|
|
217
227
|
|
|
218
228
|
```sh
|
|
219
229
|
oco config set OCO_PROMPT_MODULE=@commitlint
|
|
@@ -381,7 +391,7 @@ jobs:
|
|
|
381
391
|
OCO_OPENAI_BASE_PATH: ''
|
|
382
392
|
OCO_DESCRIPTION: false
|
|
383
393
|
OCO_EMOJI: false
|
|
384
|
-
OCO_MODEL: gpt-
|
|
394
|
+
OCO_MODEL: gpt-4o
|
|
385
395
|
OCO_LANGUAGE: en
|
|
386
396
|
OCO_PROMPT_MODULE: conventional-commit
|
|
387
397
|
```
|