opencommit 3.1.1 → 3.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +24 -38
  2. package/out/cli.cjs +2414 -195
  3. package/package.json +5 -5
package/README.md CHANGED
@@ -28,30 +28,19 @@ You can use OpenCommit by simply running it via the CLI like this `oco`. 2 secon
28
28
  npm install -g opencommit
29
29
  ```
30
30
 
31
- Alternatively run it via `npx opencommit` or `bunx opencommit`
32
-
33
- MacOS may ask to run the command with `sudo` when installing a package globally.
34
-
35
- 2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys). Make sure that you add your payment details, so the API works.
31
+ 2. Get your API key from [OpenAI](https://platform.openai.com/account/api-keys) or other supported LLM providers (we support them all). Make sure that you add your OpenAI payment details to your account, so the API works.
36
32
 
37
33
  3. Set the key to OpenCommit config:
38
34
 
39
35
  ```sh
40
- oco config set OCO_OPENAI_API_KEY=<your_api_key>
36
+ oco config set OCO_API_KEY=<your_api_key>
41
37
  ```
42
38
 
43
39
  Your API key is stored locally in the `~/.opencommit` config file.
44
40
 
45
41
  ## Usage
46
42
 
47
- You can call OpenCommit directly to generate a commit message for your staged changes:
48
-
49
- ```sh
50
- git add <files...>
51
- opencommit
52
- ```
53
-
54
- You can also use the `oco` shortcut:
43
+ You can call OpenCommit with `oco` command to generate a commit message for your staged changes:
55
44
 
56
45
  ```sh
57
46
  git add <files...>
@@ -70,21 +59,17 @@ You can also run it with local model through ollama:
70
59
 
71
60
  ```sh
72
61
  git add <files...>
73
- oco config set OCO_AI_PROVIDER='ollama'
62
+ oco config set OCO_AI_PROVIDER='ollama' OCO_MODEL='llama3:8b'
74
63
  ```
75
64
 
76
- If you want to use a model other than mistral (default), you can do so by setting the `OCO_AI_PROVIDER` environment variable as follows:
77
-
78
- ```sh
79
- oco config set OCO_AI_PROVIDER='ollama/llama3:8b'
80
- ```
65
+ Default model is `mistral`.
81
66
 
82
67
  If you have ollama that is set up in docker/ on another machine with GPUs (not locally), you can change the default endpoint url.
83
68
 
84
- You can do so by setting the `OCO_OLLAMA_API_URL` environment variable as follows:
69
+ You can do so by setting the `OCO_API_URL` environment variable as follows:
85
70
 
86
71
  ```sh
87
- oco config set OCO_OLLAMA_API_URL='http://192.168.1.10:11434/api/chat'
72
+ oco config set OCO_API_URL='http://192.168.1.10:11434/api/chat'
88
73
  ```
89
74
 
90
75
  where 192.168.1.10 is example of endpoint URL, where you have ollama set up.
@@ -121,22 +106,21 @@ Create a `.env` file and add OpenCommit config variables there like this:
121
106
 
122
107
  ```env
123
108
  ...
124
- OCO_OPENAI_API_KEY=<your OpenAI API token>
109
+ OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama, gemini, flowise>
110
+ OCO_API_KEY=<your OpenAI API token> // or other LLM provider API token
111
+ OCO_API_URL=<may be used to set proxy path to OpenAI api>
125
112
  OCO_TOKENS_MAX_INPUT=<max model token limit (default: 4096)>
126
113
  OCO_TOKENS_MAX_OUTPUT=<max response tokens (default: 500)>
127
- OCO_OPENAI_BASE_PATH=<may be used to set proxy path to OpenAI api>
128
114
  OCO_DESCRIPTION=<postface a message with ~3 sentences description of the changes>
129
115
  OCO_EMOJI=<boolean, add GitMoji>
130
- OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any string basically, but it should be a valid model name>
116
+ OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any Anthropic or Ollama model or any string basically, but it should be a valid model name>
131
117
  OCO_LANGUAGE=<locale, scroll to the bottom to see options>
132
118
  OCO_MESSAGE_TEMPLATE_PLACEHOLDER=<message template placeholder, default: '$msg'>
133
119
  OCO_PROMPT_MODULE=<either conventional-commit or @commitlint, default: conventional-commit>
134
120
  OCO_ONE_LINE_COMMIT=<one line commit message, default: false>
135
- OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama or ollama/model>
136
- ...
137
121
  ```
138
122
 
139
- This are not all the config options, but you get the point.
123
+ Global configs are same as local configs, but they are stored in the global `~/.opencommit` config file and set with `oco config set` command, e.g. `oco config set OCO_MODEL=gpt-4o`.
140
124
 
141
125
  ### Global config for all repos
142
126
 
@@ -188,26 +172,26 @@ or for as a cheaper option:
188
172
  oco config set OCO_MODEL=gpt-3.5-turbo
189
173
  ```
190
174
 
191
- ### Switch to Azure OpenAI
175
+ ### Switch to other LLM providers with a custom URL
192
176
 
193
177
  By default OpenCommit uses [OpenAI](https://openai.com).
194
178
 
195
- You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/)🚀
179
+ You could switch to [Azure OpenAI Service](https://learn.microsoft.com/azure/cognitive-services/openai/) or Flowise or Ollama.
196
180
 
197
181
  ```sh
198
- opencommit config set OCO_AI_PROVIDER=azure
199
- ```
182
+ oco config set OCO_AI_PROVIDER=azure OCO_API_KEY=<your_azure_api_key> OCO_API_URL=<your_azure_endpoint>
200
183
 
201
- Of course need to set 'OCO_OPENAI_API_KEY'. And also need to set the
202
- 'OPENAI_BASE_PATH' for the endpoint and set the deployment name to
203
- 'model'.
184
+ oco config set OCO_AI_PROVIDER=flowise OCO_API_KEY=<your_flowise_api_key> OCO_API_URL=<your_flowise_endpoint>
185
+
186
+ oco config set OCO_AI_PROVIDER=ollama OCO_API_KEY=<your_ollama_api_key> OCO_API_URL=<your_ollama_endpoint>
187
+ ```
204
188
 
205
189
  ### Locale configuration
206
190
 
207
191
  To globally specify the language used to generate commit messages:
208
192
 
209
193
  ```sh
210
- # de, German ,Deutsch
194
+ # de, German, Deutsch
211
195
  oco config set OCO_LANGUAGE=de
212
196
  oco config set OCO_LANGUAGE=German
213
197
  oco config set OCO_LANGUAGE=Deutsch
@@ -223,12 +207,14 @@ All available languages are currently listed in the [i18n](https://github.com/di
223
207
 
224
208
  ### Push to git (gonna be deprecated)
225
209
 
226
- A prompt to ushing to git is on by default but if you would like to turn it off just use:
210
+ A prompt for pushing to git is on by default but if you would like to turn it off just use:
227
211
 
228
212
  ```sh
229
213
  oco config set OCO_GITPUSH=false
230
214
  ```
231
215
 
216
+ and it will exit right after commit is confirmed without asking if you would like to push to remote.
217
+
232
218
  ### Switch to `@commitlint`
233
219
 
234
220
  OpenCommit allows you to choose the prompt module used to generate commit messages. By default, OpenCommit uses its conventional-commit message generator. However, you can switch to using the `@commitlint` prompt module if you prefer. This option lets you generate commit messages in respect with the local config.
@@ -403,7 +389,7 @@ jobs:
403
389
  # set openAI api key in repo actions secrets,
404
390
  # for openAI keys go to: https://platform.openai.com/account/api-keys
405
391
  # for repo secret go to: <your_repo_url>/settings/secrets/actions
406
- OCO_OPENAI_API_KEY: ${{ secrets.OCO_OPENAI_API_KEY }}
392
+ OCO_API_KEY: ${{ secrets.OCO_API_KEY }}
407
393
 
408
394
  # customization
409
395
  OCO_TOKENS_MAX_INPUT: 4096