gitme-ai 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
gitme_ai-0.1.0/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 RohitB2005
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,247 @@
1
+ Metadata-Version: 2.4
2
+ Name: gitme-ai
3
+ Version: 0.1.0
4
+ Summary: AI-powered git commit message generator
5
+ Author-email: Rohit Balaji <rohitvb26@gmail.com>
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/RohitB2005/gitme-cli
8
+ Project-URL: Repository, https://github.com/RohitB2005/gitme-cli
9
+ Project-URL: Bug Tracker, https://github.com/RohitB2005/gitme-cli/issues
10
+ Keywords: git,cli,ai,commit,llm,developer-tools
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: Environment :: Console
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: License :: OSI Approved :: MIT License
15
+ Classifier: Programming Language :: Python :: 3
16
+ Classifier: Programming Language :: Python :: 3.11
17
+ Classifier: Programming Language :: Python :: 3.12
18
+ Classifier: Topic :: Software Development :: Version Control :: Git
19
+ Requires-Python: >=3.11
20
+ Description-Content-Type: text/markdown
21
+ License-File: LICENSE
22
+ Requires-Dist: click>=8.0
23
+ Requires-Dist: rich>=13.0
24
+ Requires-Dist: pyperclip>=1.8
25
+ Requires-Dist: requests>=2.28
26
+ Requires-Dist: tomli-w>=1.0
27
+ Dynamic: license-file
28
+
29
+ # gitme-cli
30
+
31
+ AI-powered git commit message generator. Reads your staged changes and generates a well-formatted [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) message using a local or cloud LLM.
32
+
33
+ **Local-first and privacy-friendly by default** — your diffs never leave your machine unless you choose a cloud provider.
34
+
35
+ ---
36
+
37
+ ## Requirements
38
+
39
+ - Python 3.11 or higher
40
+ - Git
41
+
42
+ ---
43
+
44
+ ## Installation
45
+
46
+ ```bash
47
+ pip install gitme-cli
48
+ ```
49
+
50
+ ---
51
+
52
+ ## Quick Start
53
+
54
+ Stage your changes as you normally would, then run `gitme`:
55
+
56
+ ```bash
57
+ git add .
58
+ gitme
59
+ ```
60
+
61
+ That's it. gitme reads your staged diff, sends it to the AI, and prints a suggested commit message in your terminal.
62
+
63
+ By default it uses **Ollama with llama3.2** running locally. See [Providers](#providers) below to set up Ollama or switch to a cloud provider.
64
+
65
+ ---
66
+
67
+ ## Usage
68
+
69
+ ### Basic
70
+
71
+ ```bash
72
+ gitme
73
+ ```
74
+
75
+ ### Copy to clipboard
76
+
77
+ ```bash
78
+ gitme --copy
79
+ ```
80
+
81
+ The message is printed to the terminal and copied to your clipboard. You can then paste it directly into `git commit -m "..."`.
82
+
83
+ ### Add extra context
84
+
85
+ ```bash
86
+ gitme --context "this is a hotfix for the login bug in production"
87
+ ```
88
+
89
+ The context is appended to the prompt so the model has more information to work with. Useful when the diff alone doesn't tell the full story.
90
+
91
+ ---
92
+
93
+ ## Providers
94
+
95
+ gitme supports three providers. You set your preferred one once and it applies to every future run.
96
+
97
+ ### Ollama (default — local, free, private)
98
+
99
+ Ollama runs models on your own machine. No API key needed, no data leaves your computer.
100
+
101
+ **1. Install Ollama**
102
+
103
+ Download from [ollama.com](https://ollama.com) and open the app. You'll see it in your menu bar when it's running.
104
+
105
+ **2. Pull the model**
106
+
107
+ ```bash
108
+ ollama pull llama3.2
109
+ ```
110
+
111
+ **3. Run gitme**
112
+
113
+ ```bash
114
+ gitme
115
+ ```
116
+
117
+ Ollama is already the default — nothing else to configure.
118
+
119
+ To use a different Ollama model:
120
+
121
+ ```bash
122
+ gitme-config set model mistral
123
+ ```
124
+
125
+ ---
126
+
127
+ ### OpenAI (cloud, best quality)
128
+
129
+ Uses the OpenAI API. You need an API key and credit — `gpt-4o-mini` is cheap and works well for commit messages.
130
+
131
+ **1. Set your API key**
132
+
133
+ ```bash
134
+ gitme-config set openai_api_key YOUR_API_KEY
135
+ gitme-config set provider openai
136
+ gitme-config set model gpt-4o-mini
137
+ ```
138
+
139
+ **2. Run gitme**
140
+
141
+ ```bash
142
+ gitme
143
+ ```
144
+
145
+ Your API key is stored locally in `~/.gitme.toml` and never shared.
146
+
147
+ ---
148
+
149
+ ### OpenRouter (cloud, free tier available)
150
+
151
+ OpenRouter provides access to many models including free ones. No credit card required for the free tier.
152
+
153
+ Get a free API key at [openrouter.ai](https://openrouter.ai).
154
+
155
+ **1. Set your API key**
156
+
157
+ ```bash
158
+ gitme-config set openrouter_api_key YOUR_API_KEY
159
+ gitme-config set provider openrouter
160
+ gitme-config set model "nvidia/nemotron-3-nano-30b-a3b:free"
161
+ ```
162
+
163
+ **2. Run gitme**
164
+
165
+ ```bash
166
+ gitme
167
+ ```
168
+
169
+ ---
170
+
171
+ ## Configuration
172
+
173
+ Configuration is stored in `~/.gitme.toml` and shared across all your projects.
174
+
175
+ ### View current config
176
+
177
+ ```bash
178
+ gitme-config show
179
+ ```
180
+
181
+ Output:
182
+ ```
183
+ provider: ollama
184
+ model: llama3.2
185
+ style: conventional
186
+ ```
187
+
188
+ ### Set a value
189
+
190
+ ```bash
191
+ gitme-config set <key> <value>
192
+ ```
193
+
194
+ **Available keys:**
195
+
196
+ | Key | Description | Default |
197
+ |-----|-------------|---------|
198
+ | `provider` | Which provider to use: `ollama`, `openai`, or `openrouter` | `ollama` |
199
+ | `model` | Model name for the active provider | `llama3.2` |
200
+ | `style` | Commit style (currently `conventional`) | `conventional` |
201
+ | `openai_api_key` | Your OpenAI API key | — |
202
+ | `openrouter_api_key` | Your OpenRouter API key | — |
203
+
204
+ ### Examples
205
+
206
+ ```bash
207
+ # Switch to OpenAI
208
+ gitme-config set provider openai
209
+ gitme-config set model gpt-4o-mini
210
+ gitme-config set openai_api_key sk-...
211
+
212
+ # Switch back to Ollama
213
+ gitme-config set provider ollama
214
+ gitme-config set model llama3.2
215
+ ```
216
+
217
+ ---
218
+
219
+ ## Commit Format
220
+
221
+ gitme follows the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) specification:
222
+
223
+ ```
224
+ type(scope): short description
225
+
226
+ - optional body explaining why the change was made
227
+
228
+ Fixes #123 (optional footer)
229
+ ```
230
+
231
+ **Commit types:**
232
+
233
+ | Type | When to use |
234
+ |------|-------------|
235
+ | `feat` | New feature or capability |
236
+ | `fix` | Bug fix |
237
+ | `refactor` | Code restructured without changing behaviour |
238
+ | `docs` | Documentation changes only |
239
+ | `style` | Formatting or whitespace, no logic changes |
240
+ | `test` | Adding or updating tests |
241
+ | `chore` | Maintenance, config changes, dependency updates |
242
+
243
+ ---
244
+
245
+ ## License
246
+
247
+ MIT
@@ -0,0 +1,219 @@
1
+ # gitme-cli
2
+
3
+ AI-powered git commit message generator. Reads your staged changes and generates a well-formatted [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) message using a local or cloud LLM.
4
+
5
+ **Local-first and privacy-friendly by default** — your diffs never leave your machine unless you choose a cloud provider.
6
+
7
+ ---
8
+
9
+ ## Requirements
10
+
11
+ - Python 3.11 or higher
12
+ - Git
13
+
14
+ ---
15
+
16
+ ## Installation
17
+
18
+ ```bash
19
+ pip install gitme-cli
20
+ ```
21
+
22
+ ---
23
+
24
+ ## Quick Start
25
+
26
+ Stage your changes as you normally would, then run `gitme`:
27
+
28
+ ```bash
29
+ git add .
30
+ gitme
31
+ ```
32
+
33
+ That's it. gitme reads your staged diff, sends it to the AI, and prints a suggested commit message in your terminal.
34
+
35
+ By default it uses **Ollama with llama3.2** running locally. See [Providers](#providers) below to set up Ollama or switch to a cloud provider.
36
+
37
+ ---
38
+
39
+ ## Usage
40
+
41
+ ### Basic
42
+
43
+ ```bash
44
+ gitme
45
+ ```
46
+
47
+ ### Copy to clipboard
48
+
49
+ ```bash
50
+ gitme --copy
51
+ ```
52
+
53
+ The message is printed to the terminal and copied to your clipboard. You can then paste it directly into `git commit -m "..."`.
54
+
55
+ ### Add extra context
56
+
57
+ ```bash
58
+ gitme --context "this is a hotfix for the login bug in production"
59
+ ```
60
+
61
+ The context is appended to the prompt so the model has more information to work with. Useful when the diff alone doesn't tell the full story.
62
+
63
+ ---
64
+
65
+ ## Providers
66
+
67
+ gitme supports three providers. You set your preferred one once and it applies to every future run.
68
+
69
+ ### Ollama (default — local, free, private)
70
+
71
+ Ollama runs models on your own machine. No API key needed, no data leaves your computer.
72
+
73
+ **1. Install Ollama**
74
+
75
+ Download from [ollama.com](https://ollama.com) and open the app. You'll see it in your menu bar when it's running.
76
+
77
+ **2. Pull the model**
78
+
79
+ ```bash
80
+ ollama pull llama3.2
81
+ ```
82
+
83
+ **3. Run gitme**
84
+
85
+ ```bash
86
+ gitme
87
+ ```
88
+
89
+ Ollama is already the default — nothing else to configure.
90
+
91
+ To use a different Ollama model:
92
+
93
+ ```bash
94
+ gitme-config set model mistral
95
+ ```
96
+
97
+ ---
98
+
99
+ ### OpenAI (cloud, best quality)
100
+
101
+ Uses the OpenAI API. You need an API key and credit — `gpt-4o-mini` is cheap and works well for commit messages.
102
+
103
+ **1. Set your API key**
104
+
105
+ ```bash
106
+ gitme-config set openai_api_key YOUR_API_KEY
107
+ gitme-config set provider openai
108
+ gitme-config set model gpt-4o-mini
109
+ ```
110
+
111
+ **2. Run gitme**
112
+
113
+ ```bash
114
+ gitme
115
+ ```
116
+
117
+ Your API key is stored locally in `~/.gitme.toml` and never shared.
118
+
119
+ ---
120
+
121
+ ### OpenRouter (cloud, free tier available)
122
+
123
+ OpenRouter provides access to many models including free ones. No credit card required for the free tier.
124
+
125
+ Get a free API key at [openrouter.ai](https://openrouter.ai).
126
+
127
+ **1. Set your API key**
128
+
129
+ ```bash
130
+ gitme-config set openrouter_api_key YOUR_API_KEY
131
+ gitme-config set provider openrouter
132
+ gitme-config set model "nvidia/nemotron-3-nano-30b-a3b:free"
133
+ ```
134
+
135
+ **2. Run gitme**
136
+
137
+ ```bash
138
+ gitme
139
+ ```
140
+
141
+ ---
142
+
143
+ ## Configuration
144
+
145
+ Configuration is stored in `~/.gitme.toml` and shared across all your projects.
146
+
147
+ ### View current config
148
+
149
+ ```bash
150
+ gitme-config show
151
+ ```
152
+
153
+ Output:
154
+ ```
155
+ provider: ollama
156
+ model: llama3.2
157
+ style: conventional
158
+ ```
159
+
160
+ ### Set a value
161
+
162
+ ```bash
163
+ gitme-config set <key> <value>
164
+ ```
165
+
166
+ **Available keys:**
167
+
168
+ | Key | Description | Default |
169
+ |-----|-------------|---------|
170
+ | `provider` | Which provider to use: `ollama`, `openai`, or `openrouter` | `ollama` |
171
+ | `model` | Model name for the active provider | `llama3.2` |
172
+ | `style` | Commit style (currently `conventional`) | `conventional` |
173
+ | `openai_api_key` | Your OpenAI API key | — |
174
+ | `openrouter_api_key` | Your OpenRouter API key | — |
175
+
176
+ ### Examples
177
+
178
+ ```bash
179
+ # Switch to OpenAI
180
+ gitme-config set provider openai
181
+ gitme-config set model gpt-4o-mini
182
+ gitme-config set openai_api_key sk-...
183
+
184
+ # Switch back to Ollama
185
+ gitme-config set provider ollama
186
+ gitme-config set model llama3.2
187
+ ```
188
+
189
+ ---
190
+
191
+ ## Commit Format
192
+
193
+ gitme follows the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) specification:
194
+
195
+ ```
196
+ type(scope): short description
197
+
198
+ - optional body explaining why the change was made
199
+
200
+ Fixes #123 (optional footer)
201
+ ```
202
+
203
+ **Commit types:**
204
+
205
+ | Type | When to use |
206
+ |------|-------------|
207
+ | `feat` | New feature or capability |
208
+ | `fix` | Bug fix |
209
+ | `refactor` | Code restructured without changing behaviour |
210
+ | `docs` | Documentation changes only |
211
+ | `style` | Formatting or whitespace, no logic changes |
212
+ | `test` | Adding or updating tests |
213
+ | `chore` | Maintenance, config changes, dependency updates |
214
+
215
+ ---
216
+
217
+ ## License
218
+
219
+ MIT
File without changes
@@ -0,0 +1,91 @@
1
+ import click
2
+ from rich.console import Console
3
+ from rich.panel import Panel
4
+
5
+ from .diff import get_staged_diff
6
+ from .prompt import SYSTEM_PROMPT, build_prompt
7
+ from .providers.ollama import OllamaProvider
8
+ from .providers.openai import OpenAIProvider
9
+ from .providers.openrouter import OpenRouterProvider
10
+ from .config import load_config, save_config, show_config
11
+
12
+ console = Console()
13
+
14
+
15
+ def get_provider(cfg: dict):
16
+ provider = cfg.get("provider", "ollama")
17
+ model = cfg.get("model", "llama3.2")
18
+
19
+ if provider == "ollama":
20
+ return OllamaProvider(model=model)
21
+ elif provider == "openai":
22
+ return OpenAIProvider(model=model, api_key=cfg.get("openai_api_key", ""))
23
+ elif provider == "openrouter":
24
+ return OpenRouterProvider(model=model, api_key=cfg.get("openrouter_api_key", ""))
25
+ else:
26
+ raise RuntimeError(
27
+ f"Unknown provider '{provider}'. Choose: ollama, openai, openrouter"
28
+ )
29
+
30
+
31
+ @click.command()
32
+ @click.option("--copy", is_flag=True, help="Copy the result to clipboard.")
33
+ @click.option("--context", default="", help="Extra context to include in the prompt.")
34
+ def main(copy, context):
35
+ """Generate a git commit message from your staged changes."""
36
+ try:
37
+ diff = get_staged_diff()
38
+ except (EnvironmentError, ValueError) as e:
39
+ console.print(f"[red]Error:[/red] {e}")
40
+ raise SystemExit(1)
41
+
42
+ user_prompt = build_prompt(diff)
43
+ if context:
44
+ user_prompt += f"\n\nExtra context: {context}"
45
+
46
+ console.print("[dim]Generating commit message...[/dim]")
47
+
48
+ try:
49
+ cfg = load_config()
50
+ provider = get_provider(cfg)
51
+ message = provider.generate(SYSTEM_PROMPT, user_prompt)
52
+ except RuntimeError as e:
53
+ console.print(f"[red]Error:[/red] {e}")
54
+ raise SystemExit(1)
55
+
56
+ console.print(Panel(message, title="suggested commit message", border_style="green"))
57
+
58
+ if copy:
59
+ try:
60
+ import pyperclip
61
+ pyperclip.copy(message)
62
+ console.print("[dim]Copied to clipboard.[/dim]")
63
+ except Exception:
64
+ console.print("[yellow]Could not copy to clipboard.[/yellow]")
65
+
66
+
67
+ @click.group()
68
+ def config():
69
+ """Manage gitme configuration."""
70
+ pass
71
+
72
+
73
+ @config.command("set")
74
+ @click.argument("key")
75
+ @click.argument("value")
76
+ def config_set(key, value):
77
+ """Set a config value. E.g: gitme-config set provider openai"""
78
+ valid_keys = ["provider", "model", "style", "openai_api_key", "openrouter_api_key"]
79
+ if key not in valid_keys:
80
+ console.print(f"[red]Unknown key '{key}'. Valid keys: {', '.join(valid_keys)}[/red]")
81
+ raise SystemExit(1)
82
+ save_config({key: value})
83
+ console.print(f"[green]Set {key} = {value}[/green]")
84
+
85
+
86
+ @config.command("show")
87
+ def config_show():
88
+ """Show current configuration."""
89
+ cfg = show_config()
90
+ for key, value in cfg.items():
91
+ console.print(f"[dim]{key}:[/dim] {value}", highlight=False)
@@ -0,0 +1,31 @@
1
+ import tomllib
2
+ import tomli_w
3
+ from pathlib import Path
4
+
5
+ CONFIG_PATH = Path.home() / ".gitme.toml"
6
+
7
+ DEFAULTS = {
8
+ "provider": "ollama",
9
+ "model": "llama3.2",
10
+ "style": "conventional",
11
+ }
12
+
13
+ def load_config() -> dict:
14
+ if not CONFIG_PATH.exists():
15
+ return DEFAULTS.copy()
16
+
17
+ with open(CONFIG_PATH, "rb") as f:
18
+ user_config = tomllib.load(f)
19
+
20
+ return {**DEFAULTS, **user_config}
21
+
22
+ def save_config(updates: dict) -> None:
23
+ config = load_config()
24
+ config.update(updates)
25
+
26
+ with open(CONFIG_PATH, "wb") as f:
27
+ tomli_w.dump(config, f)
28
+
29
+
30
+ def show_config() -> dict:
31
+ return load_config()
@@ -0,0 +1,27 @@
1
+ import subprocess
2
+
3
+ def get_staged_diff() -> str:
4
+ repo_exists = subprocess.run(
5
+ ["git", "rev-parse", "--is-inside-work-tree"],
6
+ capture_output=True,
7
+ text=True
8
+ )
9
+
10
+ if repo_exists.returncode != 0:
11
+ raise EnvironmentError("Not currently inside a git repository.")
12
+
13
+ result = subprocess.run(
14
+ ["git", "diff", "--cached"],
15
+ capture_output=True,
16
+ text=True
17
+ )
18
+
19
+ if result.returncode != 0:
20
+ raise RuntimeError(f"Git error: {result.stderr.strip()}")
21
+
22
+ diff = result.stdout.strip()
23
+
24
+ if not diff:
25
+ raise ValueError("No staged changes found. Run 'git add' first.")
26
+
27
+ return diff
@@ -0,0 +1,40 @@
1
+ SYSTEM_PROMPT = """You are an expert software engineer writing a git commit message based on staged changes from a git diff.
2
+
3
+ Your output MUST follow the Conventional Commits specification.
4
+
5
+ Subject line rules:
6
+ - format: type(scope): description
7
+ - type must be one of: feat, fix, refactor, docs, style, test, chore
8
+ - scope is optional — only include it if changes target one clear area, never use a filename
9
+ - less than 72 characters, imperative mood ("add" not "adds" or "added"), no trailing period
10
+ - description must capture intent, not list what files changed
11
+
12
+ Commit types:
13
+ - feat: introduces a new feature or capability
14
+ - fix: patches a bug or error
15
+ - refactor: restructures code without changing behaviour or fixing a bug
16
+ - docs: documentation changes only, no logic changes
17
+ - style: formatting or whitespace only, no logic changes
18
+ - test: adding or updating tests only
19
+ - chore: maintenance, config changes, dependency updates
20
+
21
+ Body rules:
22
+ - omit the body entirely unless the WHY behind the change is genuinely non-obvious
23
+ - if included, start one blank line after the subject line
24
+ - write 2-4 bullet points explaining WHY, not what was changed
25
+ - do NOT mention filenames, function names, variable names, or command names
26
+ - do NOT use filler words or opinions ("intuitive", "user-friendly", "clean")
27
+
28
+ Footer rules:
29
+ - omit the footer entirely unless there is a real breaking change or a real issue reference
30
+ - BREAKING CHANGE only applies if existing users' workflows would break — adding features never qualifies
31
+ - do NOT invent issue numbers — only include "Fixes #N" if the diff references a real issue
32
+ - "No breaking changes" is NOT a valid footer — omit the footer entirely instead
33
+
34
+ Output rules:
35
+ - output the commit message only
36
+ - no explanations, no preamble, no markdown code fences"""
37
+
38
+
39
+ def build_prompt(diff: str) -> str:
40
+ return f"Here is the staged git diff to write a commit message for:\n\n{diff}"
File without changes
@@ -0,0 +1,8 @@
1
+ from abc import ABC, abstractmethod
2
+
3
+ class BaseProvider(ABC):
4
+
5
+ @abstractmethod
6
+ def generate(self, system_prompt: str, user_prompt: str) -> str:
7
+ """Send prompts to the LLM and return commit message."""
8
+ pass
@@ -0,0 +1,33 @@
1
+ import requests
2
+ from .base import BaseProvider
3
+
4
+
5
+ class OllamaProvider(BaseProvider):
6
+
7
+ def __init__(self, model: str = "llama3.2"):
8
+ self.model = model
9
+ self.url = "http://127.0.0.1:11434/api/chat"
10
+
11
+ def generate(self, system_prompt: str, user_prompt: str) -> str:
12
+ payload = {
13
+ "model": self.model,
14
+ "stream": False,
15
+ "messages": [
16
+ {"role": "system", "content": system_prompt},
17
+ {"role": "user", "content": user_prompt},
18
+ ]
19
+ }
20
+
21
+ try:
22
+ response = requests.post(self.url, json=payload, timeout=60)
23
+ response.raise_for_status()
24
+ except requests.exceptions.ConnectionError:
25
+ raise RuntimeError(
26
+ "Could not connect to Ollama. Is it running? Try: ollama serve"
27
+ )
28
+ except requests.exceptions.Timeout:
29
+ raise RuntimeError(
30
+ "Ollama took too long to respond. Try a smaller model."
31
+ )
32
+
33
+ return response.json()["message"]["content"].strip()
@@ -0,0 +1,43 @@
1
+ import requests
2
+ from .base import BaseProvider
3
+
4
+ class OpenAIProvider(BaseProvider):
5
+
6
+ def __init__(self, model: str = "gpt-4o-mini", api_key: str = ""):
7
+ self.model = model
8
+ self.api_key = api_key
9
+ self.url = "https://api.openai.com/v1/chat/completions"
10
+
11
+ def generate(self, system_prompt: str, user_prompt: str) -> str:
12
+ if not self.api_key:
13
+ raise RuntimeError(
14
+ "No OpenAI API key set. Run: gitme-config set openai_api_key YOUR_KEY"
15
+ )
16
+
17
+ headers = {
18
+ "Authorization": f"Bearer {self.api_key}",
19
+ "Content-Type": "application/json",
20
+ }
21
+ payload = {
22
+ "model": self.model,
23
+ "messages": [
24
+ {"role": "system", "content": system_prompt},
25
+ {"role": "user", "content": user_prompt},
26
+ ]
27
+ }
28
+
29
+ try:
30
+ response = requests.post(self.url, json=payload, headers=headers, timeout=30)
31
+ response.raise_for_status()
32
+ except requests.exceptions.ConnectionError:
33
+ raise RuntimeError("Could not connect to OpenAI. Check your internet connection.")
34
+ except requests.exceptions.Timeout:
35
+ raise RuntimeError("OpenAI took too long to respond.")
36
+ except requests.exceptions.HTTPError as e:
37
+ if response.status_code == 401:
38
+ raise RuntimeError("Invalid OpenAI API key.")
39
+ if response.status_code == 429:
40
+ raise RuntimeError("OpenAI rate limit hit. Wait a moment and try again.")
41
+ raise RuntimeError(f"OpenAI error: {e}")
42
+
43
+ return response.json()["choices"][0]["message"]["content"].strip()
@@ -0,0 +1,44 @@
1
+ import requests
2
+ from .base import BaseProvider
3
+
4
+
5
+ class OpenRouterProvider(BaseProvider):
6
+
7
+ def __init__(self, model: str = "nvidia/nemotron-3-nano-30b-a3b:free", api_key: str = ""):
8
+ self.model = model
9
+ self.api_key = api_key
10
+ self.url = "https://openrouter.ai/api/v1/chat/completions"
11
+
12
+ def generate(self, system_prompt: str, user_prompt: str) -> str:
13
+ if not self.api_key:
14
+ raise RuntimeError(
15
+ "No OpenRouter API key set. Run: gitme-config set openrouter_api_key YOUR_KEY"
16
+ )
17
+
18
+ headers = {
19
+ "Authorization": f"Bearer {self.api_key}",
20
+ "Content-Type": "application/json",
21
+ }
22
+ payload = {
23
+ "model": self.model,
24
+ "messages": [
25
+ {"role": "system", "content": system_prompt},
26
+ {"role": "user", "content": user_prompt},
27
+ ]
28
+ }
29
+
30
+ try:
31
+ response = requests.post(self.url, json=payload, headers=headers, timeout=30)
32
+ response.raise_for_status()
33
+ except requests.exceptions.ConnectionError:
34
+ raise RuntimeError("Could not connect to OpenRouter. Check your internet connection.")
35
+ except requests.exceptions.Timeout:
36
+ raise RuntimeError("OpenRouter took too long to respond.")
37
+ except requests.exceptions.HTTPError as e:
38
+ if response.status_code == 401:
39
+ raise RuntimeError("Invalid OpenRouter API key.")
40
+ if response.status_code == 429:
41
+ raise RuntimeError("OpenRouter rate limit hit. Wait a moment and try again.")
42
+ raise RuntimeError(f"OpenRouter error: {e}")
43
+
44
+ return response.json()["choices"][0]["message"]["content"].strip()
@@ -0,0 +1,247 @@
1
+ Metadata-Version: 2.4
2
+ Name: gitme-ai
3
+ Version: 0.1.0
4
+ Summary: AI-powered git commit message generator
5
+ Author-email: Rohit Balaji <rohitvb26@gmail.com>
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/RohitB2005/gitme-cli
8
+ Project-URL: Repository, https://github.com/RohitB2005/gitme-cli
9
+ Project-URL: Bug Tracker, https://github.com/RohitB2005/gitme-cli/issues
10
+ Keywords: git,cli,ai,commit,llm,developer-tools
11
+ Classifier: Development Status :: 3 - Alpha
12
+ Classifier: Environment :: Console
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: License :: OSI Approved :: MIT License
15
+ Classifier: Programming Language :: Python :: 3
16
+ Classifier: Programming Language :: Python :: 3.11
17
+ Classifier: Programming Language :: Python :: 3.12
18
+ Classifier: Topic :: Software Development :: Version Control :: Git
19
+ Requires-Python: >=3.11
20
+ Description-Content-Type: text/markdown
21
+ License-File: LICENSE
22
+ Requires-Dist: click>=8.0
23
+ Requires-Dist: rich>=13.0
24
+ Requires-Dist: pyperclip>=1.8
25
+ Requires-Dist: requests>=2.28
26
+ Requires-Dist: tomli-w>=1.0
27
+ Dynamic: license-file
28
+
29
+ # gitme-cli
30
+
31
+ AI-powered git commit message generator. Reads your staged changes and generates a well-formatted [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) message using a local or cloud LLM.
32
+
33
+ **Local-first and privacy-friendly by default** — your diffs never leave your machine unless you choose a cloud provider.
34
+
35
+ ---
36
+
37
+ ## Requirements
38
+
39
+ - Python 3.11 or higher
40
+ - Git
41
+
42
+ ---
43
+
44
+ ## Installation
45
+
46
+ ```bash
47
+ pip install gitme-cli
48
+ ```
49
+
50
+ ---
51
+
52
+ ## Quick Start
53
+
54
+ Stage your changes as you normally would, then run `gitme`:
55
+
56
+ ```bash
57
+ git add .
58
+ gitme
59
+ ```
60
+
61
+ That's it. gitme reads your staged diff, sends it to the AI, and prints a suggested commit message in your terminal.
62
+
63
+ By default it uses **Ollama with llama3.2** running locally. See [Providers](#providers) below to set up Ollama or switch to a cloud provider.
64
+
65
+ ---
66
+
67
+ ## Usage
68
+
69
+ ### Basic
70
+
71
+ ```bash
72
+ gitme
73
+ ```
74
+
75
+ ### Copy to clipboard
76
+
77
+ ```bash
78
+ gitme --copy
79
+ ```
80
+
81
+ The message is printed to the terminal and copied to your clipboard. You can then paste it directly into `git commit -m "..."`.
82
+
83
+ ### Add extra context
84
+
85
+ ```bash
86
+ gitme --context "this is a hotfix for the login bug in production"
87
+ ```
88
+
89
+ The context is appended to the prompt so the model has more information to work with. Useful when the diff alone doesn't tell the full story.
90
+
91
+ ---
92
+
93
+ ## Providers
94
+
95
+ gitme supports three providers. You set your preferred one once and it applies to every future run.
96
+
97
+ ### Ollama (default — local, free, private)
98
+
99
+ Ollama runs models on your own machine. No API key needed, no data leaves your computer.
100
+
101
+ **1. Install Ollama**
102
+
103
+ Download from [ollama.com](https://ollama.com) and open the app. You'll see it in your menu bar when it's running.
104
+
105
+ **2. Pull the model**
106
+
107
+ ```bash
108
+ ollama pull llama3.2
109
+ ```
110
+
111
+ **3. Run gitme**
112
+
113
+ ```bash
114
+ gitme
115
+ ```
116
+
117
+ Ollama is already the default — nothing else to configure.
118
+
119
+ To use a different Ollama model:
120
+
121
+ ```bash
122
+ gitme-config set model mistral
123
+ ```
124
+
125
+ ---
126
+
127
+ ### OpenAI (cloud, best quality)
128
+
129
+ Uses the OpenAI API. You need an API key and credit — `gpt-4o-mini` is cheap and works well for commit messages.
130
+
131
+ **1. Set your API key**
132
+
133
+ ```bash
134
+ gitme-config set openai_api_key YOUR_API_KEY
135
+ gitme-config set provider openai
136
+ gitme-config set model gpt-4o-mini
137
+ ```
138
+
139
+ **2. Run gitme**
140
+
141
+ ```bash
142
+ gitme
143
+ ```
144
+
145
+ Your API key is stored locally in `~/.gitme.toml` and never shared.
146
+
147
+ ---
148
+
149
+ ### OpenRouter (cloud, free tier available)
150
+
151
+ OpenRouter provides access to many models including free ones. No credit card required for the free tier.
152
+
153
+ Get a free API key at [openrouter.ai](https://openrouter.ai).
154
+
155
+ **1. Set your API key**
156
+
157
+ ```bash
158
+ gitme-config set openrouter_api_key YOUR_API_KEY
159
+ gitme-config set provider openrouter
160
+ gitme-config set model "nvidia/nemotron-3-nano-30b-a3b:free"
161
+ ```
162
+
163
+ **2. Run gitme**
164
+
165
+ ```bash
166
+ gitme
167
+ ```
168
+
169
+ ---
170
+
171
+ ## Configuration
172
+
173
+ Configuration is stored in `~/.gitme.toml` and shared across all your projects.
174
+
175
+ ### View current config
176
+
177
+ ```bash
178
+ gitme-config show
179
+ ```
180
+
181
+ Output:
182
+ ```
183
+ provider: ollama
184
+ model: llama3.2
185
+ style: conventional
186
+ ```
187
+
188
+ ### Set a value
189
+
190
+ ```bash
191
+ gitme-config set <key> <value>
192
+ ```
193
+
194
+ **Available keys:**
195
+
196
+ | Key | Description | Default |
197
+ |-----|-------------|---------|
198
+ | `provider` | Which provider to use: `ollama`, `openai`, or `openrouter` | `ollama` |
199
+ | `model` | Model name for the active provider | `llama3.2` |
200
+ | `style` | Commit style (currently `conventional`) | `conventional` |
201
+ | `openai_api_key` | Your OpenAI API key | — |
202
+ | `openrouter_api_key` | Your OpenRouter API key | — |
203
+
204
+ ### Examples
205
+
206
+ ```bash
207
+ # Switch to OpenAI
208
+ gitme-config set provider openai
209
+ gitme-config set model gpt-4o-mini
210
+ gitme-config set openai_api_key sk-...
211
+
212
+ # Switch back to Ollama
213
+ gitme-config set provider ollama
214
+ gitme-config set model llama3.2
215
+ ```
216
+
217
+ ---
218
+
219
+ ## Commit Format
220
+
221
+ gitme follows the [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0-beta.2/) specification:
222
+
223
+ ```
224
+ type(scope): short description
225
+
226
+ - optional body explaining why the change was made
227
+
228
+ Fixes #123 (optional footer)
229
+ ```
230
+
231
+ **Commit types:**
232
+
233
+ | Type | When to use |
234
+ |------|-------------|
235
+ | `feat` | New feature or capability |
236
+ | `fix` | Bug fix |
237
+ | `refactor` | Code restructured without changing behaviour |
238
+ | `docs` | Documentation changes only |
239
+ | `style` | Formatting or whitespace, no logic changes |
240
+ | `test` | Adding or updating tests |
241
+ | `chore` | Maintenance, config changes, dependency updates |
242
+
243
+ ---
244
+
245
+ ## License
246
+
247
+ MIT
@@ -0,0 +1,20 @@
1
+ LICENSE
2
+ README.md
3
+ pyproject.toml
4
+ gitme/__init__.py
5
+ gitme/cli.py
6
+ gitme/config.py
7
+ gitme/diff.py
8
+ gitme/prompt.py
9
+ gitme/providers/__init__.py
10
+ gitme/providers/base.py
11
+ gitme/providers/ollama.py
12
+ gitme/providers/openai.py
13
+ gitme/providers/openrouter.py
14
+ gitme_ai.egg-info/PKG-INFO
15
+ gitme_ai.egg-info/SOURCES.txt
16
+ gitme_ai.egg-info/dependency_links.txt
17
+ gitme_ai.egg-info/requires.txt
18
+ gitme_ai.egg-info/top_level.txt
19
+ tests/test_diff.py
20
+ tests/test_prompt.py
@@ -0,0 +1,5 @@
1
+ click>=8.0
2
+ rich>=13.0
3
+ pyperclip>=1.8
4
+ requests>=2.28
5
+ tomli-w>=1.0
@@ -0,0 +1 @@
1
+ gitme
@@ -0,0 +1,33 @@
1
+ [project]
2
+ name = "gitme-ai"
3
+ version = "0.1.0"
4
+ description = "AI-powered git commit message generator"
5
+ readme = "README.md"
6
+ requires-python = ">=3.11"
7
+ license = { text = "MIT" }
8
+ authors = [
9
+ { name = "Rohit Balaji", email = "rohitvb26@gmail.com" }
10
+ ]
11
+ keywords = ["git", "cli", "ai", "commit", "llm", "developer-tools"]
12
+ classifiers = [
13
+ "Development Status :: 3 - Alpha",
14
+ "Environment :: Console",
15
+ "Intended Audience :: Developers",
16
+ "License :: OSI Approved :: MIT License",
17
+ "Programming Language :: Python :: 3",
18
+ "Programming Language :: Python :: 3.11",
19
+ "Programming Language :: Python :: 3.12",
20
+ "Topic :: Software Development :: Version Control :: Git",
21
+ ]
22
+ dependencies = [
23
+ "click>=8.0",
24
+ "rich>=13.0",
25
+ "pyperclip>=1.8",
26
+ "requests>=2.28",
27
+ "tomli-w>=1.0",
28
+ ]
29
+
30
+ [project.urls]
31
+ Homepage = "https://github.com/RohitB2005/gitme-cli"
32
+ Repository = "https://github.com/RohitB2005/gitme-cli"
33
+ "Bug Tracker" = "https://github.com/RohitB2005/gitme-cli/issues"
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1,33 @@
1
+ import pytest
2
+ from unittest.mock import patch, MagicMock
3
+ from gitme.diff import get_staged_diff
4
+
5
+
6
+ def test_raises_outside_git_repo():
7
+ mock = MagicMock()
8
+ mock.returncode = 1
9
+ with patch("subprocess.run", return_value=mock):
10
+ with pytest.raises(EnvironmentError, match="git repository"):
11
+ get_staged_diff()
12
+
13
+
14
+ def test_raises_when_nothing_staged():
15
+ def fake_run(cmd, **kwargs):
16
+ m = MagicMock()
17
+ m.returncode = 0
18
+ m.stdout = ""
19
+ return m
20
+ with patch("subprocess.run", side_effect=fake_run):
21
+ with pytest.raises(ValueError, match="No staged changes"):
22
+ get_staged_diff()
23
+
24
+
25
+ def test_returns_diff_string():
26
+ def fake_run(cmd, **kwargs):
27
+ m = MagicMock()
28
+ m.returncode = 0
29
+ m.stdout = "diff --git a/foo.py b/foo.py\n+hello"
30
+ return m
31
+ with patch("subprocess.run", side_effect=fake_run):
32
+ result = get_staged_diff()
33
+ assert "diff" in result
@@ -0,0 +1,16 @@
1
+ from gitme.prompt import build_prompt, SYSTEM_PROMPT
2
+
3
+
4
+ def test_build_prompt_contains_diff():
5
+ diff = "diff --git a/foo.py b/foo.py\n+hello"
6
+ result = build_prompt(diff)
7
+ assert diff in result
8
+
9
+
10
+ def test_system_prompt_mentions_conventional_commits():
11
+ assert "Conventional Commits" in SYSTEM_PROMPT
12
+
13
+
14
+ def test_system_prompt_lists_types():
15
+ for t in ["feat", "fix", "refactor", "docs", "style", "test", "chore"]:
16
+ assert t in SYSTEM_PROMPT