gac 1.7.0__tar.gz → 1.8.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of gac might be problematic. Click here for more details.

Files changed (35) hide show
  1. {gac-1.7.0 → gac-1.8.0}/PKG-INFO +2 -8
  2. {gac-1.7.0 → gac-1.8.0}/README.md +1 -7
  3. {gac-1.7.0 → gac-1.8.0}/src/gac/__version__.py +1 -1
  4. {gac-1.7.0 → gac-1.8.0}/src/gac/ai.py +2 -0
  5. {gac-1.7.0 → gac-1.8.0}/src/gac/ai_utils.py +1 -1
  6. {gac-1.7.0 → gac-1.8.0}/src/gac/init_cli.py +2 -1
  7. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/__init__.py +2 -0
  8. gac-1.8.0/src/gac/providers/chutes.py +71 -0
  9. {gac-1.7.0 → gac-1.8.0}/.gitignore +0 -0
  10. {gac-1.7.0 → gac-1.8.0}/LICENSE +0 -0
  11. {gac-1.7.0 → gac-1.8.0}/pyproject.toml +0 -0
  12. {gac-1.7.0 → gac-1.8.0}/src/gac/__init__.py +0 -0
  13. {gac-1.7.0 → gac-1.8.0}/src/gac/cli.py +0 -0
  14. {gac-1.7.0 → gac-1.8.0}/src/gac/config.py +0 -0
  15. {gac-1.7.0 → gac-1.8.0}/src/gac/config_cli.py +0 -0
  16. {gac-1.7.0 → gac-1.8.0}/src/gac/constants.py +0 -0
  17. {gac-1.7.0 → gac-1.8.0}/src/gac/diff_cli.py +0 -0
  18. {gac-1.7.0 → gac-1.8.0}/src/gac/errors.py +0 -0
  19. {gac-1.7.0 → gac-1.8.0}/src/gac/git.py +0 -0
  20. {gac-1.7.0 → gac-1.8.0}/src/gac/main.py +0 -0
  21. {gac-1.7.0 → gac-1.8.0}/src/gac/preprocess.py +0 -0
  22. {gac-1.7.0 → gac-1.8.0}/src/gac/prompt.py +0 -0
  23. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/anthropic.py +0 -0
  24. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/cerebras.py +0 -0
  25. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/gemini.py +0 -0
  26. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/groq.py +0 -0
  27. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/lmstudio.py +0 -0
  28. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/ollama.py +0 -0
  29. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/openai.py +0 -0
  30. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/openrouter.py +0 -0
  31. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/streamlake.py +0 -0
  32. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/synthetic.py +0 -0
  33. {gac-1.7.0 → gac-1.8.0}/src/gac/providers/zai.py +0 -0
  34. {gac-1.7.0 → gac-1.8.0}/src/gac/security.py +0 -0
  35. {gac-1.7.0 → gac-1.8.0}/src/gac/utils.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: gac
3
- Version: 1.7.0
3
+ Version: 1.8.0
4
4
  Summary: AI-powered Git commit message generator with multi-provider support
5
5
  Project-URL: Homepage, https://github.com/cellwebb/gac
6
6
  Project-URL: Documentation, https://github.com/cellwebb/gac#readme
@@ -58,7 +58,7 @@ Description-Content-Type: text/markdown
58
58
 
59
59
  - **LLM-Powered Commit Messages:** Automatically generates clear, concise, and context-aware commit messages using large language models.
60
60
  - **Deep Contextual Analysis:** Understands your code by analyzing staged changes, repository structure, and recent commit history to provide highly relevant suggestions.
61
- - **Multi-Provider & Model Support:** Flexibly works with leading AI providers (Anthropic, Cerebras, Gemini, Groq, OpenAI, OpenRouter, Streamlake/Vanchin, Z.AI) and local providers (LM Studio, Ollama), easily configured through an interactive setup or environment variables.
61
+ - **Multi-Provider & Model Support:** Flexibly works with leading AI providers (Anthropic, Cerebras, Chutes.ai, Gemini, Groq, OpenAI, OpenRouter, Streamlake/Vanchin, Synthetic.new, & Z.AI) and local providers (LM Studio & Ollama), easily configured through an interactive setup or environment variables.
62
62
  - **Seamless Git Workflow:** Integrates smoothly into your existing Git routine as a simple drop-in replacement for `git commit`.
63
63
  - **Extensive Customization:** Tailor commit messages to your needs with a rich set of flags, including one-liners (`-o`), AI hints (`-h`), scope inference (`-s`), and specific model selection (`-m`).
64
64
  - **Streamlined Workflow Commands:** Boost your productivity with convenient options to stage all changes (`-a`), auto-confirm commits (`-y`), and push to your remote repository (`-p`) in a single step.
@@ -70,12 +70,6 @@ Description-Content-Type: text/markdown
70
70
 
71
71
  gac analyzes your staged changes to generate high-quality commit messages with the help of large language models. The tool uses a sophisticated prompt architecture that separates system instructions from user data, enabling better AI understanding and more consistent results.
72
72
 
73
- ### Technical Architecture
74
-
75
- - **Dual-Prompt System**: GAC uses a separated prompt architecture where system instructions (role definition, conventions, examples) are sent as system messages, while git data (diffs, status) are sent as user messages. This follows AI best practices for improved model performance.
76
- - **Smart Context Analysis**: The tool examines your repository structure, recent commit history, and README files to understand the broader context of your changes.
77
- - **Intelligent Diff Processing**: Large diffs are automatically preprocessed to focus on the most important changes while staying within token limits.
78
-
79
73
  ## How to Use
80
74
 
81
75
  ```sh
@@ -14,7 +14,7 @@
14
14
 
15
15
  - **LLM-Powered Commit Messages:** Automatically generates clear, concise, and context-aware commit messages using large language models.
16
16
  - **Deep Contextual Analysis:** Understands your code by analyzing staged changes, repository structure, and recent commit history to provide highly relevant suggestions.
17
- - **Multi-Provider & Model Support:** Flexibly works with leading AI providers (Anthropic, Cerebras, Gemini, Groq, OpenAI, OpenRouter, Streamlake/Vanchin, Z.AI) and local providers (LM Studio, Ollama), easily configured through an interactive setup or environment variables.
17
+ - **Multi-Provider & Model Support:** Flexibly works with leading AI providers (Anthropic, Cerebras, Chutes.ai, Gemini, Groq, OpenAI, OpenRouter, Streamlake/Vanchin, Synthetic.new, & Z.AI) and local providers (LM Studio & Ollama), easily configured through an interactive setup or environment variables.
18
18
  - **Seamless Git Workflow:** Integrates smoothly into your existing Git routine as a simple drop-in replacement for `git commit`.
19
19
  - **Extensive Customization:** Tailor commit messages to your needs with a rich set of flags, including one-liners (`-o`), AI hints (`-h`), scope inference (`-s`), and specific model selection (`-m`).
20
20
  - **Streamlined Workflow Commands:** Boost your productivity with convenient options to stage all changes (`-a`), auto-confirm commits (`-y`), and push to your remote repository (`-p`) in a single step.
@@ -26,12 +26,6 @@
26
26
 
27
27
  gac analyzes your staged changes to generate high-quality commit messages with the help of large language models. The tool uses a sophisticated prompt architecture that separates system instructions from user data, enabling better AI understanding and more consistent results.
28
28
 
29
- ### Technical Architecture
30
-
31
- - **Dual-Prompt System**: GAC uses a separated prompt architecture where system instructions (role definition, conventions, examples) are sent as system messages, while git data (diffs, status) are sent as user messages. This follows AI best practices for improved model performance.
32
- - **Smart Context Analysis**: The tool examines your repository structure, recent commit history, and README files to understand the broader context of your changes.
33
- - **Intelligent Diff Processing**: Large diffs are automatically preprocessed to focus on the most important changes while staying within token limits.
34
-
35
29
  ## How to Use
36
30
 
37
31
  ```sh
@@ -1,3 +1,3 @@
1
1
  """Version information for gac package."""
2
2
 
3
- __version__ = "1.7.0"
3
+ __version__ = "1.8.0"
@@ -12,6 +12,7 @@ from gac.errors import AIError
12
12
  from gac.providers import (
13
13
  call_anthropic_api,
14
14
  call_cerebras_api,
15
+ call_chutes_api,
15
16
  call_gemini_api,
16
17
  call_groq_api,
17
18
  call_lmstudio_api,
@@ -69,6 +70,7 @@ def generate_commit_message(
69
70
  provider_funcs = {
70
71
  "anthropic": call_anthropic_api,
71
72
  "cerebras": call_cerebras_api,
73
+ "chutes": call_chutes_api,
72
74
  "gemini": call_gemini_api,
73
75
  "groq": call_groq_api,
74
76
  "lmstudio": call_lmstudio_api,
@@ -98,7 +98,7 @@ def generate_with_retries(
98
98
  "cerebras",
99
99
  "gemini",
100
100
  "groq",
101
- "lmstudio",
101
+ "lm-studio",
102
102
  "ollama",
103
103
  "openai",
104
104
  "openrouter",
@@ -32,8 +32,9 @@ def init() -> None:
32
32
  click.echo(f"Created $HOME/.gac.env at {GAC_ENV_PATH}.")
33
33
 
34
34
  providers = [
35
- ("Anthropic", "claude-3-5-haiku-latest"),
35
+ ("Anthropic", "claude-haiku-4-5"),
36
36
  ("Cerebras", "qwen-3-coder-480b"),
37
+ ("Chutes.ai", "zai-org/GLM-4.6-FP8"),
37
38
  ("Gemini", "gemini-2.5-flash"),
38
39
  ("Groq", "meta-llama/llama-4-maverick-17b-128e-instruct"),
39
40
  ("LM Studio", "gemma3"),
@@ -2,6 +2,7 @@
2
2
 
3
3
  from .anthropic import call_anthropic_api
4
4
  from .cerebras import call_cerebras_api
5
+ from .chutes import call_chutes_api
5
6
  from .gemini import call_gemini_api
6
7
  from .groq import call_groq_api
7
8
  from .lmstudio import call_lmstudio_api
@@ -15,6 +16,7 @@ from .zai import call_zai_api, call_zai_coding_api
15
16
  __all__ = [
16
17
  "call_anthropic_api",
17
18
  "call_cerebras_api",
19
+ "call_chutes_api",
18
20
  "call_gemini_api",
19
21
  "call_groq_api",
20
22
  "call_lmstudio_api",
@@ -0,0 +1,71 @@
1
+ """Chutes.ai API provider for gac."""
2
+
3
+ import os
4
+
5
+ import httpx
6
+
7
+ from gac.errors import AIError
8
+
9
+
10
+ def call_chutes_api(model: str, messages: list[dict], temperature: float, max_tokens: int) -> str:
11
+ """Call Chutes.ai API directly.
12
+
13
+ Chutes.ai provides an OpenAI-compatible API for serverless, decentralized AI compute.
14
+
15
+ Args:
16
+ model: The model to use (e.g., 'deepseek-ai/DeepSeek-V3-0324')
17
+ messages: List of message dictionaries with 'role' and 'content' keys
18
+ temperature: Controls randomness (0.0-1.0)
19
+ max_tokens: Maximum tokens in the response
20
+
21
+ Returns:
22
+ The generated commit message
23
+
24
+ Raises:
25
+ AIError: If authentication fails, API errors occur, or response is invalid
26
+ """
27
+ api_key = os.getenv("CHUTES_API_KEY")
28
+ if not api_key:
29
+ raise AIError.authentication_error("CHUTES_API_KEY environment variable not set")
30
+
31
+ base_url = os.getenv("CHUTES_BASE_URL", "https://llm.chutes.ai")
32
+ url = f"{base_url}/v1/chat/completions"
33
+
34
+ headers = {
35
+ "Content-Type": "application/json",
36
+ "Authorization": f"Bearer {api_key}",
37
+ }
38
+
39
+ data = {
40
+ "model": model,
41
+ "messages": messages,
42
+ "temperature": temperature,
43
+ "max_tokens": max_tokens,
44
+ }
45
+
46
+ try:
47
+ response = httpx.post(url, headers=headers, json=data, timeout=120)
48
+ response.raise_for_status()
49
+ response_data = response.json()
50
+ content = response_data["choices"][0]["message"]["content"]
51
+ if content is None:
52
+ raise AIError.model_error("Chutes.ai API returned null content")
53
+ if content == "":
54
+ raise AIError.model_error("Chutes.ai API returned empty content")
55
+ return content
56
+ except httpx.HTTPStatusError as e:
57
+ status_code = e.response.status_code
58
+ error_text = e.response.text
59
+
60
+ if status_code == 429:
61
+ raise AIError.rate_limit_error(f"Chutes.ai API rate limit exceeded: {error_text}") from e
62
+ elif status_code in (502, 503):
63
+ raise AIError.connection_error(f"Chutes.ai API service unavailable: {status_code} - {error_text}") from e
64
+ else:
65
+ raise AIError.model_error(f"Chutes.ai API error: {status_code} - {error_text}") from e
66
+ except httpx.ConnectError as e:
67
+ raise AIError.connection_error(f"Chutes.ai API connection error: {str(e)}") from e
68
+ except httpx.TimeoutException as e:
69
+ raise AIError.timeout_error(f"Chutes.ai API request timed out: {str(e)}") from e
70
+ except Exception as e:
71
+ raise AIError.model_error(f"Error calling Chutes.ai API: {str(e)}") from e
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes