ngpt 2.0.0__tar.gz → 2.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: ngpt
3
- Version: 2.0.0
3
+ Version: 2.1.0
4
4
  Summary: A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
5
5
  Project-URL: Homepage, https://github.com/nazdridoy/ngpt
6
6
  Project-URL: Repository, https://github.com/nazdridoy/ngpt
@@ -37,14 +37,17 @@ Description-Content-Type: text/markdown
37
37
  [![PyPI version](https://img.shields.io/pypi/v/ngpt.svg)](https://pypi.org/project/ngpt/)
38
38
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
39
39
  [![Python Versions](https://img.shields.io/pypi/pyversions/ngpt.svg)](https://pypi.org/project/ngpt/)
40
+ [![Documentation](https://img.shields.io/badge/docs-available-brightgreen.svg)](https://nazdridoy.github.io/ngpt/)
40
41
 
41
42
  A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
42
43
 
43
44
  ## Table of Contents
44
45
  - [Quick Start](#quick-start)
45
46
  - [Features](#features)
47
+ - [Documentation](#documentation)
46
48
  - [Installation](#installation)
47
49
  - [Usage](#usage)
50
+ - [Documentation](https://nazdridoy.github.io/ngpt/)
48
51
  - [CLI Tool](#as-a-cli-tool)
49
52
  - [Python Library](#as-a-library)
50
53
  - [Configuration](#configuration)
@@ -93,6 +96,12 @@ ngpt --text
93
96
  - 🧩 **Clean Code Generation**: Output code without markdown or explanations
94
97
  - 📝 **Rich Multiline Editor**: Interactive multiline text input with syntax highlighting and intuitive controls
95
98
 
99
+ ## Documentation
100
+
101
+ Comprehensive documentation, including API reference, usage guides, and examples, is available at:
102
+
103
+ **[https://nazdridoy.github.io/ngpt/](https://nazdridoy.github.io/ngpt/)**
104
+
96
105
  ## Installation
97
106
 
98
107
  ```bash
@@ -220,6 +229,9 @@ You can configure the client using the following options:
220
229
  | `--list-models` | List all available models for the selected configuration (can be combined with --config-index) |
221
230
  | `--web-search` | Enable web search capability |
222
231
  | `-n, --no-stream` | Return the whole response without streaming |
232
+ | `--temperature` | Set temperature (controls randomness, default: 0.7) |
233
+ | `--top_p` | Set top_p (controls diversity, default: 1.0) |
234
+ | `--max_length` | Set maximum response length in tokens |
223
235
  | `--config` | Path to a custom configuration file or, when used without a value, enters interactive configuration mode |
224
236
  | `--config-index` | Index of the configuration to use (default: 0) |
225
237
  | `--remove` | Remove the configuration at the specified index (requires --config and --config-index) |
@@ -3,14 +3,17 @@
3
3
  [![PyPI version](https://img.shields.io/pypi/v/ngpt.svg)](https://pypi.org/project/ngpt/)
4
4
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
5
5
  [![Python Versions](https://img.shields.io/pypi/pyversions/ngpt.svg)](https://pypi.org/project/ngpt/)
6
+ [![Documentation](https://img.shields.io/badge/docs-available-brightgreen.svg)](https://nazdridoy.github.io/ngpt/)
6
7
 
7
8
  A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
8
9
 
9
10
  ## Table of Contents
10
11
  - [Quick Start](#quick-start)
11
12
  - [Features](#features)
13
+ - [Documentation](#documentation)
12
14
  - [Installation](#installation)
13
15
  - [Usage](#usage)
16
+ - [Documentation](https://nazdridoy.github.io/ngpt/)
14
17
  - [CLI Tool](#as-a-cli-tool)
15
18
  - [Python Library](#as-a-library)
16
19
  - [Configuration](#configuration)
@@ -59,6 +62,12 @@ ngpt --text
59
62
  - 🧩 **Clean Code Generation**: Output code without markdown or explanations
60
63
  - 📝 **Rich Multiline Editor**: Interactive multiline text input with syntax highlighting and intuitive controls
61
64
 
65
+ ## Documentation
66
+
67
+ Comprehensive documentation, including API reference, usage guides, and examples, is available at:
68
+
69
+ **[https://nazdridoy.github.io/ngpt/](https://nazdridoy.github.io/ngpt/)**
70
+
62
71
  ## Installation
63
72
 
64
73
  ```bash
@@ -186,6 +195,9 @@ You can configure the client using the following options:
186
195
  | `--list-models` | List all available models for the selected configuration (can be combined with --config-index) |
187
196
  | `--web-search` | Enable web search capability |
188
197
  | `-n, --no-stream` | Return the whole response without streaming |
198
+ | `--temperature` | Set temperature (controls randomness, default: 0.7) |
199
+ | `--top_p` | Set top_p (controls diversity, default: 1.0) |
200
+ | `--max_length` | Set maximum response length in tokens |
189
201
  | `--config` | Path to a custom configuration file or, when used without a value, enters interactive configuration mode |
190
202
  | `--config-index` | Index of the configuration to use (default: 0) |
191
203
  | `--remove` | Remove the configuration at the specified index (requires --config and --config-index) |
@@ -0,0 +1,93 @@
1
+ ---
2
+ ---
3
+
4
+ # Contributing to NGPT
5
+
6
+ Thank you for your interest in contributing to NGPT! This document provides guidelines and instructions for contributing to this project.
7
+
8
+ ## Development Setup
9
+
10
+ 1. Fork the repository
11
+ 2. Clone your fork: `git clone https://github.com/YOUR_USERNAME/ngpt.git`
12
+ 3. Navigate to the project directory: `cd ngpt`
13
+ 4. Set up Python environment:
14
+ - It's recommended to use a virtual environment
15
+ - Create a virtual environment: `python -m venv .venv`
16
+ - Activate the virtual environment:
17
+ - Windows: `.venv\Scripts\activate`
18
+ - Unix/MacOS: `source .venv/bin/activate`
19
+ 5. Install dependencies: `pip install -e .`
20
+ 6. Open the project in your preferred code editor
21
+
22
+ ## Code Structure
23
+
24
+ - `ngpt/` - Main package directory
25
+ - `__init__.py` - Package initialization
26
+ - `cli.py` - Command-line interface implementation
27
+ - `config.py` - Configuration management
28
+ - `client.py` - Client implementation
29
+ - `.github/` - GitHub workflows and templates
30
+ - `pyproject.toml` - Project configuration and dependencies
31
+
32
+ ## Code Style Guidelines
33
+
34
+ - Follow PEP 8 style guidelines for Python code
35
+ - Use consistent indentation (4 spaces)
36
+ - Write descriptive docstrings for functions and classes
37
+ - Add type hints where appropriate
38
+ - Add comments for complex logic
39
+
40
+ ## Pull Request Guidelines
41
+
42
+ Before submitting a pull request, please make sure that:
43
+
44
+ - Your code follows the project's coding conventions
45
+ - You have tested your changes thoroughly
46
+ - All existing tests pass (if applicable)
47
+ - The commit messages are clear and follow conventional commit guidelines as specified in [COMMIT_GUIDELINES.md](COMMIT_GUIDELINES.md)
48
+ - You have provided a detailed explanation of the changes in the pull request description
49
+
50
+ ## Submitting Changes
51
+
52
+ 1. Create a new branch: `git checkout -b feature/your-feature-name`
53
+ 2. Make your changes
54
+ 3. Test thoroughly
55
+ 4. Commit with clear messages: `git commit -m "feat: description"`
56
+ 5. Push to your fork: `git push origin feature/your-feature-name`
57
+ 6. Open a Pull Request against the main repository
58
+
59
+ ## Testing
60
+
61
+ Before submitting your changes, please test:
62
+
63
+ - Basic functionality
64
+ - Any new features you've added
65
+ - Any components you've modified
66
+ - Ensure all tests pass if there's a test suite
67
+
68
+ ## Issue Reporting
69
+
70
+ When opening an issue, please:
71
+
72
+ - Use a clear and descriptive title
73
+ - Provide a detailed description of the issue, including the environment and steps to reproduce
74
+ - Include any relevant logs or code snippets
75
+ - Specify your Python version and operating system
76
+ - Search the repository for similar issues before creating a new one
77
+
78
+ ## Feature Requests
79
+
80
+ Feature requests are welcome! To submit a feature request:
81
+
82
+ - Use a clear and descriptive title
83
+ - Provide a detailed description of the proposed feature
84
+ - Explain why this feature would be useful to NGPT users
85
+ - If possible, suggest how it might be implemented
86
+
87
+ ## Questions and Discussions
88
+
89
+ For questions about the project that aren't bugs or feature requests, please use GitHub Discussions instead of opening an issue. This helps keep the issue tracker focused on bugs and features.
90
+
91
+ ## License
92
+
93
+ By contributing to this project, you agree that your contributions will be licensed under the same [LICENSE](LICENSE) as the project.
@@ -0,0 +1,24 @@
1
+ ---
2
+ ---
3
+
4
+ MIT License
5
+
6
+ Copyright (c) 2025 nazdridoy
7
+
8
+ Permission is hereby granted, free of charge, to any person obtaining a copy
9
+ of this software and associated documentation files (the "Software"), to deal
10
+ in the Software without restriction, including without limitation the rights
11
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
12
+ copies of the Software, and to permit persons to whom the Software is
13
+ furnished to do so, subject to the following conditions:
14
+
15
+ The above copyright notice and this permission notice shall be included in all
16
+ copies or substantial portions of the Software.
17
+
18
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
19
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
20
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
21
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
22
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
23
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
24
+ SOFTWARE.
@@ -0,0 +1,27 @@
1
+ # nGPT Documentation
2
+
3
+ Welcome to the documentation for nGPT, a lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
4
+
5
+ ## Table of Contents
6
+
7
+ - [Overview](overview.md)
8
+ - [Installation](installation.md)
9
+ - [Usage](usage/README.md)
10
+ - [Library Usage](usage/library_usage.md)
11
+ - [CLI Usage](usage/cli_usage.md)
12
+ - [API Reference](api/README.md)
13
+ - [NGPTClient](api/client.md)
14
+ - [Configuration](api/config.md)
15
+ - [Examples](examples/README.md)
16
+ - [Basic Examples](examples/basic.md)
17
+ - [Advanced Examples](examples/advanced.md)
18
+ - [Custom Integrations](examples/integrations.md)
19
+ - [Configuration](configuration.md)
20
+ - [Contributing](CONTRIBUTING.md)
21
+ - [License](LICENSE.md)
22
+
23
+ ## Getting Started
24
+
25
+ For a quick start, refer to the [Installation](installation.md) and [Usage](usage/README.md) guides.
26
+
27
+ If you're primarily interested in using nGPT as a library in your Python projects, head directly to the [Library Usage](usage/library_usage.md) guide.
@@ -0,0 +1,7 @@
1
+ title: nGPT Documentation
2
+ description: A lightweight Python CLI and library for interacting with OpenAI-compatible APIs
3
+ baseurl: /ngpt
4
+ remote_theme: pages-themes/hacker
5
+ plugins:
6
+ - jekyll-remote-theme
7
+ show_downloads: false
@@ -0,0 +1,47 @@
1
+ # API Reference
2
+
3
+ This section provides detailed documentation for the nGPT API, including all classes, methods, functions, and parameters.
4
+
5
+ ## Overview
6
+
7
+ nGPT's API consists of two main components:
8
+
9
+ 1. **NGPTClient**: The main client class used to interact with AI providers
10
+ 2. **Configuration Utilities**: Functions for managing configuration files and settings
11
+
12
+ ## Table of Contents
13
+
14
+ - [NGPTClient](client.md) - Primary client for interacting with LLM APIs
15
+ - [Initialization](client.md#initialization)
16
+ - [Chat Method](client.md#chat-method)
17
+ - [Generate Shell Command](client.md#generate-shell-command)
18
+ - [Generate Code](client.md#generate-code)
19
+ - [List Models](client.md#list-models)
20
+
21
+ - [Configuration](config.md) - Functions for managing configurations
22
+ - [Loading Configurations](config.md#loading-configurations)
23
+ - [Creating Configurations](config.md#creating-configurations)
24
+ - [Editing Configurations](config.md#editing-configurations)
25
+ - [Removing Configurations](config.md#removing-configurations)
26
+ - [Configuration Paths](config.md#configuration-paths)
27
+
28
+ ## Quick Reference
29
+
30
+ ```python
31
+ # Import core components
32
+ from ngpt import NGPTClient, load_config
33
+
34
+ # Import configuration utilities
35
+ from ngpt.config import (
36
+ load_configs,
37
+ get_config_path,
38
+ get_config_dir,
39
+ add_config_entry,
40
+ remove_config_entry
41
+ )
42
+
43
+ # Import version information
44
+ from ngpt import __version__
45
+ ```
46
+
47
+ For complete documentation on using these components, see the linked reference pages.
@@ -0,0 +1,263 @@
1
+ # NGPTClient
2
+
3
+ `NGPTClient` is the primary class for interacting with OpenAI-compatible APIs. It provides methods for chat completion, code generation, shell command generation, and model listing.
4
+
5
+ ## Initialization
6
+
7
+ ```python
8
+ from ngpt import NGPTClient
9
+
10
+ client = NGPTClient(
11
+ api_key: str = "",
12
+ base_url: str = "https://api.openai.com/v1/",
13
+ provider: str = "OpenAI",
14
+ model: str = "gpt-3.5-turbo"
15
+ )
16
+ ```
17
+
18
+ ### Parameters
19
+
20
+ | Parameter | Type | Default | Description |
21
+ |-----------|------|---------|-------------|
22
+ | `api_key` | `str` | `""` | API key for authentication |
23
+ | `base_url` | `str` | `"https://api.openai.com/v1/"` | Base URL for the API endpoint |
24
+ | `provider` | `str` | `"OpenAI"` | Name of the provider (for reference only) |
25
+ | `model` | `str` | `"gpt-3.5-turbo"` | Default model to use for completion |
26
+
27
+ ### Examples
28
+
29
+ ```python
30
+ # Basic initialization with OpenAI
31
+ client = NGPTClient(api_key="your-openai-api-key")
32
+
33
+ # Using a different provider
34
+ client = NGPTClient(
35
+ api_key="your-api-key",
36
+ base_url="https://api.groq.com/openai/v1/",
37
+ provider="Groq",
38
+ model="llama3-70b-8192"
39
+ )
40
+
41
+ # Using a local Ollama instance
42
+ client = NGPTClient(
43
+ api_key="", # No key needed for local Ollama
44
+ base_url="http://localhost:11434/v1/",
45
+ provider="Ollama-Local",
46
+ model="llama3"
47
+ )
48
+ ```
49
+
50
+ ## Chat Method
51
+
52
+ The primary method for interacting with the AI model.
53
+
54
+ ```python
55
+ response = client.chat(
56
+ prompt: str,
57
+ stream: bool = True,
58
+ temperature: float = 0.7,
59
+ max_tokens: Optional[int] = None,
60
+ messages: Optional[List[Dict[str, str]]] = None,
61
+ web_search: bool = False,
62
+ **kwargs
63
+ ) -> str
64
+ ```
65
+
66
+ ### Parameters
67
+
68
+ | Parameter | Type | Default | Description |
69
+ |-----------|------|---------|-------------|
70
+ | `prompt` | `str` | Required | The user's message |
71
+ | `stream` | `bool` | `True` | Whether to stream the response |
72
+ | `temperature` | `float` | `0.7` | Controls randomness in the response (0.0-1.0) |
73
+ | `max_tokens` | `Optional[int]` | `None` | Maximum number of tokens to generate |
74
+ | `messages` | `Optional[List[Dict[str, str]]]` | `None` | Optional list of message objects for conversation history |
75
+ | `web_search` | `bool` | `False` | Whether to enable web search capability |
76
+ | `**kwargs` | | | Additional arguments to pass to the API |
77
+
78
+ ### Returns
79
+
80
+ - When `stream=False`: A string containing the complete response
81
+ - When `stream=True`: A generator yielding response chunks that can be iterated over
82
+
83
+ ### Examples
84
+
85
+ ```python
86
+ # Basic chat with streaming
87
+ for chunk in client.chat("Tell me about quantum computing"):
88
+ print(chunk, end="", flush=True)
89
+ print() # Final newline
90
+
91
+ # Without streaming
92
+ response = client.chat("Tell me about quantum computing", stream=False)
93
+ print(response)
94
+
95
+ # With conversation history
96
+ messages = [
97
+ {"role": "system", "content": "You are a helpful assistant."},
98
+ {"role": "user", "content": "Hello, who are you?"},
99
+ {"role": "assistant", "content": "I'm an AI assistant. How can I help you today?"},
100
+ {"role": "user", "content": "Tell me about yourself"}
101
+ ]
102
+ response = client.chat("", messages=messages)
103
+ print(response)
104
+
105
+ # With web search
106
+ response = client.chat("What's the latest news about AI?", web_search=True)
107
+ print(response)
108
+
109
+ # With temperature control
110
+ response = client.chat("Write a creative story", temperature=0.9) # More random
111
+ response = client.chat("Explain how a CPU works", temperature=0.2) # More focused
112
+
113
+ # With token limit
114
+ response = client.chat("Summarize this concept", max_tokens=100)
115
+ ```
116
+
117
+ ## Generate Shell Command
118
+
119
+ Generates a shell command based on the prompt, optimized for the user's operating system.
120
+
121
+ ```python
122
+ command = client.generate_shell_command(
123
+ prompt: str,
124
+ web_search: bool = False
125
+ ) -> str
126
+ ```
127
+
128
+ ### Parameters
129
+
130
+ | Parameter | Type | Default | Description |
131
+ |-----------|------|---------|-------------|
132
+ | `prompt` | `str` | Required | Description of the command to generate |
133
+ | `web_search` | `bool` | `False` | Whether to enable web search capability |
134
+
135
+ ### Returns
136
+
137
+ A string containing the generated shell command appropriate for the user's OS.
138
+
139
+ ### Examples
140
+
141
+ ```python
142
+ # Generate a command to find large files
143
+ command = client.generate_shell_command("find all files larger than 100MB")
144
+ print(f"Generated command: {command}")
145
+
146
+ # Execute the generated command
147
+ import subprocess
148
+ result = subprocess.run(command, shell=True, capture_output=True, text=True)
149
+ print(f"Output: {result.stdout}")
150
+
151
+ # With web search for more current command syntax
152
+ command = client.generate_shell_command(
153
+ "show Docker container resource usage in a nice format",
154
+ web_search=True
155
+ )
156
+ ```
157
+
158
+ ## Generate Code
159
+
160
+ Generates clean code based on the prompt, without markdown formatting or explanations.
161
+
162
+ ```python
163
+ code = client.generate_code(
164
+ prompt: str,
165
+ language: str = "python",
166
+ web_search: bool = False
167
+ ) -> str
168
+ ```
169
+
170
+ ### Parameters
171
+
172
+ | Parameter | Type | Default | Description |
173
+ |-----------|------|---------|-------------|
174
+ | `prompt` | `str` | Required | Description of the code to generate |
175
+ | `language` | `str` | `"python"` | Programming language to generate code in |
176
+ | `web_search` | `bool` | `False` | Whether to enable web search capability |
177
+
178
+ ### Returns
179
+
180
+ A string containing the generated code without any markdown formatting or explanations.
181
+
182
+ ### Examples
183
+
184
+ ```python
185
+ # Generate Python code (default)
186
+ python_code = client.generate_code("function to calculate fibonacci numbers")
187
+ print(python_code)
188
+
189
+ # Generate JavaScript code
190
+ js_code = client.generate_code(
191
+ "function to validate email addresses using regex",
192
+ language="javascript"
193
+ )
194
+ print(js_code)
195
+
196
+ # Generate code with web search for latest best practices
197
+ react_code = client.generate_code(
198
+ "create a React component that fetches and displays data from an API",
199
+ language="jsx",
200
+ web_search=True
201
+ )
202
+ ```
203
+
204
+ ## List Models
205
+
206
+ Retrieves the list of available models from the API.
207
+
208
+ ```python
209
+ models = client.list_models() -> list
210
+ ```
211
+
212
+ ### Returns
213
+
214
+ A list of available model objects or an empty list if the request failed.
215
+
216
+ ### Examples
217
+
218
+ ```python
219
+ # Get available models
220
+ models = client.list_models()
221
+
222
+ # Print model IDs
223
+ if models:
224
+ print("Available models:")
225
+ for model in models:
226
+ print(f"- {model.get('id')}")
227
+ else:
228
+ print("No models available or could not retrieve models")
229
+
230
+ # Filter for specific models
231
+ gpt4_models = [m for m in models if "gpt-4" in m.get('id', '')]
232
+ for model in gpt4_models:
233
+ print(f"GPT-4 model: {model.get('id')}")
234
+ ```
235
+
236
+ ## Error Handling
237
+
238
+ The `NGPTClient` methods include basic error handling for common issues:
239
+
240
+ - HTTP errors (401, 404, 429, etc.)
241
+ - Connection errors
242
+ - Timeout errors
243
+ - JSON parsing errors
244
+
245
+ For production code, it's recommended to implement your own error handling:
246
+
247
+ ```python
248
+ import requests
249
+
250
+ try:
251
+ response = client.chat("Tell me about quantum computing")
252
+ except requests.exceptions.HTTPError as e:
253
+ if e.response.status_code == 401:
254
+ print("Authentication failed. Check your API key.")
255
+ elif e.response.status_code == 429:
256
+ print("Rate limit exceeded. Please wait and try again.")
257
+ else:
258
+ print(f"HTTP error: {e}")
259
+ except requests.exceptions.ConnectionError:
260
+ print("Connection error. Check your internet and base URL.")
261
+ except Exception as e:
262
+ print(f"An unexpected error occurred: {e}")
263
+ ```