ngpt 1.1.2__tar.gz → 1.1.4__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- ngpt-1.1.4/COMMIT_GUIDELINES.md +116 -0
- ngpt-1.1.4/CONTRIBUTING.md +90 -0
- ngpt-1.1.4/PKG-INFO +280 -0
- ngpt-1.1.4/README.md +247 -0
- {ngpt-1.1.2 → ngpt-1.1.4}/ngpt/cli.py +1 -2
- {ngpt-1.1.2 → ngpt-1.1.4}/ngpt/client.py +1 -1
- {ngpt-1.1.2 → ngpt-1.1.4}/ngpt/config.py +33 -13
- {ngpt-1.1.2 → ngpt-1.1.4}/pyproject.toml +1 -1
- {ngpt-1.1.2 → ngpt-1.1.4}/uv.lock +1 -1
- ngpt-1.1.2/PKG-INFO +0 -192
- ngpt-1.1.2/README.md +0 -159
- {ngpt-1.1.2 → ngpt-1.1.4}/.github/workflows/python-publish.yml +0 -0
- {ngpt-1.1.2 → ngpt-1.1.4}/.gitignore +0 -0
- {ngpt-1.1.2 → ngpt-1.1.4}/.python-version +0 -0
- {ngpt-1.1.2 → ngpt-1.1.4}/LICENSE +0 -0
- {ngpt-1.1.2 → ngpt-1.1.4}/ngpt/__init__.py +0 -0
@@ -0,0 +1,116 @@
|
|
1
|
+
# Commit Message Guidelines
|
2
|
+
|
3
|
+
## Introduction
|
4
|
+
|
5
|
+
Consistent and well-formatted commit messages provide a better project history, make it easier to understand changes, facilitate automatic changelog generation, and help identify bugs. These guidelines ensure that our commit messages remain uniform, descriptive, and useful to all project contributors.
|
6
|
+
|
7
|
+
## Message Format
|
8
|
+
|
9
|
+
```
|
10
|
+
type: <brief summary (max 50 chars)>
|
11
|
+
|
12
|
+
- [type] key change 1 (max 60 chars per line)
|
13
|
+
- [type] key change 2
|
14
|
+
- [type] key change N (include all significant changes)
|
15
|
+
```
|
16
|
+
|
17
|
+
## Best Practices
|
18
|
+
|
19
|
+
- Use the imperative mood in the subject line (e.g., "Add feature" not "Added feature")
|
20
|
+
- Don't end the subject line with a period
|
21
|
+
- Start with a capital letter
|
22
|
+
- Separate subject from body with a blank line
|
23
|
+
- Wrap body text at 72 characters
|
24
|
+
- Use the body to explain what and why vs. how
|
25
|
+
|
26
|
+
## Atomic Commits
|
27
|
+
|
28
|
+
Each commit should represent a single logical change:
|
29
|
+
- Make focused commits that address a single concern
|
30
|
+
- Split work into multiple commits when appropriate
|
31
|
+
- Avoid mixing unrelated changes in the same commit
|
32
|
+
|
33
|
+
## Issue References
|
34
|
+
|
35
|
+
Link to issues in your commit messages:
|
36
|
+
- Use "Fixes #123" to automatically close an issue
|
37
|
+
- Use "Relates to #123" for changes related to but not resolving an issue
|
38
|
+
- Always include issue numbers for bug fixes
|
39
|
+
|
40
|
+
## Valid Types
|
41
|
+
|
42
|
+
Choose the most specific type for your changes:
|
43
|
+
|
44
|
+
- `feat`: New user features (not for new files without user features)
|
45
|
+
- `fix`: Bug fixes/corrections to errors
|
46
|
+
- `refactor`: Restructured code (no behavior change)
|
47
|
+
- `style`: Formatting/whitespace changes
|
48
|
+
- `docs`: Documentation only
|
49
|
+
- `test`: Test-related changes
|
50
|
+
- `perf`: Performance improvements
|
51
|
+
- `build`: Build system changes
|
52
|
+
- `ci`: CI pipeline changes
|
53
|
+
- `chore`: Routine maintenance tasks
|
54
|
+
- `revert`: Reverting previous changes
|
55
|
+
- `add`: New files/resources with no user-facing features
|
56
|
+
- `remove`: Removing files/code
|
57
|
+
- `update`: Changes to existing functionality
|
58
|
+
- `security`: Security-related changes
|
59
|
+
- `i18n`: Internationalization
|
60
|
+
- `a11y`: Accessibility improvements
|
61
|
+
- `api`: API-related changes
|
62
|
+
- `ui`: User interface changes
|
63
|
+
- `data`: Database changes
|
64
|
+
- `config`: Configuration changes
|
65
|
+
- `init`: Initial commit/project setup
|
66
|
+
|
67
|
+
## Examples
|
68
|
+
|
69
|
+
### Good Examples
|
70
|
+
|
71
|
+
#### Bug Fix:
|
72
|
+
```
|
73
|
+
fix: Address memory leak in data processing pipeline
|
74
|
+
|
75
|
+
- [fix] Release resources in DataProcessor.cleanUp()
|
76
|
+
- [fix] Add null checks to prevent NPE in edge cases
|
77
|
+
- [perf] Optimize large dataset handling
|
78
|
+
|
79
|
+
Fixes #456
|
80
|
+
```
|
81
|
+
|
82
|
+
#### New Feature:
|
83
|
+
```
|
84
|
+
feat: Add user profile export functionality
|
85
|
+
|
86
|
+
- [feat] Create export button in profile settings
|
87
|
+
- [feat] Implement JSON and CSV export options
|
88
|
+
- [security] Apply rate limiting to prevent abuse
|
89
|
+
|
90
|
+
Relates to #789
|
91
|
+
```
|
92
|
+
|
93
|
+
#### Refactoring:
|
94
|
+
```
|
95
|
+
refactor: Simplify authentication flow
|
96
|
+
|
97
|
+
- [refactor] Extract login logic to separate service
|
98
|
+
- [refactor] Reduce complexity in AuthManager
|
99
|
+
- [test] Add unit tests for new service
|
100
|
+
|
101
|
+
Part of #234
|
102
|
+
```
|
103
|
+
|
104
|
+
### Poor Example:
|
105
|
+
```
|
106
|
+
Made some changes to fix stuff
|
107
|
+
|
108
|
+
Changed a bunch of files to make the login work better.
|
109
|
+
Also fixed that other bug people were complaining about.
|
110
|
+
```
|
111
|
+
|
112
|
+
## Additional Resources
|
113
|
+
|
114
|
+
- [Conventional Commits](https://www.conventionalcommits.org/)
|
115
|
+
- [How to Write a Git Commit Message](https://chris.beams.io/posts/git-commit/)
|
116
|
+
- [A Note About Git Commit Messages](https://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html)
|
@@ -0,0 +1,90 @@
|
|
1
|
+
# Contributing to NGPT
|
2
|
+
|
3
|
+
Thank you for your interest in contributing to NGPT! This document provides guidelines and instructions for contributing to this project.
|
4
|
+
|
5
|
+
## Development Setup
|
6
|
+
|
7
|
+
1. Fork the repository
|
8
|
+
2. Clone your fork: `git clone https://github.com/YOUR_USERNAME/ngpt.git`
|
9
|
+
3. Navigate to the project directory: `cd ngpt`
|
10
|
+
4. Set up Python environment:
|
11
|
+
- It's recommended to use a virtual environment
|
12
|
+
- Create a virtual environment: `python -m venv .venv`
|
13
|
+
- Activate the virtual environment:
|
14
|
+
- Windows: `.venv\Scripts\activate`
|
15
|
+
- Unix/MacOS: `source .venv/bin/activate`
|
16
|
+
5. Install dependencies: `pip install -e .`
|
17
|
+
6. Open the project in your preferred code editor
|
18
|
+
|
19
|
+
## Code Structure
|
20
|
+
|
21
|
+
- `ngpt/` - Main package directory
|
22
|
+
- `__init__.py` - Package initialization
|
23
|
+
- `cli.py` - Command-line interface implementation
|
24
|
+
- `config.py` - Configuration management
|
25
|
+
- `client.py` - Client implementation
|
26
|
+
- `.github/` - GitHub workflows and templates
|
27
|
+
- `pyproject.toml` - Project configuration and dependencies
|
28
|
+
|
29
|
+
## Code Style Guidelines
|
30
|
+
|
31
|
+
- Follow PEP 8 style guidelines for Python code
|
32
|
+
- Use consistent indentation (4 spaces)
|
33
|
+
- Write descriptive docstrings for functions and classes
|
34
|
+
- Add type hints where appropriate
|
35
|
+
- Add comments for complex logic
|
36
|
+
|
37
|
+
## Pull Request Guidelines
|
38
|
+
|
39
|
+
Before submitting a pull request, please make sure that:
|
40
|
+
|
41
|
+
- Your code follows the project's coding conventions
|
42
|
+
- You have tested your changes thoroughly
|
43
|
+
- All existing tests pass (if applicable)
|
44
|
+
- The commit messages are clear and follow conventional commit guidelines as specified in [COMMIT_GUIDELINES.md](COMMIT_GUIDELINES.md)
|
45
|
+
- You have provided a detailed explanation of the changes in the pull request description
|
46
|
+
|
47
|
+
## Submitting Changes
|
48
|
+
|
49
|
+
1. Create a new branch: `git checkout -b feature/your-feature-name`
|
50
|
+
2. Make your changes
|
51
|
+
3. Test thoroughly
|
52
|
+
4. Commit with clear messages: `git commit -m "feat: description"`
|
53
|
+
5. Push to your fork: `git push origin feature/your-feature-name`
|
54
|
+
6. Open a Pull Request against the main repository
|
55
|
+
|
56
|
+
## Testing
|
57
|
+
|
58
|
+
Before submitting your changes, please test:
|
59
|
+
|
60
|
+
- Basic functionality
|
61
|
+
- Any new features you've added
|
62
|
+
- Any components you've modified
|
63
|
+
- Ensure all tests pass if there's a test suite
|
64
|
+
|
65
|
+
## Issue Reporting
|
66
|
+
|
67
|
+
When opening an issue, please:
|
68
|
+
|
69
|
+
- Use a clear and descriptive title
|
70
|
+
- Provide a detailed description of the issue, including the environment and steps to reproduce
|
71
|
+
- Include any relevant logs or code snippets
|
72
|
+
- Specify your Python version and operating system
|
73
|
+
- Search the repository for similar issues before creating a new one
|
74
|
+
|
75
|
+
## Feature Requests
|
76
|
+
|
77
|
+
Feature requests are welcome! To submit a feature request:
|
78
|
+
|
79
|
+
- Use a clear and descriptive title
|
80
|
+
- Provide a detailed description of the proposed feature
|
81
|
+
- Explain why this feature would be useful to NGPT users
|
82
|
+
- If possible, suggest how it might be implemented
|
83
|
+
|
84
|
+
## Questions and Discussions
|
85
|
+
|
86
|
+
For questions about the project that aren't bugs or feature requests, please use GitHub Discussions instead of opening an issue. This helps keep the issue tracker focused on bugs and features.
|
87
|
+
|
88
|
+
## License
|
89
|
+
|
90
|
+
By contributing to this project, you agree that your contributions will be licensed under the same [LICENSE](LICENSE) as the project.
|
ngpt-1.1.4/PKG-INFO
ADDED
@@ -0,0 +1,280 @@
|
|
1
|
+
Metadata-Version: 2.4
|
2
|
+
Name: ngpt
|
3
|
+
Version: 1.1.4
|
4
|
+
Summary: A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
5
|
+
Project-URL: Homepage, https://github.com/nazdridoy/ngpt
|
6
|
+
Project-URL: Repository, https://github.com/nazdridoy/ngpt
|
7
|
+
Project-URL: Bug Tracker, https://github.com/nazdridoy/ngpt/issues
|
8
|
+
Author-email: nazDridoy <nazdridoy399@gmail.com>
|
9
|
+
License: MIT
|
10
|
+
License-File: LICENSE
|
11
|
+
Keywords: ai,api-client,chatgpt,cli,gpt,gpt4free,llm,ngpt,openai
|
12
|
+
Classifier: Environment :: Console
|
13
|
+
Classifier: Intended Audience :: Developers
|
14
|
+
Classifier: Intended Audience :: End Users/Desktop
|
15
|
+
Classifier: Intended Audience :: System Administrators
|
16
|
+
Classifier: License :: OSI Approved :: MIT License
|
17
|
+
Classifier: Operating System :: OS Independent
|
18
|
+
Classifier: Programming Language :: Python :: 3
|
19
|
+
Classifier: Programming Language :: Python :: 3.8
|
20
|
+
Classifier: Programming Language :: Python :: 3.9
|
21
|
+
Classifier: Programming Language :: Python :: 3.10
|
22
|
+
Classifier: Programming Language :: Python :: 3.11
|
23
|
+
Classifier: Programming Language :: Python :: 3.12
|
24
|
+
Classifier: Programming Language :: Python :: 3.13
|
25
|
+
Classifier: Topic :: Communications :: Chat
|
26
|
+
Classifier: Topic :: Internet :: WWW/HTTP
|
27
|
+
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
|
28
|
+
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
29
|
+
Classifier: Topic :: Utilities
|
30
|
+
Requires-Python: >=3.8
|
31
|
+
Requires-Dist: requests>=2.31.0
|
32
|
+
Description-Content-Type: text/markdown
|
33
|
+
|
34
|
+
# nGPT
|
35
|
+
|
36
|
+
[](https://pypi.org/project/ngpt/)
|
37
|
+
[](https://opensource.org/licenses/MIT)
|
38
|
+
[](https://pypi.org/project/ngpt/)
|
39
|
+
|
40
|
+
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
41
|
+
|
42
|
+
## Table of Contents
|
43
|
+
- [Quick Start](#quick-start)
|
44
|
+
- [Features](#features)
|
45
|
+
- [Installation](#installation)
|
46
|
+
- [Usage](#usage)
|
47
|
+
- [CLI Tool](#as-a-cli-tool)
|
48
|
+
- [Python Library](#as-a-library)
|
49
|
+
- [Configuration](#configuration)
|
50
|
+
- [Command Line Options](#command-line-options)
|
51
|
+
- [Interactive Configuration](#interactive-configuration)
|
52
|
+
- [Configuration File](#configuration-file)
|
53
|
+
- [Configuration Priority](#configuration-priority)
|
54
|
+
- [Contributing](#contributing)
|
55
|
+
- [License](#license)
|
56
|
+
|
57
|
+
## Quick Start
|
58
|
+
|
59
|
+
```bash
|
60
|
+
# Install
|
61
|
+
pip install ngpt
|
62
|
+
|
63
|
+
# Chat with default settings
|
64
|
+
ngpt "Tell me about quantum computing"
|
65
|
+
|
66
|
+
# Generate code
|
67
|
+
ngpt --code "function to calculate the Fibonacci sequence"
|
68
|
+
|
69
|
+
# Generate and execute shell commands
|
70
|
+
ngpt --shell "list all files in the current directory"
|
71
|
+
```
|
72
|
+
|
73
|
+
## Features
|
74
|
+
|
75
|
+
- ✅ **Dual Mode**: Use as a CLI tool or import as a Python library
|
76
|
+
- 🪶 **Lightweight**: Minimal dependencies (just `requests`)
|
77
|
+
- 🔄 **API Flexibility**: Works with OpenAI, Ollama, Groq, and any compatible endpoint
|
78
|
+
- 📊 **Streaming Responses**: Real-time output for better user experience
|
79
|
+
- 🔍 **Web Search**: Integrated with compatible API endpoints
|
80
|
+
- ⚙️ **Multiple Configurations**: Cross-platform config system supporting different profiles
|
81
|
+
- 💻 **Shell Command Generation**: OS-aware command execution
|
82
|
+
- 🧩 **Clean Code Generation**: Output code without markdown or explanations
|
83
|
+
|
84
|
+
## Installation
|
85
|
+
|
86
|
+
```bash
|
87
|
+
pip install ngpt
|
88
|
+
```
|
89
|
+
|
90
|
+
Requires Python 3.8 or newer.
|
91
|
+
|
92
|
+
## Usage
|
93
|
+
|
94
|
+
### As a CLI Tool
|
95
|
+
|
96
|
+
```bash
|
97
|
+
# Basic chat (default mode)
|
98
|
+
ngpt "Hello, how are you?"
|
99
|
+
|
100
|
+
# Show version information
|
101
|
+
ngpt -v
|
102
|
+
|
103
|
+
# Show active configuration
|
104
|
+
ngpt --show-config
|
105
|
+
|
106
|
+
# Show all configurations
|
107
|
+
ngpt --show-config --all
|
108
|
+
|
109
|
+
# With custom options
|
110
|
+
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"
|
111
|
+
|
112
|
+
# Enable web search (if your API endpoint supports it)
|
113
|
+
ngpt --web-search "What's the latest news about AI?"
|
114
|
+
|
115
|
+
# Generate and execute shell commands (using -s or --shell flag)
|
116
|
+
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
|
117
|
+
ngpt -s "list all files in current directory"
|
118
|
+
# On Windows generates: dir
|
119
|
+
# On Linux/macOS generates: ls -la
|
120
|
+
|
121
|
+
# Generate clean code (using -c or --code flag)
|
122
|
+
# Returns only code without markdown formatting or explanations
|
123
|
+
ngpt -c "create a python function that calculates fibonacci numbers"
|
124
|
+
```
|
125
|
+
|
126
|
+
### As a Library
|
127
|
+
|
128
|
+
```python
|
129
|
+
from ngpt import NGPTClient, load_config
|
130
|
+
|
131
|
+
# Load the first configuration (index 0) from config file
|
132
|
+
config = load_config(config_index=0)
|
133
|
+
|
134
|
+
# Initialize the client with config
|
135
|
+
client = NGPTClient(**config)
|
136
|
+
|
137
|
+
# Or initialize with custom parameters
|
138
|
+
client = NGPTClient(
|
139
|
+
api_key="your-key",
|
140
|
+
base_url="http://your-endpoint",
|
141
|
+
provider="openai",
|
142
|
+
model="o3-mini"
|
143
|
+
)
|
144
|
+
|
145
|
+
# Chat
|
146
|
+
response = client.chat("Hello, how are you?")
|
147
|
+
|
148
|
+
# Chat with web search (if your API endpoint supports it)
|
149
|
+
response = client.chat("What's the latest news about AI?", web_search=True)
|
150
|
+
|
151
|
+
# Generate shell command
|
152
|
+
command = client.generate_shell_command("list all files")
|
153
|
+
|
154
|
+
# Generate code
|
155
|
+
code = client.generate_code("create a python function that calculates fibonacci numbers")
|
156
|
+
```
|
157
|
+
|
158
|
+
#### Advanced Library Usage
|
159
|
+
|
160
|
+
```python
|
161
|
+
# Stream responses
|
162
|
+
for chunk in client.chat("Write a poem about Python", stream=True):
|
163
|
+
print(chunk, end="", flush=True)
|
164
|
+
|
165
|
+
# Customize system prompt
|
166
|
+
response = client.chat(
|
167
|
+
"Explain quantum computing",
|
168
|
+
system_prompt="You are a quantum physics professor. Explain complex concepts simply."
|
169
|
+
)
|
170
|
+
|
171
|
+
# OS-aware shell commands
|
172
|
+
# Automatically generates appropriate commands for the current OS
|
173
|
+
command = client.generate_shell_command("find large files")
|
174
|
+
import subprocess
|
175
|
+
result = subprocess.run(command, shell=True, capture_output=True, text=True)
|
176
|
+
print(result.stdout)
|
177
|
+
|
178
|
+
# Clean code generation
|
179
|
+
# Returns only code without markdown or explanations
|
180
|
+
code = client.generate_code("function that converts Celsius to Fahrenheit")
|
181
|
+
print(code)
|
182
|
+
```
|
183
|
+
|
184
|
+
## Configuration
|
185
|
+
|
186
|
+
### Command Line Options
|
187
|
+
|
188
|
+
You can configure the client using the following options:
|
189
|
+
|
190
|
+
| Option | Description |
|
191
|
+
|--------|-------------|
|
192
|
+
| `--api-key` | API key for the service |
|
193
|
+
| `--base-url` | Base URL for the API |
|
194
|
+
| `--model` | Model to use |
|
195
|
+
| `--web-search` | Enable web search capability |
|
196
|
+
| `--config` | Path to a custom configuration file or, when used without a value, enters interactive configuration mode |
|
197
|
+
| `--config-index` | Index of the configuration to use (default: 0) |
|
198
|
+
| `--show-config` | Show configuration details and exit |
|
199
|
+
| `--all` | Used with `--show-config` to display all configurations |
|
200
|
+
| `-s, --shell` | Generate and execute shell commands |
|
201
|
+
| `-c, --code` | Generate clean code output |
|
202
|
+
| `-v, --version` | Show version information |
|
203
|
+
|
204
|
+
### Interactive Configuration
|
205
|
+
|
206
|
+
The `--config` option without arguments enters interactive configuration mode, allowing you to add or edit configurations:
|
207
|
+
|
208
|
+
```bash
|
209
|
+
# Add a new configuration
|
210
|
+
ngpt --config
|
211
|
+
|
212
|
+
# Edit an existing configuration at index 1
|
213
|
+
ngpt --config --config-index 1
|
214
|
+
```
|
215
|
+
|
216
|
+
In interactive mode:
|
217
|
+
- When editing an existing configuration, press Enter to keep the current values
|
218
|
+
- When creating a new configuration, press Enter to use default values
|
219
|
+
- For security, your API key is not displayed when editing configurations
|
220
|
+
|
221
|
+
### Configuration File
|
222
|
+
|
223
|
+
nGPT uses a configuration file stored in the standard user config directory for your operating system:
|
224
|
+
|
225
|
+
- **Linux**: `~/.config/ngpt/ngpt.conf` or `$XDG_CONFIG_HOME/ngpt/ngpt.conf`
|
226
|
+
- **macOS**: `~/Library/Application Support/ngpt/ngpt.conf`
|
227
|
+
- **Windows**: `%APPDATA%\ngpt\ngpt.conf`
|
228
|
+
|
229
|
+
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the `--config-index` argument (or by default, index 0 is used).
|
230
|
+
|
231
|
+
#### Multiple Configurations Example (`ngpt.conf`)
|
232
|
+
```json
|
233
|
+
[
|
234
|
+
{
|
235
|
+
"api_key": "your-openai-api-key-here",
|
236
|
+
"base_url": "https://api.openai.com/v1/",
|
237
|
+
"provider": "OpenAI",
|
238
|
+
"model": "gpt-4o"
|
239
|
+
},
|
240
|
+
{
|
241
|
+
"api_key": "your-groq-api-key-here",
|
242
|
+
"base_url": "https://api.groq.com/openai/v1/",
|
243
|
+
"provider": "Groq",
|
244
|
+
"model": "llama3-70b-8192"
|
245
|
+
},
|
246
|
+
{
|
247
|
+
"api_key": "your-ollama-key-if-needed",
|
248
|
+
"base_url": "http://localhost:11434/v1/",
|
249
|
+
"provider": "Ollama-Local",
|
250
|
+
"model": "llama3"
|
251
|
+
}
|
252
|
+
]
|
253
|
+
```
|
254
|
+
|
255
|
+
### Configuration Priority
|
256
|
+
|
257
|
+
nGPT determines configuration values in the following order (highest priority first):
|
258
|
+
|
259
|
+
1. Command line arguments (`--api-key`, `--base-url`, `--model`)
|
260
|
+
2. Environment variables (`OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OPENAI_MODEL`)
|
261
|
+
3. Configuration file (selected by `--config-index`, defaults to index 0)
|
262
|
+
4. Default values
|
263
|
+
|
264
|
+
## Contributing
|
265
|
+
|
266
|
+
We welcome contributions to nGPT! Whether it's bug fixes, feature additions, or documentation improvements, your help is appreciated.
|
267
|
+
|
268
|
+
To contribute:
|
269
|
+
|
270
|
+
1. Fork the repository
|
271
|
+
2. Create a feature branch: `git checkout -b feature/your-feature-name`
|
272
|
+
3. Make your changes
|
273
|
+
4. Commit with clear messages following conventional commit guidelines
|
274
|
+
5. Push to your fork and submit a pull request
|
275
|
+
|
276
|
+
Please check the [CONTRIBUTING.md](CONTRIBUTING.md) file for detailed guidelines on code style, pull request process, and development setup.
|
277
|
+
|
278
|
+
## License
|
279
|
+
|
280
|
+
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
|
ngpt-1.1.4/README.md
ADDED
@@ -0,0 +1,247 @@
|
|
1
|
+
# nGPT
|
2
|
+
|
3
|
+
[](https://pypi.org/project/ngpt/)
|
4
|
+
[](https://opensource.org/licenses/MIT)
|
5
|
+
[](https://pypi.org/project/ngpt/)
|
6
|
+
|
7
|
+
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
8
|
+
|
9
|
+
## Table of Contents
|
10
|
+
- [Quick Start](#quick-start)
|
11
|
+
- [Features](#features)
|
12
|
+
- [Installation](#installation)
|
13
|
+
- [Usage](#usage)
|
14
|
+
- [CLI Tool](#as-a-cli-tool)
|
15
|
+
- [Python Library](#as-a-library)
|
16
|
+
- [Configuration](#configuration)
|
17
|
+
- [Command Line Options](#command-line-options)
|
18
|
+
- [Interactive Configuration](#interactive-configuration)
|
19
|
+
- [Configuration File](#configuration-file)
|
20
|
+
- [Configuration Priority](#configuration-priority)
|
21
|
+
- [Contributing](#contributing)
|
22
|
+
- [License](#license)
|
23
|
+
|
24
|
+
## Quick Start
|
25
|
+
|
26
|
+
```bash
|
27
|
+
# Install
|
28
|
+
pip install ngpt
|
29
|
+
|
30
|
+
# Chat with default settings
|
31
|
+
ngpt "Tell me about quantum computing"
|
32
|
+
|
33
|
+
# Generate code
|
34
|
+
ngpt --code "function to calculate the Fibonacci sequence"
|
35
|
+
|
36
|
+
# Generate and execute shell commands
|
37
|
+
ngpt --shell "list all files in the current directory"
|
38
|
+
```
|
39
|
+
|
40
|
+
## Features
|
41
|
+
|
42
|
+
- ✅ **Dual Mode**: Use as a CLI tool or import as a Python library
|
43
|
+
- 🪶 **Lightweight**: Minimal dependencies (just `requests`)
|
44
|
+
- 🔄 **API Flexibility**: Works with OpenAI, Ollama, Groq, and any compatible endpoint
|
45
|
+
- 📊 **Streaming Responses**: Real-time output for better user experience
|
46
|
+
- 🔍 **Web Search**: Integrated with compatible API endpoints
|
47
|
+
- ⚙️ **Multiple Configurations**: Cross-platform config system supporting different profiles
|
48
|
+
- 💻 **Shell Command Generation**: OS-aware command execution
|
49
|
+
- 🧩 **Clean Code Generation**: Output code without markdown or explanations
|
50
|
+
|
51
|
+
## Installation
|
52
|
+
|
53
|
+
```bash
|
54
|
+
pip install ngpt
|
55
|
+
```
|
56
|
+
|
57
|
+
Requires Python 3.8 or newer.
|
58
|
+
|
59
|
+
## Usage
|
60
|
+
|
61
|
+
### As a CLI Tool
|
62
|
+
|
63
|
+
```bash
|
64
|
+
# Basic chat (default mode)
|
65
|
+
ngpt "Hello, how are you?"
|
66
|
+
|
67
|
+
# Show version information
|
68
|
+
ngpt -v
|
69
|
+
|
70
|
+
# Show active configuration
|
71
|
+
ngpt --show-config
|
72
|
+
|
73
|
+
# Show all configurations
|
74
|
+
ngpt --show-config --all
|
75
|
+
|
76
|
+
# With custom options
|
77
|
+
ngpt --api-key your-key --base-url http://your-endpoint --model your-model "Hello"
|
78
|
+
|
79
|
+
# Enable web search (if your API endpoint supports it)
|
80
|
+
ngpt --web-search "What's the latest news about AI?"
|
81
|
+
|
82
|
+
# Generate and execute shell commands (using -s or --shell flag)
|
83
|
+
# OS-aware: generates appropriate commands for Windows, macOS, or Linux
|
84
|
+
ngpt -s "list all files in current directory"
|
85
|
+
# On Windows generates: dir
|
86
|
+
# On Linux/macOS generates: ls -la
|
87
|
+
|
88
|
+
# Generate clean code (using -c or --code flag)
|
89
|
+
# Returns only code without markdown formatting or explanations
|
90
|
+
ngpt -c "create a python function that calculates fibonacci numbers"
|
91
|
+
```
|
92
|
+
|
93
|
+
### As a Library
|
94
|
+
|
95
|
+
```python
|
96
|
+
from ngpt import NGPTClient, load_config
|
97
|
+
|
98
|
+
# Load the first configuration (index 0) from config file
|
99
|
+
config = load_config(config_index=0)
|
100
|
+
|
101
|
+
# Initialize the client with config
|
102
|
+
client = NGPTClient(**config)
|
103
|
+
|
104
|
+
# Or initialize with custom parameters
|
105
|
+
client = NGPTClient(
|
106
|
+
api_key="your-key",
|
107
|
+
base_url="http://your-endpoint",
|
108
|
+
provider="openai",
|
109
|
+
model="o3-mini"
|
110
|
+
)
|
111
|
+
|
112
|
+
# Chat
|
113
|
+
response = client.chat("Hello, how are you?")
|
114
|
+
|
115
|
+
# Chat with web search (if your API endpoint supports it)
|
116
|
+
response = client.chat("What's the latest news about AI?", web_search=True)
|
117
|
+
|
118
|
+
# Generate shell command
|
119
|
+
command = client.generate_shell_command("list all files")
|
120
|
+
|
121
|
+
# Generate code
|
122
|
+
code = client.generate_code("create a python function that calculates fibonacci numbers")
|
123
|
+
```
|
124
|
+
|
125
|
+
#### Advanced Library Usage
|
126
|
+
|
127
|
+
```python
|
128
|
+
# Stream responses
|
129
|
+
for chunk in client.chat("Write a poem about Python", stream=True):
|
130
|
+
print(chunk, end="", flush=True)
|
131
|
+
|
132
|
+
# Customize system prompt
|
133
|
+
response = client.chat(
|
134
|
+
"Explain quantum computing",
|
135
|
+
system_prompt="You are a quantum physics professor. Explain complex concepts simply."
|
136
|
+
)
|
137
|
+
|
138
|
+
# OS-aware shell commands
|
139
|
+
# Automatically generates appropriate commands for the current OS
|
140
|
+
command = client.generate_shell_command("find large files")
|
141
|
+
import subprocess
|
142
|
+
result = subprocess.run(command, shell=True, capture_output=True, text=True)
|
143
|
+
print(result.stdout)
|
144
|
+
|
145
|
+
# Clean code generation
|
146
|
+
# Returns only code without markdown or explanations
|
147
|
+
code = client.generate_code("function that converts Celsius to Fahrenheit")
|
148
|
+
print(code)
|
149
|
+
```
|
150
|
+
|
151
|
+
## Configuration
|
152
|
+
|
153
|
+
### Command Line Options
|
154
|
+
|
155
|
+
You can configure the client using the following options:
|
156
|
+
|
157
|
+
| Option | Description |
|
158
|
+
|--------|-------------|
|
159
|
+
| `--api-key` | API key for the service |
|
160
|
+
| `--base-url` | Base URL for the API |
|
161
|
+
| `--model` | Model to use |
|
162
|
+
| `--web-search` | Enable web search capability |
|
163
|
+
| `--config` | Path to a custom configuration file or, when used without a value, enters interactive configuration mode |
|
164
|
+
| `--config-index` | Index of the configuration to use (default: 0) |
|
165
|
+
| `--show-config` | Show configuration details and exit |
|
166
|
+
| `--all` | Used with `--show-config` to display all configurations |
|
167
|
+
| `-s, --shell` | Generate and execute shell commands |
|
168
|
+
| `-c, --code` | Generate clean code output |
|
169
|
+
| `-v, --version` | Show version information |
|
170
|
+
|
171
|
+
### Interactive Configuration
|
172
|
+
|
173
|
+
The `--config` option without arguments enters interactive configuration mode, allowing you to add or edit configurations:
|
174
|
+
|
175
|
+
```bash
|
176
|
+
# Add a new configuration
|
177
|
+
ngpt --config
|
178
|
+
|
179
|
+
# Edit an existing configuration at index 1
|
180
|
+
ngpt --config --config-index 1
|
181
|
+
```
|
182
|
+
|
183
|
+
In interactive mode:
|
184
|
+
- When editing an existing configuration, press Enter to keep the current values
|
185
|
+
- When creating a new configuration, press Enter to use default values
|
186
|
+
- For security, your API key is not displayed when editing configurations
|
187
|
+
|
188
|
+
### Configuration File
|
189
|
+
|
190
|
+
nGPT uses a configuration file stored in the standard user config directory for your operating system:
|
191
|
+
|
192
|
+
- **Linux**: `~/.config/ngpt/ngpt.conf` or `$XDG_CONFIG_HOME/ngpt/ngpt.conf`
|
193
|
+
- **macOS**: `~/Library/Application Support/ngpt/ngpt.conf`
|
194
|
+
- **Windows**: `%APPDATA%\ngpt\ngpt.conf`
|
195
|
+
|
196
|
+
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the `--config-index` argument (or by default, index 0 is used).
|
197
|
+
|
198
|
+
#### Multiple Configurations Example (`ngpt.conf`)
|
199
|
+
```json
|
200
|
+
[
|
201
|
+
{
|
202
|
+
"api_key": "your-openai-api-key-here",
|
203
|
+
"base_url": "https://api.openai.com/v1/",
|
204
|
+
"provider": "OpenAI",
|
205
|
+
"model": "gpt-4o"
|
206
|
+
},
|
207
|
+
{
|
208
|
+
"api_key": "your-groq-api-key-here",
|
209
|
+
"base_url": "https://api.groq.com/openai/v1/",
|
210
|
+
"provider": "Groq",
|
211
|
+
"model": "llama3-70b-8192"
|
212
|
+
},
|
213
|
+
{
|
214
|
+
"api_key": "your-ollama-key-if-needed",
|
215
|
+
"base_url": "http://localhost:11434/v1/",
|
216
|
+
"provider": "Ollama-Local",
|
217
|
+
"model": "llama3"
|
218
|
+
}
|
219
|
+
]
|
220
|
+
```
|
221
|
+
|
222
|
+
### Configuration Priority
|
223
|
+
|
224
|
+
nGPT determines configuration values in the following order (highest priority first):
|
225
|
+
|
226
|
+
1. Command line arguments (`--api-key`, `--base-url`, `--model`)
|
227
|
+
2. Environment variables (`OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OPENAI_MODEL`)
|
228
|
+
3. Configuration file (selected by `--config-index`, defaults to index 0)
|
229
|
+
4. Default values
|
230
|
+
|
231
|
+
## Contributing
|
232
|
+
|
233
|
+
We welcome contributions to nGPT! Whether it's bug fixes, feature additions, or documentation improvements, your help is appreciated.
|
234
|
+
|
235
|
+
To contribute:
|
236
|
+
|
237
|
+
1. Fork the repository
|
238
|
+
2. Create a feature branch: `git checkout -b feature/your-feature-name`
|
239
|
+
3. Make your changes
|
240
|
+
4. Commit with clear messages following conventional commit guidelines
|
241
|
+
5. Push to your fork and submit a pull request
|
242
|
+
|
243
|
+
Please check the [CONTRIBUTING.md](CONTRIBUTING.md) file for detailed guidelines on code style, pull request process, and development setup.
|
244
|
+
|
245
|
+
## License
|
246
|
+
|
247
|
+
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
|
@@ -35,11 +35,10 @@ def show_config_help():
|
|
35
35
|
print(" 3. Or set environment variables:")
|
36
36
|
print(" - OPENAI_API_KEY")
|
37
37
|
print(" - OPENAI_BASE_URL")
|
38
|
-
print(" - OPENAI_PROVIDER")
|
39
38
|
print(" - OPENAI_MODEL")
|
40
39
|
|
41
40
|
print(" 4. Or provide command line arguments:")
|
42
|
-
print(" ngpt --api-key your-key --base-url https://api.example.com \"Your prompt\"")
|
41
|
+
print(" ngpt --api-key your-key --base-url https://api.example.com --model your-model \"Your prompt\"")
|
43
42
|
|
44
43
|
print(" 5. Use --config-index to specify which configuration to use:")
|
45
44
|
print(" ngpt --config-index 1 \"Your prompt\"")
|
@@ -10,7 +10,7 @@ class NGPTClient:
|
|
10
10
|
self,
|
11
11
|
api_key: str = "",
|
12
12
|
base_url: str = "https://api.openai.com/v1/",
|
13
|
-
provider: str = "OpenAI",
|
13
|
+
provider: str = "OpenAI",
|
14
14
|
model: str = "gpt-3.5-turbo"
|
15
15
|
):
|
16
16
|
self.api_key = api_key
|
@@ -51,23 +51,44 @@ def add_config_entry(config_path: Path, config_index: Optional[int] = None) -> N
|
|
51
51
|
"""Add a new configuration entry or update existing one at the specified index."""
|
52
52
|
configs = load_configs(custom_path=str(config_path))
|
53
53
|
|
54
|
-
#
|
55
|
-
|
54
|
+
# Determine if we're editing an existing config or creating a new one
|
55
|
+
is_existing_config = config_index is not None and config_index < len(configs)
|
56
|
+
|
57
|
+
# Set up entry based on whether we're editing or creating
|
58
|
+
if is_existing_config:
|
59
|
+
# Use existing config as the base when editing
|
60
|
+
entry = configs[config_index].copy()
|
61
|
+
print("Enter configuration details (press Enter to keep current values):")
|
62
|
+
else:
|
63
|
+
# Use default config as the base when creating new
|
64
|
+
entry = DEFAULT_CONFIG_ENTRY.copy()
|
65
|
+
print("Enter configuration details (press Enter to use default values):")
|
56
66
|
|
57
|
-
# Interactive configuration
|
58
|
-
print("Enter configuration details (press Enter to use default values):")
|
59
67
|
try:
|
60
|
-
|
61
|
-
|
62
|
-
|
63
|
-
|
68
|
+
# For API key, just show the prompt without the current value for security
|
69
|
+
user_input = input(f"API Key: ")
|
70
|
+
if user_input:
|
71
|
+
entry["api_key"] = user_input
|
72
|
+
|
73
|
+
# For other fields, show current/default value and keep it if Enter is pressed
|
74
|
+
user_input = input(f"Base URL [{entry['base_url']}]: ")
|
75
|
+
if user_input:
|
76
|
+
entry["base_url"] = user_input
|
77
|
+
|
78
|
+
user_input = input(f"Provider [{entry['provider']}]: ")
|
79
|
+
if user_input:
|
80
|
+
entry["provider"] = user_input
|
81
|
+
|
82
|
+
user_input = input(f"Model [{entry['model']}]: ")
|
83
|
+
if user_input:
|
84
|
+
entry["model"] = user_input
|
64
85
|
|
65
86
|
# Add or update the entry
|
66
|
-
if
|
67
|
-
configs[config_index] =
|
87
|
+
if is_existing_config:
|
88
|
+
configs[config_index] = entry
|
68
89
|
print(f"Updated configuration at index {config_index}")
|
69
90
|
else:
|
70
|
-
configs.append(
|
91
|
+
configs.append(entry)
|
71
92
|
print(f"Added new configuration at index {len(configs)-1}")
|
72
93
|
|
73
94
|
# Save the updated configs
|
@@ -128,8 +149,7 @@ def load_config(custom_path: Optional[str] = None, config_index: int = 0) -> Dic
|
|
128
149
|
# Override with environment variables if they exist
|
129
150
|
env_mapping = {
|
130
151
|
"OPENAI_API_KEY": "api_key",
|
131
|
-
"OPENAI_BASE_URL": "base_url",
|
132
|
-
"OPENAI_PROVIDER": "provider",
|
152
|
+
"OPENAI_BASE_URL": "base_url",
|
133
153
|
"OPENAI_MODEL": "model"
|
134
154
|
}
|
135
155
|
|
@@ -1,6 +1,6 @@
|
|
1
1
|
[project]
|
2
2
|
name = "ngpt"
|
3
|
-
version = "1.1.
|
3
|
+
version = "1.1.4"
|
4
4
|
description = "A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints."
|
5
5
|
authors = [
|
6
6
|
{name = "nazDridoy", email = "nazdridoy399@gmail.com"},
|
ngpt-1.1.2/PKG-INFO
DELETED
@@ -1,192 +0,0 @@
|
|
1
|
-
Metadata-Version: 2.4
|
2
|
-
Name: ngpt
|
3
|
-
Version: 1.1.2
|
4
|
-
Summary: A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
5
|
-
Project-URL: Homepage, https://github.com/nazdridoy/ngpt
|
6
|
-
Project-URL: Repository, https://github.com/nazdridoy/ngpt
|
7
|
-
Project-URL: Bug Tracker, https://github.com/nazdridoy/ngpt/issues
|
8
|
-
Author-email: nazDridoy <nazdridoy399@gmail.com>
|
9
|
-
License: MIT
|
10
|
-
License-File: LICENSE
|
11
|
-
Keywords: ai,api-client,chatgpt,cli,gpt,gpt4free,llm,ngpt,openai
|
12
|
-
Classifier: Environment :: Console
|
13
|
-
Classifier: Intended Audience :: Developers
|
14
|
-
Classifier: Intended Audience :: End Users/Desktop
|
15
|
-
Classifier: Intended Audience :: System Administrators
|
16
|
-
Classifier: License :: OSI Approved :: MIT License
|
17
|
-
Classifier: Operating System :: OS Independent
|
18
|
-
Classifier: Programming Language :: Python :: 3
|
19
|
-
Classifier: Programming Language :: Python :: 3.8
|
20
|
-
Classifier: Programming Language :: Python :: 3.9
|
21
|
-
Classifier: Programming Language :: Python :: 3.10
|
22
|
-
Classifier: Programming Language :: Python :: 3.11
|
23
|
-
Classifier: Programming Language :: Python :: 3.12
|
24
|
-
Classifier: Programming Language :: Python :: 3.13
|
25
|
-
Classifier: Topic :: Communications :: Chat
|
26
|
-
Classifier: Topic :: Internet :: WWW/HTTP
|
27
|
-
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
|
28
|
-
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
29
|
-
Classifier: Topic :: Utilities
|
30
|
-
Requires-Python: >=3.8
|
31
|
-
Requires-Dist: requests>=2.31.0
|
32
|
-
Description-Content-Type: text/markdown
|
33
|
-
|
34
|
-
# nGPT
|
35
|
-
|
36
|
-
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
37
|
-
|
38
|
-
## Features
|
39
|
-
|
40
|
-
- Dual mode: Use as a CLI tool or import as a library
|
41
|
-
- Minimal dependencies
|
42
|
-
- Customizable API endpoints and providers
|
43
|
-
- Streaming responses
|
44
|
-
- Web search capability (supported by compatible API endpoints)
|
45
|
-
- Cross-platform configuration system
|
46
|
-
- Experimental features:
|
47
|
-
- Shell command generation and execution (OS-aware)
|
48
|
-
- Code generation with clean output
|
49
|
-
|
50
|
-
## Installation
|
51
|
-
|
52
|
-
```bash
|
53
|
-
pip install ngpt
|
54
|
-
```
|
55
|
-
|
56
|
-
## Usage
|
57
|
-
|
58
|
-
### As a CLI Tool
|
59
|
-
|
60
|
-
```bash
|
61
|
-
# Basic chat (default mode)
|
62
|
-
ngpt "Hello, how are you?"
|
63
|
-
|
64
|
-
# Show version information
|
65
|
-
ngpt -v
|
66
|
-
|
67
|
-
# Show active configuration
|
68
|
-
ngpt --show-config
|
69
|
-
|
70
|
-
# Show all configurations
|
71
|
-
ngpt --show-config --all
|
72
|
-
|
73
|
-
# With custom options
|
74
|
-
ngpt --api-key your-key --base-url http://your-endpoint "Hello"
|
75
|
-
|
76
|
-
# Enable web search (if your API endpoint supports it)
|
77
|
-
ngpt --web-search "What's the latest news about AI?"
|
78
|
-
|
79
|
-
# Generate and execute shell commands (using -s or --shell flag)
|
80
|
-
ngpt -s "list all files in current directory"
|
81
|
-
|
82
|
-
# Generate code (using -c or --code flag)
|
83
|
-
ngpt -c "create a python function that calculates fibonacci numbers"
|
84
|
-
```
|
85
|
-
|
86
|
-
### As a Library
|
87
|
-
|
88
|
-
```python
|
89
|
-
from ngpt import NGPTClient, load_config
|
90
|
-
|
91
|
-
# Load the first configuration (index 0) from config file
|
92
|
-
config = load_config(config_index=0)
|
93
|
-
|
94
|
-
# Initialize the client with config
|
95
|
-
client = NGPTClient(**config)
|
96
|
-
|
97
|
-
# Or initialize with custom parameters
|
98
|
-
client = NGPTClient(
|
99
|
-
api_key="your-key",
|
100
|
-
base_url="http://your-endpoint",
|
101
|
-
provider="openai",
|
102
|
-
model="o3-mini"
|
103
|
-
)
|
104
|
-
|
105
|
-
# Chat
|
106
|
-
response = client.chat("Hello, how are you?")
|
107
|
-
|
108
|
-
# Chat with web search (if your API endpoint supports it)
|
109
|
-
response = client.chat("What's the latest news about AI?", web_search=True)
|
110
|
-
|
111
|
-
# Generate shell command
|
112
|
-
command = client.generate_shell_command("list all files")
|
113
|
-
|
114
|
-
# Generate code
|
115
|
-
code = client.generate_code("create a python function that calculates fibonacci numbers")
|
116
|
-
```
|
117
|
-
|
118
|
-
## Configuration
|
119
|
-
|
120
|
-
### Command Line Options
|
121
|
-
|
122
|
-
You can configure the client using the following options:
|
123
|
-
|
124
|
-
- `--api-key`: API key for the service
|
125
|
-
- `--base-url`: Base URL for the API
|
126
|
-
- `--model`: Model to use
|
127
|
-
- `--web-search`: Enable web search capability (Note: Your API endpoint must support this feature)
|
128
|
-
- `--config`: Path to a custom configuration file
|
129
|
-
- `--config-index`: Index of the configuration to use from the config file (default: 0)
|
130
|
-
- `--show-config`: Show configuration details and exit.
|
131
|
-
- `--all`: Used with `--show-config` to display details for all configurations.
|
132
|
-
|
133
|
-
### Configuration File
|
134
|
-
|
135
|
-
nGPT uses a configuration file stored in the standard user config directory for your operating system:
|
136
|
-
|
137
|
-
- **Linux**: `~/.config/ngpt/ngpt.conf` or `$XDG_CONFIG_HOME/ngpt/ngpt.conf`
|
138
|
-
- **macOS**: `~/Library/Application Support/ngpt/ngpt.conf`
|
139
|
-
- **Windows**: `%APPDATA%\ngpt\ngpt.conf`
|
140
|
-
|
141
|
-
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the `--config-index` argument (or by default, index 0 is used).
|
142
|
-
|
143
|
-
#### Multiple Configurations Example (`ngpt.conf`)
|
144
|
-
```json
|
145
|
-
[
|
146
|
-
{
|
147
|
-
"api_key": "your-openai-api-key-here",
|
148
|
-
"base_url": "https://api.openai.com/v1/",
|
149
|
-
"provider": "OpenAI",
|
150
|
-
"model": "gpt-4o"
|
151
|
-
},
|
152
|
-
{
|
153
|
-
"api_key": "your-groq-api-key-here",
|
154
|
-
"base_url": "https://api.groq.com/openai/v1/",
|
155
|
-
"provider": "Groq",
|
156
|
-
"model": "llama3-70b-8192"
|
157
|
-
},
|
158
|
-
{
|
159
|
-
"api_key": "your-ollama-key-if-needed",
|
160
|
-
"base_url": "http://localhost:11434/v1/",
|
161
|
-
"provider": "Ollama-Local",
|
162
|
-
"model": "llama3"
|
163
|
-
}
|
164
|
-
]
|
165
|
-
```
|
166
|
-
|
167
|
-
### Configuration Priority
|
168
|
-
|
169
|
-
nGPT determines configuration values in the following order (highest priority first):
|
170
|
-
|
171
|
-
1. Command line arguments (`--api-key`, `--base-url`, `--model`)
|
172
|
-
2. Environment variables (`OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OPENAI_MODEL`)
|
173
|
-
3. Configuration file (selected by `--config-index`, defaults to index 0)
|
174
|
-
4. Default values
|
175
|
-
|
176
|
-
## Special Features
|
177
|
-
|
178
|
-
### OS-Aware Shell Commands
|
179
|
-
|
180
|
-
Shell command generation is OS-aware, providing appropriate commands for your operating system (Windows, macOS, or Linux) and shell type (bash, powershell, etc.).
|
181
|
-
|
182
|
-
### Clean Code Generation
|
183
|
-
|
184
|
-
Code generation uses an improved prompt that ensures only clean code is returned, without markdown formatting or unnecessary explanations.
|
185
|
-
|
186
|
-
## Implementation Notes
|
187
|
-
|
188
|
-
This library uses direct HTTP requests instead of the OpenAI client library, allowing it to work with custom API endpoints that support additional parameters like `provider` and `web_search`. All parameters are sent directly in the request body, similar to the format shown in the curl example.
|
189
|
-
|
190
|
-
## License
|
191
|
-
|
192
|
-
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
|
ngpt-1.1.2/README.md
DELETED
@@ -1,159 +0,0 @@
|
|
1
|
-
# nGPT
|
2
|
-
|
3
|
-
A lightweight Python CLI and library for interacting with OpenAI-compatible APIs, supporting both official and self-hosted LLM endpoints.
|
4
|
-
|
5
|
-
## Features
|
6
|
-
|
7
|
-
- Dual mode: Use as a CLI tool or import as a library
|
8
|
-
- Minimal dependencies
|
9
|
-
- Customizable API endpoints and providers
|
10
|
-
- Streaming responses
|
11
|
-
- Web search capability (supported by compatible API endpoints)
|
12
|
-
- Cross-platform configuration system
|
13
|
-
- Experimental features:
|
14
|
-
- Shell command generation and execution (OS-aware)
|
15
|
-
- Code generation with clean output
|
16
|
-
|
17
|
-
## Installation
|
18
|
-
|
19
|
-
```bash
|
20
|
-
pip install ngpt
|
21
|
-
```
|
22
|
-
|
23
|
-
## Usage
|
24
|
-
|
25
|
-
### As a CLI Tool
|
26
|
-
|
27
|
-
```bash
|
28
|
-
# Basic chat (default mode)
|
29
|
-
ngpt "Hello, how are you?"
|
30
|
-
|
31
|
-
# Show version information
|
32
|
-
ngpt -v
|
33
|
-
|
34
|
-
# Show active configuration
|
35
|
-
ngpt --show-config
|
36
|
-
|
37
|
-
# Show all configurations
|
38
|
-
ngpt --show-config --all
|
39
|
-
|
40
|
-
# With custom options
|
41
|
-
ngpt --api-key your-key --base-url http://your-endpoint "Hello"
|
42
|
-
|
43
|
-
# Enable web search (if your API endpoint supports it)
|
44
|
-
ngpt --web-search "What's the latest news about AI?"
|
45
|
-
|
46
|
-
# Generate and execute shell commands (using -s or --shell flag)
|
47
|
-
ngpt -s "list all files in current directory"
|
48
|
-
|
49
|
-
# Generate code (using -c or --code flag)
|
50
|
-
ngpt -c "create a python function that calculates fibonacci numbers"
|
51
|
-
```
|
52
|
-
|
53
|
-
### As a Library
|
54
|
-
|
55
|
-
```python
|
56
|
-
from ngpt import NGPTClient, load_config
|
57
|
-
|
58
|
-
# Load the first configuration (index 0) from config file
|
59
|
-
config = load_config(config_index=0)
|
60
|
-
|
61
|
-
# Initialize the client with config
|
62
|
-
client = NGPTClient(**config)
|
63
|
-
|
64
|
-
# Or initialize with custom parameters
|
65
|
-
client = NGPTClient(
|
66
|
-
api_key="your-key",
|
67
|
-
base_url="http://your-endpoint",
|
68
|
-
provider="openai",
|
69
|
-
model="o3-mini"
|
70
|
-
)
|
71
|
-
|
72
|
-
# Chat
|
73
|
-
response = client.chat("Hello, how are you?")
|
74
|
-
|
75
|
-
# Chat with web search (if your API endpoint supports it)
|
76
|
-
response = client.chat("What's the latest news about AI?", web_search=True)
|
77
|
-
|
78
|
-
# Generate shell command
|
79
|
-
command = client.generate_shell_command("list all files")
|
80
|
-
|
81
|
-
# Generate code
|
82
|
-
code = client.generate_code("create a python function that calculates fibonacci numbers")
|
83
|
-
```
|
84
|
-
|
85
|
-
## Configuration
|
86
|
-
|
87
|
-
### Command Line Options
|
88
|
-
|
89
|
-
You can configure the client using the following options:
|
90
|
-
|
91
|
-
- `--api-key`: API key for the service
|
92
|
-
- `--base-url`: Base URL for the API
|
93
|
-
- `--model`: Model to use
|
94
|
-
- `--web-search`: Enable web search capability (Note: Your API endpoint must support this feature)
|
95
|
-
- `--config`: Path to a custom configuration file
|
96
|
-
- `--config-index`: Index of the configuration to use from the config file (default: 0)
|
97
|
-
- `--show-config`: Show configuration details and exit.
|
98
|
-
- `--all`: Used with `--show-config` to display details for all configurations.
|
99
|
-
|
100
|
-
### Configuration File
|
101
|
-
|
102
|
-
nGPT uses a configuration file stored in the standard user config directory for your operating system:
|
103
|
-
|
104
|
-
- **Linux**: `~/.config/ngpt/ngpt.conf` or `$XDG_CONFIG_HOME/ngpt/ngpt.conf`
|
105
|
-
- **macOS**: `~/Library/Application Support/ngpt/ngpt.conf`
|
106
|
-
- **Windows**: `%APPDATA%\ngpt\ngpt.conf`
|
107
|
-
|
108
|
-
The configuration file uses a JSON list format, allowing you to store multiple configurations. You can select which configuration to use with the `--config-index` argument (or by default, index 0 is used).
|
109
|
-
|
110
|
-
#### Multiple Configurations Example (`ngpt.conf`)
|
111
|
-
```json
|
112
|
-
[
|
113
|
-
{
|
114
|
-
"api_key": "your-openai-api-key-here",
|
115
|
-
"base_url": "https://api.openai.com/v1/",
|
116
|
-
"provider": "OpenAI",
|
117
|
-
"model": "gpt-4o"
|
118
|
-
},
|
119
|
-
{
|
120
|
-
"api_key": "your-groq-api-key-here",
|
121
|
-
"base_url": "https://api.groq.com/openai/v1/",
|
122
|
-
"provider": "Groq",
|
123
|
-
"model": "llama3-70b-8192"
|
124
|
-
},
|
125
|
-
{
|
126
|
-
"api_key": "your-ollama-key-if-needed",
|
127
|
-
"base_url": "http://localhost:11434/v1/",
|
128
|
-
"provider": "Ollama-Local",
|
129
|
-
"model": "llama3"
|
130
|
-
}
|
131
|
-
]
|
132
|
-
```
|
133
|
-
|
134
|
-
### Configuration Priority
|
135
|
-
|
136
|
-
nGPT determines configuration values in the following order (highest priority first):
|
137
|
-
|
138
|
-
1. Command line arguments (`--api-key`, `--base-url`, `--model`)
|
139
|
-
2. Environment variables (`OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OPENAI_MODEL`)
|
140
|
-
3. Configuration file (selected by `--config-index`, defaults to index 0)
|
141
|
-
4. Default values
|
142
|
-
|
143
|
-
## Special Features
|
144
|
-
|
145
|
-
### OS-Aware Shell Commands
|
146
|
-
|
147
|
-
Shell command generation is OS-aware, providing appropriate commands for your operating system (Windows, macOS, or Linux) and shell type (bash, powershell, etc.).
|
148
|
-
|
149
|
-
### Clean Code Generation
|
150
|
-
|
151
|
-
Code generation uses an improved prompt that ensures only clean code is returned, without markdown formatting or unnecessary explanations.
|
152
|
-
|
153
|
-
## Implementation Notes
|
154
|
-
|
155
|
-
This library uses direct HTTP requests instead of the OpenAI client library, allowing it to work with custom API endpoints that support additional parameters like `provider` and `web_search`. All parameters are sent directly in the request body, similar to the format shown in the curl example.
|
156
|
-
|
157
|
-
## License
|
158
|
-
|
159
|
-
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|
File without changes
|