model-forge-llm 0.2.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,327 @@
1
+ Metadata-Version: 2.4
2
+ Name: model-forge-llm
3
+ Version: 0.2.0
4
+ Summary: A reusable library for managing LLM providers, authentication, and model selection.
5
+ Author: ModelForge Contributors
6
+ License: MIT License
7
+
8
+ Copyright (c) 2025 Shuhai Miao
9
+
10
+ Permission is hereby granted, free of charge, to any person obtaining a copy
11
+ of this software and associated documentation files (the "Software"), to deal
12
+ in the Software without restriction, including without limitation the rights
13
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
14
+ copies of the Software, and to permit persons to whom the Software is
15
+ furnished to do so, subject to the following conditions:
16
+
17
+ The above copyright notice and this permission notice shall be included in all
18
+ copies or substantial portions of the Software.
19
+
20
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
21
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
22
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
23
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
24
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
25
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
26
+ SOFTWARE.
27
+
28
+ Project-URL: Homepage, https://github.com/your-org/model-forge
29
+ Project-URL: Repository, https://github.com/your-org/model-forge
30
+ Project-URL: Issues, https://github.com/your-org/model-forge/issues
31
+ Project-URL: Documentation, https://model-forge.readthedocs.io
32
+ Keywords: llm,ai,langchain,openai,ollama,providers,authentication
33
+ Classifier: Development Status :: 4 - Beta
34
+ Classifier: Intended Audience :: Developers
35
+ Classifier: License :: OSI Approved :: MIT License
36
+ Classifier: Operating System :: OS Independent
37
+ Classifier: Programming Language :: Python :: 3
38
+ Classifier: Programming Language :: Python :: 3.11
39
+ Classifier: Programming Language :: Python :: 3.12
40
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
41
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
42
+ Requires-Python: <4.0,>=3.11
43
+ Description-Content-Type: text/markdown
44
+ License-File: LICENSE
45
+ Requires-Dist: click>=8.1.7
46
+ Requires-Dist: requests>=2.32.3
47
+ Requires-Dist: langchain-core>=0.3.0
48
+ Requires-Dist: langchain-openai>=0.3.0
49
+ Requires-Dist: langchain-community>=0.3.0
50
+ Requires-Dist: langchain-google-genai>=2.1.5
51
+ Requires-Dist: langchain-github-copilot>=0.4.0
52
+ Provides-Extra: dev
53
+ Requires-Dist: pytest>=8.2.2; extra == "dev"
54
+ Requires-Dist: pytest-mock>=3.14.0; extra == "dev"
55
+ Requires-Dist: ruff>=0.7.0; extra == "dev"
56
+ Requires-Dist: mypy>=1.11.0; extra == "dev"
57
+ Requires-Dist: pre-commit>=3.8.0; extra == "dev"
58
+ Requires-Dist: pytest-cov>=5.0.0; extra == "dev"
59
+ Requires-Dist: types-requests>=2.32.0; extra == "dev"
60
+ Requires-Dist: twine>=6.1.0; extra == "dev"
61
+ Requires-Dist: requests-mock>=1.12.1; extra == "dev"
62
+ Dynamic: license-file
63
+
64
+ # Model Forge Library
65
+
66
+ A reusable library for managing LLM providers, authentication, and model selection.
67
+
68
+ This library is intended to be used by various Python-based AI projects to provide a consistent way to handle LLM interactions.
69
+
70
+ ## High-Level Design
71
+
72
+ The library is composed of three core modules:
73
+
74
+ - **`config`**: Manages configuration files with a two-tier system - global (`~/.config/model-forge/config.json`) and local (`./.model-forge/config.json`) - where all provider and model settings are stored.
75
+ - **`auth`**: Provides a suite of authentication strategies (API Key, OAuth 2.0 Device Flow, and a No-Op for local models) and handles secure credential storage in configuration files.
76
+ - **`registry`**: Acts as the main entry point and factory. It reads the configuration, invokes the appropriate authentication strategy, and instantiates ready-to-use, LangChain-compatible LLM objects.
77
+
78
+ ## 🛠️ **Quick Start**
79
+
80
+ ## **Option 1: Traditional Development Setup (Recommended)**
81
+ Best for developers who will use ModelForge frequently:
82
+
83
+ ```bash
84
+ # 1. Run setup script
85
+ ./setup.sh
86
+
87
+ # 2. Use Poetry directly (faster for repeated use)
88
+ poetry run modelforge config show
89
+ poetry run modelforge config add --provider openai --model gpt-4
90
+ ```
91
+
92
+ ## **Option 2: Wrapper Script (Quick Usage)**
93
+ Best for occasional use, CI/CD, or Docker environments:
94
+
95
+ ```bash
96
+ # Single command that handles setup + execution
97
+ ./modelforge.sh config show
98
+ ./modelforge.sh config add --provider openai --model gpt-4
99
+ ```
100
+
101
+ **Performance Comparison:**
102
+ - **Traditional**: ~0.9s per command
103
+ - **Wrapper**: ~1.6s per command (includes setup overhead)
104
+
105
+ ## Local Development & Testing
106
+
107
+ To test the library locally, you can use the built-in Command-Line Interface (CLI).
108
+
109
+ **Option 1: Using the setup script (recommended)**
110
+ ```bash
111
+ ./setup.sh
112
+ ```
113
+
114
+ **Option 2: Manual setup**
115
+ 1. **Set up a virtual environment:**
116
+ ```bash
117
+ python -m venv venv
118
+ source venv/bin/activate
119
+ ```
120
+
121
+ 2. **Install the library in editable mode:**
122
+ This allows you to use the CLI and reflects any code changes immediately without reinstalling.
123
+ ```bash
124
+ pip install -e .
125
+ ```
126
+
127
+ 3. **Use the CLI to manage your models:**
128
+ ```bash
129
+ # Show the current configuration
130
+ modelforge config show
131
+
132
+ # Add a local Ollama model
133
+ modelforge config add --provider ollama --model qwen3:1.7b
134
+
135
+ # Add OpenAI models with API key
136
+ modelforge config add --provider openai --model gpt-4o-mini --api-key "YOUR_API_KEY_HERE"
137
+ modelforge config add --provider openai --model gpt-4o --api-model-name "gpt-4o" --api-key "YOUR_API_KEY_HERE"
138
+
139
+ # Add a provider requiring an API key (Google Gemini)
140
+ modelforge config add --provider google --model gemini-pro --api-model-name "gemini-1.5-pro" --api-key "YOUR_API_KEY_HERE"
141
+
142
+ # Add GitHub Copilot and trigger the device authentication flow
143
+ modelforge config add --provider github_copilot --model claude-3.7-sonnet --dev-auth
144
+
145
+ # Set a model to be the default
146
+ modelforge config use --provider ollama --model qwen3:1.7b
147
+ ```
148
+
149
+ ## Available Models and Providers
150
+
151
+ **📚 Model Reference:**
152
+ For a comprehensive list of available providers and models, visit **[models.dev](https://models.dev)** - your go-to resource for:
153
+
154
+ - **Provider Documentation**: Detailed information about each LLM provider
155
+ - **Model Specifications**: Complete model listings with capabilities and pricing
156
+ - **API References**: Authentication methods and integration guides
157
+ - **Model Comparisons**: Performance metrics and use case recommendations
158
+
159
+ **Supported Providers:**
160
+ - **OpenAI**: GPT-4, GPT-4o, GPT-3.5-turbo, and more
161
+ - **Ollama**: Local models like Llama, Qwen, Mistral, and others
162
+ - **GitHub Copilot**: Claude, GPT-4, and other models via GitHub *(Enhanced Support)*
163
+ - **Google Gemini**: Gemini Pro, Gemini Flash, and other Google models
164
+
165
+ ### 🚀 **Enhanced GitHub Copilot Support**
166
+
167
+ ModelForge provides **two-tier GitHub Copilot integration** for optimal performance:
168
+
169
+ #### **🎯 Tier 1: Dedicated ChatGitHubCopilot (Recommended)**
170
+ When `langchain-github-copilot` is installed, ModelForge uses the specialized GitHub Copilot class:
171
+
172
+ ```bash
173
+ # Install the enhanced GitHub Copilot support
174
+ poetry add langchain-github-copilot
175
+
176
+ # Add GitHub Copilot with device authentication
177
+ ./modelforge.sh config add --provider github_copilot --model claude-3.7-sonnet --dev-auth
178
+ ```
179
+
180
+ **Benefits:**
181
+ - ✅ **Optimized for 25-minute token lifecycle**
182
+ - ✅ **GitHub-specific rate limiting**
183
+ - ✅ **Enhanced error handling**
184
+ - ✅ **Built-in token refresh**
185
+
186
+ #### **🔄 Tier 2: OpenAI-Compatible Fallback**
187
+ If `langchain-github-copilot` is not available, ModelForge automatically falls back to OpenAI-compatible mode:
188
+
189
+ ```bash
190
+ # Works even without langchain-github-copilot installed
191
+ ./modelforge.sh config add --provider github_copilot --model claude-3.7-sonnet --dev-auth
192
+ ```
193
+
194
+ **Characteristics:**
195
+ - ⚡ **Universal compatibility**
196
+ - 🛠️ **Manual token management**
197
+ - 📊 **Standard OpenAI interface**
198
+
199
+ #### **🔍 Installation Options**
200
+
201
+ ```bash
202
+ # Option 1: Full installation with GitHub Copilot enhancement
203
+ git clone <repo>
204
+ cd model-forge
205
+ ./setup.sh
206
+ poetry add langchain-github-copilot
207
+
208
+ # Option 2: Basic installation (fallback mode)
209
+ git clone <repo>
210
+ cd model-forge
211
+ ./setup.sh
212
+ # Uses OpenAI-compatible fallback automatically
213
+ ```
214
+
215
+ Use [models.dev](https://models.dev) to explore the full ecosystem and find the perfect model for your use case!
216
+
217
+ ## Configuration System
218
+
219
+ ModelForge uses a **two-tier configuration system** that provides flexibility for both personal and project-specific setups:
220
+
221
+ ### 🌍 **Global Configuration** (`~/.config/model-forge/config.json`)
222
+ - **Location**: User's config directory (follows XDG Base Directory Standard)
223
+ - **Purpose**: System-wide model configurations shared across all projects
224
+ - **Use case**: Personal API keys, frequently used models, default settings
225
+
226
+ ### 📁 **Local Configuration** (`./.model-forge/config.json`)
227
+ - **Location**: Current working directory (project-specific)
228
+ - **Purpose**: Project-specific model configurations
229
+ - **Use case**: Team projects, specific model requirements, environment-specific settings
230
+
231
+ ### 🔄 **Precedence Rules**
232
+ 1. **Local First**: If a local config exists, it takes precedence
233
+ 2. **Global Fallback**: If no local config, the global config is used
234
+ 3. **Auto-Creation**: If neither exists, a new global config is created
235
+
236
+ ### 💡 **Managing Configurations**
237
+ ```bash
238
+ # View current configuration (shows which config is active)
239
+ modelforge config show
240
+
241
+ # Add to global configuration (default)
242
+ modelforge config add --provider openai --model gpt-4o --api-key "YOUR_KEY"
243
+
244
+ # Add to local configuration (project-specific)
245
+ modelforge config add --provider openai --model gpt-4o --api-key "YOUR_KEY" --local
246
+ ```
247
+
248
+ Both configuration files use the same JSON structure and are fully compatible with all ModelForge features.
249
+
250
+ ## Code Quality & Development
251
+
252
+ ModelForge maintains high code quality standards with automated tooling:
253
+
254
+ ### 🔧 **Quality Tools**
255
+ - **Ruff**: Fast linting and formatting
256
+ - **MyPy**: Type checking for reliability
257
+ - **Pre-commit**: Automated quality checks
258
+ - **GitHub Actions**: CI/CD pipeline
259
+ - **Pytest**: Comprehensive testing with coverage
260
+
261
+ ### 📋 **Code Review Guidelines**
262
+ We provide comprehensive code review guidelines for consistent quality:
263
+ - **[Detailed Guidelines](CODE_REVIEW_GUIDELINES.md)**: Complete review criteria and examples
264
+ - **[LLM Prompt](PROMPT_CODE_REVIEW.md)**: Quick prompt for AI-assisted code reviews
265
+
266
+ ### 🚀 **Development Commands**
267
+ ```bash
268
+ # Format and check code
269
+ poetry run ruff format .
270
+ poetry run ruff check .
271
+
272
+ # Type checking
273
+ poetry run mypy src/modelforge
274
+
275
+ # Run tests with coverage
276
+ poetry run pytest --cov=src/modelforge
277
+
278
+ # Run all quality checks
279
+ poetry run pre-commit run --all-files
280
+ ```
281
+
282
+ ## Integration Guide
283
+
284
+ To use this library in a host application (e.g., RAG-Forge):
285
+
286
+ 1. **Install the library:**
287
+ ```bash
288
+ # Quick setup (recommended for development)
289
+ cd /path/to/model-forge && ./setup.sh
290
+
291
+ # Or install manually from a local path
292
+ pip install -e /path/to/model-forge
293
+
294
+ # In the future, you would install from a package registry like PyPI
295
+ # pip install model-forge
296
+ ```
297
+
298
+ 2. **Use the `ModelForgeRegistry` in your application:**
299
+ ```python
300
+ from modelforge.registry import ModelForgeRegistry
301
+
302
+ # 1. Initialize the registry
303
+ registry = ModelForgeRegistry()
304
+
305
+ # 2. See which models the user has configured
306
+ available_models = registry.list_models()
307
+ print(f"Available models: {available_models}")
308
+ # Example output: ['ollama/qwen3:1.7b', 'github_copilot/claude-3.7-sonnet']
309
+
310
+ # 3. Get a fully authenticated model instance
311
+ if available_models:
312
+ model_id = available_models[0]
313
+ llm = registry.get_model_instance(model_id)
314
+
315
+ if llm:
316
+ # Now you have a LangChain-compatible LLM object to use
317
+ response = llm.invoke("Tell me a joke.")
318
+ print(response)
319
+ ```
320
+
321
+ ## Features
322
+
323
+ - **Multi-Provider Support**: OpenAI, Ollama, GitHub Copilot, Google Gemini
324
+ - **Flexible Authentication**: API Key, OAuth 2.0 Device Flow, Local (no auth)
325
+ - **Secure Credential Storage**: Stores API keys and tokens in configuration files
326
+ - **LangChain Integration**: Provides ready-to-use LangChain-compatible model instances
327
+ - **Centralized Configuration**: Single configuration file managing all providers and models
@@ -0,0 +1,14 @@
1
+ model_forge_llm-0.2.0.dist-info/licenses/LICENSE,sha256=g3TKKxfqGlKmBRdCKelUPNrTUBdvDzn0yopE8WeJNDw,1068
2
+ modelforge/__init__.py,sha256=mTsSBmyZPmvMm92g9N0S_47QWM5raWuMLd2f9i81rNg,183
3
+ modelforge/auth.py,sha256=ju0ARgMCEb1cnEY_nJTbGR1o3MaEd-_H4YxFTpbE9Vg,18680
4
+ modelforge/cli.py,sha256=QTgHhUq3gtOb74XLzRAhsX8wF0cdwXPgSwGee2BvIeU,27189
5
+ modelforge/config.py,sha256=77dF1q2P2fo7hBF7YR8OBdcn3yVSSqkQXvJ3NoJvQh8,6814
6
+ modelforge/exceptions.py,sha256=Ld3RG1DuUiekAa1JM4-ivOqfHleN5o2rve1QJ-yZ5No,720
7
+ modelforge/logging_config.py,sha256=SQlPMQkAIRIoItvcfM_P5ZEvfnsvc_O8tniuHN8keNM,1754
8
+ modelforge/modelsdev.py,sha256=9hKPBlE4E2-FY_OTRfxyWkchpmHMwf4rzNh8om4UW5Q,14083
9
+ modelforge/registry.py,sha256=MW3mxYrFdKw5URc4Tn0g9IJIo0DcC8h0YHAXKxjsL5E,10136
10
+ model_forge_llm-0.2.0.dist-info/METADATA,sha256=fkH-IavtAegD3D4Uapef3eEfVdu372HPTjqavM6-43E,12515
11
+ model_forge_llm-0.2.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
12
+ model_forge_llm-0.2.0.dist-info/entry_points.txt,sha256=y5CZ8kHJCfMVzoj4KX6dMnlMHb4DAvgRy6jUlM0a8wE,50
13
+ model_forge_llm-0.2.0.dist-info/top_level.txt,sha256=ZyKkWfjMHbQ1mlQyLT2xZlbeT0M1jOjB4nRWVR_YiI8,11
14
+ model_forge_llm-0.2.0.dist-info/RECORD,,
@@ -0,0 +1,5 @@
1
+ Wheel-Version: 1.0
2
+ Generator: setuptools (80.9.0)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ modelforge = modelforge.cli:cli
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Shuhai Miao
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1 @@
1
+ modelforge
modelforge/__init__.py ADDED
@@ -0,0 +1,7 @@
1
+ """ModelForge: A reusable library for managing LLM providers and authentication."""
2
+
3
+ __version__ = "0.2.0"
4
+
5
+ from .registry import ModelForgeRegistry
6
+
7
+ __all__ = ["ModelForgeRegistry"]