ccs-llmconnector 1.0.0__py3-none-any.whl → 1.0.2__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/METADATA +48 -12
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/RECORD +6 -6
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/WHEEL +0 -0
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/entry_points.txt +0 -0
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/licenses/LICENSE +0 -0
- {ccs_llmconnector-1.0.0.dist-info → ccs_llmconnector-1.0.2.dist-info}/top_level.txt +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
|
-
Name:
|
|
3
|
-
Version: 1.0.
|
|
2
|
+
Name: ccs-llmconnector
|
|
3
|
+
Version: 1.0.2
|
|
4
4
|
Summary: Lightweight wrapper around different LLM provider Python SDK Responses APIs.
|
|
5
5
|
Author: CCS
|
|
6
6
|
License: MIT
|
|
@@ -14,18 +14,22 @@ Requires-Dist: anthropic
|
|
|
14
14
|
Requires-Dist: xai-sdk
|
|
15
15
|
Dynamic: license-file
|
|
16
16
|
|
|
17
|
-
# llmconnector
|
|
17
|
+
# ccs-llmconnector
|
|
18
18
|
|
|
19
|
-
`llmconnector` is a thin Python wrapper around leading large-language-model SDKs,
|
|
20
|
-
including the OpenAI Responses API, Google's Gemini SDK,
|
|
21
|
-
Messages API. It exposes a minimal interface that
|
|
22
|
-
options such as API key, prompt, optional reasoning
|
|
23
|
-
and image inputs, and
|
|
24
|
-
your account with each provider.
|
|
19
|
+
`ccs-llmconnector` is a thin Python wrapper around leading large-language-model SDKs,
|
|
20
|
+
including the OpenAI Responses API, Google's Gemini SDK, Anthropic's Claude
|
|
21
|
+
Messages API, and xAI's Grok chat API. It exposes a minimal interface that
|
|
22
|
+
forwards the most common options such as API key, prompt, optional reasoning
|
|
23
|
+
effort hints, token limits, and image inputs, and includes helpers to enumerate
|
|
24
|
+
the models available to your account with each provider.
|
|
25
25
|
|
|
26
26
|
## Installation
|
|
27
27
|
|
|
28
28
|
```bash
|
|
29
|
+
# from PyPI (normalized project name)
|
|
30
|
+
pip install ccs-llmconnector
|
|
31
|
+
|
|
32
|
+
# or from source (this repository)
|
|
29
33
|
pip install .
|
|
30
34
|
```
|
|
31
35
|
|
|
@@ -34,6 +38,7 @@ pip install .
|
|
|
34
38
|
- `openai` (installed automatically with the package)
|
|
35
39
|
- `google-genai` (installed automatically with the package)
|
|
36
40
|
- `anthropic` (installed automatically with the package)
|
|
41
|
+
- `xai-sdk` (installed automatically with the package; requires Python 3.10+)
|
|
37
42
|
|
|
38
43
|
## Components
|
|
39
44
|
|
|
@@ -166,7 +171,8 @@ for model in client.list_models(api_key="sk-ant-api-key"):
|
|
|
166
171
|
|
|
167
172
|
Use the `GrokClient` when you want direct access to xAI's Grok chat API.
|
|
168
173
|
|
|
169
|
-
> Requires the `xai-sdk` Python package (
|
|
174
|
+
> Requires the `xai-sdk` Python package (installed automatically with `llmconnector`).
|
|
175
|
+
> Note: `xai-sdk` targets Python 3.10 and newer.
|
|
170
176
|
|
|
171
177
|
```python
|
|
172
178
|
from llmconnector import GrokClient
|
|
@@ -278,7 +284,7 @@ for model in client.list_models(api_key="sk-your-api-key"):
|
|
|
278
284
|
|
|
279
285
|
### Usage
|
|
280
286
|
|
|
281
|
-
`LLMClient` routes requests to whichever provider has been registered; OpenAI, Gemini, and
|
|
287
|
+
`LLMClient` routes requests to whichever provider has been registered; OpenAI, Gemini, Anthropic, and Grok (alias: `xai`) are configured by default when their dependencies are available. The client also exposes `list_models` to surface the identifiers available for the selected provider.
|
|
282
288
|
|
|
283
289
|
```python
|
|
284
290
|
from llmconnector import LLMClient
|
|
@@ -328,7 +334,7 @@ for model in llm_client.list_models(provider="openai", api_key="sk-your-api-key"
|
|
|
328
334
|
|
|
329
335
|
| Parameter | Type | Required | Description |
|
|
330
336
|
|-----------|------|----------|-------------|
|
|
331
|
-
| `provider` | `str` | Yes | Registered provider key (default registry includes `'openai'`, `'gemini'`,
|
|
337
|
+
| `provider` | `str` | Yes | Registered provider key (default registry includes `'openai'`, `'gemini'`, `'anthropic'`, `'grok'`/`'xai'`). |
|
|
332
338
|
| `api_key` | `str` | Yes | Provider-specific API key. |
|
|
333
339
|
| `prompt` | `Optional[str]` | Conditional | Plain-text prompt. Required unless `images` is supplied. |
|
|
334
340
|
| `model` | `str` | Yes | Provider-specific model identifier. |
|
|
@@ -339,6 +345,34 @@ for model in llm_client.list_models(provider="openai", api_key="sk-your-api-key"
|
|
|
339
345
|
Use `LLMClient.register_provider(name, client)` to add additional providers that implement
|
|
340
346
|
`generate_response` with the same signature.
|
|
341
347
|
|
|
348
|
+
## CLI
|
|
349
|
+
|
|
350
|
+
The package provides a simple CLI entry point named `client_cli` (see `pyproject.toml`).
|
|
351
|
+
It reads API keys from environment variables and supports generating responses and
|
|
352
|
+
listing models.
|
|
353
|
+
|
|
354
|
+
- API key environment variables:
|
|
355
|
+
- OpenAI: `OPENAI_API_KEY`
|
|
356
|
+
- Gemini: `GEMINI_API_KEY` (fallback: `GOOGLE_API_KEY`)
|
|
357
|
+
- Anthropic: `ANTHROPIC_API_KEY`
|
|
358
|
+
- Grok/xAI: `GROK_API_KEY` or `XAI_API_KEY` (either works)
|
|
359
|
+
|
|
360
|
+
Examples:
|
|
361
|
+
|
|
362
|
+
```bash
|
|
363
|
+
# Generate a response
|
|
364
|
+
client_cli respond --provider openai --model gpt-4o --prompt "Hello!"
|
|
365
|
+
|
|
366
|
+
# List models for one provider (human-readable)
|
|
367
|
+
client_cli models --provider gemini
|
|
368
|
+
|
|
369
|
+
# List models for one provider (JSON)
|
|
370
|
+
client_cli models --provider anthropic --json
|
|
371
|
+
|
|
372
|
+
# List models for all registered providers
|
|
373
|
+
client_cli all-models
|
|
374
|
+
```
|
|
375
|
+
|
|
342
376
|
## Development
|
|
343
377
|
|
|
344
378
|
The project uses a standard `pyproject.toml` based packaging layout with sources
|
|
@@ -347,3 +381,5 @@ stored under `src/`. Install the project in editable mode and run your preferred
|
|
|
347
381
|
```bash
|
|
348
382
|
pip install -e .
|
|
349
383
|
```
|
|
384
|
+
|
|
385
|
+
Requires Python 3.8+.
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
ccs_llmconnector-1.0.
|
|
1
|
+
ccs_llmconnector-1.0.2.dist-info/licenses/LICENSE,sha256=YYl_gt0O2aJW046pklgKWlVVZZpFcTIOsycrs69ltn4,1061
|
|
2
2
|
llmconnector/__init__.py,sha256=SCCVGnaj8aFeE5ugvgf2bGmCLt29R__hoRu0qKhJA4c,1174
|
|
3
3
|
llmconnector/anthropic_client.py,sha256=v-BEEZQFEY5GunbzSnZLy0dSAmMqaegc2u86WPvCLOE,6196
|
|
4
4
|
llmconnector/client.py,sha256=SVn-afiwjdnFnlDflN-WiGK1wdFyazh5wSmUTtRWQmU,4834
|
|
@@ -7,8 +7,8 @@ llmconnector/gemini_client.py,sha256=OoLI9Svk0XTXqPDFsY6u4cdiVQV7t3AOGN_yIBGiMIg
|
|
|
7
7
|
llmconnector/grok_client.py,sha256=ZLDIkb17ayLYxWxs0KZYbpR_PD1Ocze0fAmUXxouS0k,4678
|
|
8
8
|
llmconnector/openai_client.py,sha256=3FTOq-M0IP3n9Q48P9JDCT46wP79OAVW4klAhFeuT2g,4717
|
|
9
9
|
llmconnector/py.typed,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
|
10
|
-
ccs_llmconnector-1.0.
|
|
11
|
-
ccs_llmconnector-1.0.
|
|
12
|
-
ccs_llmconnector-1.0.
|
|
13
|
-
ccs_llmconnector-1.0.
|
|
14
|
-
ccs_llmconnector-1.0.
|
|
10
|
+
ccs_llmconnector-1.0.2.dist-info/METADATA,sha256=Q1Q53upLNJwsHGXLl5xTRd26pKa-nv1nn_GZnZX71PY,14051
|
|
11
|
+
ccs_llmconnector-1.0.2.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
|
|
12
|
+
ccs_llmconnector-1.0.2.dist-info/entry_points.txt,sha256=eFvLY3nHAG_QhaKlemhhK7echfezW0KiMdSNMZOStLc,60
|
|
13
|
+
ccs_llmconnector-1.0.2.dist-info/top_level.txt,sha256=Doer7TAUsN8UXQfPHPNsuBXVNCz2uV-Q0v4t4fwv_MM,13
|
|
14
|
+
ccs_llmconnector-1.0.2.dist-info/RECORD,,
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|