lollms-client 0.24.2__py3-none-any.whl → 0.27.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of lollms-client might be problematic. Click here for more details.
- lollms_client/__init__.py +3 -2
- lollms_client/llm_bindings/azure_openai/__init__.py +364 -0
- lollms_client/llm_bindings/claude/__init__.py +549 -0
- lollms_client/llm_bindings/gemini/__init__.py +501 -0
- lollms_client/llm_bindings/grok/__init__.py +536 -0
- lollms_client/llm_bindings/groq/__init__.py +292 -0
- lollms_client/llm_bindings/hugging_face_inference_api/__init__.py +307 -0
- lollms_client/llm_bindings/litellm/__init__.py +201 -0
- lollms_client/llm_bindings/lollms/__init__.py +2 -0
- lollms_client/llm_bindings/mistral/__init__.py +298 -0
- lollms_client/llm_bindings/open_router/__init__.py +304 -0
- lollms_client/llm_bindings/openai/__init__.py +30 -9
- lollms_client/lollms_core.py +338 -162
- lollms_client/lollms_discussion.py +135 -37
- lollms_client/lollms_llm_binding.py +4 -0
- lollms_client/lollms_types.py +9 -1
- lollms_client/lollms_utilities.py +68 -0
- lollms_client/mcp_bindings/remote_mcp/__init__.py +82 -4
- lollms_client-0.27.0.dist-info/METADATA +604 -0
- {lollms_client-0.24.2.dist-info → lollms_client-0.27.0.dist-info}/RECORD +23 -14
- lollms_client-0.24.2.dist-info/METADATA +0 -239
- {lollms_client-0.24.2.dist-info → lollms_client-0.27.0.dist-info}/WHEEL +0 -0
- {lollms_client-0.24.2.dist-info → lollms_client-0.27.0.dist-info}/licenses/LICENSE +0 -0
- {lollms_client-0.24.2.dist-info → lollms_client-0.27.0.dist-info}/top_level.txt +0 -0
|
@@ -1,239 +0,0 @@
|
|
|
1
|
-
Metadata-Version: 2.4
|
|
2
|
-
Name: lollms_client
|
|
3
|
-
Version: 0.24.2
|
|
4
|
-
Summary: A client library for LoLLMs generate endpoint
|
|
5
|
-
Author-email: ParisNeo <parisneoai@gmail.com>
|
|
6
|
-
License: Apache Software License
|
|
7
|
-
Project-URL: Homepage, https://github.com/ParisNeo/lollms_client
|
|
8
|
-
Classifier: Programming Language :: Python :: 3
|
|
9
|
-
Classifier: Programming Language :: Python :: 3.8
|
|
10
|
-
Classifier: Programming Language :: Python :: 3.9
|
|
11
|
-
Classifier: Programming Language :: Python :: 3.10
|
|
12
|
-
Classifier: Programming Language :: Python :: 3.11
|
|
13
|
-
Classifier: Programming Language :: Python :: 3.12
|
|
14
|
-
Classifier: License :: OSI Approved :: Apache Software License
|
|
15
|
-
Classifier: Operating System :: OS Independent
|
|
16
|
-
Classifier: Intended Audience :: Developers
|
|
17
|
-
Classifier: Intended Audience :: Science/Research
|
|
18
|
-
Requires-Python: >=3.7
|
|
19
|
-
Description-Content-Type: text/markdown
|
|
20
|
-
License-File: LICENSE
|
|
21
|
-
Requires-Dist: requests
|
|
22
|
-
Requires-Dist: ascii-colors
|
|
23
|
-
Requires-Dist: pipmaster
|
|
24
|
-
Requires-Dist: pyyaml
|
|
25
|
-
Requires-Dist: tiktoken
|
|
26
|
-
Requires-Dist: pydantic
|
|
27
|
-
Requires-Dist: numpy
|
|
28
|
-
Requires-Dist: pillow
|
|
29
|
-
Requires-Dist: sqlalchemy
|
|
30
|
-
Dynamic: license-file
|
|
31
|
-
|
|
32
|
-
# LoLLMs Client Library
|
|
33
|
-
|
|
34
|
-
[](https://opensource.org/licenses/Apache-2.0)
|
|
35
|
-
[](https://badge.fury.io/py/lollms_client)
|
|
36
|
-
[](https://pypi.org/project/lollms_client/)
|
|
37
|
-
[](https://pepy.tech/project/lollms-client)
|
|
38
|
-
[](DOC_USE.md)
|
|
39
|
-
[](DOC_DEV.md)
|
|
40
|
-
[](https://github.com/ParisNeo/lollms_client/stargazers/)
|
|
41
|
-
[](https://github.com/ParisNeo/lollms_client/issues)
|
|
42
|
-
|
|
43
|
-
**`lollms_client`** is a powerful and flexible Python library designed to simplify interactions with the **LoLLMs (Lord of Large Language Models)** ecosystem and various other Large Language Model (LLM) backends. It provides a unified API for text generation, multimodal operations (text-to-image, text-to-speech, etc.), and robust function calling through the Model Context Protocol (MCP).
|
|
44
|
-
|
|
45
|
-
Whether you're connecting to a remote LoLLMs server, an Ollama instance, the OpenAI API, or running models locally using GGUF (via `llama-cpp-python` or a managed `llama.cpp` server), Hugging Face Transformers, or vLLM, `lollms-client` offers a consistent and developer-friendly experience.
|
|
46
|
-
|
|
47
|
-
## Key Features
|
|
48
|
-
|
|
49
|
-
* 🔌 **Versatile Binding System:** Seamlessly switch between different LLM backends (LoLLMs, Ollama, OpenAI, Llama.cpp, Transformers, vLLM, OpenLLM) without major code changes.
|
|
50
|
-
* 🗣️ **Multimodal Support:** Interact with models capable of processing images and generate various outputs like speech (TTS) and images (TTI).
|
|
51
|
-
* 🤖 **Function Calling with MCP:** Empowers LLMs to use external tools and functions through the Model Context Protocol (MCP), with built-in support for local Python tool execution via `local_mcp` binding and its default tools (file I/O, internet search, Python interpreter, image generation).
|
|
52
|
-
* 🚀 **Streaming & Callbacks:** Efficiently handle real-time text generation with customizable callback functions, including during MCP interactions.
|
|
53
|
-
* 💬 **Discussion Management:** Utilities to easily manage and format conversation histories for chat applications.
|
|
54
|
-
* ⚙️ **Configuration Management:** Flexible ways to configure bindings and generation parameters.
|
|
55
|
-
* 🧩 **Extensible:** Designed to easily incorporate new LLM backends and modality services, including custom MCP toolsets.
|
|
56
|
-
* 📝 **High-Level Operations:** Includes convenience methods for complex tasks like sequential summarization and deep text analysis directly within `LollmsClient`.
|
|
57
|
-
|
|
58
|
-
## Installation
|
|
59
|
-
|
|
60
|
-
You can install `lollms_client` directly from PyPI:
|
|
61
|
-
|
|
62
|
-
```bash
|
|
63
|
-
pip install lollms-client
|
|
64
|
-
```
|
|
65
|
-
|
|
66
|
-
This will install the core library. Some bindings may require additional dependencies (e.g., `llama-cpp-python`, `torch`, `transformers`, `ollama`, `vllm`). The library attempts to manage these using `pipmaster`, but for complex dependencies (especially those requiring compilation like `llama-cpp-python` with GPU support), manual installation might be preferred.
|
|
67
|
-
|
|
68
|
-
## Quick Start
|
|
69
|
-
|
|
70
|
-
Here's a very basic example of how to use `LollmsClient` to generate text with a LoLLMs server (ensure one is running at `http://localhost:9600`):
|
|
71
|
-
|
|
72
|
-
```python
|
|
73
|
-
from lollms_client import LollmsClient, MSG_TYPE
|
|
74
|
-
from ascii_colors import ASCIIColors
|
|
75
|
-
|
|
76
|
-
# Callback for streaming output
|
|
77
|
-
def simple_streaming_callback(chunk: str, msg_type: MSG_TYPE, params=None, metadata=None) -> bool:
|
|
78
|
-
if msg_type == MSG_TYPE.MSG_TYPE_CHUNK:
|
|
79
|
-
print(chunk, end="", flush=True)
|
|
80
|
-
elif msg_type == MSG_TYPE.MSG_TYPE_EXCEPTION:
|
|
81
|
-
ASCIIColors.error(f"\nStreaming Error: {chunk}")
|
|
82
|
-
return True # True to continue streaming
|
|
83
|
-
|
|
84
|
-
try:
|
|
85
|
-
# Initialize client to connect to a LoLLMs server
|
|
86
|
-
# For other backends, change 'binding_name' and provide necessary parameters.
|
|
87
|
-
# See DOC_USE.md for detailed initialization examples.
|
|
88
|
-
lc = LollmsClient(
|
|
89
|
-
binding_name="lollms",
|
|
90
|
-
host_address="http://localhost:9600"
|
|
91
|
-
)
|
|
92
|
-
|
|
93
|
-
prompt = "Tell me a fun fact about space."
|
|
94
|
-
ASCIIColors.yellow(f"Prompt: {prompt}")
|
|
95
|
-
|
|
96
|
-
# Generate text with streaming
|
|
97
|
-
ASCIIColors.green("Streaming Response:")
|
|
98
|
-
response_text = lc.generate_text(
|
|
99
|
-
prompt,
|
|
100
|
-
n_predict=100,
|
|
101
|
-
stream=True,
|
|
102
|
-
streaming_callback=simple_streaming_callback
|
|
103
|
-
)
|
|
104
|
-
print("\n--- End of Stream ---")
|
|
105
|
-
|
|
106
|
-
# The 'response_text' variable will contain the full concatenated text
|
|
107
|
-
# if streaming_callback returns True throughout.
|
|
108
|
-
if isinstance(response_text, str):
|
|
109
|
-
ASCIIColors.cyan(f"\nFull streamed text collected: {response_text[:100]}...")
|
|
110
|
-
elif isinstance(response_text, dict) and "error" in response_text:
|
|
111
|
-
ASCIIColors.error(f"Error during generation: {response_text['error']}")
|
|
112
|
-
|
|
113
|
-
except ValueError as ve:
|
|
114
|
-
ASCIIColors.error(f"Initialization Error: {ve}")
|
|
115
|
-
ASCIIColors.info("Ensure a LoLLMs server is running or configure another binding.")
|
|
116
|
-
except ConnectionRefusedError:
|
|
117
|
-
ASCIIColors.error("Connection refused. Is the LoLLMs server running at http://localhost:9600?")
|
|
118
|
-
except Exception as e:
|
|
119
|
-
ASCIIColors.error(f"An unexpected error occurred: {e}")
|
|
120
|
-
|
|
121
|
-
```
|
|
122
|
-
|
|
123
|
-
### Function Calling with MCP
|
|
124
|
-
|
|
125
|
-
`lollms-client` supports robust function calling via the Model Context Protocol (MCP), allowing LLMs to interact with your custom Python tools or pre-defined utilities.
|
|
126
|
-
|
|
127
|
-
```python
|
|
128
|
-
from lollms_client import LollmsClient, MSG_TYPE
|
|
129
|
-
from ascii_colors import ASCIIColors
|
|
130
|
-
import json # For pretty printing results
|
|
131
|
-
|
|
132
|
-
# Example callback for MCP streaming
|
|
133
|
-
def mcp_stream_callback(chunk: str, msg_type: MSG_TYPE, metadata: dict = None, turn_history: list = None) -> bool:
|
|
134
|
-
if msg_type == MSG_TYPE.MSG_TYPE_CHUNK: ASCIIColors.success(chunk, end="", flush=True) # LLM's final answer or thought process
|
|
135
|
-
elif msg_type == MSG_TYPE.MSG_TYPE_STEP_START: ASCIIColors.info(f"\n>> MCP Step Start: {metadata.get('tool_name', chunk)}", flush=True)
|
|
136
|
-
elif msg_type == MSG_TYPE.MSG_TYPE_STEP_END: ASCIIColors.success(f"\n<< MCP Step End: {metadata.get('tool_name', chunk)} -> Result: {json.dumps(metadata.get('result', ''))}", flush=True)
|
|
137
|
-
elif msg_type == MSG_TYPE.MSG_TYPE_INFO and metadata and metadata.get("type") == "tool_call_request": ASCIIColors.info(f"\nAI requests: {metadata.get('name')}({metadata.get('params')})", flush=True)
|
|
138
|
-
return True
|
|
139
|
-
|
|
140
|
-
try:
|
|
141
|
-
# Initialize LollmsClient with an LLM binding and the local_mcp binding
|
|
142
|
-
lc = LollmsClient(
|
|
143
|
-
binding_name="ollama", model_name="mistral", # Example LLM
|
|
144
|
-
mcp_binding_name="local_mcp" # Enables default tools (file_writer, internet_search, etc.)
|
|
145
|
-
# or custom tools if mcp_binding_config.tools_folder_path is set.
|
|
146
|
-
)
|
|
147
|
-
|
|
148
|
-
user_query = "What were the main AI headlines last week and write a summary to 'ai_news.txt'?"
|
|
149
|
-
ASCIIColors.blue(f"User Query: {user_query}")
|
|
150
|
-
ASCIIColors.yellow("AI Processing with MCP (streaming):")
|
|
151
|
-
|
|
152
|
-
mcp_result = lc.generate_with_mcp(
|
|
153
|
-
prompt=user_query,
|
|
154
|
-
streaming_callback=mcp_stream_callback
|
|
155
|
-
)
|
|
156
|
-
print("\n--- End of MCP Interaction ---")
|
|
157
|
-
|
|
158
|
-
if mcp_result.get("error"):
|
|
159
|
-
ASCIIColors.error(f"MCP Error: {mcp_result['error']}")
|
|
160
|
-
else:
|
|
161
|
-
ASCIIColors.cyan(f"\nFinal Answer from AI: {mcp_result.get('final_answer', 'N/A')}")
|
|
162
|
-
ASCIIColors.magenta("\nTool Calls Made:")
|
|
163
|
-
for tc in mcp_result.get("tool_calls", []):
|
|
164
|
-
print(f" - Tool: {tc.get('name')}, Params: {tc.get('params')}, Result (first 50 chars): {str(tc.get('result'))[:50]}...")
|
|
165
|
-
|
|
166
|
-
except Exception as e:
|
|
167
|
-
ASCIIColors.error(f"An error occurred in MCP example: {e}")
|
|
168
|
-
trace_exception(e) # Assuming you have trace_exception utility
|
|
169
|
-
```
|
|
170
|
-
For a comprehensive guide on function calling and setting up tools, please refer to the [Usage Guide (DOC_USE.md)](DOC_USE.md).
|
|
171
|
-
|
|
172
|
-
## Documentation
|
|
173
|
-
|
|
174
|
-
For more in-depth information, please refer to:
|
|
175
|
-
|
|
176
|
-
* **[Usage Guide (DOC_USE.md)](DOC_USE.md):** Learn how to use `LollmsClient`, different bindings, modality features, function calling with MCP, and high-level operations.
|
|
177
|
-
* **[Developer Guide (DOC_DEV.md)](DOC_DEV.md):** Understand the architecture, how to create new bindings (LLM, modality, MCP), and contribute to the library.
|
|
178
|
-
|
|
179
|
-
## Core Concepts
|
|
180
|
-
|
|
181
|
-
```mermaid
|
|
182
|
-
graph LR
|
|
183
|
-
A[Your Application] --> LC[LollmsClient];
|
|
184
|
-
|
|
185
|
-
subgraph LollmsClient_Core
|
|
186
|
-
LC -- Manages --> LLB[LLM Binding];
|
|
187
|
-
LC -- Manages --> MCPB[MCP Binding];
|
|
188
|
-
LC -- Orchestrates --> MCP_Interaction[generate_with_mcp];
|
|
189
|
-
LC -- Provides --> HighLevelOps[High-Level Ops<br>(summarize, deep_analyze etc.)];
|
|
190
|
-
LC -- Provides Access To --> DM[DiscussionManager];
|
|
191
|
-
LC -- Provides Access To --> ModalityBindings[TTS, TTI, STT etc.];
|
|
192
|
-
end
|
|
193
|
-
|
|
194
|
-
subgraph LLM_Backends
|
|
195
|
-
LLB --> LollmsServer[LoLLMs Server];
|
|
196
|
-
LLB --> OllamaServer[Ollama];
|
|
197
|
-
LLB --> OpenAPIServer[OpenAI API];
|
|
198
|
-
LLB --> LocalGGUF[Local GGUF<br>(pythonllamacpp / llamacpp server)];
|
|
199
|
-
LLB --> LocalHF[Local HuggingFace<br>(transformers / vLLM)];
|
|
200
|
-
end
|
|
201
|
-
|
|
202
|
-
MCP_Interaction --> MCPB;
|
|
203
|
-
MCPB --> LocalTools[Local Python Tools<br>(via local_mcp)];
|
|
204
|
-
MCPB --> RemoteTools[Remote MCP Tool Servers<br>(Future Potential)];
|
|
205
|
-
|
|
206
|
-
|
|
207
|
-
ModalityBindings --> ModalityServices[Modality Services<br>(e.g., LoLLMs Server TTS/TTI, local Bark/XTTS)];
|
|
208
|
-
```
|
|
209
|
-
|
|
210
|
-
* **`LollmsClient`**: The central class for all interactions. It holds the currently active LLM binding, an optional MCP binding, and provides access to modality bindings and high-level operations.
|
|
211
|
-
* **LLM Bindings**: These are plugins that allow `LollmsClient` to communicate with different LLM backends. You choose a binding (e.g., `"ollama"`, `"lollms"`, `"pythonllamacpp"`) when you initialize `LollmsClient`.
|
|
212
|
-
* **🔧 MCP Bindings**: Enable tool use and function calling. `lollms-client` includes `local_mcp` for executing Python tools. It discovers tools from a specified folder (or uses its default set), each defined by a `.py` script and a `.mcp.json` metadata file.
|
|
213
|
-
* **Modality Bindings**: Similar to LLM bindings, but for services like Text-to-Speech (`tts`), Text-to-Image (`tti`), etc.
|
|
214
|
-
* **High-Level Operations**: Methods directly on `LollmsClient` (e.g., `sequential_summarize`, `deep_analyze`, `generate_code`, `yes_no`) for performing complex, multi-step AI tasks.
|
|
215
|
-
* **`LollmsDiscussion`**: Helps manage and format conversation histories for chat applications.
|
|
216
|
-
|
|
217
|
-
## Examples
|
|
218
|
-
|
|
219
|
-
The `examples/` directory in this repository contains a rich set of scripts demonstrating various features:
|
|
220
|
-
* Basic text generation with different bindings.
|
|
221
|
-
* Streaming and non-streaming examples.
|
|
222
|
-
* Multimodal generation (text with images).
|
|
223
|
-
* Using built-in methods for summarization and Q&A.
|
|
224
|
-
* Implementing and using function calls with **`generate_with_mcp`** and the `local_mcp` binding (see `examples/function_calling_with_local_custom_mcp.py` and `examples/local_mcp.py`).
|
|
225
|
-
* Text-to-Speech and Text-to-Image generation.
|
|
226
|
-
|
|
227
|
-
Explore these examples to see `lollms-client` in action!
|
|
228
|
-
|
|
229
|
-
## Contributing
|
|
230
|
-
|
|
231
|
-
Contributions are welcome! Whether it's bug reports, feature suggestions, documentation improvements, or new bindings, please feel free to open an issue or submit a pull request on our [GitHub repository](https://github.com/ParisNeo/lollms_client).
|
|
232
|
-
|
|
233
|
-
## License
|
|
234
|
-
|
|
235
|
-
This project is licensed under the **Apache 2.0 License**. See the [LICENSE](LICENSE) file for details (assuming you have a LICENSE file, if not, state "Apache 2.0 License").
|
|
236
|
-
|
|
237
|
-
## Changelog
|
|
238
|
-
|
|
239
|
-
For a list of changes and updates, please refer to the [CHANGELOG.md](CHANGELOG.md) file.
|
|
File without changes
|
|
File without changes
|
|
File without changes
|