lollms-client 0.15.2__py3-none-any.whl → 0.17.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of lollms-client might be problematic. Click here for more details.

Files changed (39) hide show
  1. examples/generate_and_speak/generate_and_speak.py +251 -0
  2. examples/generate_game_sfx/generate_game_fx.py +240 -0
  3. examples/simple_text_gen_with_image_test.py +8 -8
  4. examples/text_2_image.py +0 -1
  5. examples/text_gen.py +1 -1
  6. lollms_client/__init__.py +1 -1
  7. lollms_client/llm_bindings/llamacpp/__init__.py +61 -11
  8. lollms_client/llm_bindings/lollms/__init__.py +31 -24
  9. lollms_client/llm_bindings/ollama/__init__.py +47 -27
  10. lollms_client/llm_bindings/openai/__init__.py +62 -35
  11. lollms_client/llm_bindings/openllm/__init__.py +4 -1
  12. lollms_client/llm_bindings/pythonllamacpp/__init__.py +3 -0
  13. lollms_client/llm_bindings/tensor_rt/__init__.py +4 -1
  14. lollms_client/llm_bindings/transformers/__init__.py +3 -0
  15. lollms_client/llm_bindings/vllm/__init__.py +4 -1
  16. lollms_client/lollms_core.py +65 -33
  17. lollms_client/lollms_llm_binding.py +76 -22
  18. lollms_client/lollms_stt_binding.py +3 -15
  19. lollms_client/lollms_tti_binding.py +5 -29
  20. lollms_client/lollms_ttm_binding.py +5 -28
  21. lollms_client/lollms_tts_binding.py +4 -28
  22. lollms_client/lollms_ttv_binding.py +4 -28
  23. lollms_client/lollms_utilities.py +5 -3
  24. lollms_client/stt_bindings/lollms/__init__.py +5 -4
  25. lollms_client/stt_bindings/whisper/__init__.py +304 -0
  26. lollms_client/stt_bindings/whispercpp/__init__.py +380 -0
  27. lollms_client/tti_bindings/lollms/__init__.py +4 -6
  28. lollms_client/ttm_bindings/audiocraft/__init__.py +281 -0
  29. lollms_client/ttm_bindings/bark/__init__.py +339 -0
  30. lollms_client/tts_bindings/bark/__init__.py +336 -0
  31. lollms_client/tts_bindings/piper_tts/__init__.py +343 -0
  32. lollms_client/tts_bindings/xtts/__init__.py +317 -0
  33. lollms_client-0.17.0.dist-info/METADATA +183 -0
  34. lollms_client-0.17.0.dist-info/RECORD +65 -0
  35. lollms_client-0.15.2.dist-info/METADATA +0 -192
  36. lollms_client-0.15.2.dist-info/RECORD +0 -56
  37. {lollms_client-0.15.2.dist-info → lollms_client-0.17.0.dist-info}/WHEEL +0 -0
  38. {lollms_client-0.15.2.dist-info → lollms_client-0.17.0.dist-info}/licenses/LICENSE +0 -0
  39. {lollms_client-0.15.2.dist-info → lollms_client-0.17.0.dist-info}/top_level.txt +0 -0
@@ -0,0 +1,183 @@
1
+ Metadata-Version: 2.4
2
+ Name: lollms_client
3
+ Version: 0.17.0
4
+ Summary: A client library for LoLLMs generate endpoint
5
+ Author-email: ParisNeo <parisneoai@gmail.com>
6
+ License: Apache Software License
7
+ Project-URL: Homepage, https://github.com/ParisNeo/lollms_client
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: Programming Language :: Python :: 3.8
10
+ Classifier: Programming Language :: Python :: 3.9
11
+ Classifier: Programming Language :: Python :: 3.10
12
+ Classifier: Programming Language :: Python :: 3.11
13
+ Classifier: Programming Language :: Python :: 3.12
14
+ Classifier: License :: OSI Approved :: Apache Software License
15
+ Classifier: Operating System :: OS Independent
16
+ Classifier: Intended Audience :: Developers
17
+ Classifier: Intended Audience :: Science/Research
18
+ Requires-Python: >=3.7
19
+ Description-Content-Type: text/markdown
20
+ License-File: LICENSE
21
+ Requires-Dist: requests
22
+ Requires-Dist: ascii-colors
23
+ Requires-Dist: pipmaster
24
+ Requires-Dist: pyyaml
25
+ Requires-Dist: tiktoken
26
+ Requires-Dist: pydantic
27
+ Requires-Dist: numpy
28
+ Requires-Dist: pillow
29
+ Dynamic: license-file
30
+
31
+ # LoLLMs Client Library
32
+
33
+ [![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
34
+ [![PyPI version](https://badge.fury.io/py/lollms_client.svg)](https://badge.fury.io/py/lollms_client)
35
+ [![Python Versions](https://img.shields.io/pypi/pyversions/lollms_client.svg)](https://pypi.org/project/lollms_client/)
36
+ [![Downloads](https://static.pepy.tech/personalized-badge/lollms-client?period=total&units=international_system&left_color=grey&right_color=green&left_text=Downloads)](https://pepy.tech/project/lollms-client)
37
+ [![Documentation - Usage](https://img.shields.io/badge/docs-Usage%20Guide-brightgreen)](DOC_USE.md)
38
+ [![Documentation - Developer](https://img.shields.io/badge/docs-Developer%20Guide-blue)](DOC_DEV.md)
39
+ [![GitHub stars](https://img.shields.io/github/stars/ParisNeo/lollms_client.svg?style=social&label=Star&maxAge=2592000)](https://github.com/ParisNeo/lollms_client/stargazers/)
40
+ [![GitHub issues](https://img.shields.io/github/issues/ParisNeo/lollms_client.svg)](https://github.com/ParisNeo/lollms_client/issues)
41
+
42
+ **`lollms_client`** is a powerful and flexible Python library designed to simplify interactions with the **LoLLMs (Lord of Large Language Models)** ecosystem and various other Large Language Model (LLM) backends. It provides a unified API for text generation, multimodal operations (text-to-image, text-to-speech, etc.), function calling, and advanced AI-driven tasks.
43
+
44
+ Whether you're connecting to a remote LoLLMs server, an Ollama instance, the OpenAI API, or running models locally using GGUF (via `llama-cpp-python` or a managed `llama.cpp` server), Hugging Face Transformers, or vLLM, `lollms-client` offers a consistent and developer-friendly experience.
45
+
46
+ ## Key Features
47
+
48
+ * 🔌 **Versatile Binding System:** Seamlessly switch between different LLM backends (LoLLMs, Ollama, OpenAI, Llama.cpp, Transformers, vLLM, OpenLLM) without major code changes.
49
+ * 🗣️ **Multimodal Support:** Interact with models capable of processing images and generate various outputs like speech (TTS) and images (TTI).
50
+ * 🚀 **Streaming & Callbacks:** Efficiently handle real-time text generation with customizable callback functions.
51
+ * 🛠️ **Task-Oriented Library:** High-level `TasksLibrary` for common operations like summarization, Q&A, code generation, and structured data extraction.
52
+ * 📞 **Function Calling:** Enable LLMs to invoke your custom Python functions, bridging the gap between language models and external tools or data sources.
53
+ * 💬 **Discussion Management:** Utilities to easily manage and format conversation histories for chat applications.
54
+ * ⚙️ **Configuration Management:** Flexible ways to configure bindings and generation parameters.
55
+ * 🧩 **Extensible:** Designed to easily incorporate new LLM backends and modality services.
56
+
57
+ ## Installation
58
+
59
+ You can install `lollms_client` directly from PyPI:
60
+
61
+ ```bash
62
+ pip install lollms-client
63
+ ```
64
+
65
+ This will install the core library. Some bindings may require additional dependencies (e.g., `llama-cpp-python`, `torch`, `transformers`, `ollama`, `vllm`). The library attempts to manage these using `pipmaster`, but for complex dependencies (especially those requiring compilation like `llama-cpp-python` with GPU support), manual installation might be preferred.
66
+
67
+ ## Quick Start
68
+
69
+ Here's a very basic example of how to use `LollmsClient` to generate text with a LoLLMs server (ensure one is running at `http://localhost:9600`):
70
+
71
+ ```python
72
+ from lollms_client import LollmsClient, MSG_TYPE
73
+ from ascii_colors import ASCIIColors
74
+
75
+ # Callback for streaming output
76
+ def simple_streaming_callback(chunk: str, msg_type: MSG_TYPE, params=None, metadata=None) -> bool:
77
+ if msg_type == MSG_TYPE.MSG_TYPE_CHUNK:
78
+ print(chunk, end="", flush=True)
79
+ elif msg_type == MSG_TYPE.MSG_TYPE_EXCEPTION:
80
+ ASCIIColors.error(f"\nStreaming Error: {chunk}")
81
+ return True # True to continue streaming
82
+
83
+ try:
84
+ # Initialize client to connect to a LoLLMs server
85
+ # For other backends, change 'binding_name' and provide necessary parameters.
86
+ # See DOC_USE.md for detailed initialization examples.
87
+ lc = LollmsClient(
88
+ binding_name="lollms",
89
+ host_address="http://localhost:9600"
90
+ )
91
+
92
+ prompt = "Tell me a fun fact about space."
93
+ ASCIIColors.yellow(f"Prompt: {prompt}")
94
+
95
+ # Generate text with streaming
96
+ ASCIIColors.green("Streaming Response:")
97
+ response_text = lc.generate_text(
98
+ prompt,
99
+ n_predict=100,
100
+ stream=True,
101
+ streaming_callback=simple_streaming_callback
102
+ )
103
+ print("\n--- End of Stream ---")
104
+
105
+ # The 'response_text' variable will contain the full concatenated text
106
+ # if streaming_callback returns True throughout.
107
+ if isinstance(response_text, str):
108
+ ASCIIColors.cyan(f"\nFull streamed text collected: {response_text[:100]}...")
109
+ elif isinstance(response_text, dict) and "error" in response_text:
110
+ ASCIIColors.error(f"Error during generation: {response_text['error']}")
111
+
112
+ except ValueError as ve:
113
+ ASCIIColors.error(f"Initialization Error: {ve}")
114
+ ASCIIColors.info("Ensure a LoLLMs server is running or configure another binding.")
115
+ except ConnectionRefusedError:
116
+ ASCIIColors.error("Connection refused. Is the LoLLMs server running at http://localhost:9600?")
117
+ except Exception as e:
118
+ ASCIIColors.error(f"An unexpected error occurred: {e}")
119
+
120
+ ```
121
+
122
+ ## Documentation
123
+
124
+ For more in-depth information, please refer to:
125
+
126
+ * **[Usage Guide (DOC_USE.md)](DOC_USE.md):** Learn how to use `LollmsClient`, different bindings, modality features, `TasksLibrary`, and `FunctionCalling_Library` with comprehensive examples.
127
+ * **[Developer Guide (DOC_DEV.md)](DOC_DEV.md):** Understand the architecture, how to create new bindings, and contribute to the library.
128
+
129
+ ## Core Concepts
130
+
131
+ ```mermaid
132
+ graph LR
133
+ A[Your Application] --> LC[LollmsClient];
134
+
135
+ subgraph LollmsClient_Core
136
+ LC -- Manages --> LLB[LLM Binding];
137
+ LC -- Provides Access To --> TL[TasksLibrary];
138
+ LC -- Provides Access To --> FCL[FunctionCalling_Library];
139
+ LC -- Provides Access To --> DM[DiscussionManager];
140
+ LC -- Provides Access To --> ModalityBindings[TTS, TTI, STT etc.];
141
+ end
142
+
143
+ subgraph LLM_Backends
144
+ LLB --> LollmsServer[LoLLMs Server];
145
+ LLB --> OllamaServer[Ollama];
146
+ LLB --> OpenAPIServer[OpenAI API];
147
+ LLB --> LocalGGUF[Local GGUF<br>(pythonllamacpp / llamacpp server)];
148
+ LLB --> LocalHF[Local HuggingFace<br>(transformers / vLLM)];
149
+ end
150
+
151
+ ModalityBindings --> ModalityServices[Modality Services<br>(e.g., LoLLMs Server TTS/TTI)];
152
+ ```
153
+
154
+ * **`LollmsClient`**: The central class for all interactions. It holds the currently active LLM binding and provides access to modality bindings and helper libraries.
155
+ * **LLM Bindings**: These are plugins that allow `LollmsClient` to communicate with different LLM backends. You choose a binding (e.g., `"ollama"`, `"lollms"`, `"pythonllamacpp"`) when you initialize `LollmsClient`.
156
+ * **Modality Bindings**: Similar to LLM bindings, but for services like Text-to-Speech (`tts`), Text-to-Image (`tti`), etc.
157
+ * **`TasksLibrary`**: Offers high-level functions for common AI tasks (summarization, Q&A) built on top of `LollmsClient`.
158
+ * **`FunctionCalling_Library`**: Enables you to define Python functions that the LLM can request to execute, allowing for tool usage.
159
+ * **`LollmsDiscussion`**: Helps manage and format conversation histories for chat applications.
160
+
161
+ ## Examples
162
+
163
+ The `examples/` directory in this repository contains a rich set of scripts demonstrating various features:
164
+ * Basic text generation with different bindings.
165
+ * Streaming and non-streaming examples.
166
+ * Multimodal generation (text with images).
167
+ * Using `TasksLibrary` for summarization and Q&A.
168
+ * Implementing and using function calls.
169
+ * Text-to-Speech and Text-to-Image generation.
170
+
171
+ Explore these examples to see `lollms-client` in action!
172
+
173
+ ## Contributing
174
+
175
+ Contributions are welcome! Whether it's bug reports, feature suggestions, documentation improvements, or new bindings, please feel free to open an issue or submit a pull request on our [GitHub repository](https://github.com/ParisNeo/lollms_client).
176
+
177
+ ## License
178
+
179
+ This project is licensed under the **Apache 2.0 License**. See the [LICENSE](LICENSE) file for details (assuming you have a LICENSE file, if not, state "Apache 2.0 License").
180
+
181
+ ## Changelog
182
+
183
+ For a list of changes and updates, please refer to the [CHANGELOG.md](CHANGELOG.md) file.
@@ -0,0 +1,65 @@
1
+ examples/simple_text_gen_test.py,sha256=RoX9ZKJjGMujeep60wh5WT_GoBn0O9YKJY6WOy-ZmOc,8710
2
+ examples/simple_text_gen_with_image_test.py,sha256=rR1O5Prcb52UHtJ3c6bv7VuTd1cvbkr5aNZU-v-Rs3Y,9263
3
+ examples/text_2_audio.py,sha256=MfL4AH_NNwl6m0I0ywl4BXRZJ0b9Y_9fRqDIe6O-Sbw,3523
4
+ examples/text_2_image.py,sha256=naBL_wXWbUxDzy0F4hOj6me4tHaR8Aib9XBsj7aiyuc,6372
5
+ examples/text_and_image_2_audio.py,sha256=QLvSsLff8VZZa7k7K1EFGlPpQWZy07zM4Fnli5btAl0,2074
6
+ examples/text_gen.py,sha256=pqQz0y_jZZCdxE5u_8d21EYPciX-UZ35zrlDxLGDP5E,1021
7
+ examples/text_gen_system_prompt.py,sha256=jRQeGe1IVu_zRHX09CFiDYi7WrK9Zd5FlMqC_gnVH-g,1018
8
+ examples/article_summary/article_summary.py,sha256=CR8mCBNcZEVCR-q34uOmrJyMlG-xk4HkMbsV-TOZEnk,1978
9
+ examples/deep_analyze/deep_analyse.py,sha256=fZNmDrfEAuxEAfdbjAgJYIh1k6wbiuZ4RvwHRvtyUs8,971
10
+ examples/deep_analyze/deep_analyze_multiple_files.py,sha256=fOryShA33P4IFxcxUDe-nJ2kW0v9w9yW8KsToS3ETl8,1032
11
+ examples/function_call/functions_call_with images.py,sha256=jrNtTF7lAzad25Ob0Yv4pwLs12HSzDamKKR9ORkNWjc,1888
12
+ examples/generate_and_speak/generate_and_speak.py,sha256=RAlvRwtEKXCh894l9M3iQbADe8CvF5N442jtRurK02I,13908
13
+ examples/generate_game_sfx/generate_game_fx.py,sha256=MgLNGi4hGBRoyr4bqYuCUdCSqd-ldDVfF0VSDUjgzsg,10467
14
+ examples/personality_test/chat_test.py,sha256=o2jlpoddFc-T592iqAiA29xk3x27KsdK5DluqxBwHqw,1417
15
+ examples/personality_test/chat_with_aristotle.py,sha256=4X_fwubMpd0Eq2rCReS2bgVlUoAqJprjkLXk2Jz6pXU,1774
16
+ examples/personality_test/tesks_test.py,sha256=7LIiwrEbva9WWZOLi34fsmCBN__RZbPpxoUOKA_AtYk,1924
17
+ examples/test_local_models/local_chat.py,sha256=slakja2zaHOEAUsn2tn_VmI4kLx6luLBrPqAeaNsix8,456
18
+ lollms_client/__init__.py,sha256=gnRUMaensmKS0xLWcPnEhZzlwu6zbXt9M2NIlW35YXM,823
19
+ lollms_client/lollms_config.py,sha256=goEseDwDxYJf3WkYJ4IrLXwg3Tfw73CXV2Avg45M_hE,21876
20
+ lollms_client/lollms_core.py,sha256=6Uvm0qI1NZAzVu9kCoa-gCmBIe5FxRqAywfFEFqQliY,79894
21
+ lollms_client/lollms_discussion.py,sha256=9b83m0D894jwpgssWYTQHbVxp1gJoI-J947Ui_dRXII,2073
22
+ lollms_client/lollms_functions.py,sha256=p8SFtmEPqvVCsIz2fZ5HxyOHaxjrAo5c12uTzJnb6m8,3594
23
+ lollms_client/lollms_js_analyzer.py,sha256=01zUvuO2F_lnUe_0NLxe1MF5aHE1hO8RZi48mNPv-aw,8361
24
+ lollms_client/lollms_llm_binding.py,sha256=Ekh_UzqOh7KyF4asjj83a_I40d7Iki7OjKNKtKF5tEs,9714
25
+ lollms_client/lollms_python_analyzer.py,sha256=7gf1fdYgXCOkPUkBAPNmr6S-66hMH4_KonOMsADASxc,10246
26
+ lollms_client/lollms_stt_binding.py,sha256=jAUhLouEhh2hmm1bK76ianfw_6B59EHfY3FmLv6DU-g,5111
27
+ lollms_client/lollms_tasks.py,sha256=Tgqces03gPTHFJCcPaeN9vBCsil3SSJX7nQAjCQ2-yg,34393
28
+ lollms_client/lollms_tti_binding.py,sha256=afO0-d-Kqsmh8UHTijTvy6dZAt-XDB6R-IHmdbf-_fs,5928
29
+ lollms_client/lollms_ttm_binding.py,sha256=FjVVSNXOZXK1qvcKEfxdiX6l2b4XdGOSNnZ0utAsbDg,4167
30
+ lollms_client/lollms_tts_binding.py,sha256=5cJYECj8PYLJAyB6SEH7_fhHYK3Om-Y3arkygCnZ24o,4342
31
+ lollms_client/lollms_ttv_binding.py,sha256=KkTaHLBhEEdt4sSVBlbwr5i_g_TlhcrwrT-7DjOsjWQ,4131
32
+ lollms_client/lollms_types.py,sha256=cfc1sremM8KR4avkYX99fIVkkdRvXErrCWKGjLrgv50,2723
33
+ lollms_client/lollms_utilities.py,sha256=WiG-HHMdo86j3LBndcBQ-PbMqQ8kGKLp1e9WuLDzRVU,7048
34
+ lollms_client/llm_bindings/__init__.py,sha256=9sWGpmWSSj6KQ8H4lKGCjpLYwhnVdL_2N7gXCphPqh4,14
35
+ lollms_client/llm_bindings/llamacpp/__init__.py,sha256=rMfxOiLVsVwg_VOPSxMGcO401V6JFfpWXTvbpPZqI14,58424
36
+ lollms_client/llm_bindings/lollms/__init__.py,sha256=poGr9H3UshRUqmiAiiRW8_1Q8rBj3Q-mhBacdnp7C7Y,13157
37
+ lollms_client/llm_bindings/ollama/__init__.py,sha256=8Kn8OI0PcT8QWVv5w7NDcsP99AvQ9TKMj-1VL3DZLfU,27770
38
+ lollms_client/llm_bindings/openai/__init__.py,sha256=IGvsWbI0uw6x04IQ7u4GAM1AaVFSendLjkbKvjQ6-AM,13993
39
+ lollms_client/llm_bindings/openllm/__init__.py,sha256=xv2XDhJNCYe6NPnWBboDs24AQ1VJBOzsTuMcmuQ6xYY,29864
40
+ lollms_client/llm_bindings/pythonllamacpp/__init__.py,sha256=7dM42TCGKh0eV0njNL1tc9cInhyvBRIXzN3dcy12Gl0,33551
41
+ lollms_client/llm_bindings/tensor_rt/__init__.py,sha256=nPaNhGRd-bsG0UlYwcEqjd_UagCMEf5VEbBUW-GWu6A,32203
42
+ lollms_client/llm_bindings/transformers/__init__.py,sha256=9LkqEC5bp1zHgyeGEcPQ3_uqvEAEf_B4p9DztcBaC5w,37211
43
+ lollms_client/llm_bindings/vllm/__init__.py,sha256=2NqeeqYWXNq1aNicdcAwN9DaoL4gq96GZ7hsKErfC6c,32187
44
+ lollms_client/stt_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
45
+ lollms_client/stt_bindings/lollms/__init__.py,sha256=jBz3285atdPRqQe9ZRrb-AvjqKRB4f8tjLXjma0DLfE,6082
46
+ lollms_client/stt_bindings/whisper/__init__.py,sha256=1NYczN4xm1gXCLmgymTF_zIIEbY5_s-lt5bFxIZ1uHw,15447
47
+ lollms_client/stt_bindings/whispercpp/__init__.py,sha256=B45lOn5rSoHXJSG9duPzBEPBOoNTrDzpCUdOL8KHaDM,21874
48
+ lollms_client/tti_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
49
+ lollms_client/tti_bindings/lollms/__init__.py,sha256=GJShFW6Y8MrfM9PXaPrdAp8OUpD6rraSRFt8ZOnrauo,8735
50
+ lollms_client/ttm_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
51
+ lollms_client/ttm_bindings/audiocraft/__init__.py,sha256=a0k6wTrHth6GaVOiNnVboeFY3oKVvCQPbQlqO38XEyc,14328
52
+ lollms_client/ttm_bindings/bark/__init__.py,sha256=Pr3ou2a-7hNYDqbkxrAbghZpO5HvGUhz7e-7VGXIHHA,18976
53
+ lollms_client/ttm_bindings/lollms/__init__.py,sha256=DU3WLmJaWNM1NAMtJsnaFo4Y9wlfc675M8aUiaLnojA,3143
54
+ lollms_client/tts_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
55
+ lollms_client/tts_bindings/bark/__init__.py,sha256=cpnmr6rmXcNdy4ib_5UHAbUP5oGoMJwB931_vU6VI14,19480
56
+ lollms_client/tts_bindings/lollms/__init__.py,sha256=8x2_T9XscvISw2TiaLoFxvrS7TIsVLdqbwSc04cX-wc,7164
57
+ lollms_client/tts_bindings/piper_tts/__init__.py,sha256=0IEWG4zH3_sOkSb9WbZzkeV5Lvhgp5Gs2-2GN51MTjA,18930
58
+ lollms_client/tts_bindings/xtts/__init__.py,sha256=FgcdUH06X6ZR806WQe5ixaYx0QoxtAcOgYo87a2qxYc,18266
59
+ lollms_client/ttv_bindings/__init__.py,sha256=UZ8o2izQOJLQgtZ1D1cXoNST7rzqW22rL2Vufc7ddRc,3141
60
+ lollms_client/ttv_bindings/lollms/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
61
+ lollms_client-0.17.0.dist-info/licenses/LICENSE,sha256=HrhfyXIkWY2tGFK11kg7vPCqhgh5DcxleloqdhrpyMY,11558
62
+ lollms_client-0.17.0.dist-info/METADATA,sha256=8Yr67tAA-uvLMsS5XABaicDYGSyl_jqRofN8NVWwsT8,9798
63
+ lollms_client-0.17.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
64
+ lollms_client-0.17.0.dist-info/top_level.txt,sha256=NI_W8S4OYZvJjb0QWMZMSIpOrYzpqwPGYaklhyWKH2w,23
65
+ lollms_client-0.17.0.dist-info/RECORD,,
@@ -1,192 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: lollms_client
3
- Version: 0.15.2
4
- Summary: A client library for LoLLMs generate endpoint
5
- Author-email: ParisNeo <parisneoai@gmail.com>
6
- License: Apache Software License
7
- Project-URL: Homepage, https://github.com/ParisNeo/lollms_client
8
- Classifier: Programming Language :: Python :: 3
9
- Classifier: Programming Language :: Python :: 3.8
10
- Classifier: Programming Language :: Python :: 3.9
11
- Classifier: Programming Language :: Python :: 3.10
12
- Classifier: Programming Language :: Python :: 3.11
13
- Classifier: Programming Language :: Python :: 3.12
14
- Classifier: License :: OSI Approved :: Apache Software License
15
- Classifier: Operating System :: OS Independent
16
- Classifier: Intended Audience :: Developers
17
- Classifier: Intended Audience :: Science/Research
18
- Requires-Python: >=3.7
19
- Description-Content-Type: text/markdown
20
- License-File: LICENSE
21
- Requires-Dist: requests
22
- Requires-Dist: ascii-colors
23
- Requires-Dist: pipmaster
24
- Requires-Dist: pyyaml
25
- Requires-Dist: tiktoken
26
- Requires-Dist: pydantic
27
- Requires-Dist: numpy
28
- Requires-Dist: pillow
29
- Dynamic: license-file
30
-
31
- # lollms_client
32
-
33
- [![Python Version](https://img.shields.io/pypi/pyversions/lollms-client)](https://pypi.org/project/lollms-client/) [![PyPI Downloads](https://img.shields.io/pypi/dw/lollms-client)](https://pypi.org/project/lollms-client/) [![Apache License](https://img.shields.io/apache/2.0)](https://www.apache.org/licenses/LICENSE-2.0)
34
-
35
- Welcome to the lollms_client repository! This library is built by [ParisNeo](https://github.com/ParisNeo) and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on [PyPI](https://pypi.org/project/lollms-client/) and distributed under the Apache 2.0 License.
36
-
37
- ## Installation
38
-
39
- To install the library from PyPI using `pip`, run:
40
-
41
- ```
42
- pip install lollms-client
43
- ```
44
-
45
- ## Getting Started
46
-
47
- The LollmsClient class is the gateway to interacting with the lollms API. Here's how you can instantiate it in various ways to suit your needs:
48
-
49
- ```python
50
- from lollms_client import LollmsClient
51
-
52
- # Default instantiation using the local lollms service - hosted at http://localhost:9600
53
- lc = LollmsClient()
54
-
55
- # Specify a custom host and port
56
- lc = LollmsClient(host_address="http://some.lollms.server:9600")
57
-
58
- # Use a specific model with a local ollama server
59
- lc = LollmsClient("ollama", model_name="phi4:latest")
60
- # Use a specific model with an Ollama binding on the server, with a context size of 32800
61
- lc = LollmsClient(
62
- "ollama",
63
- host_address="http://some.other.server:11434",
64
- model_name="phi4:latest",
65
- ctx_size=32800,
66
- )
67
- # Use a specific model with a local or remote OpenAI server (you can either set your key as an environment variable or pass it here)
68
- lc = LollmsClient("openai", model_name="gpt-3.5-turbo-0125", service_key="Key, or don't put anything if you have already an environment variable with these informations")
69
-
70
- # Use a specific model with a other OpenAI compatible server
71
- lc = LollmsClient("openai", host_address="http://some.other.server", model_name="gpt-3.5-turbo-0125")
72
- ```
73
-
74
- ### Text Generation
75
-
76
- Use `generate_text()` for generating text from the lollms API.
77
-
78
- ```python
79
- response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
80
- print(response)
81
- ```
82
- ```python
83
- response = lc.generate_text(prompt="Once upon a time", images= ["path to image1", "path to image 2"] stream=False, temperature=0.5)
84
- print(response)
85
- ```
86
-
87
- ### Code Generation
88
-
89
- The `generate_code()` function allows you to generate code snippets based on your input. Here's how you can use it:
90
-
91
- ```python
92
- # A generic case to generate a snippet in python
93
- response = lc.generate_code(prompt="Create a function to add all numbers of a list", language='python')
94
- print(response)
95
-
96
- # generate_code can also be used to generate responses ready to be parsed - with json or yaml for instance
97
- response = lc.generate_code(prompt="Mr Alex Brown presents himself to the pharmacist. He is 20 years old and seeks an appointment for the 12th of October. Fill out his application.", language='json', template="""
98
- {
99
- "name":"the first name of the person"
100
- "family_name":"the family name of the person"
101
- "age":"the age of the person"
102
- "appointment_date":"the date of the appointment"
103
- "reason":"the reason for the appointment. if not specified fill out with 'N/A'"
104
- }
105
- """)
106
- data = json.loads(response)
107
- print(data['name'], data['family_name'], "- Reason:", data['reason'])
108
- ```
109
-
110
-
111
- ### List Mounted Personalities (only on lollms)
112
-
113
- List mounted personalities of the lollms API with the `listMountedPersonalities()` method.
114
-
115
- ```python
116
- response = lc.listMountedPersonalities()
117
- print(response)
118
- ```
119
-
120
- ### List Models
121
-
122
- List available models of the lollms API with the `listModels()` method.
123
-
124
- ```python
125
- response = lc.listModels()
126
- print(response)
127
- ```
128
-
129
- ## Complete Example
130
-
131
- ```python
132
- import json
133
- from datetime import datetime
134
-
135
- # Assuming LollmsClient is already imported and instantiated as lc
136
- lc = LollmsClient()
137
-
138
- # Generate code using the LollmsClient
139
- response = lc.generate_code(
140
- prompt="Mr Alex Brown presents himself to the pharmacist. He is 20 years old and seeks an appointment for the 12th of October. Fill out his application.",
141
- language='json',
142
- template="""
143
- {
144
- "name": "the first name of the person",
145
- "family_name": "the family name of the person",
146
- "age": "the age of the person",
147
- "appointment_date": "the date of the appointment in the format DD/MM/YYYY",
148
- "reason": "the reason for the appointment. if not specified fill out with 'N/A'"
149
- }
150
- """
151
- )
152
-
153
- # Parse the JSON response
154
- data = json.loads(response)
155
-
156
- # Function to validate the data
157
- def validate_data(data):
158
- try:
159
- # Validate age
160
- if not (0 < int(data['age']) < 120):
161
- raise ValueError("Invalid age provided.")
162
-
163
- # Validate appointment date
164
- appointment_date = datetime.strptime(data['appointment_date'], '%d/%m/%Y')
165
- if appointment_date < datetime.now():
166
- raise ValueError("Appointment date cannot be in the past.")
167
-
168
- # Validate name fields
169
- if not data['name'] or not data['family_name']:
170
- raise ValueError("Name fields cannot be empty.")
171
-
172
- return True
173
- except Exception as e:
174
- print(f"Validation Error: {e}")
175
- return False
176
-
177
- # Function to simulate a response to the user
178
- def simulate_response(data):
179
- if validate_data(data):
180
- print(f"Appointment confirmed for {data['name']} {data['family_name']}.")
181
- print(f"Date: {data['appointment_date']}")
182
- print(f"Reason: {data['reason']}")
183
- else:
184
- print("Failed to confirm appointment due to invalid data.")
185
-
186
- # Execute the simulation
187
- simulate_response(data)
188
- ```
189
-
190
- Feel free to contribute to the project by submitting issues or pull requests. Follow [ParisNeo](https://github.com/ParisNeo) on [GitHub](https://github.com/ParisNeo), [Twitter](https://twitter.com/ParisNeo_AI), [Discord](https://discord.gg/BDxacQmv), [Sub-Reddit](r/lollms), and [Instagram](https://www.instagram.com/spacenerduino/) for updates and news.
191
-
192
- Happy coding!
@@ -1,56 +0,0 @@
1
- examples/simple_text_gen_test.py,sha256=RoX9ZKJjGMujeep60wh5WT_GoBn0O9YKJY6WOy-ZmOc,8710
2
- examples/simple_text_gen_with_image_test.py,sha256=V5dc6iLScpsVGDAd1xxbWMsdWqEZHbupyc6VaxH9S6o,9263
3
- examples/text_2_audio.py,sha256=MfL4AH_NNwl6m0I0ywl4BXRZJ0b9Y_9fRqDIe6O-Sbw,3523
4
- examples/text_2_image.py,sha256=Ri7lQ-GW54YWQh2eofcaN6LpwFoorbpJsJffrcXl3cg,6415
5
- examples/text_and_image_2_audio.py,sha256=QLvSsLff8VZZa7k7K1EFGlPpQWZy07zM4Fnli5btAl0,2074
6
- examples/text_gen.py,sha256=O3wuvsbEJMRSjIWBV828BXzIYtED-VgR85tXCqBBtZY,930
7
- examples/text_gen_system_prompt.py,sha256=jRQeGe1IVu_zRHX09CFiDYi7WrK9Zd5FlMqC_gnVH-g,1018
8
- examples/article_summary/article_summary.py,sha256=CR8mCBNcZEVCR-q34uOmrJyMlG-xk4HkMbsV-TOZEnk,1978
9
- examples/deep_analyze/deep_analyse.py,sha256=fZNmDrfEAuxEAfdbjAgJYIh1k6wbiuZ4RvwHRvtyUs8,971
10
- examples/deep_analyze/deep_analyze_multiple_files.py,sha256=fOryShA33P4IFxcxUDe-nJ2kW0v9w9yW8KsToS3ETl8,1032
11
- examples/function_call/functions_call_with images.py,sha256=jrNtTF7lAzad25Ob0Yv4pwLs12HSzDamKKR9ORkNWjc,1888
12
- examples/personality_test/chat_test.py,sha256=o2jlpoddFc-T592iqAiA29xk3x27KsdK5DluqxBwHqw,1417
13
- examples/personality_test/chat_with_aristotle.py,sha256=4X_fwubMpd0Eq2rCReS2bgVlUoAqJprjkLXk2Jz6pXU,1774
14
- examples/personality_test/tesks_test.py,sha256=7LIiwrEbva9WWZOLi34fsmCBN__RZbPpxoUOKA_AtYk,1924
15
- examples/test_local_models/local_chat.py,sha256=slakja2zaHOEAUsn2tn_VmI4kLx6luLBrPqAeaNsix8,456
16
- lollms_client/__init__.py,sha256=MH3-9CNgMJ7G_XfeOTnt3eDxU-tLqngSmx4HryurM3c,823
17
- lollms_client/lollms_config.py,sha256=goEseDwDxYJf3WkYJ4IrLXwg3Tfw73CXV2Avg45M_hE,21876
18
- lollms_client/lollms_core.py,sha256=YJqvURx8nWQfnvorqGleR5qkNTlk_u-9mLeU07j4FjY,78296
19
- lollms_client/lollms_discussion.py,sha256=9b83m0D894jwpgssWYTQHbVxp1gJoI-J947Ui_dRXII,2073
20
- lollms_client/lollms_functions.py,sha256=p8SFtmEPqvVCsIz2fZ5HxyOHaxjrAo5c12uTzJnb6m8,3594
21
- lollms_client/lollms_js_analyzer.py,sha256=01zUvuO2F_lnUe_0NLxe1MF5aHE1hO8RZi48mNPv-aw,8361
22
- lollms_client/lollms_llm_binding.py,sha256=sRtCUvXLdlGpaMzAYpqzDdwbCAHvwK5GFEqk3_-WxCU,7004
23
- lollms_client/lollms_python_analyzer.py,sha256=7gf1fdYgXCOkPUkBAPNmr6S-66hMH4_KonOMsADASxc,10246
24
- lollms_client/lollms_stt_binding.py,sha256=ovmpFF0fnmPC9VNi1-rxAJA8xI4JZDUBh_YwdtoTx28,5818
25
- lollms_client/lollms_tasks.py,sha256=Tgqces03gPTHFJCcPaeN9vBCsil3SSJX7nQAjCQ2-yg,34393
26
- lollms_client/lollms_tti_binding.py,sha256=CBCdXt6GQzRPzq7eEujQ5mBOoYcqUUdYY-POHLbJhx8,7469
27
- lollms_client/lollms_ttm_binding.py,sha256=ymGvHtFqesm32y1ZoyIgMBC1PckTABS-DOh-8SvMkRs,5706
28
- lollms_client/lollms_tts_binding.py,sha256=c4PQVe6NyPUtNguKMPo5L2nHJXjoIEQpCtVLK06p_iA,5880
29
- lollms_client/lollms_ttv_binding.py,sha256=u-gLIe22tbu4YsKA5RTyUT7iBlKxPXDmoQzccG3_KuA,5672
30
- lollms_client/lollms_types.py,sha256=cfc1sremM8KR4avkYX99fIVkkdRvXErrCWKGjLrgv50,2723
31
- lollms_client/lollms_utilities.py,sha256=YAgamfp0pBVApR68AHKjhp1lh6isMNF8iadwWLl63c0,7045
32
- lollms_client/llm_bindings/__init__.py,sha256=9sWGpmWSSj6KQ8H4lKGCjpLYwhnVdL_2N7gXCphPqh4,14
33
- lollms_client/llm_bindings/llamacpp/__init__.py,sha256=0XXkzoAn4OYE6R0rLm363-KNnkB4-_cgrkEdIsyCuzM,55386
34
- lollms_client/llm_bindings/lollms/__init__.py,sha256=a36AMPFEf3xK4zx1M_L9PC-3-b0iiDf7eyLkknPjgaY,12356
35
- lollms_client/llm_bindings/ollama/__init__.py,sha256=eHRwOcon61r1ISER-47M8zrZhRCKhuRFvHrnrLuY5Lw,26547
36
- lollms_client/llm_bindings/openai/__init__.py,sha256=NDZIdzW0pnHy9gPXSKfFyS6SPIOOxj9ZEzEE7gZT2NQ,12054
37
- lollms_client/llm_bindings/openllm/__init__.py,sha256=LDEwcT8CCsWrTs0ZyUg5OgP_1RV5HdCkDQmF2f5XSLo,29623
38
- lollms_client/llm_bindings/pythonllamacpp/__init__.py,sha256=xh2faZa57Nn6jscWhhu0WyRvhCC8kZ9cBJFKaE7Ddos,33332
39
- lollms_client/llm_bindings/tensor_rt/__init__.py,sha256=IY4CrHVpHY77R1rzsl3iwcoarDjYD24n7bFKk_69PD8,31983
40
- lollms_client/llm_bindings/transformers/__init__.py,sha256=gcpEQo-cs0Gzk-_gIB8fL_UjE2T_KJ1Y3FQLVA2mA94,36992
41
- lollms_client/llm_bindings/vllm/__init__.py,sha256=ZRCR7g3A2kHQ_07viNrNnVHoIGj5TNA4Q41rQWeTlxw,31967
42
- lollms_client/stt_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
43
- lollms_client/stt_bindings/lollms/__init__.py,sha256=7-IZkrsn15Vaz0oqkqCxMeNQfMkeilbgScLlrrywES4,6098
44
- lollms_client/tti_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
45
- lollms_client/tti_bindings/lollms/__init__.py,sha256=y1mcAsaYnWUVEw1Wq3Gxur4srwKDWu2IKRgwtsSPifY,9082
46
- lollms_client/ttm_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
47
- lollms_client/ttm_bindings/lollms/__init__.py,sha256=DU3WLmJaWNM1NAMtJsnaFo4Y9wlfc675M8aUiaLnojA,3143
48
- lollms_client/tts_bindings/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
49
- lollms_client/tts_bindings/lollms/__init__.py,sha256=8x2_T9XscvISw2TiaLoFxvrS7TIsVLdqbwSc04cX-wc,7164
50
- lollms_client/ttv_bindings/__init__.py,sha256=UZ8o2izQOJLQgtZ1D1cXoNST7rzqW22rL2Vufc7ddRc,3141
51
- lollms_client/ttv_bindings/lollms/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
52
- lollms_client-0.15.2.dist-info/licenses/LICENSE,sha256=HrhfyXIkWY2tGFK11kg7vPCqhgh5DcxleloqdhrpyMY,11558
53
- lollms_client-0.15.2.dist-info/METADATA,sha256=tDCoHcNRJ2GxcnU000H-d90Poyvf6NlH_huEumLhy7U,7276
54
- lollms_client-0.15.2.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
55
- lollms_client-0.15.2.dist-info/top_level.txt,sha256=NI_W8S4OYZvJjb0QWMZMSIpOrYzpqwPGYaklhyWKH2w,23
56
- lollms_client-0.15.2.dist-info/RECORD,,