deepanything 0.1.0__py3-none-any.whl

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,191 @@
1
+ Metadata-Version: 2.1
2
+ Name: deepanything
3
+ Version: 0.1.0
4
+ Summary: DeepAnything is a project that provides DeepSeek R1's deep thinking capabilities for various large language models (LLMs).
5
+ Author: Junity
6
+ Author-email: 1727636624@qq.com
7
+ Classifier: Programming Language :: Python :: 3
8
+ Classifier: License :: OSI Approved :: MIT License
9
+ Classifier: Operating System :: OS Independent
10
+ Description-Content-Type: text/markdown
11
+ License-File: LICENSE
12
+
13
+ # DeepAnything
14
+
15
+ [![Python Version](https://img.shields.io/badge/python-3.8%2B-blue)](https://www.python.org/)
16
+ [![License](https://img.shields.io/badge/license-MIT-green)](LICENSE)
17
+ [中文版](README_zh.md)
18
+
19
+ DeepAnything is a project that provides DeepSeek R1's deep thinking capabilities for various large language models (LLMs).
20
+
21
+ ## Key Features
22
+
23
+ - Provides an interface similar to OpenAI
24
+ - Supports using various models compatible with the OpenAI API as response models and thinking models
25
+ - Offers a server that provides a simple configuration to obtain an API interface compatible with the OpenAI API, and can be called using the official OpenAI SDK.
26
+ - Supports using QWQ-32b as a thinking model
27
+
28
+ ## Installation Guide
29
+
30
+ Install via pip:
31
+ ```bash
32
+ pip install deepanything
33
+ ```
34
+
35
+ ## Quick Start
36
+
37
+ ### 1. Integrate into Code
38
+ #### Chat Completion
39
+
40
+ ```python
41
+ from deepanything.ReasonClient import DeepseekReasonClient
42
+ from deepanything.ResponseClient import OpenaiResponseClient
43
+ from deepanything.DeepAnythingClient import DeepAnythingClient
44
+
45
+ think_client = DeepseekReasonClient(
46
+ base_url="https://api.siliconflow.cn/v1",
47
+ api_key="sk-xxxxxxxxx"
48
+ )
49
+
50
+ response_client = OpenaiResponseClient(
51
+ base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
52
+ api_key="sk-xxxxxxxxxx",
53
+ )
54
+
55
+ da_client = DeepAnythingClient(
56
+ reason_client=think_client,
57
+ response_client=response_client,
58
+ reason_prompt="<Think>{}</Think>"
59
+ )
60
+
61
+ completions = da_client.chat_completion(
62
+ messages=[
63
+ {
64
+ "role": "user",
65
+ "content": "你好"
66
+ }
67
+ ],
68
+ reason_model="Pro/deepseek-ai/DeepSeek-R1",
69
+ response_model="qwen-max-latest",
70
+ show_model="R1-qwen-max"
71
+ )
72
+ ```
73
+
74
+ #### Streaming Call
75
+ ```python
76
+ stream = da_client.chat_completion(
77
+ messages=[
78
+ {
79
+ "role": "user",
80
+ "content": "你好"
81
+ }
82
+ ],
83
+ reason_model="Pro/deepseek-ai/DeepSeek-R1",
84
+ response_model="qwen-max-latest",
85
+ show_model="R1-qwen-max",
86
+ stream=True
87
+ )
88
+
89
+ for chunk in stream:
90
+ print(chunk)
91
+ ```
92
+
93
+ #### Asynchronous Usage
94
+ ```python
95
+ from deepanything.ReasonClient import AsyncDeepseekReasonClient
96
+ from deepanything.ResponseClient import AsyncOpenaiResponseClient
97
+ from deepanything.DeepAnythingClient import AsyncDeepAnythingClient
98
+ import asyncio
99
+
100
+ think_client = AsyncDeepseekReasonClient(
101
+ base_url="https://api.siliconflow.cn/v1",
102
+ api_key="sk-xxxxxxxxx"
103
+ )
104
+
105
+ response_client = AsyncOpenaiResponseClient(
106
+ base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
107
+ api_key="sk-xxxxxxxxxx",
108
+ )
109
+
110
+ da_client = AsyncDeepAnythingClient(
111
+ reason_client=think_client,
112
+ response_client=response_client,
113
+ reason_prompt="<Think>{}</Think>"
114
+ )
115
+
116
+ async def main():
117
+ completions = await da_client.chat_completion(
118
+ messages=[
119
+ {
120
+ "role": "user",
121
+ "content": "你好"
122
+ }
123
+ ],
124
+ reason_model="Pro/deepseek-ai/DeepSeek-R1",
125
+ response_model="qwen-max-latest",
126
+ show_model="R1-qwen-max"
127
+ )
128
+ print(completions)
129
+
130
+ asyncio.run(main())
131
+ ```
132
+ ### 2. Use as a Server
133
+ ```bash
134
+ python -m deepanything --host host --port port --config config.json
135
+ ```
136
+ | Parameter | Description |
137
+ | --- |----------------|
138
+ | --host | Server listening address, will override the setting in config.json |
139
+ | --port | Server listening port, will override the setting in config.json |
140
+ | --config | Configuration file path |
141
+
142
+ #### Configuration File Format
143
+ Below is an example of a configuration file:
144
+
145
+ ```json
146
+ {
147
+ "host" : "0.0.0.0",
148
+ "port" : 8080,
149
+ "reason_clients": [
150
+ {
151
+ "name" : "siliconflow",
152
+ "type" : "deepseek",
153
+ "base_url" : "https://api.siliconflow.cn/v1",
154
+ "api_key" : "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
155
+ }
156
+ ],
157
+ "response_clients": [
158
+ {
159
+ "name" : "qwen",
160
+ "type" : "openai",
161
+ "base_url" : "https://dashscope.aliyuncs.com/compatible-mode/v1",
162
+ "api_key" : "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
163
+ }
164
+ ],
165
+ "models": [
166
+ {
167
+ "name": "R1-Qwen-max",
168
+ "reason_client" : "siliconflow",
169
+ "response_client" : "qwen",
170
+ "reason_model": "Pro/deepseek-ai/DeepSeek-R1",
171
+ "response_model" : "qwen-max-latest"
172
+ }
173
+ ],
174
+ "api_keys": [
175
+ "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
176
+ ]
177
+ }
178
+ ```
179
+ **Detailed Explanation**
180
+
181
+ - reason_clients: Configuration for thinking models, currently supports deepseek and openai types. When the type is openai, deepanything directly uses the model's output as the thinking content, and it is recommended to use qwq-32b in this case.
182
+ - response_clients: Configuration for response models, currently only supports the openai type.
183
+ - api_keys: API keys for user authentication. When left blank or an empty list, the server does not use API keys for authentication.
184
+
185
+ ## License
186
+ This project is licensed under the [MIT License](LICENSE)
187
+
188
+ ## Contact Us
189
+ Email: 1737636624@qq.com
190
+
191
+ GitHub Issues: https://github.com/yourusername/deepseek-r1-integration/issues
@@ -0,0 +1,16 @@
1
+ deepanything/DeepAnythingClient.py,sha256=7yf4iteGjcDWcLKd3GD0OWv-A_d54QKVAuSHMl7-T54,15042
2
+ deepanything/ReasonClient.py,sha256=8uV2wuDUmUP7ICCOfClVm5J2gBcdQJwI5F_-ftxS2Ho,5941
3
+ deepanything/ResponseClient.py,sha256=oWPIQXknm7QEkG5Ysx9ejKUyISd0cHZF-HVG0fersOQ,2871
4
+ deepanything/Stream.py,sha256=8ESR8ttjyPZ-uXPDENsVWUzaL34_GT2OZBJ0PWu7vsA,1578
5
+ deepanything/Utility.py,sha256=HmiXU5X1eQO9iL292lfFA3dbGVjEPIM4Ayvw8Z8y2lk,6677
6
+ deepanything/__init__.py,sha256=DDDKiR4kl2PhOsVHVQ7GPKNEo5qIaO8w2lbTRn-2XWQ,97
7
+ deepanything/__main__.py,sha256=x9LQGBlVnyML4yJ6Y3Rrt1pZ4xkLu0uZ569wKqwmpJQ,941
8
+ deepanything/Server/Server.py,sha256=rrTVahuDe0U1U6V6IfWJ7p5c-LbOQ15T42lymbwzna4,7211
9
+ deepanything/Server/Types.py,sha256=iN3X2QDnOyKwpuifUymeu1qg4OZ8uu4QG9ThttZFSDQ,390
10
+ deepanything/Server/__init__.py,sha256=BjkolZTph3fa10A3u02hdPMkSgpDp8BhFv8cf-BNReQ,29
11
+ deepanything-0.1.0.dist-info/LICENSE,sha256=JWYd2E-mcNcSYjT5nk4ayM5kkkDq6ZlOxVcYsyqCIwU,1059
12
+ deepanything-0.1.0.dist-info/METADATA,sha256=JhgQ6s8hUUDsltC7_Y06K-Sg8QaBAPZJgEGK-FGRzlA,5628
13
+ deepanything-0.1.0.dist-info/WHEEL,sha256=yQN5g4mg4AybRjkgi-9yy4iQEFibGQmlz78Pik5Or-A,92
14
+ deepanything-0.1.0.dist-info/entry_points.txt,sha256=UT4gNGx6dJsKBjZIl3VkMekh385O5WMbMidAAla6UB4,60
15
+ deepanything-0.1.0.dist-info/top_level.txt,sha256=wGeRb__4jEJTclCUl0cxhgubD_Bq-QT38VIH6C4KpzY,13
16
+ deepanything-0.1.0.dist-info/RECORD,,
@@ -0,0 +1,5 @@
1
+ Wheel-Version: 1.0
2
+ Generator: bdist_wheel (0.41.2)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ deepanything = deepanything.__main__:main
@@ -0,0 +1 @@
1
+ deepanything