local-openai2anthropic 0.1.0__py3-none-any.whl → 0.3.6__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- local_openai2anthropic/__init__.py +1 -1
- local_openai2anthropic/__main__.py +7 -0
- local_openai2anthropic/config.py +132 -18
- local_openai2anthropic/converter.py +107 -250
- local_openai2anthropic/daemon.py +382 -0
- local_openai2anthropic/daemon_runner.py +116 -0
- local_openai2anthropic/main.py +256 -33
- local_openai2anthropic/openai_types.py +149 -0
- local_openai2anthropic/protocol.py +1 -1
- local_openai2anthropic/router.py +211 -520
- local_openai2anthropic/streaming/__init__.py +6 -0
- local_openai2anthropic/streaming/handler.py +444 -0
- local_openai2anthropic/tools/__init__.py +14 -0
- local_openai2anthropic/tools/handler.py +357 -0
- local_openai2anthropic/utils/__init__.py +18 -0
- local_openai2anthropic/utils/tokens.py +96 -0
- local_openai2anthropic-0.3.6.dist-info/METADATA +374 -0
- local_openai2anthropic-0.3.6.dist-info/RECORD +25 -0
- local_openai2anthropic-0.1.0.dist-info/METADATA +0 -689
- local_openai2anthropic-0.1.0.dist-info/RECORD +0 -15
- {local_openai2anthropic-0.1.0.dist-info → local_openai2anthropic-0.3.6.dist-info}/WHEEL +0 -0
- {local_openai2anthropic-0.1.0.dist-info → local_openai2anthropic-0.3.6.dist-info}/entry_points.txt +0 -0
- {local_openai2anthropic-0.1.0.dist-info → local_openai2anthropic-0.3.6.dist-info}/licenses/LICENSE +0 -0
|
@@ -1,689 +0,0 @@
|
|
|
1
|
-
Metadata-Version: 2.4
|
|
2
|
-
Name: local-openai2anthropic
|
|
3
|
-
Version: 0.1.0
|
|
4
|
-
Summary: A lightweight proxy server that converts Anthropic Messages API to OpenAI API
|
|
5
|
-
Project-URL: Homepage, https://github.com/dongfangzan/local-openai2anthropic
|
|
6
|
-
Project-URL: Repository, https://github.com/dongfangzan/local-openai2anthropic
|
|
7
|
-
Project-URL: Issues, https://github.com/dongfangzan/local-openai2anthropic/issues
|
|
8
|
-
Author-email: dongfangzan <zsybook0124@163.com>
|
|
9
|
-
Maintainer-email: dongfangzan <zsybook0124@163.com>
|
|
10
|
-
License: Apache-2.0
|
|
11
|
-
License-File: LICENSE
|
|
12
|
-
Keywords: anthropic,api,claude,messages,openai,proxy
|
|
13
|
-
Classifier: Development Status :: 4 - Beta
|
|
14
|
-
Classifier: Intended Audience :: Developers
|
|
15
|
-
Classifier: License :: OSI Approved :: Apache Software License
|
|
16
|
-
Classifier: Programming Language :: Python :: 3
|
|
17
|
-
Classifier: Programming Language :: Python :: 3.12
|
|
18
|
-
Classifier: Programming Language :: Python :: 3.13
|
|
19
|
-
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
|
20
|
-
Requires-Python: >=3.12
|
|
21
|
-
Requires-Dist: anthropic>=0.30.0
|
|
22
|
-
Requires-Dist: fastapi>=0.100.0
|
|
23
|
-
Requires-Dist: httpx>=0.25.0
|
|
24
|
-
Requires-Dist: openai>=1.30.0
|
|
25
|
-
Requires-Dist: pydantic-settings>=2.0.0
|
|
26
|
-
Requires-Dist: pydantic>=2.0.0
|
|
27
|
-
Requires-Dist: uvicorn[standard]>=0.23.0
|
|
28
|
-
Provides-Extra: dev
|
|
29
|
-
Requires-Dist: black>=23.0.0; extra == 'dev'
|
|
30
|
-
Requires-Dist: mypy>=1.0.0; extra == 'dev'
|
|
31
|
-
Requires-Dist: pytest-asyncio>=0.21.0; extra == 'dev'
|
|
32
|
-
Requires-Dist: pytest>=7.0.0; extra == 'dev'
|
|
33
|
-
Requires-Dist: ruff>=0.1.0; extra == 'dev'
|
|
34
|
-
Description-Content-Type: text/markdown
|
|
35
|
-
|
|
36
|
-
# local-openai2anthropic
|
|
37
|
-
|
|
38
|
-
[](https://www.python.org/downloads/)
|
|
39
|
-
[](https://opensource.org/licenses/Apache-2.0)
|
|
40
|
-
|
|
41
|
-
**[English](#english) | [中文](#中文)**
|
|
42
|
-
|
|
43
|
-
---
|
|
44
|
-
|
|
45
|
-
<a name="english"></a>
|
|
46
|
-
## English
|
|
47
|
-
|
|
48
|
-
A lightweight proxy server that converts **Anthropic Messages API** requests to **OpenAI API** calls.
|
|
49
|
-
|
|
50
|
-
This allows you to use applications built for Claude's API with any OpenAI-compatible backend (OpenAI, Azure, local vLLM, etc.).
|
|
51
|
-
|
|
52
|
-
## Features
|
|
53
|
-
|
|
54
|
-
- ✅ **Full Messages API compatibility** - All Anthropic Messages API features
|
|
55
|
-
- ✅ **Streaming support** - Server-sent events (SSE) for real-time responses
|
|
56
|
-
- ✅ **Tool/Function calling** - Convert between Anthropic tools and OpenAI functions
|
|
57
|
-
- ✅ **Vision/Multimodal** - Image input support
|
|
58
|
-
- ✅ **Official SDK types** - Uses official `anthropic` and `openai` Python SDKs for type safety
|
|
59
|
-
- ✅ **Easy configuration** - Environment variables or `.env` file
|
|
60
|
-
- ✅ **CORS support** - Ready for browser-based applications
|
|
61
|
-
- ✅ **Self-hosted** - Run locally or deploy to your infrastructure
|
|
62
|
-
|
|
63
|
-
## Quick Start
|
|
64
|
-
|
|
65
|
-
### Installation
|
|
66
|
-
|
|
67
|
-
```bash
|
|
68
|
-
# Install from source (recommended for now)
|
|
69
|
-
git clone https://github.com/yourusername/local-openai2anthropic.git
|
|
70
|
-
cd local-openai2anthropic
|
|
71
|
-
pip install -e ".[dev]"
|
|
72
|
-
|
|
73
|
-
# Or install directly
|
|
74
|
-
pip install local-openai2anthropic
|
|
75
|
-
```
|
|
76
|
-
|
|
77
|
-
### Configuration
|
|
78
|
-
|
|
79
|
-
Set your OpenAI API key:
|
|
80
|
-
|
|
81
|
-
```bash
|
|
82
|
-
export OA2A_OPENAI_API_KEY="sk-..."
|
|
83
|
-
```
|
|
84
|
-
|
|
85
|
-
Or create a `.env` file:
|
|
86
|
-
|
|
87
|
-
```env
|
|
88
|
-
OA2A_OPENAI_API_KEY=sk-...
|
|
89
|
-
OA2A_OPENAI_BASE_URL=https://api.openai.com/v1
|
|
90
|
-
OA2A_HOST=0.0.0.0
|
|
91
|
-
OA2A_PORT=8080
|
|
92
|
-
```
|
|
93
|
-
|
|
94
|
-
### Run the Server
|
|
95
|
-
|
|
96
|
-
```bash
|
|
97
|
-
# Using the CLI command
|
|
98
|
-
local-openai2anthropic
|
|
99
|
-
|
|
100
|
-
# Or using the short alias
|
|
101
|
-
oa2a
|
|
102
|
-
|
|
103
|
-
# Or using Python module
|
|
104
|
-
python -m local_openai2anthropic
|
|
105
|
-
```
|
|
106
|
-
|
|
107
|
-
## Usage Examples
|
|
108
|
-
|
|
109
|
-
### Using with Anthropic Python SDK
|
|
110
|
-
|
|
111
|
-
```python
|
|
112
|
-
import anthropic
|
|
113
|
-
|
|
114
|
-
# Point to your local proxy instead of Anthropic's API
|
|
115
|
-
client = anthropic.Anthropic(
|
|
116
|
-
base_url="http://localhost:8080",
|
|
117
|
-
api_key="dummy-key", # Not used but required by SDK
|
|
118
|
-
)
|
|
119
|
-
|
|
120
|
-
# Use the Messages API normally
|
|
121
|
-
message = client.messages.create(
|
|
122
|
-
model="gpt-4o", # This will be passed to OpenAI
|
|
123
|
-
max_tokens=1024,
|
|
124
|
-
messages=[
|
|
125
|
-
{"role": "user", "content": "Hello, Claude!"}
|
|
126
|
-
]
|
|
127
|
-
)
|
|
128
|
-
|
|
129
|
-
print(message.content[0].text)
|
|
130
|
-
```
|
|
131
|
-
|
|
132
|
-
### Using with Streaming
|
|
133
|
-
|
|
134
|
-
```python
|
|
135
|
-
import anthropic
|
|
136
|
-
|
|
137
|
-
client = anthropic.Anthropic(
|
|
138
|
-
base_url="http://localhost:8080",
|
|
139
|
-
api_key="dummy-key",
|
|
140
|
-
)
|
|
141
|
-
|
|
142
|
-
stream = client.messages.create(
|
|
143
|
-
model="gpt-4o",
|
|
144
|
-
max_tokens=1024,
|
|
145
|
-
messages=[{"role": "user", "content": "Count to 10"}],
|
|
146
|
-
stream=True,
|
|
147
|
-
)
|
|
148
|
-
|
|
149
|
-
for event in stream:
|
|
150
|
-
if event.type == "content_block_delta":
|
|
151
|
-
print(event.delta.text, end="", flush=True)
|
|
152
|
-
```
|
|
153
|
-
|
|
154
|
-
### Using with Tool Calling
|
|
155
|
-
|
|
156
|
-
```python
|
|
157
|
-
import anthropic
|
|
158
|
-
|
|
159
|
-
client = anthropic.Anthropic(
|
|
160
|
-
base_url="http://localhost:8080",
|
|
161
|
-
api_key="dummy-key",
|
|
162
|
-
)
|
|
163
|
-
|
|
164
|
-
message = client.messages.create(
|
|
165
|
-
model="gpt-4o",
|
|
166
|
-
max_tokens=1024,
|
|
167
|
-
tools=[
|
|
168
|
-
{
|
|
169
|
-
"name": "get_weather",
|
|
170
|
-
"description": "Get weather for a location",
|
|
171
|
-
"input_schema": {
|
|
172
|
-
"type": "object",
|
|
173
|
-
"properties": {
|
|
174
|
-
"location": {"type": "string"},
|
|
175
|
-
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
|
|
176
|
-
},
|
|
177
|
-
"required": ["location"],
|
|
178
|
-
},
|
|
179
|
-
}
|
|
180
|
-
],
|
|
181
|
-
messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
|
|
182
|
-
)
|
|
183
|
-
|
|
184
|
-
if message.stop_reason == "tool_use":
|
|
185
|
-
tool_use = message.content[-1]
|
|
186
|
-
print(f"Tool called: {tool_use.name}")
|
|
187
|
-
print(f"Input: {tool_use.input}")
|
|
188
|
-
```
|
|
189
|
-
|
|
190
|
-
### Using with Vision (Images)
|
|
191
|
-
|
|
192
|
-
```python
|
|
193
|
-
import anthropic
|
|
194
|
-
import base64
|
|
195
|
-
|
|
196
|
-
client = anthropic.Anthropic(
|
|
197
|
-
base_url="http://localhost:8080",
|
|
198
|
-
api_key="dummy-key",
|
|
199
|
-
)
|
|
200
|
-
|
|
201
|
-
# Read image and encode as base64
|
|
202
|
-
with open("image.png", "rb") as f:
|
|
203
|
-
image_data = base64.b64encode(f.read()).decode()
|
|
204
|
-
|
|
205
|
-
message = client.messages.create(
|
|
206
|
-
model="gpt-4o",
|
|
207
|
-
max_tokens=1024,
|
|
208
|
-
messages=[
|
|
209
|
-
{
|
|
210
|
-
"role": "user",
|
|
211
|
-
"content": [
|
|
212
|
-
{"type": "text", "text": "What's in this image?"},
|
|
213
|
-
{
|
|
214
|
-
"type": "image",
|
|
215
|
-
"source": {
|
|
216
|
-
"type": "base64",
|
|
217
|
-
"media_type": "image/png",
|
|
218
|
-
"data": image_data,
|
|
219
|
-
},
|
|
220
|
-
},
|
|
221
|
-
],
|
|
222
|
-
}
|
|
223
|
-
],
|
|
224
|
-
)
|
|
225
|
-
|
|
226
|
-
print(message.content[0].text)
|
|
227
|
-
```
|
|
228
|
-
|
|
229
|
-
## Configuration Options
|
|
230
|
-
|
|
231
|
-
All configuration is done via environment variables (with `OA2A_` prefix) or a `.env` file:
|
|
232
|
-
|
|
233
|
-
| Variable | Default | Description |
|
|
234
|
-
|----------|---------|-------------|
|
|
235
|
-
| `OA2A_OPENAI_API_KEY` | *Required* | Your OpenAI API key |
|
|
236
|
-
| `OA2A_OPENAI_BASE_URL` | `https://api.openai.com/v1` | OpenAI API base URL |
|
|
237
|
-
| `OA2A_OPENAI_ORG_ID` | `None` | OpenAI organization ID |
|
|
238
|
-
| `OA2A_OPENAI_PROJECT_ID` | `None` | OpenAI project ID |
|
|
239
|
-
| `OA2A_HOST` | `0.0.0.0` | Server host to bind |
|
|
240
|
-
| `OA2A_PORT` | `8080` | Server port |
|
|
241
|
-
| `OA2A_REQUEST_TIMEOUT` | `300.0` | Request timeout in seconds |
|
|
242
|
-
| `OA2A_API_KEY` | `None` | Optional API key to protect your proxy |
|
|
243
|
-
| `OA2A_CORS_ORIGINS` | `["*"]` | Allowed CORS origins |
|
|
244
|
-
| `OA2A_LOG_LEVEL` | `INFO` | Logging level |
|
|
245
|
-
|
|
246
|
-
## Use Cases
|
|
247
|
-
|
|
248
|
-
### 1. Using Claude-based Apps with OpenAI Models
|
|
249
|
-
|
|
250
|
-
Many applications are built specifically for Claude's API. This proxy lets you use them with GPT-4, GPT-3.5, or any OpenAI-compatible model.
|
|
251
|
-
|
|
252
|
-
### 2. Local Development with vLLM
|
|
253
|
-
|
|
254
|
-
Run a local vLLM server with OpenAI-compatible API, then use this proxy to test Claude-integrated apps:
|
|
255
|
-
|
|
256
|
-
```bash
|
|
257
|
-
# Terminal 1: Start vLLM
|
|
258
|
-
vllm serve meta-llama/Llama-2-7b-chat-hf --api-key dummy
|
|
259
|
-
|
|
260
|
-
# Terminal 2: Start proxy pointing to vLLM
|
|
261
|
-
export OA2A_OPENAI_API_KEY=dummy
|
|
262
|
-
export OA2A_OPENAI_BASE_URL=http://localhost:8000/v1
|
|
263
|
-
local-openai2anthropic
|
|
264
|
-
|
|
265
|
-
# Terminal 3: Use Claude SDK with local model
|
|
266
|
-
python my_claude_app.py # Point to http://localhost:8080
|
|
267
|
-
```
|
|
268
|
-
|
|
269
|
-
### 3. Azure OpenAI Service
|
|
270
|
-
|
|
271
|
-
```bash
|
|
272
|
-
export OA2A_OPENAI_API_KEY="your-azure-key"
|
|
273
|
-
export OA2A_OPENAI_BASE_URL="https://your-resource.openai.azure.com/openai/deployments/your-deployment"
|
|
274
|
-
local-openai2anthropic
|
|
275
|
-
```
|
|
276
|
-
|
|
277
|
-
### 4. Other OpenAI-Compatible APIs
|
|
278
|
-
|
|
279
|
-
- **Groq**
|
|
280
|
-
- **Together AI**
|
|
281
|
-
- **Fireworks**
|
|
282
|
-
- **Anyscale**
|
|
283
|
-
- **LocalAI**
|
|
284
|
-
- **llama.cpp server**
|
|
285
|
-
|
|
286
|
-
## API Coverage
|
|
287
|
-
|
|
288
|
-
### Supported Anthropic Features
|
|
289
|
-
|
|
290
|
-
| Feature | Status | Notes |
|
|
291
|
-
|---------|--------|-------|
|
|
292
|
-
| `messages.create()` | ✅ Full | All parameters supported |
|
|
293
|
-
| Streaming | ✅ Full | SSE with all event types |
|
|
294
|
-
| Tool use | ✅ Full | Converted to OpenAI functions |
|
|
295
|
-
| Vision | ✅ Full | Images converted to base64 data URLs |
|
|
296
|
-
| System prompts | ✅ Full | String or array format |
|
|
297
|
-
| Stop sequences | ✅ Full | Passed through |
|
|
298
|
-
| Temperature | ✅ Full | |
|
|
299
|
-
| Top P | ✅ Full | |
|
|
300
|
-
| Top K | ✅ Full | |
|
|
301
|
-
| Max tokens | ✅ Full | |
|
|
302
|
-
| Thinking | ⚠️ Partial | Mapped to reasoning_effort where supported |
|
|
303
|
-
|
|
304
|
-
### Not Supported
|
|
305
|
-
|
|
306
|
-
- **Prompt caching** (`cache_control`) - OpenAI doesn't have equivalent
|
|
307
|
-
- **Computer use (beta)** - Requires native Claude capabilities
|
|
308
|
-
|
|
309
|
-
## Architecture
|
|
310
|
-
|
|
311
|
-
```
|
|
312
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
313
|
-
│ Your Application │
|
|
314
|
-
│ (uses Anthropic Python SDK) │
|
|
315
|
-
└───────────────────────┬─────────────────────────────────────────┘
|
|
316
|
-
│ Anthropic Messages API format
|
|
317
|
-
▼
|
|
318
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
319
|
-
│ local-openai2anthropic Proxy │
|
|
320
|
-
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
|
321
|
-
│ │ Anthropic │───▶│ Converter │───▶│ OpenAI │ │
|
|
322
|
-
│ │ Request │ │ │ │ Request │ │
|
|
323
|
-
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
|
324
|
-
│ │
|
|
325
|
-
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
|
326
|
-
│ │ Anthropic │◀───│ Converter │◀───│ OpenAI │ │
|
|
327
|
-
│ │ Response │ │ │ │ Response │ │
|
|
328
|
-
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
|
329
|
-
└───────────────────────┬─────────────────────────────────────────┘
|
|
330
|
-
│ OpenAI API format
|
|
331
|
-
▼
|
|
332
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
333
|
-
│ OpenAI-compatible Backend │
|
|
334
|
-
│ (OpenAI, Azure, vLLM, Groq, etc.) │
|
|
335
|
-
└─────────────────────────────────────────────────────────────────┘
|
|
336
|
-
```
|
|
337
|
-
|
|
338
|
-
## Development
|
|
339
|
-
|
|
340
|
-
```bash
|
|
341
|
-
# Clone repository
|
|
342
|
-
git clone https://github.com/yourusername/local-openai2anthropic.git
|
|
343
|
-
cd local-openai2anthropic
|
|
344
|
-
|
|
345
|
-
# Install in development mode
|
|
346
|
-
pip install -e ".[dev]"
|
|
347
|
-
|
|
348
|
-
# Run tests
|
|
349
|
-
pytest
|
|
350
|
-
|
|
351
|
-
# Format code
|
|
352
|
-
black src/
|
|
353
|
-
ruff check src/
|
|
354
|
-
|
|
355
|
-
# Type check
|
|
356
|
-
mypy src/
|
|
357
|
-
```
|
|
358
|
-
|
|
359
|
-
## License
|
|
360
|
-
|
|
361
|
-
Apache License 2.0 - See [LICENSE](LICENSE) for details.
|
|
362
|
-
|
|
363
|
-
## Acknowledgments
|
|
364
|
-
|
|
365
|
-
This project is based on the Anthropic API implementation from [vLLM](https://github.com/vllm-project/vllm), adapted to work as a standalone proxy service.
|
|
366
|
-
|
|
367
|
-
---
|
|
368
|
-
|
|
369
|
-
<a name="中文"></a>
|
|
370
|
-
## 中文
|
|
371
|
-
|
|
372
|
-
一个轻量级代理服务器,将 **Anthropic Messages API** 请求转换为 **OpenAI API** 调用。
|
|
373
|
-
|
|
374
|
-
让你可以使用任何 OpenAI 兼容的后端(OpenAI、Azure、本地 vLLM 等)来运行为 Claude API 构建的应用程序。
|
|
375
|
-
|
|
376
|
-
## 功能特性
|
|
377
|
-
|
|
378
|
-
- ✅ **完整的 Messages API 兼容性** - 支持所有 Anthropic Messages API 功能
|
|
379
|
-
- ✅ **流式响应支持** - 使用服务器推送事件(SSE)实现实时响应
|
|
380
|
-
- ✅ **工具/函数调用** - Anthropic 工具与 OpenAI 函数之间的双向转换
|
|
381
|
-
- ✅ **视觉/多模态** - 支持图片输入
|
|
382
|
-
- ✅ **官方 SDK 类型** - 使用官方 `anthropic` 和 `openai` Python SDK 确保类型安全
|
|
383
|
-
- ✅ **简单配置** - 支持环境变量或 `.env` 文件
|
|
384
|
-
- ✅ **CORS 支持** - 开箱即用的浏览器应用支持
|
|
385
|
-
- ✅ **自托管** - 可在本地运行或部署到自己的基础设施
|
|
386
|
-
|
|
387
|
-
## 快速开始
|
|
388
|
-
|
|
389
|
-
### 安装
|
|
390
|
-
|
|
391
|
-
```bash
|
|
392
|
-
# 从源码安装(当前推荐)
|
|
393
|
-
git clone https://github.com/yourusername/local-openai2anthropic.git
|
|
394
|
-
cd local-openai2anthropic
|
|
395
|
-
pip install -e ".[dev]"
|
|
396
|
-
|
|
397
|
-
# 或直接安装
|
|
398
|
-
pip install local-openai2anthropic
|
|
399
|
-
```
|
|
400
|
-
|
|
401
|
-
### 配置
|
|
402
|
-
|
|
403
|
-
设置你的 OpenAI API 密钥:
|
|
404
|
-
|
|
405
|
-
```bash
|
|
406
|
-
export OA2A_OPENAI_API_KEY="sk-..."
|
|
407
|
-
```
|
|
408
|
-
|
|
409
|
-
或创建 `.env` 文件:
|
|
410
|
-
|
|
411
|
-
```env
|
|
412
|
-
OA2A_OPENAI_API_KEY=sk-...
|
|
413
|
-
OA2A_OPENAI_BASE_URL=https://api.openai.com/v1
|
|
414
|
-
OA2A_HOST=0.0.0.0
|
|
415
|
-
OA2A_PORT=8080
|
|
416
|
-
```
|
|
417
|
-
|
|
418
|
-
### 运行服务器
|
|
419
|
-
|
|
420
|
-
```bash
|
|
421
|
-
# 使用 CLI 命令
|
|
422
|
-
local-openai2anthropic
|
|
423
|
-
|
|
424
|
-
# 或使用短别名
|
|
425
|
-
oa2a
|
|
426
|
-
|
|
427
|
-
# 或使用 Python 模块
|
|
428
|
-
python -m local_openai2anthropic
|
|
429
|
-
```
|
|
430
|
-
|
|
431
|
-
## 使用示例
|
|
432
|
-
|
|
433
|
-
### 使用 Anthropic Python SDK
|
|
434
|
-
|
|
435
|
-
```python
|
|
436
|
-
import anthropic
|
|
437
|
-
|
|
438
|
-
# 指向本地代理而非 Anthropic 官方 API
|
|
439
|
-
client = anthropic.Anthropic(
|
|
440
|
-
base_url="http://localhost:8080",
|
|
441
|
-
api_key="dummy-key", # 需要填写但不会被使用
|
|
442
|
-
)
|
|
443
|
-
|
|
444
|
-
# 像平常一样使用 Messages API
|
|
445
|
-
message = client.messages.create(
|
|
446
|
-
model="gpt-4o", # 会传递给 OpenAI
|
|
447
|
-
max_tokens=1024,
|
|
448
|
-
messages=[
|
|
449
|
-
{"role": "user", "content": "你好,Claude!"}
|
|
450
|
-
]
|
|
451
|
-
)
|
|
452
|
-
|
|
453
|
-
print(message.content[0].text)
|
|
454
|
-
```
|
|
455
|
-
|
|
456
|
-
### 使用流式响应
|
|
457
|
-
|
|
458
|
-
```python
|
|
459
|
-
import anthropic
|
|
460
|
-
|
|
461
|
-
client = anthropic.Anthropic(
|
|
462
|
-
base_url="http://localhost:8080",
|
|
463
|
-
api_key="dummy-key",
|
|
464
|
-
)
|
|
465
|
-
|
|
466
|
-
stream = client.messages.create(
|
|
467
|
-
model="gpt-4o",
|
|
468
|
-
max_tokens=1024,
|
|
469
|
-
messages=[{"role": "user", "content": "数到 10"}],
|
|
470
|
-
stream=True,
|
|
471
|
-
)
|
|
472
|
-
|
|
473
|
-
for event in stream:
|
|
474
|
-
if event.type == "content_block_delta":
|
|
475
|
-
print(event.delta.text, end="", flush=True)
|
|
476
|
-
```
|
|
477
|
-
|
|
478
|
-
### 使用工具调用
|
|
479
|
-
|
|
480
|
-
```python
|
|
481
|
-
import anthropic
|
|
482
|
-
|
|
483
|
-
client = anthropic.Anthropic(
|
|
484
|
-
base_url="http://localhost:8080",
|
|
485
|
-
api_key="dummy-key",
|
|
486
|
-
)
|
|
487
|
-
|
|
488
|
-
message = client.messages.create(
|
|
489
|
-
model="gpt-4o",
|
|
490
|
-
max_tokens=1024,
|
|
491
|
-
tools=[
|
|
492
|
-
{
|
|
493
|
-
"name": "get_weather",
|
|
494
|
-
"description": "获取某地天气",
|
|
495
|
-
"input_schema": {
|
|
496
|
-
"type": "object",
|
|
497
|
-
"properties": {
|
|
498
|
-
"location": {"type": "string"},
|
|
499
|
-
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
|
|
500
|
-
},
|
|
501
|
-
"required": ["location"],
|
|
502
|
-
},
|
|
503
|
-
}
|
|
504
|
-
],
|
|
505
|
-
messages=[{"role": "user", "content": "东京天气如何?"}],
|
|
506
|
-
)
|
|
507
|
-
|
|
508
|
-
if message.stop_reason == "tool_use":
|
|
509
|
-
tool_use = message.content[-1]
|
|
510
|
-
print(f"调用工具: {tool_use.name}")
|
|
511
|
-
print(f"输入参数: {tool_use.input}")
|
|
512
|
-
```
|
|
513
|
-
|
|
514
|
-
### 使用视觉(图片)
|
|
515
|
-
|
|
516
|
-
```python
|
|
517
|
-
import anthropic
|
|
518
|
-
import base64
|
|
519
|
-
|
|
520
|
-
client = anthropic.Anthropic(
|
|
521
|
-
base_url="http://localhost:8080",
|
|
522
|
-
api_key="dummy-key",
|
|
523
|
-
)
|
|
524
|
-
|
|
525
|
-
# 读取图片并编码为 base64
|
|
526
|
-
with open("image.png", "rb") as f:
|
|
527
|
-
image_data = base64.b64encode(f.read()).decode()
|
|
528
|
-
|
|
529
|
-
message = client.messages.create(
|
|
530
|
-
model="gpt-4o",
|
|
531
|
-
max_tokens=1024,
|
|
532
|
-
messages=[
|
|
533
|
-
{
|
|
534
|
-
"role": "user",
|
|
535
|
-
"content": [
|
|
536
|
-
{"type": "text", "text": "这张图片里有什么?"},
|
|
537
|
-
{
|
|
538
|
-
"type": "image",
|
|
539
|
-
"source": {
|
|
540
|
-
"type": "base64",
|
|
541
|
-
"media_type": "image/png",
|
|
542
|
-
"data": image_data,
|
|
543
|
-
},
|
|
544
|
-
},
|
|
545
|
-
],
|
|
546
|
-
}
|
|
547
|
-
],
|
|
548
|
-
)
|
|
549
|
-
|
|
550
|
-
print(message.content[0].text)
|
|
551
|
-
```
|
|
552
|
-
|
|
553
|
-
## 配置选项
|
|
554
|
-
|
|
555
|
-
所有配置都可以通过环境变量(前缀为 `OA2A_`)或 `.env` 文件设置:
|
|
556
|
-
|
|
557
|
-
| 变量 | 默认值 | 说明 |
|
|
558
|
-
|------|--------|------|
|
|
559
|
-
| `OA2A_OPENAI_API_KEY` | *必填* | 你的 OpenAI API 密钥 |
|
|
560
|
-
| `OA2A_OPENAI_BASE_URL` | `https://api.openai.com/v1` | OpenAI API 基础地址 |
|
|
561
|
-
| `OA2A_OPENAI_ORG_ID` | `None` | OpenAI 组织 ID |
|
|
562
|
-
| `OA2A_OPENAI_PROJECT_ID` | `None` | OpenAI 项目 ID |
|
|
563
|
-
| `OA2A_HOST` | `0.0.0.0` | 服务器绑定的主机地址 |
|
|
564
|
-
| `OA2A_PORT` | `8080` | 服务器端口 |
|
|
565
|
-
| `OA2A_REQUEST_TIMEOUT` | `300.0` | 请求超时时间(秒) |
|
|
566
|
-
| `OA2A_API_KEY` | `None` | 保护代理的可选 API 密钥 |
|
|
567
|
-
| `OA2A_CORS_ORIGINS` | `["*"]` | 允许的 CORS 来源 |
|
|
568
|
-
| `OA2A_LOG_LEVEL` | `INFO` | 日志级别 |
|
|
569
|
-
|
|
570
|
-
## 使用场景
|
|
571
|
-
|
|
572
|
-
### 1. 使用 Claude 应用配合 OpenAI 模型
|
|
573
|
-
|
|
574
|
-
许多应用是专门为 Claude API 构建的。此代理允许你将它们与 GPT-4、GPT-3.5 或任何 OpenAI 兼容模型一起使用。
|
|
575
|
-
|
|
576
|
-
### 2. 配合 vLLM 进行本地开发
|
|
577
|
-
|
|
578
|
-
运行本地 vLLM 服务器,使用 OpenAI 兼容 API,然后通过此代理测试集成 Claude 的应用:
|
|
579
|
-
|
|
580
|
-
```bash
|
|
581
|
-
# 终端 1:启动 vLLM
|
|
582
|
-
vllm serve meta-llama/Llama-2-7b-chat-hf --api-key dummy
|
|
583
|
-
|
|
584
|
-
# 终端 2:启动代理指向 vLLM
|
|
585
|
-
export OA2A_OPENAI_API_KEY=dummy
|
|
586
|
-
export OA2A_OPENAI_BASE_URL=http://localhost:8000/v1
|
|
587
|
-
local-openai2anthropic
|
|
588
|
-
|
|
589
|
-
# 终端 3:使用 Claude SDK 配合本地模型
|
|
590
|
-
python my_claude_app.py # 指向 http://localhost:8080
|
|
591
|
-
```
|
|
592
|
-
|
|
593
|
-
### 3. Azure OpenAI 服务
|
|
594
|
-
|
|
595
|
-
```bash
|
|
596
|
-
export OA2A_OPENAI_API_KEY="your-azure-key"
|
|
597
|
-
export OA2A_OPENAI_BASE_URL="https://your-resource.openai.azure.com/openai/deployments/your-deployment"
|
|
598
|
-
local-openai2anthropic
|
|
599
|
-
```
|
|
600
|
-
|
|
601
|
-
### 4. 其他 OpenAI 兼容 API
|
|
602
|
-
|
|
603
|
-
- **Groq**
|
|
604
|
-
- **Together AI**
|
|
605
|
-
- **Fireworks**
|
|
606
|
-
- **Anyscale**
|
|
607
|
-
- **LocalAI**
|
|
608
|
-
- **llama.cpp server**
|
|
609
|
-
|
|
610
|
-
## API 覆盖情况
|
|
611
|
-
|
|
612
|
-
### 支持的 Anthropic 功能
|
|
613
|
-
|
|
614
|
-
| 功能 | 状态 | 备注 |
|
|
615
|
-
|------|------|------|
|
|
616
|
-
| `messages.create()` | ✅ 完整 | 支持所有参数 |
|
|
617
|
-
| 流式响应 | ✅ 完整 | SSE 支持所有事件类型 |
|
|
618
|
-
| 工具调用 | ✅ 完整 | 转换为 OpenAI 函数 |
|
|
619
|
-
| 视觉 | ✅ 完整 | 图片转换为 base64 data URLs |
|
|
620
|
-
| 系统提示词 | ✅ 完整 | 支持字符串或数组格式 |
|
|
621
|
-
| 停止序列 | ✅ 完整 | 透传 |
|
|
622
|
-
| 温度 | ✅ 完整 | |
|
|
623
|
-
| Top P | ✅ 完整 | |
|
|
624
|
-
| Top K | ✅ 完整 | |
|
|
625
|
-
| 最大 token | ✅ 完整 | |
|
|
626
|
-
| 思考模式 | ⚠️ 部分 | 在支持的模型上映射到 reasoning_effort |
|
|
627
|
-
|
|
628
|
-
### 不支持的功能
|
|
629
|
-
|
|
630
|
-
- **提示词缓存** (`cache_control`) - OpenAI 没有等效功能
|
|
631
|
-
- **计算机使用(测试版)** - 需要原生 Claude 能力
|
|
632
|
-
|
|
633
|
-
## 架构
|
|
634
|
-
|
|
635
|
-
```
|
|
636
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
637
|
-
│ 你的应用 │
|
|
638
|
-
│ (使用 Anthropic Python SDK) │
|
|
639
|
-
└───────────────────────┬─────────────────────────────────────────┘
|
|
640
|
-
│ Anthropic Messages API 格式
|
|
641
|
-
▼
|
|
642
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
643
|
-
│ local-openai2anthropic 代理 │
|
|
644
|
-
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
|
645
|
-
│ │ Anthropic │───▶│ Converter │───▶│ OpenAI │ │
|
|
646
|
-
│ │ Request │ │ │ │ Request │ │
|
|
647
|
-
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
|
648
|
-
│ │
|
|
649
|
-
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
|
650
|
-
│ │ Anthropic │◀───│ Converter │◀───│ OpenAI │ │
|
|
651
|
-
│ │ Response │ │ │ │ Response │ │
|
|
652
|
-
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
|
653
|
-
└───────────────────────┬─────────────────────────────────────────┘
|
|
654
|
-
│ OpenAI API 格式
|
|
655
|
-
▼
|
|
656
|
-
┌─────────────────────────────────────────────────────────────────┐
|
|
657
|
-
│ OpenAI 兼容后端 │
|
|
658
|
-
│ (OpenAI、Azure、vLLM、Groq 等) │
|
|
659
|
-
└─────────────────────────────────────────────────────────────────┘
|
|
660
|
-
```
|
|
661
|
-
|
|
662
|
-
## 开发
|
|
663
|
-
|
|
664
|
-
```bash
|
|
665
|
-
# 克隆仓库
|
|
666
|
-
git clone https://github.com/yourusername/local-openai2anthropic.git
|
|
667
|
-
cd local-openai2anthropic
|
|
668
|
-
|
|
669
|
-
# 以开发模式安装
|
|
670
|
-
pip install -e ".[dev]"
|
|
671
|
-
|
|
672
|
-
# 运行测试
|
|
673
|
-
pytest
|
|
674
|
-
|
|
675
|
-
# 格式化代码
|
|
676
|
-
black src/
|
|
677
|
-
ruff check src/
|
|
678
|
-
|
|
679
|
-
# 类型检查
|
|
680
|
-
mypy src/
|
|
681
|
-
```
|
|
682
|
-
|
|
683
|
-
## 许可证
|
|
684
|
-
|
|
685
|
-
Apache License 2.0 - 详情请参阅 [LICENSE](LICENSE)。
|
|
686
|
-
|
|
687
|
-
## 致谢
|
|
688
|
-
|
|
689
|
-
本项目基于 [vLLM](https://github.com/vllm-project/vllm) 的 Anthropic API 实现,适配为独立的代理服务。
|
|
@@ -1,15 +0,0 @@
|
|
|
1
|
-
local_openai2anthropic/__init__.py,sha256=jgIoIwQXIXS83WbRUx2CF1x0A8DloLduoUIUGXwWhSU,1059
|
|
2
|
-
local_openai2anthropic/config.py,sha256=Z7GYUPNSvtkKM-ZyG3bodywIMO7sPqG4JK42HwyDjWE,1900
|
|
3
|
-
local_openai2anthropic/converter.py,sha256=0Zx_CmYsr9i4315lEPy9O2YV5o05ZYhWtW5n6uQ38Qk,21668
|
|
4
|
-
local_openai2anthropic/main.py,sha256=SmhOW39qgyztY0PdmKTUeMQQ3qucE0cfODNTvtjbps8,5358
|
|
5
|
-
local_openai2anthropic/protocol.py,sha256=vUEgxtRPFll6jEtLc4DyxTLCBjrWIEScZXhEqe4uibk,5185
|
|
6
|
-
local_openai2anthropic/router.py,sha256=Er2LGA0KY_qJDFSd7-23zrnADqIxrlz8bJ36C5ENXLY,31352
|
|
7
|
-
local_openai2anthropic/tavily_client.py,sha256=QsBhnyF8BFWPAxB4XtWCCpHCquNL5SW93-zjTTi4Meg,3774
|
|
8
|
-
local_openai2anthropic/server_tools/__init__.py,sha256=QlJfjEta-HOCtLe7NaY_fpbEKv-ZpInjAnfmSqE9tbk,615
|
|
9
|
-
local_openai2anthropic/server_tools/base.py,sha256=pNFsv-jSgxVrkY004AHAcYMNZgVSO8ZOeCzQBUtQ3vU,5633
|
|
10
|
-
local_openai2anthropic/server_tools/web_search.py,sha256=1C7lX_cm-tMaN3MsCjinEZYPJc_Hj4yAxYay9h8Zbvs,6543
|
|
11
|
-
local_openai2anthropic-0.1.0.dist-info/METADATA,sha256=zCcNG8r140_XLo54415DyJDZM3u6svxW8VBVZtXYQiI,22272
|
|
12
|
-
local_openai2anthropic-0.1.0.dist-info/WHEEL,sha256=WLgqFyCfm_KASv4WHyYy0P3pM_m7J5L9k2skdKLirC8,87
|
|
13
|
-
local_openai2anthropic-0.1.0.dist-info/entry_points.txt,sha256=hdc9tSJUNxyNLXcTYye5SuD2K0bEQhxBhGnWTFup6ZM,116
|
|
14
|
-
local_openai2anthropic-0.1.0.dist-info/licenses/LICENSE,sha256=X3_kZy3lJvd_xp8IeyUcIAO2Y367MXZc6aaRx8BYR_s,11369
|
|
15
|
-
local_openai2anthropic-0.1.0.dist-info/RECORD,,
|
|
File without changes
|
{local_openai2anthropic-0.1.0.dist-info → local_openai2anthropic-0.3.6.dist-info}/entry_points.txt
RENAMED
|
File without changes
|