llm-proxy 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +34 -0
- package/docs/directoryStructure.png +0 -0
- package/docs/plan.excalidraw +1758 -0
- package/docs/plan.png +0 -0
- package/jest.config.js +10 -0
- package/package.json +29 -0
- package/src/clients/AwsBedrockAnthropicClient.ts +56 -0
- package/src/clients/OpenAIClient.ts +66 -0
- package/src/index.ts +145 -0
- package/src/middleware/InputFormatAdapter.ts +51 -0
- package/src/middleware/OutputFormatAdapter.ts +83 -0
- package/src/middleware/ProviderFinder.ts +23 -0
- package/src/services/AwsBedrockAnthropicService.ts +111 -0
- package/src/services/ClientService.ts +29 -0
- package/src/services/OpenAIService.ts +68 -0
- package/src/test/AwsBedrockAnthropicClient.test.ts +98 -0
- package/src/test/AwsBedrockAnthropicStreamClient.test.ts +81 -0
- package/src/test/OpenAIClient.test.ts +58 -0
- package/src/types/index.ts +234 -0
- package/tsconfig.json +110 -0
package/README.md
ADDED
|
@@ -0,0 +1,34 @@
|
|
|
1
|
+
# LLM Proxy
|
|
2
|
+
|
|
3
|
+
The main purpose of the project is to allow the users to make OpenAI compatabile request to any provider. In the background the LLM Proxy is going to figure out which provider your trying to user based on the passed model.
|
|
4
|
+
|
|
5
|
+
For Example
|
|
6
|
+
|
|
7
|
+
```javascript
|
|
8
|
+
|
|
9
|
+
const openaiFormatMessage = [
|
|
10
|
+
{
|
|
11
|
+
role: "system",
|
|
12
|
+
content: "You are a helpful assistant.",
|
|
13
|
+
},
|
|
14
|
+
{
|
|
15
|
+
role: "user",
|
|
16
|
+
content: "What is the meaning of life?",
|
|
17
|
+
},
|
|
18
|
+
];
|
|
19
|
+
|
|
20
|
+
const openaiCompatabilePayload = {
|
|
21
|
+
messages: openaiFormatMessage,
|
|
22
|
+
model: "anthropic.claude-3-sonnet-20240229-v1:0",
|
|
23
|
+
maxTokens: 60,
|
|
24
|
+
temperature: 0.5,
|
|
25
|
+
};
|
|
26
|
+
|
|
27
|
+
```
|
|
28
|
+
|
|
29
|
+
In this example the message is openai message, and the payload is also openai compatabile. But in fact the user is trying to communicate with Anthopic Sonnet on AWS.
|
|
30
|
+
|
|
31
|
+
The LLM Proxy is going to fnd out that your trying to communicate with Anthopic Sonnet on AWS and reformat the message and the payload to Anthropic compatabile one.
|
|
32
|
+
|
|
33
|
+
|
|
34
|
+

|
|
Binary file
|