supercompat 4.2.0 → 4.2.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +93 -0
- package/package.json +1 -1
package/README.md
ADDED
|
@@ -0,0 +1,93 @@
|
|
|
1
|
+

|
|
2
|
+
|
|
3
|
+
Supercompat is AI compatibility layer without compromises. Supercompat library that lets you call **any LLM provider** through the **OpenAI SDK** (or the **Anthropic SDK**). Swap one adapter and the same `client.responses.create()` call reaches Anthropic, Google, Groq, Mistral, Together, OpenRouter, Perplexity, Ollama, or Azure — with the original SDK types intact.
|
|
4
|
+
|
|
5
|
+
It runs in-process. No proxy server, no request forwarding, no extra latency. Supercompat installs a custom `fetch` on the SDK instance and routes calls locally.
|
|
6
|
+
|
|
7
|
+
Full docs: **[supercompat.com/docs](https://supercompat.com/docs)**.
|
|
8
|
+
|
|
9
|
+
## Install
|
|
10
|
+
|
|
11
|
+
```bash
|
|
12
|
+
npm install supercompat openai
|
|
13
|
+
```
|
|
14
|
+
|
|
15
|
+
## Quick example
|
|
16
|
+
|
|
17
|
+
```tsx
|
|
18
|
+
import {
|
|
19
|
+
supercompat,
|
|
20
|
+
anthropicClientAdapter,
|
|
21
|
+
completionsRunAdapter,
|
|
22
|
+
memoryStorageAdapter,
|
|
23
|
+
} from 'supercompat/openai'
|
|
24
|
+
import Anthropic from '@anthropic-ai/sdk'
|
|
25
|
+
|
|
26
|
+
const client = supercompat({
|
|
27
|
+
clientAdapter: anthropicClientAdapter({ anthropic: new Anthropic() }),
|
|
28
|
+
storageAdapter: memoryStorageAdapter(),
|
|
29
|
+
runAdapter: completionsRunAdapter(),
|
|
30
|
+
})
|
|
31
|
+
|
|
32
|
+
const response = await client.responses.create({
|
|
33
|
+
model: 'claude-sonnet-4-6',
|
|
34
|
+
input: 'Say hello.',
|
|
35
|
+
})
|
|
36
|
+
|
|
37
|
+
console.log(response.output_text)
|
|
38
|
+
```
|
|
39
|
+
|
|
40
|
+
`client` is a real `OpenAI` instance with the real TypeScript types. Every call made on it — `responses`, `chat.completions`, `beta.threads` — is intercepted by Supercompat and translated into a request against the Anthropic SDK. Switching providers is a change to `clientAdapter`; everything else stays the same.
|
|
41
|
+
|
|
42
|
+
## Persistent state
|
|
43
|
+
|
|
44
|
+
`memoryStorageAdapter` is fine for one-shot scripts but loses everything on restart. For persisted conversations, threads, and runs, swap it for [`prismaStorageAdapter`](https://supercompat.com/docs/adapters/storage-adapters/prisma):
|
|
45
|
+
|
|
46
|
+
```tsx
|
|
47
|
+
import { PrismaClient } from '@prisma/client'
|
|
48
|
+
import {
|
|
49
|
+
supercompat,
|
|
50
|
+
anthropicClientAdapter,
|
|
51
|
+
completionsRunAdapter,
|
|
52
|
+
prismaStorageAdapter,
|
|
53
|
+
} from 'supercompat/openai'
|
|
54
|
+
import Anthropic from '@anthropic-ai/sdk'
|
|
55
|
+
|
|
56
|
+
const prisma = new PrismaClient()
|
|
57
|
+
|
|
58
|
+
const client = supercompat({
|
|
59
|
+
clientAdapter: anthropicClientAdapter({ anthropic: new Anthropic() }),
|
|
60
|
+
storageAdapter: prismaStorageAdapter({ prisma }),
|
|
61
|
+
runAdapter: completionsRunAdapter(),
|
|
62
|
+
})
|
|
63
|
+
|
|
64
|
+
// Continue a conversation across requests with previous_response_id:
|
|
65
|
+
const first = await client.responses.create({
|
|
66
|
+
model: 'claude-sonnet-4-6',
|
|
67
|
+
input: 'My name is Alice.',
|
|
68
|
+
})
|
|
69
|
+
|
|
70
|
+
const second = await client.responses.create({
|
|
71
|
+
model: 'claude-sonnet-4-6',
|
|
72
|
+
input: 'What did I just tell you?',
|
|
73
|
+
previous_response_id: first.id,
|
|
74
|
+
})
|
|
75
|
+
```
|
|
76
|
+
|
|
77
|
+
Conversations, responses, assistants, threads, messages, and runs all land in Postgres. See [Storage adapters](https://supercompat.com/docs/adapters/storage-adapters) for every option — including OpenAI-managed and Azure-managed state.
|
|
78
|
+
|
|
79
|
+
## Where to go next
|
|
80
|
+
|
|
81
|
+
- **[Installation](https://supercompat.com/docs/getting-started/installation)** — install the package, pick a provider SDK, and wire them together.
|
|
82
|
+
- **[Comparison](https://supercompat.com/docs/getting-started/comparison)** — how Supercompat compares to Vercel AI SDK, LiteLLM, LangChain, and others.
|
|
83
|
+
- **[Output SDKs](https://supercompat.com/docs/output-sdks)** — return an OpenAI-shaped or Anthropic-shaped client. Works with every provider.
|
|
84
|
+
- **[Adapters](https://supercompat.com/docs/adapters)** — the three adapter types (client, storage, run) and how they compose.
|
|
85
|
+
- **[Providers](https://supercompat.com/docs/providers)** — setup notes for OpenAI, Anthropic, Google, Azure, and every other backend.
|
|
86
|
+
- **[Tools](https://supercompat.com/docs/tools)** — function calling, web search, file search, code interpreter, and computer use.
|
|
87
|
+
- **[Streaming](https://supercompat.com/docs/streaming)** — stream deltas through the OpenAI SDK regardless of which provider is behind it.
|
|
88
|
+
|
|
89
|
+
## Links
|
|
90
|
+
|
|
91
|
+
- Docs: [supercompat.com/docs](https://supercompat.com/docs)
|
|
92
|
+
- GitHub: [github.com/supercorp-ai/supercompat](https://github.com/supercorp-ai/supercompat)
|
|
93
|
+
- Supported by [Supercorp](https://supercorp.ai)
|