unified-ai-router 3.1.1 โ 3.1.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.env.example +2 -1
- package/package.json +1 -1
- package/readme.md +19 -21
package/.env.example
CHANGED
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "unified-ai-router",
|
|
3
|
-
"version": "3.1.
|
|
3
|
+
"version": "3.1.4",
|
|
4
4
|
"description": "A unified interface for multiple LLM providers with automatic fallback. This project includes an OpenAI-compatible server and a deployable Telegram bot with a Mini App interface. It supports major providers like OpenAI, Google, Grok, and more, ensuring reliability and flexibility for your AI applications.",
|
|
5
5
|
"license": "ISC",
|
|
6
6
|
"author": "mlibre",
|
package/readme.md
CHANGED
|
@@ -4,7 +4,6 @@ Unified AI Router is a comprehensive toolkit for AI applications, featuring:
|
|
|
4
4
|
|
|
5
5
|
- A unified interface for multiple LLM providers with automatic fallback (the core router library)
|
|
6
6
|
- An OpenAI-compatible server for seamless API integration
|
|
7
|
-
- A deployable Telegram bot with Mini App interface
|
|
8
7
|
|
|
9
8
|
It supports major providers like OpenAI, Google, Grok, and more, ensuring reliability and flexibility.
|
|
10
9
|
|
|
@@ -13,6 +12,7 @@ It supports major providers like OpenAI, Google, Grok, and more, ensuring reliab
|
|
|
13
12
|
- [๐ Usage](#-usage)
|
|
14
13
|
- [๐ Basic Library Usage](#-basic-library-usage)
|
|
15
14
|
- [๐ OpenAI-Compatible Server](#-openai-compatible-server)
|
|
15
|
+
- [Setup](#setup)
|
|
16
16
|
- [๐งช Testing](#-testing)
|
|
17
17
|
- [๐ง Supported Providers](#-supported-providers)
|
|
18
18
|
- [๐ API Keys](#-api-keys)
|
|
@@ -31,15 +31,14 @@ It supports major providers like OpenAI, Google, Grok, and more, ensuring reliab
|
|
|
31
31
|
- **OpenAI-Compatible Server**: Drop-in replacement for the OpenAI API, enabling easy integration with existing tools and clients
|
|
32
32
|
- **Streaming and Non-Streaming Support**: Handles both streaming and non-streaming responses
|
|
33
33
|
- **Tool Calling**: Full support for tools in LLM interactions
|
|
34
|
-
- **Telegram Bot Integration**: Deployable as a Telegram bot with an interactive Mini App interface
|
|
35
34
|
|
|
36
35
|
## ๐ ๏ธ Installation
|
|
37
36
|
|
|
38
37
|
```bash
|
|
39
38
|
npm i unified-ai-router
|
|
40
39
|
# OR
|
|
41
|
-
git clone https://github.com/mlibre/
|
|
42
|
-
cd
|
|
40
|
+
git clone https://github.com/mlibre/Unified-AI-Router
|
|
41
|
+
cd Unified-AI-Router
|
|
43
42
|
npm i
|
|
44
43
|
```
|
|
45
44
|
|
|
@@ -87,16 +86,14 @@ console.log(response);
|
|
|
87
86
|
The OpenAI-compatible server provides a drop-in replacement for the OpenAI API. It routes requests through the unified router with fallback logic, ensuring high availability.
|
|
88
87
|
The server uses the provider configurations defined in [provider.js](provider.js) file, and requires API keys set in a `.env` file.
|
|
89
88
|
|
|
90
|
-
|
|
89
|
+
1. Configure your providers in `provider.js`. Add new provider or modify existing ones with the appropriate `name`, `apiKey` (referencing the corresponding env variable), `model`, and `apiUrl` for the providers you want to use.
|
|
91
90
|
|
|
92
|
-
|
|
91
|
+
2. Copy the example environment file:
|
|
93
92
|
|
|
94
93
|
```bash
|
|
95
94
|
cp .env.example .env
|
|
96
95
|
```
|
|
97
96
|
|
|
98
|
-
2. Configure your providers in `provider.js`. Add new provider or modify existing ones with the appropriate `name`, `apiKey` (referencing the corresponding env variable), `model`, and `apiUrl` for the providers you want to use.
|
|
99
|
-
|
|
100
97
|
3. Edit `.env` and add your API keys for the desired providers (see [๐ API Keys](#-api-keys) for sources).
|
|
101
98
|
|
|
102
99
|
To start the server locally, run:
|
|
@@ -105,7 +102,13 @@ To start the server locally, run:
|
|
|
105
102
|
npm start
|
|
106
103
|
```
|
|
107
104
|
|
|
108
|
-
The server listens at `http://localhost:3000
|
|
105
|
+
The server listens at `http://localhost:3000/` and supports the following OpenAI-compatible endpoints:
|
|
106
|
+
|
|
107
|
+
- `POST /v1/chat/completions` - Chat completions (streaming and non-streaming)
|
|
108
|
+
- `POST /chat/completions` - Chat completions (streaming and non-streaming)
|
|
109
|
+
- `GET /v1/models` - List available models
|
|
110
|
+
- `GET /models` - List available models
|
|
111
|
+
- `GET /health` - Health check
|
|
109
112
|
|
|
110
113
|
### ๐งช Testing
|
|
111
114
|
|
|
@@ -137,6 +140,7 @@ node tests/tools.js
|
|
|
137
140
|
- Vercel
|
|
138
141
|
- Cerebras
|
|
139
142
|
- LLM7
|
|
143
|
+
- Any Other OpenAI Compatible Server
|
|
140
144
|
|
|
141
145
|
## ๐ API Keys
|
|
142
146
|
|
|
@@ -152,7 +156,6 @@ Get your API keys from the following providers:
|
|
|
152
156
|
- **Vercel AI Gateway**: [vercel.com/docs/ai/ai-gateway](https://vercel.com/docs/ai-gateway)
|
|
153
157
|
- **Cerebras**: [cloud.cerebras.ai](https://cloud.cerebras.ai)
|
|
154
158
|
- **LLM7**: [token.llm7.io](https://token.llm7.io/)
|
|
155
|
-
- Seems like it does not support tool calling
|
|
156
159
|
|
|
157
160
|
## ๐ผ Vercel Deployment (Telegram Bot)
|
|
158
161
|
|
|
@@ -214,19 +217,14 @@ curl "https://ai-router-flame.vercel.app/api?register_webhook=true"
|
|
|
214
217
|
|
|
215
218
|
After deploying the bot, you need to configure the Telegram Mini App and menu button:
|
|
216
219
|
|
|
217
|
-
|
|
218
|
-
- Go to [@BotFather](https://t.me/botfather)
|
|
219
|
-
- Send `/mybots` and select your bot
|
|
220
|
-
- Go to `Bot Settings` โ `Configure Mini App`
|
|
221
|
-
- Set the Mini App URL to: `https://ai-router-flame.vercel.app`
|
|
220
|
+
**Configure Mini App:**
|
|
222
221
|
|
|
223
|
-
|
|
224
|
-
|
|
225
|
-
|
|
226
|
-
|
|
227
|
-
- Ensure the URL shown is: `https://ai-router-flame.vercel.app`
|
|
222
|
+
- Go to [@BotFather](https://t.me/botfather)
|
|
223
|
+
- Send `/mybots` and select your bot
|
|
224
|
+
- Go to `Bot Settings` โ `Configure Mini App`
|
|
225
|
+
- Set the Mini App URL to: `https://ai-router-flame.vercel.app`
|
|
228
226
|
|
|
229
|
-
Once configured, users can access the Mini App by sending `/start` or `/app` to your bot
|
|
227
|
+
Once configured, users can access the Mini App by sending `/start` or `/app` to your bot.
|
|
230
228
|
|
|
231
229
|
An example of a deployed bot is accessible on Telegram: [https://t.me/freePulseAIbot](https://t.me/freePulseAIbot)
|
|
232
230
|
|