unified-ai-router 3.1.1 โ†’ 3.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/.env.example +2 -1
  2. package/package.json +1 -1
  3. package/readme.md +20 -20
package/.env.example CHANGED
@@ -9,4 +9,5 @@ VERCEL_AI_GATEWAY_API_KEY=API_KEY
9
9
  COHERE_API_KEY=API_KEY
10
10
  CEREBRAS_API_KEY=API_KEY
11
11
  LLM7_API_KEY=API_KEY
12
- SEARX_URL=https://searx.perennialte.ch
12
+ SEARX_URL=https://searx.perennialte.ch
13
+ PORT=3000
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "unified-ai-router",
3
- "version": "3.1.1",
3
+ "version": "3.1.3",
4
4
  "description": "A unified interface for multiple LLM providers with automatic fallback. This project includes an OpenAI-compatible server and a deployable Telegram bot with a Mini App interface. It supports major providers like OpenAI, Google, Grok, and more, ensuring reliability and flexibility for your AI applications.",
5
5
  "license": "ISC",
6
6
  "author": "mlibre",
package/readme.md CHANGED
@@ -4,7 +4,6 @@ Unified AI Router is a comprehensive toolkit for AI applications, featuring:
4
4
 
5
5
  - A unified interface for multiple LLM providers with automatic fallback (the core router library)
6
6
  - An OpenAI-compatible server for seamless API integration
7
- - A deployable Telegram bot with Mini App interface
8
7
 
9
8
  It supports major providers like OpenAI, Google, Grok, and more, ensuring reliability and flexibility.
10
9
 
@@ -13,6 +12,7 @@ It supports major providers like OpenAI, Google, Grok, and more, ensuring reliab
13
12
  - [๐Ÿ“– Usage](#-usage)
14
13
  - [๐Ÿ“š Basic Library Usage](#-basic-library-usage)
15
14
  - [๐Ÿ”Œ OpenAI-Compatible Server](#-openai-compatible-server)
15
+ - [Setup](#setup)
16
16
  - [๐Ÿงช Testing](#-testing)
17
17
  - [๐Ÿ”ง Supported Providers](#-supported-providers)
18
18
  - [๐Ÿ”‘ API Keys](#-api-keys)
@@ -31,15 +31,14 @@ It supports major providers like OpenAI, Google, Grok, and more, ensuring reliab
31
31
  - **OpenAI-Compatible Server**: Drop-in replacement for the OpenAI API, enabling easy integration with existing tools and clients
32
32
  - **Streaming and Non-Streaming Support**: Handles both streaming and non-streaming responses
33
33
  - **Tool Calling**: Full support for tools in LLM interactions
34
- - **Telegram Bot Integration**: Deployable as a Telegram bot with an interactive Mini App interface
35
34
 
36
35
  ## ๐Ÿ› ๏ธ Installation
37
36
 
38
37
  ```bash
39
38
  npm i unified-ai-router
40
39
  # OR
41
- git clone https://github.com/mlibre/AIRouter
42
- cd AIRouter
40
+ git clone https://github.com/mlibre/Unified-AI-Router
41
+ cd Unified-AI-Router
43
42
  npm i
44
43
  ```
45
44
 
@@ -89,14 +88,14 @@ The server uses the provider configurations defined in [provider.js](provider.js
89
88
 
90
89
  #### Setup
91
90
 
92
- 1. Copy the example environment file:
91
+ 1. Configure your providers in `provider.js`. Add new provider or modify existing ones with the appropriate `name`, `apiKey` (referencing the corresponding env variable), `model`, and `apiUrl` for the providers you want to use.
92
+
93
+ 2. Copy the example environment file:
93
94
 
94
95
  ```bash
95
96
  cp .env.example .env
96
97
  ```
97
98
 
98
- 2. Configure your providers in `provider.js`. Add new provider or modify existing ones with the appropriate `name`, `apiKey` (referencing the corresponding env variable), `model`, and `apiUrl` for the providers you want to use.
99
-
100
99
  3. Edit `.env` and add your API keys for the desired providers (see [๐Ÿ”‘ API Keys](#-api-keys) for sources).
101
100
 
102
101
  To start the server locally, run:
@@ -105,7 +104,13 @@ To start the server locally, run:
105
104
  npm start
106
105
  ```
107
106
 
108
- The server listens at `http://localhost:3000/v1/chat/completions` and supports standard OpenAI endpoints like `/v1/chat/completions`.
107
+ The server listens at `http://localhost:3000/` and supports the following OpenAI-compatible endpoints:
108
+
109
+ - `POST /v1/chat/completions` - Chat completions (streaming and non-streaming)
110
+ - `POST /chat/completions` - Chat completions (streaming and non-streaming)
111
+ - `GET /v1/models` - List available models
112
+ - `GET /models` - List available models
113
+ - `GET /health` - Health check
109
114
 
110
115
  ### ๐Ÿงช Testing
111
116
 
@@ -137,6 +142,7 @@ node tests/tools.js
137
142
  - Vercel
138
143
  - Cerebras
139
144
  - LLM7
145
+ - Any Other OpenAI Compatible Server
140
146
 
141
147
  ## ๐Ÿ”‘ API Keys
142
148
 
@@ -152,7 +158,6 @@ Get your API keys from the following providers:
152
158
  - **Vercel AI Gateway**: [vercel.com/docs/ai/ai-gateway](https://vercel.com/docs/ai-gateway)
153
159
  - **Cerebras**: [cloud.cerebras.ai](https://cloud.cerebras.ai)
154
160
  - **LLM7**: [token.llm7.io](https://token.llm7.io/)
155
- - Seems like it does not support tool calling
156
161
 
157
162
  ## ๐Ÿ”ผ Vercel Deployment (Telegram Bot)
158
163
 
@@ -214,19 +219,14 @@ curl "https://ai-router-flame.vercel.app/api?register_webhook=true"
214
219
 
215
220
  After deploying the bot, you need to configure the Telegram Mini App and menu button:
216
221
 
217
- 1. **Configure Mini App:**
218
- - Go to [@BotFather](https://t.me/botfather)
219
- - Send `/mybots` and select your bot
220
- - Go to `Bot Settings` โ†’ `Configure Mini App`
221
- - Set the Mini App URL to: `https://ai-router-flame.vercel.app`
222
+ **Configure Mini App:**
222
223
 
223
- 2. **Configure Menu Button:**
224
- - Go to [@BotFather](https://t.me/botfather)
225
- - Send `/mybots` and select your bot
226
- - Go to `Bot Settings` โ†’ `Menu Button`
227
- - Ensure the URL shown is: `https://ai-router-flame.vercel.app`
224
+ - Go to [@BotFather](https://t.me/botfather)
225
+ - Send `/mybots` and select your bot
226
+ - Go to `Bot Settings` โ†’ `Configure Mini App`
227
+ - Set the Mini App URL to: `https://ai-router-flame.vercel.app`
228
228
 
229
- Once configured, users can access the Mini App by sending `/start` or `/app` to your bot, or through the menu button.
229
+ Once configured, users can access the Mini App by sending `/start` or `/app` to your bot.
230
230
 
231
231
  An example of a deployed bot is accessible on Telegram: [https://t.me/freePulseAIbot](https://t.me/freePulseAIbot)
232
232