@vly-ai/integrations 0.6.0 → 0.6.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +16 -17
- package/package.json +2 -2
package/README.md
CHANGED
|
@@ -1,20 +1,28 @@
|
|
|
1
1
|
# @vly-ai/integrations
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
Access AI models and email services through [vly.ai](https://vly.ai) without managing multiple API keys or accounts. Built on the AI SDK for reliable, type-safe AI Gateway integration.
|
|
4
|
+
|
|
5
|
+
## Why use @vly-ai/integrations?
|
|
6
|
+
|
|
7
|
+
Instead of creating separate accounts and managing API keys for OpenAI, Anthropic, SendGrid, Resend, etc., get a single deployment token from [vly.ai](https://vly.ai) and access all these services with automatic usage-based billing:
|
|
8
|
+
|
|
9
|
+
- **No API key management** - One token for all services (AI models, email, etc.)
|
|
10
|
+
- **No separate accounts** - Skip creating accounts with OpenAI, Anthropic, SendGrid, etc.
|
|
11
|
+
- **Usage-based billing** - Pay only for what you use, billed through your vly.ai deployment
|
|
12
|
+
- **Multiple AI models** - Access GPT-5, Claude 4, Gemini, Llama, and more through one API
|
|
13
|
+
- **Built on AI SDK** - Type-safe, production-ready integration
|
|
4
14
|
|
|
5
15
|
## Installation
|
|
6
16
|
|
|
7
17
|
```bash
|
|
8
|
-
npm install @vly-ai/integrations
|
|
9
|
-
# or
|
|
10
|
-
yarn add @vly-ai/integrations
|
|
11
|
-
# or
|
|
12
18
|
pnpm add @vly-ai/integrations
|
|
13
19
|
```
|
|
14
20
|
|
|
15
|
-
|
|
21
|
+
## Getting Started
|
|
16
22
|
|
|
17
|
-
|
|
23
|
+
1. Sign up at [vly.ai](https://vly.ai) and create a deployment
|
|
24
|
+
2. Get your deployment token from the dashboard
|
|
25
|
+
3. Set it as an environment variable: `VLY_INTEGRATION_KEY`
|
|
18
26
|
|
|
19
27
|
## Usage
|
|
20
28
|
|
|
@@ -22,7 +30,7 @@ No additional dependencies required! Built with fetch and AI SDK.
|
|
|
22
30
|
import { createVlyIntegrations } from '@vly-ai/integrations';
|
|
23
31
|
|
|
24
32
|
const vly = createVlyIntegrations({
|
|
25
|
-
deploymentToken: process.env.VLY_INTEGRATION_KEY,
|
|
33
|
+
deploymentToken: process.env.VLY_INTEGRATION_KEY,
|
|
26
34
|
debug: false // optional
|
|
27
35
|
});
|
|
28
36
|
|
|
@@ -188,15 +196,6 @@ const vly = createVlyIntegrations({
|
|
|
188
196
|
});
|
|
189
197
|
```
|
|
190
198
|
|
|
191
|
-
## What's New in v0.2.0
|
|
192
|
-
|
|
193
|
-
- ✅ **AI SDK Integration**: Now powered by `@ai-sdk/openai-compatible` for better reliability
|
|
194
|
-
- ✅ **No more axios**: Replaced with built-in fetch for lighter weight
|
|
195
|
-
- ✅ **New AI models**: Support for GPT-5, Claude 4, and Claude 3.7 models
|
|
196
|
-
- ✅ **Direct AI SDK access**: Get the provider for advanced AI SDK usage
|
|
197
|
-
- ✅ **Better streaming**: Improved streaming support with AI SDK
|
|
198
|
-
- ✅ **Type safety**: Enhanced TypeScript support
|
|
199
|
-
|
|
200
199
|
## Billing
|
|
201
200
|
|
|
202
201
|
All API calls are automatically billed to your deployment based on usage. The billing happens transparently through your deployment token, and usage information is included in the API responses.
|
package/package.json
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@vly-ai/integrations",
|
|
3
|
-
"version": "0.6.
|
|
4
|
-
"description": "
|
|
3
|
+
"version": "0.6.1",
|
|
4
|
+
"description": "Access AI models and email services through vly.ai without managing multiple API keys or accounts",
|
|
5
5
|
"main": "./dist/index.js",
|
|
6
6
|
"module": "./dist/index.mjs",
|
|
7
7
|
"types": "./dist/index.d.ts",
|