@genui-a3/providers 0.0.0 → 0.0.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +186 -0
- package/package.json +1 -1
package/README.md
ADDED
|
@@ -0,0 +1,186 @@
|
|
|
1
|
+
# @genui-a3/providers
|
|
2
|
+
|
|
3
|
+
[](https://www.npmjs.com/package/@genui-a3/providers)
|
|
4
|
+
[](https://github.com/generalui/a3/blob/main/LICENSE)
|
|
5
|
+
|
|
6
|
+
**LLM provider implementations for the [A3 agentic framework](https://www.npmjs.com/package/@genui-a3/core).**
|
|
7
|
+
|
|
8
|
+
Ships with **AWS Bedrock**, **Anthropic** and **OpenAI** providers out of the box.
|
|
9
|
+
Both support blocking and streaming modes, model fallback, and structured output via Zod schemas.
|
|
10
|
+
|
|
11
|
+
## Install
|
|
12
|
+
|
|
13
|
+
```bash
|
|
14
|
+
npm install @genui-a3/providers @genui-a3/core
|
|
15
|
+
```
|
|
16
|
+
|
|
17
|
+
`@genui-a3/core` is a **peer dependency** — it must be installed alongside this package.
|
|
18
|
+
|
|
19
|
+
## Quick Start
|
|
20
|
+
|
|
21
|
+
### AWS Bedrock
|
|
22
|
+
|
|
23
|
+
```typescript
|
|
24
|
+
import { createBedrockProvider } from '@genui-a3/providers/bedrock'
|
|
25
|
+
|
|
26
|
+
const provider = createBedrockProvider({
|
|
27
|
+
models: ['us.anthropic.claude-sonnet-4-5-20250929-v1:0'],
|
|
28
|
+
region: 'us-east-1', // optional, defaults to AWS SDK default
|
|
29
|
+
})
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
### OpenAI
|
|
33
|
+
|
|
34
|
+
```typescript
|
|
35
|
+
import { createOpenAIProvider } from '@genui-a3/providers/openai'
|
|
36
|
+
|
|
37
|
+
const provider = createOpenAIProvider({
|
|
38
|
+
models: ['gpt-4o', 'gpt-4o-mini'],
|
|
39
|
+
apiKey: process.env.OPENAI_API_KEY, // optional, defaults to OPENAI_API_KEY env var
|
|
40
|
+
})
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
### Use with A3
|
|
44
|
+
|
|
45
|
+
```typescript
|
|
46
|
+
import { ChatSession, MemorySessionStore } from '@genui-a3/core'
|
|
47
|
+
|
|
48
|
+
const session = new ChatSession({
|
|
49
|
+
sessionId: 'user-123',
|
|
50
|
+
store: new MemorySessionStore(),
|
|
51
|
+
initialAgentId: 'greeting',
|
|
52
|
+
initialState: {},
|
|
53
|
+
provider, // any provider from above
|
|
54
|
+
})
|
|
55
|
+
|
|
56
|
+
// Blocking
|
|
57
|
+
const response = await session.send({ message: 'Hello!' })
|
|
58
|
+
|
|
59
|
+
// Streaming
|
|
60
|
+
for await (const event of session.send({ message: 'Hello!', stream: true })) {
|
|
61
|
+
console.log(event)
|
|
62
|
+
}
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
## Providers
|
|
66
|
+
|
|
67
|
+
### Bedrock — `createBedrockProvider(config)`
|
|
68
|
+
|
|
69
|
+
Communicates with AWS Bedrock via the [Converse API](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html).
|
|
70
|
+
|
|
71
|
+
| Option | Type | Required | Description |
|
|
72
|
+
|---|---|---|---|
|
|
73
|
+
| `models` | `string[]` | Yes | Model IDs in preference order (first = primary, rest = fallbacks) |
|
|
74
|
+
| `region` | `string` | No | AWS region. Defaults to AWS SDK default |
|
|
75
|
+
|
|
76
|
+
**Behaviour:**
|
|
77
|
+
|
|
78
|
+
- Uses **tool-based JSON extraction** (`structuredResponse` tool) for reliable structured output
|
|
79
|
+
- **Streaming** yields text deltas in real-time, then emits a validated tool-call result at the end
|
|
80
|
+
- **Merges sequential same-role messages** to satisfy Bedrock's alternating-role requirement
|
|
81
|
+
- **Prepends an initial user message** (`"Hi"`) so the conversation always starts with a user turn
|
|
82
|
+
|
|
83
|
+
**Prerequisites:** AWS credentials configured via environment variables, IAM role, or AWS profile — the same setup the AWS SDK expects.
|
|
84
|
+
|
|
85
|
+
---
|
|
86
|
+
|
|
87
|
+
### OpenAI — `createOpenAIProvider(config)`
|
|
88
|
+
|
|
89
|
+
Communicates with the OpenAI Chat Completions API using [structured output](https://platform.openai.com/docs/guides/structured-outputs) (`response_format: json_schema`).
|
|
90
|
+
|
|
91
|
+
| Option | Type | Required | Description |
|
|
92
|
+
|---|---|---|---|
|
|
93
|
+
| `models` | `string[]` | Yes | Model IDs in preference order (first = primary, rest = fallbacks) |
|
|
94
|
+
| `apiKey` | `string` | No | API key. Defaults to `OPENAI_API_KEY` env var |
|
|
95
|
+
| `baseURL` | `string` | No | Custom base URL for Azure OpenAI or compatible endpoints |
|
|
96
|
+
| `organization` | `string` | No | OpenAI organization ID |
|
|
97
|
+
|
|
98
|
+
**Behaviour:**
|
|
99
|
+
|
|
100
|
+
- Uses **structured output** (`response_format` with `json_schema`) — no tool calls required
|
|
101
|
+
- **Enforces strict schemas** automatically (`additionalProperties: false`, all properties `required`)
|
|
102
|
+
- **Streaming** extracts `chatbotMessage` text progressively from the JSON response via a character-level state machine, yielding text deltas in real-time
|
|
103
|
+
- Detects **truncated responses** (`finish_reason: length`) and surfaces them as errors
|
|
104
|
+
|
|
105
|
+
**Prerequisites:** An OpenAI API key, either passed directly or set as `OPENAI_API_KEY`.
|
|
106
|
+
|
|
107
|
+
## Model Fallback
|
|
108
|
+
|
|
109
|
+
Both providers support automatic model fallback. List models in order of preference:
|
|
110
|
+
|
|
111
|
+
```typescript
|
|
112
|
+
const provider = createBedrockProvider({
|
|
113
|
+
models: [
|
|
114
|
+
'us.anthropic.claude-sonnet-4-5-20250929-v1:0', // primary
|
|
115
|
+
'us.anthropic.claude-haiku-4-5-20251001-v1:0', // fallback
|
|
116
|
+
],
|
|
117
|
+
})
|
|
118
|
+
```
|
|
119
|
+
|
|
120
|
+
If the primary model fails, the provider automatically retries with the next model in the list. If all models fail, the last error is thrown.
|
|
121
|
+
|
|
122
|
+
## Per-Agent Provider Override
|
|
123
|
+
|
|
124
|
+
Each agent can override the session-level provider:
|
|
125
|
+
|
|
126
|
+
```typescript
|
|
127
|
+
import { createOpenAIProvider } from '@genui-a3/providers/openai'
|
|
128
|
+
import { createBedrockProvider } from '@genui-a3/providers/bedrock'
|
|
129
|
+
|
|
130
|
+
// Session uses Bedrock by default
|
|
131
|
+
const session = new ChatSession({
|
|
132
|
+
provider: createBedrockProvider({ models: ['us.anthropic.claude-sonnet-4-5-20250929-v1:0'] }),
|
|
133
|
+
// ...
|
|
134
|
+
})
|
|
135
|
+
|
|
136
|
+
// This agent uses OpenAI instead
|
|
137
|
+
const premiumAgent = {
|
|
138
|
+
id: 'premium',
|
|
139
|
+
provider: createOpenAIProvider({ models: ['gpt-4o'] }),
|
|
140
|
+
// ...
|
|
141
|
+
}
|
|
142
|
+
```
|
|
143
|
+
|
|
144
|
+
## Provider Interface
|
|
145
|
+
|
|
146
|
+
Both providers implement the `Provider` interface from `@genui-a3/core`:
|
|
147
|
+
|
|
148
|
+
| Member | Description |
|
|
149
|
+
|---|---|
|
|
150
|
+
| `sendRequest(request)` | Blocking request → `Promise<ProviderResponse>` |
|
|
151
|
+
| `sendRequestStream(request)` | Streaming request → `AsyncGenerator<StreamEvent>` |
|
|
152
|
+
| `name` | Human-readable name (`'bedrock'` or `'openai'`) |
|
|
153
|
+
|
|
154
|
+
To create a custom provider, implement this interface and pass it to `ChatSession` or an individual agent.
|
|
155
|
+
|
|
156
|
+
## Exports
|
|
157
|
+
|
|
158
|
+
This package uses [subpath exports](https://nodejs.org/api/packages.html#subpath-exports). Import from the specific provider entry point:
|
|
159
|
+
|
|
160
|
+
```typescript
|
|
161
|
+
// ✅ Correct
|
|
162
|
+
import { createBedrockProvider } from '@genui-a3/providers/bedrock'
|
|
163
|
+
import { createOpenAIProvider } from '@genui-a3/providers/openai'
|
|
164
|
+
|
|
165
|
+
// ❌ No bare import
|
|
166
|
+
import { ... } from '@genui-a3/providers'
|
|
167
|
+
```
|
|
168
|
+
|
|
169
|
+
| Entry point | Export | Description |
|
|
170
|
+
|---|---|---|
|
|
171
|
+
| `@genui-a3/providers/bedrock` | `createBedrockProvider` | Factory function returning a Bedrock `Provider` |
|
|
172
|
+
| `@genui-a3/providers/bedrock` | `BedrockProviderConfig` | TypeScript config interface |
|
|
173
|
+
| `@genui-a3/providers/openai` | `createOpenAIProvider` | Factory function returning an OpenAI `Provider` |
|
|
174
|
+
| `@genui-a3/providers/openai` | `OpenAIProviderConfig` | TypeScript config interface |
|
|
175
|
+
|
|
176
|
+
## Requirements
|
|
177
|
+
|
|
178
|
+
- Node.js 20.19.0+
|
|
179
|
+
- TypeScript 5.9+
|
|
180
|
+
- `@genui-a3/core` ≥ 0.1.5 (peer dependency)
|
|
181
|
+
- **Bedrock**: AWS credentials configured in the environment
|
|
182
|
+
- **OpenAI**: `OPENAI_API_KEY` environment variable or `apiKey` config option
|
|
183
|
+
|
|
184
|
+
## License
|
|
185
|
+
|
|
186
|
+
ISC
|