march-ai-sdk 0.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +732 -0
- package/dist/app-C_umwZXh.d.ts +790 -0
- package/dist/extensions/langgraph.d.ts +144 -0
- package/dist/extensions/langgraph.js +326 -0
- package/dist/extensions/langgraph.js.map +1 -0
- package/dist/extensions/vercel-ai.d.ts +124 -0
- package/dist/extensions/vercel-ai.js +177 -0
- package/dist/extensions/vercel-ai.js.map +1 -0
- package/dist/index.d.ts +260 -0
- package/dist/index.js +1695 -0
- package/dist/index.js.map +1 -0
- package/dist/proto/gateway.proto +99 -0
- package/package.json +83 -0
- package/src/agent-state-client.ts +115 -0
- package/src/agent.ts +293 -0
- package/src/api-paths.ts +60 -0
- package/src/app.ts +235 -0
- package/src/artifact.ts +59 -0
- package/src/attachment-client.ts +78 -0
- package/src/checkpoint-client.ts +175 -0
- package/src/conversation-client.ts +109 -0
- package/src/conversation-message.ts +61 -0
- package/src/conversation.ts +123 -0
- package/src/exceptions.ts +78 -0
- package/src/extensions/index.ts +6 -0
- package/src/extensions/langgraph.ts +351 -0
- package/src/extensions/vercel-ai.ts +177 -0
- package/src/gateway-client.ts +420 -0
- package/src/heartbeat.ts +89 -0
- package/src/index.ts +70 -0
- package/src/memory-client.ts +125 -0
- package/src/memory.ts +68 -0
- package/src/message.ts +178 -0
- package/src/proto/gateway.proto +99 -0
- package/src/streamer.ts +242 -0
- package/src/types.ts +196 -0
package/README.md
ADDED
|
@@ -0,0 +1,732 @@
|
|
|
1
|
+
# March Agent SDK (TypeScript)
|
|
2
|
+
|
|
3
|
+
TypeScript SDK for building AI agents in the March AI platform. Provides agent registration, message handling via Kafka (through Agent Gateway), streaming responses, and integrations with LangGraph and Vercel AI SDK.
|
|
4
|
+
|
|
5
|
+
## Quick Start
|
|
6
|
+
|
|
7
|
+
The fastest way to get started is with the CLI:
|
|
8
|
+
|
|
9
|
+
```bash
|
|
10
|
+
npx march-start my-agent
|
|
11
|
+
cd my-agent
|
|
12
|
+
# Edit .env with your configuration
|
|
13
|
+
pnpm dev
|
|
14
|
+
```
|
|
15
|
+
|
|
16
|
+
## Overview
|
|
17
|
+
|
|
18
|
+
The March Agent SDK (`march-ai-sdk`) enables developers to build AI agents that integrate with the March AI Management platform. Agents register themselves, receive messages through Kafka topics, and stream responses back to users.
|
|
19
|
+
|
|
20
|
+
### Key Features
|
|
21
|
+
|
|
22
|
+
- **Agent Registration**: Automatic registration with AI Inventory
|
|
23
|
+
- **Message Handling**: Function-based handlers with sender filtering
|
|
24
|
+
- **Response Streaming**: Chunked responses via Kafka/Centrifugo
|
|
25
|
+
- **Conversation History**: Access to message history via Conversation Store
|
|
26
|
+
- **LangGraph Integration**: HTTP-based checkpoint saver for LangGraph graphs
|
|
27
|
+
- **Vercel AI SDK Integration**: Persistent message history for Vercel AI SDK
|
|
28
|
+
- **Attachment Handling**: Download images and PDFs for processing
|
|
29
|
+
- **Long-term Memory**: Semantic search across user conversations
|
|
30
|
+
- **Error Handling**: Automatic error responses to users on handler failures
|
|
31
|
+
|
|
32
|
+
## Architecture
|
|
33
|
+
|
|
34
|
+
```
|
|
35
|
+
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
36
|
+
│ Your Agent Application │
|
|
37
|
+
├─────────────────────────────────────────────────────────────────────────────┤
|
|
38
|
+
│ │
|
|
39
|
+
│ MarchAgentApp │
|
|
40
|
+
│ ├── registerMe() → Agent │
|
|
41
|
+
│ │ ├── onMessage() handler │
|
|
42
|
+
│ │ └── streamer() → Streamer │
|
|
43
|
+
│ └── run() → Start consume loop │
|
|
44
|
+
│ │
|
|
45
|
+
│ Extensions: │
|
|
46
|
+
│ ├── HTTPCheckpointSaver (LangGraph) │
|
|
47
|
+
│ └── VercelAIMessageStore (Vercel AI SDK) │
|
|
48
|
+
│ │
|
|
49
|
+
└──────────────────────────────────┬──────────────────────────────────────────┘
|
|
50
|
+
│ gRPC (Kafka) + HTTP (APIs)
|
|
51
|
+
▼
|
|
52
|
+
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
53
|
+
│ Agent Gateway │
|
|
54
|
+
├─────────────────────────────────────────────────────────────────────────────┤
|
|
55
|
+
│ gRPC: AgentStream (bidirectional) HTTP: /s/{service}/* proxy │
|
|
56
|
+
│ - Auth - /s/ai-inventory/* │
|
|
57
|
+
│ - Subscribe/Unsubscribe - /s/conversation-store/* │
|
|
58
|
+
│ - Produce/Consume messages - /s/ai-memory/* │
|
|
59
|
+
│ - /s/attachment/* │
|
|
60
|
+
└─────────────────┬───────────────────────────────────┬───────────────────────┘
|
|
61
|
+
│ │
|
|
62
|
+
▼ ▼
|
|
63
|
+
┌──────────────┐ ┌────────────────────┐
|
|
64
|
+
│ Kafka │ │ AI Inventory / │
|
|
65
|
+
│ {agent}.inbox │ │ Conversation Store │
|
|
66
|
+
└──────────────┘ └────────────────────┘
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
### Message Flow
|
|
70
|
+
|
|
71
|
+
```mermaid
|
|
72
|
+
sequenceDiagram
|
|
73
|
+
participant User
|
|
74
|
+
participant Provider
|
|
75
|
+
participant Router
|
|
76
|
+
participant Gateway as Agent Gateway
|
|
77
|
+
participant Agent as Your Agent
|
|
78
|
+
participant ConvStore as Conversation Store
|
|
79
|
+
|
|
80
|
+
User->>Provider: Send message
|
|
81
|
+
Provider->>Router: Kafka: router.inbox
|
|
82
|
+
Router->>Gateway: Kafka: {agent}.inbox
|
|
83
|
+
Gateway->>Agent: gRPC stream: message
|
|
84
|
+
|
|
85
|
+
Agent->>ConvStore: Get conversation history
|
|
86
|
+
ConvStore-->>Agent: Message history
|
|
87
|
+
|
|
88
|
+
loop Streaming Response
|
|
89
|
+
Agent->>Gateway: gRPC stream: content chunk
|
|
90
|
+
Gateway->>Router: Kafka: router.inbox
|
|
91
|
+
Router->>Provider: Kafka: user.inbox
|
|
92
|
+
Provider->>User: WebSocket: chunk
|
|
93
|
+
end
|
|
94
|
+
|
|
95
|
+
Agent->>Gateway: gRPC stream: done=true
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
## Installation
|
|
99
|
+
|
|
100
|
+
```bash
|
|
101
|
+
# Using pnpm
|
|
102
|
+
pnpm add march-ai-sdk
|
|
103
|
+
|
|
104
|
+
# Using npm
|
|
105
|
+
npm install march-ai-sdk
|
|
106
|
+
|
|
107
|
+
# Using yarn
|
|
108
|
+
yarn add march-ai-sdk
|
|
109
|
+
```
|
|
110
|
+
|
|
111
|
+
Or scaffold a new project with the CLI:
|
|
112
|
+
|
|
113
|
+
```bash
|
|
114
|
+
npx march-start my-agent
|
|
115
|
+
```
|
|
116
|
+
|
|
117
|
+
### Peer Dependencies
|
|
118
|
+
|
|
119
|
+
The SDK requires the following peer dependencies:
|
|
120
|
+
|
|
121
|
+
```bash
|
|
122
|
+
pnpm add @grpc/grpc-js @grpc/proto-loader zod
|
|
123
|
+
```
|
|
124
|
+
|
|
125
|
+
### Optional Extensions
|
|
126
|
+
|
|
127
|
+
```bash
|
|
128
|
+
# For LangGraph support
|
|
129
|
+
pnpm add @langchain/langgraph @langchain/langgraph-checkpoint @langchain/core
|
|
130
|
+
|
|
131
|
+
# For Vercel AI SDK support
|
|
132
|
+
pnpm add ai @ai-sdk/openai
|
|
133
|
+
```
|
|
134
|
+
|
|
135
|
+
## Quick Start
|
|
136
|
+
|
|
137
|
+
```typescript
|
|
138
|
+
import { config } from 'dotenv'
|
|
139
|
+
import { MarchAgentApp, Message } from 'march-ai-sdk'
|
|
140
|
+
|
|
141
|
+
// Load environment variables
|
|
142
|
+
config({ path: '.env', override: true })
|
|
143
|
+
|
|
144
|
+
// Initialize the app
|
|
145
|
+
const app = new MarchAgentApp({
|
|
146
|
+
gatewayUrl: process.env.GATEWAY_URL || 'agent-gateway:8080',
|
|
147
|
+
apiKey: process.env.GATEWAY_API_KEY || 'your-api-key',
|
|
148
|
+
})
|
|
149
|
+
|
|
150
|
+
async function main() {
|
|
151
|
+
// Register an agent
|
|
152
|
+
const agent = await app.registerMe({
|
|
153
|
+
name: 'my-assistant',
|
|
154
|
+
about: 'A helpful AI assistant',
|
|
155
|
+
document: 'Answers general questions and provides helpful information.',
|
|
156
|
+
representationName: 'My Assistant',
|
|
157
|
+
})
|
|
158
|
+
|
|
159
|
+
// Handle messages
|
|
160
|
+
agent.onMessage(async (message: Message, sender: string) => {
|
|
161
|
+
// Access conversation history
|
|
162
|
+
const history = await message.conversation?.getHistory({ limit: 10 })
|
|
163
|
+
|
|
164
|
+
// Stream response
|
|
165
|
+
const streamer = agent.streamer(message)
|
|
166
|
+
streamer.stream('Hello! ')
|
|
167
|
+
streamer.stream('How can I help you today?')
|
|
168
|
+
await streamer.finish()
|
|
169
|
+
})
|
|
170
|
+
|
|
171
|
+
// Run the agent
|
|
172
|
+
await app.run()
|
|
173
|
+
}
|
|
174
|
+
|
|
175
|
+
main().catch(console.error)
|
|
176
|
+
```
|
|
177
|
+
|
|
178
|
+
## Configuration
|
|
179
|
+
|
|
180
|
+
### Environment Variables
|
|
181
|
+
|
|
182
|
+
| Variable | Description | Default |
|
|
183
|
+
|----------|-------------|---------|
|
|
184
|
+
| `GATEWAY_URL` | Agent Gateway endpoint | Required |
|
|
185
|
+
| `GATEWAY_API_KEY` | API key for authentication | Required |
|
|
186
|
+
| `HEARTBEAT_INTERVAL` | Heartbeat frequency (seconds) | `60` |
|
|
187
|
+
| `CONNECTION_SECURE` | Use TLS for connections | `false` |
|
|
188
|
+
|
|
189
|
+
### MarchAgentApp Options
|
|
190
|
+
|
|
191
|
+
```typescript
|
|
192
|
+
interface AppOptions {
|
|
193
|
+
gatewayUrl: string // Gateway endpoint (host:port)
|
|
194
|
+
apiKey: string // Authentication key
|
|
195
|
+
heartbeatInterval?: number // Heartbeat frequency in seconds (default: 60)
|
|
196
|
+
maxConcurrentTasks?: number // Max concurrent message handlers (default: 100)
|
|
197
|
+
errorMessageTemplate?: string // Error message sent to users
|
|
198
|
+
secure?: boolean // Use TLS for gRPC and HTTPS for HTTP (default: false)
|
|
199
|
+
}
|
|
200
|
+
|
|
201
|
+
const app = new MarchAgentApp({
|
|
202
|
+
gatewayUrl: 'agent-gateway:8080',
|
|
203
|
+
apiKey: 'your-api-key',
|
|
204
|
+
heartbeatInterval: 60,
|
|
205
|
+
secure: false,
|
|
206
|
+
errorMessageTemplate: 'Sorry, something went wrong. Please try again.',
|
|
207
|
+
})
|
|
208
|
+
```
|
|
209
|
+
|
|
210
|
+
## API Reference
|
|
211
|
+
|
|
212
|
+
### MarchAgentApp
|
|
213
|
+
|
|
214
|
+
Main application class for initializing and running agents.
|
|
215
|
+
|
|
216
|
+
```typescript
|
|
217
|
+
import { MarchAgentApp } from 'march-ai-sdk'
|
|
218
|
+
|
|
219
|
+
const app = new MarchAgentApp({
|
|
220
|
+
gatewayUrl: 'agent-gateway:8080',
|
|
221
|
+
apiKey: 'your-api-key',
|
|
222
|
+
})
|
|
223
|
+
|
|
224
|
+
// Register an agent
|
|
225
|
+
const agent = await app.registerMe({
|
|
226
|
+
name: 'agent-name', // Unique identifier (used for routing)
|
|
227
|
+
about: 'Short description', // Brief description for agent selection
|
|
228
|
+
document: 'Full docs...', // Detailed documentation
|
|
229
|
+
representationName: 'Name', // Display name (optional)
|
|
230
|
+
baseUrl: 'http://...', // Base URL for artifacts (optional)
|
|
231
|
+
metadata: { key: 'value' }, // Additional metadata (optional)
|
|
232
|
+
relatedPages: [ // Related pages (optional)
|
|
233
|
+
{ name: 'Dashboard', endpoint: '/dashboard' }
|
|
234
|
+
],
|
|
235
|
+
})
|
|
236
|
+
|
|
237
|
+
// Start consuming messages (blocks until shutdown)
|
|
238
|
+
await app.run()
|
|
239
|
+
```
|
|
240
|
+
|
|
241
|
+
### Agent
|
|
242
|
+
|
|
243
|
+
Handles message registration and response streaming.
|
|
244
|
+
|
|
245
|
+
```typescript
|
|
246
|
+
// Register message handler (all senders)
|
|
247
|
+
agent.onMessage(async (message, sender) => {
|
|
248
|
+
// Handle message
|
|
249
|
+
})
|
|
250
|
+
|
|
251
|
+
// Filter by sender
|
|
252
|
+
agent.onMessage(async (message, sender) => {
|
|
253
|
+
// Only handles messages from 'user'
|
|
254
|
+
}, { senders: ['user'] })
|
|
255
|
+
|
|
256
|
+
// Exclude sender
|
|
257
|
+
agent.onMessage(async (message, sender) => {
|
|
258
|
+
// Handles all except 'other-agent'
|
|
259
|
+
}, { senders: ['~other-agent'] })
|
|
260
|
+
```
|
|
261
|
+
|
|
262
|
+
### Message
|
|
263
|
+
|
|
264
|
+
Represents an incoming message with conversation context.
|
|
265
|
+
|
|
266
|
+
```typescript
|
|
267
|
+
agent.onMessage(async (message: Message, sender: string) => {
|
|
268
|
+
// Core properties
|
|
269
|
+
message.id // Message ID
|
|
270
|
+
message.conversationId // Conversation ID
|
|
271
|
+
message.userId // User who sent the message
|
|
272
|
+
message.content // Message content
|
|
273
|
+
message.createdAt // Timestamp
|
|
274
|
+
|
|
275
|
+
// Metadata
|
|
276
|
+
message.metadata // Custom metadata from frontend
|
|
277
|
+
message.schema // JSON schema for form responses
|
|
278
|
+
|
|
279
|
+
// Attachments
|
|
280
|
+
message.hasAttachment() // Check if attachment exists
|
|
281
|
+
message.attachment // AttachmentInfo object
|
|
282
|
+
await message.getAttachmentBytes() // Download as Buffer
|
|
283
|
+
await message.getAttachmentBase64() // Download as base64 string
|
|
284
|
+
|
|
285
|
+
// Conversation history
|
|
286
|
+
await message.conversation?.getHistory({ limit: 10 })
|
|
287
|
+
await message.conversation?.getAgentHistory({ limit: 10 })
|
|
288
|
+
|
|
289
|
+
// Long-term memory
|
|
290
|
+
await message.memory?.queryAboutUser({ query: '...', limit: 5 })
|
|
291
|
+
await message.memory?.getUserSummary()
|
|
292
|
+
})
|
|
293
|
+
```
|
|
294
|
+
|
|
295
|
+
#### Message Properties
|
|
296
|
+
|
|
297
|
+
| Property | Type | Description |
|
|
298
|
+
|----------|------|-------------|
|
|
299
|
+
| `id` | `string` | Message ID |
|
|
300
|
+
| `conversationId` | `string` | Conversation ID |
|
|
301
|
+
| `userId` | `string` | User who sent the message |
|
|
302
|
+
| `content` | `string` | Message content |
|
|
303
|
+
| `createdAt` | `string` | ISO timestamp |
|
|
304
|
+
| `metadata` | `Record<string, unknown> \| undefined` | Custom metadata |
|
|
305
|
+
| `schema` | `Record<string, unknown> \| undefined` | JSON schema for forms |
|
|
306
|
+
| `attachment` | `AttachmentInfo \| undefined` | File attachment metadata |
|
|
307
|
+
| `conversation` | `Conversation \| undefined` | Conversation helper |
|
|
308
|
+
| `memory` | `Memory \| undefined` | Long-term memory helper |
|
|
309
|
+
|
|
310
|
+
### Streamer
|
|
311
|
+
|
|
312
|
+
Streams response chunks back to the user (or another agent).
|
|
313
|
+
|
|
314
|
+
```typescript
|
|
315
|
+
agent.onMessage(async (message, sender) => {
|
|
316
|
+
// Basic streaming
|
|
317
|
+
const streamer = agent.streamer(message)
|
|
318
|
+
streamer.stream('Chunk 1...')
|
|
319
|
+
streamer.stream('Chunk 2...')
|
|
320
|
+
await streamer.finish()
|
|
321
|
+
|
|
322
|
+
// Streaming with options
|
|
323
|
+
const streamer2 = agent.streamer(message, {
|
|
324
|
+
sendTo: 'user', // or another agent name
|
|
325
|
+
awaiting: true, // Set awaiting_route to this agent
|
|
326
|
+
})
|
|
327
|
+
|
|
328
|
+
// Non-persisted content (not saved to DB)
|
|
329
|
+
streamer2.stream('Thinking...', { persist: false, eventType: 'thinking' })
|
|
330
|
+
|
|
331
|
+
// Persisted content
|
|
332
|
+
streamer2.stream('Here is my response...')
|
|
333
|
+
|
|
334
|
+
// Set response schema for forms
|
|
335
|
+
streamer2.setResponseSchema({
|
|
336
|
+
type: 'object',
|
|
337
|
+
properties: {
|
|
338
|
+
name: { type: 'string' }
|
|
339
|
+
}
|
|
340
|
+
})
|
|
341
|
+
|
|
342
|
+
// Add artifacts
|
|
343
|
+
streamer2.addArtifact({
|
|
344
|
+
url: '/dashboard/123',
|
|
345
|
+
type: 'iframe',
|
|
346
|
+
title: 'Dashboard',
|
|
347
|
+
})
|
|
348
|
+
|
|
349
|
+
// Set message metadata
|
|
350
|
+
streamer2.setMessageMetadata({
|
|
351
|
+
model: 'gpt-4o',
|
|
352
|
+
confidence: 0.95,
|
|
353
|
+
})
|
|
354
|
+
|
|
355
|
+
await streamer2.finish()
|
|
356
|
+
})
|
|
357
|
+
```
|
|
358
|
+
|
|
359
|
+
#### Streamer Methods
|
|
360
|
+
|
|
361
|
+
| Method | Description |
|
|
362
|
+
|--------|-------------|
|
|
363
|
+
| `stream(content, options?)` | Stream a content chunk |
|
|
364
|
+
| `write(content, persist?)` | Alias for stream() |
|
|
365
|
+
| `finish(awaiting?)` | Finish streaming with done=true signal |
|
|
366
|
+
| `setResponseSchema(schema)` | Set JSON Schema for form rendering |
|
|
367
|
+
| `setMessageMetadata(metadata)` | Set message metadata |
|
|
368
|
+
| `addArtifact(artifact)` | Add an artifact to the message |
|
|
369
|
+
| `setArtifacts(artifacts)` | Set all artifacts at once |
|
|
370
|
+
|
|
371
|
+
## Extensions
|
|
372
|
+
|
|
373
|
+
### LangGraph Integration
|
|
374
|
+
|
|
375
|
+
Use `HTTPCheckpointSaver` to persist LangGraph state via the Conversation Store checkpoint API.
|
|
376
|
+
|
|
377
|
+
```typescript
|
|
378
|
+
import { MarchAgentApp, Message } from 'march-ai-sdk'
|
|
379
|
+
import { HTTPCheckpointSaver } from 'march-ai-sdk/extensions/langgraph'
|
|
380
|
+
import { StateGraph, START, END, Annotation } from '@langchain/langgraph'
|
|
381
|
+
import { ChatOpenAI } from '@langchain/openai'
|
|
382
|
+
import { HumanMessage, AIMessage, BaseMessage } from '@langchain/core/messages'
|
|
383
|
+
|
|
384
|
+
const app = new MarchAgentApp({
|
|
385
|
+
gatewayUrl: process.env.GATEWAY_URL!,
|
|
386
|
+
apiKey: process.env.GATEWAY_API_KEY!,
|
|
387
|
+
})
|
|
388
|
+
|
|
389
|
+
// Create checkpointer
|
|
390
|
+
const checkpointer = new HTTPCheckpointSaver(app)
|
|
391
|
+
|
|
392
|
+
// Define state
|
|
393
|
+
const StateAnnotation = Annotation.Root({
|
|
394
|
+
messages: Annotation<BaseMessage[]>({
|
|
395
|
+
reducer: (x, y) => x.concat(y),
|
|
396
|
+
}),
|
|
397
|
+
})
|
|
398
|
+
|
|
399
|
+
// Create graph
|
|
400
|
+
const llm = new ChatOpenAI({ model: 'gpt-4o-mini', streaming: true })
|
|
401
|
+
|
|
402
|
+
async function respond(state: typeof StateAnnotation.State) {
|
|
403
|
+
const response = await llm.invoke(state.messages)
|
|
404
|
+
return { messages: [response] }
|
|
405
|
+
}
|
|
406
|
+
|
|
407
|
+
const graph = new StateGraph(StateAnnotation)
|
|
408
|
+
.addNode('respond', respond)
|
|
409
|
+
.addEdge(START, 'respond')
|
|
410
|
+
.addEdge('respond', END)
|
|
411
|
+
|
|
412
|
+
const compiled = graph.compile({ checkpointer })
|
|
413
|
+
|
|
414
|
+
// Register agent
|
|
415
|
+
const agent = await app.registerMe({
|
|
416
|
+
name: 'langgraph-agent',
|
|
417
|
+
about: 'Agent using LangGraph for state management',
|
|
418
|
+
document: 'Uses LangGraph to maintain conversation state.',
|
|
419
|
+
})
|
|
420
|
+
|
|
421
|
+
agent.onMessage(async (message: Message, sender: string) => {
|
|
422
|
+
const config = { configurable: { thread_id: message.conversationId } }
|
|
423
|
+
|
|
424
|
+
// Add user message and invoke graph
|
|
425
|
+
const result = await compiled.invoke(
|
|
426
|
+
{ messages: [new HumanMessage(message.content)] },
|
|
427
|
+
config
|
|
428
|
+
)
|
|
429
|
+
|
|
430
|
+
// Stream response
|
|
431
|
+
const streamer = agent.streamer(message)
|
|
432
|
+
const lastMessage = result.messages[result.messages.length - 1]
|
|
433
|
+
streamer.stream(lastMessage.content as string)
|
|
434
|
+
await streamer.finish()
|
|
435
|
+
})
|
|
436
|
+
|
|
437
|
+
await app.run()
|
|
438
|
+
```
|
|
439
|
+
|
|
440
|
+
### Vercel AI SDK Integration
|
|
441
|
+
|
|
442
|
+
Use `VercelAIMessageStore` to persist message history for Vercel AI SDK.
|
|
443
|
+
|
|
444
|
+
```typescript
|
|
445
|
+
import { MarchAgentApp, Message } from 'march-ai-sdk'
|
|
446
|
+
import { VercelAIMessageStore } from 'march-ai-sdk/extensions/vercel-ai'
|
|
447
|
+
import { streamText, CoreMessage } from 'ai'
|
|
448
|
+
import { openai } from '@ai-sdk/openai'
|
|
449
|
+
|
|
450
|
+
const app = new MarchAgentApp({
|
|
451
|
+
gatewayUrl: process.env.GATEWAY_URL!,
|
|
452
|
+
apiKey: process.env.GATEWAY_API_KEY!,
|
|
453
|
+
})
|
|
454
|
+
|
|
455
|
+
// Create message store
|
|
456
|
+
const store = new VercelAIMessageStore(app)
|
|
457
|
+
|
|
458
|
+
const agent = await app.registerMe({
|
|
459
|
+
name: 'vercel-ai-agent',
|
|
460
|
+
about: 'Agent using Vercel AI SDK',
|
|
461
|
+
document: 'Uses Vercel AI SDK for LLM interactions.',
|
|
462
|
+
})
|
|
463
|
+
|
|
464
|
+
agent.onMessage(async (message: Message, sender: string) => {
|
|
465
|
+
// Load previous messages
|
|
466
|
+
const history = await store.load(message.conversationId)
|
|
467
|
+
|
|
468
|
+
// Build messages array
|
|
469
|
+
const messages: CoreMessage[] = [
|
|
470
|
+
{ role: 'system', content: 'You are a helpful assistant.' },
|
|
471
|
+
...history,
|
|
472
|
+
{ role: 'user', content: message.content }
|
|
473
|
+
]
|
|
474
|
+
|
|
475
|
+
const streamer = agent.streamer(message)
|
|
476
|
+
|
|
477
|
+
// Stream response
|
|
478
|
+
const result = await streamText({
|
|
479
|
+
model: openai('gpt-4o'),
|
|
480
|
+
messages,
|
|
481
|
+
onChunk: ({ chunk }) => {
|
|
482
|
+
if (chunk.type === 'text-delta') {
|
|
483
|
+
streamer.stream(chunk.textDelta)
|
|
484
|
+
}
|
|
485
|
+
}
|
|
486
|
+
})
|
|
487
|
+
|
|
488
|
+
await streamer.finish()
|
|
489
|
+
|
|
490
|
+
// Save updated history
|
|
491
|
+
await store.save(message.conversationId, [
|
|
492
|
+
...history,
|
|
493
|
+
{ role: 'user', content: message.content },
|
|
494
|
+
{ role: 'assistant', content: result.text }
|
|
495
|
+
])
|
|
496
|
+
})
|
|
497
|
+
|
|
498
|
+
await app.run()
|
|
499
|
+
```
|
|
500
|
+
|
|
501
|
+
#### VercelAIMessageStore Methods
|
|
502
|
+
|
|
503
|
+
| Method | Description |
|
|
504
|
+
|--------|-------------|
|
|
505
|
+
| `load(conversationId)` | Load message history |
|
|
506
|
+
| `save(conversationId, messages)` | Save message history |
|
|
507
|
+
| `clear(conversationId)` | Clear message history |
|
|
508
|
+
| `append(conversationId, messages)` | Append to existing history |
|
|
509
|
+
| `getLastMessages(conversationId, count)` | Get last N messages |
|
|
510
|
+
|
|
511
|
+
## Advanced Usage
|
|
512
|
+
|
|
513
|
+
### Inter-Agent Communication
|
|
514
|
+
|
|
515
|
+
Agents can send messages to other agents using the `sendTo` option.
|
|
516
|
+
|
|
517
|
+
```typescript
|
|
518
|
+
// Handle user messages - forward to specialist
|
|
519
|
+
agent.onMessage(async (message, sender) => {
|
|
520
|
+
const streamer = agent.streamer(message, { sendTo: 'specialist-agent' })
|
|
521
|
+
streamer.stream('Forwarding your question to the specialist...')
|
|
522
|
+
await streamer.finish()
|
|
523
|
+
}, { senders: ['user'] })
|
|
524
|
+
|
|
525
|
+
// Handle specialist responses - send back to user
|
|
526
|
+
agent.onMessage(async (message, sender) => {
|
|
527
|
+
const streamer = agent.streamer(message, { sendTo: 'user' })
|
|
528
|
+
streamer.stream(`The specialist says: ${message.content}`)
|
|
529
|
+
await streamer.finish()
|
|
530
|
+
}, { senders: ['specialist-agent'] })
|
|
531
|
+
```
|
|
532
|
+
|
|
533
|
+
### Dynamic Forms with Response Schema
|
|
534
|
+
|
|
535
|
+
Request structured input from users using JSON Schema forms.
|
|
536
|
+
|
|
537
|
+
```typescript
|
|
538
|
+
agent.onMessage(async (message, sender) => {
|
|
539
|
+
// Check if this is a form response
|
|
540
|
+
if (message.schema) {
|
|
541
|
+
const data = JSON.parse(message.content)
|
|
542
|
+
const streamer = agent.streamer(message)
|
|
543
|
+
streamer.stream(`Thank you, ${data.name}! I received your message.`)
|
|
544
|
+
await streamer.finish()
|
|
545
|
+
return
|
|
546
|
+
}
|
|
547
|
+
|
|
548
|
+
// Request form input
|
|
549
|
+
const streamer = agent.streamer(message, { awaiting: true })
|
|
550
|
+
streamer.setResponseSchema({
|
|
551
|
+
type: 'object',
|
|
552
|
+
title: 'Contact Information',
|
|
553
|
+
properties: {
|
|
554
|
+
name: { type: 'string', title: 'Full Name' },
|
|
555
|
+
email: { type: 'string', format: 'email', title: 'Email' },
|
|
556
|
+
message: { type: 'string', title: 'Message' }
|
|
557
|
+
},
|
|
558
|
+
required: ['name', 'email']
|
|
559
|
+
})
|
|
560
|
+
streamer.stream('Please fill out your contact information:')
|
|
561
|
+
await streamer.finish()
|
|
562
|
+
})
|
|
563
|
+
```
|
|
564
|
+
|
|
565
|
+
### Error Handling
|
|
566
|
+
|
|
567
|
+
Customize error handling behavior:
|
|
568
|
+
|
|
569
|
+
```typescript
|
|
570
|
+
const app = new MarchAgentApp({
|
|
571
|
+
gatewayUrl: '...',
|
|
572
|
+
apiKey: '...',
|
|
573
|
+
errorMessageTemplate: 'Sorry, something went wrong. Please try again.',
|
|
574
|
+
})
|
|
575
|
+
|
|
576
|
+
const agent = await app.registerMe({ ... })
|
|
577
|
+
|
|
578
|
+
// Disable automatic error responses
|
|
579
|
+
agent.sendErrorResponses = false
|
|
580
|
+
|
|
581
|
+
agent.onMessage(async (message, sender) => {
|
|
582
|
+
try {
|
|
583
|
+
// Your logic
|
|
584
|
+
} catch (error) {
|
|
585
|
+
// Custom error handling
|
|
586
|
+
const streamer = agent.streamer(message)
|
|
587
|
+
streamer.stream(`I encountered an issue: ${error}`)
|
|
588
|
+
await streamer.finish()
|
|
589
|
+
}
|
|
590
|
+
})
|
|
591
|
+
```
|
|
592
|
+
|
|
593
|
+
## Project Structure
|
|
594
|
+
|
|
595
|
+
```
|
|
596
|
+
ai-ts-framework/
|
|
597
|
+
├── src/
|
|
598
|
+
│ ├── index.ts # Package exports
|
|
599
|
+
│ ├── app.ts # MarchAgentApp
|
|
600
|
+
│ ├── agent.ts # Agent class
|
|
601
|
+
│ ├── message.ts # Message class
|
|
602
|
+
│ ├── streamer.ts # Streamer class
|
|
603
|
+
│ ├── conversation.ts # Conversation helper
|
|
604
|
+
│ ├── memory.ts # Memory helper
|
|
605
|
+
│ ├── conversation-client.ts # HTTP client for Conversation Store
|
|
606
|
+
│ ├── checkpoint-client.ts # HTTP client for checkpoints
|
|
607
|
+
│ ├── agent-state-client.ts # HTTP client for agent state
|
|
608
|
+
│ ├── memory-client.ts # HTTP client for AI Memory
|
|
609
|
+
│ ├── attachment-client.ts # HTTP client for attachments
|
|
610
|
+
│ ├── gateway-client.ts # gRPC client for Agent Gateway
|
|
611
|
+
│ ├── heartbeat.ts # Heartbeat manager
|
|
612
|
+
│ ├── api-paths.ts # API endpoint configuration
|
|
613
|
+
│ ├── types.ts # TypeScript types
|
|
614
|
+
│ ├── exceptions.ts # Custom exceptions
|
|
615
|
+
│ ├── proto/
|
|
616
|
+
│ │ └── gateway.proto # gRPC protocol definition
|
|
617
|
+
│ └── extensions/
|
|
618
|
+
│ ├── index.ts
|
|
619
|
+
│ ├── langgraph.ts # LangGraph HTTPCheckpointSaver
|
|
620
|
+
│ └── vercel-ai.ts # Vercel AI MessageStore
|
|
621
|
+
├── tests/
|
|
622
|
+
│ ├── agent.test.ts
|
|
623
|
+
│ ├── app.test.ts
|
|
624
|
+
│ ├── checkpoint-client.test.ts
|
|
625
|
+
│ ├── message.test.ts
|
|
626
|
+
│ └── streamer.test.ts
|
|
627
|
+
├── tsup.config.ts # Build configuration
|
|
628
|
+
├── vitest.config.ts # Test configuration
|
|
629
|
+
├── tsconfig.json
|
|
630
|
+
├── package.json
|
|
631
|
+
└── README.md
|
|
632
|
+
```
|
|
633
|
+
|
|
634
|
+
## Development
|
|
635
|
+
|
|
636
|
+
### Running Tests
|
|
637
|
+
|
|
638
|
+
```bash
|
|
639
|
+
# Install dependencies
|
|
640
|
+
pnpm install
|
|
641
|
+
|
|
642
|
+
# Run tests
|
|
643
|
+
pnpm test
|
|
644
|
+
|
|
645
|
+
# Run tests in watch mode
|
|
646
|
+
pnpm test:watch
|
|
647
|
+
|
|
648
|
+
# Run tests with coverage
|
|
649
|
+
pnpm test:coverage
|
|
650
|
+
```
|
|
651
|
+
|
|
652
|
+
### Building
|
|
653
|
+
|
|
654
|
+
```bash
|
|
655
|
+
# Build the package
|
|
656
|
+
pnpm build
|
|
657
|
+
|
|
658
|
+
# Type check without building
|
|
659
|
+
pnpm typecheck
|
|
660
|
+
```
|
|
661
|
+
|
|
662
|
+
## Dependencies
|
|
663
|
+
|
|
664
|
+
| Package | Purpose |
|
|
665
|
+
|---------|---------|
|
|
666
|
+
| @grpc/grpc-js | gRPC communication with Agent Gateway |
|
|
667
|
+
| @grpc/proto-loader | Protobuf loading |
|
|
668
|
+
| zod | Schema validation |
|
|
669
|
+
|
|
670
|
+
### Optional Dependencies
|
|
671
|
+
|
|
672
|
+
| Package | Install Command | Purpose |
|
|
673
|
+
|---------|-----------------|---------|
|
|
674
|
+
| @langchain/langgraph | `pnpm add @langchain/langgraph` | LangGraph state management |
|
|
675
|
+
| @langchain/langgraph-checkpoint | `pnpm add @langchain/langgraph-checkpoint` | Checkpoint interface |
|
|
676
|
+
| ai | `pnpm add ai` | Vercel AI SDK |
|
|
677
|
+
| @ai-sdk/openai | `pnpm add @ai-sdk/openai` | OpenAI provider for Vercel AI |
|
|
678
|
+
|
|
679
|
+
## TypeScript Types
|
|
680
|
+
|
|
681
|
+
The SDK exports all types for TypeScript users:
|
|
682
|
+
|
|
683
|
+
```typescript
|
|
684
|
+
import type {
|
|
685
|
+
// Core types
|
|
686
|
+
Message,
|
|
687
|
+
Streamer,
|
|
688
|
+
Agent,
|
|
689
|
+
MarchAgentApp,
|
|
690
|
+
|
|
691
|
+
// Options
|
|
692
|
+
AppOptions,
|
|
693
|
+
RegisterOptions,
|
|
694
|
+
StreamOptions,
|
|
695
|
+
StreamerOptions,
|
|
696
|
+
|
|
697
|
+
// Data types
|
|
698
|
+
KafkaMessage,
|
|
699
|
+
AgentRegistrationData,
|
|
700
|
+
AttachmentInfo,
|
|
701
|
+
ConversationData,
|
|
702
|
+
|
|
703
|
+
// Memory types
|
|
704
|
+
MemoryMessage,
|
|
705
|
+
MemorySearchResult,
|
|
706
|
+
UserSummary,
|
|
707
|
+
|
|
708
|
+
// Handler types
|
|
709
|
+
MessageHandler,
|
|
710
|
+
SenderFilterOptions,
|
|
711
|
+
} from 'march-ai-sdk'
|
|
712
|
+
```
|
|
713
|
+
|
|
714
|
+
## Exceptions
|
|
715
|
+
|
|
716
|
+
The SDK provides typed exceptions:
|
|
717
|
+
|
|
718
|
+
```typescript
|
|
719
|
+
import {
|
|
720
|
+
MarchAgentError, // Base error class
|
|
721
|
+
RegistrationError, // Agent registration failed
|
|
722
|
+
KafkaError, // Kafka communication error
|
|
723
|
+
ConfigurationError, // Invalid configuration
|
|
724
|
+
APIException, // HTTP API error
|
|
725
|
+
HeartbeatError, // Heartbeat failed
|
|
726
|
+
GatewayError, // Gateway communication error
|
|
727
|
+
} from 'march-ai-sdk'
|
|
728
|
+
```
|
|
729
|
+
|
|
730
|
+
## License
|
|
731
|
+
|
|
732
|
+
MIT
|