@mastra/client-js 1.15.2-alpha.0 → 1.15.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,223 @@
1
1
  # @mastra/client-js
2
2
 
3
+ ## 1.15.2
4
+
5
+ ### Patch Changes
6
+
7
+ - Remove incorrect deprecation markers from `getTask()` and `cancelTask()` in the Mastra A2A client. ([#15941](https://github.com/mastra-ai/mastra/pull/15941))
8
+
9
+ - Add durable agents with resumable streams ([#12557](https://github.com/mastra-ai/mastra/pull/12557))
10
+
11
+ Durable agents make agent execution resilient to disconnections, crashes, and long-running operations.
12
+
13
+ ### The Problem
14
+
15
+ Standard agent streaming has two fragility points:
16
+ 1. **Connection drops** - If a client disconnects mid-stream (network blip, browser refresh, mobile app backgrounded), all subsequent events are lost. The client has no way to "catch up" on what they missed.
17
+ 2. **Long-running operations** - Agent loops with tool calls can take minutes. Holding an HTTP connection open that long is unreliable. If the server restarts or the connection times out, the work is lost.
18
+
19
+ ### The Solution
20
+
21
+ **Resumable streams** solve connection drops. Every event is cached with a sequential index. If a client disconnects at event 5, they can reconnect and request events starting from index 6. They receive cached events immediately, then continue with live events as they arrive.
22
+
23
+ **Durable execution** solves long-running operations. Instead of executing the agent loop directly in the HTTP request, execution happens in a workflow engine (built-in evented engine or Inngest). The HTTP request just subscribes to events. If the connection drops, execution continues. The client can reconnect anytime to observe progress.
24
+
25
+ ### Usage
26
+
27
+ Wrap any existing `Agent` with durability using factory functions:
28
+
29
+ ```typescript
30
+ import { Agent } from '@mastra/core/agent';
31
+ import { createDurableAgent } from '@mastra/core/agent/durable';
32
+
33
+ const agent = new Agent({
34
+ id: 'my-agent',
35
+ model: openai('gpt-4'),
36
+ instructions: 'You are helpful',
37
+ });
38
+
39
+ const durableAgent = createDurableAgent({ agent });
40
+ ```
41
+
42
+ **Factory functions for different execution strategies:**
43
+
44
+ | Factory | Execution | Use Case |
45
+ | ---------------------------------------- | ----------------------------------- | ------------------------------- |
46
+ | `createDurableAgent({ agent })` | Local, synchronous | Development, simple deployments |
47
+ | `createEventedAgent({ agent })` | Fire-and-forget via workflow engine | Long-running operations |
48
+ | `createInngestAgent({ agent, inngest })` | Inngest-powered | Production, distributed systems |
49
+
50
+ ### Resumable Streams
51
+
52
+ ```typescript
53
+ // Start streaming
54
+ const { runId, output } = await durableAgent.stream('Analyze this data...');
55
+
56
+ // Client disconnects at event 5...
57
+
58
+ // Reconnect and resume from where we left off
59
+ const { output: resumed } = await durableAgent.observe(runId, { offset: 6 });
60
+ // Receives events 6, 7, 8... from cache, then continues with live events
61
+ ```
62
+
63
+ ### PubSub and Cache
64
+
65
+ Durable agents use two infrastructure components:
66
+
67
+ | Component | Purpose | Default |
68
+ | ---------- | ----------------------------------------- | --------------------- |
69
+ | **PubSub** | Real-time event delivery during streaming | `EventEmitterPubSub` |
70
+ | **Cache** | Stores events for replay on reconnection | `InMemoryServerCache` |
71
+
72
+ When `stream()` is called, events flow through pubsub in real-time. The cache stores each event with a sequential index. When `observe()` is called, missed events replay from cache before continuing with live events.
73
+
74
+ **Configure via Mastra instance (recommended):**
75
+
76
+ ```typescript
77
+ const mastra = new Mastra({
78
+ cache: new RedisServerCache({ url: 'redis://...' }),
79
+ pubsub: new RedisPubSub({ url: 'redis://...' }),
80
+ agents: {
81
+ // Inherits cache and pubsub from Mastra
82
+ myAgent: createDurableAgent({ agent }),
83
+ },
84
+ });
85
+ ```
86
+
87
+ **Configure per-agent (overrides Mastra):**
88
+
89
+ ```typescript
90
+ const durableAgent = createDurableAgent({
91
+ agent,
92
+ cache: new RedisServerCache({ url: 'redis://...' }),
93
+ pubsub: new RedisPubSub({ url: 'redis://...' }),
94
+ });
95
+ ```
96
+
97
+ **Disable caching (streams won't be resumable):**
98
+
99
+ ```typescript
100
+ const durableAgent = createDurableAgent({ agent, cache: false });
101
+ ```
102
+
103
+ For single-instance deployments, the defaults work fine. For multi-instance deployments (load balancer, horizontal scaling), use Redis-backed implementations so any instance can serve reconnection requests.
104
+
105
+ ### Class Hierarchy
106
+ - `DurableAgent` extends `Agent` - base class with resumable streams
107
+ - `EventedAgent` extends `DurableAgent` - fire-and-forget execution
108
+ - `InngestAgent` extends `DurableAgent` - Inngest-powered execution
109
+
110
+ - Updated dependencies [[`920c757`](https://github.com/mastra-ai/mastra/commit/920c75799c6bd71787d86deaf654a35af4c839ca), [`d587199`](https://github.com/mastra-ai/mastra/commit/d5871993c0371bde2b0717d6b47194755baa1443), [`1fe2533`](https://github.com/mastra-ai/mastra/commit/1fe2533c4382ca6858aac7c4b63e888c2eac6541), [`f8694b6`](https://github.com/mastra-ai/mastra/commit/f8694b6fa0b7a5cde71d794c3bbef4957c55bcb8)]:
111
+ - @mastra/core@1.30.0
112
+
113
+ ## 1.15.2-alpha.1
114
+
115
+ ### Patch Changes
116
+
117
+ - Add durable agents with resumable streams ([#12557](https://github.com/mastra-ai/mastra/pull/12557))
118
+
119
+ Durable agents make agent execution resilient to disconnections, crashes, and long-running operations.
120
+
121
+ ### The Problem
122
+
123
+ Standard agent streaming has two fragility points:
124
+ 1. **Connection drops** - If a client disconnects mid-stream (network blip, browser refresh, mobile app backgrounded), all subsequent events are lost. The client has no way to "catch up" on what they missed.
125
+ 2. **Long-running operations** - Agent loops with tool calls can take minutes. Holding an HTTP connection open that long is unreliable. If the server restarts or the connection times out, the work is lost.
126
+
127
+ ### The Solution
128
+
129
+ **Resumable streams** solve connection drops. Every event is cached with a sequential index. If a client disconnects at event 5, they can reconnect and request events starting from index 6. They receive cached events immediately, then continue with live events as they arrive.
130
+
131
+ **Durable execution** solves long-running operations. Instead of executing the agent loop directly in the HTTP request, execution happens in a workflow engine (built-in evented engine or Inngest). The HTTP request just subscribes to events. If the connection drops, execution continues. The client can reconnect anytime to observe progress.
132
+
133
+ ### Usage
134
+
135
+ Wrap any existing `Agent` with durability using factory functions:
136
+
137
+ ```typescript
138
+ import { Agent } from '@mastra/core/agent';
139
+ import { createDurableAgent } from '@mastra/core/agent/durable';
140
+
141
+ const agent = new Agent({
142
+ id: 'my-agent',
143
+ model: openai('gpt-4'),
144
+ instructions: 'You are helpful',
145
+ });
146
+
147
+ const durableAgent = createDurableAgent({ agent });
148
+ ```
149
+
150
+ **Factory functions for different execution strategies:**
151
+
152
+ | Factory | Execution | Use Case |
153
+ | ---------------------------------------- | ----------------------------------- | ------------------------------- |
154
+ | `createDurableAgent({ agent })` | Local, synchronous | Development, simple deployments |
155
+ | `createEventedAgent({ agent })` | Fire-and-forget via workflow engine | Long-running operations |
156
+ | `createInngestAgent({ agent, inngest })` | Inngest-powered | Production, distributed systems |
157
+
158
+ ### Resumable Streams
159
+
160
+ ```typescript
161
+ // Start streaming
162
+ const { runId, output } = await durableAgent.stream('Analyze this data...');
163
+
164
+ // Client disconnects at event 5...
165
+
166
+ // Reconnect and resume from where we left off
167
+ const { output: resumed } = await durableAgent.observe(runId, { offset: 6 });
168
+ // Receives events 6, 7, 8... from cache, then continues with live events
169
+ ```
170
+
171
+ ### PubSub and Cache
172
+
173
+ Durable agents use two infrastructure components:
174
+
175
+ | Component | Purpose | Default |
176
+ | ---------- | ----------------------------------------- | --------------------- |
177
+ | **PubSub** | Real-time event delivery during streaming | `EventEmitterPubSub` |
178
+ | **Cache** | Stores events for replay on reconnection | `InMemoryServerCache` |
179
+
180
+ When `stream()` is called, events flow through pubsub in real-time. The cache stores each event with a sequential index. When `observe()` is called, missed events replay from cache before continuing with live events.
181
+
182
+ **Configure via Mastra instance (recommended):**
183
+
184
+ ```typescript
185
+ const mastra = new Mastra({
186
+ cache: new RedisServerCache({ url: 'redis://...' }),
187
+ pubsub: new RedisPubSub({ url: 'redis://...' }),
188
+ agents: {
189
+ // Inherits cache and pubsub from Mastra
190
+ myAgent: createDurableAgent({ agent }),
191
+ },
192
+ });
193
+ ```
194
+
195
+ **Configure per-agent (overrides Mastra):**
196
+
197
+ ```typescript
198
+ const durableAgent = createDurableAgent({
199
+ agent,
200
+ cache: new RedisServerCache({ url: 'redis://...' }),
201
+ pubsub: new RedisPubSub({ url: 'redis://...' }),
202
+ });
203
+ ```
204
+
205
+ **Disable caching (streams won't be resumable):**
206
+
207
+ ```typescript
208
+ const durableAgent = createDurableAgent({ agent, cache: false });
209
+ ```
210
+
211
+ For single-instance deployments, the defaults work fine. For multi-instance deployments (load balancer, horizontal scaling), use Redis-backed implementations so any instance can serve reconnection requests.
212
+
213
+ ### Class Hierarchy
214
+ - `DurableAgent` extends `Agent` - base class with resumable streams
215
+ - `EventedAgent` extends `DurableAgent` - fire-and-forget execution
216
+ - `InngestAgent` extends `DurableAgent` - Inngest-powered execution
217
+
218
+ - Updated dependencies [[`920c757`](https://github.com/mastra-ai/mastra/commit/920c75799c6bd71787d86deaf654a35af4c839ca), [`1fe2533`](https://github.com/mastra-ai/mastra/commit/1fe2533c4382ca6858aac7c4b63e888c2eac6541), [`f8694b6`](https://github.com/mastra-ai/mastra/commit/f8694b6fa0b7a5cde71d794c3bbef4957c55bcb8)]:
219
+ - @mastra/core@1.30.0-alpha.1
220
+
3
221
  ## 1.15.2-alpha.0
4
222
 
5
223
  ### Patch Changes
@@ -3,7 +3,7 @@ name: mastra-client-js
3
3
  description: Documentation for @mastra/client-js. Use when working with @mastra/client-js APIs, configuration, or implementation.
4
4
  metadata:
5
5
  package: "@mastra/client-js"
6
- version: "1.15.2-alpha.0"
6
+ version: "1.15.2"
7
7
  ---
8
8
 
9
9
  ## When to use
@@ -1,5 +1,5 @@
1
1
  {
2
- "version": "1.15.2-alpha.0",
2
+ "version": "1.15.2",
3
3
  "package": "@mastra/client-js",
4
4
  "exports": {
5
5
  "RequestContext": {
package/dist/index.cjs CHANGED
@@ -1558,6 +1558,51 @@ var Agent = class extends BaseResource {
1558
1558
  };
1559
1559
  return streamResponse;
1560
1560
  }
1561
+ /**
1562
+ * Observe (reconnect to) an existing agent stream.
1563
+ * Use this to resume receiving events after a disconnection.
1564
+ *
1565
+ * @param params.runId - The run ID to observe
1566
+ * @param params.offset - Optional position to resume from (0-based). If omitted, replays all events.
1567
+ * @returns Promise containing a streaming Response
1568
+ *
1569
+ * @example
1570
+ * ```typescript
1571
+ * // Reconnect to a stream from a specific position
1572
+ * const response = await client.agents('my-agent').observe({
1573
+ * runId: 'run-123',
1574
+ * offset: 42, // Resume from event 42
1575
+ * });
1576
+ *
1577
+ * await response.processDataStream({
1578
+ * onChunk: (chunk) => console.log('Received:', chunk),
1579
+ * });
1580
+ * ```
1581
+ */
1582
+ async observe(params) {
1583
+ const response = await this.request(`/agents/${this.agentId}/observe`, {
1584
+ method: "POST",
1585
+ body: params,
1586
+ stream: true
1587
+ });
1588
+ if (!response.body) {
1589
+ throw new Error("No response body");
1590
+ }
1591
+ const streamResponse = new Response(response.body, {
1592
+ status: response.status,
1593
+ statusText: response.statusText,
1594
+ headers: response.headers
1595
+ });
1596
+ streamResponse.processDataStream = async ({
1597
+ onChunk
1598
+ }) => {
1599
+ await processMastraStream({
1600
+ stream: streamResponse.body,
1601
+ onChunk
1602
+ });
1603
+ };
1604
+ return streamResponse;
1605
+ }
1561
1606
  /**
1562
1607
  * Resumes a suspended agent stream with custom resume data.
1563
1608
  * Used to continue execution after a suspension point (e.g., workflow suspend within an agent).
@@ -2311,19 +2356,32 @@ var Run = class extends BaseResource {
2311
2356
  return response.body.pipeThrough(this.createChunkTransformStream());
2312
2357
  }
2313
2358
  /**
2314
- * Observes workflow stream for a workflow run
2315
- * @returns Promise containing the workflow execution results
2359
+ * Observe (reconnect to) an existing workflow stream.
2360
+ * Use this to resume receiving events after a disconnection.
2361
+ *
2362
+ * @param params.offset - Optional position to resume from (0-based). If omitted, replays all events.
2363
+ * @returns Promise containing a ReadableStream of workflow events
2364
+ *
2365
+ * @example
2366
+ * ```typescript
2367
+ * // Reconnect to a workflow stream from a specific position
2368
+ * const stream = await run.observe({ offset: 42 });
2369
+ *
2370
+ * for await (const event of stream) {
2371
+ * console.log('Received:', event);
2372
+ * }
2373
+ * ```
2316
2374
  */
2317
- async observeStream() {
2375
+ async observe(params) {
2318
2376
  const searchParams = new URLSearchParams();
2319
2377
  searchParams.set("runId", this.runId);
2320
- const response = await this.request(
2321
- `/workflows/${this.workflowId}/observe-stream?${searchParams.toString()}`,
2322
- {
2323
- method: "POST",
2324
- stream: true
2325
- }
2326
- );
2378
+ if (params?.offset !== void 0) {
2379
+ searchParams.set("offset", String(params.offset));
2380
+ }
2381
+ const response = await this.request(`/workflows/${this.workflowId}/observe?${searchParams.toString()}`, {
2382
+ method: "POST",
2383
+ stream: true
2384
+ });
2327
2385
  if (!response.ok) {
2328
2386
  throw new Error(`Failed to observe workflow stream: ${response.statusText}`);
2329
2387
  }
@@ -2332,6 +2390,14 @@ var Run = class extends BaseResource {
2332
2390
  }
2333
2391
  return response.body.pipeThrough(this.createChunkTransformStream());
2334
2392
  }
2393
+ /**
2394
+ * Observes workflow stream for a workflow run
2395
+ * @deprecated Use `observe()` instead for better control over replay position
2396
+ * @returns Promise containing the workflow execution results
2397
+ */
2398
+ async observeStream() {
2399
+ return this.observe();
2400
+ }
2335
2401
  /**
2336
2402
  * Resumes a suspended workflow step asynchronously and returns a promise that resolves when the workflow is complete
2337
2403
  * @param params - Object containing the step, resumeData and requestContext