@hotmeshio/hotmesh 0.4.3 β†’ 0.5.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,44 +1,52 @@
1
1
  # HotMesh
2
2
 
3
- **Permanent-Memory Workflows & AI Agents**
3
+ **🧠 Workflow That Remembers**
4
4
 
5
5
  ![beta release](https://img.shields.io/badge/release-beta-blue.svg) ![made with typescript](https://img.shields.io/badge/built%20with-typescript-lightblue.svg)
6
6
 
7
- **HotMesh** is a Temporal-style workflow engine that runs natively on PostgreSQL β€” with a powerful twist: every workflow maintains permanent, state that persists independently of the workflow itself.
7
+ HotMesh brings a **memory model** to your automation: durable entities that hold context, support concurrency, and evolve across runs. Built on PostgreSQL, it treats your database not just as storageβ€”but as the runtime hub for agents, pipelines, and long-lived processes.
8
8
 
9
- This means:
9
+ Use HotMesh to:
10
10
 
11
- * Any number of lightweight, thread-safe **hook workers** can attach to the same workflow record at any time.
12
- * These hooks can safely **read and write** to shared state.
13
- * The result is a **durable execution model** with **evolving memory**, ideal for **human-in-the-loop processes** and **AI agents that learn over time**.
11
+ * **Store Evolving State** – Retain memory/state between executions
12
+ * **Coordinate Distributed Work** – Safely allow multiple workers to act on shared state
13
+ * **Track and Replay** – Full audit history and replay support by default
14
14
 
15
15
  ---
16
16
 
17
17
  ## Table of Contents
18
18
 
19
- 1. πŸš€ Quick Start
20
- 2. 🧠 How Permanent Memory Works
21
- 3. πŸ”Œ Hooks & Entity API
22
- 4. πŸ€– Building Durable AI Agents
23
- 5. πŸ”¬ Advanced Patterns & Recipes
24
- 6. πŸ“š Documentation & Links
19
+ 1. [Quick Start](#-quick-start)
20
+ 2. [Permanent Memory Architecture](#-permanent-memory-architecture)
21
+ 3. [Durable AI Agents](#-durable-ai-agents)
22
+ 4. [Building Pipelines with State](#-building-pipelines-with-state)
23
+ 5. [Documentation & Links](#-documentation--links)
25
24
 
26
25
  ---
27
26
 
28
- ## πŸš€ Quick Start
27
+ ## Quick Start
28
+
29
+ ### Prerequisites
30
+
31
+ * PostgreSQL (or Supabase)
32
+ * Node.js 16+
29
33
 
30
34
  ### Install
35
+
31
36
  ```bash
32
37
  npm install @hotmeshio/hotmesh
33
38
  ```
34
39
 
35
- ### Start a workflow
36
- ```typescript
37
- // index.ts
40
+ ### Connect to a Database
41
+
42
+ HotMesh leverages Temporal.io's developer-friendly syntax for authoring workers, workflows, and clients. The `init` and `start` methods should look familiar.
43
+
44
+ ```ts
38
45
  import { MemFlow } from '@hotmeshio/hotmesh';
39
46
  import { Client as Postgres } from 'pg';
40
47
 
41
48
  async function main() {
49
+ // MemFlow will auto-provision the database upon init
42
50
  const mf = await MemFlow.init({
43
51
  appId: 'my-app',
44
52
  engine: {
@@ -49,13 +57,13 @@ async function main() {
49
57
  }
50
58
  });
51
59
 
52
- // Kick off a workflow
60
+ // Start a workflow with an assigned ID and arguments
53
61
  const handle = await mf.workflow.start({
54
- entity: 'user',
55
- workflowName: 'userExample',
56
- workflowId: 'jane@hotmesh.com',
57
- args: ['Jane'],
58
- taskQueue: 'entityqueue'
62
+ entity: 'research-agent',
63
+ workflowName: 'researchAgent',
64
+ workflowId: 'agent-session-jane-001',
65
+ args: ['What are the long-term impacts of renewable energy subsidies?'],
66
+ taskQueue: 'agents'
59
67
  });
60
68
 
61
69
  console.log('Result:', await handle.result());
@@ -64,172 +72,298 @@ async function main() {
64
72
  main().catch(console.error);
65
73
  ```
66
74
 
75
+ ### System Benefits
76
+
77
+ * **No Setup Required** – Tables and indexes are provisioned automatically
78
+ * **Shared State** – Every worker shares access to the same entity memory
79
+ * **Coordination by Design** – PostgreSQL handles consistency and isolation
80
+ * **Tenant Isolation** – Each app maintains its own schema
81
+ * **Scalable Defaults** – Partitioned tables and index support included
82
+
67
83
  ---
68
84
 
69
- ## 🧠 How Permanent Memory Works
85
+ ## Permanent Memory Architecture
86
+
87
+ Every workflow in HotMesh is backed by an "entity": a versioned, JSONB record that tracks its memory and state transitions.
70
88
 
71
- * **Entity = persistent JSON record** – each workflow's memory is stored as a JSONB row in your Postgres database
72
- * **Atomic operations** (`set`, `merge`, `append`, `increment`, `toggle`, `delete`, …)
73
- * **Transactional** – every update participates in the workflow/DB transaction
74
- * **Time-travel-safe** – full replay compatibility; side-effect detector guarantees determinism
75
- * **Hook-friendly** – any worker with the record ID can attach and mutate its slice of the JSON
76
- * **Index-friendly** - entity data is stored as JSONB; add partial indexes for improved query analysis.
89
+ * **Entities** – Represent long-lived state for a workflow or agent
90
+ * **Commands** – Modify state with methods like `set`, `merge`, `append`, `increment`
91
+ * **Consistency** – All updates are transactional with Postgres
92
+ * **Replay Safety** – Protects against duplicated side effects during re-execution
93
+ * **Partial Indexing** – Optimized querying of fields within large JSON structures
94
+
95
+ ### Example: Partial Index for Premium Users
77
96
 
78
- **Example: Adding a Partial Index for Specific Entity Types**
79
97
  ```sql
80
- -- Create a partial index for 'user' entities with specific entity values
98
+ -- Index only those user entities that are marked as premium
81
99
  CREATE INDEX idx_user_premium ON your_app.jobs (id)
82
100
  WHERE entity = 'user' AND (context->>'isPremium')::boolean = true;
83
101
  ```
84
- This index will only be used for queries that match both conditions, making lookups for premium users much faster.
102
+
103
+ This index improves performance for filtered queries while reducing index size.
85
104
 
86
105
  ---
87
106
 
88
- ## πŸ”Œ Hooks & Entity API – Full Example
107
+ ## Durable AI Agents
89
108
 
90
- HotMesh hooks are powerful because they can be called both internally (from within a workflow) and externally (from outside, even after the workflow completes). This means you can:
109
+ Agents often require memoryβ€”context that persists between invocations, spans multiple perspectives, or outlives a single process. HotMesh supports that natively.
91
110
 
92
- * Start a workflow that sets up initial state
93
- * Have the workflow call some hooks internally
94
- * Let the workflow complete
95
- * Continue to update the workflow's entity state from the outside via hooks
96
- * Build long-running processes that evolve over time
111
+ The following example builds a "research agent" that runs sub-flows and self-reflects on the results.
97
112
 
98
- Here's a complete example showing both internal and external hook usage:
113
+ ### Research Agent Example
99
114
 
100
- ```typescript
101
- import { MemFlow } from '@hotmeshio/hotmesh';
115
+ #### Main Coordinator Agent
102
116
 
103
- /* ------------ Main workflow ------------ */
104
- export async function userExample(name: string): Promise<any> {
105
- //the entity method provides transactional, replayable access to shared job state
117
+ ```ts
118
+ export async function researchAgent(query: string): Promise<any> {
106
119
  const entity = await MemFlow.workflow.entity();
107
120
 
108
- //create the initial entity (even arrays are supported)
121
+ // Set up shared memory for this agent session
109
122
  await entity.set({
110
- user: { name },
111
- hooks: {},
112
- metrics: { count: 0 }
123
+ query,
124
+ findings: [],
125
+ perspectives: {},
126
+ confidence: 0,
127
+ status: 'researching'
128
+ });
129
+
130
+ // Launch perspective hooks in parallel (no need to await here)
131
+ const optimistic = MemFlow.workflow.execHook({
132
+ taskQueue: 'agents',
133
+ workflowName: 'optimisticPerspective',
134
+ args: [query]
135
+ });
136
+
137
+ const skeptical = MemFlow.workflow.execHook({
138
+ taskQueue: 'agents',
139
+ workflowName: 'skepticalPerspective',
140
+ args: [query]
113
141
  });
114
142
 
115
- // Call one hook internally
116
- const result1 = await MemFlow.workflow.execHook({
117
- taskQueue: 'entityqueue',
118
- workflowName: 'hook1',
119
- args: [name, 'hook1'],
120
- signalId: 'hook1-complete'
143
+ // Launch a child workflow with its own isolated entity/state
144
+ const factChecker = await MemFlow.workflow.execChild({
145
+ entity: 'fact-checker',
146
+ workflowName: 'factCheckAgent',
147
+ workflowId: `fact-check-${Date.now()}`,
148
+ args: [query],
149
+ taskQueue: 'agents'
121
150
  });
122
151
 
123
- // merge the result
124
- await entity.merge({ hooks: { r1: result1 } });
125
- await entity.increment('metrics.count', 1);
152
+ // Wait for all views to complete before analyzing
153
+ await Promise.all([optimistic, skeptical, factChecker]);
126
154
 
127
- return "The main has completed; the db record persists and can be hydrated; hook in from the outside!";
155
+ // Final synthesis: aggregate and compare all perspectives
156
+ await MemFlow.workflow.execHook({
157
+ taskQueue: 'perspectives',
158
+ workflowName: 'synthesizePerspectives',
159
+ args: []
160
+ });
161
+
162
+ return await entity.get();
128
163
  }
164
+ ```
165
+
166
+ #### Hooks: Perspectives
129
167
 
130
- /* ------------ Hook 1 (hooks have access to methods like sleepFor) ------------ */
131
- export async function hook1(name: string, kind: string): Promise<any> {
132
- await MemFlow.workflow.sleepFor('2 seconds');
133
- const res = { kind, processed: true, at: Date.now() };
134
- await MemFlow.workflow.signal('hook1-complete', res);
168
+ ```ts
169
+ // Optimistic hook looks for affirming evidence
170
+ export async function optimisticPerspective(query: string, config: {signal: string}): Promise<void> {
171
+ const entity = await MemFlow.workflow.entity();
172
+ const findings = await searchForSupportingEvidence(query);
173
+ await entity.merge({ perspectives: { optimistic: { findings, confidence: 0.8 }}});
174
+ //signal the caller to notify all done
175
+ await MemFlow.workflow.signal(config.signal, {});
135
176
  }
136
177
 
137
- /* ------------ Hook 2 (hooks can access shared job entity) ------------ */
138
- export async function hook2(name: string, kind: string): Promise<void> {
178
+ // Skeptical hook seeks out contradictions and counterpoints
179
+ export async function skepticalPerspective(query: string, config: {signal: string}): Promise<void> {
139
180
  const entity = await MemFlow.workflow.entity();
140
- await entity.merge({ user: { lastSeen: new Date().toISOString() } });
141
- await MemFlow.workflow.signal('hook2-complete', { ok: true });
181
+ const counterEvidence = await searchForCounterEvidence(query);
182
+ await entity.merge({ perspectives: { skeptical: { counterEvidence, confidence: 0.6 }}});
183
+ await MemFlow.workflow.signal(config.signal, {});
142
184
  }
185
+ ```
143
186
 
144
- /* ------------ Worker/Hook Registration ------------ */
145
- async function startWorker() {
146
- const mf = await MemFlow.init({
147
- appId: 'my-app',
148
- engine: {
149
- connection: {
150
- class: Postgres,
151
- options: { connectionString: process.env.DATABASE_URL }
152
- }
153
- }
154
- });
187
+ #### Child Agent: Fact Checker
155
188
 
156
- const worker = await mf.worker.create({
157
- taskQueue: 'entityqueue',
158
- workflow: example
159
- });
189
+ ```ts
190
+ // A dedicated child agent with its own entity type and context
191
+ export async function factCheckAgent(query: string): Promise<any> {
192
+ const entity = await MemFlow.workflow.entity();
193
+ await entity.set({ query, sources: [], verifications: [] });
160
194
 
161
- await mf.worker.create({
162
- taskQueue: 'entityqueue',
163
- workflow: hook1
195
+ await MemFlow.workflow.execHook({
196
+ taskQueue: 'agents',
197
+ workflowName: 'verifySourceCredibility',
198
+ args: [query]
164
199
  });
165
200
 
166
- await mf.worker.create({
167
- taskQueue: 'entityqueue',
168
- workflow: hook2
169
- });
201
+ return await entity.get();
202
+ }
203
+ ```
204
+
205
+ #### Synthesis
206
+
207
+ ```ts
208
+ // Synthesis hook aggregates different viewpoints
209
+ export async function synthesizePerspectives(config: {signal: string}): Promise<void> {
210
+ const entity = await MemFlow.workflow.entity();
211
+ const context = await entity.get();
212
+
213
+ const result = await analyzePerspectives(context.perspectives);
170
214
 
171
- console.log('Workers and hooks started and listening...');
215
+ await entity.merge({
216
+ perspectives: {
217
+ synthesis: {
218
+ finalAssessment: result,
219
+ confidence: calculateConfidence(context.perspectives)
220
+ }
221
+ },
222
+ status: 'completed'
223
+ });
224
+ await MemFlow.workflow.signal(config.signal, {});
172
225
  }
173
226
  ```
174
227
 
175
- ### The Power of External Hooks
228
+ ---
176
229
 
177
- One of HotMesh's most powerful features is that workflow entities remain accessible even after the main workflow completes. By providing the original workflow ID, any authorized client can:
230
+ ## Building Pipelines with State
178
231
 
179
- * Hook into existing workflow entities
180
- * Update state and trigger new processing
181
- * Build evolving, long-running processes
182
- * Enable human-in-the-loop workflows
183
- * Create AI agents that learn over time
232
+ HotMesh treats pipelines as long-lived records, not ephemeral jobs. Every pipeline run is stateful, resumable, and traceable.
184
233
 
185
- Here's how to hook into an existing workflow from the outside:
234
+ ### Setup a Data Pipeline
186
235
 
187
- ```typescript
188
- /* ------------ External Hook Example ------------ */
189
- async function externalHookExample() {
190
- const client = new MemFlow.Client({
191
- appId: 'my-app',
192
- engine: {
193
- connection: {
194
- class: Postgres,
195
- options: { connectionString: process.env.DATABASE_URL }
196
- }
197
- }
236
+ ```ts
237
+ export async function dataPipeline(source: string): Promise<void> {
238
+ const entity = await MemFlow.workflow.entity();
239
+
240
+ // Initial policy and tracking setup
241
+ await entity.set({
242
+ source,
243
+ pipeline: { version: 1, policy: { refreshInterval: '24 hours' } },
244
+ changeLog: []
198
245
  });
199
246
 
200
- // Start hook2 externally by providing the original workflow ID
201
- await client.workflow.hook({
202
- workflowId: 'jane@hotmesh.com', //id of the target workflow
203
- taskQueue: 'entityqueue',
204
- workflowName: 'hook2',
205
- args: [name, 'external-hook']
247
+ // Trigger the recurring orchestration pipeline
248
+ await MemFlow.workflow.execHook({
249
+ taskQueue: 'pipeline',
250
+ workflowName: 'runPipeline',
251
+ args: [true]
206
252
  });
207
253
  }
208
254
  ```
209
255
 
210
- ---
256
+ ### Orchestration Hook
257
+
258
+ ```ts
259
+ export async function runPipeline(repeat = false): Promise<void> {
260
+ do {
261
+ // Perform transformation step
262
+ await MemFlow.workflow.execHook({
263
+ taskQueue: 'transform',
264
+ workflowName: 'cleanData',
265
+ args: []
266
+ });
267
+
268
+ if (repeat) {
269
+ // Schedule next execution
270
+ await MemFlow.workflow.execHook({
271
+ taskQueue: 'scheduler',
272
+ workflowName: 'scheduleRefresh',
273
+ args: []
274
+ });
275
+ }
276
+ } while (repeat)
277
+ }
211
278
 
212
- ## πŸ€– Building Durable AI Agents
279
+ /**
280
+ * Hook to clean and transform data
281
+ */
282
+ export async function cleanData(signalInfo?: { signal: string }): Promise<void> {
283
+ const entity = await MemFlow.workflow.entity();
284
+
285
+ // Simulate data cleaning
286
+ await entity.merge({
287
+ status: 'cleaning',
288
+ lastCleanedAt: new Date().toISOString()
289
+ });
290
+
291
+ // Add to changelog
292
+ await entity.append('changeLog', {
293
+ action: 'clean',
294
+ timestamp: new Date().toISOString()
295
+ });
296
+
297
+ // Signal completion if called via execHook
298
+ if (signalInfo?.signal) {
299
+ await MemFlow.workflow.signal(signalInfo.signal, {
300
+ status: 'cleaned',
301
+ timestamp: new Date().toISOString()
302
+ });
303
+ }
304
+ }
305
+
306
+ /**
307
+ * Hook to schedule the next refresh based on policy
308
+ */
309
+ export async function scheduleRefresh(signalInfo?: { signal: string }): Promise<void> {
310
+ const entity = await MemFlow.workflow.entity();
311
+
312
+ // Get refresh interval from policy
313
+ const currentEntity = await entity.get();
314
+ const refreshInterval = currentEntity.pipeline.policy.refreshInterval;
315
+
316
+ // Sleep for the configured interval
317
+ await MemFlow.workflow.sleepFor(refreshInterval);
318
+
319
+ // Update status after sleep
320
+ await entity.merge({
321
+ status: 'ready_for_refresh',
322
+ nextRefreshAt: new Date().toISOString()
323
+ });
213
324
 
214
- Permanent memory unlocks a straightforward pattern for agentic systems:
325
+ // Add to changelog
326
+ await entity.append('changeLog', {
327
+ action: 'schedule_refresh',
328
+ timestamp: new Date().toISOString(),
329
+ nextRefresh: new Date().toISOString()
330
+ });
331
+
332
+ // Signal completion if called via execHook
333
+ if (signalInfo?.signal) {
334
+ await MemFlow.workflow.signal(signalInfo.signal, {
335
+ status: 'scheduled',
336
+ nextRefresh: new Date().toISOString()
337
+ });
338
+ }
339
+ }
340
+ ```
341
+
342
+ ### Trigger from Outside
215
343
 
216
- 1. **Planner workflow** – sketches a task list, seeds entity state.
217
- 2. **Tool hooks** – execute individual tasks, feeding intermediate results back into state.
218
- 3. **Reflector hook** – periodically summarizes state into long-term memory embeddings.
219
- 4. **Supervisor workflow** – monitors metrics stored in state and decides when to finish.
344
+ ```ts
345
+ // External systems can trigger a single pipeline run
346
+ export async function triggerRefresh() {
347
+ const client = new MemFlow.Client({/*...*/});
220
348
 
221
- Because every step is durable *and* shares the same knowledge object, agents can pause,
222
- restart, scale horizontally, and keep evolving their world-model indefinitely.
349
+ await client.workflow.hook({
350
+ workflowId: 'pipeline-123',
351
+ taskQueue: 'pipeline',
352
+ workflowName: 'runPipeline',
353
+ args: []
354
+ });
355
+ }
356
+ ```
223
357
 
224
358
  ---
225
359
 
226
- ## πŸ“š Documentation & Links
360
+ ## Documentation & Links
227
361
 
228
- * SDK API – [https://hotmeshio.github.io/sdk-typescript](https://hotmeshio.github.io/sdk-typescript)
229
- * Examples – [https://github.com/hotmeshio/samples-typescript](https://github.com/hotmeshio/samples-typescript)
362
+ * SDK Reference – [hotmeshio.github.io/sdk-typescript](https://hotmeshio.github.io/sdk-typescript)
363
+ * Examples – [github.com/hotmeshio/samples-typescript](https://github.com/hotmeshio/samples-typescript)
230
364
 
231
365
  ---
232
366
 
233
367
  ## License
234
368
 
235
- Apache 2.0 – see `LICENSE` for details.
369
+ Apache 2.0 – See `LICENSE` for details.
@@ -15,6 +15,7 @@ declare class MemFlowWaitForError extends Error {
15
15
  workflowId: string;
16
16
  index: number;
17
17
  workflowDimension: string;
18
+ type: string;
18
19
  constructor(params: MemFlowWaitForErrorType);
19
20
  }
20
21
  declare class MemFlowProxyError extends Error {
@@ -31,10 +32,12 @@ declare class MemFlowProxyError extends Error {
31
32
  workflowDimension: string;
32
33
  workflowId: string;
33
34
  workflowTopic: string;
35
+ type: string;
34
36
  constructor(params: MemFlowProxyErrorType);
35
37
  }
36
38
  declare class MemFlowChildError extends Error {
37
39
  await: boolean;
40
+ entity: string;
38
41
  arguments: string[];
39
42
  backoffCoefficient: number;
40
43
  code: number;
@@ -49,6 +52,7 @@ declare class MemFlowChildError extends Error {
49
52
  parentWorkflowId: string;
50
53
  workflowId: string;
51
54
  workflowTopic: string;
55
+ type: string;
52
56
  constructor(params: MemFlowChildErrorType);
53
57
  }
54
58
  declare class MemFlowWaitForAllError extends Error {
@@ -61,6 +65,7 @@ declare class MemFlowWaitForAllError extends Error {
61
65
  parentWorkflowId: string;
62
66
  workflowId: string;
63
67
  workflowTopic: string;
68
+ type: string;
64
69
  constructor(params: MemFlowWaitForAllErrorType);
65
70
  }
66
71
  declare class MemFlowSleepError extends Error {
@@ -69,22 +74,27 @@ declare class MemFlowSleepError extends Error {
69
74
  duration: number;
70
75
  index: number;
71
76
  workflowDimension: string;
77
+ type: string;
72
78
  constructor(params: MemFlowSleepErrorType);
73
79
  }
74
80
  declare class MemFlowTimeoutError extends Error {
75
81
  code: number;
82
+ type: string;
76
83
  constructor(message: string, stack?: string);
77
84
  }
78
85
  declare class MemFlowMaxedError extends Error {
79
86
  code: number;
87
+ type: string;
80
88
  constructor(message: string, stackTrace?: string);
81
89
  }
82
90
  declare class MemFlowFatalError extends Error {
83
91
  code: number;
92
+ type: string;
84
93
  constructor(message: string, stackTrace?: string);
85
94
  }
86
95
  declare class MemFlowRetryError extends Error {
87
96
  code: number;
97
+ type: string;
88
98
  constructor(message: string, stackTrace?: string);
89
99
  }
90
100
  declare class MapDataError extends Error {
@@ -19,6 +19,7 @@ exports.SetStateError = SetStateError;
19
19
  class MemFlowWaitForError extends Error {
20
20
  constructor(params) {
21
21
  super(`WaitFor Interruption`);
22
+ this.type = 'MemFlowWaitForError';
22
23
  this.signalId = params.signalId;
23
24
  this.index = params.index;
24
25
  this.workflowDimension = params.workflowDimension;
@@ -29,6 +30,7 @@ exports.MemFlowWaitForError = MemFlowWaitForError;
29
30
  class MemFlowProxyError extends Error {
30
31
  constructor(params) {
31
32
  super(`ProxyActivity Interruption`);
33
+ this.type = 'MemFlowProxyError';
32
34
  this.arguments = params.arguments;
33
35
  this.workflowId = params.workflowId;
34
36
  this.workflowTopic = params.workflowTopic;
@@ -48,6 +50,7 @@ exports.MemFlowProxyError = MemFlowProxyError;
48
50
  class MemFlowChildError extends Error {
49
51
  constructor(params) {
50
52
  super(`ExecChild Interruption`);
53
+ this.type = 'MemFlowChildError';
51
54
  this.arguments = params.arguments;
52
55
  this.workflowId = params.workflowId;
53
56
  this.workflowTopic = params.workflowTopic;
@@ -56,6 +59,7 @@ class MemFlowChildError extends Error {
56
59
  this.persistent = params.persistent;
57
60
  this.signalIn = params.signalIn;
58
61
  this.originJobId = params.originJobId;
62
+ this.entity = params.entity;
59
63
  this.index = params.index;
60
64
  this.workflowDimension = params.workflowDimension;
61
65
  this.code = enums_1.HMSH_CODE_MEMFLOW_CHILD;
@@ -69,6 +73,7 @@ exports.MemFlowChildError = MemFlowChildError;
69
73
  class MemFlowWaitForAllError extends Error {
70
74
  constructor(params) {
71
75
  super(`Collation Interruption`);
76
+ this.type = 'MemFlowWaitForAllError';
72
77
  this.items = params.items;
73
78
  this.size = params.size;
74
79
  this.workflowId = params.workflowId;
@@ -84,6 +89,7 @@ exports.MemFlowWaitForAllError = MemFlowWaitForAllError;
84
89
  class MemFlowSleepError extends Error {
85
90
  constructor(params) {
86
91
  super(`SleepFor Interruption`);
92
+ this.type = 'MemFlowSleepError';
87
93
  this.duration = params.duration;
88
94
  this.workflowId = params.workflowId;
89
95
  this.index = params.index;
@@ -95,6 +101,7 @@ exports.MemFlowSleepError = MemFlowSleepError;
95
101
  class MemFlowTimeoutError extends Error {
96
102
  constructor(message, stack) {
97
103
  super(message);
104
+ this.type = 'MemFlowTimeoutError';
98
105
  if (this.stack) {
99
106
  this.stack = stack;
100
107
  }
@@ -105,6 +112,7 @@ exports.MemFlowTimeoutError = MemFlowTimeoutError;
105
112
  class MemFlowMaxedError extends Error {
106
113
  constructor(message, stackTrace) {
107
114
  super(message);
115
+ this.type = 'MemFlowMaxedError';
108
116
  if (stackTrace) {
109
117
  this.stack = stackTrace;
110
118
  }
@@ -115,6 +123,7 @@ exports.MemFlowMaxedError = MemFlowMaxedError;
115
123
  class MemFlowFatalError extends Error {
116
124
  constructor(message, stackTrace) {
117
125
  super(message);
126
+ this.type = 'MemFlowFatalError';
118
127
  if (stackTrace) {
119
128
  this.stack = stackTrace;
120
129
  }
@@ -125,6 +134,7 @@ exports.MemFlowFatalError = MemFlowFatalError;
125
134
  class MemFlowRetryError extends Error {
126
135
  constructor(message, stackTrace) {
127
136
  super(message);
137
+ this.type = 'MemFlowRetryError';
128
138
  if (stackTrace) {
129
139
  this.stack = stackTrace;
130
140
  }
@@ -1,10 +1,10 @@
1
1
  {
2
2
  "name": "@hotmeshio/hotmesh",
3
- "version": "0.4.2",
3
+ "version": "0.5.1",
4
4
  "description": "Permanent-Memory Workflows & AI Agents",
5
5
  "main": "./build/index.js",
6
6
  "types": "./build/index.d.ts",
7
- "homepage": "https://hotmesh.io/",
7
+ "homepage": "https://github.com/hotmeshio/sdk-typescript/",
8
8
  "publishConfig": {
9
9
  "access": "public"
10
10
  },
@@ -31,7 +31,7 @@ declare class Trigger extends Activity {
31
31
  getJobStatus(): number;
32
32
  resolveJobId(context: Partial<JobState>): string;
33
33
  resolveJobKey(context: Partial<JobState>): string;
34
- setStateNX(status?: number, entity?: string): Promise<void>;
34
+ setStateNX(status?: number, entity?: string | undefined): Promise<void>;
35
35
  setStats(transaction?: ProviderTransaction): Promise<void>;
36
36
  }
37
37
  export { Trigger };
@@ -9,6 +9,7 @@ const reporter_1 = require("../reporter");
9
9
  const serializer_1 = require("../serializer");
10
10
  const telemetry_1 = require("../telemetry");
11
11
  const activity_1 = require("./activity");
12
+ const mapper_1 = require("../mapper");
12
13
  class Trigger extends activity_1.Activity {
13
14
  constructor(config, data, metadata, hook, engine, context) {
14
15
  super(config, data, metadata, hook, engine, context);
@@ -27,7 +28,9 @@ class Trigger extends activity_1.Activity {
27
28
  this.mapJobData();
28
29
  this.adjacencyList = await this.filterAdjacent();
29
30
  const initialStatus = this.initStatus(options, this.adjacencyList.length);
30
- await this.setStateNX(initialStatus, options?.entity);
31
+ //config.entity is a pipe expression; if 'entity' exists, it will resolve
32
+ const resolvedEntity = new mapper_1.MapperService({ entity: this.config.entity }, this.context).mapRules()?.entity;
33
+ await this.setStateNX(initialStatus, options?.entity || resolvedEntity);
31
34
  await this.setStatus(initialStatus);
32
35
  this.bindSearchData(options);
33
36
  this.bindMarkerData(options);
@@ -6,6 +6,7 @@ import { Entity } from './entity';
6
6
  import { WorkerService } from './worker';
7
7
  import { WorkflowService } from './workflow';
8
8
  import { WorkflowHandleService } from './handle';
9
+ import { didInterrupt } from './workflow/interruption';
9
10
  /**
10
11
  * The MemFlow service is a collection of services that
11
12
  * emulate Temporal's capabilities, but instead are
@@ -106,6 +107,13 @@ declare class MemFlowClass {
106
107
  * including: `execChild`, `waitFor`, `sleep`, etc
107
108
  */
108
109
  static workflow: typeof WorkflowService;
110
+ /**
111
+ * Checks if an error is a HotMesh reserved error type that indicates
112
+ * a workflow interruption rather than a true error condition.
113
+ *
114
+ * @see {@link utils/interruption.didInterrupt} for detailed documentation
115
+ */
116
+ static didInterrupt: typeof didInterrupt;
109
117
  /**
110
118
  * Shutdown everything. All connections, workers, and clients will be closed.
111
119
  * Include in your signal handlers to ensure a clean shutdown.
@@ -9,6 +9,7 @@ const entity_1 = require("./entity");
9
9
  const worker_1 = require("./worker");
10
10
  const workflow_1 = require("./workflow");
11
11
  const handle_1 = require("./handle");
12
+ const interruption_1 = require("./workflow/interruption");
12
13
  /**
13
14
  * The MemFlow service is a collection of services that
14
15
  * emulate Temporal's capabilities, but instead are
@@ -120,3 +121,10 @@ MemFlowClass.Worker = worker_1.WorkerService;
120
121
  * including: `execChild`, `waitFor`, `sleep`, etc
121
122
  */
122
123
  MemFlowClass.workflow = workflow_1.WorkflowService;
124
+ /**
125
+ * Checks if an error is a HotMesh reserved error type that indicates
126
+ * a workflow interruption rather than a true error condition.
127
+ *
128
+ * @see {@link utils/interruption.didInterrupt} for detailed documentation
129
+ */
130
+ MemFlowClass.didInterrupt = interruption_1.didInterrupt;
@@ -17,7 +17,7 @@
17
17
  * * Service Meshes
18
18
  * * Master Data Management systems
19
19
  */
20
- declare const APP_VERSION = "4";
20
+ declare const APP_VERSION = "5";
21
21
  declare const APP_ID = "memflow";
22
22
  /**
23
23
  * returns a new memflow workflow schema
@@ -20,7 +20,7 @@
20
20
  */
21
21
  Object.defineProperty(exports, "__esModule", { value: true });
22
22
  exports.APP_ID = exports.APP_VERSION = exports.getWorkflowYAML = void 0;
23
- const APP_VERSION = '4';
23
+ const APP_VERSION = '5';
24
24
  exports.APP_VERSION = APP_VERSION;
25
25
  const APP_ID = 'memflow';
26
26
  exports.APP_ID = APP_ID;
@@ -86,6 +86,9 @@ const getWorkflowYAML = (app, version) => {
86
86
  signalIn:
87
87
  description: if false, the job will not support subordinated hooks
88
88
  type: boolean
89
+ entity:
90
+ description: the entity type for this workflow instance
91
+ type: string
89
92
 
90
93
  output:
91
94
  schema:
@@ -120,6 +123,7 @@ const getWorkflowYAML = (app, version) => {
120
123
  trigger:
121
124
  title: Main Flow Trigger
122
125
  type: trigger
126
+ entity: '{$self.input.data.entity}'
123
127
  job:
124
128
  maps:
125
129
  done: false
@@ -249,6 +253,9 @@ const getWorkflowYAML = (app, version) => {
249
253
  await:
250
254
  type: string
251
255
  description: when set to false, do not await the child flow's completion
256
+ entity:
257
+ type: string
258
+ description: the entity type for the child workflow
252
259
  591:
253
260
  schema:
254
261
  type: object
@@ -367,6 +374,9 @@ const getWorkflowYAML = (app, version) => {
367
374
  description: the arguments to pass to the activity
368
375
  items:
369
376
  type: string
377
+ entity:
378
+ type: string
379
+ description: the entity type for the child workflow
370
380
  maps:
371
381
  arguments: '{worker.output.data.arguments}'
372
382
  workflowDimension: '{worker.output.data.workflowDimension}'
@@ -379,6 +389,7 @@ const getWorkflowYAML = (app, version) => {
379
389
  workflowId: '{worker.output.data.workflowId}'
380
390
  workflowName: '{worker.output.data.workflowName}'
381
391
  workflowTopic: '{worker.output.data.workflowTopic}'
392
+ entity: '{worker.output.data.entity}'
382
393
  backoffCoefficient:
383
394
  '@pipe':
384
395
  - ['{worker.output.data.backoffCoefficient}','{trigger.output.data.backoffCoefficient}']
@@ -992,6 +1003,9 @@ const getWorkflowYAML = (app, version) => {
992
1003
  await:
993
1004
  type: string
994
1005
  description: when set to false, do not await the child flow's completion
1006
+ entity:
1007
+ type: string
1008
+ description: the entity type for the child workflow
995
1009
  591:
996
1010
  schema:
997
1011
  type: object
@@ -1109,6 +1123,9 @@ const getWorkflowYAML = (app, version) => {
1109
1123
  description: the arguments to pass to the activity
1110
1124
  items:
1111
1125
  type: string
1126
+ entity:
1127
+ type: string
1128
+ description: the entity type for the child workflow
1112
1129
  maps:
1113
1130
  arguments: '{signaler_worker.output.data.arguments}'
1114
1131
  workflowDimension: '{signaler_worker.output.data.workflowDimension}'
@@ -1121,6 +1138,7 @@ const getWorkflowYAML = (app, version) => {
1121
1138
  workflowId: '{signaler_worker.output.data.workflowId}'
1122
1139
  workflowName: '{signaler_worker.output.data.workflowName}'
1123
1140
  workflowTopic: '{signaler_worker.output.data.workflowTopic}'
1141
+ entity: '{signaler_worker.output.data.entity}'
1124
1142
  backoffCoefficient:
1125
1143
  '@pipe':
1126
1144
  - ['{signaler_worker.output.data.backoffCoefficient}','{trigger.output.data.backoffCoefficient}']
@@ -1875,6 +1893,9 @@ const getWorkflowYAML = (app, version) => {
1875
1893
  description: the arguments to pass to the activity
1876
1894
  items:
1877
1895
  type: string
1896
+ entity:
1897
+ type: string
1898
+ description: the entity type for the child workflow
1878
1899
  maps:
1879
1900
  arguments:
1880
1901
  '@pipe':
@@ -1946,6 +1967,11 @@ const getWorkflowYAML = (app, version) => {
1946
1967
  - ['{collator_trigger.output.data.items}', '{collator_cycle_hook.output.data.cur_index}']
1947
1968
  - ['{@array.get}', maximumInterval]
1948
1969
  - ['{@object.get}']
1970
+ entity:
1971
+ '@pipe':
1972
+ - ['{collator_trigger.output.data.items}', '{collator_cycle_hook.output.data.cur_index}']
1973
+ - ['{@array.get}', entity]
1974
+ - ['{@object.get}']
1949
1975
  output:
1950
1976
  schema:
1951
1977
  type: object
@@ -322,6 +322,31 @@ class WorkerService {
322
322
  const workflowResponse = await storage_1.asyncLocalStorage.run(context, async () => {
323
323
  return await workflowFunction.apply(this, workflowInput.arguments);
324
324
  });
325
+ //if the embedded function has a try/catch, it can interrup the throw
326
+ // throw here to interrupt the workflow if the embedded function caught and suppressed
327
+ if (interruptionRegistry.length > 0) {
328
+ const payload = interruptionRegistry[0];
329
+ switch (payload.type) {
330
+ case 'MemFlowWaitForError':
331
+ throw new errors_1.MemFlowWaitForError(payload);
332
+ case 'MemFlowProxyError':
333
+ throw new errors_1.MemFlowProxyError(payload);
334
+ case 'MemFlowChildError':
335
+ throw new errors_1.MemFlowChildError(payload);
336
+ case 'MemFlowSleepError':
337
+ throw new errors_1.MemFlowSleepError(payload);
338
+ case 'MemFlowTimeoutError':
339
+ throw new errors_1.MemFlowTimeoutError(payload.message, payload.stack);
340
+ case 'MemFlowMaxedError':
341
+ throw new errors_1.MemFlowMaxedError(payload.message, payload.stack);
342
+ case 'MemFlowFatalError':
343
+ throw new errors_1.MemFlowFatalError(payload.message, payload.stack);
344
+ case 'MemFlowRetryError':
345
+ throw new errors_1.MemFlowRetryError(payload.message, payload.stack);
346
+ default:
347
+ throw new errors_1.MemFlowRetryError(`Unknown interruption type: ${payload.type}`);
348
+ }
349
+ }
325
350
  return {
326
351
  code: 200,
327
352
  status: stream_1.StreamStatus.SUCCESS,
@@ -431,6 +456,7 @@ class WorkerService {
431
456
  maximumAttempts: err.maximumAttempts || enums_1.HMSH_MEMFLOW_MAX_ATTEMPTS,
432
457
  maximumInterval: err.maximumInterval || (0, utils_1.s)(enums_1.HMSH_MEMFLOW_MAX_INTERVAL),
433
458
  originJobId: err.originJobId,
459
+ entity: err.entity,
434
460
  parentWorkflowId: err.parentWorkflowId,
435
461
  expire: err.expire,
436
462
  persistent: err.persistent,
@@ -22,7 +22,7 @@ function getChildInterruptPayload(context, options, execIndex) {
22
22
  }
23
23
  const parentWorkflowId = workflowId;
24
24
  const taskQueueName = options.taskQueue ?? options.entity;
25
- const workflowName = options.entity ?? options.workflowName;
25
+ const workflowName = options.taskQueue ? options.workflowName : (options.entity ?? options.workflowName);
26
26
  const workflowTopic = `${taskQueueName}-${workflowName}`;
27
27
  return {
28
28
  arguments: [...(options.args || [])],
@@ -32,6 +32,7 @@ function getChildInterruptPayload(context, options, execIndex) {
32
32
  maximumAttempts: options?.config?.maximumAttempts ?? common_1.HMSH_MEMFLOW_MAX_ATTEMPTS,
33
33
  maximumInterval: (0, common_1.s)(options?.config?.maximumInterval ?? common_1.HMSH_MEMFLOW_MAX_INTERVAL),
34
34
  originJobId: originJobId ?? workflowId,
35
+ entity: options.entity,
35
36
  expire: options.expire ?? expire,
36
37
  persistent: options.persistent,
37
38
  signalIn: options.signalIn,
@@ -81,6 +82,7 @@ async function execChild(options) {
81
82
  const interruptionMessage = getChildInterruptPayload(context, options, execIndex);
82
83
  interruptionRegistry.push({
83
84
  code: common_1.HMSH_CODE_MEMFLOW_CHILD,
85
+ type: 'MemFlowChildError',
84
86
  ...interruptionMessage,
85
87
  });
86
88
  await (0, common_1.sleepImmediate)();
@@ -3,8 +3,8 @@ import { HookOptions } from './common';
3
3
  * Extended hook options that include signal configuration
4
4
  */
5
5
  export interface ExecHookOptions extends HookOptions {
6
- /** Signal ID to send after hook execution */
7
- signalId: string;
6
+ /** Signal ID to send after hook execution; if not provided, a random one will be generated */
7
+ signalId?: string;
8
8
  }
9
9
  /**
10
10
  * Executes a hook function and awaits the signal response.
@@ -3,6 +3,7 @@ Object.defineProperty(exports, "__esModule", { value: true });
3
3
  exports.execHook = void 0;
4
4
  const hook_1 = require("./hook");
5
5
  const waitFor_1 = require("./waitFor");
6
+ const interruption_1 = require("./interruption");
6
7
  /**
7
8
  * Executes a hook function and awaits the signal response.
8
9
  * This is a convenience method that combines `hook()` and `waitFor()` operations.
@@ -60,14 +61,23 @@ const waitFor_1 = require("./waitFor");
60
61
  * ```
61
62
  */
62
63
  async function execHook(options) {
63
- // Create hook options with signal field added to args
64
- const hookOptions = {
65
- ...options,
66
- args: [...options.args, { signal: options.signalId }]
67
- };
68
- // Execute the hook with the signal information
69
- await (0, hook_1.hook)(hookOptions);
70
- // Wait for the signal response and return it
71
- return await (0, waitFor_1.waitFor)(options.signalId);
64
+ try {
65
+ if (!options.signalId) {
66
+ options.signalId = 'memflow-hook-' + crypto.randomUUID();
67
+ }
68
+ const hookOptions = {
69
+ ...options,
70
+ args: [...options.args, { signal: options.signalId, $memflow: true }]
71
+ };
72
+ // Execute the hook with the signal information
73
+ await (0, hook_1.hook)(hookOptions);
74
+ // Wait for the signal response and return it
75
+ return await (0, waitFor_1.waitFor)(options.signalId);
76
+ }
77
+ catch (error) {
78
+ if ((0, interruption_1.didInterrupt)(error)) {
79
+ throw error;
80
+ }
81
+ }
72
82
  }
73
83
  exports.execHook = execHook;
@@ -0,0 +1,26 @@
1
+ /**
2
+ * Checks if an error is a HotMesh reserved error type that indicates
3
+ * a workflow interruption rather than a true error condition.
4
+ *
5
+ * When this returns true, you can safely return from your workflow function.
6
+ * The workflow engine will handle the interruption automatically.
7
+ *
8
+ * @example
9
+ * ```typescript
10
+ * try {
11
+ * await someWorkflowOperation();
12
+ * } catch (error) {
13
+ * // Check if this is a HotMesh interruption
14
+ * if (didInterrupt(error)) {
15
+ * // Rethrow the error if HotMesh interruption
16
+ * throw error;
17
+ * }
18
+ * // Handle actual error
19
+ * console.error('Workflow failed:', error);
20
+ * }
21
+ * ```
22
+ *
23
+ * @param error - The error to check
24
+ * @returns true if the error is a HotMesh interruption
25
+ */
26
+ export declare function didInterrupt(error: Error): boolean;
@@ -0,0 +1,41 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.didInterrupt = void 0;
4
+ const errors_1 = require("../../../modules/errors");
5
+ /**
6
+ * Checks if an error is a HotMesh reserved error type that indicates
7
+ * a workflow interruption rather than a true error condition.
8
+ *
9
+ * When this returns true, you can safely return from your workflow function.
10
+ * The workflow engine will handle the interruption automatically.
11
+ *
12
+ * @example
13
+ * ```typescript
14
+ * try {
15
+ * await someWorkflowOperation();
16
+ * } catch (error) {
17
+ * // Check if this is a HotMesh interruption
18
+ * if (didInterrupt(error)) {
19
+ * // Rethrow the error if HotMesh interruption
20
+ * throw error;
21
+ * }
22
+ * // Handle actual error
23
+ * console.error('Workflow failed:', error);
24
+ * }
25
+ * ```
26
+ *
27
+ * @param error - The error to check
28
+ * @returns true if the error is a HotMesh interruption
29
+ */
30
+ function didInterrupt(error) {
31
+ return (error instanceof errors_1.MemFlowChildError ||
32
+ error instanceof errors_1.MemFlowFatalError ||
33
+ error instanceof errors_1.MemFlowMaxedError ||
34
+ error instanceof errors_1.MemFlowProxyError ||
35
+ error instanceof errors_1.MemFlowRetryError ||
36
+ error instanceof errors_1.MemFlowSleepError ||
37
+ error instanceof errors_1.MemFlowTimeoutError ||
38
+ error instanceof errors_1.MemFlowWaitForError ||
39
+ error instanceof errors_1.MemFlowWaitForAllError);
40
+ }
41
+ exports.didInterrupt = didInterrupt;
@@ -67,6 +67,7 @@ function wrapActivity(activityName, options) {
67
67
  const interruptionMessage = getProxyInterruptPayload(context, activityName, execIndex, args, options);
68
68
  interruptionRegistry.push({
69
69
  code: common_1.HMSH_CODE_MEMFLOW_PROXY,
70
+ type: 'MemFlowProxyError',
70
71
  ...interruptionMessage,
71
72
  });
72
73
  await (0, common_1.sleepImmediate)();
@@ -43,6 +43,7 @@ async function sleepFor(duration) {
43
43
  };
44
44
  interruptionRegistry.push({
45
45
  code: common_1.HMSH_CODE_MEMFLOW_SLEEP,
46
+ type: 'MemFlowSleepError',
46
47
  ...interruptionMessage,
47
48
  });
48
49
  await (0, common_1.sleepImmediate)();
@@ -45,11 +45,10 @@ async function waitFor(signalId) {
45
45
  signalId,
46
46
  index: execIndex,
47
47
  workflowDimension,
48
- };
49
- interruptionRegistry.push({
48
+ type: 'MemFlowWaitForError',
50
49
  code: common_1.HMSH_CODE_MEMFLOW_WAIT,
51
- ...interruptionMessage,
52
- });
50
+ };
51
+ interruptionRegistry.push(interruptionMessage);
53
52
  await (0, common_1.sleepImmediate)();
54
53
  throw new common_1.MemFlowWaitForError(interruptionMessage);
55
54
  }
@@ -9,6 +9,7 @@ interface BaseActivity {
9
9
  statusThreshold?: number;
10
10
  statusThresholdType?: 'stop' | 'throw' | 'stall';
11
11
  input?: Record<string, any>;
12
+ entity?: string;
12
13
  output?: Record<string, any>;
13
14
  settings?: Record<string, any>;
14
15
  job?: Record<string, any>;
@@ -73,6 +74,7 @@ interface TriggerActivityStats {
73
74
  interface TriggerActivity extends BaseActivity {
74
75
  type: 'trigger';
75
76
  stats?: TriggerActivityStats;
77
+ entity?: string;
76
78
  }
77
79
  interface AwaitActivity extends BaseActivity {
78
80
  type: 'await';
@@ -6,6 +6,7 @@ export type MemFlowChildErrorType = {
6
6
  expire?: number;
7
7
  persistent?: boolean;
8
8
  signalIn?: boolean;
9
+ entity?: string;
9
10
  maximumAttempts?: number;
10
11
  maximumInterval?: number;
11
12
  originJobId: string | null;
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@hotmeshio/hotmesh",
3
- "version": "0.4.3",
3
+ "version": "0.5.1",
4
4
  "description": "Permanent-Memory Workflows & AI Agents",
5
5
  "main": "./build/index.js",
6
6
  "types": "./build/index.d.ts",