@hotmeshio/hotmesh 0.5.3 → 0.5.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (38) hide show
  1. package/README.md +218 -249
  2. package/build/index.d.ts +1 -3
  3. package/build/index.js +1 -5
  4. package/build/modules/enums.d.ts +4 -0
  5. package/build/modules/enums.js +5 -1
  6. package/build/modules/utils.d.ts +1 -9
  7. package/build/modules/utils.js +0 -6
  8. package/build/package.json +3 -4
  9. package/build/services/connector/factory.d.ts +2 -2
  10. package/build/services/connector/factory.js +11 -8
  11. package/build/services/connector/providers/postgres.d.ts +47 -0
  12. package/build/services/connector/providers/postgres.js +107 -0
  13. package/build/services/hotmesh/index.d.ts +8 -0
  14. package/build/services/hotmesh/index.js +27 -0
  15. package/build/services/memflow/client.d.ts +1 -1
  16. package/build/services/memflow/client.js +8 -6
  17. package/build/services/memflow/worker.js +3 -0
  18. package/build/services/pipe/functions/cron.js +1 -1
  19. package/build/services/store/providers/postgres/kvtables.js +19 -6
  20. package/build/services/store/providers/postgres/postgres.js +13 -2
  21. package/build/services/stream/providers/postgres/postgres.d.ts +6 -3
  22. package/build/services/stream/providers/postgres/postgres.js +169 -59
  23. package/build/services/sub/providers/postgres/postgres.d.ts +9 -0
  24. package/build/services/sub/providers/postgres/postgres.js +109 -18
  25. package/build/services/worker/index.js +4 -0
  26. package/build/types/hotmesh.d.ts +19 -5
  27. package/build/types/index.d.ts +0 -2
  28. package/env.example +11 -0
  29. package/index.ts +0 -4
  30. package/package.json +3 -4
  31. package/build/services/meshdata/index.d.ts +0 -795
  32. package/build/services/meshdata/index.js +0 -1235
  33. package/build/services/meshos/index.d.ts +0 -293
  34. package/build/services/meshos/index.js +0 -547
  35. package/build/types/manifest.d.ts +0 -52
  36. package/build/types/manifest.js +0 -2
  37. package/build/types/meshdata.d.ts +0 -252
  38. package/build/types/meshdata.js +0 -2
package/README.md CHANGED
@@ -1,52 +1,135 @@
1
1
  # HotMesh
2
2
 
3
- **🧠 Workflow That Remembers**
3
+ **Durable Memory + Coordinated Execution**
4
4
 
5
- ![beta release](https://img.shields.io/badge/release-beta-blue.svg) ![made with typescript](https://img.shields.io/badge/built%20with-typescript-lightblue.svg)
5
+ ![beta release](https://img.shields.io/badge/release-beta-blue.svg) ![made with typescript](https://img.shields.io/badge/built%20with-typescript-lightblue.svg)
6
6
 
7
- HotMesh brings a **memory model** to your automation: durable entities that hold context, support concurrency, and evolve across runs. Built on PostgreSQL, it treats your database not just as storage—but as the runtime hub for agents, pipelines, and long-lived processes.
7
+ HotMesh removes the repetitive glue of building durable agents, pipelines, and long‑running workflows. Instead of you designing queues, schedulers, cache layers, and ad‑hoc state stores for each agent or pipeline, HotMesh standardizes the pattern:
8
8
 
9
- Use HotMesh to:
9
+ * **Entity (Core Memory)**: The authoritative JSONB document + its indexable “surface” fields.
10
+ * **Hooks (Durable Units of Work)**: Re‑entrant, idempotent functions that *maintain* the entity over time.
11
+ * **Workflow (Coordinator)**: The thin orchestration entry that seeds state, spawns hooks, and optionally synthesizes results.
12
+ * **Commands (State Mutation API)**: Atomic `set / merge / append / increment / tag / signal` updates with optimistic invariants handled by Postgres transactions.
10
13
 
11
- * **Store Evolving State** Retain memory/state between executions
12
- * **Coordinate Distributed Work** – Safely allow multiple workers to act on shared state
13
- * **Track and Replay** – Full audit history and replay support by default
14
+ You focus on *what should change in memory*; HotMesh handles *how it changes safely and durably.*
15
+
16
+ ---
17
+
18
+ ## Why It’s Easier with HotMesh
19
+
20
+ | Problem You Usually Solve Manually | HotMesh Built‑In | Impact |
21
+ | --------------------------------------------------- | ----------------------------------------- | ---------------------------------------------- |
22
+ | Designing per‑agent persistence and caches | Unified JSONB entity + typed accessors | One memory model across agents/pipelines |
23
+ | Preventing race conditions on shared state | Transactional hook writes | Safe parallel maintenance |
24
+ | Coordinating multi-perspective / multi-step work | Hook spawning + signals | Decomposed work without orchestration glue |
25
+ | Schema evolution / optional fields | Flexible JSONB + selective indexes | Add / adapt state incrementally |
26
+ | Querying live pipeline / agent status | SQL over materialized surfaces | Operational observability using standard tools |
27
+ | Avoiding duplicate side-effects during retry/replay | Deterministic re‑entry + idempotent hooks | Simplifies error handling |
28
+ | Per‑tenant isolation | Schema (or prefix) scoping | Clean multi‑tenant boundary |
29
+ | Background progression / fan‑out | `execHook` + signals | Natural concurrency without queue plumbing |
30
+
31
+ ---
32
+
33
+ ## Core Abstractions
34
+
35
+ ### 1. Entities
36
+
37
+ Durable JSONB documents representing *process memory*. Each entity:
38
+
39
+ * Has a stable identity (`workflowId` / logical key).
40
+ * Evolves via atomic commands.
41
+ * Is versioned implicitly by transactional history.
42
+ * Can be partially indexed for targeted query performance.
43
+
44
+ > **Design Note:** Treat entity shape as *contractual surface* + *freeform interior*. Index only the minimal surface required for lookups or dashboards.
45
+
46
+ ### 2. Hooks
47
+
48
+ Re‑entrant, idempotent, interruptible units of work that *maintain* an entity. Hooks can:
49
+
50
+ * Start, stop, or be re‑invoked without corrupting state.
51
+ * Run concurrently (Postgres ensures isolation on write).
52
+ * Emit signals to let coordinators or sibling hooks know a perspective / phase completed.
53
+
54
+ ### 3. Workflow Coordinators
55
+
56
+ Thin entrypoints that:
57
+
58
+ * Seed initial entity state.
59
+ * Fan out perspective / phase hooks.
60
+ * Optionally synthesize or finalize.
61
+ * Return a snapshot (often the final entity state) — *the workflow result is just memory*.
62
+
63
+ ### 4. Commands (Entity Mutation Primitives)
64
+
65
+ | Command | Purpose | Example |
66
+ | ----------- | ----------------------------------------- | ------------------------------------------------ |
67
+ | `set` | Replace full value (first write or reset) | `await e.set({ user: { id: 123, name: "John" } })` |
68
+ | `merge` | Deep JSON merge | `await e.merge({ user: { email: "john@example.com" } })` |
69
+ | `append` | Append to an array field | `await e.append('items', { id: 1, name: "New Item" })` |
70
+ | `prepend` | Add to start of array field | `await e.prepend('items', { id: 0, name: "First Item" })` |
71
+ | `remove` | Remove item from array by index | `await e.remove('items', 0)` |
72
+ | `increment` | Numeric counters / progress | `await e.increment('counter', 5)` |
73
+ | `toggle` | Toggle boolean value | `await e.toggle('settings.enabled')` |
74
+ | `setIfNotExists` | Set value only if path doesn't exist | `await e.setIfNotExists('user.id', 123)` |
75
+ | `delete` | Remove field at specified path | `await e.delete('user.email')` |
76
+ | `get` | Read value at path (or full entity) | `await e.get('user.email')` |
77
+ | `signal` | Mark hook milestone / unlock waiters | `await MemFlow.workflow.signal('phase-x', data)` |
78
+
79
+ The Entity module also provides static methods for cross-entity querying:
80
+
81
+ ```typescript
82
+ // Find entities matching conditions
83
+ const activeUsers = await Entity.find('user', {
84
+ status: 'active',
85
+ country: 'US'
86
+ });
87
+
88
+ // Find by specific field condition
89
+ const highValueOrders = await Entity.findByCondition(
90
+ 'order',
91
+ 'total_amount',
92
+ 1000,
93
+ '>=',
94
+ hotMeshClient
95
+ );
96
+
97
+ // Find single entity by ID
98
+ const user = await Entity.findById('user', 'user123', hotMeshClient);
99
+
100
+ // Create optimized index for queries
101
+ await Entity.createIndex('user', 'email', hotMeshClient);
102
+ ```
14
103
 
15
104
  ---
16
105
 
17
106
  ## Table of Contents
18
107
 
19
108
  1. [Quick Start](#quick-start)
20
- 2. [Permanent Memory Architecture](#permanent-memory-architecture)
109
+ 2. [Memory Architecture](#memory-architecture)
21
110
  3. [Durable AI Agents](#durable-ai-agents)
22
- 4. [Building Pipelines with State](#building-pipelines-with-state)
23
- 5. [Documentation & Links](#documentation--links)
111
+ 4. [Stateful Pipelines](#stateful-pipelines)
112
+ 5. [Indexing Strategy](#indexing-strategy)
113
+ 6. [Operational Notes](#operational-notes)
114
+ 7. [Documentation & Links](#documentation--links)
24
115
 
25
116
  ---
26
117
 
27
118
  ## Quick Start
28
119
 
29
- ### Prerequisites
30
-
31
- * PostgreSQL (or Supabase)
32
- * Node.js 16+
33
-
34
120
  ### Install
35
121
 
36
122
  ```bash
37
123
  npm install @hotmeshio/hotmesh
38
124
  ```
39
125
 
40
- ### Connect to a Database
41
-
42
- HotMesh leverages Temporal.io's developer-friendly syntax for authoring workers, workflows, and clients. The `init` and `start` methods should look familiar.
43
-
126
+ ### Minimal Setup
44
127
  ```ts
45
128
  import { MemFlow } from '@hotmeshio/hotmesh';
46
129
  import { Client as Postgres } from 'pg';
47
130
 
48
131
  async function main() {
49
- // MemFlow will auto-provision the database upon init
132
+ // Auto-provisions required tables/index scaffolding on first run
50
133
  const mf = await MemFlow.init({
51
134
  appId: 'my-app',
52
135
  engine: {
@@ -57,302 +140,188 @@ async function main() {
57
140
  }
58
141
  });
59
142
 
60
- // Start a workflow with an assigned ID and arguments
143
+ // Start a durable research agent (entity-backed workflow)
61
144
  const handle = await mf.workflow.start({
62
145
  entity: 'research-agent',
63
146
  workflowName: 'researchAgent',
64
147
  workflowId: 'agent-session-jane-001',
65
- args: ['What are the long-term impacts of renewable energy subsidies?'],
148
+ args: ['Long-term impacts of renewable energy subsidies'],
66
149
  taskQueue: 'agents'
67
150
  });
68
151
 
69
- console.log('Result:', await handle.result());
152
+ console.log('Final Memory Snapshot:', await handle.result());
70
153
  }
71
154
 
72
155
  main().catch(console.error);
73
156
  ```
74
157
 
75
- ### System Benefits
76
-
77
- * **No Setup Required** – Tables and indexes are provisioned automatically
78
- * **Shared State** – Every worker shares access to the same entity memory
79
- * **Coordination by Design** PostgreSQL handles consistency and isolation
80
- * **Tenant Isolation** – Each app maintains its own schema
81
- * **Scalable Defaults** – Partitioned tables and index support included
158
+ ### Value Checklist (What You Did *Not* Have To Do)
159
+ - Create tables / migrations
160
+ - Define per-agent caches
161
+ - Implement optimistic locking
162
+ - Build a queue fan‑out mechanism
163
+ - Hand-roll replay protection
82
164
 
83
165
  ---
84
166
 
85
- ## Permanent Memory Architecture
86
-
87
- Every workflow in HotMesh is backed by an "entity": a versioned, JSONB record that tracks its memory and state transitions.
88
-
89
- * **Entities** – Represent long-lived state for a workflow or agent
90
- * **Commands** – Modify state with methods like `set`, `merge`, `append`, `increment`
91
- * **Consistency** – All updates are transactional with Postgres
92
- * **Replay Safety** – Protects against duplicated side effects during re-execution
93
- * **Partial Indexing** – Optimized querying of fields within large JSON structures
167
+ ## Memory Architecture
168
+ Each workflow = **1 durable entity**. Hooks are stateless functions *shaped by* that entity's evolving JSON. You can inspect or modify it at any time using ordinary SQL or the provided API.
94
169
 
95
- ### Example: Partial Index for Premium Users
170
+ ### Programmatic Indexing
171
+ ```ts
172
+ // Create index for premium research agents
173
+ await MemFlow.Entity.createIndex('research-agent', 'isPremium', hotMeshClient);
174
+
175
+ // Find premium agents needing verification
176
+ const agents = await MemFlow.Entity.find('research-agent', {
177
+ isPremium: true,
178
+ needsVerification: true
179
+ }, hotMeshClient);
180
+ ```
96
181
 
182
+ ### Direct SQL Access
97
183
  ```sql
98
- -- Index only those user entities that are marked as premium
99
- CREATE INDEX idx_user_premium ON your_app.jobs (id)
100
- WHERE entity = 'user' AND (context->>'isPremium')::boolean = true;
184
+ -- Same index via SQL (more control over index type/conditions)
185
+ CREATE INDEX idx_research_agents_premium ON my_app.jobs (id)
186
+ WHERE entity = 'research-agent' AND (context->>'isPremium')::boolean = true;
187
+
188
+ -- Ad hoc query example
189
+ SELECT id, context->>'status' as status, context->>'confidence' as confidence
190
+ FROM my_app.jobs
191
+ WHERE entity = 'research-agent'
192
+ AND (context->>'isPremium')::boolean = true
193
+ AND (context->>'confidence')::numeric > 0.8;
101
194
  ```
102
195
 
103
- This index improves performance for filtered queries while reducing index size.
196
+ **Guidelines:**
197
+ 1. *Model intent, not mechanics.* Keep ephemeral calculation artifacts minimal; store derived values only if reused.
198
+ 2. *Index sparingly.* Each index is a write amplification cost. Start with 1–2 selective partial indexes.
199
+ 3. *Keep arrays append‑only where possible.* Supports audit and replay semantics cheaply.
200
+ 4. *Choose your tool:* Use Entity methods for standard queries, raw SQL for complex analytics or custom indexes.
104
201
 
105
202
  ---
106
203
 
107
204
  ## Durable AI Agents
205
+ Agents become simpler: the *agent* is the memory record; hooks supply perspectives, verification, enrichment, or lifecycle progression.
108
206
 
109
- Agents often require memory—context that persists between invocations, spans multiple perspectives, or outlives a single process.
110
-
111
- The following example builds a "research agent" that executes hooks with different perspectives and then synthesizes. The data-first approach sets up initial state and then uses temporary hook functions to augment over the lifecycle of the entity record.
112
-
113
- ### Research Agent Example
114
-
115
- #### Main Coordinator Agent
116
-
207
+ ### Coordinator (Research Agent)
117
208
  ```ts
118
- export async function researchAgent(query: string): Promise<any> {
119
- const agent = await MemFlow.workflow.entity();
209
+ export async function researchAgent(query: string) {
210
+ const entity = await MemFlow.workflow.entity();
120
211
 
121
- // Set up shared memory for this agent session
122
- const initialState = {
212
+ const initial = {
123
213
  query,
124
214
  findings: [],
125
215
  perspectives: {},
126
216
  confidence: 0,
127
217
  verification: {},
128
218
  status: 'researching',
129
- startTime: new Date().toISOString(),
130
- }
131
- await agent.set<typeof initialState>(initialState);
132
-
133
- // Launch perspective hooks in parallel
134
- await MemFlow.workflow.execHook({
135
- taskQueue: 'agents',
136
- workflowName: 'optimisticPerspective',
137
- args: [query],
138
- signalId: 'optimistic-complete'
139
- });
140
-
141
- await MemFlow.workflow.execHook({
142
- taskQueue: 'agents',
143
- workflowName: 'skepticalPerspective',
144
- args: [query],
145
- signalId: 'skeptical-complete'
146
- });
219
+ startTime: new Date().toISOString()
220
+ };
221
+ await entity.set<typeof initial>(initial);
147
222
 
148
- await MemFlow.workflow.execHook({
149
- taskQueue: 'agents',
150
- workflowName: 'verificationHook',
151
- args: [query],
152
- signalId: 'verification-complete'
153
- });
154
-
155
- await MemFlow.workflow.execHook({
156
- taskQueue: 'perspectives',
157
- workflowName: 'synthesizePerspectives',
158
- args: [],
159
- signalId: 'synthesis-complete',
160
- });
223
+ // Fan-out perspectives
224
+ await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'optimisticPerspective', args: [query], signalId: 'optimistic-complete' });
225
+ await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'skepticalPerspective', args: [query], signalId: 'skeptical-complete' });
226
+ await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'verificationHook', args: [query], signalId: 'verification-complete' });
227
+ await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'synthesizePerspectives', args: [], signalId: 'synthesis-complete' });
161
228
 
162
- // return analysis, verification, and synthesis
163
- return await agent.get();
229
+ return await entity.get();
164
230
  }
165
231
  ```
166
232
 
167
- > 💡 A complete implementation of this Research Agent example with tests, OpenAI integration, and multi-perspective analysis can be found in the [agent test suite](https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/agent).
168
-
169
- #### Perspective Hooks
170
-
233
+ ### Synthesis Hook
171
234
  ```ts
172
- // Optimistic hook looks for affirming evidence
173
- export async function optimisticPerspective(query: string, config: {signal: string}): Promise<void> {
174
- const entity = await MemFlow.workflow.entity();
175
- const findings = await searchForSupportingEvidence(query);
176
- await entity.merge({ perspectives: { optimistic: { findings, confidence: 0.8 }}});
177
- //signal the caller to notify all done
178
- await MemFlow.workflow.signal(config.signal, {});
179
- }
180
-
181
- // Skeptical hook seeks out contradictions and counterpoints
182
- export async function skepticalPerspective(query: string, config: {signal: string}): Promise<void> {
183
- const entity = await MemFlow.workflow.entity();
184
- const counterEvidence = await searchForCounterEvidence(query);
185
- await entity.merge({ perspectives: { skeptical: { counterEvidence, confidence: 0.6 }}});
186
- await MemFlow.workflow.signal(config.signal, {});
187
- }
188
-
189
- // Other hooks...
190
- ```
191
-
192
- #### Synthesis Hook
235
+ export async function synthesizePerspectives({ signal }: { signal: string }) {
236
+ const e = await MemFlow.workflow.entity();
237
+ const ctx = await e.get();
193
238
 
194
- ```ts
195
- // Synthesis hook aggregates different viewpoints
196
- export async function synthesizePerspectives(config: {signal: string}): Promise<void> {
197
- const entity = await MemFlow.workflow.entity();
198
- const context = await entity.get();
199
-
200
- const result = await analyzePerspectives(context.perspectives);
201
-
202
- await entity.merge({
239
+ const synthesized = await analyzePerspectives(ctx.perspectives);
240
+ await e.merge({
203
241
  perspectives: {
204
242
  synthesis: {
205
- finalAssessment: result,
206
- confidence: calculateConfidence(context.perspectives)
243
+ finalAssessment: synthesized,
244
+ confidence: calculateConfidence(ctx.perspectives)
207
245
  }
208
246
  },
209
247
  status: 'completed'
210
248
  });
211
- await MemFlow.workflow.signal(config.signal, {});
249
+ await MemFlow.workflow.signal(signal, {});
212
250
  }
213
251
  ```
214
252
 
215
- ---
216
-
217
- ## Building Pipelines with State
253
+ > **Pattern:** Fan-out hooks that write *adjacent* subtrees (e.g., `perspectives.optimistic`, `perspectives.skeptical`). A final hook merges a compact synthesis object. Avoid cross-hook mutation of the same nested branch.
218
254
 
219
- HotMesh treats pipelines as long-lived records. Every pipeline run is stateful, resumable, and traceable.
255
+ ---
220
256
 
221
- ### Setup a Data Pipeline
257
+ ## Stateful Pipelines
258
+ Pipelines are identical in structure to agents: a coordinator seeds memory; phase hooks advance state; the entity is the audit trail.
222
259
 
260
+ ### Document Processing Pipeline (Coordinator)
223
261
  ```ts
224
- export async function dataPipeline(source: string): Promise<void> {
225
- const entity = await MemFlow.workflow.entity();
226
-
227
- // Initial policy and tracking setup
228
- await entity.set({
229
- source,
230
- pipeline: { version: 1, policy: { refreshInterval: '24 hours' } },
231
- changeLog: []
232
- });
233
-
234
- // Trigger the recurring orchestration pipeline
235
- await MemFlow.workflow.execHook({
236
- taskQueue: 'pipeline',
237
- workflowName: 'runPipeline',
238
- args: [true]
239
- });
240
- }
241
- ```
242
-
243
- ### Orchestration Hook
262
+ export async function documentProcessingPipeline() {
263
+ const pipeline = await MemFlow.workflow.entity();
244
264
 
245
- ```ts
246
- export async function runPipeline(repeat = false): Promise<void> {
247
- do {
248
- // Perform transformation step
265
+ const initial = {
266
+ documentId: `doc-${Date.now()}`,
267
+ status: 'started',
268
+ startTime: new Date().toISOString(),
269
+ imageRefs: [],
270
+ extractedInfo: [],
271
+ validationResults: [],
272
+ finalResult: null,
273
+ processingSteps: [],
274
+ errors: [],
275
+ pageSignals: {}
276
+ };
277
+ await pipeline.set<typeof initial>(initial);
278
+
279
+ await pipeline.merge({ status: 'loading-images' });
280
+ await pipeline.append('processingSteps', 'image-load-started');
281
+ const imageRefs = await activities.loadImagePages();
282
+ if (!imageRefs?.length) throw new Error('No image references found');
283
+ await pipeline.merge({ imageRefs });
284
+ await pipeline.append('processingSteps', 'image-load-completed');
285
+
286
+ // Page hooks
287
+ for (const [i, ref] of imageRefs.entries()) {
288
+ const page = i + 1;
249
289
  await MemFlow.workflow.execHook({
250
- taskQueue: 'transform',
251
- workflowName: 'cleanData',
252
- args: []
253
- });
254
-
255
- if (repeat) {
256
- // Schedule next execution
257
- await MemFlow.workflow.execHook({
258
- taskQueue: 'scheduler',
259
- workflowName: 'scheduleRefresh',
260
- args: []
261
- });
262
- }
263
- } while (repeat)
264
- }
265
-
266
- /**
267
- * Hook to clean and transform data
268
- */
269
- export async function cleanData(signalInfo?: { signal: string }): Promise<void> {
270
- const entity = await MemFlow.workflow.entity();
271
-
272
- // Simulate data cleaning
273
- await entity.merge({
274
- status: 'cleaning',
275
- lastCleanedAt: new Date().toISOString()
276
- });
277
-
278
- // Add to changelog
279
- await entity.append('changeLog', {
280
- action: 'clean',
281
- timestamp: new Date().toISOString()
282
- });
283
-
284
- // Signal completion if called via execHook
285
- if (signalInfo?.signal) {
286
- await MemFlow.workflow.signal(signalInfo.signal, {
287
- status: 'cleaned',
288
- timestamp: new Date().toISOString()
289
- });
290
- }
291
- }
292
-
293
- /**
294
- * Hook to schedule the next refresh based on policy
295
- */
296
- export async function scheduleRefresh(signalInfo?: { signal: string }): Promise<void> {
297
- const entity = await MemFlow.workflow.entity();
298
-
299
- // Get refresh interval from policy
300
- const currentEntity = await entity.get();
301
- const refreshInterval = currentEntity.pipeline.policy.refreshInterval;
302
-
303
- // Sleep for the configured interval
304
- await MemFlow.workflow.sleepFor(refreshInterval);
305
-
306
- // Update status after sleep
307
- await entity.merge({
308
- status: 'ready_for_refresh',
309
- nextRefreshAt: new Date().toISOString()
310
- });
311
-
312
- // Add to changelog
313
- await entity.append('changeLog', {
314
- action: 'schedule_refresh',
315
- timestamp: new Date().toISOString(),
316
- nextRefresh: new Date().toISOString()
317
- });
318
-
319
- // Signal completion if called via execHook
320
- if (signalInfo?.signal) {
321
- await MemFlow.workflow.signal(signalInfo.signal, {
322
- status: 'scheduled',
323
- nextRefresh: new Date().toISOString()
290
+ taskQueue: 'pipeline',
291
+ workflowName: 'pageProcessingHook',
292
+ args: [ref, page, initial.documentId],
293
+ signalId: `page-${page}-complete`
324
294
  });
325
295
  }
326
- }
327
- ```
328
296
 
329
- ### Trigger from Outside
297
+ // Validation
298
+ await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'validationHook', args: [initial.documentId], signalId: 'validation-complete' });
299
+ // Approval
300
+ await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'approvalHook', args: [initial.documentId], signalId: 'approval-complete' });
301
+ // Notification
302
+ await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'notificationHook', args: [initial.documentId], signalId: 'processing-complete' });
330
303
 
331
- ```ts
332
- // External systems can trigger a single pipeline run
333
- export async function triggerRefresh() {
334
- const client = new MemFlow.Client({/*...*/});
335
-
336
- await client.workflow.hook({
337
- workflowId: 'pipeline-123',
338
- taskQueue: 'pipeline',
339
- workflowName: 'runPipeline',
340
- args: []
341
- });
304
+ await pipeline.merge({ status: 'completed', completedAt: new Date().toISOString() });
305
+ await pipeline.append('processingSteps', 'pipeline-completed');
306
+ return await pipeline.get();
342
307
  }
343
308
  ```
344
309
 
345
- > 💡 A complete implementation of this Pipeline example with OpenAI Vision integration, processing hooks, and document workflow automation can be found in the [pipeline test suite](https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/pipeline).
310
+ **Operational Characteristics:**
311
+ - *Replay Friendly*: Each hook can be retried; pipeline memory records invariant progress markers (`processingSteps`).
312
+ - *Parallelizable*: Pages fan out naturally without manual queue wiring.
313
+ - *Auditable*: Entire lifecycle captured in a single evolving JSON record.
346
314
 
347
315
  ---
348
316
 
349
317
  ## Documentation & Links
350
-
351
- * SDK Reference[hotmeshio.github.io/sdk-typescript](https://hotmeshio.github.io/sdk-typescript)
352
- * Examples[github.com/hotmeshio/samples-typescript](https://github.com/hotmeshio/samples-typescript)
318
+ * **SDK Reference** – https://hotmeshio.github.io/sdk-typescript
319
+ * **Agent Example Tests** – https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/agent
320
+ * **Pipeline Example Tests** https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/pipeline
321
+ * **Sample Projects** – https://github.com/hotmeshio/samples-typescript
353
322
 
354
323
  ---
355
324
 
356
325
  ## License
357
-
358
- Apache 2.0 See `LICENSE` for details.
326
+ Apache 2.0 with commercial restrictions* – see `LICENSE`.
327
+ >*NOTE: It's open source with one commercial exception: Build, sell, and share solutions made with HotMesh. But don't white-label the orchestration core and repackage it as your own workflow-as-a-service.
package/build/index.d.ts CHANGED
@@ -10,8 +10,6 @@ import { WorkerService as Worker } from './services/memflow/worker';
10
10
  import { WorkflowService as workflow } from './services/memflow/workflow';
11
11
  import { WorkflowHandleService as WorkflowHandle } from './services/memflow/handle';
12
12
  import { proxyActivities } from './services/memflow/workflow/proxyActivities';
13
- import { MeshData } from './services/meshdata';
14
- import { MeshOS } from './services/meshos';
15
13
  import * as Errors from './modules/errors';
16
14
  import * as Utils from './modules/utils';
17
15
  import * as Enums from './modules/enums';
@@ -22,5 +20,5 @@ import { RedisConnection as ConnectorIORedis } from './services/connector/provid
22
20
  import { RedisConnection as ConnectorRedis } from './services/connector/providers/redis';
23
21
  import { NatsConnection as ConnectorNATS } from './services/connector/providers/nats';
24
22
  export { Connector, //factory
25
- ConnectorIORedis, ConnectorNATS, ConnectorPostgres, ConnectorRedis, HotMesh, HotMeshConfig, MeshCall, MeshData, MemFlow, MeshOS, Client, Connection, proxyActivities, Search, Entity, Worker, workflow, WorkflowHandle, Enums, Errors, Utils, KeyStore, };
23
+ ConnectorIORedis, ConnectorNATS, ConnectorPostgres, ConnectorRedis, HotMesh, HotMeshConfig, MeshCall, MemFlow, Client, Connection, proxyActivities, Search, Entity, Worker, workflow, WorkflowHandle, Enums, Errors, Utils, KeyStore, };
26
24
  export * as Types from './types';
package/build/index.js CHANGED
@@ -23,7 +23,7 @@ var __importStar = (this && this.__importStar) || function (mod) {
23
23
  return result;
24
24
  };
25
25
  Object.defineProperty(exports, "__esModule", { value: true });
26
- exports.Types = exports.KeyStore = exports.Utils = exports.Errors = exports.Enums = exports.WorkflowHandle = exports.workflow = exports.Worker = exports.Entity = exports.Search = exports.proxyActivities = exports.Connection = exports.Client = exports.MeshOS = exports.MemFlow = exports.MeshData = exports.MeshCall = exports.HotMesh = exports.ConnectorRedis = exports.ConnectorPostgres = exports.ConnectorNATS = exports.ConnectorIORedis = exports.Connector = void 0;
26
+ exports.Types = exports.KeyStore = exports.Utils = exports.Errors = exports.Enums = exports.WorkflowHandle = exports.workflow = exports.Worker = exports.Entity = exports.Search = exports.proxyActivities = exports.Connection = exports.Client = exports.MemFlow = exports.MeshCall = exports.HotMesh = exports.ConnectorRedis = exports.ConnectorPostgres = exports.ConnectorNATS = exports.ConnectorIORedis = exports.Connector = void 0;
27
27
  const hotmesh_1 = require("./services/hotmesh");
28
28
  Object.defineProperty(exports, "HotMesh", { enumerable: true, get: function () { return hotmesh_1.HotMesh; } });
29
29
  const meshcall_1 = require("./services/meshcall");
@@ -46,10 +46,6 @@ const handle_1 = require("./services/memflow/handle");
46
46
  Object.defineProperty(exports, "WorkflowHandle", { enumerable: true, get: function () { return handle_1.WorkflowHandleService; } });
47
47
  const proxyActivities_1 = require("./services/memflow/workflow/proxyActivities");
48
48
  Object.defineProperty(exports, "proxyActivities", { enumerable: true, get: function () { return proxyActivities_1.proxyActivities; } });
49
- const meshdata_1 = require("./services/meshdata");
50
- Object.defineProperty(exports, "MeshData", { enumerable: true, get: function () { return meshdata_1.MeshData; } });
51
- const meshos_1 = require("./services/meshos");
52
- Object.defineProperty(exports, "MeshOS", { enumerable: true, get: function () { return meshos_1.MeshOS; } });
53
49
  const Errors = __importStar(require("./modules/errors"));
54
50
  exports.Errors = Errors;
55
51
  const Utils = __importStar(require("./modules/utils"));
@@ -108,3 +108,7 @@ export declare const HMSH_EXPIRE_DURATION: number;
108
108
  export declare const HMSH_FIDELITY_SECONDS: number;
109
109
  export declare const HMSH_SCOUT_INTERVAL_SECONDS: number;
110
110
  export declare const HMSH_GUID_SIZE: number;
111
+ /**
112
+ * Default task queue name used when no task queue is specified
113
+ */
114
+ export declare const DEFAULT_TASK_QUEUE = "default";
@@ -1,6 +1,6 @@
1
1
  "use strict";
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
- exports.HMSH_GUID_SIZE = exports.HMSH_SCOUT_INTERVAL_SECONDS = exports.HMSH_FIDELITY_SECONDS = exports.HMSH_EXPIRE_DURATION = exports.HMSH_XPENDING_COUNT = exports.HMSH_XCLAIM_COUNT = exports.HMSH_XCLAIM_DELAY_MS = exports.HMSH_BLOCK_TIME_MS = exports.HMSH_MEMFLOW_EXP_BACKOFF = exports.HMSH_MEMFLOW_MAX_INTERVAL = exports.HMSH_MEMFLOW_MAX_ATTEMPTS = exports.HMSH_GRADUATED_INTERVAL_MS = exports.HMSH_MAX_TIMEOUT_MS = exports.HMSH_MAX_RETRIES = exports.MAX_DELAY = exports.MAX_STREAM_RETRIES = exports.INITIAL_STREAM_BACKOFF = exports.MAX_STREAM_BACKOFF = exports.HMSH_EXPIRE_JOB_SECONDS = exports.HMSH_OTT_WAIT_TIME = exports.HMSH_DEPLOYMENT_PAUSE = exports.HMSH_DEPLOYMENT_DELAY = exports.HMSH_ACTIVATION_MAX_RETRY = exports.HMSH_QUORUM_DELAY_MS = exports.HMSH_QUORUM_ROLLCALL_CYCLES = exports.HMSH_STATUS_UNKNOWN = exports.HMSH_CODE_MEMFLOW_RETRYABLE = exports.HMSH_CODE_MEMFLOW_FATAL = exports.HMSH_CODE_MEMFLOW_MAXED = exports.HMSH_CODE_MEMFLOW_TIMEOUT = exports.HMSH_CODE_MEMFLOW_WAIT = exports.HMSH_CODE_MEMFLOW_PROXY = exports.HMSH_CODE_MEMFLOW_CHILD = exports.HMSH_CODE_MEMFLOW_ALL = exports.HMSH_CODE_MEMFLOW_SLEEP = exports.HMSH_CODE_UNACKED = exports.HMSH_CODE_TIMEOUT = exports.HMSH_CODE_UNKNOWN = exports.HMSH_CODE_INTERRUPT = exports.HMSH_CODE_NOTFOUND = exports.HMSH_CODE_PENDING = exports.HMSH_CODE_SUCCESS = exports.HMSH_SIGNAL_EXPIRE = exports.HMSH_IS_CLUSTER = exports.HMSH_TELEMETRY = exports.HMSH_LOGLEVEL = void 0;
3
+ exports.DEFAULT_TASK_QUEUE = exports.HMSH_GUID_SIZE = exports.HMSH_SCOUT_INTERVAL_SECONDS = exports.HMSH_FIDELITY_SECONDS = exports.HMSH_EXPIRE_DURATION = exports.HMSH_XPENDING_COUNT = exports.HMSH_XCLAIM_COUNT = exports.HMSH_XCLAIM_DELAY_MS = exports.HMSH_BLOCK_TIME_MS = exports.HMSH_MEMFLOW_EXP_BACKOFF = exports.HMSH_MEMFLOW_MAX_INTERVAL = exports.HMSH_MEMFLOW_MAX_ATTEMPTS = exports.HMSH_GRADUATED_INTERVAL_MS = exports.HMSH_MAX_TIMEOUT_MS = exports.HMSH_MAX_RETRIES = exports.MAX_DELAY = exports.MAX_STREAM_RETRIES = exports.INITIAL_STREAM_BACKOFF = exports.MAX_STREAM_BACKOFF = exports.HMSH_EXPIRE_JOB_SECONDS = exports.HMSH_OTT_WAIT_TIME = exports.HMSH_DEPLOYMENT_PAUSE = exports.HMSH_DEPLOYMENT_DELAY = exports.HMSH_ACTIVATION_MAX_RETRY = exports.HMSH_QUORUM_DELAY_MS = exports.HMSH_QUORUM_ROLLCALL_CYCLES = exports.HMSH_STATUS_UNKNOWN = exports.HMSH_CODE_MEMFLOW_RETRYABLE = exports.HMSH_CODE_MEMFLOW_FATAL = exports.HMSH_CODE_MEMFLOW_MAXED = exports.HMSH_CODE_MEMFLOW_TIMEOUT = exports.HMSH_CODE_MEMFLOW_WAIT = exports.HMSH_CODE_MEMFLOW_PROXY = exports.HMSH_CODE_MEMFLOW_CHILD = exports.HMSH_CODE_MEMFLOW_ALL = exports.HMSH_CODE_MEMFLOW_SLEEP = exports.HMSH_CODE_UNACKED = exports.HMSH_CODE_TIMEOUT = exports.HMSH_CODE_UNKNOWN = exports.HMSH_CODE_INTERRUPT = exports.HMSH_CODE_NOTFOUND = exports.HMSH_CODE_PENDING = exports.HMSH_CODE_SUCCESS = exports.HMSH_SIGNAL_EXPIRE = exports.HMSH_IS_CLUSTER = exports.HMSH_TELEMETRY = exports.HMSH_LOGLEVEL = void 0;
4
4
  /**
5
5
  * Determines the log level for the application. The default is 'info'.
6
6
  */
@@ -132,3 +132,7 @@ exports.HMSH_FIDELITY_SECONDS = process.env.HMSH_FIDELITY_SECONDS
132
132
  exports.HMSH_SCOUT_INTERVAL_SECONDS = parseInt(process.env.HMSH_SCOUT_INTERVAL_SECONDS, 10) || 60;
133
133
  // UTILS
134
134
  exports.HMSH_GUID_SIZE = Math.min(parseInt(process.env.HMSH_GUID_SIZE, 10) || 22, 32);
135
+ /**
136
+ * Default task queue name used when no task queue is specified
137
+ */
138
+ exports.DEFAULT_TASK_QUEUE = 'default';