@hotmeshio/hotmesh 0.5.7 → 0.5.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,289 +1,189 @@
1
1
  # HotMesh
2
2
 
3
- **Durable Memory + Coordinated Execution**
3
+ **Integrate AI automation into your current stack — without breaking it**
4
4
 
5
5
  ![beta release](https://img.shields.io/badge/release-beta-blue.svg)
6
6
 
7
- HotMesh removes the repetitive glue of building durable agents, pipelines, and long‑running workflows. You focus on *what* to change; HotMesh handles *how*, safely and durably.
7
+ HotMesh modernizes existing business systems by introducing a durable workflow layer that connects AI, automation, and human-in-the-loop steps **without replacing your current stack**.
8
+ Each process runs with persistent memory in Postgres, surviving retries, crashes, and human delays.
8
9
 
9
- ---
10
-
11
- ## Why Choose HotMesh
12
-
13
- - **Zero Boilerplate** - Transactional Postgres without the setup hassle
14
- - **Built-in Durability** - Automatic crash recovery and replay protection
15
- - **Parallel by Default** - Run hooks concurrently without coordination
16
- - **SQL-First** - Query pipeline status and agent memory directly
10
+ ```bash
11
+ npm install @hotmeshio/hotmesh
12
+ ```
17
13
 
18
14
  ---
19
15
 
20
- ## Core Abstractions
21
-
22
- ### 1. Entities
23
-
24
- Durable JSONB documents representing *process memory*. Each entity:
25
-
26
- * Has a stable identity (`workflowId` / logical key).
27
- * Evolves via atomic commands.
28
- * Is versioned implicitly by transactional history.
29
- * Can be partially indexed for targeted query performance.
30
-
31
- > **Design Note:** Treat entity shape as *contractual surface* + *freeform interior*. Index only the minimal surface required for lookups or dashboards.
32
-
33
- ### 2. Hooks
34
-
35
- Re‑entrant, idempotent, interruptible units of work that *maintain* an entity. Hooks can:
36
-
37
- * Start, stop, or be re‑invoked without corrupting state.
38
- * Run concurrently (Postgres ensures isolation on write).
39
- * Emit signals to let coordinators or sibling hooks know a perspective / phase completed.
40
-
41
- ### 3. Workflow Coordinators
16
+ ## What It Solves
42
17
 
43
- Thin entrypoints that:
18
+ Modernization often stalls where systems meet people and AI.
19
+ HotMesh builds a **durable execution bridge** across those seams — linking your database, APIs, RPA, and AI agents into one recoverable process.
44
20
 
45
- * Seed initial entity state.
46
- * Fan out perspective / phase hooks.
47
- * Optionally synthesize or finalize.
48
- * Return a snapshot (often the final entity state) *the workflow result is just memory*.
49
-
50
- ### 4. Commands (Entity Mutation Primitives)
51
-
52
- | Command | Purpose | Example |
53
- | ----------- | ----------------------------------------- | ------------------------------------------------ |
54
- | `set` | Replace full value (first write or reset) | `await e.set({ user: { id: 123, name: "John" } })` |
55
- | `merge` | Deep JSON merge | `await e.merge({ user: { email: "john@example.com" } })` |
56
- | `append` | Append to an array field | `await e.append('items', { id: 1, name: "New Item" })` |
57
- | `prepend` | Add to start of array field | `await e.prepend('items', { id: 0, name: "First Item" })` |
58
- | `remove` | Remove item from array by index | `await e.remove('items', 0)` |
59
- | `increment` | Numeric counters / progress | `await e.increment('counter', 5)` |
60
- | `toggle` | Toggle boolean value | `await e.toggle('settings.enabled')` |
61
- | `setIfNotExists` | Set value only if path doesn't exist | `await e.setIfNotExists('user.id', 123)` |
62
- | `delete` | Remove field at specified path | `await e.delete('user.email')` |
63
- | `get` | Read value at path (or full entity) | `await e.get('user.email')` |
64
- | `signal` | Mark hook milestone / unlock waiters | `await MemFlow.workflow.signal('phase-x', data)` |
21
+ * **AI that can fail safely** — retries, resumable state, and confidence tracking
22
+ * **Human steps that don’t block** — pause for days, resume instantly
23
+ * **Legacy systems that stay connected** — SQL and RPA coexist seamlessly
24
+ * **Full visibility** query workflows and outcomes directly in SQL
65
25
 
66
26
  ---
67
27
 
68
- ## Table of Contents
28
+ ## Core Model
69
29
 
70
- 1. [Quick Start](#quick-start)
71
- 2. [Memory Architecture](#memory-architecture)
72
- 3. [Durable AI Agents](#durable-ai-agents)
73
- 4. [Stateful Pipelines](#stateful-pipelines)
74
- 5. [Indexing Strategy](#indexing-strategy)
75
- 6. [Operational Notes](#operational-notes)
76
- 7. [Documentation & Links](#documentation--links)
77
-
78
- ---
30
+ ### Entity — the Business Process Record
79
31
 
80
- ## Quick Start
32
+ Every workflow writes to a durable JSON document in Postgres called an **Entity**.
33
+ It becomes the shared memory between APIs, RPA jobs, LLM agents, and human operators.
81
34
 
82
- ### Install
83
-
84
- ```bash
85
- npm install @hotmeshio/hotmesh
86
- ```
87
-
88
- ### Minimal Setup
89
35
  ```ts
90
- import { MemFlow } from '@hotmeshio/hotmesh';
91
- import { Client as Postgres } from 'pg';
92
-
93
- async function main() {
94
- // Auto-provisions required tables/index scaffolding on first run
95
- const mf = await MemFlow.init({
96
- appId: 'my-app',
97
- engine: {
98
- connection: {
99
- class: Postgres,
100
- options: { connectionString: process.env.DATABASE_URL }
101
- }
102
- }
103
- });
104
-
105
- // Start a durable research agent (entity-backed workflow)
106
- const handle = await mf.workflow.start({
107
- entity: 'research-agent',
108
- workflowName: 'researchAgent',
109
- workflowId: 'agent-session-jane-001',
110
- args: ['Long-term impacts of renewable energy subsidies'],
111
- taskQueue: 'agents'
112
- });
113
-
114
- console.log('Final Memory Snapshot:', await handle.result());
115
- }
116
-
117
- main().catch(console.error);
36
+ const e = await MemFlow.workflow.entity();
37
+
38
+ // initialize from a source event
39
+ await e.set({
40
+ caseId: "A42",
41
+ stage: "verification",
42
+ retries: 0,
43
+ notes: []
44
+ });
45
+
46
+ // AI step adds structured output
47
+ await e.merge({
48
+ aiSummary: { result: "Verified coverage", confidence: 0.93 },
49
+ stage: "approval",
50
+ });
51
+
52
+ // human operator review
53
+ await e.append("notes", { reviewer: "ops1", comment: "ok to proceed" });
54
+
55
+ // maintain counters
56
+ await e.increment("retries", 1);
57
+
58
+ // retrieve current process state
59
+ const data = await e.get();
118
60
  ```
119
61
 
120
- ### Value Checklist (What You Did *Not* Have To Do)
121
- - Create tables / migrations
122
- - Define per-agent caches
123
- - Implement optimistic locking
124
- - Build a queue fan‑out mechanism
125
- - Hand-roll replay protection
62
+ **Minimal surface contract**
126
63
 
127
- ---
128
-
129
- ## Memory Architecture
130
- Each workflow = **1 durable entity**. Hooks are stateless functions *shaped by* that entity's evolving JSON. You can inspect or modify it at any time using ordinary SQL or the provided API.
64
+ | Command | Purpose |
65
+ | ------------- | ---------------------------------- |
66
+ | `set()` | Initialize workflow state |
67
+ | `merge()` | Update any JSON path |
68
+ | `append()` | Add entries to lists (logs, notes) |
69
+ | `increment()` | Maintain counters or metrics |
70
+ | `get()` | Retrieve current state |
131
71
 
132
- ### Programmatic Indexing
133
- ```ts
134
- // Create index for premium research agents
135
- await MemFlow.Entity.createIndex('research-agent', 'isPremium', hotMeshClient);
136
-
137
- // Find premium agents needing verification
138
- const agents = await MemFlow.Entity.find('research-agent', {
139
- isPremium: true,
140
- needsVerification: true
141
- }, hotMeshClient);
142
- ```
72
+ Entities are stored in plain SQL tables, directly queryable:
143
73
 
144
- ### Direct SQL Access
145
74
  ```sql
146
- -- Same index via SQL (more control over index type/conditions)
147
- CREATE INDEX idx_research_agents_premium ON my_app.jobs (id)
148
- WHERE entity = 'research-agent' AND (context->>'isPremium')::boolean = true;
149
-
150
- -- Ad hoc query example
151
- SELECT id, context->>'status' as status, context->>'confidence' as confidence
75
+ SELECT id, context->>'stage', context->'aiSummary'->>'result'
152
76
  FROM my_app.jobs
153
- WHERE entity = 'research-agent'
154
- AND (context->>'isPremium')::boolean = true
155
- AND (context->>'confidence')::numeric > 0.8;
77
+ WHERE entity = 'claims-review'
78
+ AND context->>'stage' != 'complete';
156
79
  ```
157
80
 
158
- **Guidelines:**
159
- 1. *Model intent, not mechanics.* Keep ephemeral calculation artifacts minimal; store derived values only if reused.
160
- 2. *Index sparingly.* Each index is a write amplification cost. Start with 1–2 selective partial indexes.
161
- 3. *Keep arrays append‑only where possible.* Supports audit and replay semantics cheaply.
162
- 4. *Choose your tool:* Use Entity methods for standard queries, raw SQL for complex analytics or custom indexes.
163
-
164
81
  ---
165
82
 
166
- ## Durable AI Agents
167
- Agents become simpler: the *agent* is the memory record; hooks supply perspectives, verification, enrichment, or lifecycle progression.
83
+ ### Hook Parallel Work Units
84
+
85
+ Hooks are stateless functions that operate on the shared Entity.
86
+ Each hook executes independently (API, RPA, or AI), retrying automatically until success.
168
87
 
169
- ### Coordinator (Research Agent)
170
88
  ```ts
171
- export async function researchAgent(query: string) {
172
- const entity = await MemFlow.workflow.entity();
173
-
174
- const initial = {
175
- query,
176
- findings: [],
177
- perspectives: {},
178
- confidence: 0,
179
- verification: {},
180
- status: 'researching',
181
- startTime: new Date().toISOString()
182
- };
183
- await entity.set<typeof initial>(initial);
184
-
185
- // Fan-out perspectives
186
- await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'optimisticPerspective', args: [query], signalId: 'optimistic-complete' });
187
- await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'skepticalPerspective', args: [query], signalId: 'skeptical-complete' });
188
- await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'verificationHook', args: [query], signalId: 'verification-complete' });
189
- await MemFlow.workflow.execHook({ taskQueue: 'agents', workflowName: 'synthesizePerspectives', args: [], signalId: 'synthesis-complete' });
190
-
191
- return await entity.get();
192
- }
89
+ await MemFlow.workflow.execHook({
90
+ workflowName: "verifyCoverage",
91
+ args: ["A42"]
92
+ });
193
93
  ```
194
94
 
195
- ### Synthesis Hook
95
+ To run independent work in parallel, use a **batch execution** pattern:
96
+
196
97
  ```ts
197
- export async function synthesizePerspectives({ signal }: { signal: string }) {
198
- const e = await MemFlow.workflow.entity();
199
- const ctx = await e.get();
200
-
201
- const synthesized = await analyzePerspectives(ctx.perspectives);
202
- await e.merge({
203
- perspectives: {
204
- synthesis: {
205
- finalAssessment: synthesized,
206
- confidence: calculateConfidence(ctx.perspectives)
207
- }
208
- },
209
- status: 'completed'
210
- });
211
- await MemFlow.workflow.signal(signal, {});
212
- }
98
+ // Run independent research perspectives in parallel using batch execution
99
+ await MemFlow.workflow.execHookBatch([
100
+ {
101
+ key: 'optimistic',
102
+ options: {
103
+ taskQueue: 'agents',
104
+ workflowName: 'optimisticPerspective',
105
+ args: [query],
106
+ signalId: 'optimistic-complete'
107
+ }
108
+ },
109
+ {
110
+ key: 'skeptical',
111
+ options: {
112
+ taskQueue: 'agents',
113
+ workflowName: 'skepticalPerspective',
114
+ args: [query],
115
+ signalId: 'skeptical-complete'
116
+ }
117
+ }
118
+ ]);
213
119
  ```
214
120
 
215
- > **Pattern:** Fan-out hooks that write *adjacent* subtrees (e.g., `perspectives.optimistic`, `perspectives.skeptical`). A final hook merges a compact synthesis object. Avoid cross-hook mutation of the same nested branch.
121
+ Each hook runs in its own recoverable context, allowing AI, API, and RPA agents to operate independently while writing to the same durable Entity.
216
122
 
217
123
  ---
218
124
 
219
- ## Stateful Pipelines
220
- Pipelines are identical in structure to agents: a coordinator seeds memory; phase hooks advance state; the entity is the audit trail.
125
+ ## Example — AI-Assisted Claims Review
221
126
 
222
- ### Document Processing Pipeline (Coordinator)
223
127
  ```ts
224
- export async function documentProcessingPipeline() {
225
- const pipeline = await MemFlow.workflow.entity();
226
-
227
- const initial = {
228
- documentId: `doc-${Date.now()}`,
229
- status: 'started',
230
- startTime: new Date().toISOString(),
231
- imageRefs: [],
232
- extractedInfo: [],
233
- validationResults: [],
234
- finalResult: null,
235
- processingSteps: [],
236
- errors: [],
237
- pageSignals: {}
238
- };
239
- await pipeline.set<typeof initial>(initial);
240
-
241
- await pipeline.merge({ status: 'loading-images' });
242
- await pipeline.append('processingSteps', 'image-load-started');
243
- const imageRefs = await activities.loadImagePages();
244
- if (!imageRefs?.length) throw new Error('No image references found');
245
- await pipeline.merge({ imageRefs });
246
- await pipeline.append('processingSteps', 'image-load-completed');
247
-
248
- // Page hooks
249
- for (const [i, ref] of imageRefs.entries()) {
250
- const page = i + 1;
251
- await MemFlow.workflow.execHook({
252
- taskQueue: 'pipeline',
253
- workflowName: 'pageProcessingHook',
254
- args: [ref, page, initial.documentId],
255
- signalId: `page-${page}-complete`
256
- });
257
- }
128
+ export async function claimsWorkflow(caseId: string) {
129
+ const e = await MemFlow.workflow.entity();
130
+ await e.set({ caseId, stage: "intake", approved: false });
131
+
132
+ // Run verification and summarization in parallel
133
+ await MemFlow.workflow.execHookBatch([
134
+ {
135
+ key: 'verifyCoverage',
136
+ options: {
137
+ taskQueue: 'agents',
138
+ workflowName: 'verifyCoverage',
139
+ args: [caseId],
140
+ signalId: 'verify-complete'
141
+ }
142
+ },
143
+ {
144
+ key: 'generateSummary',
145
+ options: {
146
+ taskQueue: 'agents',
147
+ workflowName: 'generateSummary',
148
+ args: [caseId],
149
+ signalId: 'summary-complete'
150
+ }
151
+ }
152
+ ]);
258
153
 
259
- // Validation
260
- await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'validationHook', args: [initial.documentId], signalId: 'validation-complete' });
261
- // Approval
262
- await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'approvalHook', args: [initial.documentId], signalId: 'approval-complete' });
263
- // Notification
264
- await MemFlow.workflow.execHook({ taskQueue: 'pipeline', workflowName: 'notificationHook', args: [initial.documentId], signalId: 'processing-complete' });
154
+ // Wait for human sign-off
155
+ const approval = await MemFlow.workflow.waitFor("human-approval");
156
+ await e.merge({ approved: approval === true, stage: "complete" });
265
157
 
266
- await pipeline.merge({ status: 'completed', completedAt: new Date().toISOString() });
267
- await pipeline.append('processingSteps', 'pipeline-completed');
268
- return await pipeline.get();
158
+ return await e.get();
269
159
  }
270
160
  ```
271
161
 
272
- **Operational Characteristics:**
273
- - *Replay Friendly*: Each hook can be retried; pipeline memory records invariant progress markers (`processingSteps`).
274
- - *Parallelizable*: Pages fan out naturally without manual queue wiring.
275
- - *Auditable*: Entire lifecycle captured in a single evolving JSON record.
162
+ This bridges:
163
+
164
+ * an existing insurance or EHR system (status + audit trail)
165
+ * LLM agents for data validation and summarization
166
+ * a human reviewer for final sign-off
167
+
168
+ —all within one recoverable workflow record.
276
169
 
277
170
  ---
278
171
 
279
- ## Documentation & Links
280
- * **SDK Reference** – https://hotmeshio.github.io/sdk-typescript
281
- * **Agent Example Tests** – https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/agent
282
- * **Pipeline Example Tests** – https://github.com/hotmeshio/sdk-typescript/tree/main/tests/memflow/pipeline
283
- * **Sample Projects** https://github.com/hotmeshio/samples-typescript
172
+ ## Why It Fits Integration Work
173
+
174
+ HotMesh is purpose-built for **incremental modernization**.
175
+
176
+ | Need | What HotMesh Provides |
177
+ | ----------------------------- | ---------------------------------------- |
178
+ | Tie AI into legacy apps | Durable SQL bridge with full visibility |
179
+ | Keep human review steps | Wait-for-signal workflows |
180
+ | Handle unstable APIs | Built-in retries and exponential backoff |
181
+ | Trace process across systems | Unified JSON entity per workflow |
182
+ | Store long-running AI results | Durable state for agents and automations |
284
183
 
285
184
  ---
286
185
 
287
186
  ## License
288
- Apache 2.0 with commercial restrictions* – see `LICENSE`.
289
- >*NOTE: It's open source with one commercial exception: Build, sell, and share solutions made with HotMesh. But don't white-label the orchestration core and repackage it as your own workflow-as-a-service.
187
+
188
+ Apache 2.0 free to build, integrate, and deploy.
189
+ Do not resell the core engine as a hosted service.
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@hotmeshio/hotmesh",
3
- "version": "0.5.7",
3
+ "version": "0.5.8",
4
4
  "description": "Permanent-Memory Workflows & AI Agents",
5
5
  "main": "./build/index.js",
6
6
  "types": "./build/index.d.ts",
@@ -106,7 +106,7 @@
106
106
  "eslint-config-prettier": "^9.1.0",
107
107
  "eslint-plugin-import": "^2.29.1",
108
108
  "eslint-plugin-prettier": "^5.1.3",
109
- "javascript-obfuscator": "^4.1.1",
109
+ "javascript-obfuscator": "^0.6.2",
110
110
  "jest": "^29.5.0",
111
111
  "nats": "^2.28.0",
112
112
  "openai": "^5.9.0",
@@ -0,0 +1,54 @@
1
+ import { ExecHookOptions } from './execHook';
2
+ /**
3
+ * Configuration for a single hook in a batch execution
4
+ */
5
+ export interface BatchHookConfig<T = any> {
6
+ /** Unique key to identify this hook's result in the returned object */
7
+ key: string;
8
+ /** Hook execution options */
9
+ options: ExecHookOptions;
10
+ }
11
+ /**
12
+ * Executes multiple hooks in parallel and awaits all their signal responses.
13
+ * This solves the race condition where Promise.all() with execHook() would prevent
14
+ * all waitFor() registrations from completing.
15
+ *
16
+ * The method ensures all waitFor() registrations happen before any hooks execute,
17
+ * preventing signals from being sent before the framework is ready to receive them.
18
+ *
19
+ * @template T - Object type with keys matching the batch hook keys and values as expected response types
20
+ * @param {BatchHookConfig[]} hookConfigs - Array of hook configurations with unique keys
21
+ * @returns {Promise<T>} Object with keys from hookConfigs and values as the signal responses
22
+ *
23
+ * @example
24
+ * ```typescript
25
+ * // Execute multiple research perspectives in parallel
26
+ * const results = await MemFlow.workflow.execHookBatch<{
27
+ * optimistic: OptimisticResult;
28
+ * skeptical: SkepticalResult;
29
+ * }>([
30
+ * {
31
+ * key: 'optimistic',
32
+ * options: {
33
+ * taskQueue: 'agents',
34
+ * workflowName: 'optimisticPerspective',
35
+ * args: [query],
36
+ * signalId: 'optimistic-complete'
37
+ * }
38
+ * },
39
+ * {
40
+ * key: 'skeptical',
41
+ * options: {
42
+ * taskQueue: 'agents',
43
+ * workflowName: 'skepticalPerspective',
44
+ * args: [query],
45
+ * signalId: 'skeptical-complete'
46
+ * }
47
+ * }
48
+ * ]);
49
+ *
50
+ * // results.optimistic contains the OptimisticResult
51
+ * // results.skeptical contains the SkepticalResult
52
+ * ```
53
+ */
54
+ export declare function execHookBatch<T extends Record<string, any>>(hookConfigs: BatchHookConfig[]): Promise<T>;
@@ -0,0 +1,77 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.execHookBatch = void 0;
4
+ const hook_1 = require("./hook");
5
+ const waitFor_1 = require("./waitFor");
6
+ /**
7
+ * Executes multiple hooks in parallel and awaits all their signal responses.
8
+ * This solves the race condition where Promise.all() with execHook() would prevent
9
+ * all waitFor() registrations from completing.
10
+ *
11
+ * The method ensures all waitFor() registrations happen before any hooks execute,
12
+ * preventing signals from being sent before the framework is ready to receive them.
13
+ *
14
+ * @template T - Object type with keys matching the batch hook keys and values as expected response types
15
+ * @param {BatchHookConfig[]} hookConfigs - Array of hook configurations with unique keys
16
+ * @returns {Promise<T>} Object with keys from hookConfigs and values as the signal responses
17
+ *
18
+ * @example
19
+ * ```typescript
20
+ * // Execute multiple research perspectives in parallel
21
+ * const results = await MemFlow.workflow.execHookBatch<{
22
+ * optimistic: OptimisticResult;
23
+ * skeptical: SkepticalResult;
24
+ * }>([
25
+ * {
26
+ * key: 'optimistic',
27
+ * options: {
28
+ * taskQueue: 'agents',
29
+ * workflowName: 'optimisticPerspective',
30
+ * args: [query],
31
+ * signalId: 'optimistic-complete'
32
+ * }
33
+ * },
34
+ * {
35
+ * key: 'skeptical',
36
+ * options: {
37
+ * taskQueue: 'agents',
38
+ * workflowName: 'skepticalPerspective',
39
+ * args: [query],
40
+ * signalId: 'skeptical-complete'
41
+ * }
42
+ * }
43
+ * ]);
44
+ *
45
+ * // results.optimistic contains the OptimisticResult
46
+ * // results.skeptical contains the SkepticalResult
47
+ * ```
48
+ */
49
+ async function execHookBatch(hookConfigs) {
50
+ // Generate signal IDs for hooks that don't have them
51
+ const processedConfigs = hookConfigs.map(config => ({
52
+ ...config,
53
+ options: {
54
+ ...config.options,
55
+ signalId: config.options.signalId || `memflow-hook-${crypto.randomUUID()}`
56
+ }
57
+ }));
58
+ // STEP 1: Fire off all hooks (but don't await them)
59
+ // This registers the hooks/streams with the system immediately
60
+ await Promise.all(processedConfigs.map(config => {
61
+ const hookOptions = {
62
+ ...config.options,
63
+ args: [...config.options.args, {
64
+ signal: config.options.signalId,
65
+ $memflow: true
66
+ }]
67
+ };
68
+ return (0, hook_1.hook)(hookOptions);
69
+ }));
70
+ // STEP 2: Await all waitFor operations
71
+ // This ensures all waitFor registrations happen in the same call stack
72
+ // before any MemFlowWaitForError is thrown (via setImmediate mechanism)
73
+ const results = await Promise.all(processedConfigs.map(config => (0, waitFor_1.waitFor)(config.options.signalId)));
74
+ // STEP 3: Return results as a keyed object
75
+ return Object.fromEntries(processedConfigs.map((config, i) => [config.key, results[i]]));
76
+ }
77
+ exports.execHookBatch = execHookBatch;
@@ -6,6 +6,7 @@ import { enrich } from './enrich';
6
6
  import { emit } from './emit';
7
7
  import { execChild, startChild } from './execChild';
8
8
  import { execHook } from './execHook';
9
+ import { execHookBatch } from './execHookBatch';
9
10
  import { proxyActivities } from './proxyActivities';
10
11
  import { search } from './searchMethods';
11
12
  import { random } from './random';
@@ -53,6 +54,7 @@ export declare class WorkflowService {
53
54
  static executeChild: typeof execChild;
54
55
  static startChild: typeof startChild;
55
56
  static execHook: typeof execHook;
57
+ static execHookBatch: typeof execHookBatch;
56
58
  static proxyActivities: typeof proxyActivities;
57
59
  static search: typeof search;
58
60
  static entity: typeof entity;
@@ -9,6 +9,7 @@ const enrich_1 = require("./enrich");
9
9
  const emit_1 = require("./emit");
10
10
  const execChild_1 = require("./execChild");
11
11
  const execHook_1 = require("./execHook");
12
+ const execHookBatch_1 = require("./execHookBatch");
12
13
  const proxyActivities_1 = require("./proxyActivities");
13
14
  const searchMethods_1 = require("./searchMethods");
14
15
  const random_1 = require("./random");
@@ -71,6 +72,7 @@ WorkflowService.execChild = execChild_1.execChild;
71
72
  WorkflowService.executeChild = execChild_1.executeChild;
72
73
  WorkflowService.startChild = execChild_1.startChild;
73
74
  WorkflowService.execHook = execHook_1.execHook;
75
+ WorkflowService.execHookBatch = execHookBatch_1.execHookBatch;
74
76
  WorkflowService.proxyActivities = proxyActivities_1.proxyActivities;
75
77
  WorkflowService.search = searchMethods_1.search;
76
78
  WorkflowService.entity = entityMethods_1.entity;
@@ -267,9 +267,32 @@ const KVTables = (context) => ({
267
267
  field TEXT NOT NULL,
268
268
  value TEXT,
269
269
  type ${schemaName}.type_enum NOT NULL,
270
+ created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
271
+ updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
270
272
  PRIMARY KEY (job_id, field),
271
273
  FOREIGN KEY (job_id) REFERENCES ${fullTableName} (id) ON DELETE CASCADE
272
274
  ) PARTITION BY HASH (job_id);
275
+ `);
276
+ // Create trigger function for updating updated_at only for mutable types
277
+ await client.query(`
278
+ CREATE OR REPLACE FUNCTION ${schemaName}.update_attributes_updated_at()
279
+ RETURNS TRIGGER AS $$
280
+ BEGIN
281
+ IF NEW.type IN ('udata', 'jdata', 'hmark', 'jmark') AND
282
+ (OLD.value IS NULL OR NEW.value <> OLD.value) THEN
283
+ NEW.updated_at = NOW();
284
+ END IF;
285
+ RETURN NEW;
286
+ END;
287
+ $$ LANGUAGE plpgsql;
288
+ `);
289
+ // Create trigger for updated_at updates
290
+ await client.query(`
291
+ DROP TRIGGER IF EXISTS trg_update_attributes_updated_at ON ${attributesTableName};
292
+ CREATE TRIGGER trg_update_attributes_updated_at
293
+ BEFORE UPDATE ON ${attributesTableName}
294
+ FOR EACH ROW
295
+ EXECUTE FUNCTION ${schemaName}.update_attributes_updated_at();
273
296
  `);
274
297
  // Create partitions for attributes table
275
298
  await client.query(`
@@ -8,6 +8,7 @@ declare class PostgresSubService extends SubService<PostgresClientType & Provide
8
8
  private static clientSubscriptions;
9
9
  private static clientHandlers;
10
10
  private instanceSubscriptions;
11
+ private instanceId;
11
12
  constructor(eventClient: PostgresClientType & ProviderClient, storeClient?: PostgresClientType & ProviderClient);
12
13
  init(namespace: string, appId: string, engineId: string, logger: ILogger): Promise<void>;
13
14
  private setupNotificationHandler;
@@ -11,7 +11,8 @@ class PostgresSubService extends index_1.SubService {
11
11
  constructor(eventClient, storeClient) {
12
12
  super(eventClient, storeClient);
13
13
  // Instance-level subscriptions for cleanup
14
- this.instanceSubscriptions = new Set();
14
+ this.instanceSubscriptions = new Map(); // topic -> callbackKey mapping
15
+ this.instanceId = crypto_1.default.randomUUID();
15
16
  }
16
17
  async init(namespace = key_1.HMNS, appId, engineId, logger) {
17
18
  this.namespace = namespace;
@@ -36,8 +37,10 @@ class PostgresSubService extends index_1.SubService {
36
37
  if (callbacks && callbacks.size > 0) {
37
38
  try {
38
39
  const payload = JSON.parse(msg.payload || '{}');
39
- // Call all callbacks registered for this channel across all SubService instances
40
- callbacks.forEach((callback) => {
40
+ // Collect callbacks first to avoid modification during iteration
41
+ const callbackArray = Array.from(callbacks.entries());
42
+ // Call all callbacks
43
+ callbackArray.forEach(([callbackKey, callback], index) => {
41
44
  try {
42
45
  callback(msg.channel, payload);
43
46
  }
@@ -97,13 +100,16 @@ class PostgresSubService extends index_1.SubService {
97
100
  // Start listening to the safe topic (only once per channel across all instances)
98
101
  await this.eventClient.query(`LISTEN "${safeKey}"`);
99
102
  }
100
- // Add this callback to the list
101
- callbacks.set(this, callback);
103
+ // Generate unique callback key to avoid overwrites
104
+ const callbackKey = `${this.instanceId}-${Date.now()}-${Math.random()}`;
105
+ // Add this callback to the list with unique key
106
+ callbacks.set(callbackKey, callback);
102
107
  // Track this subscription for cleanup
103
- this.instanceSubscriptions.add(safeKey);
108
+ this.instanceSubscriptions.set(safeKey, callbackKey);
104
109
  this.logger.debug(`postgres-subscribe`, {
105
110
  originalKey,
106
111
  safeKey,
112
+ callbackKey,
107
113
  totalCallbacks: callbacks.size,
108
114
  });
109
115
  }
@@ -117,11 +123,12 @@ class PostgresSubService extends index_1.SubService {
117
123
  return;
118
124
  }
119
125
  const callbacks = clientSubscriptions.get(safeKey);
120
- if (!callbacks || callbacks.size === 0) {
126
+ const callbackKey = this.instanceSubscriptions.get(safeKey);
127
+ if (!callbacks || callbacks.size === 0 || !callbackKey) {
121
128
  return;
122
129
  }
123
- // Remove callback from this specific instance
124
- callbacks.delete(this);
130
+ // Remove callback using the tracked unique key
131
+ callbacks.delete(callbackKey);
125
132
  // Remove from instance tracking
126
133
  this.instanceSubscriptions.delete(safeKey);
127
134
  // Stop listening to the safe topic if no more callbacks exist
@@ -132,6 +139,7 @@ class PostgresSubService extends index_1.SubService {
132
139
  this.logger.debug(`postgres-unsubscribe`, {
133
140
  originalKey,
134
141
  safeKey,
142
+ callbackKey,
135
143
  remainingCallbacks: callbacks.size,
136
144
  });
137
145
  }
@@ -144,10 +152,10 @@ class PostgresSubService extends index_1.SubService {
144
152
  if (!clientSubscriptions) {
145
153
  return;
146
154
  }
147
- for (const safeKey of this.instanceSubscriptions) {
155
+ for (const [safeKey, callbackKey] of this.instanceSubscriptions) {
148
156
  const callbacks = clientSubscriptions.get(safeKey);
149
157
  if (callbacks) {
150
- callbacks.delete(this);
158
+ callbacks.delete(callbackKey);
151
159
  // If no more callbacks exist for this channel, stop listening
152
160
  if (callbacks.size === 0) {
153
161
  clientSubscriptions.delete(safeKey);
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@hotmeshio/hotmesh",
3
- "version": "0.5.7",
3
+ "version": "0.5.8",
4
4
  "description": "Permanent-Memory Workflows & AI Agents",
5
5
  "main": "./build/index.js",
6
6
  "types": "./build/index.d.ts",
@@ -106,7 +106,7 @@
106
106
  "eslint-config-prettier": "^9.1.0",
107
107
  "eslint-plugin-import": "^2.29.1",
108
108
  "eslint-plugin-prettier": "^5.1.3",
109
- "javascript-obfuscator": "^4.1.1",
109
+ "javascript-obfuscator": "^0.6.2",
110
110
  "jest": "^29.5.0",
111
111
  "nats": "^2.28.0",
112
112
  "openai": "^5.9.0",