pulse-sdk 0.0.4 → 0.0.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,177 +1,116 @@
1
- # Pulse SDK — TypeScript
1
+ # Pulse TypeScript SDK
2
2
 
3
- TypeScript client for Pulse Broker (gRPC). This README is written for end users and explains installation, configuration, and basic usage with practical examples.
3
+ Official TypeScript client for [Pulse Broker](https://github.com/marcosrosa/pulse).
4
4
 
5
5
  ## Installation
6
6
 
7
- Install from npm (published package):
8
-
9
7
  ```bash
10
8
  npm install pulse-sdk
11
9
  ```
12
10
 
13
- Or install locally from the repository (for development):
14
-
15
- ```bash
16
- npm install
17
- ```
18
-
19
- To compile the TypeScript sources (optional for development/CI):
20
-
21
- ```bash
22
- npm run build
23
- ```
24
-
25
- Run the lightweight integration tests (a test gRPC server is started automatically):
26
-
27
- ```bash
28
- npm test
29
- ```
30
-
31
11
  ## Configuration
32
12
 
33
- The SDK supports configuration via (in priority order):
13
+ The SDK looks for a `pulse.yaml` (or `pulse.yml`) file in your project root. If not found, it defaults to `localhost:5555` (HTTP) and `localhost:5556` (gRPC).
34
14
 
35
- 1. Programmatic: pass a `PulseConfig` object when creating `Producer` or `Consumer`.
36
- 2. Environment variables: `PULSE_GRPC_URL`, `PULSE_EVENT_TYPES` (comma-separated), `PULSE_CONSUMER_NAME`.
37
- 3. File: a `pulse.yml` or `pulse.yaml` file placed at the process working directory or in `~/.pulse/pulse.yml`.
38
-
39
- The SDK exposes two helpers:
40
-
41
- - `loadConfig(configPath?: string): PulseConfig` — reads `pulse.yml` / env and returns a merged config object.
42
- - `initFromConfig(cfg: PulseConfig): Promise<void>` — calls the broker `CreateTopic` RPC for every topic declared in `cfg.topics`. This lets the SDK create any required topics automatically at startup (useful for tests or first-run).
43
-
44
- Example `pulse.yml`:
15
+ ### Example `pulse.yaml`
45
16
 
46
17
  ```yaml
47
- grpcUrl: localhost:5556
48
- eventTypes:
49
- - events
50
- - transactions
51
- consumerName: my-consumer
18
+ # Connection Settings
19
+ broker:
20
+ host: "localhost"
21
+ http_port: 5555
22
+ grpc_port: 5556
23
+ timeout_ms: 5000
24
+
25
+ # Client Defaults
26
+ client:
27
+ id: "my-typescript-app"
28
+ auto_commit: true # Automatically commit offsets after successful processing
29
+ max_retries: 3
30
+
31
+ # Topic Configuration
52
32
  topics:
53
- - name: events
54
- fifo: false
55
- - name: transactions
56
- fifo: true
57
- ```
58
-
59
- Usage example (auto-create topics then start):
60
-
61
- ```ts
62
- import { loadConfig, initFromConfig, Producer, Consumer } from 'pulse-sdk';
63
-
64
- async function main() {
65
- const cfg = loadConfig(); // reads pulse.yml or environment
66
- await initFromConfig(cfg); // create topics listed in cfg.topics (no-op if none)
67
-
68
- const producer = new Producer(cfg);
69
- await producer.send('events', { type: 'user.created', id: 1 });
70
-
71
- const consumer = new Consumer(cfg);
72
- consumer.on('events', (msg) => console.log('received', msg.payload));
73
- await consumer.start('events', cfg.consumerName || 'default');
74
- }
75
-
76
- main().catch(console.error);
33
+ - name: "events"
34
+ create_if_missing: true
35
+ config:
36
+ fifo: false
37
+ retention_bytes: 1073741824 # 1GB
38
+ consume:
39
+ auto_commit: true
40
+
41
+ - name: "transactions"
42
+ create_if_missing: true
43
+ config:
44
+ fifo: true
45
+ consume:
46
+ auto_commit: false # Manual commit required
47
+
48
+ - name: "logs"
49
+ create_if_missing: true
50
+ config:
51
+ fifo: false
52
+ consume:
53
+ auto_commit: true
77
54
  ```
78
55
 
79
- Notes:
80
- - `initFromConfig` calls the gRPC `CreateTopic` RPC. If the broker responds with an error for a topic that already exists, the SDK logs a warning and continues.
81
- - Ensure the broker is running and reachable at `cfg.grpcUrl` before calling `initFromConfig`.
82
- - The SDK bundles `pulse.proto` inside the package so consumers do not need to copy proto files into their projects.
56
+ ## Usage
83
57
 
84
- ## Quick Start
58
+ ### Producer
85
59
 
86
- Import from the published package and pass configuration programmatically:
60
+ You can send objects (automatically serialized to JSON), strings, or raw buffers.
87
61
 
88
- ```ts
89
- import { Producer, Consumer } from 'pulse-sdk';
62
+ ```typescript
63
+ import { Producer } from 'pulse-sdk';
90
64
 
91
- const cfg = { grpcUrl: 'localhost:50052', eventTypes: ['events'] };
65
+ // Initialize (uses pulse.yaml or defaults)
66
+ // You can override settings: new Producer("10.0.0.1", 9090)
67
+ const producer = new Producer();
92
68
 
93
- // Send a JSON-serializable event
94
- const producer = new Producer(cfg);
95
- await producer.send('events', { type: 'user.created', id: 123 });
69
+ async function main() {
70
+ // Send JSON
71
+ await producer.send("events", { type: "user_created", id: 123 });
96
72
 
97
- // Consume events and register a handler
98
- const consumer = new Consumer(cfg);
99
- consumer.on('events', (msg) => {
100
- console.log('received', msg.payload);
101
- });
102
- await consumer.start('events', 'my-consumer');
103
- ```
73
+ // Send String
74
+ await producer.send("logs", "raw log line");
104
75
 
105
- ### Programmatic override
76
+ // Send Buffer
77
+ await producer.send("logs", Buffer.from("binary data"));
106
78
 
107
- You can create and pass `PulseConfig` directly in code:
79
+ producer.close();
80
+ }
108
81
 
109
- ```ts
110
- const cfg = { grpcUrl: '10.0.0.5:50052', eventTypes: ['events'] };
111
- const producer = new Producer(cfg);
82
+ main();
112
83
  ```
113
84
 
114
- ## Technical Details
115
-
116
- - The SDK dynamically loads `pulse.proto` at runtime using `@grpc/proto-loader` and `@grpc/grpc-js`.
117
- - The protobuf field `bytes payload` is used to carry the message body; the SDK serializes JSON payloads into that field by default.
118
- - `Consumer.start()` opens a server stream (`Consume`) and dispatches incoming messages to handlers registered via `consumer.on(topic, handler)`.
119
-
120
- ## Examples
85
+ ### Consumer
121
86
 
122
- Producer (send multiple events):
87
+ Use the `consumer` function to register message handlers, and `run` to start the loop.
123
88
 
124
- ```ts
125
- const producer = new Producer(loadConfig());
126
- await producer.send('events', { type: 'login', userId: 1 });
127
- await producer.send('events', { type: 'logout', userId: 1 });
128
- ```
129
-
130
- Consumer (simple processing):
131
-
132
- ```ts
133
- const cfg = loadConfig();
134
- const consumer = new Consumer(cfg);
89
+ ```typescript
90
+ import { consumer, run, commit, Message } from 'pulse-sdk';
135
91
 
136
- consumer.on('events', (msg) => {
137
- console.log('event payload', msg.payload);
92
+ // Simple Consumer (uses auto_commit from config)
93
+ consumer("events", async (msg: Message) => {
94
+ console.log(`Received event:`, msg.payload);
95
+ // msg.payload is an object if JSON, string if String, else Buffer
138
96
  });
139
97
 
140
- await consumer.start('events', cfg.consumerName || 'default');
141
- ```
142
-
143
- ## Local Tests
144
-
145
- The tests in `tests/` start a lightweight gRPC test server automatically; run them with `npm test`.
146
-
147
- ## FAQ (short)
148
-
149
- - Can I use decorators for handlers? Yes — TypeScript supports decorators, but this SDK does not use them by default. We can add decorator helpers later if desired.
150
- - How do I enable TLS / secure credentials? The client factory currently uses `createInsecure()` by default. We can expose a `credentials` option on `PulseConfig` to accept `grpc.credentials.createSsl(...)` or other credential objects.
151
-
152
- **Grouped consumption**
153
-
154
- - **What it does:** When `grouped` is `true` (the default) the SDK coalesces consumers inside the same process that use the same `consumerName` into a single streaming connection. Messages are distributed (round-robin) among the registered handlers so each message is delivered to only one handler in the group (1x per client id).
155
- - **Why:** This mirrors the Python SDK behavior where handlers registered with `grouped=True` share a single consumer stream and avoid duplicate processing inside the same client.
156
- - **How to configure:**
157
- - `pulse.yml` (recommended): set `grouped: true` or `grouped: false` under top-level config.
158
- - Environment variable: `PULSE_GROUPED=true|false`.
159
- - Programmatically: pass `grouped` in the `PulseConfig` passed to `Producer`/`Consumer`.
160
- - **Default:** `grouped` defaults to `true`.
161
- - **Grouped=false behavior:** When `grouped` is `false`, the SDK will ensure consumers use unique consumer IDs (if you pass the default client name), so multiple consumers in the same process each receive all messages independently (useful for testing or when you want duplicate consumption).
162
- - **Example `pulse.yml` entry:**
163
-
164
- ```yaml
165
- grpcUrl: localhost:5556
166
- grouped: true
98
+ // Manual Commit Consumer
99
+ // Override config params directly in the options object
100
+ consumer("transactions", async (msg: Message) => {
101
+ try {
102
+ await processPayment(msg.payload);
103
+ await commit(); // Manually commit offset
104
+ console.log(`Processed transaction ${msg.offset}`);
105
+ } catch (e) {
106
+ console.error(`Failed to process: ${e}`);
107
+ }
108
+ }, { autoCommit: false });
109
+
110
+ // Start all consumers
111
+ run();
112
+
113
+ async function processPayment(data: any) {
114
+ // ... implementation
115
+ }
167
116
  ```
168
-
169
- The test-suite includes integration tests that validate both `grouped=true` and `grouped=false` behaviour.
170
-
171
- ## Contributing
172
-
173
- Please open a PR with tests. The existing tests validate basic Producer/Consumer behaviour.
174
-
175
- ---
176
-
177
- If you want this README expanded to mirror the Python SDK more closely (topic-level config, manual offset commits, longer examples), tell me which sections to add and I will update it.
package/dist/config.d.ts CHANGED
@@ -1,15 +1,26 @@
1
+ export interface BrokerConfig {
2
+ host: string;
3
+ http_port: number;
4
+ grpc_port: number;
5
+ timeout_ms: number;
6
+ }
7
+ export interface ClientConfig {
8
+ id: string;
9
+ auto_commit: boolean;
10
+ max_retries: number;
11
+ }
1
12
  export interface TopicConfig {
2
13
  name: string;
3
- fifo?: boolean;
4
- retention_bytes?: number;
5
- retention_time?: number;
14
+ create_if_missing?: boolean;
15
+ config?: {
16
+ fifo?: boolean;
17
+ retention_bytes?: number;
18
+ };
6
19
  }
7
20
  export interface PulseConfig {
8
- grpcUrl: string;
9
- eventTypes: string[];
10
- consumerName?: string;
11
- grouped?: boolean;
12
- topics?: TopicConfig[];
21
+ broker: BrokerConfig;
22
+ client: ClientConfig;
23
+ topics: TopicConfig[];
13
24
  }
14
25
  export declare function loadConfig(configPath?: string): PulseConfig;
15
- export declare function initFromConfig(cfg: PulseConfig): Promise<void>;
26
+ export declare function getConfig(): PulseConfig;
package/dist/config.js CHANGED
@@ -4,73 +4,62 @@ var __importDefault = (this && this.__importDefault) || function (mod) {
4
4
  };
5
5
  Object.defineProperty(exports, "__esModule", { value: true });
6
6
  exports.loadConfig = loadConfig;
7
- exports.initFromConfig = initFromConfig;
7
+ exports.getConfig = getConfig;
8
8
  const fs_1 = __importDefault(require("fs"));
9
9
  const path_1 = __importDefault(require("path"));
10
10
  const os_1 = __importDefault(require("os"));
11
11
  const js_yaml_1 = __importDefault(require("js-yaml"));
12
- const client_1 = require("./proto/client");
13
- const DEFAULTS = {
14
- grpcUrl: 'localhost:50052',
15
- eventTypes: ['events'],
16
- grouped: true,
12
+ const DEFAULT_CONFIG = {
13
+ broker: {
14
+ host: 'localhost',
15
+ http_port: 5555,
16
+ grpc_port: 5556,
17
+ timeout_ms: 5000,
18
+ },
19
+ client: {
20
+ id: 'typescript-client',
21
+ auto_commit: true,
22
+ max_retries: 3,
23
+ },
24
+ topics: [],
17
25
  };
18
26
  function loadConfig(configPath) {
19
27
  const candidates = [
20
28
  configPath,
21
- path_1.default.resolve(process.cwd(), 'pulse.yml'),
22
29
  path_1.default.resolve(process.cwd(), 'pulse.yaml'),
30
+ path_1.default.resolve(process.cwd(), 'pulse.yml'),
23
31
  path_1.default.join(os_1.default.homedir(), '.pulse', 'pulse.yml'),
24
32
  ].filter(Boolean);
25
33
  let fileCfg = {};
26
34
  for (const p of candidates) {
27
35
  if (p && fs_1.default.existsSync(p)) {
28
- const raw = fs_1.default.readFileSync(p, 'utf8');
29
- const parsed = js_yaml_1.default.load(raw);
30
- fileCfg = parsed || {};
31
- break;
36
+ try {
37
+ const raw = fs_1.default.readFileSync(p, 'utf8');
38
+ fileCfg = js_yaml_1.default.load(raw) || {};
39
+ break;
40
+ }
41
+ catch (e) {
42
+ console.warn(`Failed to load config from ${p}:`, e);
43
+ }
32
44
  }
33
45
  }
34
- const envCfg = {};
35
- if (process.env.PULSE_GRPC_URL)
36
- envCfg.grpcUrl = process.env.PULSE_GRPC_URL;
37
- if (process.env.PULSE_EVENT_TYPES)
38
- envCfg.eventTypes = process.env.PULSE_EVENT_TYPES.split(',');
39
- if (process.env.PULSE_CONSUMER_NAME)
40
- envCfg.consumerName = process.env.PULSE_CONSUMER_NAME;
41
- if (process.env.PULSE_GROUPED)
42
- envCfg.grouped = process.env.PULSE_GROUPED === 'true';
43
- const merged = Object.assign({}, DEFAULTS, fileCfg, envCfg);
44
- // Ensure eventTypes array exists
45
- if (!merged.eventTypes)
46
- merged.eventTypes = DEFAULTS.eventTypes;
47
- if (merged.grouped === undefined)
48
- merged.grouped = true;
49
- return merged;
46
+ // Deep merge defaults with file config
47
+ const config = JSON.parse(JSON.stringify(DEFAULT_CONFIG)); // Deep copy
48
+ if (fileCfg.broker) {
49
+ config.broker = { ...config.broker, ...fileCfg.broker };
50
+ }
51
+ if (fileCfg.client) {
52
+ config.client = { ...config.client, ...fileCfg.client };
53
+ }
54
+ if (fileCfg.topics) {
55
+ config.topics = fileCfg.topics;
56
+ }
57
+ return config;
50
58
  }
51
- // Initialize topics from config using gRPC CreateTopic RPC.
52
- async function initFromConfig(cfg) {
53
- if (!cfg.topics || cfg.topics.length === 0)
54
- return;
55
- const client = (0, client_1.createClient)(cfg.grpcUrl);
56
- for (const t of cfg.topics) {
57
- const req = {
58
- topic: t.name || t['name'],
59
- fifo: !!t.fifo,
60
- retention_bytes: t.retention_bytes || 0,
61
- retention_time: t.retention_time || 0,
62
- };
63
- await new Promise((resolve, reject) => {
64
- client.CreateTopic(req, (err, res) => {
65
- if (err) {
66
- // If topic exists the server may return an error; log and continue
67
- console.warn('CreateTopic error for', req.topic, err.message || err);
68
- resolve();
69
- }
70
- else {
71
- resolve();
72
- }
73
- });
74
- });
59
+ let _config = null;
60
+ function getConfig() {
61
+ if (!_config) {
62
+ _config = loadConfig();
75
63
  }
64
+ return _config;
76
65
  }
@@ -1,12 +1,12 @@
1
- import { PulseConfig } from './config';
2
- export type EventHandler = (payload: any) => void;
3
- export declare class Consumer {
4
- private config;
5
- private handlers;
6
- private client;
7
- private unregisterFns;
8
- constructor(config: PulseConfig);
9
- on(eventType: string, handler: EventHandler): void;
10
- start(topic?: string, consumerName?: string): Promise<void>;
11
- close(): void;
1
+ import { Message, commit } from './message';
2
+ export { commit, Message };
3
+ interface ConsumerOptions {
4
+ host?: string;
5
+ port?: number;
6
+ consumerGroup?: string;
7
+ autoCommit?: boolean;
8
+ grouped?: boolean;
12
9
  }
10
+ export declare function stop(): void;
11
+ export declare function consumer(topic: string, handler: (msg: Message) => void | Promise<void>, options?: ConsumerOptions): void;
12
+ export declare function run(): Promise<void>;
package/dist/consumer.js CHANGED
@@ -1,99 +1,173 @@
1
1
  "use strict";
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
- exports.Consumer = void 0;
3
+ exports.Message = exports.commit = void 0;
4
+ exports.stop = stop;
5
+ exports.consumer = consumer;
6
+ exports.run = run;
7
+ const config_1 = require("./config");
4
8
  const client_1 = require("./proto/client");
5
- const consumerManager_1 = require("./consumerManager");
6
- const crypto_1 = require("crypto");
7
9
  const message_1 = require("./message");
8
- class Consumer {
9
- constructor(config) {
10
- this.config = config;
11
- this.handlers = {};
12
- this.unregisterFns = [];
13
- this.client = (0, client_1.createClient)(config.grpcUrl);
10
+ Object.defineProperty(exports, "Message", { enumerable: true, get: function () { return message_1.Message; } });
11
+ Object.defineProperty(exports, "commit", { enumerable: true, get: function () { return message_1.commit; } });
12
+ const crypto_1 = require("crypto");
13
+ const _consumers = [];
14
+ const _cleanupFunctions = [];
15
+ function stop() {
16
+ _cleanupFunctions.forEach(fn => fn());
17
+ _cleanupFunctions.length = 0;
18
+ _consumers.length = 0;
19
+ }
20
+ function consumer(topic, handler, options = {}) {
21
+ const config = (0, config_1.getConfig)();
22
+ const host = options.host || config.broker.host;
23
+ const port = options.port || config.broker.grpc_port;
24
+ const baseGroup = options.consumerGroup || config.client.id;
25
+ const grouped = options.grouped !== undefined ? options.grouped : true;
26
+ let group = baseGroup;
27
+ if (!grouped) {
28
+ group = `${baseGroup}-${(0, crypto_1.randomUUID)().replace(/-/g, '')}`;
14
29
  }
15
- on(eventType, handler) {
16
- if (!this.handlers[eventType]) {
17
- this.handlers[eventType] = [];
30
+ let autoCommit = options.autoCommit;
31
+ if (autoCommit === undefined) {
32
+ autoCommit = config.client.auto_commit;
33
+ // Check topic specific config
34
+ const topicCfg = config.topics?.find(t => t.name === topic);
35
+ // Note: Python SDK checks topic_config["consume"]["auto_commit"] but our config.ts
36
+ // currently only has create_if_missing and config (fifo, retention).
37
+ // We'll stick to global client config for now unless we expand TopicConfig.
38
+ }
39
+ _consumers.push({
40
+ topic,
41
+ host,
42
+ port,
43
+ group,
44
+ autoCommit: autoCommit,
45
+ handler,
46
+ grouped,
47
+ });
48
+ }
49
+ async function run() {
50
+ // Group consumers by (topic, host, port, group)
51
+ const groupedMap = new Map();
52
+ for (const c of _consumers) {
53
+ const key = `${c.topic}:${c.host}:${c.port}:${c.group}`;
54
+ if (!groupedMap.has(key)) {
55
+ groupedMap.set(key, {
56
+ topic: c.topic,
57
+ host: c.host,
58
+ port: c.port,
59
+ group: c.group,
60
+ autoCommit: c.autoCommit,
61
+ handlers: [],
62
+ });
18
63
  }
19
- this.handlers[eventType].push(handler);
64
+ groupedMap.get(key).handlers.push(c.handler);
20
65
  }
21
- async start(topic, consumerName = 'default') {
22
- const topicName = topic || this.config.eventTypes[0];
23
- // If grouped is enabled (default true) use shared in-process stream,
24
- // otherwise create a dedicated stream (each consumer gets all messages).
25
- const grouped = this.config.grouped !== false;
26
- if (grouped) {
27
- // register handlers for this topic to shared stream
28
- const handlers = this.handlers[topicName] || [];
29
- for (const h of handlers) {
30
- const unregister = (0, consumerManager_1.registerSharedHandler)(this.config.grpcUrl, topicName, consumerName, (msg, stub, offset) => {
31
- // run handler with context so commit() works
32
- // debug: console.log('consumer.wrapper.invoke', consumerName, topicName);
33
- (0, message_1.runWithContext)({ stub: stub || this.client, topic: topicName, consumerName, offset: offset ?? msg.offset }, () => {
34
- try {
35
- h(msg);
36
- }
37
- catch (e) { /* ignore */ }
38
- });
39
- });
40
- this.unregisterFns.push(unregister);
66
+ const promises = [];
67
+ for (const groupConfig of groupedMap.values()) {
68
+ promises.push(consumeLoopGroup(groupConfig));
69
+ }
70
+ // We don't await promises here to let them run in background,
71
+ // but we could if we wanted to block until they all finish (which they won't).
72
+ // However, to keep the process alive, the user should probably await this or we return a promise that never resolves?
73
+ // Python's run() blocks. In Node, usually we just start things.
74
+ // But if the script ends, the process exits.
75
+ // We'll return a promise that never resolves to simulate blocking if awaited.
76
+ return new Promise(() => { });
77
+ }
78
+ async function consumeLoopGroup(groupConfig) {
79
+ const address = `${groupConfig.host}:${groupConfig.port}`;
80
+ // console.log(`Starting consumer for topic '${groupConfig.topic}' (group: ${groupConfig.group}) on ${address}`);
81
+ let handlerIdx = 0;
82
+ let currentStream = null;
83
+ let retryTimeout = null;
84
+ let isStopped = false;
85
+ const cleanup = () => {
86
+ isStopped = true;
87
+ if (currentStream) {
88
+ try {
89
+ currentStream.cancel();
41
90
  }
42
- // Return a promise that never resolves (stream runs until process exits)
43
- return new Promise(() => { });
44
- }
45
- // If grouped is explicitly false, and the consumerName equals the configured
46
- // client name or the default literal, generate a unique consumer id so each
47
- // consumer receives messages independently (mirrors Python behaviour).
48
- if (!grouped) {
49
- const base = this.config.consumerName || 'default';
50
- if (consumerName === base || consumerName === 'default') {
51
- consumerName = `${base}-${(0, crypto_1.randomUUID)().replace(/-/g, '')}`;
91
+ catch (e) {
92
+ // Ignore cancel errors
52
93
  }
94
+ currentStream = null;
53
95
  }
54
- const req = { topic: topicName, consumer_name: consumerName, offset: 0 };
55
- let stream = null;
56
- try {
57
- stream = this.client.Consume(req);
96
+ if (retryTimeout) {
97
+ clearTimeout(retryTimeout);
98
+ retryTimeout = null;
58
99
  }
59
- catch (err) {
60
- // If the client throws synchronously (e.g. CANCELLED), return a promise
61
- // that is rejected but handled to avoid an unhandled rejection when
62
- // callers don't await `start()` (tests call start() without awaiting).
63
- const p = Promise.reject(err);
64
- p.catch(() => { }); // swallow to avoid uncaught rejection
65
- return p;
100
+ };
101
+ _cleanupFunctions.push(cleanup);
102
+ const startStream = async () => {
103
+ if (isStopped)
104
+ return;
105
+ try {
106
+ const client = (0, client_1.createClient)(address);
107
+ const req = {
108
+ topic: groupConfig.topic,
109
+ consumer_name: groupConfig.group,
110
+ offset: 0,
111
+ };
112
+ const stream = client.Consume(req);
113
+ currentStream = stream;
114
+ stream.on('data', async (protoMsg) => {
115
+ if (isStopped)
116
+ return;
117
+ // Pause stream to process message sequentially
118
+ stream.pause();
119
+ const msg = new message_1.Message(protoMsg);
120
+ if (groupConfig.handlers.length === 0) {
121
+ stream.resume();
122
+ return;
123
+ }
124
+ const handler = groupConfig.handlers[handlerIdx % groupConfig.handlers.length];
125
+ handlerIdx++;
126
+ const ctx = {
127
+ stub: client,
128
+ topic: groupConfig.topic,
129
+ consumerGroup: groupConfig.group,
130
+ offset: msg.offset,
131
+ committed: false,
132
+ };
133
+ try {
134
+ await (0, message_1.runWithContext)(ctx, async () => {
135
+ await handler(msg);
136
+ if (groupConfig.autoCommit && !ctx.committed) {
137
+ await (0, message_1.commit)();
138
+ }
139
+ });
140
+ }
141
+ catch (e) {
142
+ console.error(`Error processing message: ${e}`);
143
+ }
144
+ finally {
145
+ // Resume stream after processing
146
+ stream.resume();
147
+ }
148
+ });
149
+ stream.on('error', (err) => {
150
+ if (isStopped)
151
+ return;
152
+ // 1 = CANCELLED
153
+ if (err.code === 1)
154
+ return;
155
+ console.error(`Connection lost for ${groupConfig.topic}: ${err.message}. Retrying in 5s...`);
156
+ retryTimeout = setTimeout(startStream, 5000);
157
+ });
158
+ stream.on('end', () => {
159
+ if (isStopped)
160
+ return;
161
+ // console.warn(`Stream ended for ${groupConfig.topic}. Retrying in 5s...`);
162
+ retryTimeout = setTimeout(startStream, 5000);
163
+ });
66
164
  }
67
- stream.on('data', (msg) => {
68
- const message = new message_1.Message(msg);
69
- const handlers = this.handlers[req.topic] || [];
70
- for (const h of handlers) {
71
- // run handler within AsyncLocalStorage context so commit() can access stub and offset
72
- (0, message_1.runWithContext)({ stub: this.client, topic: req.topic, consumerName, offset: message.offset }, () => {
73
- try {
74
- h(message);
75
- }
76
- catch (e) { /* handler error ignored here */ }
77
- });
78
- }
79
- });
80
- const p = new Promise((resolve, reject) => {
81
- stream.on('end', () => resolve());
82
- stream.on('error', (e) => reject(e));
83
- });
84
- // prevent unhandled rejections when callers don't await the returned promise
85
- p.catch(() => { });
86
- return p;
87
- }
88
- // unregister any shared handlers when this consumer is discarded
89
- close() {
90
- for (const u of this.unregisterFns) {
91
- try {
92
- u();
93
- }
94
- catch (e) { /* ignore */ }
165
+ catch (e) {
166
+ if (isStopped)
167
+ return;
168
+ console.error(`Unexpected error in consumer ${groupConfig.topic}: ${e.message}`);
169
+ retryTimeout = setTimeout(startStream, 5000);
95
170
  }
96
- this.unregisterFns = [];
97
- }
171
+ };
172
+ startStream();
98
173
  }
99
- exports.Consumer = Consumer;
package/dist/index.d.ts CHANGED
@@ -2,4 +2,3 @@ export * from './producer';
2
2
  export * from './consumer';
3
3
  export * from './config';
4
4
  export * from './message';
5
- export * from './consumerManager';
package/dist/index.js CHANGED
@@ -18,4 +18,3 @@ __exportStar(require("./producer"), exports);
18
18
  __exportStar(require("./consumer"), exports);
19
19
  __exportStar(require("./config"), exports);
20
20
  __exportStar(require("./message"), exports);
21
- __exportStar(require("./consumerManager"), exports);
package/dist/message.d.ts CHANGED
@@ -1,24 +1,19 @@
1
- export interface IMessage {
2
- offset: number;
3
- timestamp: number;
4
- payload: any;
5
- headers: Record<string, string>;
6
- }
7
- interface MessageContext {
1
+ export interface MessageContext {
8
2
  stub: any;
9
3
  topic: string;
10
- consumerName: string;
4
+ consumerGroup: string;
11
5
  offset: number;
12
- committed?: boolean;
6
+ committed: boolean;
13
7
  }
14
- export declare function runWithContext(ctx: MessageContext, fn: () => void): void;
15
- export declare function getContext(): MessageContext | undefined;
8
+ export declare function runWithContext(ctx: MessageContext, fn: () => void | Promise<void>): void | Promise<void>;
16
9
  export declare function commit(): Promise<void>;
17
- export declare class Message implements IMessage {
10
+ export declare class Message {
18
11
  offset: number;
19
12
  timestamp: number;
20
- payload: any;
21
- headers: Record<string, string>;
13
+ private _rawPayload;
14
+ private _headers;
22
15
  constructor(protoMsg: any);
16
+ get payload(): any;
17
+ get rawPayload(): Buffer;
18
+ get headers(): Record<string, string>;
23
19
  }
24
- export {};
package/dist/message.js CHANGED
@@ -2,76 +2,73 @@
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
3
  exports.Message = void 0;
4
4
  exports.runWithContext = runWithContext;
5
- exports.getContext = getContext;
6
5
  exports.commit = commit;
7
6
  const async_hooks_1 = require("async_hooks");
8
- const storage = new async_hooks_1.AsyncLocalStorage();
7
+ const contextStorage = new async_hooks_1.AsyncLocalStorage();
9
8
  function runWithContext(ctx, fn) {
10
- storage.run(ctx, fn);
11
- }
12
- function getContext() {
13
- return storage.getStore();
9
+ return contextStorage.run(ctx, fn);
14
10
  }
15
11
  async function commit() {
16
- const ctx = storage.getStore();
17
- if (!ctx)
12
+ const ctx = contextStorage.getStore();
13
+ if (!ctx) {
18
14
  throw new Error('commit() called outside of a consumer handler');
19
- if (ctx.committed)
15
+ }
16
+ if (ctx.committed) {
20
17
  return;
18
+ }
21
19
  return new Promise((resolve, reject) => {
22
- try {
23
- ctx.stub.CommitOffset({ topic: ctx.topic, consumer_name: ctx.consumerName, offset: ctx.offset + 1 }, (err, res) => {
24
- if (err)
25
- return reject(err);
26
- ctx.committed = true;
27
- resolve();
28
- });
29
- }
30
- catch (e) {
31
- reject(e);
32
- }
20
+ const req = {
21
+ topic: ctx.topic,
22
+ consumer_name: ctx.consumerGroup,
23
+ offset: ctx.offset + 1,
24
+ };
25
+ ctx.stub.CommitOffset(req, (err, res) => {
26
+ if (err) {
27
+ console.error(`Error committing offset: ${err}`);
28
+ return reject(err);
29
+ }
30
+ // console.log(`Committed offset ${req.offset} for ${req.consumer_name}`);
31
+ ctx.committed = true;
32
+ resolve();
33
+ });
33
34
  });
34
35
  }
35
36
  class Message {
36
37
  constructor(protoMsg) {
37
- this.offset = protoMsg.offset;
38
- this.timestamp = protoMsg.timestamp;
39
- this.headers = {};
40
- try {
41
- this.headers = Object.assign({}, protoMsg.headers);
42
- }
43
- catch (e) {
44
- this.headers = {};
45
- }
46
- const buf = protoMsg.payload;
47
- const ptype = this.headers['payload-type'];
38
+ this.offset = typeof protoMsg.offset === 'string' ? parseInt(protoMsg.offset, 10) : protoMsg.offset;
39
+ this.timestamp = typeof protoMsg.timestamp === 'string' ? parseInt(protoMsg.timestamp, 10) : protoMsg.timestamp;
40
+ this._rawPayload = protoMsg.payload;
41
+ this._headers = protoMsg.headers || {};
42
+ }
43
+ get payload() {
44
+ const ptype = this._headers['payload-type'];
48
45
  if (ptype === 'json') {
49
46
  try {
50
- this.payload = JSON.parse(buf.toString());
47
+ return JSON.parse(this._rawPayload.toString('utf-8'));
51
48
  }
52
49
  catch (e) {
53
- this.payload = buf;
50
+ return this._rawPayload;
54
51
  }
55
52
  }
56
- else if (ptype === 'string') {
57
- try {
58
- this.payload = buf.toString('utf8');
59
- }
60
- catch (e) {
61
- this.payload = buf;
62
- }
53
+ if (ptype === 'string') {
54
+ return this._rawPayload.toString('utf-8');
63
55
  }
64
- else if (ptype === 'bytes') {
65
- this.payload = buf;
56
+ if (ptype === 'bytes') {
57
+ return this._rawPayload;
66
58
  }
67
- else {
68
- try {
69
- this.payload = JSON.parse(buf.toString());
70
- }
71
- catch (e) {
72
- this.payload = buf;
73
- }
59
+ // Fallback
60
+ try {
61
+ return JSON.parse(this._rawPayload.toString('utf-8'));
62
+ }
63
+ catch (e) {
64
+ return this._rawPayload;
74
65
  }
75
66
  }
67
+ get rawPayload() {
68
+ return this._rawPayload;
69
+ }
70
+ get headers() {
71
+ return this._headers;
72
+ }
76
73
  }
77
74
  exports.Message = Message;
@@ -1,11 +1,8 @@
1
- import { PulseConfig } from './config';
2
- export interface PublishResponse {
3
- id: string;
4
- offset: number | string;
5
- }
6
1
  export declare class Producer {
7
- private config;
8
2
  private client;
9
- constructor(config: PulseConfig);
10
- send(topic: string, payload: any, headers?: Record<string, string>): Promise<PublishResponse>;
3
+ private config;
4
+ constructor(host?: string, port?: number);
5
+ private setupTopics;
6
+ send(topic: string, payload: any): Promise<void>;
7
+ close(): void;
11
8
  }
package/dist/producer.js CHANGED
@@ -1,25 +1,68 @@
1
1
  "use strict";
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
3
  exports.Producer = void 0;
4
+ const config_1 = require("./config");
4
5
  const client_1 = require("./proto/client");
5
6
  class Producer {
6
- constructor(config) {
7
- this.config = config;
8
- this.client = (0, client_1.createClient)(config.grpcUrl);
7
+ constructor(host, port) {
8
+ this.config = (0, config_1.getConfig)();
9
+ const h = host || this.config.broker.host;
10
+ const p = port || this.config.broker.grpc_port;
11
+ const address = `${h}:${p}`;
12
+ this.client = (0, client_1.createClient)(address);
13
+ this.setupTopics();
9
14
  }
10
- async send(topic, payload, headers) {
15
+ setupTopics() {
16
+ const topics = this.config.topics || [];
17
+ for (const topicCfg of topics) {
18
+ if (topicCfg.create_if_missing) {
19
+ const req = {
20
+ topic: topicCfg.name,
21
+ fifo: topicCfg.config?.fifo || false,
22
+ retention_bytes: topicCfg.config?.retention_bytes || 0,
23
+ };
24
+ this.client.CreateTopic(req, (err, res) => {
25
+ // Ignore errors (e.g. topic already exists)
26
+ if (err) {
27
+ // console.warn(`Failed to create topic ${topicCfg.name}:`, err.message);
28
+ }
29
+ });
30
+ }
31
+ }
32
+ }
33
+ async send(topic, payload) {
34
+ let data;
35
+ const headers = {};
36
+ if (Buffer.isBuffer(payload)) {
37
+ data = payload;
38
+ headers['payload-type'] = 'bytes';
39
+ }
40
+ else if (typeof payload === 'string') {
41
+ data = Buffer.from(payload, 'utf-8');
42
+ headers['payload-type'] = 'string';
43
+ }
44
+ else if (typeof payload === 'object') {
45
+ data = Buffer.from(JSON.stringify(payload), 'utf-8');
46
+ headers['payload-type'] = 'json';
47
+ }
48
+ else {
49
+ throw new Error('Payload must be Buffer, string, or object');
50
+ }
11
51
  const req = {
12
52
  topic,
13
- payload: Buffer.from(JSON.stringify(payload)),
14
- headers: headers || {},
53
+ payload: data,
54
+ headers,
15
55
  };
16
56
  return new Promise((resolve, reject) => {
17
57
  this.client.Publish(req, (err, res) => {
18
58
  if (err)
19
59
  return reject(err);
20
- resolve({ id: res.id, offset: res.offset });
60
+ resolve();
21
61
  });
22
62
  });
23
63
  }
64
+ close() {
65
+ this.client.close();
66
+ }
24
67
  }
25
68
  exports.Producer = Producer;
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "pulse-sdk",
3
- "version": "0.0.4",
3
+ "version": "0.0.6",
4
4
  "description": "Pulse SDK for Node.js/TypeScript",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",
@@ -1,4 +0,0 @@
1
- type Handler = (msg: any) => void;
2
- export declare function shutdownAll(): Promise<void>;
3
- export declare function registerSharedHandler(grpcUrl: string, topic: string, consumerName: string, handler: Handler): () => void;
4
- export {};
@@ -1,191 +0,0 @@
1
- "use strict";
2
- Object.defineProperty(exports, "__esModule", { value: true });
3
- exports.shutdownAll = shutdownAll;
4
- exports.registerSharedHandler = registerSharedHandler;
5
- const client_1 = require("./proto/client");
6
- const message_1 = require("./message");
7
- let suppressStreamWarnings = false;
8
- async function shutdownAll() {
9
- // When shutting down tests/teardown, suppress stream warnings so Jest doesn't
10
- // complain about logs after tests are finished.
11
- suppressStreamWarnings = true;
12
- for (const [k, entry] of Array.from(registry.entries())) {
13
- try {
14
- if (entry.stream) {
15
- try {
16
- entry.stream.removeAllListeners();
17
- }
18
- catch (_) { }
19
- try {
20
- if (entry.stream.cancel)
21
- entry.stream.cancel();
22
- }
23
- catch (_) { }
24
- entry.stream = null;
25
- }
26
- try {
27
- entry.handlers.clear();
28
- }
29
- catch (_) { }
30
- }
31
- catch (_) {
32
- // ignore individual errors
33
- }
34
- try {
35
- if (entry.client && typeof entry.client.close === 'function')
36
- entry.client.close();
37
- }
38
- catch (_) { }
39
- try {
40
- registry.delete(k);
41
- }
42
- catch (_) { }
43
- }
44
- }
45
- const registry = new Map();
46
- function keyFor(grpcUrl, topic, consumerName) {
47
- return `${grpcUrl}::${topic}::${consumerName}`;
48
- }
49
- function registerSharedHandler(grpcUrl, topic, consumerName, handler) {
50
- const k = keyFor(grpcUrl, topic, consumerName);
51
- let entry = registry.get(k);
52
- if (!entry) {
53
- const client = (0, client_1.createClient)(grpcUrl);
54
- entry = { topic, consumerName, client, stream: null, handlers: new Set(), nextIndex: 0, grpcUrl };
55
- // add handler before starting the stream to avoid losing early messages
56
- entry.handlers.add(handler);
57
- registry.set(k, entry);
58
- startStream(entry);
59
- return () => {
60
- // unregister
61
- const e = registry.get(k);
62
- if (!e)
63
- return;
64
- e.handlers.delete(handler);
65
- if (e.handlers.size === 0) {
66
- try {
67
- if (e.stream && e.stream.cancel)
68
- e.stream.cancel();
69
- }
70
- catch (err) { }
71
- registry.delete(k);
72
- }
73
- };
74
- }
75
- entry.handlers.add(handler);
76
- return () => {
77
- // unregister
78
- const e = registry.get(k);
79
- if (!e)
80
- return;
81
- e.handlers.delete(handler);
82
- if (e.handlers.size === 0) {
83
- // cleanup stream
84
- try {
85
- if (e.stream && e.stream.cancel)
86
- e.stream.cancel();
87
- }
88
- catch (e) {
89
- // ignore
90
- }
91
- registry.delete(k);
92
- }
93
- };
94
- }
95
- function startStream(entry) {
96
- const req = { topic: entry.topic, consumer_name: entry.consumerName, offset: 0 };
97
- const stream = entry.client.Consume(req);
98
- entry.stream = stream;
99
- // debug: console.log('consumerManager.startStream', entry.grpcUrl, entry.topic, entry.consumerName);
100
- stream.on('data', (msg) => {
101
- const message = new message_1.Message(msg);
102
- const handlers = Array.from(entry.handlers);
103
- if (handlers.length === 0)
104
- return;
105
- if (entry.nextIndex === undefined)
106
- entry.nextIndex = 0;
107
- const h = handlers[entry.nextIndex % handlers.length];
108
- entry.nextIndex = (entry.nextIndex + 1) % handlers.length;
109
- try {
110
- // run handler inside context providing the stub for commit()
111
- // debug: console.log('consumerManager.dispatch', entry.topic, 'offset', message.offset, 'handlerIndex', entry.nextIndex);
112
- (0, message_1.runWithContext)({ stub: entry.client, topic: entry.topic, consumerName: entry.consumerName, offset: message.offset }, () => {
113
- h(message);
114
- });
115
- }
116
- catch (e) {
117
- console.warn('handler error', e);
118
- }
119
- });
120
- stream.on('error', (e) => {
121
- // Log once and clean up the registry entry to avoid reconnect storms and test leaks
122
- if (!suppressStreamWarnings) {
123
- // Ignore normal client-side cancellations which happen during unregister
124
- // and shutdown; only log unexpected stream errors.
125
- try {
126
- const code = e && typeof e.code !== 'undefined' ? e.code : null;
127
- if (code !== 1) {
128
- console.warn('shared consumer stream error for', entry.topic, e);
129
- }
130
- }
131
- catch (_) {
132
- // if anything goes wrong determining code, log the error
133
- console.warn('shared consumer stream error for', entry.topic, e);
134
- }
135
- }
136
- try {
137
- if (entry.stream) {
138
- try {
139
- entry.stream.removeAllListeners();
140
- }
141
- catch (_) { }
142
- try {
143
- if (entry.stream.cancel)
144
- entry.stream.cancel();
145
- }
146
- catch (_) { }
147
- entry.stream = null;
148
- }
149
- }
150
- catch (err) {
151
- // ignore
152
- }
153
- try {
154
- try {
155
- if (entry.client && typeof entry.client.close === 'function')
156
- entry.client.close();
157
- }
158
- catch (_) { }
159
- registry.delete(keyFor(entry.grpcUrl || '', entry.topic, entry.consumerName));
160
- }
161
- catch (err) {
162
- // ignore
163
- }
164
- });
165
- stream.on('end', () => {
166
- // stream ended; clean up entry
167
- try {
168
- if (entry.stream) {
169
- try {
170
- entry.stream.removeAllListeners();
171
- }
172
- catch (_) { }
173
- try {
174
- if (entry.stream.cancel)
175
- entry.stream.cancel();
176
- }
177
- catch (_) { }
178
- entry.stream = null;
179
- }
180
- try {
181
- if (entry.client && typeof entry.client.close === 'function')
182
- entry.client.close();
183
- }
184
- catch (_) { }
185
- registry.delete(keyFor(entry.grpcUrl || '', entry.topic, entry.consumerName));
186
- }
187
- catch (err) {
188
- // ignore
189
- }
190
- });
191
- }