smart-pool 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +9 -0
- package/package.json +46 -0
- package/readme.md +1007 -0
- package/src/heap.js +130 -0
- package/src/index.d.ts +118 -0
- package/src/index.js +996 -0
package/readme.md
ADDED
|
@@ -0,0 +1,1007 @@
|
|
|
1
|
+
# Smart Pool [](https://github.com/Marcus-Johnson/smart-pool/actions)
|
|
2
|
+
|
|
3
|
+
A high-performance, feature-rich task queue and concurrency management library for Node.js. Built for production workloads requiring advanced scheduling, priority management, batching, rate limiting, circuit breaking, worker thread support, and adaptive concurrency.
|
|
4
|
+
|
|
5
|
+
|
|
6
|
+
|
|
7
|
+
## Features
|
|
8
|
+
|
|
9
|
+
- **Priority Queue**: Binary max-heap with dynamic priority adjustments
|
|
10
|
+
- **Concurrency Control**: Fixed or adaptive concurrency limits
|
|
11
|
+
- **Rate Limiting**: Per-type rate limits with token bucket algorithm
|
|
12
|
+
- **Circuit Breakers**: Automatic failure detection and recovery
|
|
13
|
+
- **Task Batching**: Group similar tasks for efficient processing
|
|
14
|
+
- **Worker Threads**: Offload CPU-intensive tasks to worker threads
|
|
15
|
+
- **Task Dependencies**: Execute tasks only after dependencies complete
|
|
16
|
+
- **Caching**: Deduplicate identical pending tasks
|
|
17
|
+
- **Retry Logic**: Exponential backoff with configurable limits
|
|
18
|
+
- **Abort Support**: Cancel tasks via AbortSignal
|
|
19
|
+
- **Priority Aging**: Prevent starvation with automatic priority boosts
|
|
20
|
+
- **Priority Decay**: Reduce priority of stale high-priority tasks
|
|
21
|
+
- **Metrics**: Real-time performance tracking with percentiles
|
|
22
|
+
- **Lifecycle Hooks**: Execute code at key points in task execution
|
|
23
|
+
- **Sub-queues**: Isolated queues with independent concurrency limits
|
|
24
|
+
- **Weight-based Load**: Track and limit load by task weight
|
|
25
|
+
|
|
26
|
+
## Installation
|
|
27
|
+
|
|
28
|
+
```bash
|
|
29
|
+
npm install smart-pool
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
## Quick Start
|
|
33
|
+
|
|
34
|
+
```javascript
|
|
35
|
+
import smartPool from 'smart-pool';
|
|
36
|
+
|
|
37
|
+
const pool = smartPool(5);
|
|
38
|
+
|
|
39
|
+
const result = await pool(async () => {
|
|
40
|
+
return 'Task completed';
|
|
41
|
+
});
|
|
42
|
+
|
|
43
|
+
console.log(result);
|
|
44
|
+
```
|
|
45
|
+
|
|
46
|
+
## API Reference
|
|
47
|
+
|
|
48
|
+
### `smartPool(concurrency, options)`
|
|
49
|
+
|
|
50
|
+
Creates a new task pool instance.
|
|
51
|
+
|
|
52
|
+
**Parameters:**
|
|
53
|
+
- `concurrency` (number): Maximum number of concurrent tasks (minimum 1)
|
|
54
|
+
- `options` (object, optional): Global configuration options
|
|
55
|
+
|
|
56
|
+
**Returns:** PoolInstance
|
|
57
|
+
|
|
58
|
+
### Pool Instance
|
|
59
|
+
|
|
60
|
+
#### Methods
|
|
61
|
+
|
|
62
|
+
##### `pool(task, options)`
|
|
63
|
+
|
|
64
|
+
Enqueue and execute a task.
|
|
65
|
+
|
|
66
|
+
**Parameters:**
|
|
67
|
+
- `task` (function): Async function to execute
|
|
68
|
+
- `options` (number | object): Priority (number) or task options (object)
|
|
69
|
+
|
|
70
|
+
**Task Options:**
|
|
71
|
+
- `priority` (number): Task priority (higher = executed sooner). Default: 0
|
|
72
|
+
- `weight` (number): Task weight for load tracking. Default: 1
|
|
73
|
+
- `type` (string): Task type for rate limiting and circuit breaking
|
|
74
|
+
- `cacheKey` (string): Deduplicate identical pending tasks
|
|
75
|
+
- `batchKey` (string): Group tasks for batch processing
|
|
76
|
+
- `id` (string | number): Unique task identifier
|
|
77
|
+
- `tags` (string[]): Tags for filtering/cancellation
|
|
78
|
+
- `metadata` (object): Custom metadata
|
|
79
|
+
- `dependsOn` (array): Task IDs that must complete first
|
|
80
|
+
- `deadline` (number): Unix timestamp when task expires
|
|
81
|
+
- `signal` (AbortSignal): Abort signal for cancellation
|
|
82
|
+
- `timeout` (number): Task timeout in milliseconds
|
|
83
|
+
- `retryCount` (number): Maximum retry attempts
|
|
84
|
+
- `retryDelay` (number): Initial retry delay in milliseconds
|
|
85
|
+
- `worker` (object): Worker thread configuration
|
|
86
|
+
- `path` (string): Path to worker script
|
|
87
|
+
- `data` (any): Data to pass to worker
|
|
88
|
+
|
|
89
|
+
**Returns:** Promise resolving to task result
|
|
90
|
+
|
|
91
|
+
**Example:**
|
|
92
|
+
```javascript
|
|
93
|
+
const result = await pool(
|
|
94
|
+
async () => fetchData(),
|
|
95
|
+
{
|
|
96
|
+
priority: 10,
|
|
97
|
+
type: 'api',
|
|
98
|
+
retryCount: 3,
|
|
99
|
+
timeout: 5000
|
|
100
|
+
}
|
|
101
|
+
);
|
|
102
|
+
```
|
|
103
|
+
|
|
104
|
+
##### `pool.map(items, fn, options)`
|
|
105
|
+
|
|
106
|
+
Map function over array items using the pool.
|
|
107
|
+
|
|
108
|
+
**Parameters:**
|
|
109
|
+
- `items` (array): Items to process
|
|
110
|
+
- `fn` (function): Async function to apply to each item
|
|
111
|
+
- `options` (number | object): Priority or task options
|
|
112
|
+
- `throwOnError` (boolean): Throw on first error. Default: true
|
|
113
|
+
|
|
114
|
+
**Returns:** Promise<array> with results
|
|
115
|
+
|
|
116
|
+
**Example:**
|
|
117
|
+
```javascript
|
|
118
|
+
const results = await pool.map(
|
|
119
|
+
[1, 2, 3, 4, 5],
|
|
120
|
+
async (n) => n * 2,
|
|
121
|
+
{ priority: 5 }
|
|
122
|
+
);
|
|
123
|
+
```
|
|
124
|
+
|
|
125
|
+
##### `pool.pause()`
|
|
126
|
+
|
|
127
|
+
Pause task execution. Queued tasks remain in queue.
|
|
128
|
+
|
|
129
|
+
**Example:**
|
|
130
|
+
```javascript
|
|
131
|
+
pool.pause();
|
|
132
|
+
console.log(pool.isPaused);
|
|
133
|
+
```
|
|
134
|
+
|
|
135
|
+
##### `pool.resume()`
|
|
136
|
+
|
|
137
|
+
Resume task execution after pause.
|
|
138
|
+
|
|
139
|
+
**Example:**
|
|
140
|
+
```javascript
|
|
141
|
+
pool.resume();
|
|
142
|
+
```
|
|
143
|
+
|
|
144
|
+
##### `pool.cancel(query)`
|
|
145
|
+
|
|
146
|
+
Cancel pending tasks matching criteria.
|
|
147
|
+
|
|
148
|
+
**Parameters:**
|
|
149
|
+
- `query` (object | function):
|
|
150
|
+
- Object: `{ id, tag }` to match tasks
|
|
151
|
+
- Function: `(task) => boolean` predicate
|
|
152
|
+
|
|
153
|
+
**Returns:** Number of cancelled tasks
|
|
154
|
+
|
|
155
|
+
**Example:**
|
|
156
|
+
```javascript
|
|
157
|
+
await pool(async () => work(), { id: 'task-1', tags: ['batch-1'] });
|
|
158
|
+
await pool(async () => work(), { id: 'task-2', tags: ['batch-1'] });
|
|
159
|
+
|
|
160
|
+
pool.cancel({ tag: 'batch-1' });
|
|
161
|
+
|
|
162
|
+
pool.cancel((task) => task.priority < 5);
|
|
163
|
+
```
|
|
164
|
+
|
|
165
|
+
##### `pool.onIdle()`
|
|
166
|
+
|
|
167
|
+
Wait for all tasks to complete, including batched and blocked tasks.
|
|
168
|
+
|
|
169
|
+
**Returns:** Promise<{ errors, failed, metrics }>
|
|
170
|
+
|
|
171
|
+
**Example:**
|
|
172
|
+
```javascript
|
|
173
|
+
const { errors, failed, metrics } = await pool.onIdle();
|
|
174
|
+
console.log(`Completed with ${errors.length} errors`);
|
|
175
|
+
```
|
|
176
|
+
|
|
177
|
+
##### `pool.drain()`
|
|
178
|
+
|
|
179
|
+
Stop accepting new tasks and wait for completion. Equivalent to setting drain mode + onIdle.
|
|
180
|
+
|
|
181
|
+
**Returns:** Promise<{ errors, failed, metrics }>
|
|
182
|
+
|
|
183
|
+
**Example:**
|
|
184
|
+
```javascript
|
|
185
|
+
await pool.drain();
|
|
186
|
+
console.log('All tasks completed, pool drained');
|
|
187
|
+
```
|
|
188
|
+
|
|
189
|
+
##### `pool.clear()`
|
|
190
|
+
|
|
191
|
+
Cancel all pending tasks and terminate all workers. Resets pool state.
|
|
192
|
+
|
|
193
|
+
**Returns:** Promise<void>
|
|
194
|
+
|
|
195
|
+
**Example:**
|
|
196
|
+
```javascript
|
|
197
|
+
await pool.clear();
|
|
198
|
+
console.log('Pool cleared');
|
|
199
|
+
```
|
|
200
|
+
|
|
201
|
+
##### `pool.setConcurrency(limit)`
|
|
202
|
+
|
|
203
|
+
Dynamically adjust concurrency limit.
|
|
204
|
+
|
|
205
|
+
**Parameters:**
|
|
206
|
+
- `limit` (number): New concurrency limit
|
|
207
|
+
|
|
208
|
+
**Example:**
|
|
209
|
+
```javascript
|
|
210
|
+
pool.setConcurrency(10);
|
|
211
|
+
console.log(pool.concurrency);
|
|
212
|
+
```
|
|
213
|
+
|
|
214
|
+
##### `pool.peek()`
|
|
215
|
+
|
|
216
|
+
View the next task to be executed without removing it.
|
|
217
|
+
|
|
218
|
+
**Returns:** Task object or null
|
|
219
|
+
|
|
220
|
+
**Example:**
|
|
221
|
+
```javascript
|
|
222
|
+
const nextTask = pool.peek();
|
|
223
|
+
console.log(nextTask?.priority);
|
|
224
|
+
```
|
|
225
|
+
|
|
226
|
+
##### `pool.remove(predicate)`
|
|
227
|
+
|
|
228
|
+
Remove tasks from queue matching predicate.
|
|
229
|
+
|
|
230
|
+
**Parameters:**
|
|
231
|
+
- `predicate` (function): `(task) => boolean`
|
|
232
|
+
|
|
233
|
+
**Returns:** Boolean indicating if any tasks were removed
|
|
234
|
+
|
|
235
|
+
**Example:**
|
|
236
|
+
```javascript
|
|
237
|
+
pool.remove((task) => task.priority < 3);
|
|
238
|
+
```
|
|
239
|
+
|
|
240
|
+
##### `pool.useQueue(name, concurrency)`
|
|
241
|
+
|
|
242
|
+
Create or get an isolated sub-queue with independent concurrency control.
|
|
243
|
+
|
|
244
|
+
**Parameters:**
|
|
245
|
+
- `name` (string): Sub-queue identifier
|
|
246
|
+
- `concurrency` (number, optional): Sub-queue concurrency. Default: parent concurrency
|
|
247
|
+
|
|
248
|
+
**Returns:** PoolInstance for the sub-queue
|
|
249
|
+
|
|
250
|
+
**Example:**
|
|
251
|
+
```javascript
|
|
252
|
+
const apiQueue = pool.useQueue('api', 3);
|
|
253
|
+
const dbQueue = pool.useQueue('database', 5);
|
|
254
|
+
|
|
255
|
+
await apiQueue(async () => fetchAPI());
|
|
256
|
+
await dbQueue(async () => queryDB());
|
|
257
|
+
```
|
|
258
|
+
|
|
259
|
+
##### `pool.getWorkerHealth()`
|
|
260
|
+
|
|
261
|
+
Get health status of all worker threads.
|
|
262
|
+
|
|
263
|
+
**Returns:** Array<{ path, busy, active }>
|
|
264
|
+
|
|
265
|
+
**Example:**
|
|
266
|
+
```javascript
|
|
267
|
+
const health = pool.getWorkerHealth();
|
|
268
|
+
health.forEach(w => {
|
|
269
|
+
console.log(`Worker ${w.path}: ${w.busy ? 'busy' : 'idle'}`);
|
|
270
|
+
});
|
|
271
|
+
```
|
|
272
|
+
|
|
273
|
+
#### Properties
|
|
274
|
+
|
|
275
|
+
##### `pool.activeCount`
|
|
276
|
+
|
|
277
|
+
Number of currently executing tasks (read-only).
|
|
278
|
+
|
|
279
|
+
##### `pool.pendingCount`
|
|
280
|
+
|
|
281
|
+
Number of tasks waiting in queue, batches, or blocked by dependencies (read-only).
|
|
282
|
+
|
|
283
|
+
##### `pool.currentLoad`
|
|
284
|
+
|
|
285
|
+
Current total weight of active tasks (read-only).
|
|
286
|
+
|
|
287
|
+
##### `pool.concurrency`
|
|
288
|
+
|
|
289
|
+
Current concurrency limit (read-only).
|
|
290
|
+
|
|
291
|
+
##### `pool.isDraining`
|
|
292
|
+
|
|
293
|
+
Whether pool is in drain mode (read-only).
|
|
294
|
+
|
|
295
|
+
##### `pool.isPaused`
|
|
296
|
+
|
|
297
|
+
Whether pool is paused (read-only).
|
|
298
|
+
|
|
299
|
+
##### `pool.metrics`
|
|
300
|
+
|
|
301
|
+
Performance metrics object (read-only):
|
|
302
|
+
- `totalTasks`: Total tasks processed
|
|
303
|
+
- `successfulTasks`: Successful task count
|
|
304
|
+
- `failedTasks`: Failed task count
|
|
305
|
+
- `throughput`: Tasks per second (formatted string)
|
|
306
|
+
- `errorRate`: Failure rate (formatted string)
|
|
307
|
+
- `percentiles`: Latency percentiles
|
|
308
|
+
- `p50`: Median latency (ms)
|
|
309
|
+
- `p90`: 90th percentile (ms)
|
|
310
|
+
- `p99`: 99th percentile (ms)
|
|
311
|
+
|
|
312
|
+
### Global Options
|
|
313
|
+
|
|
314
|
+
Configure pool behavior during initialization:
|
|
315
|
+
|
|
316
|
+
```javascript
|
|
317
|
+
const pool = smartPool(5, {
|
|
318
|
+
// Queue Management
|
|
319
|
+
maxQueueSize: 10000,
|
|
320
|
+
|
|
321
|
+
// Adaptive Concurrency
|
|
322
|
+
adaptive: true,
|
|
323
|
+
minConcurrency: 2,
|
|
324
|
+
maxConcurrency: 20,
|
|
325
|
+
|
|
326
|
+
// Rate Limiting
|
|
327
|
+
rateLimits: {
|
|
328
|
+
api: { interval: 1000, tasksPerInterval: 10 },
|
|
329
|
+
database: { interval: 100, tasksPerInterval: 5 }
|
|
330
|
+
},
|
|
331
|
+
|
|
332
|
+
// Circuit Breaker
|
|
333
|
+
circuitThreshold: 5,
|
|
334
|
+
circuitResetTimeout: 30000,
|
|
335
|
+
|
|
336
|
+
// Batching
|
|
337
|
+
batchSize: 10,
|
|
338
|
+
batchTimeout: 100,
|
|
339
|
+
|
|
340
|
+
// Retry
|
|
341
|
+
retryCount: 3,
|
|
342
|
+
initialRetryDelay: 100,
|
|
343
|
+
retryFactor: 2,
|
|
344
|
+
maxRetryDelay: 10000,
|
|
345
|
+
|
|
346
|
+
// Priority Management
|
|
347
|
+
agingThreshold: 5,
|
|
348
|
+
agingBoost: 1,
|
|
349
|
+
decayThreshold: 10,
|
|
350
|
+
decayAmount: 1,
|
|
351
|
+
|
|
352
|
+
// Worker Threads
|
|
353
|
+
workerPoolSize: 4,
|
|
354
|
+
workerPathWhitelist: ['/app/workers/'],
|
|
355
|
+
|
|
356
|
+
// Maintenance
|
|
357
|
+
interval: 1000,
|
|
358
|
+
completedTaskCleanupMs: 60000,
|
|
359
|
+
maxLatencyHistory: 10000,
|
|
360
|
+
maxErrorHistory: 1000,
|
|
361
|
+
|
|
362
|
+
// Events
|
|
363
|
+
emitter: eventEmitter,
|
|
364
|
+
|
|
365
|
+
// Lifecycle Hooks
|
|
366
|
+
onEnqueue: (task) => console.log('Enqueued:', task.id),
|
|
367
|
+
onDequeue: (task) => console.log('Dequeued:', task.id),
|
|
368
|
+
beforeExecute: (task) => console.log('Executing:', task.id),
|
|
369
|
+
afterExecute: (task, profile) => {
|
|
370
|
+
console.log('Completed:', task.id, profile.duration, 'ms');
|
|
371
|
+
}
|
|
372
|
+
});
|
|
373
|
+
```
|
|
374
|
+
|
|
375
|
+
**Option Descriptions:**
|
|
376
|
+
|
|
377
|
+
**Queue Management:**
|
|
378
|
+
- `maxQueueSize`: Maximum number of queued tasks. Default: 10000
|
|
379
|
+
|
|
380
|
+
**Adaptive Concurrency:**
|
|
381
|
+
- `adaptive`: Enable automatic concurrency adjustment. Default: false
|
|
382
|
+
- `minConcurrency`: Minimum concurrent tasks. Default: 1
|
|
383
|
+
- `maxConcurrency`: Maximum concurrent tasks. Default: 2x initial concurrency
|
|
384
|
+
|
|
385
|
+
**Rate Limiting:**
|
|
386
|
+
- `rateLimits`: Per-type rate limits using token bucket
|
|
387
|
+
- `interval`: Time window in milliseconds
|
|
388
|
+
- `tasksPerInterval`: Tasks allowed per interval
|
|
389
|
+
|
|
390
|
+
**Circuit Breaker:**
|
|
391
|
+
- `circuitThreshold`: Consecutive failures to open circuit. Default: 5
|
|
392
|
+
- `circuitResetTimeout`: Time before retry after circuit opens (ms). Default: 30000
|
|
393
|
+
|
|
394
|
+
**Batching:**
|
|
395
|
+
- `batchSize`: Tasks per batch. Default: 10
|
|
396
|
+
- `batchTimeout`: Max wait time before flushing partial batch (ms). Default: 100
|
|
397
|
+
|
|
398
|
+
**Retry:**
|
|
399
|
+
- `retryCount`: Default retry attempts. Default: 0
|
|
400
|
+
- `initialRetryDelay`: Initial retry delay (ms). Default: 100
|
|
401
|
+
- `retryFactor`: Backoff multiplier. Default: 2
|
|
402
|
+
- `maxRetryDelay`: Maximum retry delay (ms). Default: 10000
|
|
403
|
+
|
|
404
|
+
**Priority Management:**
|
|
405
|
+
- `agingThreshold`: Cycles before boosting low-priority tasks. Default: undefined
|
|
406
|
+
- `agingBoost`: Priority increase amount. Default: 1
|
|
407
|
+
- `decayThreshold`: Cycles before decaying high-priority tasks. Default: undefined
|
|
408
|
+
- `decayAmount`: Priority decrease amount. Default: 1
|
|
409
|
+
|
|
410
|
+
**Worker Threads:**
|
|
411
|
+
- `workerPoolSize`: Maximum worker threads. Default: 0 (disabled)
|
|
412
|
+
- `workerPathWhitelist`: Allowed worker script paths
|
|
413
|
+
|
|
414
|
+
**Maintenance:**
|
|
415
|
+
- `interval`: Maintenance cycle interval (ms). Default: 1000
|
|
416
|
+
- `completedTaskCleanupMs`: Time before cleaning completed task records (ms). Default: 60000
|
|
417
|
+
- `maxLatencyHistory`: Maximum latency samples to retain. Default: 10000
|
|
418
|
+
- `maxErrorHistory`: Maximum error samples to retain. Default: 1000
|
|
419
|
+
|
|
420
|
+
**Events:**
|
|
421
|
+
- `emitter`: EventEmitter instance for pool events
|
|
422
|
+
|
|
423
|
+
**Lifecycle Hooks:**
|
|
424
|
+
- `onEnqueue`: Called when task added to queue
|
|
425
|
+
- `onDequeue`: Called when task removed from queue
|
|
426
|
+
- `beforeExecute`: Called before task execution
|
|
427
|
+
- `afterExecute`: Called after task execution with profile data
|
|
428
|
+
|
|
429
|
+
## Tutorials
|
|
430
|
+
|
|
431
|
+
### 1. Basic Task Queue
|
|
432
|
+
|
|
433
|
+
Simple task queue with priority management:
|
|
434
|
+
|
|
435
|
+
```javascript
|
|
436
|
+
import smartPool from 'smart-pool';
|
|
437
|
+
|
|
438
|
+
const pool = smartPool(3);
|
|
439
|
+
|
|
440
|
+
await pool(async () => {
|
|
441
|
+
console.log('Low priority task');
|
|
442
|
+
}, 1);
|
|
443
|
+
|
|
444
|
+
await pool(async () => {
|
|
445
|
+
console.log('High priority task');
|
|
446
|
+
}, 10);
|
|
447
|
+
|
|
448
|
+
await pool.onIdle();
|
|
449
|
+
```
|
|
450
|
+
|
|
451
|
+
### 2. API Rate Limiting
|
|
452
|
+
|
|
453
|
+
Respect API rate limits with type-based limiting:
|
|
454
|
+
|
|
455
|
+
```javascript
|
|
456
|
+
const pool = smartPool(10, {
|
|
457
|
+
rateLimits: {
|
|
458
|
+
github: { interval: 3600000, tasksPerInterval: 5000 },
|
|
459
|
+
twitter: { interval: 900000, tasksPerInterval: 300 }
|
|
460
|
+
}
|
|
461
|
+
});
|
|
462
|
+
|
|
463
|
+
async function fetchGithubUser(username) {
|
|
464
|
+
return pool(
|
|
465
|
+
async () => {
|
|
466
|
+
const res = await fetch(`https://api.github.com/users/${username}`);
|
|
467
|
+
return res.json();
|
|
468
|
+
},
|
|
469
|
+
{ type: 'github', priority: 5 }
|
|
470
|
+
);
|
|
471
|
+
}
|
|
472
|
+
|
|
473
|
+
async function fetchTweet(id) {
|
|
474
|
+
return pool(
|
|
475
|
+
async () => {
|
|
476
|
+
const res = await fetch(`https://api.twitter.com/tweets/${id}`);
|
|
477
|
+
return res.json();
|
|
478
|
+
},
|
|
479
|
+
{ type: 'twitter', priority: 3 }
|
|
480
|
+
);
|
|
481
|
+
}
|
|
482
|
+
|
|
483
|
+
const users = await Promise.all([
|
|
484
|
+
fetchGithubUser('alice'),
|
|
485
|
+
fetchGithubUser('bob'),
|
|
486
|
+
fetchGithubUser('charlie')
|
|
487
|
+
]);
|
|
488
|
+
```
|
|
489
|
+
|
|
490
|
+
### 3. Task Batching
|
|
491
|
+
|
|
492
|
+
Batch database operations for efficiency:
|
|
493
|
+
|
|
494
|
+
```javascript
|
|
495
|
+
const pool = smartPool(5, {
|
|
496
|
+
batchSize: 50,
|
|
497
|
+
batchTimeout: 100
|
|
498
|
+
});
|
|
499
|
+
|
|
500
|
+
async function insertUser(user) {
|
|
501
|
+
return pool(
|
|
502
|
+
async (batch) => {
|
|
503
|
+
const ids = await db.users.insertMany(batch.map(t => t.data));
|
|
504
|
+
return ids[batch.indexOf(user)];
|
|
505
|
+
},
|
|
506
|
+
{
|
|
507
|
+
batchKey: 'user-insert',
|
|
508
|
+
data: user
|
|
509
|
+
}
|
|
510
|
+
);
|
|
511
|
+
}
|
|
512
|
+
|
|
513
|
+
const users = Array.from({ length: 200 }, (_, i) => ({
|
|
514
|
+
name: `User ${i}`,
|
|
515
|
+
email: `user${i}@example.com`
|
|
516
|
+
}));
|
|
517
|
+
|
|
518
|
+
const ids = await Promise.all(users.map(insertUser));
|
|
519
|
+
console.log(`Inserted ${ids.length} users in batches`);
|
|
520
|
+
```
|
|
521
|
+
|
|
522
|
+
### 4. Circuit Breaker
|
|
523
|
+
|
|
524
|
+
Protect external services from cascading failures:
|
|
525
|
+
|
|
526
|
+
```javascript
|
|
527
|
+
const pool = smartPool(5, {
|
|
528
|
+
circuitThreshold: 3,
|
|
529
|
+
circuitResetTimeout: 30000,
|
|
530
|
+
retryCount: 2,
|
|
531
|
+
initialRetryDelay: 1000
|
|
532
|
+
});
|
|
533
|
+
|
|
534
|
+
async function callUnstableAPI(endpoint) {
|
|
535
|
+
return pool(
|
|
536
|
+
async () => {
|
|
537
|
+
const res = await fetch(endpoint);
|
|
538
|
+
if (!res.ok) throw new Error(`HTTP ${res.status}`);
|
|
539
|
+
return res.json();
|
|
540
|
+
},
|
|
541
|
+
{ type: 'unstable-api' }
|
|
542
|
+
);
|
|
543
|
+
}
|
|
544
|
+
|
|
545
|
+
try {
|
|
546
|
+
const data = await callUnstableAPI('https://api.example.com/data');
|
|
547
|
+
console.log(data);
|
|
548
|
+
} catch (err) {
|
|
549
|
+
console.error('API call failed:', err.message);
|
|
550
|
+
}
|
|
551
|
+
```
|
|
552
|
+
|
|
553
|
+
### 5. Worker Threads
|
|
554
|
+
|
|
555
|
+
Offload CPU-intensive work to worker threads:
|
|
556
|
+
|
|
557
|
+
**worker.js:**
|
|
558
|
+
```javascript
|
|
559
|
+
import { parentPort } from 'node:worker_threads';
|
|
560
|
+
|
|
561
|
+
parentPort.on('message', ({ data }) => {
|
|
562
|
+
const result = expensiveComputation(data);
|
|
563
|
+
parentPort.postMessage({ result });
|
|
564
|
+
});
|
|
565
|
+
|
|
566
|
+
function expensiveComputation(n) {
|
|
567
|
+
let sum = 0;
|
|
568
|
+
for (let i = 0; i < n; i++) {
|
|
569
|
+
sum += Math.sqrt(i);
|
|
570
|
+
}
|
|
571
|
+
return sum;
|
|
572
|
+
}
|
|
573
|
+
```
|
|
574
|
+
|
|
575
|
+
**main.js:**
|
|
576
|
+
```javascript
|
|
577
|
+
const pool = smartPool(5, {
|
|
578
|
+
workerPoolSize: 4,
|
|
579
|
+
workerPathWhitelist: ['/app/workers/']
|
|
580
|
+
});
|
|
581
|
+
|
|
582
|
+
const results = await pool.map(
|
|
583
|
+
[1000000, 2000000, 3000000],
|
|
584
|
+
async (n) => {
|
|
585
|
+
return pool(
|
|
586
|
+
async () => {},
|
|
587
|
+
{
|
|
588
|
+
worker: {
|
|
589
|
+
path: '/app/workers/worker.js',
|
|
590
|
+
data: n
|
|
591
|
+
}
|
|
592
|
+
}
|
|
593
|
+
);
|
|
594
|
+
}
|
|
595
|
+
);
|
|
596
|
+
|
|
597
|
+
console.log(results);
|
|
598
|
+
```
|
|
599
|
+
|
|
600
|
+
### 6. Task Dependencies
|
|
601
|
+
|
|
602
|
+
Execute tasks after dependencies complete:
|
|
603
|
+
|
|
604
|
+
```javascript
|
|
605
|
+
const pool = smartPool(10);
|
|
606
|
+
|
|
607
|
+
const userId = await pool(
|
|
608
|
+
async () => db.users.create({ name: 'Alice' }),
|
|
609
|
+
{ id: 'create-user' }
|
|
610
|
+
);
|
|
611
|
+
|
|
612
|
+
const profileId = await pool(
|
|
613
|
+
async () => db.profiles.create({ userId, bio: 'Developer' }),
|
|
614
|
+
{ id: 'create-profile', dependsOn: ['create-user'] }
|
|
615
|
+
);
|
|
616
|
+
|
|
617
|
+
await pool(
|
|
618
|
+
async () => sendWelcomeEmail(userId),
|
|
619
|
+
{ dependsOn: ['create-user', 'create-profile'] }
|
|
620
|
+
);
|
|
621
|
+
```
|
|
622
|
+
|
|
623
|
+
### 7. Task Caching
|
|
624
|
+
|
|
625
|
+
Deduplicate identical pending requests:
|
|
626
|
+
|
|
627
|
+
```javascript
|
|
628
|
+
const pool = smartPool(5);
|
|
629
|
+
|
|
630
|
+
async function fetchUserData(userId) {
|
|
631
|
+
return pool(
|
|
632
|
+
async () => {
|
|
633
|
+
console.log(`Fetching user ${userId}`);
|
|
634
|
+
const res = await fetch(`https://api.example.com/users/${userId}`);
|
|
635
|
+
return res.json();
|
|
636
|
+
},
|
|
637
|
+
{ cacheKey: `user-${userId}` }
|
|
638
|
+
);
|
|
639
|
+
}
|
|
640
|
+
|
|
641
|
+
const [user1, user2, user3] = await Promise.all([
|
|
642
|
+
fetchUserData(123),
|
|
643
|
+
fetchUserData(123),
|
|
644
|
+
fetchUserData(123)
|
|
645
|
+
]);
|
|
646
|
+
|
|
647
|
+
console.log('Only one request made');
|
|
648
|
+
```
|
|
649
|
+
|
|
650
|
+
### 8. Adaptive Concurrency
|
|
651
|
+
|
|
652
|
+
Automatically adjust concurrency based on performance:
|
|
653
|
+
|
|
654
|
+
```javascript
|
|
655
|
+
const pool = smartPool(5, {
|
|
656
|
+
adaptive: true,
|
|
657
|
+
minConcurrency: 2,
|
|
658
|
+
maxConcurrency: 20,
|
|
659
|
+
adaptiveLatencyLow: 50,
|
|
660
|
+
adaptiveLatencyHigh: 200
|
|
661
|
+
});
|
|
662
|
+
|
|
663
|
+
for (let i = 0; i < 1000; i++) {
|
|
664
|
+
pool(async () => {
|
|
665
|
+
await simulateWork();
|
|
666
|
+
});
|
|
667
|
+
}
|
|
668
|
+
|
|
669
|
+
setInterval(() => {
|
|
670
|
+
console.log(`Current concurrency: ${pool.concurrency}`);
|
|
671
|
+
console.log(`Active tasks: ${pool.activeCount}`);
|
|
672
|
+
console.log(`Pending tasks: ${pool.pendingCount}`);
|
|
673
|
+
}, 1000);
|
|
674
|
+
|
|
675
|
+
await pool.onIdle();
|
|
676
|
+
```
|
|
677
|
+
|
|
678
|
+
### 9. Priority Aging
|
|
679
|
+
|
|
680
|
+
Prevent task starvation with automatic priority boosts:
|
|
681
|
+
|
|
682
|
+
```javascript
|
|
683
|
+
const pool = smartPool(3, {
|
|
684
|
+
agingThreshold: 5,
|
|
685
|
+
agingBoost: 1,
|
|
686
|
+
interval: 1000
|
|
687
|
+
});
|
|
688
|
+
|
|
689
|
+
for (let i = 0; i < 100; i++) {
|
|
690
|
+
pool(
|
|
691
|
+
async () => {
|
|
692
|
+
console.log(`Task ${i}`);
|
|
693
|
+
await sleep(100);
|
|
694
|
+
},
|
|
695
|
+
{ priority: i < 10 ? 1 : 10 }
|
|
696
|
+
);
|
|
697
|
+
}
|
|
698
|
+
|
|
699
|
+
await pool.onIdle();
|
|
700
|
+
```
|
|
701
|
+
|
|
702
|
+
### 10. Sub-queues
|
|
703
|
+
|
|
704
|
+
Isolate different workload types:
|
|
705
|
+
|
|
706
|
+
```javascript
|
|
707
|
+
const pool = smartPool(10);
|
|
708
|
+
|
|
709
|
+
const criticalQueue = pool.useQueue('critical', 5);
|
|
710
|
+
const backgroundQueue = pool.useQueue('background', 2);
|
|
711
|
+
|
|
712
|
+
await criticalQueue(async () => {
|
|
713
|
+
await processPayment();
|
|
714
|
+
});
|
|
715
|
+
|
|
716
|
+
await backgroundQueue(async () => {
|
|
717
|
+
await generateReport();
|
|
718
|
+
});
|
|
719
|
+
|
|
720
|
+
console.log(`Critical active: ${criticalQueue.activeCount}`);
|
|
721
|
+
console.log(`Background active: ${backgroundQueue.activeCount}`);
|
|
722
|
+
```
|
|
723
|
+
|
|
724
|
+
### 11. Timeout and Abort
|
|
725
|
+
|
|
726
|
+
Cancel tasks via timeout or AbortSignal:
|
|
727
|
+
|
|
728
|
+
```javascript
|
|
729
|
+
const pool = smartPool(5);
|
|
730
|
+
|
|
731
|
+
const controller = new AbortController();
|
|
732
|
+
|
|
733
|
+
const timeoutTask = pool(
|
|
734
|
+
async () => {
|
|
735
|
+
await longRunningOperation();
|
|
736
|
+
},
|
|
737
|
+
{ timeout: 5000 }
|
|
738
|
+
);
|
|
739
|
+
|
|
740
|
+
const abortTask = pool(
|
|
741
|
+
async () => {
|
|
742
|
+
await anotherOperation();
|
|
743
|
+
},
|
|
744
|
+
{ signal: controller.signal }
|
|
745
|
+
);
|
|
746
|
+
|
|
747
|
+
setTimeout(() => controller.abort(), 2000);
|
|
748
|
+
|
|
749
|
+
try {
|
|
750
|
+
await Promise.all([timeoutTask, abortTask]);
|
|
751
|
+
} catch (err) {
|
|
752
|
+
console.error('Task cancelled:', err.message);
|
|
753
|
+
}
|
|
754
|
+
```
|
|
755
|
+
|
|
756
|
+
### 12. Metrics and Monitoring
|
|
757
|
+
|
|
758
|
+
Track performance metrics:
|
|
759
|
+
|
|
760
|
+
```javascript
|
|
761
|
+
const pool = smartPool(10);
|
|
762
|
+
|
|
763
|
+
for (let i = 0; i < 1000; i++) {
|
|
764
|
+
pool(async () => {
|
|
765
|
+
await simulateWork();
|
|
766
|
+
});
|
|
767
|
+
}
|
|
768
|
+
|
|
769
|
+
await pool.onIdle();
|
|
770
|
+
|
|
771
|
+
const metrics = pool.metrics;
|
|
772
|
+
console.log(`Total tasks: ${metrics.totalTasks}`);
|
|
773
|
+
console.log(`Success rate: ${((1 - parseFloat(metrics.errorRate)) * 100).toFixed(2)}%`);
|
|
774
|
+
console.log(`Throughput: ${metrics.throughput} tasks/sec`);
|
|
775
|
+
console.log(`Latency p50: ${metrics.percentiles.p50}ms`);
|
|
776
|
+
console.log(`Latency p90: ${metrics.percentiles.p90}ms`);
|
|
777
|
+
console.log(`Latency p99: ${metrics.percentiles.p99}ms`);
|
|
778
|
+
```
|
|
779
|
+
|
|
780
|
+
### 13. Lifecycle Hooks
|
|
781
|
+
|
|
782
|
+
Monitor task execution:
|
|
783
|
+
|
|
784
|
+
```javascript
|
|
785
|
+
const pool = smartPool(5, {
|
|
786
|
+
onEnqueue: (task) => {
|
|
787
|
+
console.log(`[ENQUEUE] ${task.id || 'anonymous'} (priority: ${task.priority})`);
|
|
788
|
+
},
|
|
789
|
+
onDequeue: (task) => {
|
|
790
|
+
console.log(`[DEQUEUE] ${task.id || 'anonymous'}`);
|
|
791
|
+
},
|
|
792
|
+
beforeExecute: (task) => {
|
|
793
|
+
console.log(`[EXECUTE] ${task.id || 'anonymous'}`);
|
|
794
|
+
},
|
|
795
|
+
afterExecute: (task, profile) => {
|
|
796
|
+
console.log(`[COMPLETE] ${task.id || 'anonymous'} in ${profile.duration}ms`);
|
|
797
|
+
if (profile.error) {
|
|
798
|
+
console.error(`[ERROR] ${profile.error}`);
|
|
799
|
+
}
|
|
800
|
+
}
|
|
801
|
+
});
|
|
802
|
+
|
|
803
|
+
await pool(async () => {
|
|
804
|
+
await performWork();
|
|
805
|
+
}, { id: 'my-task', priority: 10 });
|
|
806
|
+
```
|
|
807
|
+
|
|
808
|
+
### 14. Weight-based Load
|
|
809
|
+
|
|
810
|
+
Track and limit load by task weight:
|
|
811
|
+
|
|
812
|
+
```javascript
|
|
813
|
+
const pool = smartPool(100);
|
|
814
|
+
|
|
815
|
+
async function cpuIntensiveTask() {
|
|
816
|
+
return pool(
|
|
817
|
+
async () => {
|
|
818
|
+
return performComputation();
|
|
819
|
+
},
|
|
820
|
+
{ weight: 10 }
|
|
821
|
+
);
|
|
822
|
+
}
|
|
823
|
+
|
|
824
|
+
async function lightweightTask() {
|
|
825
|
+
return pool(
|
|
826
|
+
async () => {
|
|
827
|
+
return fetchData();
|
|
828
|
+
},
|
|
829
|
+
{ weight: 1 }
|
|
830
|
+
);
|
|
831
|
+
}
|
|
832
|
+
|
|
833
|
+
await Promise.all([
|
|
834
|
+
...Array(5).fill().map(cpuIntensiveTask),
|
|
835
|
+
...Array(50).fill().map(lightweightTask)
|
|
836
|
+
]);
|
|
837
|
+
|
|
838
|
+
console.log(`Current load: ${pool.currentLoad}`);
|
|
839
|
+
```
|
|
840
|
+
|
|
841
|
+
### 15. Task Cancellation
|
|
842
|
+
|
|
843
|
+
Cancel tasks by ID, tag, or predicate:
|
|
844
|
+
|
|
845
|
+
```javascript
|
|
846
|
+
const pool = smartPool(5);
|
|
847
|
+
|
|
848
|
+
for (let i = 0; i < 100; i++) {
|
|
849
|
+
pool(
|
|
850
|
+
async () => {
|
|
851
|
+
await processItem(i);
|
|
852
|
+
},
|
|
853
|
+
{
|
|
854
|
+
id: `task-${i}`,
|
|
855
|
+
tags: i % 2 === 0 ? ['even'] : ['odd'],
|
|
856
|
+
priority: i
|
|
857
|
+
}
|
|
858
|
+
).catch(err => {
|
|
859
|
+
if (err.message === 'Task cancelled via API') {
|
|
860
|
+
console.log(`Task ${i} was cancelled`);
|
|
861
|
+
}
|
|
862
|
+
});
|
|
863
|
+
}
|
|
864
|
+
|
|
865
|
+
pool.cancel({ tag: 'even' });
|
|
866
|
+
|
|
867
|
+
pool.cancel((task) => task.priority < 50);
|
|
868
|
+
|
|
869
|
+
await pool.onIdle();
|
|
870
|
+
```
|
|
871
|
+
|
|
872
|
+
## Events
|
|
873
|
+
|
|
874
|
+
When an emitter is provided, the pool emits these events:
|
|
875
|
+
|
|
876
|
+
- `circuit:open` - Circuit breaker opened
|
|
877
|
+
- `circuit:closed` - Circuit breaker closed
|
|
878
|
+
- `concurrency:adjust` - Adaptive concurrency changed
|
|
879
|
+
- `task:retry` - Task retry attempt
|
|
880
|
+
- `task:timeout` - Task timeout
|
|
881
|
+
- `batch:flush` - Batch flushed
|
|
882
|
+
|
|
883
|
+
```javascript
|
|
884
|
+
import { EventEmitter } from 'events';
|
|
885
|
+
|
|
886
|
+
const emitter = new EventEmitter();
|
|
887
|
+
const pool = smartPool(5, { emitter });
|
|
888
|
+
|
|
889
|
+
emitter.on('circuit:open', ({ type }) => {
|
|
890
|
+
console.log(`Circuit opened for ${type}`);
|
|
891
|
+
});
|
|
892
|
+
|
|
893
|
+
emitter.on('concurrency:adjust', ({ concurrency, reason }) => {
|
|
894
|
+
console.log(`Concurrency adjusted to ${concurrency}: ${reason}`);
|
|
895
|
+
});
|
|
896
|
+
|
|
897
|
+
emitter.on('task:retry', ({ id, attempt, delay }) => {
|
|
898
|
+
console.log(`Retrying task ${id} (attempt ${attempt}) after ${delay}ms`);
|
|
899
|
+
});
|
|
900
|
+
```
|
|
901
|
+
|
|
902
|
+
## Best Practices
|
|
903
|
+
|
|
904
|
+
### 1. Choose Appropriate Concurrency
|
|
905
|
+
|
|
906
|
+
Start conservative and adjust based on metrics:
|
|
907
|
+
|
|
908
|
+
```javascript
|
|
909
|
+
const pool = smartPool(5, {
|
|
910
|
+
adaptive: true,
|
|
911
|
+
minConcurrency: 2,
|
|
912
|
+
maxConcurrency: 20
|
|
913
|
+
});
|
|
914
|
+
```
|
|
915
|
+
|
|
916
|
+
### 2. Use Type-based Rate Limiting
|
|
917
|
+
|
|
918
|
+
Respect external API limits:
|
|
919
|
+
|
|
920
|
+
```javascript
|
|
921
|
+
const pool = smartPool(10, {
|
|
922
|
+
rateLimits: {
|
|
923
|
+
'api-provider': { interval: 60000, tasksPerInterval: 100 }
|
|
924
|
+
}
|
|
925
|
+
});
|
|
926
|
+
```
|
|
927
|
+
|
|
928
|
+
### 3. Implement Circuit Breakers
|
|
929
|
+
|
|
930
|
+
Protect against cascading failures:
|
|
931
|
+
|
|
932
|
+
```javascript
|
|
933
|
+
const pool = smartPool(5, {
|
|
934
|
+
circuitThreshold: 5,
|
|
935
|
+
circuitResetTimeout: 30000,
|
|
936
|
+
retryCount: 3
|
|
937
|
+
});
|
|
938
|
+
```
|
|
939
|
+
|
|
940
|
+
### 4. Batch Similar Operations
|
|
941
|
+
|
|
942
|
+
Reduce overhead for bulk operations:
|
|
943
|
+
|
|
944
|
+
```javascript
|
|
945
|
+
const pool = smartPool(5, {
|
|
946
|
+
batchSize: 100,
|
|
947
|
+
batchTimeout: 50
|
|
948
|
+
});
|
|
949
|
+
```
|
|
950
|
+
|
|
951
|
+
### 5. Use Sub-queues for Isolation
|
|
952
|
+
|
|
953
|
+
Separate critical and background work:
|
|
954
|
+
|
|
955
|
+
```javascript
|
|
956
|
+
const critical = pool.useQueue('critical', 10);
|
|
957
|
+
const background = pool.useQueue('background', 2);
|
|
958
|
+
```
|
|
959
|
+
|
|
960
|
+
### 6. Monitor Metrics
|
|
961
|
+
|
|
962
|
+
Track performance and adjust configuration:
|
|
963
|
+
|
|
964
|
+
```javascript
|
|
965
|
+
setInterval(() => {
|
|
966
|
+
const { throughput, errorRate, percentiles } = pool.metrics;
|
|
967
|
+
console.log({ throughput, errorRate, p99: percentiles.p99 });
|
|
968
|
+
}, 5000);
|
|
969
|
+
```
|
|
970
|
+
|
|
971
|
+
### 7. Handle Errors Gracefully
|
|
972
|
+
|
|
973
|
+
Always catch and handle task errors:
|
|
974
|
+
|
|
975
|
+
```javascript
|
|
976
|
+
try {
|
|
977
|
+
await pool(async () => riskyOperation());
|
|
978
|
+
} catch (err) {
|
|
979
|
+
console.error('Task failed:', err);
|
|
980
|
+
}
|
|
981
|
+
```
|
|
982
|
+
|
|
983
|
+
### 8. Clean Up Resources
|
|
984
|
+
|
|
985
|
+
Always drain or clear the pool on shutdown:
|
|
986
|
+
|
|
987
|
+
```javascript
|
|
988
|
+
process.on('SIGTERM', async () => {
|
|
989
|
+
await pool.drain();
|
|
990
|
+
process.exit(0);
|
|
991
|
+
});
|
|
992
|
+
```
|
|
993
|
+
|
|
994
|
+
## Performance Tips
|
|
995
|
+
|
|
996
|
+
1. **Batch when possible**: Use `batchKey` for operations that can be grouped
|
|
997
|
+
2. **Enable adaptive mode**: Let the pool optimize concurrency automatically
|
|
998
|
+
3. **Use worker threads**: Offload CPU-intensive tasks to avoid blocking
|
|
999
|
+
4. **Cache duplicate requests**: Use `cacheKey` to deduplicate pending tasks
|
|
1000
|
+
5. **Set appropriate priorities**: High-priority tasks execute first
|
|
1001
|
+
6. **Monitor metrics**: Use percentiles to identify bottlenecks
|
|
1002
|
+
7. **Tune rate limits**: Match external service limits
|
|
1003
|
+
8. **Use sub-queues**: Isolate different workload types
|
|
1004
|
+
|
|
1005
|
+
## License
|
|
1006
|
+
|
|
1007
|
+
MIT
|