@push.rocks/smartstream 3.2.4 → 3.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/dist_ts/00_commitinfo_data.js +1 -1
- package/dist_ts/smartstream.classes.smartduplex.d.ts +0 -2
- package/dist_ts/smartstream.classes.streamintake.d.ts +0 -1
- package/dist_ts/smartstream.functions.d.ts +0 -1
- package/dist_ts/smartstream.nodewebhelpers.d.ts +29 -0
- package/dist_ts/smartstream.nodewebhelpers.js +117 -3
- package/dist_ts/smartstream.plugins.d.ts +2 -2
- package/dist_ts/smartstream.plugins.js +3 -2
- package/dist_ts_web/00_commitinfo_data.js +1 -1
- package/dist_ts_web/plugins.js +1 -1
- package/npmextra.json +12 -6
- package/package.json +21 -17
- package/readme.md +335 -238
- package/ts/00_commitinfo_data.ts +1 -1
- package/ts/smartstream.nodewebhelpers.ts +119 -2
- package/ts/smartstream.plugins.ts +2 -1
- package/ts/tspublish.json +3 -0
- package/ts_web/00_commitinfo_data.ts +1 -1
- package/ts_web/plugins.ts +3 -3
- package/ts_web/tspublish.json +3 -0
- package/dist_ts/smartstream.classes.passthrough.d.ts +0 -8
- package/dist_ts/smartstream.classes.passthrough.js +0 -18
- package/dist_ts/smartstream.classes.smartstream.d.ts +0 -12
- package/dist_ts/smartstream.classes.smartstream.js +0 -48
- package/dist_ts/smartstream.duplex.d.ts +0 -23
- package/dist_ts/smartstream.duplex.js +0 -48
- package/dist_ts_web/convert.d.ts +0 -18
- package/dist_ts_web/convert.js +0 -45
package/readme.md
CHANGED
|
@@ -1,375 +1,472 @@
|
|
|
1
|
-
```markdown
|
|
2
1
|
# @push.rocks/smartstream
|
|
3
|
-
|
|
2
|
+
|
|
3
|
+
A TypeScript-first library for creating and manipulating Node.js and Web streams with built-in backpressure handling, async transformations, and seamless Node.js ↔ Web stream interoperability.
|
|
4
|
+
|
|
5
|
+
## Issue Reporting and Security
|
|
6
|
+
|
|
7
|
+
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
|
|
4
8
|
|
|
5
9
|
## Install
|
|
6
|
-
To install `@push.rocks/smartstream`, you can use npm or yarn as follows:
|
|
7
10
|
|
|
8
11
|
```bash
|
|
9
|
-
|
|
10
|
-
# OR
|
|
11
|
-
yarn add @push.rocks/smartstream
|
|
12
|
+
pnpm install @push.rocks/smartstream
|
|
12
13
|
```
|
|
13
14
|
|
|
14
|
-
|
|
15
|
+
The package ships with two entry points:
|
|
16
|
+
|
|
17
|
+
| Entry Point | Import Path | Environment |
|
|
18
|
+
|---|---|---|
|
|
19
|
+
| **Node.js** (default) | `@push.rocks/smartstream` | Node.js — full stream utilities, duplex, intake, wrappers, and Node↔Web helpers |
|
|
20
|
+
| **Web** | `@push.rocks/smartstream/web` | Browser & Node.js — pure Web Streams API (`WebDuplexStream`) |
|
|
15
21
|
|
|
16
22
|
## Usage
|
|
17
23
|
|
|
18
|
-
|
|
24
|
+
All examples use ESM / TypeScript syntax.
|
|
25
|
+
|
|
26
|
+
### 📦 Importing
|
|
27
|
+
|
|
28
|
+
```typescript
|
|
29
|
+
// Node.js — full API
|
|
30
|
+
import {
|
|
31
|
+
SmartDuplex,
|
|
32
|
+
StreamWrapper,
|
|
33
|
+
StreamIntake,
|
|
34
|
+
createTransformFunction,
|
|
35
|
+
createPassThrough,
|
|
36
|
+
nodewebhelpers,
|
|
37
|
+
} from '@push.rocks/smartstream';
|
|
38
|
+
|
|
39
|
+
// Web — browser-safe, zero Node.js dependencies
|
|
40
|
+
import { WebDuplexStream } from '@push.rocks/smartstream/web';
|
|
41
|
+
```
|
|
42
|
+
|
|
43
|
+
---
|
|
44
|
+
|
|
45
|
+
### 🔄 SmartDuplex — The Core Stream Primitive
|
|
19
46
|
|
|
20
|
-
|
|
47
|
+
`SmartDuplex` extends Node.js `Duplex` with first-class async support, built-in backpressure management, and a clean functional API. Instead of overriding `_transform` or `_write` manually, you pass a `writeFunction` that receives each chunk along with a `tools` object.
|
|
21
48
|
|
|
22
|
-
|
|
49
|
+
#### Basic Transform
|
|
23
50
|
|
|
24
51
|
```typescript
|
|
25
|
-
import
|
|
52
|
+
import { SmartDuplex } from '@push.rocks/smartstream';
|
|
53
|
+
|
|
54
|
+
const upperCaser = new SmartDuplex<Buffer, Buffer>({
|
|
55
|
+
writeFunction: async (chunk, tools) => {
|
|
56
|
+
// Return a value to push it downstream
|
|
57
|
+
return Buffer.from(chunk.toString().toUpperCase());
|
|
58
|
+
},
|
|
59
|
+
});
|
|
60
|
+
|
|
61
|
+
readableStream.pipe(upperCaser).pipe(writableStream);
|
|
26
62
|
```
|
|
27
63
|
|
|
28
|
-
|
|
64
|
+
#### Using `tools.push()` for Multiple Outputs
|
|
65
|
+
|
|
66
|
+
The `writeFunction` can emit multiple chunks per input via `tools.push()`:
|
|
29
67
|
|
|
30
68
|
```typescript
|
|
31
|
-
|
|
69
|
+
const splitter = new SmartDuplex<string, string>({
|
|
70
|
+
objectMode: true,
|
|
71
|
+
writeFunction: async (chunk, tools) => {
|
|
72
|
+
const words = chunk.split(' ');
|
|
73
|
+
for (const word of words) {
|
|
74
|
+
await tools.push(word);
|
|
75
|
+
}
|
|
76
|
+
// Returning nothing — output was already pushed
|
|
77
|
+
},
|
|
78
|
+
});
|
|
32
79
|
```
|
|
33
80
|
|
|
34
|
-
|
|
81
|
+
#### Final Function
|
|
35
82
|
|
|
36
|
-
|
|
83
|
+
Run cleanup or emit final data when the writable side ends:
|
|
37
84
|
|
|
38
85
|
```typescript
|
|
39
|
-
|
|
86
|
+
const aggregator = new SmartDuplex<number, number>({
|
|
87
|
+
objectMode: true,
|
|
88
|
+
writeFunction: async (chunk, tools) => {
|
|
89
|
+
runningTotal += chunk;
|
|
90
|
+
// Don't emit anything per-chunk
|
|
91
|
+
},
|
|
92
|
+
finalFunction: async (tools) => {
|
|
93
|
+
return runningTotal; // Emitted as the last chunk
|
|
94
|
+
},
|
|
95
|
+
});
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
#### Truncating a Stream Early
|
|
99
|
+
|
|
100
|
+
Call `tools.truncate()` inside `writeFunction` to signal that no more data should be read:
|
|
40
101
|
|
|
41
|
-
|
|
42
|
-
|
|
102
|
+
```typescript
|
|
103
|
+
const limiter = new SmartDuplex<string, string>({
|
|
104
|
+
objectMode: true,
|
|
105
|
+
writeFunction: async (chunk, tools) => {
|
|
106
|
+
if (chunk === 'STOP') {
|
|
107
|
+
tools.truncate();
|
|
108
|
+
return;
|
|
109
|
+
}
|
|
110
|
+
return chunk;
|
|
111
|
+
},
|
|
43
112
|
});
|
|
113
|
+
```
|
|
44
114
|
|
|
45
|
-
|
|
46
|
-
|
|
47
|
-
|
|
48
|
-
|
|
115
|
+
#### Creating from a Buffer
|
|
116
|
+
|
|
117
|
+
```typescript
|
|
118
|
+
const stream = SmartDuplex.fromBuffer(Buffer.from('hello world'));
|
|
119
|
+
stream.on('data', (chunk) => console.log(chunk.toString())); // "hello world"
|
|
49
120
|
```
|
|
50
121
|
|
|
51
|
-
|
|
122
|
+
#### Creating from a Web ReadableStream
|
|
52
123
|
|
|
53
|
-
|
|
124
|
+
Bridge the Web Streams API into a Node.js Duplex:
|
|
54
125
|
|
|
55
126
|
```typescript
|
|
56
|
-
|
|
127
|
+
const response = await fetch('https://example.com/data');
|
|
128
|
+
const nodeDuplex = SmartDuplex.fromWebReadableStream(response.body);
|
|
57
129
|
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
130
|
+
nodeDuplex.pipe(processTransform).pipe(outputStream);
|
|
131
|
+
```
|
|
132
|
+
|
|
133
|
+
#### Getting Web Streams from SmartDuplex
|
|
134
|
+
|
|
135
|
+
Convert a `SmartDuplex` into Web `ReadableStream` + `WritableStream` pair:
|
|
136
|
+
|
|
137
|
+
```typescript
|
|
138
|
+
const duplex = new SmartDuplex({
|
|
139
|
+
writeFunction: async (chunk, tools) => {
|
|
140
|
+
return transform(chunk);
|
|
141
|
+
},
|
|
63
142
|
});
|
|
64
143
|
|
|
65
|
-
|
|
144
|
+
const { readable, writable } = await duplex.getWebStreams();
|
|
145
|
+
```
|
|
146
|
+
|
|
147
|
+
#### Debug Mode
|
|
148
|
+
|
|
149
|
+
Pass `debug: true` and `name` to get detailed internal logs:
|
|
150
|
+
|
|
151
|
+
```typescript
|
|
152
|
+
const stream = new SmartDuplex({
|
|
153
|
+
name: 'MyStream',
|
|
154
|
+
debug: true,
|
|
155
|
+
writeFunction: async (chunk, tools) => chunk,
|
|
156
|
+
});
|
|
66
157
|
```
|
|
67
158
|
|
|
68
|
-
|
|
159
|
+
---
|
|
69
160
|
|
|
70
|
-
|
|
161
|
+
### 🧩 StreamWrapper — Pipeline Composition
|
|
162
|
+
|
|
163
|
+
`StreamWrapper` takes an array of streams, pipes them together, attaches error listeners on all of them, and returns a `Promise` that resolves when the pipeline finishes:
|
|
71
164
|
|
|
72
165
|
```typescript
|
|
73
166
|
import { StreamWrapper } from '@push.rocks/smartstream';
|
|
167
|
+
import fs from 'fs';
|
|
74
168
|
|
|
75
|
-
const
|
|
76
|
-
|
|
77
|
-
|
|
78
|
-
|
|
79
|
-
|
|
169
|
+
const pipeline = new StreamWrapper([
|
|
170
|
+
fs.createReadStream('./input.txt'),
|
|
171
|
+
new SmartDuplex({
|
|
172
|
+
writeFunction: async (chunk) => Buffer.from(chunk.toString().toUpperCase()),
|
|
173
|
+
}),
|
|
174
|
+
fs.createWriteStream('./output.txt'),
|
|
80
175
|
]);
|
|
81
176
|
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
.catch(err => console.error('An error occurred:', err));
|
|
177
|
+
await pipeline.run();
|
|
178
|
+
console.log('Pipeline complete!');
|
|
85
179
|
```
|
|
86
180
|
|
|
87
|
-
|
|
88
|
-
|
|
89
|
-
`StreamIntake` allows for more dynamic control of the reading process, facilitating scenarios where data is not continuously available:
|
|
181
|
+
Error handling is automatic — if any stream in the array errors, the returned promise rejects:
|
|
90
182
|
|
|
91
183
|
```typescript
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
184
|
+
pipeline.run()
|
|
185
|
+
.then(() => console.log('Done'))
|
|
186
|
+
.catch((err) => console.error('Pipeline failed:', err));
|
|
187
|
+
```
|
|
95
188
|
|
|
96
|
-
|
|
97
|
-
streamIntake.pushData('Hello, World!');
|
|
98
|
-
streamIntake.pushData('Another message');
|
|
189
|
+
You can also listen for custom events across all streams:
|
|
99
190
|
|
|
100
|
-
|
|
101
|
-
|
|
191
|
+
```typescript
|
|
192
|
+
pipeline.onCustomEvent('progress', () => {
|
|
193
|
+
console.log('Progress event fired');
|
|
194
|
+
});
|
|
102
195
|
```
|
|
103
196
|
|
|
104
|
-
|
|
197
|
+
---
|
|
198
|
+
|
|
199
|
+
### 📥 StreamIntake — Dynamic Data Injection
|
|
105
200
|
|
|
106
|
-
|
|
201
|
+
`StreamIntake` is a `Readable` stream that lets you programmatically push data into a pipeline. It operates in object mode by default and provides a reactive observable (`pushNextObservable`) for demand-driven data production.
|
|
107
202
|
|
|
108
203
|
```typescript
|
|
109
|
-
import {
|
|
110
|
-
import fs from 'fs';
|
|
111
|
-
import csvParser from 'csv-parser';
|
|
204
|
+
import { StreamIntake, SmartDuplex } from '@push.rocks/smartstream';
|
|
112
205
|
|
|
113
|
-
const
|
|
114
|
-
// Process row
|
|
115
|
-
return processedRow;
|
|
116
|
-
});
|
|
206
|
+
const intake = new StreamIntake<string>();
|
|
117
207
|
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
.pipe(csvReadTransform)
|
|
208
|
+
// Pipe through a transform
|
|
209
|
+
intake
|
|
121
210
|
.pipe(new SmartDuplex({
|
|
122
|
-
|
|
123
|
-
|
|
124
|
-
|
|
211
|
+
objectMode: true,
|
|
212
|
+
writeFunction: async (chunk) => {
|
|
213
|
+
console.log('Processing:', chunk);
|
|
214
|
+
return chunk;
|
|
215
|
+
},
|
|
125
216
|
}))
|
|
126
|
-
.on('
|
|
127
|
-
```
|
|
217
|
+
.on('data', (data) => console.log('Output:', data));
|
|
128
218
|
|
|
129
|
-
|
|
219
|
+
// Push data whenever it's ready
|
|
220
|
+
intake.pushData('Hello');
|
|
221
|
+
intake.pushData('World');
|
|
222
|
+
intake.signalEnd(); // Signal end-of-stream
|
|
223
|
+
```
|
|
130
224
|
|
|
131
|
-
|
|
225
|
+
#### Demand-driven Production with Observable
|
|
132
226
|
|
|
133
|
-
|
|
227
|
+
`pushNextObservable` emits whenever the stream is ready for more data — perfect for throttled or event-driven producers:
|
|
134
228
|
|
|
135
229
|
```typescript
|
|
136
|
-
|
|
230
|
+
const intake = new StreamIntake<number>();
|
|
137
231
|
|
|
138
|
-
|
|
139
|
-
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
144
|
-
console.log('Processed chunk:', chunk);
|
|
145
|
-
push(chunk);
|
|
232
|
+
let counter = 0;
|
|
233
|
+
intake.pushNextObservable.subscribe(() => {
|
|
234
|
+
if (counter < 100) {
|
|
235
|
+
intake.pushData(counter++);
|
|
236
|
+
} else {
|
|
237
|
+
intake.signalEnd();
|
|
146
238
|
}
|
|
147
239
|
});
|
|
148
240
|
|
|
149
|
-
|
|
150
|
-
|
|
151
|
-
name: 'FastProcessor',
|
|
152
|
-
objectMode: true,
|
|
153
|
-
writeFunction: async (chunk, { push }) => {
|
|
154
|
-
console.log('Fast processing chunk:', chunk);
|
|
155
|
-
push(chunk);
|
|
156
|
-
}
|
|
157
|
-
});
|
|
241
|
+
intake.pipe(consumer);
|
|
242
|
+
```
|
|
158
243
|
|
|
159
|
-
|
|
160
|
-
const streamIntake = new StreamIntake<string>();
|
|
244
|
+
#### Creating from Existing Streams
|
|
161
245
|
|
|
162
|
-
|
|
163
|
-
streamIntake
|
|
164
|
-
.pipe(fastProcessingStream)
|
|
165
|
-
.pipe(slowProcessingStream)
|
|
166
|
-
.pipe(createPassThrough()) // Use Pass-Through to provide intermediary handling
|
|
167
|
-
.on('data', data => console.log('Final output:', data))
|
|
168
|
-
.on('error', error => console.error('Stream encountered an error:', error));
|
|
246
|
+
Wrap a Node.js `Readable` or a Web `ReadableStream`:
|
|
169
247
|
|
|
170
|
-
|
|
171
|
-
|
|
172
|
-
const
|
|
173
|
-
|
|
174
|
-
|
|
175
|
-
|
|
176
|
-
|
|
177
|
-
streamIntake.pushData(`Chunk ${counter}`);
|
|
178
|
-
counter++;
|
|
179
|
-
}
|
|
180
|
-
}, 50);
|
|
248
|
+
```typescript
|
|
249
|
+
// From Node.js Readable
|
|
250
|
+
const intake = await StreamIntake.fromStream<Buffer>(fs.createReadStream('./data.bin'));
|
|
251
|
+
|
|
252
|
+
// From Web ReadableStream
|
|
253
|
+
const response = await fetch('https://example.com/stream');
|
|
254
|
+
const intake = await StreamIntake.fromStream<Uint8Array>(response.body);
|
|
181
255
|
```
|
|
182
256
|
|
|
183
|
-
|
|
257
|
+
---
|
|
258
|
+
|
|
259
|
+
### ⚡ Utility Functions
|
|
184
260
|
|
|
185
|
-
|
|
261
|
+
#### `createTransformFunction`
|
|
186
262
|
|
|
187
|
-
|
|
263
|
+
Quickly create a `SmartDuplex` from a simple async mapping function:
|
|
188
264
|
|
|
189
265
|
```typescript
|
|
190
|
-
import {
|
|
266
|
+
import { createTransformFunction } from '@push.rocks/smartstream';
|
|
191
267
|
|
|
192
|
-
const
|
|
193
|
-
// Parallel Processing
|
|
194
|
-
const results = await Promise.all(chunk.map(async item => await processItem(item)));
|
|
195
|
-
return results;
|
|
196
|
-
});
|
|
268
|
+
const doubler = createTransformFunction<number, number>(async (n) => n * 2);
|
|
197
269
|
|
|
198
|
-
|
|
270
|
+
intakeStream.pipe(doubler).pipe(outputStream);
|
|
271
|
+
```
|
|
199
272
|
|
|
200
|
-
|
|
201
|
-
.pipe(parallelTransform)
|
|
202
|
-
.pipe(new SmartDuplex({
|
|
203
|
-
async writeFunction(chunk, { push }) {
|
|
204
|
-
console.log('Processed parallel chunk:', chunk);
|
|
205
|
-
push(chunk);
|
|
206
|
-
}
|
|
207
|
-
}))
|
|
208
|
-
.on('finish', () => console.log('Parallel processing completed.'));
|
|
273
|
+
#### `createPassThrough`
|
|
209
274
|
|
|
210
|
-
|
|
211
|
-
|
|
212
|
-
|
|
213
|
-
|
|
275
|
+
Create an object-mode passthrough stream (useful as an intermediary or tee point):
|
|
276
|
+
|
|
277
|
+
```typescript
|
|
278
|
+
import { createPassThrough } from '@push.rocks/smartstream';
|
|
279
|
+
|
|
280
|
+
const passThrough = createPassThrough();
|
|
281
|
+
source.pipe(passThrough).pipe(destination);
|
|
214
282
|
```
|
|
215
283
|
|
|
216
|
-
|
|
284
|
+
---
|
|
285
|
+
|
|
286
|
+
### 🌐 WebDuplexStream — Pure Web Streams API
|
|
217
287
|
|
|
218
|
-
|
|
288
|
+
`WebDuplexStream` extends `TransformStream` and works in both browsers and Node.js. Import it from the `/web` subpath for zero Node.js dependencies.
|
|
219
289
|
|
|
220
290
|
```typescript
|
|
221
|
-
import {
|
|
291
|
+
import { WebDuplexStream } from '@push.rocks/smartstream/web';
|
|
222
292
|
|
|
223
|
-
const
|
|
224
|
-
async
|
|
225
|
-
|
|
226
|
-
|
|
227
|
-
}
|
|
228
|
-
push(chunk);
|
|
229
|
-
}
|
|
293
|
+
const stream = new WebDuplexStream<number, number>({
|
|
294
|
+
writeFunction: async (chunk, { push }) => {
|
|
295
|
+
push(chunk * 2); // Push transformed data
|
|
296
|
+
},
|
|
230
297
|
});
|
|
231
298
|
|
|
232
|
-
const
|
|
233
|
-
const
|
|
234
|
-
async writeFunction(chunk) {
|
|
235
|
-
console.log('Written chunk:', chunk);
|
|
236
|
-
}
|
|
237
|
-
});
|
|
299
|
+
const writer = stream.writable.getWriter();
|
|
300
|
+
const reader = stream.readable.getReader();
|
|
238
301
|
|
|
239
|
-
|
|
302
|
+
// Write
|
|
303
|
+
await writer.write(5);
|
|
304
|
+
await writer.write(10);
|
|
305
|
+
await writer.close();
|
|
306
|
+
|
|
307
|
+
// Read
|
|
308
|
+
const { value } = await reader.read(); // 10
|
|
309
|
+
const { value: v2 } = await reader.read(); // 20
|
|
310
|
+
```
|
|
240
311
|
|
|
241
|
-
|
|
242
|
-
.then(() => console.log('Stream processing completed.'))
|
|
243
|
-
.catch(err => console.error('Stream error:', err.message));
|
|
312
|
+
#### From a Uint8Array
|
|
244
313
|
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
249
|
-
readStream.signalEnd();
|
|
314
|
+
```typescript
|
|
315
|
+
const stream = WebDuplexStream.fromUInt8Array(new Uint8Array([1, 2, 3]));
|
|
316
|
+
const reader = stream.readable.getReader();
|
|
317
|
+
const { value } = await reader.read(); // Uint8Array [1, 2, 3]
|
|
250
318
|
```
|
|
251
319
|
|
|
252
|
-
|
|
320
|
+
#### Data Production with `readFunction`
|
|
253
321
|
|
|
254
|
-
|
|
322
|
+
Supply data into the stream from any async source:
|
|
255
323
|
|
|
256
324
|
```typescript
|
|
257
|
-
|
|
258
|
-
|
|
259
|
-
|
|
260
|
-
|
|
261
|
-
|
|
262
|
-
|
|
263
|
-
|
|
264
|
-
|
|
265
|
-
|
|
266
|
-
return new Promise<void>((resolve) => {
|
|
267
|
-
smartStream.on('data', (chunk: Buffer) => {
|
|
268
|
-
receivedData = Buffer.concat([receivedData, chunk]);
|
|
269
|
-
});
|
|
270
|
-
|
|
271
|
-
smartStream.on('end', () => {
|
|
272
|
-
expect(receivedData.toString()).toEqual(bufferData.toString());
|
|
273
|
-
resolve();
|
|
274
|
-
});
|
|
275
|
-
});
|
|
325
|
+
const stream = new WebDuplexStream<string, string>({
|
|
326
|
+
readFunction: async (tools) => {
|
|
327
|
+
await tools.write('chunk 1');
|
|
328
|
+
await tools.write('chunk 2');
|
|
329
|
+
tools.done(); // Signal end
|
|
330
|
+
},
|
|
331
|
+
writeFunction: async (chunk, { push }) => {
|
|
332
|
+
push(chunk.toUpperCase());
|
|
333
|
+
},
|
|
276
334
|
});
|
|
277
335
|
|
|
278
|
-
|
|
336
|
+
const reader = stream.readable.getReader();
|
|
337
|
+
// reads "CHUNK 1", "CHUNK 2"
|
|
279
338
|
```
|
|
280
339
|
|
|
281
|
-
|
|
340
|
+
---
|
|
282
341
|
|
|
283
|
-
|
|
342
|
+
### 🔀 Node ↔ Web Stream Converters
|
|
284
343
|
|
|
285
|
-
|
|
286
|
-
import { tap } from '@push.rocks/tapbundle';
|
|
287
|
-
import * as smartfile from '@push.rocks/smartfile';
|
|
288
|
-
import { SmartDuplex, StreamWrapper } from '@push.rocks/smartstream';
|
|
344
|
+
The `nodewebhelpers` namespace provides bidirectional converters between Node.js and Web Streams:
|
|
289
345
|
|
|
290
|
-
|
|
291
|
-
|
|
292
|
-
|
|
346
|
+
```typescript
|
|
347
|
+
import { nodewebhelpers } from '@push.rocks/smartstream';
|
|
348
|
+
```
|
|
293
349
|
|
|
294
|
-
|
|
295
|
-
|
|
296
|
-
|
|
297
|
-
|
|
298
|
-
|
|
299
|
-
|
|
350
|
+
| Function | From | To |
|
|
351
|
+
|---|---|---|
|
|
352
|
+
| `createWebReadableStreamFromFile(path)` | File path | Web `ReadableStream<Uint8Array>` |
|
|
353
|
+
| `convertWebReadableToNodeReadable(webStream)` | Web `ReadableStream` | Node.js `Readable` |
|
|
354
|
+
| `convertNodeReadableToWebReadable(nodeStream)` | Node.js `Readable` | Web `ReadableStream` |
|
|
355
|
+
| `convertWebWritableToNodeWritable(webWritable)` | Web `WritableStream` | Node.js `Writable` |
|
|
356
|
+
| `convertNodeWritableToWebWritable(nodeWritable)` | Node.js `Writable` | Web `WritableStream` |
|
|
300
357
|
|
|
301
|
-
|
|
358
|
+
#### Example: Serve a File as a Web ReadableStream
|
|
302
359
|
|
|
303
|
-
|
|
360
|
+
```typescript
|
|
361
|
+
const webStream = nodewebhelpers.createWebReadableStreamFromFile('./video.mp4');
|
|
304
362
|
|
|
305
|
-
|
|
306
|
-
|
|
363
|
+
// Use with fetch Response, service workers, etc.
|
|
364
|
+
return new Response(webStream, {
|
|
365
|
+
headers: { 'Content-Type': 'video/mp4' },
|
|
307
366
|
});
|
|
367
|
+
```
|
|
308
368
|
|
|
309
|
-
|
|
369
|
+
#### Example: Convert Between Stream Types
|
|
370
|
+
|
|
371
|
+
```typescript
|
|
372
|
+
import fs from 'fs';
|
|
373
|
+
import { nodewebhelpers } from '@push.rocks/smartstream';
|
|
374
|
+
|
|
375
|
+
// Node → Web
|
|
376
|
+
const nodeReadable = fs.createReadStream('./data.bin');
|
|
377
|
+
const webReadable = nodewebhelpers.convertNodeReadableToWebReadable(nodeReadable);
|
|
378
|
+
|
|
379
|
+
// Web → Node
|
|
380
|
+
const nodeReadable2 = nodewebhelpers.convertWebReadableToNodeReadable(webReadable);
|
|
381
|
+
nodeReadable2.pipe(fs.createWriteStream('./copy.bin'));
|
|
310
382
|
```
|
|
311
383
|
|
|
312
|
-
|
|
384
|
+
---
|
|
385
|
+
|
|
386
|
+
### 🏗️ Backpressure Handling
|
|
313
387
|
|
|
314
|
-
|
|
388
|
+
`SmartDuplex` uses a `BackpressuredArray` internally, bounded by `highWaterMark` (default: 1). When the downstream consumer is slow, the stream automatically pauses the upstream producer until space is available — no manual bookkeeping required.
|
|
315
389
|
|
|
316
390
|
```typescript
|
|
317
|
-
|
|
391
|
+
const slow = new SmartDuplex({
|
|
392
|
+
name: 'SlowConsumer',
|
|
393
|
+
objectMode: true,
|
|
394
|
+
highWaterMark: 1,
|
|
395
|
+
writeFunction: async (chunk, tools) => {
|
|
396
|
+
await new Promise((resolve) => setTimeout(resolve, 200));
|
|
397
|
+
return chunk;
|
|
398
|
+
},
|
|
399
|
+
});
|
|
318
400
|
|
|
319
|
-
|
|
320
|
-
|
|
321
|
-
|
|
322
|
-
|
|
401
|
+
const fast = new SmartDuplex({
|
|
402
|
+
name: 'FastProducer',
|
|
403
|
+
objectMode: true,
|
|
404
|
+
writeFunction: async (chunk, tools) => {
|
|
405
|
+
return chunk; // Instant processing
|
|
406
|
+
},
|
|
407
|
+
});
|
|
323
408
|
|
|
324
|
-
|
|
325
|
-
|
|
326
|
-
chunk.data = chunk.data.toUpperCase();
|
|
327
|
-
push(chunk);
|
|
328
|
-
}
|
|
329
|
-
})
|
|
409
|
+
// Backpressure is handled automatically between fast → slow
|
|
410
|
+
fast.pipe(slow).on('data', (d) => console.log(d));
|
|
330
411
|
|
|
331
|
-
|
|
332
|
-
|
|
333
|
-
|
|
334
|
-
|
|
335
|
-
|
|
336
|
-
});
|
|
412
|
+
for (let i = 0; i < 100; i++) {
|
|
413
|
+
fast.write(`chunk-${i}`);
|
|
414
|
+
}
|
|
415
|
+
fast.end();
|
|
416
|
+
```
|
|
337
417
|
|
|
338
|
-
|
|
339
|
-
{ id: 1, data: 'first' },
|
|
340
|
-
{ id: 2, data: 'second' }
|
|
341
|
-
];
|
|
418
|
+
---
|
|
342
419
|
|
|
343
|
-
|
|
420
|
+
### 🎯 Real-World Example: Processing Pipeline
|
|
344
421
|
|
|
345
|
-
|
|
346
|
-
|
|
347
|
-
|
|
348
|
-
.on('data', data => console.log('Transformed Data:', data));
|
|
422
|
+
```typescript
|
|
423
|
+
import fs from 'fs';
|
|
424
|
+
import { SmartDuplex, StreamWrapper } from '@push.rocks/smartstream';
|
|
349
425
|
|
|
350
|
-
|
|
351
|
-
|
|
352
|
-
|
|
426
|
+
// Read → Transform → Filter → Write
|
|
427
|
+
const pipeline = new StreamWrapper([
|
|
428
|
+
fs.createReadStream('./access.log'),
|
|
429
|
+
new SmartDuplex({
|
|
430
|
+
writeFunction: async (chunk) => {
|
|
431
|
+
// Parse each line
|
|
432
|
+
return chunk.toString().split('\n');
|
|
433
|
+
},
|
|
434
|
+
}),
|
|
435
|
+
new SmartDuplex({
|
|
436
|
+
objectMode: true,
|
|
437
|
+
writeFunction: async (lines: string[], tools) => {
|
|
438
|
+
// Filter and push matching lines
|
|
439
|
+
for (const line of lines) {
|
|
440
|
+
if (line.includes('ERROR')) {
|
|
441
|
+
await tools.push(line + '\n');
|
|
442
|
+
}
|
|
443
|
+
}
|
|
444
|
+
},
|
|
445
|
+
}),
|
|
446
|
+
fs.createWriteStream('./errors.log'),
|
|
447
|
+
]);
|
|
353
448
|
|
|
354
|
-
|
|
449
|
+
await pipeline.run();
|
|
450
|
+
console.log('Error extraction complete');
|
|
355
451
|
```
|
|
356
452
|
|
|
357
|
-
|
|
358
453
|
## License and Legal Information
|
|
359
454
|
|
|
360
|
-
This repository contains open-source code
|
|
455
|
+
This repository contains open-source code licensed under the MIT License. A copy of the license can be found in the [LICENSE](./LICENSE) file.
|
|
361
456
|
|
|
362
457
|
**Please note:** The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
|
|
363
458
|
|
|
364
459
|
### Trademarks
|
|
365
460
|
|
|
366
|
-
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein.
|
|
461
|
+
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH or third parties, and are not included within the scope of the MIT license granted herein.
|
|
462
|
+
|
|
463
|
+
Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines or the guidelines of the respective third-party owners, and any usage must be approved in writing. Third-party trademarks used herein are the property of their respective owners and used only in a descriptive manner, e.g. for an implementation of an API or similar.
|
|
367
464
|
|
|
368
465
|
### Company Information
|
|
369
466
|
|
|
370
|
-
Task Venture Capital GmbH
|
|
371
|
-
Registered at District
|
|
467
|
+
Task Venture Capital GmbH
|
|
468
|
+
Registered at District Court Bremen HRB 35230 HB, Germany
|
|
372
469
|
|
|
373
|
-
For any legal inquiries or
|
|
470
|
+
For any legal inquiries or further information, please contact us via email at hello@task.vc.
|
|
374
471
|
|
|
375
472
|
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.
|