@push.rocks/smartstream 3.2.5 → 3.4.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (34) hide show
  1. package/dist_ts/00_commitinfo_data.js +1 -1
  2. package/dist_ts/smartstream.classes.smartduplex.d.ts +10 -6
  3. package/dist_ts/smartstream.classes.smartduplex.js +182 -72
  4. package/dist_ts/smartstream.classes.streamintake.d.ts +0 -1
  5. package/dist_ts/smartstream.classes.streamintake.js +6 -3
  6. package/dist_ts/smartstream.classes.streamwrapper.js +11 -12
  7. package/dist_ts/smartstream.functions.d.ts +0 -1
  8. package/dist_ts/smartstream.nodewebhelpers.d.ts +2 -3
  9. package/dist_ts/smartstream.nodewebhelpers.js +97 -36
  10. package/dist_ts/smartstream.plugins.d.ts +0 -2
  11. package/dist_ts_web/00_commitinfo_data.js +1 -1
  12. package/dist_ts_web/classes.webduplexstream.js +20 -15
  13. package/dist_ts_web/plugins.js +1 -1
  14. package/npmextra.json +12 -6
  15. package/package.json +21 -17
  16. package/readme.md +335 -238
  17. package/ts/00_commitinfo_data.ts +1 -1
  18. package/ts/smartstream.classes.smartduplex.ts +211 -78
  19. package/ts/smartstream.classes.streamintake.ts +5 -2
  20. package/ts/smartstream.classes.streamwrapper.ts +11 -12
  21. package/ts/smartstream.nodewebhelpers.ts +105 -37
  22. package/ts/tspublish.json +3 -0
  23. package/ts_web/00_commitinfo_data.ts +1 -1
  24. package/ts_web/classes.webduplexstream.ts +20 -13
  25. package/ts_web/plugins.ts +3 -3
  26. package/ts_web/tspublish.json +3 -0
  27. package/dist_ts/smartstream.classes.passthrough.d.ts +0 -8
  28. package/dist_ts/smartstream.classes.passthrough.js +0 -18
  29. package/dist_ts/smartstream.classes.smartstream.d.ts +0 -12
  30. package/dist_ts/smartstream.classes.smartstream.js +0 -48
  31. package/dist_ts/smartstream.duplex.d.ts +0 -23
  32. package/dist_ts/smartstream.duplex.js +0 -48
  33. package/dist_ts_web/convert.d.ts +0 -18
  34. package/dist_ts_web/convert.js +0 -45
package/readme.md CHANGED
@@ -1,375 +1,472 @@
1
- ```markdown
2
1
  # @push.rocks/smartstream
3
- A TypeScript library to simplify the creation and manipulation of Node.js streams, providing utilities for transform, duplex, and readable/writable stream handling while managing backpressure effectively.
2
+
3
+ A TypeScript-first library for creating and manipulating Node.js and Web streams with built-in backpressure handling, async transformations, and seamless Node.js ↔ Web stream interoperability.
4
+
5
+ ## Issue Reporting and Security
6
+
7
+ For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
4
8
 
5
9
  ## Install
6
- To install `@push.rocks/smartstream`, you can use npm or yarn as follows:
7
10
 
8
11
  ```bash
9
- npm install @push.rocks/smartstream --save
10
- # OR
11
- yarn add @push.rocks/smartstream
12
+ pnpm install @push.rocks/smartstream
12
13
  ```
13
14
 
14
- This will add `@push.rocks/smartstream` to your project's dependencies.
15
+ The package ships with two entry points:
16
+
17
+ | Entry Point | Import Path | Environment |
18
+ |---|---|---|
19
+ | **Node.js** (default) | `@push.rocks/smartstream` | Node.js — full stream utilities, duplex, intake, wrappers, and Node↔Web helpers |
20
+ | **Web** | `@push.rocks/smartstream/web` | Browser & Node.js — pure Web Streams API (`WebDuplexStream`) |
15
21
 
16
22
  ## Usage
17
23
 
18
- The `@push.rocks/smartstream` module is designed to simplify working with Node.js streams by providing a set of utilities for creating and manipulating streams. This module makes extensive use of TypeScript for improved code quality, readability, and maintenance. ESM syntax is utilized throughout the examples.
24
+ All examples use ESM / TypeScript syntax.
25
+
26
+ ### 📦 Importing
27
+
28
+ ```typescript
29
+ // Node.js — full API
30
+ import {
31
+ SmartDuplex,
32
+ StreamWrapper,
33
+ StreamIntake,
34
+ createTransformFunction,
35
+ createPassThrough,
36
+ nodewebhelpers,
37
+ } from '@push.rocks/smartstream';
38
+
39
+ // Web — browser-safe, zero Node.js dependencies
40
+ import { WebDuplexStream } from '@push.rocks/smartstream/web';
41
+ ```
42
+
43
+ ---
44
+
45
+ ### 🔄 SmartDuplex — The Core Stream Primitive
19
46
 
20
- ### Importing the Module
47
+ `SmartDuplex` extends Node.js `Duplex` with first-class async support, built-in backpressure management, and a clean functional API. Instead of overriding `_transform` or `_write` manually, you pass a `writeFunction` that receives each chunk along with a `tools` object.
21
48
 
22
- Start by importing the module into your TypeScript file:
49
+ #### Basic Transform
23
50
 
24
51
  ```typescript
25
- import * as smartstream from '@push.rocks/smartstream';
52
+ import { SmartDuplex } from '@push.rocks/smartstream';
53
+
54
+ const upperCaser = new SmartDuplex<Buffer, Buffer>({
55
+ writeFunction: async (chunk, tools) => {
56
+ // Return a value to push it downstream
57
+ return Buffer.from(chunk.toString().toUpperCase());
58
+ },
59
+ });
60
+
61
+ readableStream.pipe(upperCaser).pipe(writableStream);
26
62
  ```
27
63
 
28
- For a more specific import, you may do the following:
64
+ #### Using `tools.push()` for Multiple Outputs
65
+
66
+ The `writeFunction` can emit multiple chunks per input via `tools.push()`:
29
67
 
30
68
  ```typescript
31
- import { SmartDuplex, StreamWrapper, StreamIntake, createTransformFunction, createPassThrough } from '@push.rocks/smartstream';
69
+ const splitter = new SmartDuplex<string, string>({
70
+ objectMode: true,
71
+ writeFunction: async (chunk, tools) => {
72
+ const words = chunk.split(' ');
73
+ for (const word of words) {
74
+ await tools.push(word);
75
+ }
76
+ // Returning nothing — output was already pushed
77
+ },
78
+ });
32
79
  ```
33
80
 
34
- ### Creating Basic Transform Streams
81
+ #### Final Function
35
82
 
36
- The module provides utilities for creating transform streams. For example, to create a transform stream that modifies chunks of data, you can use the `createTransformFunction` utility:
83
+ Run cleanup or emit final data when the writable side ends:
37
84
 
38
85
  ```typescript
39
- import { createTransformFunction } from '@push.rocks/smartstream';
86
+ const aggregator = new SmartDuplex<number, number>({
87
+ objectMode: true,
88
+ writeFunction: async (chunk, tools) => {
89
+ runningTotal += chunk;
90
+ // Don't emit anything per-chunk
91
+ },
92
+ finalFunction: async (tools) => {
93
+ return runningTotal; // Emitted as the last chunk
94
+ },
95
+ });
96
+ ```
97
+
98
+ #### Truncating a Stream Early
99
+
100
+ Call `tools.truncate()` inside `writeFunction` to signal that no more data should be read:
40
101
 
41
- const upperCaseTransform = createTransformFunction<string, string>(async (chunk) => {
42
- return chunk.toUpperCase();
102
+ ```typescript
103
+ const limiter = new SmartDuplex<string, string>({
104
+ objectMode: true,
105
+ writeFunction: async (chunk, tools) => {
106
+ if (chunk === 'STOP') {
107
+ tools.truncate();
108
+ return;
109
+ }
110
+ return chunk;
111
+ },
43
112
  });
113
+ ```
44
114
 
45
- // Usage with pipe
46
- readableStream
47
- .pipe(upperCaseTransform)
48
- .pipe(writableStream);
115
+ #### Creating from a Buffer
116
+
117
+ ```typescript
118
+ const stream = SmartDuplex.fromBuffer(Buffer.from('hello world'));
119
+ stream.on('data', (chunk) => console.log(chunk.toString())); // "hello world"
49
120
  ```
50
121
 
51
- ### Handling Backpressure with SmartDuplex
122
+ #### Creating from a Web ReadableStream
52
123
 
53
- `SmartDuplex` is a powerful part of the `smartstream` module designed to handle backpressure effectively. Here's an example of how to create a `SmartDuplex` stream that processes data and respects the consumer's pace:
124
+ Bridge the Web Streams API into a Node.js Duplex:
54
125
 
55
126
  ```typescript
56
- import { SmartDuplex } from '@push.rocks/smartstream';
127
+ const response = await fetch('https://example.com/data');
128
+ const nodeDuplex = SmartDuplex.fromWebReadableStream(response.body);
57
129
 
58
- const processDataDuplex = new SmartDuplex({
59
- async writeFunction(chunk, { push }) {
60
- const processedChunk = await processChunk(chunk); // Assume this is a defined asynchronous function
61
- push(processedChunk);
62
- }
130
+ nodeDuplex.pipe(processTransform).pipe(outputStream);
131
+ ```
132
+
133
+ #### Getting Web Streams from SmartDuplex
134
+
135
+ Convert a `SmartDuplex` into Web `ReadableStream` + `WritableStream` pair:
136
+
137
+ ```typescript
138
+ const duplex = new SmartDuplex({
139
+ writeFunction: async (chunk, tools) => {
140
+ return transform(chunk);
141
+ },
63
142
  });
64
143
 
65
- sourceStream.pipe(processDataDuplex).pipe(destinationStream);
144
+ const { readable, writable } = await duplex.getWebStreams();
145
+ ```
146
+
147
+ #### Debug Mode
148
+
149
+ Pass `debug: true` and `name` to get detailed internal logs:
150
+
151
+ ```typescript
152
+ const stream = new SmartDuplex({
153
+ name: 'MyStream',
154
+ debug: true,
155
+ writeFunction: async (chunk, tools) => chunk,
156
+ });
66
157
  ```
67
158
 
68
- ### Combining Multiple Streams
159
+ ---
69
160
 
70
- `Smartstream` facilitates easy combining of multiple streams into a single pipeline, handling errors and cleanup automatically. Here's how you can combine multiple streams:
161
+ ### 🧩 StreamWrapper Pipeline Composition
162
+
163
+ `StreamWrapper` takes an array of streams, pipes them together, attaches error listeners on all of them, and returns a `Promise` that resolves when the pipeline finishes:
71
164
 
72
165
  ```typescript
73
166
  import { StreamWrapper } from '@push.rocks/smartstream';
167
+ import fs from 'fs';
74
168
 
75
- const combinedStream = new StreamWrapper([
76
- readStream, // Source stream
77
- transformStream1, // Transformation
78
- transformStream2, // Another transformation
79
- writeStream // Destination stream
169
+ const pipeline = new StreamWrapper([
170
+ fs.createReadStream('./input.txt'),
171
+ new SmartDuplex({
172
+ writeFunction: async (chunk) => Buffer.from(chunk.toString().toUpperCase()),
173
+ }),
174
+ fs.createWriteStream('./output.txt'),
80
175
  ]);
81
176
 
82
- combinedStream.run()
83
- .then(() => console.log('Processing completed.'))
84
- .catch(err => console.error('An error occurred:', err));
177
+ await pipeline.run();
178
+ console.log('Pipeline complete!');
85
179
  ```
86
180
 
87
- ### Working with StreamIntake
88
-
89
- `StreamIntake` allows for more dynamic control of the reading process, facilitating scenarios where data is not continuously available:
181
+ Error handling is automatic — if any stream in the array errors, the returned promise rejects:
90
182
 
91
183
  ```typescript
92
- import { StreamIntake } from '@push.rocks/smartstream';
93
-
94
- const streamIntake = new StreamIntake<string>();
184
+ pipeline.run()
185
+ .then(() => console.log('Done'))
186
+ .catch((err) => console.error('Pipeline failed:', err));
187
+ ```
95
188
 
96
- // Dynamically push data into the intake
97
- streamIntake.pushData('Hello, World!');
98
- streamIntake.pushData('Another message');
189
+ You can also listen for custom events across all streams:
99
190
 
100
- // Signal end when no more data is to be pushed
101
- streamIntake.signalEnd();
191
+ ```typescript
192
+ pipeline.onCustomEvent('progress', () => {
193
+ console.log('Progress event fired');
194
+ });
102
195
  ```
103
196
 
104
- ### Real-world Scenario: Processing Large Files
197
+ ---
198
+
199
+ ### 📥 StreamIntake — Dynamic Data Injection
105
200
 
106
- Consider a scenario where you need to process a large CSV file, transform the data row-by-row, and then write the results to a database or another file. With `smartstream`, you could create a pipe that reads the CSV, processes each row, and handles backpressure, ensuring efficient use of resources.
201
+ `StreamIntake` is a `Readable` stream that lets you programmatically push data into a pipeline. It operates in object mode by default and provides a reactive observable (`pushNextObservable`) for demand-driven data production.
107
202
 
108
203
  ```typescript
109
- import { SmartDuplex, createTransformFunction } from '@push.rocks/smartstream';
110
- import fs from 'fs';
111
- import csvParser from 'csv-parser';
204
+ import { StreamIntake, SmartDuplex } from '@push.rocks/smartstream';
112
205
 
113
- const csvReadTransform = createTransformFunction<any, any>(async (row) => {
114
- // Process row
115
- return processedRow;
116
- });
206
+ const intake = new StreamIntake<string>();
117
207
 
118
- fs.createReadStream('path/to/largeFile.csv')
119
- .pipe(csvParser())
120
- .pipe(csvReadTransform)
208
+ // Pipe through a transform
209
+ intake
121
210
  .pipe(new SmartDuplex({
122
- async writeFunction(chunk, { push }) {
123
- await writeToDatabase(chunk); // Assume this writes to a database
124
- }
211
+ objectMode: true,
212
+ writeFunction: async (chunk) => {
213
+ console.log('Processing:', chunk);
214
+ return chunk;
215
+ },
125
216
  }))
126
- .on('finish', () => console.log('File processed successfully.'));
127
- ```
217
+ .on('data', (data) => console.log('Output:', data));
128
218
 
129
- This example demonstrates reading a large CSV file, transforming each row with `createTransformFunction`, and using a `SmartDuplex` to manage the processed data flow efficiently, ensuring no data is lost due to backpressure issues.
219
+ // Push data whenever it's ready
220
+ intake.pushData('Hello');
221
+ intake.pushData('World');
222
+ intake.signalEnd(); // Signal end-of-stream
223
+ ```
130
224
 
131
- ### Advanced Use Case: Backpressure Handling
225
+ #### Demand-driven Production with Observable
132
226
 
133
- Effective backpressure handling is crucial when working with streams to avoid overwhelming the downstream consumers. Here’s a comprehensive example that demonstrates handling backpressure in a pipeline with multiple `SmartDuplex` instances:
227
+ `pushNextObservable` emits whenever the stream is ready for more data perfect for throttled or event-driven producers:
134
228
 
135
229
  ```typescript
136
- import { SmartDuplex } from '@push.rocks/smartstream';
230
+ const intake = new StreamIntake<number>();
137
231
 
138
- // Define the first SmartDuplex, which writes data slowly to simulate backpressure
139
- const slowProcessingStream = new SmartDuplex({
140
- name: 'SlowProcessor',
141
- objectMode: true,
142
- writeFunction: async (chunk, { push }) => {
143
- await new Promise(resolve => setTimeout(resolve, 100)); // Simulated delay
144
- console.log('Processed chunk:', chunk);
145
- push(chunk);
232
+ let counter = 0;
233
+ intake.pushNextObservable.subscribe(() => {
234
+ if (counter < 100) {
235
+ intake.pushData(counter++);
236
+ } else {
237
+ intake.signalEnd();
146
238
  }
147
239
  });
148
240
 
149
- // Define the second SmartDuplex as a fast processor
150
- const fastProcessingStream = new SmartDuplex({
151
- name: 'FastProcessor',
152
- objectMode: true,
153
- writeFunction: async (chunk, { push }) => {
154
- console.log('Fast processing chunk:', chunk);
155
- push(chunk);
156
- }
157
- });
241
+ intake.pipe(consumer);
242
+ ```
158
243
 
159
- // Create a StreamIntake to dynamically handle incoming data
160
- const streamIntake = new StreamIntake<string>();
244
+ #### Creating from Existing Streams
161
245
 
162
- // Chain the streams together and handle the backpressure scenario
163
- streamIntake
164
- .pipe(fastProcessingStream)
165
- .pipe(slowProcessingStream)
166
- .pipe(createPassThrough()) // Use Pass-Through to provide intermediary handling
167
- .on('data', data => console.log('Final output:', data))
168
- .on('error', error => console.error('Stream encountered an error:', error));
246
+ Wrap a Node.js `Readable` or a Web `ReadableStream`:
169
247
 
170
- // Simulate data pushing with intervals to observe backpressure handling
171
- let counter = 0;
172
- const interval = setInterval(() => {
173
- if (counter >= 10) {
174
- streamIntake.signalEnd();
175
- clearInterval(interval);
176
- } else {
177
- streamIntake.pushData(`Chunk ${counter}`);
178
- counter++;
179
- }
180
- }, 50);
248
+ ```typescript
249
+ // From Node.js Readable
250
+ const intake = await StreamIntake.fromStream<Buffer>(fs.createReadStream('./data.bin'));
251
+
252
+ // From Web ReadableStream
253
+ const response = await fetch('https://example.com/stream');
254
+ const intake = await StreamIntake.fromStream<Uint8Array>(response.body);
181
255
  ```
182
256
 
183
- In this advanced use case, a `SlowProcessor` and `FastProcessor` are created using `SmartDuplex`, simulating a situation where one stream is slower than another. The `StreamIntake` dynamically handles incoming chunks of data and the intermediary Pass-Through handles any potential interruptions.
257
+ ---
258
+
259
+ ### ⚡ Utility Functions
184
260
 
185
- ### Transform Streams in Parallel
261
+ #### `createTransformFunction`
186
262
 
187
- For scenarios where you need to process data in parallel:
263
+ Quickly create a `SmartDuplex` from a simple async mapping function:
188
264
 
189
265
  ```typescript
190
- import { SmartDuplex, createTransformFunction } from '@push.rocks/smartstream';
266
+ import { createTransformFunction } from '@push.rocks/smartstream';
191
267
 
192
- const parallelTransform = createTransformFunction<any, any>(async (chunk) => {
193
- // Parallel Processing
194
- const results = await Promise.all(chunk.map(async item => await processItem(item)));
195
- return results;
196
- });
268
+ const doubler = createTransformFunction<number, number>(async (n) => n * 2);
197
269
 
198
- const streamIntake = new StreamIntake<any[]>();
270
+ intakeStream.pipe(doubler).pipe(outputStream);
271
+ ```
199
272
 
200
- streamIntake
201
- .pipe(parallelTransform)
202
- .pipe(new SmartDuplex({
203
- async writeFunction(chunk, { push }) {
204
- console.log('Processed parallel chunk:', chunk);
205
- push(chunk);
206
- }
207
- }))
208
- .on('finish', () => console.log('Parallel processing completed.'));
273
+ #### `createPassThrough`
209
274
 
210
- // Simulate data pushing
211
- streamIntake.pushData([1, 2, 3, 4]);
212
- streamIntake.pushData([5, 6, 7, 8]);
213
- streamIntake.signalEnd();
275
+ Create an object-mode passthrough stream (useful as an intermediary or tee point):
276
+
277
+ ```typescript
278
+ import { createPassThrough } from '@push.rocks/smartstream';
279
+
280
+ const passThrough = createPassThrough();
281
+ source.pipe(passThrough).pipe(destination);
214
282
  ```
215
283
 
216
- ### Error Handling in Stream Pipelines
284
+ ---
285
+
286
+ ### 🌐 WebDuplexStream — Pure Web Streams API
217
287
 
218
- Error handling is an essential part of working with streams. The `StreamWrapper` assists in combining multiple streams while managing errors seamlessly:
288
+ `WebDuplexStream` extends `TransformStream` and works in both browsers and Node.js. Import it from the `/web` subpath for zero Node.js dependencies.
219
289
 
220
290
  ```typescript
221
- import { StreamWrapper } from '@push.rocks/smartstream';
291
+ import { WebDuplexStream } from '@push.rocks/smartstream/web';
222
292
 
223
- const faultyStream = new SmartDuplex({
224
- async writeFunction(chunk, { push }) {
225
- if (chunk === 'bad data') {
226
- throw new Error('Faulty data encountered');
227
- }
228
- push(chunk);
229
- }
293
+ const stream = new WebDuplexStream<number, number>({
294
+ writeFunction: async (chunk, { push }) => {
295
+ push(chunk * 2); // Push transformed data
296
+ },
230
297
  });
231
298
 
232
- const readStream = new StreamIntake<string>();
233
- const writeStream = new SmartDuplex({
234
- async writeFunction(chunk) {
235
- console.log('Written chunk:', chunk);
236
- }
237
- });
299
+ const writer = stream.writable.getWriter();
300
+ const reader = stream.readable.getReader();
238
301
 
239
- const combinedStream = new StreamWrapper([readStream, faultyStream, writeStream]);
302
+ // Write
303
+ await writer.write(5);
304
+ await writer.write(10);
305
+ await writer.close();
306
+
307
+ // Read
308
+ const { value } = await reader.read(); // 10
309
+ const { value: v2 } = await reader.read(); // 20
310
+ ```
240
311
 
241
- combinedStream.run()
242
- .then(() => console.log('Stream processing completed.'))
243
- .catch(err => console.error('Stream error:', err.message));
312
+ #### From a Uint8Array
244
313
 
245
- // Push Data
246
- readStream.pushData('good data');
247
- readStream.pushData('bad data'); // This will throw an error
248
- readStream.pushData('more good data');
249
- readStream.signalEnd();
314
+ ```typescript
315
+ const stream = WebDuplexStream.fromUInt8Array(new Uint8Array([1, 2, 3]));
316
+ const reader = stream.readable.getReader();
317
+ const { value } = await reader.read(); // Uint8Array [1, 2, 3]
250
318
  ```
251
319
 
252
- ### Testing Streams
320
+ #### Data Production with `readFunction`
253
321
 
254
- Here's an example test case using the `tap` testing framework to verify the integrity of the `SmartDuplex` from a buffer:
322
+ Supply data into the stream from any async source:
255
323
 
256
324
  ```typescript
257
- import { expect, tap } from '@push.rocks/tapbundle';
258
- import { SmartDuplex } from '@push.rocks/smartstream';
259
-
260
- tap.test('should create a SmartStream from a Buffer', async () => {
261
- const bufferData = Buffer.from('This is a test buffer');
262
- const smartStream = SmartDuplex.fromBuffer(bufferData, {});
263
-
264
- let receivedData = Buffer.alloc(0);
265
-
266
- return new Promise<void>((resolve) => {
267
- smartStream.on('data', (chunk: Buffer) => {
268
- receivedData = Buffer.concat([receivedData, chunk]);
269
- });
270
-
271
- smartStream.on('end', () => {
272
- expect(receivedData.toString()).toEqual(bufferData.toString());
273
- resolve();
274
- });
275
- });
325
+ const stream = new WebDuplexStream<string, string>({
326
+ readFunction: async (tools) => {
327
+ await tools.write('chunk 1');
328
+ await tools.write('chunk 2');
329
+ tools.done(); // Signal end
330
+ },
331
+ writeFunction: async (chunk, { push }) => {
332
+ push(chunk.toUpperCase());
333
+ },
276
334
  });
277
335
 
278
- tap.start();
336
+ const reader = stream.readable.getReader();
337
+ // reads "CHUNK 1", "CHUNK 2"
279
338
  ```
280
339
 
281
- ### Working with Files and Buffers
340
+ ---
282
341
 
283
- You can easily stream files and buffers with `smartstream`. Here’s a test illustrating reading and writing with file streams using `smartfile` combined with `smartstream` utilities:
342
+ ### 🔀 Node Web Stream Converters
284
343
 
285
- ```typescript
286
- import { tap } from '@push.rocks/tapbundle';
287
- import * as smartfile from '@push.rocks/smartfile';
288
- import { SmartDuplex, StreamWrapper } from '@push.rocks/smartstream';
344
+ The `nodewebhelpers` namespace provides bidirectional converters between Node.js and Web Streams:
289
345
 
290
- tap.test('should handle file read and write streams', async () => {
291
- const readStream = smartfile.fsStream.createReadStream('./test/assets/readabletext.txt');
292
- const writeStream = smartfile.fsStream.createWriteStream('./test/assets/writabletext.txt');
346
+ ```typescript
347
+ import { nodewebhelpers } from '@push.rocks/smartstream';
348
+ ```
293
349
 
294
- const transformStream = new SmartDuplex({
295
- async writeFunction(chunk, { push }) {
296
- const transformedChunk = chunk.toString().toUpperCase();
297
- push(transformedChunk);
298
- }
299
- });
350
+ | Function | From | To |
351
+ |---|---|---|
352
+ | `createWebReadableStreamFromFile(path)` | File path | Web `ReadableStream<Uint8Array>` |
353
+ | `convertWebReadableToNodeReadable(webStream)` | Web `ReadableStream` | Node.js `Readable` |
354
+ | `convertNodeReadableToWebReadable(nodeStream)` | Node.js `Readable` | Web `ReadableStream` |
355
+ | `convertWebWritableToNodeWritable(webWritable)` | Web `WritableStream` | Node.js `Writable` |
356
+ | `convertNodeWritableToWebWritable(nodeWritable)` | Node.js `Writable` | Web `WritableStream` |
300
357
 
301
- const streamWrapper = new StreamWrapper([readStream, transformStream, writeStream]);
358
+ #### Example: Serve a File as a Web ReadableStream
302
359
 
303
- await streamWrapper.run();
360
+ ```typescript
361
+ const webStream = nodewebhelpers.createWebReadableStreamFromFile('./video.mp4');
304
362
 
305
- const outputContent = await smartfile.fs.promises.readFile('./test/assets/writabletext.txt', 'utf-8');
306
- console.log('Output Content:', outputContent);
363
+ // Use with fetch Response, service workers, etc.
364
+ return new Response(webStream, {
365
+ headers: { 'Content-Type': 'video/mp4' },
307
366
  });
367
+ ```
308
368
 
309
- tap.start();
369
+ #### Example: Convert Between Stream Types
370
+
371
+ ```typescript
372
+ import fs from 'fs';
373
+ import { nodewebhelpers } from '@push.rocks/smartstream';
374
+
375
+ // Node → Web
376
+ const nodeReadable = fs.createReadStream('./data.bin');
377
+ const webReadable = nodewebhelpers.convertNodeReadableToWebReadable(nodeReadable);
378
+
379
+ // Web → Node
380
+ const nodeReadable2 = nodewebhelpers.convertWebReadableToNodeReadable(webReadable);
381
+ nodeReadable2.pipe(fs.createWriteStream('./copy.bin'));
310
382
  ```
311
383
 
312
- ### Modular and Scoped Transformations
384
+ ---
385
+
386
+ ### 🏗️ Backpressure Handling
313
387
 
314
- Creating modular and scoped transformations is straightforward with `SmartDuplex`:
388
+ `SmartDuplex` uses a `BackpressuredArray` internally, bounded by `highWaterMark` (default: 1). When the downstream consumer is slow, the stream automatically pauses the upstream producer until space is available — no manual bookkeeping required.
315
389
 
316
390
  ```typescript
317
- import { SmartDuplex } from '@push.rocks/smartstream';
391
+ const slow = new SmartDuplex({
392
+ name: 'SlowConsumer',
393
+ objectMode: true,
394
+ highWaterMark: 1,
395
+ writeFunction: async (chunk, tools) => {
396
+ await new Promise((resolve) => setTimeout(resolve, 200));
397
+ return chunk;
398
+ },
399
+ });
318
400
 
319
- type DataChunk = {
320
- id: number;
321
- data: string;
322
- };
401
+ const fast = new SmartDuplex({
402
+ name: 'FastProducer',
403
+ objectMode: true,
404
+ writeFunction: async (chunk, tools) => {
405
+ return chunk; // Instant processing
406
+ },
407
+ });
323
408
 
324
- const transformationStream1 = new SmartDuplex<DataChunk, DataChunk>({
325
- async writeFunction(chunk, { push }) {
326
- chunk.data = chunk.data.toUpperCase();
327
- push(chunk);
328
- }
329
- })
409
+ // Backpressure is handled automatically between fast → slow
410
+ fast.pipe(slow).on('data', (d) => console.log(d));
330
411
 
331
- const transformationStream2 = new SmartDuplex<DataChunk, DataChunk>({
332
- async writeFunction(chunk, { push }) {
333
- chunk.data = `${chunk.data} processed with transformation 2`;
334
- push(chunk);
335
- }
336
- });
412
+ for (let i = 0; i < 100; i++) {
413
+ fast.write(`chunk-${i}`);
414
+ }
415
+ fast.end();
416
+ ```
337
417
 
338
- const initialData: DataChunk[] = [
339
- { id: 1, data: 'first' },
340
- { id: 2, data: 'second' }
341
- ];
418
+ ---
342
419
 
343
- const intakeStream = new StreamIntake<DataChunk>();
420
+ ### 🎯 Real-World Example: Processing Pipeline
344
421
 
345
- intakeStream
346
- .pipe(transformationStream1)
347
- .pipe(transformationStream2)
348
- .on('data', data => console.log('Transformed Data:', data));
422
+ ```typescript
423
+ import fs from 'fs';
424
+ import { SmartDuplex, StreamWrapper } from '@push.rocks/smartstream';
349
425
 
350
- initialData.forEach(item => intakeStream.pushData(item));
351
- intakeStream.signalEnd();
352
- ```
426
+ // Read → Transform → Filter → Write
427
+ const pipeline = new StreamWrapper([
428
+ fs.createReadStream('./access.log'),
429
+ new SmartDuplex({
430
+ writeFunction: async (chunk) => {
431
+ // Parse each line
432
+ return chunk.toString().split('\n');
433
+ },
434
+ }),
435
+ new SmartDuplex({
436
+ objectMode: true,
437
+ writeFunction: async (lines: string[], tools) => {
438
+ // Filter and push matching lines
439
+ for (const line of lines) {
440
+ if (line.includes('ERROR')) {
441
+ await tools.push(line + '\n');
442
+ }
443
+ }
444
+ },
445
+ }),
446
+ fs.createWriteStream('./errors.log'),
447
+ ]);
353
448
 
354
- By leveraging `SmartDuplex`, `StreamWrapper`, and `StreamIntake`, you can streamline and enhance your data transformation pipelines in Node.js with a clear, efficient, and backpressure-friendly approach.
449
+ await pipeline.run();
450
+ console.log('Error extraction complete');
355
451
  ```
356
452
 
357
-
358
453
  ## License and Legal Information
359
454
 
360
- This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the [license](license) file within this repository.
455
+ This repository contains open-source code licensed under the MIT License. A copy of the license can be found in the [LICENSE](./LICENSE) file.
361
456
 
362
457
  **Please note:** The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
363
458
 
364
459
  ### Trademarks
365
460
 
366
- This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.
461
+ This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH or third parties, and are not included within the scope of the MIT license granted herein.
462
+
463
+ Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines or the guidelines of the respective third-party owners, and any usage must be approved in writing. Third-party trademarks used herein are the property of their respective owners and used only in a descriptive manner, e.g. for an implementation of an API or similar.
367
464
 
368
465
  ### Company Information
369
466
 
370
- Task Venture Capital GmbH
371
- Registered at District court Bremen HRB 35230 HB, Germany
467
+ Task Venture Capital GmbH
468
+ Registered at District Court Bremen HRB 35230 HB, Germany
372
469
 
373
- For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.
470
+ For any legal inquiries or further information, please contact us via email at hello@task.vc.
374
471
 
375
472
  By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.