readline-pager 0.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,172 @@
1
+ # readline-pager
2
+
3
+ Memory-efficient, paginated file reader for Node.js with async iteration, prefetching, backward reading and optional worker support.
4
+
5
+ `readline-pager` reads large text files page-by-page without loading the entire file into memory.
6
+
7
+ - ✅ Zero dependencies
8
+ - ✅ Async iterator support (`for await...of`)
9
+ - ✅ Forward & backward reading (read from EOF → BOF)
10
+ - ✅ Optional worker thread mode (forward only)
11
+ - ✅ Up to ~3× faster than vanilla Node.js `readline`
12
+ - ✅ ~97% test coverage & Fully typed (TypeScript)
13
+
14
+ ---
15
+
16
+ ## Installation
17
+
18
+ ```bash
19
+ npm install readline-pager
20
+ ```
21
+
22
+ ---
23
+
24
+ ## Quick Start
25
+
26
+ ```ts
27
+ import { createPager } from "readline-pager";
28
+
29
+ const pager = createPager("./bigfile.txt");
30
+
31
+ for await (const page of pager) {
32
+ console.log(page[0]); // first line of the current page
33
+ }
34
+ ```
35
+
36
+ ---
37
+
38
+ ## Manual iteration (recommended for max throughput)
39
+
40
+ ```ts
41
+ const pager = createPager("./bigfile.txt");
42
+
43
+ let page;
44
+ while ((page = await pager.next()) !== null) {
45
+ // page: string[]
46
+ // process page
47
+ }
48
+ ```
49
+
50
+ `pager.next()` returns:
51
+
52
+ - `Promise<string[]>` — next page
53
+ - `Promise<null>` — end of file
54
+
55
+ > Use `while + next()` when raw throughput matters (see Iteration Performance Notes).
56
+
57
+ ---
58
+
59
+ ## Options
60
+
61
+ ```ts
62
+ createPager(filepath, {
63
+ pageSize?: number, // default: 1_000
64
+ delimiter?: string // default: "\n"
65
+ prefetch?: number, // default: 1
66
+ backward?: boolean, // default: false
67
+ useWorker?: boolean, // default: false (forward only)
68
+ });
69
+ ```
70
+
71
+ - `pageSize` — number of lines per page.
72
+ - `delimiter` — line separator.
73
+ - `prefetch` — max pages buffered internally. Higher increases throughput but uses more memory.
74
+ - `backward` — read file from end → start (not supported with `useWorker`).
75
+ - `useWorker` — move parsing to a worker thread (forward only).
76
+
77
+ ---
78
+
79
+ ## API
80
+
81
+ ### `pager.next(): Promise<string[] | null>`
82
+
83
+ Returns the next page or `null` when finished. Empty lines are preserved.
84
+
85
+ **Note:** Unlike Node.js `readline`, which skips empty files or empty lines at the start, `readline-pager` always returns all lines.
86
+
87
+ - A completely empty file (`0` bytes) produces `[""]` on the first read.
88
+ - A file with multiple empty lines returns each line as an empty string (e.g., `["", ""]` for two empty lines). Node.js `readline` would emit fewer or no `line` events in these cases.
89
+
90
+ ✅ Key points:
91
+
92
+ - A 0-byte file → `[""]`
93
+ - Consecutive `\n\n` → `["", ""]`
94
+ - Node.js `readline` skips first empty line(s) and might emit nothing for empty files.
95
+
96
+ ### `pager.close(): void`
97
+
98
+ Stops reading and releases resources immediately. Safe to call any time.
99
+
100
+ ### Properties
101
+
102
+ ```ts
103
+ pager.lineCount; // total lines emitted so far
104
+ pager.firstLine; // first line emitted (available after first read)
105
+ pager.lastLine; // last line emitted (updated each page)
106
+ ```
107
+
108
+ ---
109
+
110
+ ## Benchmark
111
+
112
+ Run the included benchmark:
113
+
114
+ ```bash
115
+ # default run
116
+ node test/benchmark.ts
117
+
118
+ # or customize
119
+ node test/benchmark.ts --lines=20000 --page-size=500 --prefetch=4 --backward
120
+ ```
121
+
122
+ Benchmarks were executed on a high-end consumer Linux machine (SSD + fast CPU) using generated files.
123
+
124
+ ### Summary (averages)
125
+
126
+ | Lines | File Size (MB) | Implementation | Avg Time (ms) | Avg Throughput (MB/s) | Speedup vs `readline` |
127
+ | ----------- | -------------- | -------------- | ------------- | --------------------: | --------------------: |
128
+ | 1,000,000 | 35.29 MB | readline | 100.21 | 352.31 | — |
129
+ | 1,000,000 | 35.29 MB | readline-pager | 43.31 | 815.71 | **2.32× faster** |
130
+ | 10,000,000 | 352.86 MB | readline | 802.61 | 439.80 | — |
131
+ | 10,000,000 | 352.86 MB | readline-pager | 292.33 | 1207.77 | **2.75× faster** |
132
+ | 100,000,000 | 3528.59 MB | readline | 7777.52 | 453.75 | — |
133
+ | 100,000,000 | 3528.59 MB | readline-pager | 2742.99 | 1286.50 | **2.83× faster** |
134
+
135
+ **Key takeaways**
136
+
137
+ - `readline-pager` is consistently **~2.3×–2.8× faster** than Node.js `readline`.
138
+ - Relative gain increases with file size.
139
+ - Sustained throughput exceeds **1.2 GB/s** on large files (machine-dependent).
140
+
141
+ ---
142
+
143
+ ## Iteration performance notes
144
+
145
+ - **Fastest**: manual `while ((page = await pager.next()) !== null) { ... }`
146
+ Avoids async-iterator protocol overhead and microtask churn.
147
+
148
+ - **More ergonomic**: `for await (const page of pager) { ... }`
149
+ Cleaner but slightly slower in hot paths.
150
+
151
+ **Recommendation:** use the explicit `next()` loop for benchmarks and performance-critical code; use `for await...of` for clarity in less performance-sensitive code.
152
+
153
+ ---
154
+
155
+ ## Development & Contributing
156
+
157
+ - Minimum supported Node.js: **18.12+** (LTS).
158
+ - Development/test environment used by the author: **Node v25.6.1**, TypeScript `~5.9.x`.
159
+ - To run tests & coverage:
160
+
161
+ ```bash
162
+ npm ci
163
+ npm test
164
+ ```
165
+
166
+ If you want to contribute, open an issue or PR. A `CONTRIBUTING.md` is welcome for larger workflow notes.
167
+
168
+ ---
169
+
170
+ ## License
171
+
172
+ MIT — © Morteza Jamshidi
@@ -0,0 +1,6 @@
1
+ //#region src/constants.ts
2
+ const CHUNK_SIZE = 64 * 1024;
3
+ const ENCODING = "utf8";
4
+
5
+ //#endregion
6
+ export { ENCODING as n, CHUNK_SIZE as t };
@@ -0,0 +1,18 @@
1
+
2
+ //#region src/constants.ts
3
+ const CHUNK_SIZE = 64 * 1024;
4
+ const ENCODING = "utf8";
5
+
6
+ //#endregion
7
+ Object.defineProperty(exports, 'CHUNK_SIZE', {
8
+ enumerable: true,
9
+ get: function () {
10
+ return CHUNK_SIZE;
11
+ }
12
+ });
13
+ Object.defineProperty(exports, 'ENCODING', {
14
+ enumerable: true,
15
+ get: function () {
16
+ return ENCODING;
17
+ }
18
+ });
package/dist/main.cjs ADDED
@@ -0,0 +1,298 @@
1
+ Object.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });
2
+ const require_constants = require('./constants-Cyb_UQAC.cjs');
3
+ let node_fs_promises = require("node:fs/promises");
4
+ let node_worker_threads = require("node:worker_threads");
5
+
6
+ //#region src/queue.ts
7
+ function createPageQueue() {
8
+ const queue = [];
9
+ let resolver = null;
10
+ return {
11
+ queue,
12
+ push(page) {
13
+ queue.push(page);
14
+ resolver?.();
15
+ resolver = null;
16
+ },
17
+ wake() {
18
+ resolver?.();
19
+ resolver = null;
20
+ },
21
+ async shift(done) {
22
+ if (queue.length) return queue.shift();
23
+ if (done()) return null;
24
+ await new Promise((r) => resolver = r);
25
+ if (queue.length) return queue.shift();
26
+ if (done()) return null;
27
+ return null;
28
+ }
29
+ };
30
+ }
31
+
32
+ //#endregion
33
+ //#region src/reader/backward.reader.ts
34
+ function createBackwardReader(filepath, options) {
35
+ const { pageSize, delimiter, prefetch } = options;
36
+ const pageQueue = createPageQueue();
37
+ let fd = null;
38
+ let pos = 0;
39
+ let buffer = "";
40
+ let done = false;
41
+ let closed = false;
42
+ let emittedCount = 0;
43
+ let firstLine = null;
44
+ let lastLine = null;
45
+ const local = [];
46
+ async function init() {
47
+ if (fd) return;
48
+ fd = await (0, node_fs_promises.open)(filepath, "r");
49
+ pos = (await fd.stat()).size;
50
+ if (pos === 0) done = true;
51
+ }
52
+ async function fill() {
53
+ if (done || closed) return;
54
+ await init();
55
+ if (!fd) return;
56
+ while (pageQueue.queue.length < prefetch && pos > 0) {
57
+ const readSize = Math.min(require_constants.CHUNK_SIZE, pos);
58
+ pos -= readSize;
59
+ const buf = Buffer.allocUnsafe(readSize);
60
+ await fd.read(buf, 0, readSize, pos);
61
+ buffer += buf.toString(require_constants.ENCODING);
62
+ let idx;
63
+ while ((idx = buffer.lastIndexOf(delimiter)) !== -1) {
64
+ const line = buffer.slice(idx + delimiter.length);
65
+ buffer = buffer.slice(0, idx);
66
+ local.push(line);
67
+ while (local.length >= pageSize) {
68
+ const page = local.splice(0, pageSize);
69
+ pageQueue.push(page);
70
+ }
71
+ }
72
+ }
73
+ if (pos === 0) {
74
+ local.push(buffer);
75
+ buffer = "";
76
+ while (local.length > 0) {
77
+ const sliceSize = Math.min(pageSize, local.length);
78
+ const page = local.splice(local.length - sliceSize, sliceSize);
79
+ pageQueue.push(page);
80
+ }
81
+ done = true;
82
+ if (fd) {
83
+ await fd.close();
84
+ fd = null;
85
+ }
86
+ }
87
+ }
88
+ async function next() {
89
+ if (closed) return null;
90
+ await fill();
91
+ const page = await pageQueue.shift(() => done);
92
+ if (!page) return null;
93
+ emittedCount += page.length;
94
+ firstLine ??= page[0];
95
+ lastLine = page[page.length - 1];
96
+ return page;
97
+ }
98
+ async function close() {
99
+ closed = true;
100
+ done = true;
101
+ pageQueue.queue.length = 0;
102
+ if (fd) await fd.close();
103
+ }
104
+ return {
105
+ next,
106
+ close,
107
+ get lineCount() {
108
+ return emittedCount;
109
+ },
110
+ get firstLine() {
111
+ return firstLine;
112
+ },
113
+ get lastLine() {
114
+ return lastLine;
115
+ },
116
+ async *[Symbol.asyncIterator]() {
117
+ while (true) {
118
+ const p = await next();
119
+ if (!p) break;
120
+ yield p;
121
+ }
122
+ }
123
+ };
124
+ }
125
+
126
+ //#endregion
127
+ //#region src/reader/forward.reader.ts
128
+ function createForwardReader(filepath, options) {
129
+ const { pageSize, delimiter, prefetch } = options;
130
+ const pageQueue = createPageQueue();
131
+ let fd = null;
132
+ let pos = 0;
133
+ let size = 0;
134
+ let buffer = "";
135
+ let done = false;
136
+ let closed = false;
137
+ let emittedCount = 0;
138
+ let firstLine = null;
139
+ let lastLine = null;
140
+ const local = [];
141
+ async function init() {
142
+ if (fd) return;
143
+ fd = await (0, node_fs_promises.open)(filepath, "r");
144
+ size = (await fd.stat()).size;
145
+ if (size === 0) done = true;
146
+ }
147
+ async function fill() {
148
+ if (done || closed) return;
149
+ await init();
150
+ if (!fd) return;
151
+ while (pageQueue.queue.length < prefetch && pos < size) {
152
+ const readSize = Math.min(require_constants.CHUNK_SIZE, size - pos);
153
+ const buf = Buffer.allocUnsafe(readSize);
154
+ const { bytesRead } = await fd.read(buf, 0, readSize, pos);
155
+ pos += bytesRead;
156
+ buffer += buf.toString(require_constants.ENCODING, 0, bytesRead);
157
+ let idx;
158
+ while ((idx = buffer.indexOf(delimiter)) !== -1) {
159
+ const line = buffer.slice(0, idx);
160
+ buffer = buffer.slice(idx + delimiter.length);
161
+ local.push(line);
162
+ while (local.length >= pageSize) pageQueue.push(local.splice(0, pageSize));
163
+ }
164
+ }
165
+ if (pos >= size) {
166
+ const parts = buffer.length > 0 ? buffer.split(delimiter) : [""];
167
+ for (const line of parts) local.push(line);
168
+ buffer = "";
169
+ while (local.length > 0) pageQueue.push(local.splice(0, pageSize));
170
+ done = true;
171
+ if (fd) {
172
+ await fd.close();
173
+ fd = null;
174
+ }
175
+ }
176
+ }
177
+ async function next() {
178
+ if (closed) return null;
179
+ await fill();
180
+ const page = await pageQueue.shift(() => done);
181
+ if (!page) return null;
182
+ emittedCount += page.length;
183
+ firstLine ??= page[0];
184
+ lastLine = page[page.length - 1];
185
+ return page;
186
+ }
187
+ async function close() {
188
+ closed = true;
189
+ done = true;
190
+ pageQueue.queue.length = 0;
191
+ if (fd) await fd.close();
192
+ }
193
+ return {
194
+ next,
195
+ close,
196
+ get lineCount() {
197
+ return emittedCount;
198
+ },
199
+ get firstLine() {
200
+ return firstLine;
201
+ },
202
+ get lastLine() {
203
+ return lastLine;
204
+ },
205
+ async *[Symbol.asyncIterator]() {
206
+ while (true) {
207
+ const p = await next();
208
+ if (!p) break;
209
+ yield p;
210
+ }
211
+ }
212
+ };
213
+ }
214
+
215
+ //#endregion
216
+ //#region src/reader/worker.reader.ts
217
+ const workerFile = typeof {} !== "undefined" ? new URL("./worker.mjs", require("url").pathToFileURL(__filename).href) : require.resolve("./worker.cjs");
218
+ function createWorkerReader(filepath, options) {
219
+ const { pageSize, delimiter, prefetch } = options;
220
+ const worker = new node_worker_threads.Worker(new URL(workerFile, require("url").pathToFileURL(__filename).href), { workerData: {
221
+ filepath,
222
+ pageSize,
223
+ delimiter
224
+ } });
225
+ const pageQueue = createPageQueue();
226
+ let done = false;
227
+ let closed = false;
228
+ let emittedCount = 0;
229
+ let firstLine = null;
230
+ let lastLine = null;
231
+ worker.on("message", (msg) => {
232
+ if (msg.type === "page") pageQueue.push(msg.data);
233
+ if (msg.type === "done") {
234
+ done = true;
235
+ pageQueue.wake();
236
+ }
237
+ });
238
+ async function next() {
239
+ if (closed) return null;
240
+ const page = await pageQueue.shift(() => done);
241
+ if (!page) return null;
242
+ emittedCount += page.length;
243
+ firstLine ??= page[0];
244
+ lastLine = page[page.length - 1];
245
+ return page;
246
+ }
247
+ async function close() {
248
+ closed = true;
249
+ done = true;
250
+ worker.terminate();
251
+ }
252
+ return {
253
+ next,
254
+ close,
255
+ get lineCount() {
256
+ return emittedCount;
257
+ },
258
+ get firstLine() {
259
+ return firstLine;
260
+ },
261
+ get lastLine() {
262
+ return lastLine;
263
+ },
264
+ async *[Symbol.asyncIterator]() {
265
+ while (true) {
266
+ const p = await next();
267
+ if (!p) break;
268
+ yield p;
269
+ }
270
+ }
271
+ };
272
+ }
273
+
274
+ //#endregion
275
+ //#region src/main.ts
276
+ function createPager(filepath, options = {}) {
277
+ const { pageSize = 1e3, delimiter = "\n", prefetch = 1, backward = false, useWorker = false } = options;
278
+ if (!filepath) throw new Error("filepath required");
279
+ if (pageSize <= 0) throw new RangeError("pageSize must be > 0");
280
+ if (prefetch <= 0) throw new RangeError("prefetch must be >= 1");
281
+ if (backward && useWorker) throw new Error("backward not supported with useWorker");
282
+ return useWorker ? createWorkerReader(filepath, {
283
+ pageSize,
284
+ prefetch,
285
+ delimiter
286
+ }) : backward ? createBackwardReader(filepath, {
287
+ pageSize,
288
+ prefetch,
289
+ delimiter
290
+ }) : createForwardReader(filepath, {
291
+ pageSize,
292
+ prefetch,
293
+ delimiter
294
+ });
295
+ }
296
+
297
+ //#endregion
298
+ exports.createPager = createPager;
@@ -0,0 +1,22 @@
1
+ //#region src/types.d.ts
2
+ interface ReaderOptions {
3
+ pageSize: number;
4
+ delimiter: string;
5
+ prefetch: number;
6
+ }
7
+ interface PagerOptions extends Partial<ReaderOptions> {
8
+ backward?: boolean;
9
+ useWorker?: boolean;
10
+ }
11
+ interface Pager extends AsyncIterable<string[]> {
12
+ next(): Promise<string[] | null>;
13
+ close(): void;
14
+ readonly lineCount: number;
15
+ readonly firstLine: string | null;
16
+ readonly lastLine: string | null;
17
+ }
18
+ //#endregion
19
+ //#region src/main.d.ts
20
+ declare function createPager(filepath: string, options?: PagerOptions): Pager;
21
+ //#endregion
22
+ export { Pager, PagerOptions, ReaderOptions, createPager };
@@ -0,0 +1,22 @@
1
+ //#region src/types.d.ts
2
+ interface ReaderOptions {
3
+ pageSize: number;
4
+ delimiter: string;
5
+ prefetch: number;
6
+ }
7
+ interface PagerOptions extends Partial<ReaderOptions> {
8
+ backward?: boolean;
9
+ useWorker?: boolean;
10
+ }
11
+ interface Pager extends AsyncIterable<string[]> {
12
+ next(): Promise<string[] | null>;
13
+ close(): void;
14
+ readonly lineCount: number;
15
+ readonly firstLine: string | null;
16
+ readonly lastLine: string | null;
17
+ }
18
+ //#endregion
19
+ //#region src/main.d.ts
20
+ declare function createPager(filepath: string, options?: PagerOptions): Pager;
21
+ //#endregion
22
+ export { Pager, PagerOptions, ReaderOptions, createPager };
package/dist/main.mjs ADDED
@@ -0,0 +1,302 @@
1
+ import { n as ENCODING, t as CHUNK_SIZE } from "./constants-BNKQoOqH.mjs";
2
+ import { createRequire } from "node:module";
3
+ import { open } from "node:fs/promises";
4
+ import { Worker } from "node:worker_threads";
5
+
6
+ //#region \0rolldown/runtime.js
7
+ var __require = /* @__PURE__ */ createRequire(import.meta.url);
8
+
9
+ //#endregion
10
+ //#region src/queue.ts
11
+ function createPageQueue() {
12
+ const queue = [];
13
+ let resolver = null;
14
+ return {
15
+ queue,
16
+ push(page) {
17
+ queue.push(page);
18
+ resolver?.();
19
+ resolver = null;
20
+ },
21
+ wake() {
22
+ resolver?.();
23
+ resolver = null;
24
+ },
25
+ async shift(done) {
26
+ if (queue.length) return queue.shift();
27
+ if (done()) return null;
28
+ await new Promise((r) => resolver = r);
29
+ if (queue.length) return queue.shift();
30
+ if (done()) return null;
31
+ return null;
32
+ }
33
+ };
34
+ }
35
+
36
+ //#endregion
37
+ //#region src/reader/backward.reader.ts
38
+ function createBackwardReader(filepath, options) {
39
+ const { pageSize, delimiter, prefetch } = options;
40
+ const pageQueue = createPageQueue();
41
+ let fd = null;
42
+ let pos = 0;
43
+ let buffer = "";
44
+ let done = false;
45
+ let closed = false;
46
+ let emittedCount = 0;
47
+ let firstLine = null;
48
+ let lastLine = null;
49
+ const local = [];
50
+ async function init() {
51
+ if (fd) return;
52
+ fd = await open(filepath, "r");
53
+ pos = (await fd.stat()).size;
54
+ if (pos === 0) done = true;
55
+ }
56
+ async function fill() {
57
+ if (done || closed) return;
58
+ await init();
59
+ if (!fd) return;
60
+ while (pageQueue.queue.length < prefetch && pos > 0) {
61
+ const readSize = Math.min(CHUNK_SIZE, pos);
62
+ pos -= readSize;
63
+ const buf = Buffer.allocUnsafe(readSize);
64
+ await fd.read(buf, 0, readSize, pos);
65
+ buffer += buf.toString(ENCODING);
66
+ let idx;
67
+ while ((idx = buffer.lastIndexOf(delimiter)) !== -1) {
68
+ const line = buffer.slice(idx + delimiter.length);
69
+ buffer = buffer.slice(0, idx);
70
+ local.push(line);
71
+ while (local.length >= pageSize) {
72
+ const page = local.splice(0, pageSize);
73
+ pageQueue.push(page);
74
+ }
75
+ }
76
+ }
77
+ if (pos === 0) {
78
+ local.push(buffer);
79
+ buffer = "";
80
+ while (local.length > 0) {
81
+ const sliceSize = Math.min(pageSize, local.length);
82
+ const page = local.splice(local.length - sliceSize, sliceSize);
83
+ pageQueue.push(page);
84
+ }
85
+ done = true;
86
+ if (fd) {
87
+ await fd.close();
88
+ fd = null;
89
+ }
90
+ }
91
+ }
92
+ async function next() {
93
+ if (closed) return null;
94
+ await fill();
95
+ const page = await pageQueue.shift(() => done);
96
+ if (!page) return null;
97
+ emittedCount += page.length;
98
+ firstLine ??= page[0];
99
+ lastLine = page[page.length - 1];
100
+ return page;
101
+ }
102
+ async function close() {
103
+ closed = true;
104
+ done = true;
105
+ pageQueue.queue.length = 0;
106
+ if (fd) await fd.close();
107
+ }
108
+ return {
109
+ next,
110
+ close,
111
+ get lineCount() {
112
+ return emittedCount;
113
+ },
114
+ get firstLine() {
115
+ return firstLine;
116
+ },
117
+ get lastLine() {
118
+ return lastLine;
119
+ },
120
+ async *[Symbol.asyncIterator]() {
121
+ while (true) {
122
+ const p = await next();
123
+ if (!p) break;
124
+ yield p;
125
+ }
126
+ }
127
+ };
128
+ }
129
+
130
+ //#endregion
131
+ //#region src/reader/forward.reader.ts
132
+ function createForwardReader(filepath, options) {
133
+ const { pageSize, delimiter, prefetch } = options;
134
+ const pageQueue = createPageQueue();
135
+ let fd = null;
136
+ let pos = 0;
137
+ let size = 0;
138
+ let buffer = "";
139
+ let done = false;
140
+ let closed = false;
141
+ let emittedCount = 0;
142
+ let firstLine = null;
143
+ let lastLine = null;
144
+ const local = [];
145
+ async function init() {
146
+ if (fd) return;
147
+ fd = await open(filepath, "r");
148
+ size = (await fd.stat()).size;
149
+ if (size === 0) done = true;
150
+ }
151
+ async function fill() {
152
+ if (done || closed) return;
153
+ await init();
154
+ if (!fd) return;
155
+ while (pageQueue.queue.length < prefetch && pos < size) {
156
+ const readSize = Math.min(CHUNK_SIZE, size - pos);
157
+ const buf = Buffer.allocUnsafe(readSize);
158
+ const { bytesRead } = await fd.read(buf, 0, readSize, pos);
159
+ pos += bytesRead;
160
+ buffer += buf.toString(ENCODING, 0, bytesRead);
161
+ let idx;
162
+ while ((idx = buffer.indexOf(delimiter)) !== -1) {
163
+ const line = buffer.slice(0, idx);
164
+ buffer = buffer.slice(idx + delimiter.length);
165
+ local.push(line);
166
+ while (local.length >= pageSize) pageQueue.push(local.splice(0, pageSize));
167
+ }
168
+ }
169
+ if (pos >= size) {
170
+ const parts = buffer.length > 0 ? buffer.split(delimiter) : [""];
171
+ for (const line of parts) local.push(line);
172
+ buffer = "";
173
+ while (local.length > 0) pageQueue.push(local.splice(0, pageSize));
174
+ done = true;
175
+ if (fd) {
176
+ await fd.close();
177
+ fd = null;
178
+ }
179
+ }
180
+ }
181
+ async function next() {
182
+ if (closed) return null;
183
+ await fill();
184
+ const page = await pageQueue.shift(() => done);
185
+ if (!page) return null;
186
+ emittedCount += page.length;
187
+ firstLine ??= page[0];
188
+ lastLine = page[page.length - 1];
189
+ return page;
190
+ }
191
+ async function close() {
192
+ closed = true;
193
+ done = true;
194
+ pageQueue.queue.length = 0;
195
+ if (fd) await fd.close();
196
+ }
197
+ return {
198
+ next,
199
+ close,
200
+ get lineCount() {
201
+ return emittedCount;
202
+ },
203
+ get firstLine() {
204
+ return firstLine;
205
+ },
206
+ get lastLine() {
207
+ return lastLine;
208
+ },
209
+ async *[Symbol.asyncIterator]() {
210
+ while (true) {
211
+ const p = await next();
212
+ if (!p) break;
213
+ yield p;
214
+ }
215
+ }
216
+ };
217
+ }
218
+
219
+ //#endregion
220
+ //#region src/reader/worker.reader.ts
221
+ const workerFile = typeof import.meta !== "undefined" ? new URL("./worker.mjs", import.meta.url) : __require.resolve("./worker.cjs");
222
+ function createWorkerReader(filepath, options) {
223
+ const { pageSize, delimiter, prefetch } = options;
224
+ const worker = new Worker(new URL(workerFile, import.meta.url), { workerData: {
225
+ filepath,
226
+ pageSize,
227
+ delimiter
228
+ } });
229
+ const pageQueue = createPageQueue();
230
+ let done = false;
231
+ let closed = false;
232
+ let emittedCount = 0;
233
+ let firstLine = null;
234
+ let lastLine = null;
235
+ worker.on("message", (msg) => {
236
+ if (msg.type === "page") pageQueue.push(msg.data);
237
+ if (msg.type === "done") {
238
+ done = true;
239
+ pageQueue.wake();
240
+ }
241
+ });
242
+ async function next() {
243
+ if (closed) return null;
244
+ const page = await pageQueue.shift(() => done);
245
+ if (!page) return null;
246
+ emittedCount += page.length;
247
+ firstLine ??= page[0];
248
+ lastLine = page[page.length - 1];
249
+ return page;
250
+ }
251
+ async function close() {
252
+ closed = true;
253
+ done = true;
254
+ worker.terminate();
255
+ }
256
+ return {
257
+ next,
258
+ close,
259
+ get lineCount() {
260
+ return emittedCount;
261
+ },
262
+ get firstLine() {
263
+ return firstLine;
264
+ },
265
+ get lastLine() {
266
+ return lastLine;
267
+ },
268
+ async *[Symbol.asyncIterator]() {
269
+ while (true) {
270
+ const p = await next();
271
+ if (!p) break;
272
+ yield p;
273
+ }
274
+ }
275
+ };
276
+ }
277
+
278
+ //#endregion
279
+ //#region src/main.ts
280
+ function createPager(filepath, options = {}) {
281
+ const { pageSize = 1e3, delimiter = "\n", prefetch = 1, backward = false, useWorker = false } = options;
282
+ if (!filepath) throw new Error("filepath required");
283
+ if (pageSize <= 0) throw new RangeError("pageSize must be > 0");
284
+ if (prefetch <= 0) throw new RangeError("prefetch must be >= 1");
285
+ if (backward && useWorker) throw new Error("backward not supported with useWorker");
286
+ return useWorker ? createWorkerReader(filepath, {
287
+ pageSize,
288
+ prefetch,
289
+ delimiter
290
+ }) : backward ? createBackwardReader(filepath, {
291
+ pageSize,
292
+ prefetch,
293
+ delimiter
294
+ }) : createForwardReader(filepath, {
295
+ pageSize,
296
+ prefetch,
297
+ delimiter
298
+ });
299
+ }
300
+
301
+ //#endregion
302
+ export { createPager };
@@ -0,0 +1,38 @@
1
+ const require_constants = require('./constants-Cyb_UQAC.cjs');
2
+ let node_fs_promises = require("node:fs/promises");
3
+ let node_worker_threads = require("node:worker_threads");
4
+
5
+ //#region src/worker.ts
6
+ const { filepath, pageSize, delimiter } = node_worker_threads.workerData;
7
+ (async () => {
8
+ const fd = await (0, node_fs_promises.open)(filepath, "r");
9
+ const { size } = await fd.stat();
10
+ let pos = 0;
11
+ let buffer = "";
12
+ const local = [];
13
+ while (pos < size) {
14
+ const readSize = Math.min(require_constants.CHUNK_SIZE, size - pos);
15
+ const buf = Buffer.allocUnsafe(readSize);
16
+ const { bytesRead } = await fd.read(buf, 0, readSize, pos);
17
+ pos += bytesRead;
18
+ buffer += buf.toString("utf8", 0, bytesRead);
19
+ const parts = buffer.split(delimiter);
20
+ buffer = parts.pop() ?? "";
21
+ for (const line of parts) {
22
+ local.push(line);
23
+ if (local.length === pageSize) node_worker_threads.parentPort?.postMessage({
24
+ type: "page",
25
+ data: local.splice(0)
26
+ });
27
+ }
28
+ }
29
+ if (buffer !== "") local.push(buffer);
30
+ if (local.length) node_worker_threads.parentPort?.postMessage({
31
+ type: "page",
32
+ data: local
33
+ });
34
+ node_worker_threads.parentPort?.postMessage({ type: "done" });
35
+ await fd.close();
36
+ })();
37
+
38
+ //#endregion
@@ -0,0 +1 @@
1
+ export { };
@@ -0,0 +1 @@
1
+ export { };
@@ -0,0 +1,39 @@
1
+ import { t as CHUNK_SIZE } from "./constants-BNKQoOqH.mjs";
2
+ import { open } from "node:fs/promises";
3
+ import { parentPort, workerData } from "node:worker_threads";
4
+
5
+ //#region src/worker.ts
6
+ const { filepath, pageSize, delimiter } = workerData;
7
+ (async () => {
8
+ const fd = await open(filepath, "r");
9
+ const { size } = await fd.stat();
10
+ let pos = 0;
11
+ let buffer = "";
12
+ const local = [];
13
+ while (pos < size) {
14
+ const readSize = Math.min(CHUNK_SIZE, size - pos);
15
+ const buf = Buffer.allocUnsafe(readSize);
16
+ const { bytesRead } = await fd.read(buf, 0, readSize, pos);
17
+ pos += bytesRead;
18
+ buffer += buf.toString("utf8", 0, bytesRead);
19
+ const parts = buffer.split(delimiter);
20
+ buffer = parts.pop() ?? "";
21
+ for (const line of parts) {
22
+ local.push(line);
23
+ if (local.length === pageSize) parentPort?.postMessage({
24
+ type: "page",
25
+ data: local.splice(0)
26
+ });
27
+ }
28
+ }
29
+ if (buffer !== "") local.push(buffer);
30
+ if (local.length) parentPort?.postMessage({
31
+ type: "page",
32
+ data: local
33
+ });
34
+ parentPort?.postMessage({ type: "done" });
35
+ await fd.close();
36
+ })();
37
+
38
+ //#endregion
39
+ export { };
package/package.json ADDED
@@ -0,0 +1,52 @@
1
+ {
2
+ "name": "readline-pager",
3
+ "version": "0.2.1",
4
+ "scripts": {
5
+ "build": "tsdown",
6
+ "pretest": "npm run build",
7
+ "test": "node --test --experimental-test-coverage test/**/*.test.ts",
8
+ "prepublishOnly": "npm run build && npm run test"
9
+ },
10
+ "devDependencies": {
11
+ "@types/node": "~25.3.0",
12
+ "prettier": "~3.8.1",
13
+ "prettier-plugin-organize-imports": "~4.3.0",
14
+ "tsdown": "~0.20.3",
15
+ "typescript": "~5.9.3"
16
+ },
17
+ "type": "module",
18
+ "main": "./dist/main.cjs",
19
+ "module": "./dist/main.mjs",
20
+ "types": "./dist/main.d.mts",
21
+ "exports": {
22
+ ".": {
23
+ "require": "./dist/main.cjs",
24
+ "import": "./dist/main.mjs",
25
+ "types": "./dist/main.d.mts"
26
+ }
27
+ },
28
+ "files": [
29
+ "dist"
30
+ ],
31
+ "keywords": [
32
+ "nodejs",
33
+ "readline",
34
+ "readline-pager",
35
+ "pager",
36
+ "file-reader",
37
+ "pagination",
38
+ "large-files",
39
+ "async-iterator",
40
+ "streaming",
41
+ "memory-efficient",
42
+ "high-performance",
43
+ "log-processing",
44
+ "backward-reading"
45
+ ],
46
+ "description": "Memory-efficient, paginated file reader for Node.js with async iteration, prefetching, backward reading and optional worker support.",
47
+ "engines": {
48
+ "node": ">=18.12"
49
+ },
50
+ "author": "Morteza Jamshidi",
51
+ "license": "MIT"
52
+ }