shamela 1.2.2 → 1.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE.MD CHANGED
@@ -1,4 +1,4 @@
1
- Copyright 2024 Ragaeeb Haq>
1
+ Copyright 2024-2025 Ragaeeb Haq>
2
2
 
3
3
  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
4
4
 
package/README.md CHANGED
@@ -2,6 +2,7 @@
2
2
 
3
3
  [![wakatime](https://wakatime.com/badge/user/a0b906ce-b8e7-4463-8bce-383238df6d4b/project/faef70ab-efdb-448b-ab83-0fc66c95888e.svg)](https://wakatime.com/badge/user/a0b906ce-b8e7-4463-8bce-383238df6d4b/project/faef70ab-efdb-448b-ab83-0fc66c95888e)
4
4
  [![E2E](https://github.com/ragaeeb/shamela/actions/workflows/e2e.yml/badge.svg)](https://github.com/ragaeeb/shamela/actions/workflows/e2e.yml)
5
+ [![Vercel Deploy](https://deploy-badge.vercel.app/vercel/shamela)](https://shamela.vercel.app)
5
6
  [![Node.js CI](https://github.com/ragaeeb/shamela/actions/workflows/build.yml/badge.svg)](https://github.com/ragaeeb/shamela/actions/workflows/build.yml) ![GitHub License](https://img.shields.io/github/license/ragaeeb/shamela)
6
7
  ![GitHub Release](https://img.shields.io/github/v/release/ragaeeb/shamela)
7
8
  [![codecov](https://codecov.io/gh/ragaeeb/shamela/graph/badge.svg?token=PK55V1R324)](https://codecov.io/gh/ragaeeb/shamela)
@@ -12,7 +13,7 @@
12
13
  ![GitHub issues](https://img.shields.io/github/issues/ragaeeb/shamela)
13
14
  ![GitHub stars](https://img.shields.io/github/stars/ragaeeb/shamela?style=social)
14
15
 
15
- A `Node.js` library for accessing and downloading Maktabah Shamela v4 APIs. This library provides easy-to-use functions to interact with the Shamela API, download master and book databases, and retrieve book data programmatically.
16
+ A universal TypeScript library for accessing and downloading Maktabah Shamela v4 APIs. The package runs in both Node.js and modern browsers, providing ergonomic helpers to interact with the Shamela API, download master and book databases, and retrieve book data programmatically.
16
17
 
17
18
  ## Table of Contents
18
19
 
@@ -26,13 +27,15 @@ A `Node.js` library for accessing and downloading Maktabah Shamela v4 APIs. This
26
27
  - [getBookMetadata](#getbookmetadata)
27
28
  - [downloadBook](#downloadbook)
28
29
  - [getBook](#getbook)
30
+ - [getMaster](#getmaster)
29
31
  - [getCoverUrl](#getcoverurl)
30
32
  - [Examples](#examples)
31
33
  - [Downloading the Master Database](#downloading-the-master-database)
32
34
  - [Downloading a Book](#downloading-a-book)
33
35
  - [Retrieving Book Data](#retrieving-book-data)
34
- - [Getting Book Cover URLs](#getting-book-cover-urls)
36
+ - [Retrieving Master Data in memory](#retrieving-master-data-in-memory)
35
37
  - [Data Structures](#data-structures)
38
+ - [Next.js demo](#nextjs-demo)
36
39
  - [Testing](#testing)
37
40
  - [License](#license)
38
41
 
@@ -67,6 +70,7 @@ Before using the library, you need to set up some environment variables for API
67
70
  - `SHAMELA_API_KEY`: Your API key for accessing the Shamela API.
68
71
  - `SHAMELA_API_MASTER_PATCH_ENDPOINT`: The endpoint URL for the master database patches.
69
72
  - `SHAMELA_API_BOOKS_ENDPOINT`: The base endpoint URL for book-related API calls.
73
+ - `SHAMELA_SQLJS_WASM_URL` (optional): Override the default CDN URL used to load the `sql.js` WebAssembly binary when running in the browser.
70
74
 
71
75
  You can set these variables in a `.env` file at the root of your project:
72
76
 
@@ -74,8 +78,34 @@ You can set these variables in a `.env` file at the root of your project:
74
78
  SHAMELA_API_KEY=your_api_key_here
75
79
  SHAMELA_API_MASTER_PATCH_ENDPOINT=https://shamela.ws/api/master_patch
76
80
  SHAMELA_API_BOOKS_ENDPOINT=https://shamela.ws/api/books
81
+ # Optional when you host sql-wasm.wasm yourself
82
+ # SHAMELA_SQLJS_WASM_URL=https://example.com/sql-wasm.wasm
77
83
  ```
78
84
 
85
+ ### Runtime configuration (browsers and serverless)
86
+
87
+ When you cannot rely on environment variables—such as when running inside a browser, an edge worker, or a serverless function—use the `configure` helper to provide credentials at runtime:
88
+
89
+ ```ts
90
+ import { configure } from 'shamela';
91
+
92
+ configure({
93
+ apiKey: process.env.NEXT_PUBLIC_SHAMELA_KEY,
94
+ booksEndpoint: 'https://shamela.ws/api/books',
95
+ masterPatchEndpoint: 'https://shamela.ws/api/master_patch',
96
+ // Optional: host sql-wasm.wasm yourself to control caching/CDN placement
97
+ sqlJsWasmUrl: '/assets/sql-wasm.wasm',
98
+ // Optional: integrate with your application's logging system
99
+ logger: console,
100
+ // Optional: provide a custom fetch implementation (for tests or SSR)
101
+ fetchImplementation: fetch,
102
+ });
103
+ ```
104
+
105
+ You can call `configure` multiple times—values are merged, so later calls update only the keys you pass in.
106
+
107
+ The optional `logger` must expose `debug`, `info`, `warn`, and `error` methods. When omitted, the library stays silent by default.
108
+
79
109
  ## Usage
80
110
 
81
111
  ### Getting Started
@@ -89,6 +119,7 @@ import {
89
119
  getBookMetadata,
90
120
  downloadBook,
91
121
  getBook,
122
+ getMaster,
92
123
  getCoverUrl,
93
124
  } from 'shamela';
94
125
  ```
@@ -239,6 +270,26 @@ console.log(bookData.titles?.length); // Number of title entries
239
270
  console.log(bookData.pages[0].content); // Content of the first page
240
271
  ```
241
272
 
273
+ #### getMaster
274
+
275
+ Retrieves the entire master dataset (authors, books, categories) as a JavaScript object, including the version number that the
276
+ API reports for the snapshot.
277
+
278
+ ```typescript
279
+ getMaster(): Promise<MasterData>
280
+ ```
281
+
282
+ **Returns:** Promise that resolves to the complete master dataset with version metadata
283
+
284
+ **Example:**
285
+
286
+ ```javascript
287
+ const masterData = await getMaster();
288
+ console.log(masterData.version); // Version of the downloaded master database
289
+ console.log(masterData.books.length); // Number of books available
290
+ console.log(masterData.categories.length); // Number of categories available
291
+ ```
292
+
242
293
  #### getCoverUrl
243
294
 
244
295
  Generates the URL for a book's cover image.
@@ -278,6 +329,7 @@ import { downloadMasterDatabase } from 'shamela';
278
329
  outputFile: { path: './shamela_master.json' },
279
330
  });
280
331
  console.log(`Master data exported to: ${jsonPath}`);
332
+ console.log('The JSON file includes authors, books, categories, and the master version number.');
281
333
  } catch (error) {
282
334
  console.error('Error downloading master database:', error);
283
335
  }
@@ -339,6 +391,24 @@ import { getBook } from 'shamela';
339
391
  })();
340
392
  ```
341
393
 
394
+ ### Retrieving Master Data in memory
395
+
396
+ ```javascript
397
+ import { getMaster } from 'shamela';
398
+
399
+ (async () => {
400
+ try {
401
+ const masterData = await getMaster();
402
+
403
+ console.log(`Master snapshot version: ${masterData.version}`);
404
+ console.log(`Master dataset includes ${masterData.books.length} books`);
405
+ console.log(`Master dataset includes ${masterData.categories.length} categories`);
406
+ } catch (error) {
407
+ console.error('Error retrieving master data:', error);
408
+ }
409
+ })();
410
+ ```
411
+
342
412
  ### Getting Book Cover URLs
343
413
 
344
414
  ```javascript
@@ -379,6 +449,7 @@ The library provides comprehensive TypeScript types for all data structures:
379
449
  - `authors`: Raw entries from the `author` table with the original `biography`, `death_text`, `death_number`, `is_deleted`, and `name` fields.
380
450
  - `books`: Raw entries from the `book` table containing the original metadata columns (`author`, `bibliography`, `category`, `date`, `hint`, `major_release`, `metadata`, `minor_release`, `pdf_links`, `printed`, `type`, and `is_deleted`).
381
451
  - `categories`: Raw entries from the `category` table including `is_deleted`, `order`, and `name`.
452
+ - `version`: Version number reported by the Shamela API for the downloaded master database.
382
453
 
383
454
  ### Page
384
455
 
@@ -401,12 +472,40 @@ The library provides comprehensive TypeScript types for all data structures:
401
472
  - `parseContentRobust(content: string)`: Converts Shamela page HTML into a list of structured lines while preserving title markers and punctuation.
402
473
  - `sanitizePageContent(content: string)`: Removes common footnote markers and normalises ligatures from Shamela pages.
403
474
 
475
+ ## Next.js demo
476
+
477
+ A minimal Next.js 16 application in `demo/` replaces the previous Storybook setup and offers an RTL-friendly explorer for the Shamela APIs. The server renders requests so the browser can bypass CORS limits and you only need to provide an API key and book identifier at runtime.
478
+
479
+ Create a `demo/.env.local` file (or export the variables in your shell) containing the real endpoints you wish to call:
480
+
481
+ ```dotenv
482
+ SHAMELA_API_MASTER_PATCH_ENDPOINT=https://dev.shamela.ws/api/v1/patches/master
483
+ SHAMELA_API_BOOKS_ENDPOINT=https://dev.shamela.ws/api/v1/patches/book-updates
484
+ # Optional when hosting the wasm asset yourself
485
+ # SHAMELA_SQLJS_WASM_URL=https://example.com/sql-wasm.wasm
486
+ ```
487
+
488
+ Then launch the demo:
489
+
490
+ ```bash
491
+ bun run demo
492
+ ```
493
+
494
+ Visit [http://localhost:3000](http://localhost:3000) to enter your API key, choose a book ID, and call helpers like `getMasterMetadata`, `getMaster`, `getBook`, and `downloadMasterDatabase` directly from the interface. For production-style builds use:
495
+
496
+ ```bash
497
+ bun run demo:build
498
+ bun run demo:start
499
+ ```
500
+
501
+ When deploying to Vercel, point the project to the `demo` directory and supply the same environment variables in the dashboard so the API routes can reach Shamela.
502
+
404
503
  ## Testing
405
504
 
406
- The library includes comprehensive tests. To run them, ensure you have the necessary environment variables set, then execute:
505
+ The library includes comprehensive tests powered by `bun test`. To run the unit suite, ensure you have the necessary environment variables set, then execute:
407
506
 
408
507
  ```bash
409
- bun test
508
+ bun test src
410
509
  ```
411
510
 
412
511
  For end-to-end tests:
@@ -415,10 +514,12 @@ For end-to-end tests:
415
514
  bun run e2e
416
515
  ```
417
516
 
418
- For CI environment:
517
+ ### Formatting
518
+
519
+ Apply Biome formatting across the repository with:
419
520
 
420
521
  ```bash
421
- bun run e2e:ci
522
+ bun run format
422
523
  ```
423
524
 
424
525
  ## License
package/dist/index.d.ts CHANGED
@@ -1,92 +1,95 @@
1
+ //#region src/db/types.d.ts
2
+
1
3
  /**
2
4
  * A record that can be deleted by patches.
3
5
  */
4
6
  type Deletable = {
5
- /** Indicates if it was deleted in the patch if it is set to '1 */
6
- is_deleted?: string;
7
+ /** Indicates if it was deleted in the patch if it is set to '1 */
8
+ is_deleted?: string;
7
9
  };
8
10
  type Unique = {
9
- /** Unique identifier */
10
- id: number;
11
+ /** Unique identifier */
12
+ id: number;
11
13
  };
12
14
  /**
13
15
  * Database row structure for the author table.
14
16
  */
15
17
  type AuthorRow = Deletable & Unique & {
16
- /** Author biography */
17
- biography: string;
18
- /** Death year */
19
- death_number: string;
20
- /** The death year as a text */
21
- death_text: string;
22
- /** Author name */
23
- name: string;
18
+ /** Author biography */
19
+ biography: string;
20
+ /** Death year */
21
+ death_number: string;
22
+ /** The death year as a text */
23
+ death_text: string;
24
+ /** Author name */
25
+ name: string;
24
26
  };
25
27
  /**
26
28
  * Database row structure for the book table.
27
29
  */
28
30
  type BookRow = Deletable & Unique & {
29
- /** Serialized author ID(s) "2747, 3147" or "513" */
30
- author: string;
31
- /** Bibliography information */
32
- bibliography: string;
33
- /** Category ID */
34
- category: string;
35
- /** Publication date (or 99999 for unavailable) */
36
- date: string;
37
- /** Hint or description */
38
- hint: string;
39
- /** Major version */
40
- major_release: string;
41
- /** Serialized metadata */
42
- metadata: string;
43
- /** Minor version */
44
- minor_release: string;
45
- /** Book name */
46
- name: string;
47
- /** Serialized PDF links */
48
- pdf_links: string;
49
- /** Printed flag */
50
- printed: string;
51
- /** Book type */
52
- type: string;
31
+ /** Serialized author ID(s) "2747, 3147" or "513" */
32
+ author: string;
33
+ /** Bibliography information */
34
+ bibliography: string;
35
+ /** Category ID */
36
+ category: string;
37
+ /** Publication date (or 99999 for unavailable) */
38
+ date: string;
39
+ /** Hint or description */
40
+ hint: string;
41
+ /** Major version */
42
+ major_release: string;
43
+ /** Serialized metadata */
44
+ metadata: string;
45
+ /** Minor version */
46
+ minor_release: string;
47
+ /** Book name */
48
+ name: string;
49
+ /** Serialized PDF links */
50
+ pdf_links: string;
51
+ /** Printed flag */
52
+ printed: string;
53
+ /** Book type */
54
+ type: string;
53
55
  };
54
56
  /**
55
57
  * Database row structure for the category table.
56
58
  */
57
59
  type CategoryRow = Deletable & Unique & {
58
- /** Category name */
59
- name: string;
60
- /** Category order in the list to show. */
61
- order: string;
60
+ /** Category name */
61
+ name: string;
62
+ /** Category order in the list to show. */
63
+ order: string;
62
64
  };
63
65
  /**
64
66
  * Database row structure for the page table.
65
67
  */
66
68
  type PageRow = Deletable & Unique & {
67
- /** Page content */
68
- content: string;
69
- /** Page number */
70
- number: string | null;
71
- /** Page reference */
72
- page: string | null;
73
- /** Part number */
74
- part: string | null;
75
- /** Additional metadata */
76
- services: string | null;
69
+ /** Page content */
70
+ content: string;
71
+ /** Page number */
72
+ number: string | null;
73
+ /** Page reference */
74
+ page: string | null;
75
+ /** Part number */
76
+ part: string | null;
77
+ /** Additional metadata */
78
+ services: string | null;
77
79
  };
78
80
  /**
79
81
  * Database row structure for the title table.
80
82
  */
81
83
  type TitleRow = Deletable & Unique & {
82
- /** Title content */
83
- content: string;
84
- /** Page number */
85
- page: string;
86
- /** Parent title ID */
87
- parent: string | null;
84
+ /** Title content */
85
+ content: string;
86
+ /** Page number */
87
+ page: string;
88
+ /** Parent title ID */
89
+ parent: string | null;
88
90
  };
89
-
91
+ //#endregion
92
+ //#region src/types.d.ts
90
93
  /**
91
94
  * Represents an author entity.
92
95
  */
@@ -103,94 +106,123 @@ type Category = CategoryRow;
103
106
  * A page in a book.
104
107
  */
105
108
  type Page = Pick<PageRow, 'id' | 'content'> & {
106
- page?: number;
107
- part?: string;
108
- number?: string;
109
+ page?: number;
110
+ part?: string;
111
+ number?: string;
109
112
  };
110
113
  /**
111
114
  * A title heading in a book.
112
115
  */
113
116
  type Title = Pick<TitleRow, 'id' | 'content'> & {
114
- page: number;
115
- parent?: number;
117
+ page: number;
118
+ parent?: number;
116
119
  };
117
120
  /**
118
121
  * Represents book content data.
119
122
  */
120
123
  type BookData = {
121
- /** Array of pages in the book */
122
- pages: Page[];
123
- /** Array of titles/chapters */
124
- titles: Title[];
124
+ /** Array of pages in the book */
125
+ pages: Page[];
126
+ /** Array of titles/chapters */
127
+ titles: Title[];
125
128
  };
126
129
  /**
127
130
  * Master data structure containing all core entities.
128
131
  */
129
132
  type MasterData = {
130
- /** Array of all authors */
131
- authors: Author[];
132
- /** Array of all books */
133
- books: Book[];
134
- /** Array of all categories */
135
- categories: Category[];
133
+ /** Array of all authors */
134
+ authors: Author[];
135
+ /** Array of all books */
136
+ books: Book[];
137
+ /** Array of all categories */
138
+ categories: Category[];
139
+ /** Version number for the downloaded master database */
140
+ version: number;
136
141
  };
137
142
  /**
138
143
  * Options for downloading a book.
139
144
  */
140
145
  type DownloadBookOptions = {
141
- /** Optional book metadata */
142
- bookMetadata?: GetBookMetadataResponsePayload;
143
- /** Output file configuration */
144
- outputFile: OutputOptions;
146
+ /** Optional book metadata */
147
+ bookMetadata?: GetBookMetadataResponsePayload;
148
+ /** Output file configuration */
149
+ outputFile: OutputOptions;
145
150
  };
146
151
  /**
147
152
  * Options for downloading master data.
148
153
  */
149
154
  type DownloadMasterOptions = {
150
- /** Optional master metadata */
151
- masterMetadata?: GetMasterMetadataResponsePayload;
152
- /** Output file configuration */
153
- outputFile: OutputOptions;
155
+ /** Optional master metadata */
156
+ masterMetadata?: GetMasterMetadataResponsePayload;
157
+ /** Output file configuration */
158
+ outputFile: OutputOptions;
154
159
  };
155
160
  /**
156
161
  * Options for getting book metadata.
157
162
  */
158
163
  type GetBookMetadataOptions = {
159
- /** Major version number */
160
- majorVersion: number;
161
- /** Minor version number */
162
- minorVersion: number;
164
+ /** Major version number */
165
+ majorVersion: number;
166
+ /** Minor version number */
167
+ minorVersion: number;
163
168
  };
164
169
  /**
165
170
  * Response payload for book metadata requests.
166
171
  */
167
172
  type GetBookMetadataResponsePayload = {
168
- /** Major release version */
169
- majorRelease: number;
170
- /** URL for major release download */
171
- majorReleaseUrl: string;
172
- /** Optional minor release version */
173
- minorRelease?: number;
174
- /** Optional URL for minor release download */
175
- minorReleaseUrl?: string;
173
+ /** Major release version */
174
+ majorRelease: number;
175
+ /** URL for major release download */
176
+ majorReleaseUrl: string;
177
+ /** Optional minor release version */
178
+ minorRelease?: number;
179
+ /** Optional URL for minor release download */
180
+ minorReleaseUrl?: string;
176
181
  };
177
182
  /**
178
183
  * Response payload for master metadata requests.
179
184
  */
180
185
  type GetMasterMetadataResponsePayload = {
181
- /** Download URL */
182
- url: string;
183
- /** Version number */
184
- version: number;
186
+ /** Download URL */
187
+ url: string;
188
+ /** Version number */
189
+ version: number;
190
+ };
191
+ type NodeJSOutput = {
192
+ /** Output file path (Node.js only) */
193
+ path: string;
194
+ writer?: never;
195
+ };
196
+ type CustomOutput = {
197
+ /** Custom writer used when path is not provided */
198
+ writer: (payload: string | Uint8Array) => Promise<void> | void;
199
+ path?: undefined;
185
200
  };
186
201
  /**
187
202
  * Output file options.
188
203
  */
189
- interface OutputOptions {
190
- /** Output file path */
191
- path: string;
192
- }
193
-
204
+ type OutputOptions = NodeJSOutput | CustomOutput;
205
+ /**
206
+ * Runtime configuration for the library.
207
+ */
208
+ type ShamelaConfig = {
209
+ /** API key used to authenticate against Shamela services */
210
+ apiKey?: string;
211
+ /** Endpoint used for book metadata */
212
+ booksEndpoint?: string;
213
+ /** Endpoint used for master metadata */
214
+ masterPatchEndpoint?: string;
215
+ /** Optional override for the sql.js wasm asset location */
216
+ sqlJsWasmUrl?: string;
217
+ /** Optional custom fetch implementation for environments without a global fetch */
218
+ fetchImplementation?: typeof fetch;
219
+ };
220
+ /**
221
+ * Valid configuration keys.
222
+ */
223
+ type ShamelaConfigKey = keyof ShamelaConfig;
224
+ //#endregion
225
+ //#region src/api.d.ts
194
226
  /**
195
227
  * Retrieves metadata for a specific book from the Shamela API.
196
228
  *
@@ -320,10 +352,57 @@ declare const downloadMasterDatabase: (options: DownloadMasterOptions) => Promis
320
352
  * ```
321
353
  */
322
354
  declare const getBook: (id: number) => Promise<BookData>;
323
-
355
+ /**
356
+ * Retrieves complete master data including authors, books, and categories.
357
+ *
358
+ * This convenience function downloads the master database archive, builds an in-memory
359
+ * SQLite database, and returns structured data for immediate consumption alongside
360
+ * the version number of the snapshot.
361
+ *
362
+ * @returns A promise that resolves to the complete master dataset and its version
363
+ */
364
+ declare const getMaster: () => Promise<MasterData>;
365
+ //#endregion
366
+ //#region src/utils/logger.d.ts
367
+ /**
368
+ * Signature accepted by logger methods.
369
+ */
370
+ type LogFunction = (...args: unknown[]) => void;
371
+ /**
372
+ * Contract expected from logger implementations consumed by the library.
373
+ */
374
+ interface Logger {
375
+ debug: LogFunction;
376
+ error: LogFunction;
377
+ info: LogFunction;
378
+ warn: LogFunction;
379
+ }
380
+ //#endregion
381
+ //#region src/config.d.ts
382
+ /**
383
+ * Runtime configuration options accepted by {@link configure}.
384
+ */
385
+ type ConfigureOptions = Partial<ShamelaConfig> & {
386
+ logger?: Logger;
387
+ };
388
+ /**
389
+ * Updates the runtime configuration for the library.
390
+ *
391
+ * This function merges the provided options with existing overrides and optionally
392
+ * configures a custom logger implementation.
393
+ *
394
+ * @param config - Runtime configuration overrides and optional logger instance
395
+ */
396
+ declare const configure: (config: ConfigureOptions) => void;
397
+ /**
398
+ * Clears runtime configuration overrides and restores the default logger.
399
+ */
400
+ declare const resetConfig: () => void;
401
+ //#endregion
402
+ //#region src/content.d.ts
324
403
  type Line = {
325
- id?: string;
326
- text: string;
404
+ id?: string;
405
+ text: string;
327
406
  };
328
407
  declare const parseContentRobust: (content: string) => Line[];
329
408
  /**
@@ -334,14 +413,8 @@ declare const parseContentRobust: (content: string) => Line[];
334
413
  */
335
414
  declare const sanitizePageContent: (text: string, rules?: Record<string, string>) => string;
336
415
  declare const splitPageBodyFromFooter: (content: string, footnoteMarker?: string) => readonly [string, string];
337
-
338
- type LogFunction = (...args: unknown[]) => void;
339
- interface Logger {
340
- debug: LogFunction;
341
- error: LogFunction;
342
- info: LogFunction;
343
- warn: LogFunction;
344
- }
345
- declare const setLogger: (newLogger?: Logger) => void;
346
-
347
- export { type Author, type Book, type BookData, type Category, type DownloadBookOptions, type DownloadMasterOptions, type GetBookMetadataOptions, type GetBookMetadataResponsePayload, type GetMasterMetadataResponsePayload, type Line, type MasterData, type OutputOptions, type Page, type Title, downloadBook, downloadMasterDatabase, getBook, getBookMetadata, getCoverUrl, getMasterMetadata, parseContentRobust, sanitizePageContent, setLogger, splitPageBodyFromFooter };
416
+ declare const removeArabicNumericPageMarkers: (text: string) => string;
417
+ declare const removeTagsExceptSpan: (content: string) => string;
418
+ //#endregion
419
+ export { Author, Book, BookData, Category, type ConfigureOptions, DownloadBookOptions, DownloadMasterOptions, GetBookMetadataOptions, GetBookMetadataResponsePayload, GetMasterMetadataResponsePayload, Line, type Logger, MasterData, OutputOptions, Page, ShamelaConfig, ShamelaConfigKey, Title, configure, downloadBook, downloadMasterDatabase, getBook, getBookMetadata, getCoverUrl, getMaster, getMasterMetadata, parseContentRobust, removeArabicNumericPageMarkers, removeTagsExceptSpan, resetConfig, sanitizePageContent, splitPageBodyFromFooter };
420
+ //# sourceMappingURL=index.d.ts.map
package/dist/index.js CHANGED
@@ -1,4 +1,4 @@
1
- import{Database as W}from"bun:sqlite";import{promises as m}from"fs";import g from"path";import B from"process";import{URL as Z}from"url";import{Database as y}from"bun:sqlite";var I={debug:()=>{},error:()=>{},info:()=>{},warn:()=>{}},l=I,ee=(e=I)=>{if(!e.debug||!e.error||!e.info)throw new Error("Logger must implement debug, error, and info methods");l=e};var te="#",M=(e,t)=>e.query(`PRAGMA table_info(${t})`).all(),w=(e,t)=>!!e.query("SELECT name FROM sqlite_master WHERE type='table' AND name = ?1").get(t),O=(e,t)=>w(e,t)?e.query(`SELECT * FROM ${t}`).all():[],k=e=>String(e.is_deleted)==="1",C=(e,t,r)=>{let o={};for(let n of r){if(n==="id"){o.id=(t??e)?.id??null;continue}if(t&&n in t){let i=t[n];if(i!==te&&i!==null&&i!==void 0){o[n]=i;continue}}if(e&&n in e){o[n]=e[n];continue}o[n]=null}return o},re=(e,t,r)=>{let o=new Set,n=new Map;for(let a of e)o.add(String(a.id));for(let a of t)n.set(String(a.id),a);let i=[];for(let a of e){let s=n.get(String(a.id));s&&k(s)||i.push(C(a,s,r))}for(let a of t){let s=String(a.id);o.has(s)||k(a)||i.push(C(void 0,a,r))}return i},oe=(e,t,r,o)=>{if(o.length===0)return;let n=r.map(()=>"?").join(","),i=e.prepare(`INSERT INTO ${t} (${r.join(",")}) VALUES (${n})`);o.forEach(a=>{let s=r.map(c=>c in a?a[c]:null);i.run(...s)}),i.finalize()},ne=(e,t,r)=>{let o=t.query("SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1").get(r);return o?.sql?(e.run(`DROP TABLE IF EXISTS ${r}`),e.run(o.sql),!0):(l.warn(`${r} table definition missing in source database`),!1)},f=(e,t,r,o)=>{if(!w(t,o)){l.warn(`${o} table missing in source database`);return}if(!ne(e,t,o))return;let n=M(t,o),i=r&&w(r,o)?M(r,o):[],a=n.map(p=>p.name);for(let p of i)if(!a.includes(p.name)){let h=p.type&&p.type.length>0?p.type:"TEXT";e.run(`ALTER TABLE ${o} ADD COLUMN ${p.name} ${h}`),a.push(p.name)}let s=O(t,o),c=r?O(r,o):[],u=re(s,c,a);oe(e,o,a,u)},N=(e,t,r)=>{let o=new y(t),n=new y(r);try{e.transaction(()=>{f(e,o,n,"page"),f(e,o,n,"title")})()}finally{o.close(),n.close()}},F=(e,t)=>{let r=new y(t);try{e.transaction(()=>{f(e,r,null,"page"),f(e,r,null,"title")})()}finally{r.close()}},U=e=>{e.run(`CREATE TABLE page (
1
+ import e from"sql.js";import{unzipSync as t}from"fflate";const n=Object.freeze({debug:()=>{},error:()=>{},info:()=>{},warn:()=>{}});let r=n;const i=e=>{if(!e){r=n;return}let t=[`debug`,`error`,`info`,`warn`].find(t=>typeof e[t]!=`function`);if(t)throw Error(`Logger must implement debug, error, info, and warn methods. Missing: ${String(t)}`);r=e},a=()=>r,o=()=>{r=n};var s=new Proxy({},{get:(e,t)=>{let n=a(),r=n[t];return typeof r==`function`?(...e)=>r.apply(n,e):r}});let c={};const l={apiKey:`SHAMELA_API_KEY`,booksEndpoint:`SHAMELA_API_BOOKS_ENDPOINT`,masterPatchEndpoint:`SHAMELA_API_MASTER_PATCH_ENDPOINT`,sqlJsWasmUrl:`SHAMELA_SQLJS_WASM_URL`},u=typeof process<`u`&&!!process?.env,d=e=>{let t=c[e];if(t!==void 0)return t;let n=l[e];if(u)return process.env[n]},ee=e=>{let{logger:t,...n}=e;`logger`in e&&i(t),c={...c,...n}},f=e=>e===`fetchImplementation`?c.fetchImplementation:d(e),p=()=>({apiKey:d(`apiKey`),booksEndpoint:d(`booksEndpoint`),fetchImplementation:c.fetchImplementation,masterPatchEndpoint:d(`masterPatchEndpoint`),sqlJsWasmUrl:d(`sqlJsWasmUrl`)}),m=e=>{if(e===`fetchImplementation`)throw Error(`fetchImplementation must be provided via configure().`);let t=f(e);if(!t)throw Error(`${l[e]} environment variable not set`);return t},te=()=>{c={},o()};let h=function(e){return e.Authors=`author`,e.Books=`book`,e.Categories=`category`,e.Page=`page`,e.Title=`title`,e}({});const g=(e,t)=>e.query(`PRAGMA table_info(${t})`).all(),_=(e,t)=>!!e.query(`SELECT name FROM sqlite_master WHERE type='table' AND name = ?1`).get(t),v=(e,t)=>_(e,t)?e.query(`SELECT * FROM ${t}`).all():[],y=e=>String(e.is_deleted)===`1`,b=(e,t,n)=>{let r={};for(let i of n){if(i===`id`){r.id=(t??e)?.id??null;continue}if(t&&i in t){let e=t[i];if(e!==`#`&&e!=null){r[i]=e;continue}}if(e&&i in e){r[i]=e[i];continue}r[i]=null}return r},ne=(e,t,n)=>{let r=new Set,i=new Map;for(let t of e)r.add(String(t.id));for(let e of t)i.set(String(e.id),e);let a=[];for(let t of e){let e=i.get(String(t.id));e&&y(e)||a.push(b(t,e,n))}for(let e of t){let t=String(e.id);r.has(t)||y(e)||a.push(b(void 0,e,n))}return a},re=(e,t,n,r)=>{if(r.length===0)return;let i=n.map(()=>`?`).join(`,`),a=e.prepare(`INSERT INTO ${t} (${n.join(`,`)}) VALUES (${i})`);r.forEach(e=>{let t=n.map(t=>t in e?e[t]:null);a.run(...t)}),a.finalize()},ie=(e,t,n)=>{let r=t.query(`SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1`).get(n);return r?.sql?(e.run(`DROP TABLE IF EXISTS ${n}`),e.run(r.sql),!0):(s.warn(`${n} table definition missing in source database`),!1)},x=(e,t,n,r)=>{if(!_(t,r)){s.warn(`${r} table missing in source database`);return}if(!ie(e,t,r))return;let i=g(t,r),a=n&&_(n,r)?g(n,r):[],o=i.map(e=>e.name);for(let t of a)if(!o.includes(t.name)){let n=t.type&&t.type.length>0?t.type:`TEXT`;e.run(`ALTER TABLE ${r} ADD COLUMN ${t.name} ${n}`),o.push(t.name)}re(e,r,o,ne(v(t,r),n?v(n,r):[],o))},ae=(e,t,n)=>{e.transaction(()=>{x(e,t,n,h.Page),x(e,t,n,h.Title)})()},oe=(e,t)=>{e.transaction(()=>{x(e,t,null,h.Page),x(e,t,null,h.Title)})()},S=e=>{e.run(`CREATE TABLE ${h.Page} (
2
2
  id INTEGER,
3
3
  content TEXT,
4
4
  part TEXT,
@@ -6,20 +6,20 @@ import{Database as W}from"bun:sqlite";import{promises as m}from"fs";import g fro
6
6
  number TEXT,
7
7
  services TEXT,
8
8
  is_deleted TEXT
9
- )`),e.run(`CREATE TABLE title (
9
+ )`),e.run(`CREATE TABLE ${h.Title} (
10
10
  id INTEGER,
11
11
  content TEXT,
12
12
  page INTEGER,
13
13
  parent INTEGER,
14
14
  is_deleted TEXT
15
- )`)},ae=e=>e.query("SELECT * FROM page").all(),se=e=>e.query("SELECT * FROM title").all(),R=e=>({pages:ae(e),titles:se(e)});import ie from"path";var v=(e,t)=>{let r=e.replace(/'/g,"''");if(!/^[a-zA-Z0-9_]+$/.test(t))throw new Error("Invalid database alias");return`ATTACH DATABASE '${r}' AS ${t}`};var X=e=>{if(!/^[a-zA-Z0-9_]+$/.test(e))throw new Error("Invalid database alias");return`DETACH DATABASE ${e}`};var b=(e,t,r)=>{let o=e.query(`SELECT sql FROM ${t}.sqlite_master WHERE type='table' AND name = ?1`).get(r);if(!o?.sql)throw new Error(`Missing table definition for ${r} in ${t}`);e.run(`DROP TABLE IF EXISTS ${r}`),e.run(o.sql)},q=(e,t)=>{let r={};for(let a of t){let{name:s}=ie.parse(a);r[s]=a}Object.entries(r).forEach(([a,s])=>{e.run(v(s,a))}),b(e,"author","author"),b(e,"book","book"),b(e,"category","category");let o=e.prepare("INSERT INTO author SELECT * FROM author.author"),n=e.prepare("INSERT INTO book SELECT * FROM book.book"),i=e.prepare("INSERT INTO category SELECT * FROM category.category");e.transaction(()=>{o.run(),n.run(),i.run()})(),Object.keys(r).forEach(a=>{e.run(X(a))})},A=(e,t,r)=>{e.run(`DROP VIEW IF EXISTS ${t}`),e.run(`CREATE VIEW ${t} AS SELECT * FROM ${r}`)},j=e=>{e.run(`CREATE TABLE author (
15
+ )`)},C=e=>e.query(`SELECT * FROM ${h.Page}`).all(),w=e=>e.query(`SELECT * FROM ${h.Title}`).all(),T=e=>({pages:C(e),titles:w(e)});var se=class{constructor(e){this.statement=e}run=(...e)=>{e.length>0&&this.statement.bind(e),this.statement.step(),this.statement.reset()};finalize=()=>{this.statement.free()}},E=class{constructor(e){this.db=e}run=(e,t=[])=>{this.db.run(e,t)};prepare=e=>new se(this.db.prepare(e));query=e=>({all:(...t)=>this.all(e,t),get:(...t)=>this.get(e,t)});transaction=e=>()=>{this.db.run(`BEGIN TRANSACTION`);try{e(),this.db.run(`COMMIT`)}catch(e){throw this.db.run(`ROLLBACK`),e}};close=()=>{this.db.close()};export=()=>this.db.export();all=(e,t)=>{let n=this.db.prepare(e);try{t.length>0&&n.bind(t);let e=[];for(;n.step();)e.push(n.getAsObject());return e}finally{n.free()}};get=(e,t)=>this.all(e,t)[0]};let D=null,O=null;const ce=typeof process<`u`&&!!process?.versions?.node,le=()=>{if(!O){let e=f(`sqlJsWasmUrl`);if(e)O=e;else if(ce){let e=new URL(`../../node_modules/sql.js/dist/sql-wasm.wasm`,import.meta.url);O=decodeURIComponent(e.pathname)}else O=`https://cdn.jsdelivr.net/npm/sql.js@1.13.0/dist/sql-wasm.wasm`}return O},k=()=>(D||=e({locateFile:()=>le()}),D),A=async()=>new E(new(await(k())).Database),j=async e=>new E(new(await(k())).Database(e)),ue=(e,t,n)=>{let r=t.query(`SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1`).get(n);if(!r?.sql)throw Error(`Missing table definition for ${n} in source database`);e.run(`DROP TABLE IF EXISTS ${n}`),e.run(r.sql)},de=async(e,t)=>{let n={author:h.Authors,book:h.Books,category:h.Categories},r={};for(let e of t){let t=n[(e.name.split(`/`).pop()?.split(`\\`).pop()??e.name).replace(/\.(sqlite|db)$/i,``).toLowerCase()];t&&(r[t]=await j(e.data))}try{let t=Object.entries(r);e.transaction(()=>{for(let[n,r]of t){ue(e,r,n);let t=r.query(`PRAGMA table_info(${n})`).all().map(e=>e.name);if(t.length===0)continue;let i=r.query(`SELECT * FROM ${n}`).all();if(i.length===0)continue;let a=t.map(()=>`?`).join(`,`),o=t.map(e=>e===`order`?`"order"`:e),s=e.prepare(`INSERT INTO ${n} (${o.join(`,`)}) VALUES (${a})`);try{for(let e of i){let n=t.map(t=>t in e?e[t]:null);s.run(...n)}}finally{s.finalize()}}})()}finally{Object.values(r).forEach(e=>e?.close())}},M=(e,t,n)=>{e.run(`DROP VIEW IF EXISTS ${t}`),e.run(`CREATE VIEW ${t} AS SELECT * FROM ${n}`)},fe=e=>{e.run(`CREATE TABLE ${h.Authors} (
16
16
  id INTEGER,
17
17
  is_deleted TEXT,
18
18
  name TEXT,
19
19
  biography TEXT,
20
20
  death_text TEXT,
21
21
  death_number TEXT
22
- )`),e.run(`CREATE TABLE book (
22
+ )`),e.run(`CREATE TABLE ${h.Books} (
23
23
  id INTEGER,
24
24
  name TEXT,
25
25
  is_deleted TEXT,
@@ -34,14 +34,14 @@ import{Database as W}from"bun:sqlite";import{promises as m}from"fs";import g fro
34
34
  hint TEXT,
35
35
  pdf_links TEXT,
36
36
  metadata TEXT
37
- )`),e.run(`CREATE TABLE category (
37
+ )`),e.run(`CREATE TABLE ${h.Categories} (
38
38
  id INTEGER,
39
39
  is_deleted TEXT,
40
40
  "order" TEXT,
41
41
  name TEXT
42
- )`),A(e,"authors","author"),A(e,"books","book"),A(e,"categories","category")},ce=e=>e.query("SELECT * FROM author").all(),le=e=>e.query("SELECT * FROM book").all(),pe=e=>e.query("SELECT * FROM category").all(),G=e=>({authors:ce(e),books:le(e),categories:pe(e)});var T=(e,t=["api_key","token","password","secret","auth"])=>{let r=typeof e=="string"?new URL(e):new URL(e.toString());return t.forEach(o=>{let n=r.searchParams.get(o);if(n&&n.length>6){let i=`${n.slice(0,3)}***${n.slice(-3)}`;r.searchParams.set(o,i)}else n&&r.searchParams.set(o,"***")}),r.toString()},H=e=>({content:e.content,id:e.id,...e.number&&{number:e.number},...e.page&&{page:Number(e.page)},...e.part&&{part:e.part}}),V=e=>{let t=Number(e.parent);return{content:e.content,id:e.id,page:Number(e.page),...t&&{parent:t}}};var d={"<img[^>]*>>":"",\u8204:"","\uFD40":"\u0631\u064E\u062D\u0650\u0645\u064E\u0647\u064F \u0671\u0644\u0644\u064E\u0651\u0670\u0647\u064F","\uFD41":"\u0631\u0636\u064A \u0627\u0644\u0644\u0647 \u0639\u0646\u0647","\uFD4C":"\u0635\u0644\u0649 \u0627\u0644\u0644\u0647 \u0639\u0644\u064A\u0647 \u0648\u0622\u0644\u0647 \u0648\u0633\u0644\u0645"};import{createWriteStream as ue,promises as D}from"fs";import me from"https";import ge from"os";import x from"path";import{pipeline as fe}from"stream/promises";import Te from"unzipper";var _=async(e="shamela")=>{let t=x.join(ge.tmpdir(),e);return D.mkdtemp(t)};async function E(e,t){let r=[];try{let o=await new Promise((n,i)=>{me.get(e,a=>{a.statusCode!==200?i(new Error(`Failed to download ZIP file: ${a.statusCode} ${a.statusMessage}`)):n(a)}).on("error",a=>{i(new Error(`HTTPS request failed: ${a.message}`))})});return await new Promise((n,i)=>{let a=Te.Parse(),s=[];a.on("entry",c=>{let u=(async()=>{let p=x.join(t,c.path);if(c.type==="Directory")await D.mkdir(p,{recursive:!0}),c.autodrain();else{let h=x.dirname(p);await D.mkdir(h,{recursive:!0});let Q=ue(p);await fe(c,Q),r.push(p)}})();s.push(u)}),a.on("finish",async()=>{try{await Promise.all(s),n()}catch(c){i(c)}}),a.on("error",c=>{i(new Error(`Error during extraction: ${c.message}`))}),o.pipe(a)}),r}catch(o){throw new Error(`Error processing URL: ${o.message}`)}}import{Buffer as de}from"buffer";import Ee from"https";import he from"process";import{URL as ye,URLSearchParams as we}from"url";var S=(e,t,r=!0)=>{let o=new ye(e);{let n=new we;Object.entries(t).forEach(([i,a])=>{n.append(i,a.toString())}),r&&n.append("api_key",he.env.SHAMELA_API_KEY),o.search=n.toString()}return o},P=e=>new Promise((t,r)=>{Ee.get(e,o=>{let n=o.headers["content-type"]||"",i=[];o.on("data",a=>{i.push(a)}),o.on("end",()=>{let a=de.concat(i);if(n.includes("application/json"))try{let s=JSON.parse(a.toString("utf-8"));t(s)}catch(s){r(new Error(`Failed to parse JSON: ${s.message}`))}else t(a)})}).on("error",o=>{r(new Error(`Error making request: ${o.message}`))})});import Re from"path";import be from"process";var Ae=["author.sqlite","book.sqlite","category.sqlite"],$=()=>{let e=["SHAMELA_API_MASTER_PATCH_ENDPOINT","SHAMELA_API_BOOKS_ENDPOINT","SHAMELA_API_KEY"].filter(t=>!be.env[t]);if(e.length)throw new Error(`${e.join(", ")} environment variables not set`)},z=e=>{let t=new Set(e.map(r=>Re.basename(r).toLowerCase()));return Ae.every(r=>t.has(r.toLowerCase()))};var L=e=>{let t=new Z(e);return t.protocol="https",t.toString()},J=async(e,t)=>{l.info(`Setting up book database for ${e}`);let r=await _("shamela_setupBook"),o=t||await De(e),[[n],[i]=[]]=await Promise.all([E(o.majorReleaseUrl,r),...o.minorReleaseUrl?[E(o.minorReleaseUrl,r)]:[]]),a=g.join(r,"book.db"),s=new W(a);try{return l.info("Creating tables"),await U(s),i?(l.info(`Applying patches from ${i} to ${n}`),await N(s,n,i)):(l.info(`Copying table data from ${n}`),await F(s,n)),{cleanup:async()=>{s.close(),await m.rm(r,{recursive:!0})},client:s}}catch(c){throw s.close(),await m.rm(r,{recursive:!0}),c}},De=async(e,t)=>{$();let r=S(`${B.env.SHAMELA_API_BOOKS_ENDPOINT}/${e}`,{major_release:(t?.majorVersion||0).toString(),minor_release:(t?.minorVersion||0).toString()});l.info(`Fetching shamela.ws book link: ${T(r)}`);try{let o=await P(r);return{majorRelease:o.major_release,majorReleaseUrl:L(o.major_release_url),...o.minor_release_url&&{minorReleaseUrl:L(o.minor_release_url)},...o.minor_release_url&&{minorRelease:o.minor_release}}}catch(o){throw new Error(`Error fetching book metadata: ${o.message}`)}},ht=async(e,t)=>{l.info(`downloadBook ${e} ${JSON.stringify(t)}`);let{client:r,cleanup:o}=await J(e,t?.bookMetadata);try{let{ext:n}=g.parse(t.outputFile.path);if(n===".json"){let i=await R(r);await Bun.file(t.outputFile.path).write(JSON.stringify(i,null,2))}else if(n===".db"||n===".sqlite"){let i=r.filename;r.close(),await m.rename(i,t.outputFile.path);let a=g.dirname(i);return await m.rm(a,{recursive:!0}),t.outputFile.path}await o()}catch(n){throw await o(),n}return t.outputFile.path},xe=async(e=0)=>{$();let t=S(B.env.SHAMELA_API_MASTER_PATCH_ENDPOINT,{version:e.toString()});l.info(`Fetching shamela.ws master database patch link: ${T(t)}`);try{let r=await P(t);return{url:r.patch_url,version:r.version}}catch(r){throw new Error(`Error fetching master patch: ${r.message}`)}},yt=e=>{let{origin:t}=new Z(B.env.SHAMELA_API_MASTER_PATCH_ENDPOINT);return`${t}/covers/${e}.jpg`},wt=async e=>{l.info(`downloadMasterDatabase ${JSON.stringify(e)}`);let t=await _("shamela_downloadMaster"),r=e.masterMetadata||await xe(0);l.info(`Downloading master database ${r.version} from: ${T(r.url)}`);let o=await E(L(r.url),t);if(l.info(`sourceTables downloaded: ${o.toString()}`),!z(o))throw l.error(`Some source tables were not found: ${o.toString()}`),new Error("Expected tables not found!");let n=g.join(t,"master.db"),i=new W(n);try{l.info("Creating tables"),await j(i),l.info("Copying data to master table"),await q(i,o);let{ext:a}=g.parse(e.outputFile.path);if(a===".json"){let s=await G(i);await Bun.file(e.outputFile.path).write(JSON.stringify(s,null,2))}i.close(),(a===".db"||a===".sqlite")&&await m.rename(n,e.outputFile.path),await m.rm(t,{recursive:!0})}finally{i.close()}return e.outputFile.path},Rt=async e=>{l.info(`getBook ${e}`);let{client:t,cleanup:r}=await J(e);try{let o=await R(t);return{pages:o.pages.map(H),titles:o.titles.map(V)}}finally{await r()}};var _e=/^[)\]\u00BB"”'’.,?!:\u061B\u060C\u061F\u06D4\u2026]+$/,Se=/[[({«“‘]$/,Pe=e=>{let t=[];for(let r of e){let o=t[t.length-1];o?.id&&_e.test(r.text)?o.text+=r.text:t.push(r)}return t},$e=e=>{let t=e.replace(/\r\n/g,`
42
+ )`),M(e,`authors`,h.Authors),M(e,`books`,h.Books),M(e,`categories`,h.Categories)},pe=e=>e.query(`SELECT * FROM ${h.Authors}`).all(),me=e=>e.query(`SELECT * FROM ${h.Books}`).all(),he=e=>e.query(`SELECT * FROM ${h.Categories}`).all(),N=(e,t)=>({authors:pe(e),books:me(e),categories:he(e),version:t}),P=(e,t=[`api_key`,`token`,`password`,`secret`,`auth`])=>{let n=typeof e==`string`?new URL(e):new URL(e.toString());return t.forEach(e=>{let t=n.searchParams.get(e);if(t&&t.length>6){let r=`${t.slice(0,3)}***${t.slice(-3)}`;n.searchParams.set(e,r)}else t&&n.searchParams.set(e,`***`)}),n.toString()},F=e=>({content:e.content,id:e.id,...e.number&&{number:e.number},...e.page&&{page:Number(e.page)},...e.part&&{part:e.part}}),I=e=>{let t=Number(e.parent);return{content:e.content,id:e.id,page:Number(e.page),...t&&{parent:t}}},L={"<img[^>]*>>":``,舄:``,"":`رَحِمَهُ ٱللَّٰهُ`,"":`رضي الله عنه`,"":`رَضِيَ ٱللَّٰهُ عَنْهَا`,"":`رَضِيَ اللَّهُ عَنْهُمْ`,"":`رَضِيَ ٱللَّٰهُ عَنْهُمَا`,"":`رَضِيَ اللَّهُ عَنْهُنَّ`,"":`صلى الله عليه وآله وسلم`,"":`رَحِمَهُمُ ٱللَّٰهُ`},R=e=>{let t=new URL(e);return t.protocol=`https`,t.toString()},z=e=>/\.(sqlite|db)$/i.test(e.name),B=e=>e.find(z),V=e=>{let t=/\.([^.]+)$/.exec(e);return t?`.${t[1].toLowerCase()}`:``},H=(e,t,n=!0)=>{let r=new URL(e),i=new URLSearchParams;return Object.entries(t).forEach(([e,t])=>{i.append(e,t.toString())}),n&&i.append(`api_key`,m(`apiKey`)),r.search=i.toString(),r},U=async(e,t={})=>{let n=typeof e==`string`?e:e.toString(),r=await(t.fetchImpl??p().fetchImplementation??fetch)(n);if(!r.ok)throw Error(`Error making request: ${r.status} ${r.statusText}`);if((r.headers.get(`content-type`)??``).includes(`application/json`))return await r.json();let i=await r.arrayBuffer();return new Uint8Array(i)},ge=typeof process<`u`&&!!process?.versions?.node,_e=async()=>{if(!ge)throw Error(`File system operations are only supported in Node.js environments`);return import(`node:fs/promises`)},ve=async e=>{let[t,n]=await Promise.all([_e(),import(`node:path`)]),r=n.dirname(e);return await t.mkdir(r,{recursive:!0}),t},W=async e=>{let n=await U(e),r=n instanceof Uint8Array?n.length:n&&typeof n.byteLength==`number`?n.byteLength:0;return s.debug(`unzipFromUrl:bytes`,r),new Promise((e,r)=>{let i=n instanceof Uint8Array?n:new Uint8Array(n);try{let n=t(i),r=Object.entries(n).map(([e,t])=>({data:t,name:e}));s.debug(`unzipFromUrl:entries`,r.map(e=>e.name)),e(r)}catch(e){r(Error(`Error processing URL: ${e.message}`))}})},G=async(e,t)=>{if(e.writer){await e.writer(t);return}if(!e.path)throw Error(`Output options must include either a writer or a path`);let n=await ve(e.path);typeof t==`string`?await n.writeFile(e.path,t,`utf-8`):await n.writeFile(e.path,t)},ye=[`author.sqlite`,`book.sqlite`,`category.sqlite`],K=()=>{let{apiKey:e,booksEndpoint:t,masterPatchEndpoint:n}=p(),r=[[`apiKey`,e],[`booksEndpoint`,t],[`masterPatchEndpoint`,n]].filter(([,e])=>!e).map(([e])=>e);if(r.length)throw Error(`${r.join(`, `)} environment variables not set`)},be=e=>{let t=new Set(e.map(e=>e.match(/[^\\/]+$/)?.[0]??e).map(e=>e.toLowerCase()));return ye.every(e=>t.has(e.toLowerCase()))},q=async(e,t)=>{s.info(`Setting up book database for ${e}`);let n=t||await Y(e),r=n.minorReleaseUrl?W(n.minorReleaseUrl):Promise.resolve([]),[i,a]=await Promise.all([W(n.majorReleaseUrl),r]),o=B(i);if(!o)throw Error(`Unable to locate book database in archive`);let c=await A();try{s.info(`Creating tables`),S(c);let e=await j(o.data);try{let t=B(a);if(t){s.info(`Applying patches from ${t.name} to ${o.name}`);let n=await j(t.data);try{ae(c,e,n)}finally{n.close()}}else s.info(`Copying table data from ${o.name}`),oe(c,e)}finally{e.close()}return{cleanup:async()=>{c.close()},client:c}}catch(e){throw c.close(),e}},J=async e=>{s.info(`Setting up master database`);let t=e||await X(0);s.info(`Downloading master database ${t.version} from: ${P(t.url)}`);let n=await W(R(t.url));if(s.debug?.(`sourceTables downloaded: ${n.map(e=>e.name).toString()}`),!be(n.map(e=>e.name)))throw s.error(`Some source tables were not found: ${n.map(e=>e.name).toString()}`),Error(`Expected tables not found!`);let r=await A();try{return s.info(`Creating master tables`),fe(r),s.info(`Copying data to master table`),await de(r,n.filter(z)),{cleanup:async()=>{r.close()},client:r,version:t.version}}catch(e){throw r.close(),e}},Y=async(e,t)=>{K();let n=H(`${m(`booksEndpoint`)}/${e}`,{major_release:(t?.majorVersion||0).toString(),minor_release:(t?.minorVersion||0).toString()});s.info(`Fetching shamela.ws book link: ${P(n)}`);try{let e=await U(n);return{majorRelease:e.major_release,majorReleaseUrl:R(e.major_release_url),...e.minor_release_url&&{minorReleaseUrl:R(e.minor_release_url)},...e.minor_release_url&&{minorRelease:e.minor_release}}}catch(e){throw Error(`Error fetching book metadata: ${e.message}`)}},xe=async(e,t)=>{if(s.info(`downloadBook ${e} ${JSON.stringify(t)}`),!t.outputFile.path)throw Error(`outputFile.path must be provided to determine output format`);let n=V(t.outputFile.path).toLowerCase(),{client:r,cleanup:i}=await q(e,t?.bookMetadata);try{if(n===`.json`){let e=await T(r);await G(t.outputFile,JSON.stringify(e,null,2))}else if(n===`.db`||n===`.sqlite`){let e=r.export();await G(t.outputFile,e)}else throw Error(`Unsupported output extension: ${n}`)}finally{await i()}return t.outputFile.path},X=async(e=0)=>{K();let t=H(m(`masterPatchEndpoint`),{version:e.toString()});s.info(`Fetching shamela.ws master database patch link: ${P(t)}`);try{let e=await U(t);return{url:e.patch_url,version:e.version}}catch(e){throw Error(`Error fetching master patch: ${e.message}`)}},Se=e=>{let t=m(`masterPatchEndpoint`),{origin:n}=new URL(t);return`${n}/covers/${e}.jpg`},Ce=async e=>{if(s.info(`downloadMasterDatabase ${JSON.stringify(e)}`),!e.outputFile.path)throw Error(`outputFile.path must be provided to determine output format`);let t=V(e.outputFile.path),{client:n,cleanup:r,version:i}=await J(e.masterMetadata);try{if(t===`.json`){let t=N(n,i);await G(e.outputFile,JSON.stringify(t,null,2))}else if(t===`.db`||t===`.sqlite`)await G(e.outputFile,n.export());else throw Error(`Unsupported output extension: ${t}`)}finally{await r()}return e.outputFile.path},we=async e=>{s.info(`getBook ${e}`);let{client:t,cleanup:n}=await q(e);try{let e=await T(t);return{pages:e.pages.map(F),titles:e.titles.map(I)}}finally{await n()}},Te=async()=>{s.info(`getMaster`);let{client:e,cleanup:t,version:n}=await J();try{return N(e,n)}finally{await t()}},Ee=/^[)\]\u00BB"”'’.,?!:\u061B\u060C\u061F\u06D4\u2026]+$/,De=/[[({«“‘]$/,Oe=e=>{let t=[];for(let n of e){let e=t[t.length-1];e?.id&&Ee.test(n.text)?e.text+=n.text:t.push(n)}return t},ke=e=>{let t=e.replace(/\r\n/g,`
43
43
  `).replace(/\r/g,`
44
44
  `);return/\n/.test(t)||(t=t.replace(/([.?!\u061F\u061B\u06D4\u2026]["“”'’»«)\]]?)\s+(?=[\u0600-\u06FF])/,`$1
45
45
  `)),t.split(`
46
- `).map(r=>r.replace(/^\*+/,"").trim()).filter(Boolean)},K=e=>$e(e).map(t=>({text:t})),Y=(e,t)=>{let r=new RegExp(`${t}\\s*=\\s*("([^"]*)"|'([^']*)'|([^s>]+))`,"i"),o=e.match(r);if(o)return o[2]??o[3]??o[4]},Le=e=>{let t=[],r=/<[^>]+>/g,o=0,n;for(n=r.exec(e);n;){n.index>o&&t.push({type:"text",value:e.slice(o,n.index)});let i=n[0],a=/^<\//.test(i),s=i.match(/^<\/?\s*([a-zA-Z0-9:-]+)/),c=s?s[1].toLowerCase():"";if(a)t.push({name:c,type:"end"});else{let u={};u.id=Y(i,"id"),u["data-type"]=Y(i,"data-type"),t.push({attributes:u,name:c,type:"start"})}o=r.lastIndex,n=r.exec(e)}return o<e.length&&t.push({type:"text",value:e.slice(o)}),t},Be=(e,t)=>{let r=e[e.length-1];return!t||!r||!r.id||!Se.test(r.text)||/\n/.test(t)?!1:(r.text+=t.replace(/^\s+/,""),!0)},Dt=e=>{if(!/<span[^>]*>/i.test(e))return K(e);let t=Le(`<root>${e}</root>`),r=[],o=0,n=null,i=s=>{if(!s)return;if(o>0&&n){let u=o===1?s.replace(/^\s+/,""):s;n.text+=u;return}if(Be(r,s))return;let c=s.trim();c&&r.push(...K(c))};for(let s of t)s.type==="text"?i(s.value):s.type==="start"&&s.name==="span"?s.attributes["data-type"]==="title"&&(o===0&&(n={id:s.attributes.id?.replace(/^toc-/,"")??"",text:""},r.push(n)),o+=1):s.type==="end"&&s.name==="span"&&o>0&&(o-=1,o===0&&(n=null));let a=r.map(s=>s.id?s:{...s,text:s.text.trim()});return Pe(a.map(s=>s.id?s:{...s,text:s.text})).filter(s=>s.text.length>0)},Ie=Object.entries(d).map(([e,t])=>({regex:new RegExp(e,"g"),replacement:t})),Me=e=>{if(e===d)return Ie;let t=[];for(let r in e)t.push({regex:new RegExp(r,"g"),replacement:e[r]});return t},xt=(e,t=d)=>{let r=Me(t),o=e;for(let n=0;n<r.length;n++){let{regex:i,replacement:a}=r[n];o=o.replace(i,a)}return o},_t=(e,t="_________")=>{let r="",o=e.lastIndexOf(t);return o>=0&&(r=e.slice(o+t.length),e=e.slice(0,o)),[e,r]};export{ht as downloadBook,wt as downloadMasterDatabase,Rt as getBook,De as getBookMetadata,yt as getCoverUrl,xe as getMasterMetadata,Dt as parseContentRobust,xt as sanitizePageContent,ee as setLogger,_t as splitPageBodyFromFooter};
46
+ `).map(e=>e.replace(/^\*+/,``).trim()).filter(Boolean)},Z=e=>ke(e).map(e=>({text:e})),Q=(e,t)=>{let n=RegExp(`${t}\\s*=\\s*("([^"]*)"|'([^']*)'|([^s>]+))`,`i`),r=e.match(n);if(r)return r[2]??r[3]??r[4]},Ae=e=>{let t=[],n=/<[^>]+>/g,r=0,i;for(i=n.exec(e);i;){i.index>r&&t.push({type:`text`,value:e.slice(r,i.index)});let a=i[0],o=/^<\//.test(a),s=a.match(/^<\/?\s*([a-zA-Z0-9:-]+)/),c=s?s[1].toLowerCase():``;if(o)t.push({name:c,type:`end`});else{let e={};e.id=Q(a,`id`),e[`data-type`]=Q(a,`data-type`),t.push({attributes:e,name:c,type:`start`})}r=n.lastIndex,i=n.exec(e)}return r<e.length&&t.push({type:`text`,value:e.slice(r)}),t},$=(e,t)=>{let n=e[e.length-1];return!t||!n||!n.id||!De.test(n.text)||/\n/.test(t)?!1:(n.text+=t.replace(/^\s+/,``),!0)},je=e=>{if(!/<span[^>]*>/i.test(e))return Z(e);let t=Ae(`<root>${e}</root>`),n=[],r=0,i=null,a=e=>{if(!e)return;if(r>0&&i){let t=r===1?e.replace(/^\s+/,``):e;i.text+=t;return}if($(n,e))return;let t=e.trim();t&&n.push(...Z(t))};for(let e of t)e.type===`text`?a(e.value):e.type===`start`&&e.name===`span`?e.attributes[`data-type`]===`title`&&(r===0&&(i={id:e.attributes.id?.replace(/^toc-/,``)??``,text:``},n.push(i)),r+=1):e.type===`end`&&e.name===`span`&&r>0&&(--r,r===0&&(i=null));return Oe(n.map(e=>e.id?e:{...e,text:e.text.trim()}).map(e=>e.id?e:{...e,text:e.text})).filter(e=>e.text.length>0)},Me=Object.entries(L).map(([e,t])=>({regex:new RegExp(e,`g`),replacement:t})),Ne=e=>{if(e===L)return Me;let t=[];for(let n in e)t.push({regex:new RegExp(n,`g`),replacement:e[n]});return t},Pe=(e,t=L)=>{let n=Ne(t),r=e;for(let e=0;e<n.length;e++){let{regex:t,replacement:i}=n[e];r=r.replace(t,i)}return r},Fe=(e,t=`_________`)=>{let n=``,r=e.lastIndexOf(t);return r>=0&&(n=e.slice(r+t.length),e=e.slice(0,r)),[e,n]},Ie=e=>e.replace(/\s?⦗[\u0660-\u0669]+⦘\s?/,` `),Le=e=>(e=e.replace(/<a[^>]*>(.*?)<\/a>/g,`$1`),e=e.replace(/<hadeeth[^>]*>|<\/hadeeth>|<hadeeth-\d+>/g,``),e);export{ee as configure,xe as downloadBook,Ce as downloadMasterDatabase,we as getBook,Y as getBookMetadata,Se as getCoverUrl,Te as getMaster,X as getMasterMetadata,je as parseContentRobust,Ie as removeArabicNumericPageMarkers,Le as removeTagsExceptSpan,te as resetConfig,Pe as sanitizePageContent,Fe as splitPageBodyFromFooter};
47
47
  //# sourceMappingURL=index.js.map
package/dist/index.js.map CHANGED
@@ -1 +1 @@
1
- {"version":3,"sources":["../src/api.ts","../src/db/book.ts","../src/utils/logger.ts","../src/db/master.ts","../src/db/queryBuilder.ts","../src/utils/common.ts","../src/utils/constants.ts","../src/utils/io.ts","../src/utils/network.ts","../src/utils/validation.ts","../src/content.ts"],"sourcesContent":["import { Database } from 'bun:sqlite';\nimport { promises as fs } from 'node:fs';\nimport path from 'node:path';\nimport process from 'node:process';\nimport { URL } from 'node:url';\n\nimport { applyPatches, copyTableData, createTables as createBookTables, getData as getBookData } from './db/book.js';\nimport {\n copyForeignMasterTableData,\n createTables as createMasterTables,\n getData as getMasterData,\n} from './db/master.js';\nimport type {\n BookData,\n DownloadBookOptions,\n DownloadMasterOptions,\n GetBookMetadataOptions,\n GetBookMetadataResponsePayload,\n GetMasterMetadataResponsePayload,\n} from './types.js';\nimport { mapPageRowToPage, mapTitleRowToTitle, redactUrl } from './utils/common.js';\nimport { DEFAULT_MASTER_METADATA_VERSION } from './utils/constants.js';\nimport { createTempDir, unzipFromUrl } from './utils/io.js';\nimport logger from './utils/logger.js';\nimport { buildUrl, httpsGet } from './utils/network.js';\nimport { validateEnvVariables, validateMasterSourceTables } from './utils/validation.js';\n\nconst fixHttpsProtocol = (originalUrl: string) => {\n const url = new URL(originalUrl);\n url.protocol = 'https';\n\n return url.toString();\n};\n\ntype BookUpdatesResponse = {\n major_release: number;\n major_release_url: string;\n minor_release?: number;\n minor_release_url?: string;\n};\n\n/**\n * Sets up a book database with tables and data, returning the database client.\n *\n * This helper function handles the common logic of downloading book files,\n * creating database tables, and applying patches or copying data.\n *\n * @param id - The unique identifier of the book\n * @param bookMetadata - Optional pre-fetched book metadata\n * @returns A promise that resolves to an object containing the database client and cleanup function\n */\nconst setupBookDatabase = async (\n id: number,\n bookMetadata?: GetBookMetadataResponsePayload,\n): Promise<{ client: Database; cleanup: () => Promise<void> }> => {\n logger.info(`Setting up book database for ${id}`);\n\n const outputDir = await createTempDir('shamela_setupBook');\n\n const bookResponse: GetBookMetadataResponsePayload = bookMetadata || (await getBookMetadata(id));\n const [[bookDatabase], [patchDatabase] = []]: string[][] = await Promise.all([\n unzipFromUrl(bookResponse.majorReleaseUrl, outputDir),\n ...(bookResponse.minorReleaseUrl ? [unzipFromUrl(bookResponse.minorReleaseUrl, outputDir)] : []),\n ]);\n const dbPath = path.join(outputDir, 'book.db');\n\n const client = new Database(dbPath);\n\n try {\n logger.info(`Creating tables`);\n await createBookTables(client);\n\n if (patchDatabase) {\n logger.info(`Applying patches from ${patchDatabase} to ${bookDatabase}`);\n await applyPatches(client, bookDatabase, patchDatabase);\n } else {\n logger.info(`Copying table data from ${bookDatabase}`);\n await copyTableData(client, bookDatabase);\n }\n\n const cleanup = async () => {\n client.close();\n await fs.rm(outputDir, { recursive: true });\n };\n\n return { cleanup, client };\n } catch (error) {\n client.close();\n await fs.rm(outputDir, { recursive: true });\n throw error;\n }\n};\n\n/**\n * Retrieves metadata for a specific book from the Shamela API.\n *\n * This function fetches book release information including major and minor release\n * URLs and version numbers from the Shamela web service.\n *\n * @param id - The unique identifier of the book to fetch metadata for\n * @param options - Optional parameters for specifying major and minor versions\n * @returns A promise that resolves to book metadata including release URLs and versions\n *\n * @throws {Error} When environment variables are not set or API request fails\n *\n * @example\n * ```typescript\n * const metadata = await getBookMetadata(123, { majorVersion: 1, minorVersion: 2 });\n * console.log(metadata.majorReleaseUrl); // Download URL for the book\n * ```\n */\nexport const getBookMetadata = async (\n id: number,\n options?: GetBookMetadataOptions,\n): Promise<GetBookMetadataResponsePayload> => {\n validateEnvVariables();\n\n const url = buildUrl(`${process.env.SHAMELA_API_BOOKS_ENDPOINT}/${id}`, {\n major_release: (options?.majorVersion || 0).toString(),\n minor_release: (options?.minorVersion || 0).toString(),\n });\n\n logger.info(`Fetching shamela.ws book link: ${redactUrl(url)}`);\n\n try {\n const response = (await httpsGet(url)) as BookUpdatesResponse;\n return {\n majorRelease: response.major_release,\n majorReleaseUrl: fixHttpsProtocol(response.major_release_url),\n ...(response.minor_release_url && { minorReleaseUrl: fixHttpsProtocol(response.minor_release_url) }),\n ...(response.minor_release_url && { minorRelease: response.minor_release }),\n };\n } catch (error: any) {\n throw new Error(`Error fetching book metadata: ${error.message}`);\n }\n};\n\n/**\n * Downloads and processes a book from the Shamela database.\n *\n * This function downloads the book's database files, applies patches if available,\n * creates the necessary database tables, and exports the data to the specified format.\n * The output can be either a JSON file or a SQLite database file.\n *\n * @param id - The unique identifier of the book to download\n * @param options - Configuration options including output file path and optional book metadata\n * @returns A promise that resolves to the path of the created output file\n *\n * @throws {Error} When download fails, database operations fail, or file operations fail\n *\n * @example\n * ```typescript\n * // Download as JSON\n * const jsonPath = await downloadBook(123, {\n * outputFile: { path: './book.json' }\n * });\n *\n * // Download as SQLite database\n * const dbPath = await downloadBook(123, {\n * outputFile: { path: './book.db' }\n * });\n * ```\n */\nexport const downloadBook = async (id: number, options: DownloadBookOptions): Promise<string> => {\n logger.info(`downloadBook ${id} ${JSON.stringify(options)}`);\n\n const { client, cleanup } = await setupBookDatabase(id, options?.bookMetadata);\n\n try {\n const { ext: extension } = path.parse(options.outputFile.path);\n\n if (extension === '.json') {\n const result = await getBookData(client);\n await Bun.file(options.outputFile.path).write(JSON.stringify(result, null, 2));\n } else if (extension === '.db' || extension === '.sqlite') {\n // For database files, we need to handle the file copying differently\n // since we can't move an open database file\n const tempDbPath = client.filename;\n client.close();\n await fs.rename(tempDbPath, options.outputFile.path);\n // Skip cleanup for the db file since we moved it\n const outputDir = path.dirname(tempDbPath);\n await fs.rm(outputDir, { recursive: true });\n return options.outputFile.path;\n }\n\n await cleanup();\n } catch (error) {\n await cleanup();\n throw error;\n }\n\n return options.outputFile.path;\n};\n\n/**\n * Retrieves metadata for the master database from the Shamela API.\n *\n * The master database contains information about all books, authors, and categories\n * in the Shamela library. This function fetches the download URL and version\n * information for the master database patches.\n *\n * @param version - The version number to check for updates (defaults to 0)\n * @returns A promise that resolves to master database metadata including download URL and version\n *\n * @throws {Error} When environment variables are not set or API request fails\n *\n * @example\n * ```typescript\n * const masterMetadata = await getMasterMetadata(5);\n * console.log(masterMetadata.url); // URL to download master database patch\n * console.log(masterMetadata.version); // Latest version number\n * ```\n */\nexport const getMasterMetadata = async (version: number = 0): Promise<GetMasterMetadataResponsePayload> => {\n validateEnvVariables();\n\n const url = buildUrl(process.env.SHAMELA_API_MASTER_PATCH_ENDPOINT as string, { version: version.toString() });\n\n logger.info(`Fetching shamela.ws master database patch link: ${redactUrl(url)}`);\n\n try {\n const response: Record<string, any> = await httpsGet(url);\n return { url: response.patch_url, version: response.version };\n } catch (error: any) {\n throw new Error(`Error fetching master patch: ${error.message}`);\n }\n};\n\n/**\n * Generates the URL for a book's cover image.\n *\n * This function constructs the URL to access the cover image for a specific book\n * using the book's ID and the API endpoint host.\n *\n * @param bookId - The unique identifier of the book\n * @returns The complete URL to the book's cover image\n *\n * @example\n * ```typescript\n * const coverUrl = getCoverUrl(123);\n * console.log(coverUrl); // \"https://api.shamela.ws/covers/123.jpg\"\n * ```\n */\nexport const getCoverUrl = (bookId: number) => {\n const { origin } = new URL(process.env.SHAMELA_API_MASTER_PATCH_ENDPOINT!);\n return `${origin}/covers/${bookId}.jpg`;\n};\n\n/**\n * Downloads and processes the master database from the Shamela service.\n *\n * The master database contains comprehensive information about all books, authors,\n * and categories available in the Shamela library. This function downloads the\n * database files, creates the necessary tables, and exports the data in the\n * specified format (JSON or SQLite).\n *\n * @param options - Configuration options including output file path and optional master metadata\n * @returns A promise that resolves to the path of the created output file\n *\n * @throws {Error} When download fails, expected tables are missing, database operations fail, or file operations fail\n *\n * @example\n * ```typescript\n * // Download master database as JSON\n * const jsonPath = await downloadMasterDatabase({\n * outputFile: { path: './master.json' }\n * });\n *\n * // Download master database as SQLite\n * const dbPath = await downloadMasterDatabase({\n * outputFile: { path: './master.db' }\n * });\n * ```\n */\nexport const downloadMasterDatabase = async (options: DownloadMasterOptions): Promise<string> => {\n logger.info(`downloadMasterDatabase ${JSON.stringify(options)}`);\n\n const outputDir = await createTempDir('shamela_downloadMaster');\n\n const masterResponse: GetMasterMetadataResponsePayload =\n options.masterMetadata || (await getMasterMetadata(DEFAULT_MASTER_METADATA_VERSION));\n\n logger.info(`Downloading master database ${masterResponse.version} from: ${redactUrl(masterResponse.url)}`);\n const sourceTables: string[] = await unzipFromUrl(fixHttpsProtocol(masterResponse.url), outputDir);\n\n logger.info(`sourceTables downloaded: ${sourceTables.toString()}`);\n\n if (!validateMasterSourceTables(sourceTables)) {\n logger.error(`Some source tables were not found: ${sourceTables.toString()}`);\n throw new Error('Expected tables not found!');\n }\n\n const dbPath = path.join(outputDir, 'master.db');\n\n const client = new Database(dbPath);\n\n try {\n logger.info(`Creating tables`);\n await createMasterTables(client);\n\n logger.info(`Copying data to master table`);\n await copyForeignMasterTableData(client, sourceTables);\n\n const { ext: extension } = path.parse(options.outputFile.path);\n\n if (extension === '.json') {\n const result = await getMasterData(client);\n await Bun.file(options.outputFile.path).write(JSON.stringify(result, null, 2));\n }\n\n client.close();\n\n if (extension === '.db' || extension === '.sqlite') {\n await fs.rename(dbPath, options.outputFile.path);\n }\n\n await fs.rm(outputDir, { recursive: true });\n } finally {\n client.close();\n }\n\n return options.outputFile.path;\n};\n\n/**\n * Retrieves complete book data including pages and titles.\n *\n * This is a convenience function that downloads a book's data and returns it\n * as a structured JavaScript object. The function handles the temporary file\n * creation and cleanup automatically.\n *\n * @param id - The unique identifier of the book to retrieve\n * @returns A promise that resolves to the complete book data including pages and titles\n *\n * @throws {Error} When download fails, file operations fail, or JSON parsing fails\n *\n * @example\n * ```typescript\n * const bookData = await getBook(123);\n * console.log(bookData.pages.length); // Number of pages in the book\n * console.log(bookData.titles?.length); // Number of title entries\n * ```\n */\nexport const getBook = async (id: number): Promise<BookData> => {\n logger.info(`getBook ${id}`);\n\n const { client, cleanup } = await setupBookDatabase(id);\n\n try {\n const data = await getBookData(client);\n\n const result: BookData = {\n pages: data.pages.map(mapPageRowToPage),\n titles: data.titles.map(mapTitleRowToTitle),\n };\n\n return result;\n } finally {\n await cleanup();\n }\n};\n","import { Database } from 'bun:sqlite';\nimport logger from '@/utils/logger';\nimport { type Deletable, type PageRow, Tables, type TitleRow } from './types';\n\ntype Row = Record<string, any> & Deletable;\n\nconst PATCH_NOOP_VALUE = '#';\n\n/**\n * Retrieves column information for a specified table.\n * @param db - The database instance\n * @param table - The table name to get info for\n * @returns Array of column information with name and type\n */\nconst getTableInfo = (db: Database, table: Tables) => {\n return db.query(`PRAGMA table_info(${table})`).all() as { name: string; type: string }[];\n};\n\n/**\n * Checks if a table exists in the database.\n * @param db - The database instance\n * @param table - The table name to check\n * @returns True if the table exists, false otherwise\n */\nconst hasTable = (db: Database, table: Tables): boolean => {\n const result = db.query(`SELECT name FROM sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { name: string }\n | undefined;\n return Boolean(result);\n};\n\n/**\n * Reads all rows from a specified table.\n * @param db - The database instance\n * @param table - The table name to read from\n * @returns Array of rows, or empty array if table doesn't exist\n */\nconst readRows = (db: Database, table: Tables): Row[] => {\n if (!hasTable(db, table)) {\n return [];\n }\n\n return db.query(`SELECT * FROM ${table}`).all() as Row[];\n};\n\n/**\n * Checks if a row is marked as deleted.\n * @param row - The row to check\n * @returns True if the row has is_deleted field set to '1', false otherwise\n */\nconst isDeleted = (row: Row): boolean => {\n return String(row.is_deleted) === '1';\n};\n\n/**\n * Merges values from a base row and patch row, with patch values taking precedence.\n * @param baseRow - The original row data (can be undefined)\n * @param patchRow - The patch row data with updates (can be undefined)\n * @param columns - Array of column names to merge\n * @returns Merged row with combined values\n */\nconst mergeRowValues = (baseRow: Row | undefined, patchRow: Row | undefined, columns: string[]): Row => {\n const merged: Row = {};\n\n for (const column of columns) {\n if (column === 'id') {\n merged.id = (patchRow ?? baseRow)?.id ?? null;\n continue;\n }\n\n if (patchRow && column in patchRow) {\n const value = patchRow[column];\n\n if (value !== PATCH_NOOP_VALUE && value !== null && value !== undefined) {\n merged[column] = value;\n continue;\n }\n }\n\n if (baseRow && column in baseRow) {\n merged[column] = baseRow[column];\n continue;\n }\n\n merged[column] = null;\n }\n\n return merged;\n};\n\n/**\n * Merges arrays of base rows and patch rows, handling deletions and updates.\n * @param baseRows - Original rows from the base database\n * @param patchRows - Patch rows containing updates, additions, and deletions\n * @param columns - Array of column names to merge\n * @returns Array of merged rows with patches applied\n */\nconst mergeRows = (baseRows: Row[], patchRows: Row[], columns: string[]): Row[] => {\n const baseIds = new Set<string>();\n const patchById = new Map<string, Row>();\n\n for (const row of baseRows) {\n baseIds.add(String(row.id));\n }\n\n for (const row of patchRows) {\n patchById.set(String(row.id), row);\n }\n\n const merged: Row[] = [];\n\n for (const baseRow of baseRows) {\n const patchRow = patchById.get(String(baseRow.id));\n\n if (patchRow && isDeleted(patchRow)) {\n continue;\n }\n\n merged.push(mergeRowValues(baseRow, patchRow, columns));\n }\n\n for (const row of patchRows) {\n const id = String(row.id);\n\n if (baseIds.has(id) || isDeleted(row)) {\n continue;\n }\n\n merged.push(mergeRowValues(undefined, row, columns));\n }\n\n return merged;\n};\n\n/**\n * Inserts multiple rows into a specified table using a prepared statement.\n * @param db - The database instance\n * @param table - The table name to insert into\n * @param columns - Array of column names\n * @param rows - Array of row data to insert\n */\nconst insertRows = (db: Database, table: Tables, columns: string[], rows: Row[]) => {\n if (rows.length === 0) {\n return;\n }\n\n const placeholders = columns.map(() => '?').join(',');\n const statement = db.prepare(`INSERT INTO ${table} (${columns.join(',')}) VALUES (${placeholders})`);\n\n rows.forEach((row) => {\n const values = columns.map((column) => (column in row ? row[column] : null));\n // Spread the values array instead of passing it directly\n statement.run(...values);\n });\n\n statement.finalize();\n};\n\n/**\n * Ensures the target database has the same table schema as the source database.\n * @param target - The target database to create/update the table in\n * @param source - The source database to copy the schema from\n * @param table - The table name to ensure schema for\n * @returns True if schema was successfully ensured, false otherwise\n */\nconst ensureTableSchema = (target: Database, source: Database, table: Tables) => {\n const row = source.query(`SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { sql: string }\n | undefined;\n\n if (!row?.sql) {\n logger.warn(`${table} table definition missing in source database`);\n return false;\n }\n\n target.run(`DROP TABLE IF EXISTS ${table}`);\n target.run(row.sql);\n return true;\n};\n\n/**\n * Copies and patches a table from source to target database, applying patch updates if provided.\n * @param target - The target database to copy/patch the table to\n * @param source - The source database containing the base table data\n * @param patch - Optional patch database containing updates (can be null)\n * @param table - The table name to copy and patch\n */\nconst copyAndPatchTable = (target: Database, source: Database, patch: Database | null, table: Tables) => {\n if (!hasTable(source, table)) {\n logger.warn(`${table} table missing in source database`);\n return;\n }\n\n if (!ensureTableSchema(target, source, table)) {\n return;\n }\n\n const baseInfo = getTableInfo(source, table);\n const patchInfo = patch && hasTable(patch, table) ? getTableInfo(patch, table) : [];\n\n const columns = baseInfo.map((info) => info.name);\n\n for (const info of patchInfo) {\n if (!columns.includes(info.name)) {\n const columnType = info.type && info.type.length > 0 ? info.type : 'TEXT';\n target.run(`ALTER TABLE ${table} ADD COLUMN ${info.name} ${columnType}`);\n columns.push(info.name);\n }\n }\n\n const baseRows = readRows(source, table);\n const patchRows = patch ? readRows(patch, table) : [];\n\n const mergedRows = mergeRows(baseRows, patchRows, columns);\n\n insertRows(target, table, columns, mergedRows);\n};\n\n/**\n * Applies patches from a patch database to the main database.\n * @param db - The target database to apply patches to\n * @param aslDB - Path to the source ASL database file\n * @param patchDB - Path to the patch database file\n */\nexport const applyPatches = (db: Database, aslDB: string, patchDB: string) => {\n const source = new Database(aslDB);\n const patch = new Database(patchDB);\n\n try {\n db.transaction(() => {\n copyAndPatchTable(db, source, patch, Tables.Page);\n copyAndPatchTable(db, source, patch, Tables.Title);\n })();\n } finally {\n source.close();\n patch.close();\n }\n};\n\n/**\n * Copies table data from a source database without applying any patches.\n * @param db - The target database to copy data to\n * @param aslDB - Path to the source ASL database file\n */\nexport const copyTableData = (db: Database, aslDB: string) => {\n const source = new Database(aslDB);\n\n try {\n db.transaction(() => {\n copyAndPatchTable(db, source, null, Tables.Page);\n copyAndPatchTable(db, source, null, Tables.Title);\n })();\n } finally {\n source.close();\n }\n};\n\n/**\n * Creates the required tables (Page and Title) in the database with their schema.\n * @param db - The database instance to create tables in\n */\nexport const createTables = (db: Database) => {\n db.run(\n `CREATE TABLE ${Tables.Page} (\n id INTEGER,\n content TEXT,\n part TEXT,\n page TEXT,\n number TEXT,\n services TEXT,\n is_deleted TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Title} (\n id INTEGER,\n content TEXT,\n page INTEGER,\n parent INTEGER,\n is_deleted TEXT\n )`,\n );\n};\n\n/**\n * Retrieves all pages from the Page table.\n * @param db - The database instance\n * @returns Array of all pages\n */\nexport const getAllPages = (db: Database) => {\n return db.query(`SELECT * FROM ${Tables.Page}`).all() as PageRow[];\n};\n\n/**\n * Retrieves all titles from the Title table.\n * @param db - The database instance\n * @returns Array of all titles\n */\nexport const getAllTitles = (db: Database) => {\n return db.query(`SELECT * FROM ${Tables.Title}`).all() as TitleRow[];\n};\n\n/**\n * Retrieves all book data including pages and titles.\n * @param db - The database instance\n * @returns Object containing arrays of pages and titles\n */\nexport const getData = (db: Database) => {\n return { pages: getAllPages(db), titles: getAllTitles(db) };\n};\n","type LogFunction = (...args: unknown[]) => void;\n\ninterface Logger {\n debug: LogFunction;\n error: LogFunction;\n info: LogFunction;\n warn: LogFunction;\n}\n\nconst SILENT_LOGGER = { debug: () => {}, error: () => {}, info: () => {}, warn: () => {} };\nlet logger: Logger = SILENT_LOGGER;\n\nexport const setLogger = (newLogger: Logger = SILENT_LOGGER) => {\n if (!newLogger.debug || !newLogger.error || !newLogger.info) {\n throw new Error('Logger must implement debug, error, and info methods');\n }\n\n logger = newLogger;\n};\n\nexport { logger as default };\n","import type { Database } from 'bun:sqlite';\nimport path from 'node:path';\n\nimport type { Author, Book, Category, MasterData } from '../types';\nimport { attachDB, detachDB } from './queryBuilder';\nimport { Tables } from './types';\n\n/**\n * Ensures the target database has the same table schema as the source database for a specific table.\n * @param db - The database instance\n * @param alias - The alias name of the attached database\n * @param table - The table name to ensure schema for\n * @throws {Error} When table definition is missing in the source database\n */\nconst ensureTableSchema = (db: Database, alias: string, table: Tables) => {\n const row = db.query(`SELECT sql FROM ${alias}.sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { sql: string }\n | undefined;\n\n if (!row?.sql) {\n throw new Error(`Missing table definition for ${table} in ${alias}`);\n }\n\n db.run(`DROP TABLE IF EXISTS ${table}`);\n db.run(row.sql);\n};\n\n/**\n * Copies data from foreign master table files into the main master database.\n *\n * This function processes the source table files (author.sqlite, book.sqlite, category.sqlite)\n * by attaching them to the current database connection, then copying their data into\n * the main master database tables. It handles data transformation including filtering\n * out deleted records and converting placeholder values.\n *\n * @param db - The database client instance for the master database\n * @param sourceTables - Array of file paths to the source SQLite table files\n *\n * @throws {Error} When source files cannot be attached or data copying operations fail\n */\nexport const copyForeignMasterTableData = (db: Database, sourceTables: string[]) => {\n const aliasToPath: Record<string, string> = {};\n\n for (const tablePath of sourceTables) {\n const { name } = path.parse(tablePath);\n aliasToPath[name] = tablePath;\n }\n\n Object.entries(aliasToPath).forEach(([alias, dbPath]) => {\n db.run(attachDB(dbPath, alias));\n });\n\n ensureTableSchema(db, Tables.Authors, Tables.Authors);\n ensureTableSchema(db, Tables.Books, Tables.Books);\n ensureTableSchema(db, Tables.Categories, Tables.Categories);\n\n const insertAuthors = db.prepare(`INSERT INTO ${Tables.Authors} SELECT * FROM ${Tables.Authors}.${Tables.Authors}`);\n const insertBooks = db.prepare(`INSERT INTO ${Tables.Books} SELECT * FROM ${Tables.Books}.${Tables.Books}`);\n const insertCategories = db.prepare(\n `INSERT INTO ${Tables.Categories} SELECT * FROM ${Tables.Categories}.${Tables.Categories}`,\n );\n\n db.transaction(() => {\n insertAuthors.run();\n insertBooks.run();\n insertCategories.run();\n })();\n\n Object.keys(aliasToPath).forEach((statement) => {\n db.run(detachDB(statement));\n });\n};\n\n/**\n * Creates a backward-compatible database view for legacy table names.\n * @param db - The database instance\n * @param viewName - The name of the view to create\n * @param sourceTable - The source table to base the view on\n */\nconst createCompatibilityView = (db: Database, viewName: string, sourceTable: Tables) => {\n db.run(`DROP VIEW IF EXISTS ${viewName}`);\n db.run(`CREATE VIEW ${viewName} AS SELECT * FROM ${sourceTable}`);\n};\n\n/**\n * Creates the necessary database tables for the master database.\n *\n * This function sets up the schema for the master database by creating\n * tables for authors, books, and categories with their respective columns\n * and data types. This is typically the first step in setting up a new\n * master database. Also creates backward-compatible views for legacy table names.\n *\n * @param db - The database client instance where tables should be created\n *\n * @throws {Error} When table creation fails due to database constraints or permissions\n */\nexport const createTables = (db: Database) => {\n db.run(\n `CREATE TABLE ${Tables.Authors} (\n id INTEGER,\n is_deleted TEXT,\n name TEXT,\n biography TEXT,\n death_text TEXT,\n death_number TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Books} (\n id INTEGER,\n name TEXT,\n is_deleted TEXT,\n category TEXT,\n type TEXT,\n date TEXT,\n author TEXT,\n printed TEXT,\n minor_release TEXT,\n major_release TEXT,\n bibliography TEXT,\n hint TEXT,\n pdf_links TEXT,\n metadata TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Categories} (\n id INTEGER,\n is_deleted TEXT,\n \"order\" TEXT,\n name TEXT\n )`,\n );\n\n // Provide backward-compatible pluralised views since callers historically\n // queried \"authors\", \"books\", and \"categories\" tables.\n createCompatibilityView(db, 'authors', Tables.Authors);\n createCompatibilityView(db, 'books', Tables.Books);\n createCompatibilityView(db, 'categories', Tables.Categories);\n};\n\n/**\n * Retrieves all authors from the Authors table.\n * @param db - The database instance\n * @returns Array of all authors\n */\nexport const getAllAuthors = (db: Database) => {\n return db.query(`SELECT * FROM ${Tables.Authors}`).all() as Author[];\n};\n\n/**\n * Retrieves all books from the Books table.\n * @param db - The database instance\n * @returns Array of all books\n */\nexport const getAllBooks = (db: Database) => {\n return db.query(`SELECT * FROM ${Tables.Books}`).all() as Book[];\n};\n\n/**\n * Retrieves all categories from the Categories table.\n * @param db - The database instance\n * @returns Array of all categories\n */\nexport const getAllCategories = (db: Database) => {\n return db.query(`SELECT * FROM ${Tables.Categories}`).all() as Category[];\n};\n\n/**\n * Retrieves all master data including authors, books, and categories.\n * @param db - The database instance\n * @returns Object containing arrays of authors, books, and categories\n */\nexport const getData = (db: Database) => {\n return { authors: getAllAuthors(db), books: getAllBooks(db), categories: getAllCategories(db) } as MasterData;\n};\n","/**\n * Generates SQL to attach a database file with an alias.\n * @param {string} dbFile - Path to the database file to attach\n * @param {string} alias - Alias name for the attached database\n * @returns {string} SQL ATTACH DATABASE statement\n */\nexport const attachDB = (dbFile: string, alias: string) => {\n const escapedPath = dbFile.replace(/'/g, \"''\");\n if (!/^[a-zA-Z0-9_]+$/.test(alias)) {\n throw new Error('Invalid database alias');\n }\n return `ATTACH DATABASE '${escapedPath}' AS ${alias}`;\n};\n\n/**\n * Generates SQL to create a table with specified fields.\n * @param {string} name - Name of the table to create\n * @param {string[]} fields - Array of field definitions\n * @returns {string} SQL CREATE TABLE statement\n */\nexport const createTable = (name: string, fields: string[]) => {\n if (!/^[a-zA-Z0-9_]+$/.test(name)) {\n throw new Error('Invalid table name');\n }\n fields.forEach((field) => {\n if (field.includes(';') || field.includes('--')) {\n throw new Error('Invalid field definition');\n }\n });\n return `CREATE TABLE IF NOT EXISTS ${name} (${fields.join(', ')})`;\n};\n\n/**\n * Generates SQL to detach a database by alias.\n * @param {string} alias - Alias of the database to detach\n * @returns {string} SQL DETACH DATABASE statement\n */\nexport const detachDB = (alias: string) => {\n if (!/^[a-zA-Z0-9_]+$/.test(alias)) {\n throw new Error('Invalid database alias');\n }\n return `DETACH DATABASE ${alias}`;\n};\n\n/**\n * Generates an unsafe SQL INSERT statement with provided field values.\n * @param {string} table - Name of the table to insert into\n * @param {Record<string, any>} fieldToValue - Object mapping field names to values\n * @param {boolean} [isDeleted=false] - Whether to mark the record as deleted\n * @returns {string} SQL INSERT statement (unsafe - does not escape values properly)\n * @warning This function does not properly escape SQL values and should not be used with untrusted input\n */\nexport const insertUnsafely = (table: string, fieldToValue: Record<string, any>, isDeleted = false) => {\n const combinedRecords: Record<string, any> = { ...fieldToValue, is_deleted: isDeleted ? '1' : '0' };\n\n const sortedKeys = Object.keys(combinedRecords).sort();\n\n const sortedValues = sortedKeys.map((key) => combinedRecords[key]);\n\n return `INSERT INTO ${table} (${sortedKeys.toString()}) VALUES (${sortedValues\n .map((val) => {\n if (val === null) {\n return 'NULL';\n }\n\n return typeof val === 'string' ? `'${val}'` : val;\n })\n .toString()})`;\n};\n","import type { PageRow, TitleRow } from '@/db/types';\n\n/**\n * Redacts sensitive query parameters from a URL for safe logging\n * @param url - The URL to redact\n * @param sensitiveParams - Array of parameter names to redact (defaults to common sensitive params)\n * @returns The URL string with sensitive parameters redacted\n */\nexport const redactUrl = (\n url: URL | string,\n sensitiveParams: string[] = ['api_key', 'token', 'password', 'secret', 'auth'],\n): string => {\n const urlObj = typeof url === 'string' ? new URL(url) : new URL(url.toString());\n\n sensitiveParams.forEach((param) => {\n const value = urlObj.searchParams.get(param);\n if (value && value.length > 6) {\n const redacted = `${value.slice(0, 3)}***${value.slice(-3)}`;\n urlObj.searchParams.set(param, redacted);\n } else if (value) {\n urlObj.searchParams.set(param, '***');\n }\n });\n\n return urlObj.toString();\n};\n\nexport const mapPageRowToPage = (page: PageRow) => {\n return {\n content: page.content,\n id: page.id,\n ...(page.number && { number: page.number }),\n ...(page.page && { page: Number(page.page) }),\n ...(page.part && { part: page.part }),\n };\n};\n\nexport const mapTitleRowToTitle = (title: TitleRow) => {\n const parent = Number(title.parent);\n\n return {\n content: title.content,\n id: title.id,\n page: Number(title.page),\n ...(parent && { parent }),\n };\n};\n","/**\n * The default version number for master metadata.\n * @constant {number}\n */\nexport const DEFAULT_MASTER_METADATA_VERSION = 0;\n\n/**\n * Placeholder value used to represent unknown or missing data.\n * @constant {string}\n */\nexport const UNKNOWN_VALUE_PLACEHOLDER = '99999';\n\n/**\n * Default rules to sanitize page content.\n */\nexport const DEFAULT_SANITIZATION_RULES: Record<string, string> = {\n '<img[^>]*>>': '',\n 舄: '',\n '﵀': 'رَحِمَهُ ٱللَّٰهُ',\n '﵁': 'رضي الله عنه',\n '﵌': 'صلى الله عليه وآله وسلم',\n};\n","import { createWriteStream, promises as fs } from 'node:fs';\nimport type { IncomingMessage } from 'node:http';\nimport https from 'node:https';\nimport os from 'node:os';\nimport path from 'node:path';\nimport { pipeline } from 'node:stream/promises';\nimport unzipper, { type Entry } from 'unzipper';\n\n/**\n * Creates a temporary directory with an optional prefix.\n * @param {string} [prefix='shamela'] - The prefix to use for the temporary directory name\n * @returns {Promise<string>} A promise that resolves to the path of the created temporary directory\n */\nexport const createTempDir = async (prefix = 'shamela') => {\n const tempDirBase = path.join(os.tmpdir(), prefix);\n return fs.mkdtemp(tempDirBase);\n};\n\n/**\n * Checks if a file exists at the given path.\n * @param {string} path - The file path to check\n * @returns {Promise<boolean>} A promise that resolves to true if the file exists, false otherwise\n */\nexport const fileExists = async (filePath: string) => !!(await fs.stat(filePath).catch(() => false));\n\n/**\n * Downloads and extracts a ZIP file from a given URL without loading the entire file into memory.\n * @param {string} url - The URL of the ZIP file to download and extract\n * @param {string} outputDir - The directory where the files should be extracted\n * @returns {Promise<string[]>} A promise that resolves with the list of all extracted file paths\n * @throws {Error} When the download fails, extraction fails, or other network/filesystem errors occur\n */\nexport async function unzipFromUrl(url: string, outputDir: string): Promise<string[]> {\n const extractedFiles: string[] = [];\n\n try {\n // Make HTTPS request and get the response stream\n const response = await new Promise<IncomingMessage>((resolve, reject) => {\n https\n .get(url, (res) => {\n if (res.statusCode !== 200) {\n reject(new Error(`Failed to download ZIP file: ${res.statusCode} ${res.statusMessage}`));\n } else {\n resolve(res);\n }\n })\n .on('error', (err) => {\n reject(new Error(`HTTPS request failed: ${err.message}`));\n });\n });\n\n // Process the ZIP file using unzipper.Extract with proper event handling\n await new Promise<void>((resolve, reject) => {\n const unzipStream = unzipper.Parse();\n const entryPromises: Promise<void>[] = [];\n\n unzipStream.on('entry', (entry: Entry) => {\n const entryPromise = (async () => {\n const filePath = path.join(outputDir, entry.path);\n\n if (entry.type === 'Directory') {\n // Ensure the directory exists\n await fs.mkdir(filePath, { recursive: true });\n entry.autodrain();\n } else {\n // Ensure the parent directory exists\n const dir = path.dirname(filePath);\n await fs.mkdir(dir, { recursive: true });\n\n // Create write stream and pipe entry to it\n const writeStream = createWriteStream(filePath);\n await pipeline(entry, writeStream);\n extractedFiles.push(filePath);\n }\n })();\n\n entryPromises.push(entryPromise);\n });\n\n unzipStream.on('finish', async () => {\n try {\n // Wait for all entries to be processed\n await Promise.all(entryPromises);\n resolve();\n } catch (error) {\n reject(error);\n }\n });\n\n unzipStream.on('error', (error) => {\n reject(new Error(`Error during extraction: ${error.message}`));\n });\n\n // Pipe the response to the unzip stream\n response.pipe(unzipStream);\n });\n\n return extractedFiles;\n } catch (error: any) {\n throw new Error(`Error processing URL: ${error.message}`);\n }\n}\n","import { Buffer } from 'node:buffer';\nimport type { IncomingMessage } from 'node:http';\nimport https from 'node:https';\nimport process from 'node:process';\nimport { URL, URLSearchParams } from 'node:url';\n\n/**\n * Builds a URL with query parameters and optional authentication.\n * @param {string} endpoint - The base endpoint URL\n * @param {Record<string, any>} queryParams - Object containing query parameters to append\n * @param {boolean} [useAuth=true] - Whether to include the API key from environment variables\n * @returns {URL} The constructed URL object with query parameters\n */\nexport const buildUrl = (endpoint: string, queryParams: Record<string, any>, useAuth: boolean = true): URL => {\n const url = new URL(endpoint);\n {\n const params = new URLSearchParams();\n\n Object.entries(queryParams).forEach(([key, value]) => {\n params.append(key, value.toString());\n });\n\n if (useAuth) {\n params.append('api_key', process.env.SHAMELA_API_KEY!);\n }\n\n url.search = params.toString();\n }\n\n return url;\n};\n\n/**\n * Makes an HTTPS GET request and returns the response data.\n * @template T - The expected return type (Buffer or Record<string, any>)\n * @param {string | URL} url - The URL to make the request to\n * @returns {Promise<T>} A promise that resolves to the response data, parsed as JSON if content-type is application/json, otherwise as Buffer\n * @throws {Error} When the request fails or JSON parsing fails\n */\nexport const httpsGet = <T extends Buffer | Record<string, any>>(url: string | URL): Promise<T> => {\n return new Promise((resolve, reject) => {\n https\n .get(url, (res: IncomingMessage) => {\n const contentType = res.headers['content-type'] || '';\n const dataChunks: Buffer[] = [];\n\n res.on('data', (chunk: Buffer) => {\n dataChunks.push(chunk);\n });\n\n res.on('end', () => {\n const fullData = Buffer.concat(dataChunks);\n\n if (contentType.includes('application/json')) {\n try {\n const json = JSON.parse(fullData.toString('utf-8'));\n resolve(json);\n } catch (error: any) {\n reject(new Error(`Failed to parse JSON: ${error.message}`));\n }\n } else {\n resolve(fullData as T);\n }\n });\n })\n .on('error', (error) => {\n reject(new Error(`Error making request: ${error.message}`));\n });\n });\n};\n","import path from 'node:path';\nimport process from 'node:process';\n\nconst SOURCE_TABLES = ['author.sqlite', 'book.sqlite', 'category.sqlite'];\n\n/**\n * Validates that required environment variables are set.\n * @throws {Error} When any required environment variable is missing\n */\nexport const validateEnvVariables = () => {\n const envVariablesNotFound = [\n 'SHAMELA_API_MASTER_PATCH_ENDPOINT',\n 'SHAMELA_API_BOOKS_ENDPOINT',\n 'SHAMELA_API_KEY',\n ].filter((key) => !process.env[key]);\n\n if (envVariablesNotFound.length) {\n throw new Error(`${envVariablesNotFound.join(', ')} environment variables not set`);\n }\n};\n\n/**\n * Validates that all required master source tables are present in the provided paths.\n * @param {string[]} sourceTablePaths - Array of file paths to validate\n * @returns {boolean} True if all required source tables (author.sqlite, book.sqlite, category.sqlite) are present\n */\nexport const validateMasterSourceTables = (sourceTablePaths: string[]) => {\n const sourceTableNames = new Set(sourceTablePaths.map((tablePath) => path.basename(tablePath).toLowerCase()));\n return SOURCE_TABLES.every((table) => sourceTableNames.has(table.toLowerCase()));\n};\n","import { DEFAULT_SANITIZATION_RULES } from './utils/constants';\n\nexport type Line = {\n id?: string;\n text: string;\n};\n\nconst PUNCT_ONLY = /^[)\\]\\u00BB\"”'’.,?!:\\u061B\\u060C\\u061F\\u06D4\\u2026]+$/;\nconst OPENER_AT_END = /[[({«“‘]$/;\n\nconst mergeDanglingPunctuation = (lines: Line[]): Line[] => {\n const out: Line[] = [];\n for (const item of lines) {\n const last = out[out.length - 1];\n if (last?.id && PUNCT_ONLY.test(item.text)) {\n last.text += item.text;\n } else {\n out.push(item);\n }\n }\n return out;\n};\n\nconst splitIntoLines = (text: string) => {\n let normalized = text.replace(/\\r\\n/g, '\\n').replace(/\\r/g, '\\n');\n\n if (!/\\n/.test(normalized)) {\n normalized = normalized.replace(/([.?!\\u061F\\u061B\\u06D4\\u2026][\"“”'’»«)\\]]?)\\s+(?=[\\u0600-\\u06FF])/, '$1\\n');\n }\n\n return normalized\n .split('\\n')\n .map((line) => line.replace(/^\\*+/, '').trim())\n .filter(Boolean);\n};\n\nconst processTextContent = (content: string): Line[] => {\n return splitIntoLines(content).map((line) => ({ text: line }));\n};\n\nconst extractAttribute = (tag: string, name: string): string | undefined => {\n const pattern = new RegExp(`${name}\\\\s*=\\\\s*(\"([^\"]*)\"|'([^']*)'|([^s>]+))`, 'i');\n const match = tag.match(pattern);\n if (!match) {\n return undefined;\n }\n return match[2] ?? match[3] ?? match[4];\n};\n\ntype Token =\n | { type: 'text'; value: string }\n | { type: 'start'; name: string; attributes: Record<string, string | undefined> }\n | { type: 'end'; name: string };\n\nconst tokenize = (html: string): Token[] => {\n const tokens: Token[] = [];\n const tagRegex = /<[^>]+>/g;\n let lastIndex = 0;\n let match: RegExpExecArray | null;\n match = tagRegex.exec(html);\n\n while (match) {\n if (match.index > lastIndex) {\n tokens.push({ type: 'text', value: html.slice(lastIndex, match.index) });\n }\n\n const raw = match[0];\n const isEnd = /^<\\//.test(raw);\n const nameMatch = raw.match(/^<\\/?\\s*([a-zA-Z0-9:-]+)/);\n const name = nameMatch ? nameMatch[1].toLowerCase() : '';\n\n if (isEnd) {\n tokens.push({ name, type: 'end' });\n } else {\n const attributes: Record<string, string | undefined> = {};\n attributes.id = extractAttribute(raw, 'id');\n attributes['data-type'] = extractAttribute(raw, 'data-type');\n tokens.push({ attributes, name, type: 'start' });\n }\n\n lastIndex = tagRegex.lastIndex;\n match = tagRegex.exec(html);\n }\n\n if (lastIndex < html.length) {\n tokens.push({ type: 'text', value: html.slice(lastIndex) });\n }\n\n return tokens;\n};\n\nconst maybeAppendToPrevTitle = (result: Line[], raw: string) => {\n const last = result[result.length - 1];\n if (!raw) {\n return false;\n }\n if (!last || !last.id) {\n return false;\n }\n if (!OPENER_AT_END.test(last.text)) {\n return false;\n }\n if (/\\n/.test(raw)) {\n return false;\n }\n last.text += raw.replace(/^\\s+/, '');\n return true;\n};\n\nexport const parseContentRobust = (content: string): Line[] => {\n if (!/<span[^>]*>/i.test(content)) {\n return processTextContent(content);\n }\n\n const tokens = tokenize(`<root>${content}</root>`);\n const result: Line[] = [];\n\n let titleDepth = 0;\n let currentTitle: Line | null = null;\n\n const pushText = (raw: string) => {\n if (!raw) {\n return;\n }\n\n if (titleDepth > 0 && currentTitle) {\n const cleaned = titleDepth === 1 ? raw.replace(/^\\s+/, '') : raw;\n currentTitle.text += cleaned;\n return;\n }\n\n if (maybeAppendToPrevTitle(result, raw)) {\n return;\n }\n\n const text = raw.trim();\n if (text) {\n result.push(...processTextContent(text));\n }\n };\n\n for (const token of tokens) {\n if (token.type === 'text') {\n pushText(token.value);\n } else if (token.type === 'start' && token.name === 'span') {\n const dataType = token.attributes['data-type'];\n if (dataType === 'title') {\n if (titleDepth === 0) {\n const id = token.attributes.id?.replace(/^toc-/, '') ?? '';\n currentTitle = { id, text: '' };\n result.push(currentTitle);\n }\n titleDepth += 1;\n }\n } else if (token.type === 'end' && token.name === 'span') {\n if (titleDepth > 0) {\n titleDepth -= 1;\n if (titleDepth === 0) {\n currentTitle = null;\n }\n }\n }\n }\n\n const cleaned = result.map((line) => (line.id ? line : { ...line, text: line.text.trim() }));\n\n return mergeDanglingPunctuation(cleaned.map((line) => (line.id ? line : { ...line, text: line.text }))).filter(\n (line) => line.text.length > 0,\n );\n};\n\nconst DEFAULT_COMPILED_RULES = Object.entries(DEFAULT_SANITIZATION_RULES).map(([pattern, replacement]) => ({\n regex: new RegExp(pattern, 'g'),\n replacement,\n}));\n\n/**\n * Compiles sanitization rules into RegExp objects for performance\n */\nconst getCompiledRules = (rules: Record<string, string>) => {\n if (rules === DEFAULT_SANITIZATION_RULES) {\n return DEFAULT_COMPILED_RULES;\n }\n\n const compiled = [];\n for (const pattern in rules) {\n compiled.push({\n regex: new RegExp(pattern, 'g'),\n replacement: rules[pattern],\n });\n }\n return compiled;\n};\n\n/**\n * Sanitizes page content by applying regex replacement rules\n * @param text - The text to sanitize\n * @param rules - Optional custom rules (defaults to DEFAULT_SANITIZATION_RULES)\n * @returns The sanitized text\n */\nexport const sanitizePageContent = (\n text: string,\n rules: Record<string, string> = DEFAULT_SANITIZATION_RULES,\n): string => {\n const compiledRules = getCompiledRules(rules);\n\n let content = text;\n for (let i = 0; i < compiledRules.length; i++) {\n const { regex, replacement } = compiledRules[i];\n content = content.replace(regex, replacement);\n }\n return content;\n};\n\nexport const splitPageBodyFromFooter = (content: string, footnoteMarker = '_________') => {\n let footnote = '';\n const indexOfFootnote = content.lastIndexOf(footnoteMarker);\n\n if (indexOfFootnote >= 0) {\n footnote = content.slice(indexOfFootnote + footnoteMarker.length);\n content = content.slice(0, indexOfFootnote);\n }\n\n return [content, footnote] as const;\n};\n"],"mappings":"AAAA,OAAS,YAAAA,MAAgB,aACzB,OAAS,YAAYC,MAAU,KAC/B,OAAOC,MAAU,OACjB,OAAOC,MAAa,UACpB,OAAS,OAAAC,MAAW,MCJpB,OAAS,YAAAC,MAAgB,aCSzB,IAAMC,EAAgB,CAAE,MAAO,IAAM,CAAC,EAAG,MAAO,IAAM,CAAC,EAAG,KAAM,IAAM,CAAC,EAAG,KAAM,IAAM,CAAC,CAAE,EACrFC,EAAiBD,EAERE,GAAY,CAACC,EAAoBH,IAAkB,CAC5D,GAAI,CAACG,EAAU,OAAS,CAACA,EAAU,OAAS,CAACA,EAAU,KACnD,MAAM,IAAI,MAAM,sDAAsD,EAG1EF,EAASE,CACb,EDZA,IAAMC,GAAmB,IAQnBC,EAAe,CAACC,EAAcC,IACzBD,EAAG,MAAM,qBAAqBC,CAAK,GAAG,EAAE,IAAI,EASjDC,EAAW,CAACF,EAAcC,IAIrB,EAHQD,EAAG,MAAM,iEAAiE,EAAE,IAAIC,CAAK,EAYlGE,EAAW,CAACH,EAAcC,IACvBC,EAASF,EAAIC,CAAK,EAIhBD,EAAG,MAAM,iBAAiBC,CAAK,EAAE,EAAE,IAAI,EAHnC,CAAC,EAWVG,EAAaC,GACR,OAAOA,EAAI,UAAU,IAAM,IAUhCC,EAAiB,CAACC,EAA0BC,EAA2BC,IAA2B,CACpG,IAAMC,EAAc,CAAC,EAErB,QAAWC,KAAUF,EAAS,CAC1B,GAAIE,IAAW,KAAM,CACjBD,EAAO,IAAMF,GAAYD,IAAU,IAAM,KACzC,QACJ,CAEA,GAAIC,GAAYG,KAAUH,EAAU,CAChC,IAAMI,EAAQJ,EAASG,CAAM,EAE7B,GAAIC,IAAUd,IAAoBc,IAAU,MAAQA,IAAU,OAAW,CACrEF,EAAOC,CAAM,EAAIC,EACjB,QACJ,CACJ,CAEA,GAAIL,GAAWI,KAAUJ,EAAS,CAC9BG,EAAOC,CAAM,EAAIJ,EAAQI,CAAM,EAC/B,QACJ,CAEAD,EAAOC,CAAM,EAAI,IACrB,CAEA,OAAOD,CACX,EASMG,GAAY,CAACC,EAAiBC,EAAkBN,IAA6B,CAC/E,IAAMO,EAAU,IAAI,IACdC,EAAY,IAAI,IAEtB,QAAWZ,KAAOS,EACdE,EAAQ,IAAI,OAAOX,EAAI,EAAE,CAAC,EAG9B,QAAWA,KAAOU,EACdE,EAAU,IAAI,OAAOZ,EAAI,EAAE,EAAGA,CAAG,EAGrC,IAAMK,EAAgB,CAAC,EAEvB,QAAWH,KAAWO,EAAU,CAC5B,IAAMN,EAAWS,EAAU,IAAI,OAAOV,EAAQ,EAAE,CAAC,EAE7CC,GAAYJ,EAAUI,CAAQ,GAIlCE,EAAO,KAAKJ,EAAeC,EAASC,EAAUC,CAAO,CAAC,CAC1D,CAEA,QAAWJ,KAAOU,EAAW,CACzB,IAAMG,EAAK,OAAOb,EAAI,EAAE,EAEpBW,EAAQ,IAAIE,CAAE,GAAKd,EAAUC,CAAG,GAIpCK,EAAO,KAAKJ,EAAe,OAAWD,EAAKI,CAAO,CAAC,CACvD,CAEA,OAAOC,CACX,EASMS,GAAa,CAACnB,EAAcC,EAAeQ,EAAmBW,IAAgB,CAChF,GAAIA,EAAK,SAAW,EAChB,OAGJ,IAAMC,EAAeZ,EAAQ,IAAI,IAAM,GAAG,EAAE,KAAK,GAAG,EAC9Ca,EAAYtB,EAAG,QAAQ,eAAeC,CAAK,KAAKQ,EAAQ,KAAK,GAAG,CAAC,aAAaY,CAAY,GAAG,EAEnGD,EAAK,QAASf,GAAQ,CAClB,IAAMkB,EAASd,EAAQ,IAAKE,GAAYA,KAAUN,EAAMA,EAAIM,CAAM,EAAI,IAAK,EAE3EW,EAAU,IAAI,GAAGC,CAAM,CAC3B,CAAC,EAEDD,EAAU,SAAS,CACvB,EASME,GAAoB,CAACC,EAAkBC,EAAkBzB,IAAkB,CAC7E,IAAMI,EAAMqB,EAAO,MAAM,gEAAgE,EAAE,IAAIzB,CAAK,EAIpG,OAAKI,GAAK,KAKVoB,EAAO,IAAI,wBAAwBxB,CAAK,EAAE,EAC1CwB,EAAO,IAAIpB,EAAI,GAAG,EACX,KANHsB,EAAO,KAAK,GAAG1B,CAAK,8CAA8C,EAC3D,GAMf,EASM2B,EAAoB,CAACH,EAAkBC,EAAkBG,EAAwB5B,IAAkB,CACrG,GAAI,CAACC,EAASwB,EAAQzB,CAAK,EAAG,CAC1B0B,EAAO,KAAK,GAAG1B,CAAK,mCAAmC,EACvD,MACJ,CAEA,GAAI,CAACuB,GAAkBC,EAAQC,EAAQzB,CAAK,EACxC,OAGJ,IAAM6B,EAAW/B,EAAa2B,EAAQzB,CAAK,EACrC8B,EAAYF,GAAS3B,EAAS2B,EAAO5B,CAAK,EAAIF,EAAa8B,EAAO5B,CAAK,EAAI,CAAC,EAE5EQ,EAAUqB,EAAS,IAAKE,GAASA,EAAK,IAAI,EAEhD,QAAWA,KAAQD,EACf,GAAI,CAACtB,EAAQ,SAASuB,EAAK,IAAI,EAAG,CAC9B,IAAMC,EAAaD,EAAK,MAAQA,EAAK,KAAK,OAAS,EAAIA,EAAK,KAAO,OACnEP,EAAO,IAAI,eAAexB,CAAK,eAAe+B,EAAK,IAAI,IAAIC,CAAU,EAAE,EACvExB,EAAQ,KAAKuB,EAAK,IAAI,CAC1B,CAGJ,IAAMlB,EAAWX,EAASuB,EAAQzB,CAAK,EACjCc,EAAYc,EAAQ1B,EAAS0B,EAAO5B,CAAK,EAAI,CAAC,EAE9CiC,EAAarB,GAAUC,EAAUC,EAAWN,CAAO,EAEzDU,GAAWM,EAAQxB,EAAOQ,EAASyB,CAAU,CACjD,EAQaC,EAAe,CAACnC,EAAcoC,EAAeC,IAAoB,CAC1E,IAAMX,EAAS,IAAIY,EAASF,CAAK,EAC3BP,EAAQ,IAAIS,EAASD,CAAO,EAElC,GAAI,CACArC,EAAG,YAAY,IAAM,CACjB4B,EAAkB5B,EAAI0B,EAAQG,QAAkB,EAChDD,EAAkB5B,EAAI0B,EAAQG,SAAmB,CACrD,CAAC,EAAE,CACP,QAAE,CACEH,EAAO,MAAM,EACbG,EAAM,MAAM,CAChB,CACJ,EAOaU,EAAgB,CAACvC,EAAcoC,IAAkB,CAC1D,IAAMV,EAAS,IAAIY,EAASF,CAAK,EAEjC,GAAI,CACApC,EAAG,YAAY,IAAM,CACjB4B,EAAkB5B,EAAI0B,EAAQ,WAAiB,EAC/CE,EAAkB5B,EAAI0B,EAAQ,YAAkB,CACpD,CAAC,EAAE,CACP,QAAE,CACEA,EAAO,MAAM,CACjB,CACJ,EAMac,EAAgBxC,GAAiB,CAC1CA,EAAG;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,UAUH,EACAA,EAAG;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,UAQH,CACJ,EAOayC,GAAezC,GACjBA,EAAG,0BAAoC,EAAE,IAAI,EAQ3C0C,GAAgB1C,GAClBA,EAAG,2BAAqC,EAAE,IAAI,EAQ5C2C,EAAW3C,IACb,CAAE,MAAOyC,GAAYzC,CAAE,EAAG,OAAQ0C,GAAa1C,CAAE,CAAE,GEnT9D,OAAO4C,OAAU,OCKV,IAAMC,EAAW,CAACC,EAAgBC,IAAkB,CACvD,IAAMC,EAAcF,EAAO,QAAQ,KAAM,IAAI,EAC7C,GAAI,CAAC,kBAAkB,KAAKC,CAAK,EAC7B,MAAM,IAAI,MAAM,wBAAwB,EAE5C,MAAO,oBAAoBC,CAAW,QAAQD,CAAK,EACvD,EAyBO,IAAME,EAAYC,GAAkB,CACvC,GAAI,CAAC,kBAAkB,KAAKA,CAAK,EAC7B,MAAM,IAAI,MAAM,wBAAwB,EAE5C,MAAO,mBAAmBA,CAAK,EACnC,ED5BA,IAAMC,EAAoB,CAACC,EAAcC,EAAeC,IAAkB,CACtE,IAAMC,EAAMH,EAAG,MAAM,mBAAmBC,CAAK,iDAAiD,EAAE,IAAIC,CAAK,EAIzG,GAAI,CAACC,GAAK,IACN,MAAM,IAAI,MAAM,gCAAgCD,CAAK,OAAOD,CAAK,EAAE,EAGvED,EAAG,IAAI,wBAAwBE,CAAK,EAAE,EACtCF,EAAG,IAAIG,EAAI,GAAG,CAClB,EAeaC,EAA6B,CAACJ,EAAcK,IAA2B,CAChF,IAAMC,EAAsC,CAAC,EAE7C,QAAWC,KAAaF,EAAc,CAClC,GAAM,CAAE,KAAAG,CAAK,EAAIC,GAAK,MAAMF,CAAS,EACrCD,EAAYE,CAAI,EAAID,CACxB,CAEA,OAAO,QAAQD,CAAW,EAAE,QAAQ,CAAC,CAACL,EAAOS,CAAM,IAAM,CACrDV,EAAG,IAAIW,EAASD,EAAQT,CAAK,CAAC,CAClC,CAAC,EAEDF,EAAkBC,mBAAkC,EACpDD,EAAkBC,eAA8B,EAChDD,EAAkBC,uBAAwC,EAE1D,IAAMY,EAAgBZ,EAAG,wDAAyF,EAC5Ga,EAAcb,EAAG,kDAAmF,EACpGc,EAAmBd,EAAG,8DAE5B,EAEAA,EAAG,YAAY,IAAM,CACjBY,EAAc,IAAI,EAClBC,EAAY,IAAI,EAChBC,EAAiB,IAAI,CACzB,CAAC,EAAE,EAEH,OAAO,KAAKR,CAAW,EAAE,QAASS,GAAc,CAC5Cf,EAAG,IAAIgB,EAASD,CAAS,CAAC,CAC9B,CAAC,CACL,EAQME,EAA0B,CAACjB,EAAckB,EAAkBC,IAAwB,CACrFnB,EAAG,IAAI,uBAAuBkB,CAAQ,EAAE,EACxClB,EAAG,IAAI,eAAekB,CAAQ,qBAAqBC,CAAW,EAAE,CACpE,EAcaC,EAAgBpB,GAAiB,CAC1CA,EAAG;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,UASH,EACAA,EAAG;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,UAiBH,EACAA,EAAG;AAAA;AAAA;AAAA;AAAA;AAAA,UAOH,EAIAiB,EAAwBjB,EAAI,kBAAyB,EACrDiB,EAAwBjB,EAAI,cAAqB,EACjDiB,EAAwBjB,EAAI,uBAA+B,CAC/D,EAOaqB,GAAiBrB,GACnBA,EAAG,4BAAuC,EAAE,IAAI,EAQ9CsB,GAAetB,GACjBA,EAAG,0BAAqC,EAAE,IAAI,EAQ5CuB,GAAoBvB,GACtBA,EAAG,8BAA0C,EAAE,IAAI,EAQjDwB,EAAWxB,IACb,CAAE,QAASqB,GAAcrB,CAAE,EAAG,MAAOsB,GAAYtB,CAAE,EAAG,WAAYuB,GAAiBvB,CAAE,CAAE,GEtK3F,IAAMyB,EAAY,CACrBC,EACAC,EAA4B,CAAC,UAAW,QAAS,WAAY,SAAU,MAAM,IACpE,CACT,IAAMC,EAAS,OAAOF,GAAQ,SAAW,IAAI,IAAIA,CAAG,EAAI,IAAI,IAAIA,EAAI,SAAS,CAAC,EAE9E,OAAAC,EAAgB,QAASE,GAAU,CAC/B,IAAMC,EAAQF,EAAO,aAAa,IAAIC,CAAK,EAC3C,GAAIC,GAASA,EAAM,OAAS,EAAG,CAC3B,IAAMC,EAAW,GAAGD,EAAM,MAAM,EAAG,CAAC,CAAC,MAAMA,EAAM,MAAM,EAAE,CAAC,GAC1DF,EAAO,aAAa,IAAIC,EAAOE,CAAQ,CAC3C,MAAWD,GACPF,EAAO,aAAa,IAAIC,EAAO,KAAK,CAE5C,CAAC,EAEMD,EAAO,SAAS,CAC3B,EAEaI,EAAoBC,IACtB,CACH,QAASA,EAAK,QACd,GAAIA,EAAK,GACT,GAAIA,EAAK,QAAU,CAAE,OAAQA,EAAK,MAAO,EACzC,GAAIA,EAAK,MAAQ,CAAE,KAAM,OAAOA,EAAK,IAAI,CAAE,EAC3C,GAAIA,EAAK,MAAQ,CAAE,KAAMA,EAAK,IAAK,CACvC,GAGSC,EAAsBC,GAAoB,CACnD,IAAMC,EAAS,OAAOD,EAAM,MAAM,EAElC,MAAO,CACH,QAASA,EAAM,QACf,GAAIA,EAAM,GACV,KAAM,OAAOA,EAAM,IAAI,EACvB,GAAIC,GAAU,CAAE,OAAAA,CAAO,CAC3B,CACJ,EC/BO,IAAMC,EAAqD,CAC9D,cAAe,GACf,OAAG,GACH,SAAK,oGACL,SAAK,iEACL,SAAK,wHACT,ECrBA,OAAS,qBAAAC,GAAmB,YAAYC,MAAU,KAElD,OAAOC,OAAW,QAClB,OAAOC,OAAQ,KACf,OAAOC,MAAU,OACjB,OAAS,YAAAC,OAAgB,kBACzB,OAAOC,OAA8B,WAO9B,IAAMC,EAAgB,MAAOC,EAAS,YAAc,CACvD,IAAMC,EAAcL,EAAK,KAAKD,GAAG,OAAO,EAAGK,CAAM,EACjD,OAAOP,EAAG,QAAQQ,CAAW,CACjC,EAgBA,eAAsBC,EAAaC,EAAaC,EAAsC,CAClF,IAAMC,EAA2B,CAAC,EAElC,GAAI,CAEA,IAAMC,EAAW,MAAM,IAAI,QAAyB,CAACC,EAASC,IAAW,CACrEC,GACK,IAAIN,EAAMO,GAAQ,CACXA,EAAI,aAAe,IACnBF,EAAO,IAAI,MAAM,gCAAgCE,EAAI,UAAU,IAAIA,EAAI,aAAa,EAAE,CAAC,EAEvFH,EAAQG,CAAG,CAEnB,CAAC,EACA,GAAG,QAAUC,GAAQ,CAClBH,EAAO,IAAI,MAAM,yBAAyBG,EAAI,OAAO,EAAE,CAAC,CAC5D,CAAC,CACT,CAAC,EAGD,aAAM,IAAI,QAAc,CAACJ,EAASC,IAAW,CACzC,IAAMI,EAAcC,GAAS,MAAM,EAC7BC,EAAiC,CAAC,EAExCF,EAAY,GAAG,QAAUG,GAAiB,CACtC,IAAMC,GAAgB,SAAY,CAC9B,IAAMC,EAAWC,EAAK,KAAKd,EAAWW,EAAM,IAAI,EAEhD,GAAIA,EAAM,OAAS,YAEf,MAAMI,EAAG,MAAMF,EAAU,CAAE,UAAW,EAAK,CAAC,EAC5CF,EAAM,UAAU,MACb,CAEH,IAAMK,EAAMF,EAAK,QAAQD,CAAQ,EACjC,MAAME,EAAG,MAAMC,EAAK,CAAE,UAAW,EAAK,CAAC,EAGvC,IAAMC,EAAcC,GAAkBL,CAAQ,EAC9C,MAAMM,GAASR,EAAOM,CAAW,EACjChB,EAAe,KAAKY,CAAQ,CAChC,CACJ,GAAG,EAEHH,EAAc,KAAKE,CAAY,CACnC,CAAC,EAEDJ,EAAY,GAAG,SAAU,SAAY,CACjC,GAAI,CAEA,MAAM,QAAQ,IAAIE,CAAa,EAC/BP,EAAQ,CACZ,OAASiB,EAAO,CACZhB,EAAOgB,CAAK,CAChB,CACJ,CAAC,EAEDZ,EAAY,GAAG,QAAUY,GAAU,CAC/BhB,EAAO,IAAI,MAAM,4BAA4BgB,EAAM,OAAO,EAAE,CAAC,CACjE,CAAC,EAGDlB,EAAS,KAAKM,CAAW,CAC7B,CAAC,EAEMP,CACX,OAASmB,EAAY,CACjB,MAAM,IAAI,MAAM,yBAAyBA,EAAM,OAAO,EAAE,CAC5D,CACJ,CCrGA,OAAS,UAAAC,OAAc,SAEvB,OAAOC,OAAW,QAClB,OAAOC,OAAa,UACpB,OAAS,OAAAC,GAAK,mBAAAC,OAAuB,MAS9B,IAAMC,EAAW,CAACC,EAAkBC,EAAkCC,EAAmB,KAAc,CAC1G,IAAMC,EAAM,IAAIN,GAAIG,CAAQ,EAC5B,CACI,IAAMI,EAAS,IAAIN,GAEnB,OAAO,QAAQG,CAAW,EAAE,QAAQ,CAAC,CAACI,EAAKC,CAAK,IAAM,CAClDF,EAAO,OAAOC,EAAKC,EAAM,SAAS,CAAC,CACvC,CAAC,EAEGJ,GACAE,EAAO,OAAO,UAAWR,GAAQ,IAAI,eAAgB,EAGzDO,EAAI,OAASC,EAAO,SAAS,CACjC,CAEA,OAAOD,CACX,EASaI,EAAoDJ,GACtD,IAAI,QAAQ,CAACK,EAASC,IAAW,CACpCd,GACK,IAAIQ,EAAMO,GAAyB,CAChC,IAAMC,EAAcD,EAAI,QAAQ,cAAc,GAAK,GAC7CE,EAAuB,CAAC,EAE9BF,EAAI,GAAG,OAASG,GAAkB,CAC9BD,EAAW,KAAKC,CAAK,CACzB,CAAC,EAEDH,EAAI,GAAG,MAAO,IAAM,CAChB,IAAMI,EAAWpB,GAAO,OAAOkB,CAAU,EAEzC,GAAID,EAAY,SAAS,kBAAkB,EACvC,GAAI,CACA,IAAMI,EAAO,KAAK,MAAMD,EAAS,SAAS,OAAO,CAAC,EAClDN,EAAQO,CAAI,CAChB,OAASC,EAAY,CACjBP,EAAO,IAAI,MAAM,yBAAyBO,EAAM,OAAO,EAAE,CAAC,CAC9D,MAEAR,EAAQM,CAAa,CAE7B,CAAC,CACL,CAAC,EACA,GAAG,QAAUE,GAAU,CACpBP,EAAO,IAAI,MAAM,yBAAyBO,EAAM,OAAO,EAAE,CAAC,CAC9D,CAAC,CACT,CAAC,ECpEL,OAAOC,OAAU,OACjB,OAAOC,OAAa,UAEpB,IAAMC,GAAgB,CAAC,gBAAiB,cAAe,iBAAiB,EAM3DC,EAAuB,IAAM,CACtC,IAAMC,EAAuB,CACzB,oCACA,6BACA,iBACJ,EAAE,OAAQC,GAAQ,CAACJ,GAAQ,IAAII,CAAG,CAAC,EAEnC,GAAID,EAAqB,OACrB,MAAM,IAAI,MAAM,GAAGA,EAAqB,KAAK,IAAI,CAAC,gCAAgC,CAE1F,EAOaE,EAA8BC,GAA+B,CACtE,IAAMC,EAAmB,IAAI,IAAID,EAAiB,IAAKE,GAAcT,GAAK,SAASS,CAAS,EAAE,YAAY,CAAC,CAAC,EAC5G,OAAOP,GAAc,MAAOQ,GAAUF,EAAiB,IAAIE,EAAM,YAAY,CAAC,CAAC,CACnF,ETFA,IAAMC,EAAoBC,GAAwB,CAC9C,IAAMC,EAAM,IAAIC,EAAIF,CAAW,EAC/B,OAAAC,EAAI,SAAW,QAERA,EAAI,SAAS,CACxB,EAmBME,EAAoB,MACtBC,EACAC,IAC8D,CAC9DC,EAAO,KAAK,gCAAgCF,CAAE,EAAE,EAEhD,IAAMG,EAAY,MAAMC,EAAc,mBAAmB,EAEnDC,EAA+CJ,GAAiB,MAAMK,GAAgBN,CAAE,EACxF,CAAC,CAACO,CAAY,EAAG,CAACC,CAAa,EAAI,CAAC,CAAC,EAAgB,MAAM,QAAQ,IAAI,CACzEC,EAAaJ,EAAa,gBAAiBF,CAAS,EACpD,GAAIE,EAAa,gBAAkB,CAACI,EAAaJ,EAAa,gBAAiBF,CAAS,CAAC,EAAI,CAAC,CAClG,CAAC,EACKO,EAASC,EAAK,KAAKR,EAAW,SAAS,EAEvCS,EAAS,IAAIC,EAASH,CAAM,EAElC,GAAI,CACA,OAAAR,EAAO,KAAK,iBAAiB,EAC7B,MAAMY,EAAiBF,CAAM,EAEzBJ,GACAN,EAAO,KAAK,yBAAyBM,CAAa,OAAOD,CAAY,EAAE,EACvE,MAAMQ,EAAaH,EAAQL,EAAcC,CAAa,IAEtDN,EAAO,KAAK,2BAA2BK,CAAY,EAAE,EACrD,MAAMS,EAAcJ,EAAQL,CAAY,GAQrC,CAAE,QALO,SAAY,CACxBK,EAAO,MAAM,EACb,MAAMK,EAAG,GAAGd,EAAW,CAAE,UAAW,EAAK,CAAC,CAC9C,EAEkB,OAAAS,CAAO,CAC7B,OAASM,EAAO,CACZ,MAAAN,EAAO,MAAM,EACb,MAAMK,EAAG,GAAGd,EAAW,CAAE,UAAW,EAAK,CAAC,EACpCe,CACV,CACJ,EAoBaZ,GAAkB,MAC3BN,EACAmB,IAC0C,CAC1CC,EAAqB,EAErB,IAAMvB,EAAMwB,EAAS,GAAGC,EAAQ,IAAI,0BAA0B,IAAItB,CAAE,GAAI,CACpE,eAAgBmB,GAAS,cAAgB,GAAG,SAAS,EACrD,eAAgBA,GAAS,cAAgB,GAAG,SAAS,CACzD,CAAC,EAEDjB,EAAO,KAAK,kCAAkCqB,EAAU1B,CAAG,CAAC,EAAE,EAE9D,GAAI,CACA,IAAM2B,EAAY,MAAMC,EAAS5B,CAAG,EACpC,MAAO,CACH,aAAc2B,EAAS,cACvB,gBAAiB7B,EAAiB6B,EAAS,iBAAiB,EAC5D,GAAIA,EAAS,mBAAqB,CAAE,gBAAiB7B,EAAiB6B,EAAS,iBAAiB,CAAE,EAClG,GAAIA,EAAS,mBAAqB,CAAE,aAAcA,EAAS,aAAc,CAC7E,CACJ,OAASN,EAAY,CACjB,MAAM,IAAI,MAAM,iCAAiCA,EAAM,OAAO,EAAE,CACpE,CACJ,EA4BaQ,GAAe,MAAO1B,EAAYmB,IAAkD,CAC7FjB,EAAO,KAAK,gBAAgBF,CAAE,IAAI,KAAK,UAAUmB,CAAO,CAAC,EAAE,EAE3D,GAAM,CAAE,OAAAP,EAAQ,QAAAe,CAAQ,EAAI,MAAM5B,EAAkBC,EAAImB,GAAS,YAAY,EAE7E,GAAI,CACA,GAAM,CAAE,IAAKS,CAAU,EAAIjB,EAAK,MAAMQ,EAAQ,WAAW,IAAI,EAE7D,GAAIS,IAAc,QAAS,CACvB,IAAMC,EAAS,MAAMC,EAAYlB,CAAM,EACvC,MAAM,IAAI,KAAKO,EAAQ,WAAW,IAAI,EAAE,MAAM,KAAK,UAAUU,EAAQ,KAAM,CAAC,CAAC,CACjF,SAAWD,IAAc,OAASA,IAAc,UAAW,CAGvD,IAAMG,EAAanB,EAAO,SAC1BA,EAAO,MAAM,EACb,MAAMK,EAAG,OAAOc,EAAYZ,EAAQ,WAAW,IAAI,EAEnD,IAAMhB,EAAYQ,EAAK,QAAQoB,CAAU,EACzC,aAAMd,EAAG,GAAGd,EAAW,CAAE,UAAW,EAAK,CAAC,EACnCgB,EAAQ,WAAW,IAC9B,CAEA,MAAMQ,EAAQ,CAClB,OAAST,EAAO,CACZ,YAAMS,EAAQ,EACRT,CACV,CAEA,OAAOC,EAAQ,WAAW,IAC9B,EAqBaa,GAAoB,MAAOC,EAAkB,IAAiD,CACvGb,EAAqB,EAErB,IAAMvB,EAAMwB,EAASC,EAAQ,IAAI,kCAA6C,CAAE,QAASW,EAAQ,SAAS,CAAE,CAAC,EAE7G/B,EAAO,KAAK,mDAAmDqB,EAAU1B,CAAG,CAAC,EAAE,EAE/E,GAAI,CACA,IAAM2B,EAAgC,MAAMC,EAAS5B,CAAG,EACxD,MAAO,CAAE,IAAK2B,EAAS,UAAW,QAASA,EAAS,OAAQ,CAChE,OAASN,EAAY,CACjB,MAAM,IAAI,MAAM,gCAAgCA,EAAM,OAAO,EAAE,CACnE,CACJ,EAiBagB,GAAeC,GAAmB,CAC3C,GAAM,CAAE,OAAAC,CAAO,EAAI,IAAItC,EAAIwB,EAAQ,IAAI,iCAAkC,EACzE,MAAO,GAAGc,CAAM,WAAWD,CAAM,MACrC,EA4BaE,GAAyB,MAAOlB,GAAoD,CAC7FjB,EAAO,KAAK,0BAA0B,KAAK,UAAUiB,CAAO,CAAC,EAAE,EAE/D,IAAMhB,EAAY,MAAMC,EAAc,wBAAwB,EAExDkC,EACFnB,EAAQ,gBAAmB,MAAMa,GAAkB,CAA+B,EAEtF9B,EAAO,KAAK,+BAA+BoC,EAAe,OAAO,UAAUf,EAAUe,EAAe,GAAG,CAAC,EAAE,EAC1G,IAAMC,EAAyB,MAAM9B,EAAad,EAAiB2C,EAAe,GAAG,EAAGnC,CAAS,EAIjG,GAFAD,EAAO,KAAK,4BAA4BqC,EAAa,SAAS,CAAC,EAAE,EAE7D,CAACC,EAA2BD,CAAY,EACxC,MAAArC,EAAO,MAAM,sCAAsCqC,EAAa,SAAS,CAAC,EAAE,EACtE,IAAI,MAAM,4BAA4B,EAGhD,IAAM7B,EAASC,EAAK,KAAKR,EAAW,WAAW,EAEzCS,EAAS,IAAIC,EAASH,CAAM,EAElC,GAAI,CACAR,EAAO,KAAK,iBAAiB,EAC7B,MAAMY,EAAmBF,CAAM,EAE/BV,EAAO,KAAK,8BAA8B,EAC1C,MAAMuC,EAA2B7B,EAAQ2B,CAAY,EAErD,GAAM,CAAE,IAAKX,CAAU,EAAIjB,EAAK,MAAMQ,EAAQ,WAAW,IAAI,EAE7D,GAAIS,IAAc,QAAS,CACvB,IAAMC,EAAS,MAAMC,EAAclB,CAAM,EACzC,MAAM,IAAI,KAAKO,EAAQ,WAAW,IAAI,EAAE,MAAM,KAAK,UAAUU,EAAQ,KAAM,CAAC,CAAC,CACjF,CAEAjB,EAAO,MAAM,GAETgB,IAAc,OAASA,IAAc,YACrC,MAAMX,EAAG,OAAOP,EAAQS,EAAQ,WAAW,IAAI,EAGnD,MAAMF,EAAG,GAAGd,EAAW,CAAE,UAAW,EAAK,CAAC,CAC9C,QAAE,CACES,EAAO,MAAM,CACjB,CAEA,OAAOO,EAAQ,WAAW,IAC9B,EAqBauB,GAAU,MAAO1C,GAAkC,CAC5DE,EAAO,KAAK,WAAWF,CAAE,EAAE,EAE3B,GAAM,CAAE,OAAAY,EAAQ,QAAAe,CAAQ,EAAI,MAAM5B,EAAkBC,CAAE,EAEtD,GAAI,CACA,IAAM2C,EAAO,MAAMb,EAAYlB,CAAM,EAOrC,MALyB,CACrB,MAAO+B,EAAK,MAAM,IAAIC,CAAgB,EACtC,OAAQD,EAAK,OAAO,IAAIE,CAAkB,CAC9C,CAGJ,QAAE,CACE,MAAMlB,EAAQ,CAClB,CACJ,EUlWA,IAAMmB,GAAa,wDACbC,GAAgB,YAEhBC,GAA4BC,GAA0B,CACxD,IAAMC,EAAc,CAAC,EACrB,QAAWC,KAAQF,EAAO,CACtB,IAAMG,EAAOF,EAAIA,EAAI,OAAS,CAAC,EAC3BE,GAAM,IAAMN,GAAW,KAAKK,EAAK,IAAI,EACrCC,EAAK,MAAQD,EAAK,KAElBD,EAAI,KAAKC,CAAI,CAErB,CACA,OAAOD,CACX,EAEMG,GAAkBC,GAAiB,CACrC,IAAIC,EAAaD,EAAK,QAAQ,QAAS;AAAA,CAAI,EAAE,QAAQ,MAAO;AAAA,CAAI,EAEhE,MAAK,KAAK,KAAKC,CAAU,IACrBA,EAAaA,EAAW,QAAQ,qEAAsE;AAAA,CAAM,GAGzGA,EACF,MAAM;AAAA,CAAI,EACV,IAAKC,GAASA,EAAK,QAAQ,OAAQ,EAAE,EAAE,KAAK,CAAC,EAC7C,OAAO,OAAO,CACvB,EAEMC,EAAsBC,GACjBL,GAAeK,CAAO,EAAE,IAAKF,IAAU,CAAE,KAAMA,CAAK,EAAE,EAG3DG,EAAmB,CAACC,EAAaC,IAAqC,CACxE,IAAMC,EAAU,IAAI,OAAO,GAAGD,CAAI,0CAA2C,GAAG,EAC1EE,EAAQH,EAAI,MAAME,CAAO,EAC/B,GAAKC,EAGL,OAAOA,EAAM,CAAC,GAAKA,EAAM,CAAC,GAAKA,EAAM,CAAC,CAC1C,EAOMC,GAAYC,GAA0B,CACxC,IAAMC,EAAkB,CAAC,EACnBC,EAAW,WACbC,EAAY,EACZL,EAGJ,IAFAA,EAAQI,EAAS,KAAKF,CAAI,EAEnBF,GAAO,CACNA,EAAM,MAAQK,GACdF,EAAO,KAAK,CAAE,KAAM,OAAQ,MAAOD,EAAK,MAAMG,EAAWL,EAAM,KAAK,CAAE,CAAC,EAG3E,IAAMM,EAAMN,EAAM,CAAC,EACbO,EAAQ,OAAO,KAAKD,CAAG,EACvBE,EAAYF,EAAI,MAAM,0BAA0B,EAChDR,EAAOU,EAAYA,EAAU,CAAC,EAAE,YAAY,EAAI,GAEtD,GAAID,EACAJ,EAAO,KAAK,CAAE,KAAAL,EAAM,KAAM,KAAM,CAAC,MAC9B,CACH,IAAMW,EAAiD,CAAC,EACxDA,EAAW,GAAKb,EAAiBU,EAAK,IAAI,EAC1CG,EAAW,WAAW,EAAIb,EAAiBU,EAAK,WAAW,EAC3DH,EAAO,KAAK,CAAE,WAAAM,EAAY,KAAAX,EAAM,KAAM,OAAQ,CAAC,CACnD,CAEAO,EAAYD,EAAS,UACrBJ,EAAQI,EAAS,KAAKF,CAAI,CAC9B,CAEA,OAAIG,EAAYH,EAAK,QACjBC,EAAO,KAAK,CAAE,KAAM,OAAQ,MAAOD,EAAK,MAAMG,CAAS,CAAE,CAAC,EAGvDF,CACX,EAEMO,GAAyB,CAACC,EAAgBL,IAAgB,CAC5D,IAAMjB,EAAOsB,EAAOA,EAAO,OAAS,CAAC,EAUrC,MATI,CAACL,GAGD,CAACjB,GAAQ,CAACA,EAAK,IAGf,CAACL,GAAc,KAAKK,EAAK,IAAI,GAG7B,KAAK,KAAKiB,CAAG,EACN,IAEXjB,EAAK,MAAQiB,EAAI,QAAQ,OAAQ,EAAE,EAC5B,GACX,EAEaM,GAAsBjB,GAA4B,CAC3D,GAAI,CAAC,eAAe,KAAKA,CAAO,EAC5B,OAAOD,EAAmBC,CAAO,EAGrC,IAAMQ,EAASF,GAAS,SAASN,CAAO,SAAS,EAC3CgB,EAAiB,CAAC,EAEpBE,EAAa,EACbC,EAA4B,KAE1BC,EAAYT,GAAgB,CAC9B,GAAI,CAACA,EACD,OAGJ,GAAIO,EAAa,GAAKC,EAAc,CAChC,IAAME,EAAUH,IAAe,EAAIP,EAAI,QAAQ,OAAQ,EAAE,EAAIA,EAC7DQ,EAAa,MAAQE,EACrB,MACJ,CAEA,GAAIN,GAAuBC,EAAQL,CAAG,EAClC,OAGJ,IAAMf,EAAOe,EAAI,KAAK,EAClBf,GACAoB,EAAO,KAAK,GAAGjB,EAAmBH,CAAI,CAAC,CAE/C,EAEA,QAAW0B,KAASd,EACZc,EAAM,OAAS,OACfF,EAASE,EAAM,KAAK,EACbA,EAAM,OAAS,SAAWA,EAAM,OAAS,OAC/BA,EAAM,WAAW,WAAW,IAC5B,UACTJ,IAAe,IAEfC,EAAe,CAAE,GADNG,EAAM,WAAW,IAAI,QAAQ,QAAS,EAAE,GAAK,GACnC,KAAM,EAAG,EAC9BN,EAAO,KAAKG,CAAY,GAE5BD,GAAc,GAEXI,EAAM,OAAS,OAASA,EAAM,OAAS,QAC1CJ,EAAa,IACbA,GAAc,EACVA,IAAe,IACfC,EAAe,OAM/B,IAAME,EAAUL,EAAO,IAAKlB,GAAUA,EAAK,GAAKA,EAAO,CAAE,GAAGA,EAAM,KAAMA,EAAK,KAAK,KAAK,CAAE,CAAE,EAE3F,OAAOR,GAAyB+B,EAAQ,IAAKvB,GAAUA,EAAK,GAAKA,EAAO,CAAE,GAAGA,EAAM,KAAMA,EAAK,IAAK,CAAE,CAAC,EAAE,OACnGA,GAASA,EAAK,KAAK,OAAS,CACjC,CACJ,EAEMyB,GAAyB,OAAO,QAAQC,CAA0B,EAAE,IAAI,CAAC,CAACpB,EAASqB,CAAW,KAAO,CACvG,MAAO,IAAI,OAAOrB,EAAS,GAAG,EAC9B,YAAAqB,CACJ,EAAE,EAKIC,GAAoBC,GAAkC,CACxD,GAAIA,IAAUH,EACV,OAAOD,GAGX,IAAMK,EAAW,CAAC,EAClB,QAAWxB,KAAWuB,EAClBC,EAAS,KAAK,CACV,MAAO,IAAI,OAAOxB,EAAS,GAAG,EAC9B,YAAauB,EAAMvB,CAAO,CAC9B,CAAC,EAEL,OAAOwB,CACX,EAQaC,GAAsB,CAC/BjC,EACA+B,EAAgCH,IACvB,CACT,IAAMM,EAAgBJ,GAAiBC,CAAK,EAExC3B,EAAUJ,EACd,QAASmC,EAAI,EAAGA,EAAID,EAAc,OAAQC,IAAK,CAC3C,GAAM,CAAE,MAAAC,EAAO,YAAAP,CAAY,EAAIK,EAAcC,CAAC,EAC9C/B,EAAUA,EAAQ,QAAQgC,EAAOP,CAAW,CAChD,CACA,OAAOzB,CACX,EAEaiC,GAA0B,CAACjC,EAAiBkC,EAAiB,cAAgB,CACtF,IAAIC,EAAW,GACTC,EAAkBpC,EAAQ,YAAYkC,CAAc,EAE1D,OAAIE,GAAmB,IACnBD,EAAWnC,EAAQ,MAAMoC,EAAkBF,EAAe,MAAM,EAChElC,EAAUA,EAAQ,MAAM,EAAGoC,CAAe,GAGvC,CAACpC,EAASmC,CAAQ,CAC7B","names":["Database","fs","path","process","URL","Database","SILENT_LOGGER","logger","setLogger","newLogger","PATCH_NOOP_VALUE","getTableInfo","db","table","hasTable","readRows","isDeleted","row","mergeRowValues","baseRow","patchRow","columns","merged","column","value","mergeRows","baseRows","patchRows","baseIds","patchById","id","insertRows","rows","placeholders","statement","values","ensureTableSchema","target","source","logger","copyAndPatchTable","patch","baseInfo","patchInfo","info","columnType","mergedRows","applyPatches","aslDB","patchDB","Database","copyTableData","createTables","getAllPages","getAllTitles","getData","path","attachDB","dbFile","alias","escapedPath","detachDB","alias","ensureTableSchema","db","alias","table","row","copyForeignMasterTableData","sourceTables","aliasToPath","tablePath","name","path","dbPath","attachDB","insertAuthors","insertBooks","insertCategories","statement","detachDB","createCompatibilityView","viewName","sourceTable","createTables","getAllAuthors","getAllBooks","getAllCategories","getData","redactUrl","url","sensitiveParams","urlObj","param","value","redacted","mapPageRowToPage","page","mapTitleRowToTitle","title","parent","DEFAULT_SANITIZATION_RULES","createWriteStream","fs","https","os","path","pipeline","unzipper","createTempDir","prefix","tempDirBase","unzipFromUrl","url","outputDir","extractedFiles","response","resolve","reject","https","res","err","unzipStream","unzipper","entryPromises","entry","entryPromise","filePath","path","fs","dir","writeStream","createWriteStream","pipeline","error","Buffer","https","process","URL","URLSearchParams","buildUrl","endpoint","queryParams","useAuth","url","params","key","value","httpsGet","resolve","reject","res","contentType","dataChunks","chunk","fullData","json","error","path","process","SOURCE_TABLES","validateEnvVariables","envVariablesNotFound","key","validateMasterSourceTables","sourceTablePaths","sourceTableNames","tablePath","table","fixHttpsProtocol","originalUrl","url","URL","setupBookDatabase","id","bookMetadata","logger","outputDir","createTempDir","bookResponse","getBookMetadata","bookDatabase","patchDatabase","unzipFromUrl","dbPath","path","client","Database","createTables","applyPatches","copyTableData","fs","error","options","validateEnvVariables","buildUrl","process","redactUrl","response","httpsGet","downloadBook","cleanup","extension","result","getData","tempDbPath","getMasterMetadata","version","getCoverUrl","bookId","origin","downloadMasterDatabase","masterResponse","sourceTables","validateMasterSourceTables","copyForeignMasterTableData","getBook","data","mapPageRowToPage","mapTitleRowToTitle","PUNCT_ONLY","OPENER_AT_END","mergeDanglingPunctuation","lines","out","item","last","splitIntoLines","text","normalized","line","processTextContent","content","extractAttribute","tag","name","pattern","match","tokenize","html","tokens","tagRegex","lastIndex","raw","isEnd","nameMatch","attributes","maybeAppendToPrevTitle","result","parseContentRobust","titleDepth","currentTitle","pushText","cleaned","token","DEFAULT_COMPILED_RULES","DEFAULT_SANITIZATION_RULES","replacement","getCompiledRules","rules","compiled","sanitizePageContent","compiledRules","i","regex","splitPageBodyFromFooter","footnoteMarker","footnote","indexOfFootnote"]}
1
+ {"version":3,"file":"index.js","names":["SILENT_LOGGER: Logger","currentLogger: Logger","loggerProxy: Logger","runtimeConfig: Partial<ShamelaConfig>","ENV_MAP: Record<Exclude<ShamelaConfigKey, 'fetchImplementation'>, string>","merged: Row","merged: Row[]","ensureTableSchema","statement: Statement","db: SqlJsDatabase","rows: QueryRow[]","sqlPromise: Promise<SqlJsStatic> | null","resolvedWasmPath: string | null","isNodeEnvironment","TABLE_MAP: Record<string, Tables>","tableDbs: Partial<Record<Tables, SqliteDatabase>>","createTables","getData","DEFAULT_SANITIZATION_RULES: Record<string, string>","error: any","bookResponse: GetBookMetadataResponsePayload","error: any","getBookData","response: Record<string, any>","getMasterData","out: Line[]","tokens: Token[]","match: RegExpExecArray | null","attributes: Record<string, string | undefined>","result: Line[]","currentTitle: Line | null"],"sources":["../src/utils/logger.ts","../src/config.ts","../src/db/types.ts","../src/db/book.ts","../src/db/sqlite.ts","../src/db/master.ts","../src/utils/common.ts","../src/utils/constants.ts","../src/utils/downloads.ts","../src/utils/network.ts","../src/utils/io.ts","../src/utils/validation.ts","../src/api.ts","../src/content.ts"],"sourcesContent":["/**\n * Signature accepted by logger methods.\n */\nexport type LogFunction = (...args: unknown[]) => void;\n\n/**\n * Contract expected from logger implementations consumed by the library.\n */\nexport interface Logger {\n debug: LogFunction;\n error: LogFunction;\n info: LogFunction;\n warn: LogFunction;\n}\n\n/**\n * No-op logger used when consumers do not provide their own implementation.\n */\nexport const SILENT_LOGGER: Logger = Object.freeze({\n debug: () => {},\n error: () => {},\n info: () => {},\n warn: () => {},\n});\n\nlet currentLogger: Logger = SILENT_LOGGER;\n\n/**\n * Configures the active logger or falls back to {@link SILENT_LOGGER} when undefined.\n *\n * @param newLogger - The logger instance to use for subsequent log calls\n * @throws {Error} When the provided logger does not implement the required methods\n */\nexport const configureLogger = (newLogger?: Logger) => {\n if (!newLogger) {\n currentLogger = SILENT_LOGGER;\n return;\n }\n\n const requiredMethods: Array<keyof Logger> = ['debug', 'error', 'info', 'warn'];\n const missingMethod = requiredMethods.find((method) => typeof newLogger[method] !== 'function');\n\n if (missingMethod) {\n throw new Error(\n `Logger must implement debug, error, info, and warn methods. Missing: ${String(missingMethod)}`,\n );\n }\n\n currentLogger = newLogger;\n};\n\n/**\n * Retrieves the currently configured logger.\n */\nexport const getLogger = () => currentLogger;\n\n/**\n * Restores the logger configuration back to {@link SILENT_LOGGER}.\n */\nexport const resetLogger = () => {\n currentLogger = SILENT_LOGGER;\n};\n\n/**\n * Proxy that delegates logging calls to the active logger at invocation time.\n */\nconst loggerProxy: Logger = new Proxy({} as Logger, {\n get: (_target, property: keyof Logger) => {\n const activeLogger = getLogger();\n const value = activeLogger[property];\n\n if (typeof value === 'function') {\n return (...args: unknown[]) => (value as LogFunction).apply(activeLogger, args);\n }\n\n return value;\n },\n}) as Logger;\n\nexport default loggerProxy;\n","import type { ShamelaConfig, ShamelaConfigKey } from './types';\nimport type { Logger } from './utils/logger';\nimport { configureLogger, resetLogger } from './utils/logger';\n\n/**\n * Mutable runtime configuration overrides supplied at runtime via {@link configure}.\n */\nlet runtimeConfig: Partial<ShamelaConfig> = {};\n\n/**\n * Mapping between configuration keys and their corresponding environment variable names.\n */\nconst ENV_MAP: Record<Exclude<ShamelaConfigKey, 'fetchImplementation'>, string> = {\n apiKey: 'SHAMELA_API_KEY',\n booksEndpoint: 'SHAMELA_API_BOOKS_ENDPOINT',\n masterPatchEndpoint: 'SHAMELA_API_MASTER_PATCH_ENDPOINT',\n sqlJsWasmUrl: 'SHAMELA_SQLJS_WASM_URL',\n};\n\n/**\n * Detects whether the Node.js {@link process} global is available for reading environment variables.\n */\nconst isProcessAvailable = typeof process !== 'undefined' && Boolean(process?.env);\n\n/**\n * Reads a configuration value either from runtime overrides or environment variables.\n *\n * @param key - The configuration key to resolve\n * @returns The resolved configuration value if present\n */\nconst readEnv = <Key extends Exclude<ShamelaConfigKey, 'fetchImplementation'>>(key: Key) => {\n const runtimeValue = runtimeConfig[key];\n\n if (runtimeValue !== undefined) {\n return runtimeValue as ShamelaConfig[Key];\n }\n\n const envKey = ENV_MAP[key];\n\n if (isProcessAvailable) {\n return process.env[envKey] as ShamelaConfig[Key];\n }\n\n return undefined as ShamelaConfig[Key];\n};\n\n/**\n * Runtime configuration options accepted by {@link configure}.\n */\nexport type ConfigureOptions = Partial<ShamelaConfig> & { logger?: Logger };\n\n/**\n * Updates the runtime configuration for the library.\n *\n * This function merges the provided options with existing overrides and optionally\n * configures a custom logger implementation.\n *\n * @param config - Runtime configuration overrides and optional logger instance\n */\nexport const configure = (config: ConfigureOptions) => {\n const { logger, ...options } = config;\n\n if ('logger' in config) {\n configureLogger(logger);\n }\n\n runtimeConfig = { ...runtimeConfig, ...options };\n};\n\n/**\n * Retrieves a single configuration value.\n *\n * @param key - The configuration key to read\n * @returns The configuration value when available\n */\nexport const getConfigValue = <Key extends ShamelaConfigKey>(key: Key) => {\n if (key === 'fetchImplementation') {\n return runtimeConfig.fetchImplementation as ShamelaConfig[Key];\n }\n\n return readEnv(key as Exclude<Key, 'fetchImplementation'>);\n};\n\n/**\n * Resolves the current configuration by combining runtime overrides and environment variables.\n *\n * @returns The resolved {@link ShamelaConfig}\n */\nexport const getConfig = (): ShamelaConfig => {\n return {\n apiKey: readEnv('apiKey'),\n booksEndpoint: readEnv('booksEndpoint'),\n fetchImplementation: runtimeConfig.fetchImplementation,\n masterPatchEndpoint: readEnv('masterPatchEndpoint'),\n sqlJsWasmUrl: readEnv('sqlJsWasmUrl'),\n };\n};\n\n/**\n * Retrieves a configuration value and throws if it is missing.\n *\n * @param key - The configuration key to require\n * @throws {Error} If the configuration value is not defined\n * @returns The resolved configuration value\n */\nexport const requireConfigValue = <Key extends Exclude<ShamelaConfigKey, 'fetchImplementation'>>(key: Key) => {\n if ((key as ShamelaConfigKey) === 'fetchImplementation') {\n throw new Error('fetchImplementation must be provided via configure().');\n }\n\n const value = getConfigValue(key);\n if (!value) {\n throw new Error(`${ENV_MAP[key]} environment variable not set`);\n }\n\n return value as NonNullable<ShamelaConfig[Key]>;\n};\n\n/**\n * Clears runtime configuration overrides and restores the default logger.\n */\nexport const resetConfig = () => {\n runtimeConfig = {};\n resetLogger();\n};\n","/**\n * Enumeration of database table names.\n */\nexport enum Tables {\n /** Author table */\n Authors = 'author',\n /** Book table */\n Books = 'book',\n /** Category table */\n Categories = 'category',\n /** Page table */\n Page = 'page',\n /** Title table */\n Title = 'title',\n}\n\n/**\n * A record that can be deleted by patches.\n */\nexport type Deletable = {\n /** Indicates if it was deleted in the patch if it is set to '1 */\n is_deleted?: string;\n};\n\nexport type Unique = {\n /** Unique identifier */\n id: number;\n};\n\n/**\n * Database row structure for the author table.\n */\nexport type AuthorRow = Deletable &\n Unique & {\n /** Author biography */\n biography: string;\n\n /** Death year */\n death_number: string;\n\n /** The death year as a text */\n death_text: string;\n\n /** Author name */\n name: string;\n };\n\n/**\n * Database row structure for the book table.\n */\nexport type BookRow = Deletable &\n Unique & {\n /** Serialized author ID(s) \"2747, 3147\" or \"513\" */\n author: string;\n\n /** Bibliography information */\n bibliography: string;\n\n /** Category ID */\n category: string;\n\n /** Publication date (or 99999 for unavailable) */\n date: string;\n\n /** Hint or description */\n hint: string;\n\n /** Major version */\n major_release: string;\n\n /** Serialized metadata */\n metadata: string;\n\n /** Minor version */\n minor_release: string;\n\n /** Book name */\n name: string;\n\n /** Serialized PDF links */\n pdf_links: string;\n\n /** Printed flag */\n printed: string;\n\n /** Book type */\n type: string;\n };\n\n/**\n * Database row structure for the category table.\n */\nexport type CategoryRow = Deletable &\n Unique & {\n /** Category name */\n name: string;\n\n /** Category order in the list to show. */\n order: string;\n };\n\n/**\n * Database row structure for the page table.\n */\nexport type PageRow = Deletable &\n Unique & {\n /** Page content */\n content: string;\n\n /** Page number */\n number: string | null;\n\n /** Page reference */\n page: string | null;\n\n /** Part number */\n part: string | null;\n\n /** Additional metadata */\n services: string | null;\n };\n\n/**\n * Database row structure for the title table.\n */\nexport type TitleRow = Deletable &\n Unique & {\n /** Title content */\n content: string;\n\n /** Page number */\n page: string;\n\n /** Parent title ID */\n parent: string | null;\n };\n","import logger from '@/utils/logger';\nimport type { SqliteDatabase } from './sqlite';\nimport { type Deletable, type PageRow, Tables, type TitleRow } from './types';\n\ntype Row = Record<string, any> & Deletable;\n\nconst PATCH_NOOP_VALUE = '#';\n\n/**\n * Retrieves column information for a specified table.\n * @param db - The database instance\n * @param table - The table name to get info for\n * @returns Array of column information with name and type\n */\nconst getTableInfo = (db: SqliteDatabase, table: Tables) => {\n return db.query(`PRAGMA table_info(${table})`).all() as { name: string; type: string }[];\n};\n\n/**\n * Checks if a table exists in the database.\n * @param db - The database instance\n * @param table - The table name to check\n * @returns True if the table exists, false otherwise\n */\nconst hasTable = (db: SqliteDatabase, table: Tables): boolean => {\n const result = db.query(`SELECT name FROM sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { name: string }\n | undefined;\n return Boolean(result);\n};\n\n/**\n * Reads all rows from a specified table.\n * @param db - The database instance\n * @param table - The table name to read from\n * @returns Array of rows, or empty array if table doesn't exist\n */\nconst readRows = (db: SqliteDatabase, table: Tables): Row[] => {\n if (!hasTable(db, table)) {\n return [];\n }\n\n return db.query(`SELECT * FROM ${table}`).all() as Row[];\n};\n\n/**\n * Checks if a row is marked as deleted.\n * @param row - The row to check\n * @returns True if the row has is_deleted field set to '1', false otherwise\n */\nconst isDeleted = (row: Row): boolean => {\n return String(row.is_deleted) === '1';\n};\n\n/**\n * Merges values from a base row and patch row, with patch values taking precedence.\n * @param baseRow - The original row data (can be undefined)\n * @param patchRow - The patch row data with updates (can be undefined)\n * @param columns - Array of column names to merge\n * @returns Merged row with combined values\n */\nconst mergeRowValues = (baseRow: Row | undefined, patchRow: Row | undefined, columns: string[]): Row => {\n const merged: Row = {};\n\n for (const column of columns) {\n if (column === 'id') {\n merged.id = (patchRow ?? baseRow)?.id ?? null;\n continue;\n }\n\n if (patchRow && column in patchRow) {\n const value = patchRow[column];\n\n if (value !== PATCH_NOOP_VALUE && value !== null && value !== undefined) {\n merged[column] = value;\n continue;\n }\n }\n\n if (baseRow && column in baseRow) {\n merged[column] = baseRow[column];\n continue;\n }\n\n merged[column] = null;\n }\n\n return merged;\n};\n\n/**\n * Merges arrays of base rows and patch rows, handling deletions and updates.\n * @param baseRows - Original rows from the base database\n * @param patchRows - Patch rows containing updates, additions, and deletions\n * @param columns - Array of column names to merge\n * @returns Array of merged rows with patches applied\n */\nconst mergeRows = (baseRows: Row[], patchRows: Row[], columns: string[]): Row[] => {\n const baseIds = new Set<string>();\n const patchById = new Map<string, Row>();\n\n for (const row of baseRows) {\n baseIds.add(String(row.id));\n }\n\n for (const row of patchRows) {\n patchById.set(String(row.id), row);\n }\n\n const merged: Row[] = [];\n\n for (const baseRow of baseRows) {\n const patchRow = patchById.get(String(baseRow.id));\n\n if (patchRow && isDeleted(patchRow)) {\n continue;\n }\n\n merged.push(mergeRowValues(baseRow, patchRow, columns));\n }\n\n for (const row of patchRows) {\n const id = String(row.id);\n\n if (baseIds.has(id) || isDeleted(row)) {\n continue;\n }\n\n merged.push(mergeRowValues(undefined, row, columns));\n }\n\n return merged;\n};\n\n/**\n * Inserts multiple rows into a specified table using a prepared statement.\n * @param db - The database instance\n * @param table - The table name to insert into\n * @param columns - Array of column names\n * @param rows - Array of row data to insert\n */\nconst insertRows = (db: SqliteDatabase, table: Tables, columns: string[], rows: Row[]) => {\n if (rows.length === 0) {\n return;\n }\n\n const placeholders = columns.map(() => '?').join(',');\n const statement = db.prepare(`INSERT INTO ${table} (${columns.join(',')}) VALUES (${placeholders})`);\n\n rows.forEach((row) => {\n const values = columns.map((column) => (column in row ? row[column] : null));\n // Spread the values array instead of passing it directly\n statement.run(...values);\n });\n\n statement.finalize();\n};\n\n/**\n * Ensures the target database has the same table schema as the source database.\n * @param target - The target database to create/update the table in\n * @param source - The source database to copy the schema from\n * @param table - The table name to ensure schema for\n * @returns True if schema was successfully ensured, false otherwise\n */\nconst ensureTableSchema = (target: SqliteDatabase, source: SqliteDatabase, table: Tables) => {\n const row = source.query(`SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { sql: string }\n | undefined;\n\n if (!row?.sql) {\n logger.warn(`${table} table definition missing in source database`);\n return false;\n }\n\n target.run(`DROP TABLE IF EXISTS ${table}`);\n target.run(row.sql);\n return true;\n};\n\n/**\n * Copies and patches a table from source to target database, applying patch updates if provided.\n * @param target - The target database to copy/patch the table to\n * @param source - The source database containing the base table data\n * @param patch - Optional patch database containing updates (can be null)\n * @param table - The table name to copy and patch\n */\nconst copyAndPatchTable = (\n target: SqliteDatabase,\n source: SqliteDatabase,\n patch: SqliteDatabase | null,\n table: Tables,\n) => {\n if (!hasTable(source, table)) {\n logger.warn(`${table} table missing in source database`);\n return;\n }\n\n if (!ensureTableSchema(target, source, table)) {\n return;\n }\n\n const baseInfo = getTableInfo(source, table);\n const patchInfo = patch && hasTable(patch, table) ? getTableInfo(patch, table) : [];\n\n const columns = baseInfo.map((info) => info.name);\n\n for (const info of patchInfo) {\n if (!columns.includes(info.name)) {\n const columnType = info.type && info.type.length > 0 ? info.type : 'TEXT';\n target.run(`ALTER TABLE ${table} ADD COLUMN ${info.name} ${columnType}`);\n columns.push(info.name);\n }\n }\n\n const baseRows = readRows(source, table);\n const patchRows = patch ? readRows(patch, table) : [];\n\n const mergedRows = mergeRows(baseRows, patchRows, columns);\n\n insertRows(target, table, columns, mergedRows);\n};\n\n/**\n * Applies patches from a patch database to the main database.\n * @param db - The target database to apply patches to\n * @param aslDB - Path to the source ASL database file\n * @param patchDB - Path to the patch database file\n */\nexport const applyPatches = (db: SqliteDatabase, source: SqliteDatabase, patch: SqliteDatabase) => {\n db.transaction(() => {\n copyAndPatchTable(db, source, patch, Tables.Page);\n copyAndPatchTable(db, source, patch, Tables.Title);\n })();\n};\n\n/**\n * Copies table data from a source database without applying any patches.\n * @param db - The target database to copy data to\n * @param aslDB - Path to the source ASL database file\n */\nexport const copyTableData = (db: SqliteDatabase, source: SqliteDatabase) => {\n db.transaction(() => {\n copyAndPatchTable(db, source, null, Tables.Page);\n copyAndPatchTable(db, source, null, Tables.Title);\n })();\n};\n\n/**\n * Creates the required tables (Page and Title) in the database with their schema.\n * @param db - The database instance to create tables in\n */\nexport const createTables = (db: SqliteDatabase) => {\n db.run(\n `CREATE TABLE ${Tables.Page} (\n id INTEGER,\n content TEXT,\n part TEXT,\n page TEXT,\n number TEXT,\n services TEXT,\n is_deleted TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Title} (\n id INTEGER,\n content TEXT,\n page INTEGER,\n parent INTEGER,\n is_deleted TEXT\n )`,\n );\n};\n\n/**\n * Retrieves all pages from the Page table.\n * @param db - The database instance\n * @returns Array of all pages\n */\nexport const getAllPages = (db: SqliteDatabase) => {\n return db.query(`SELECT * FROM ${Tables.Page}`).all() as PageRow[];\n};\n\n/**\n * Retrieves all titles from the Title table.\n * @param db - The database instance\n * @returns Array of all titles\n */\nexport const getAllTitles = (db: SqliteDatabase) => {\n return db.query(`SELECT * FROM ${Tables.Title}`).all() as TitleRow[];\n};\n\n/**\n * Retrieves all book data including pages and titles.\n * @param db - The database instance\n * @returns Object containing arrays of pages and titles\n */\nexport const getData = (db: SqliteDatabase) => {\n return { pages: getAllPages(db), titles: getAllTitles(db) };\n};\n","import initSqlJs, { type Database as SqlJsDatabase, type SqlJsStatic, type Statement } from 'sql.js';\n\nimport { getConfigValue } from '../config';\n\n/**\n * Represents a row returned from a SQLite query as a generic key-value object.\n */\nexport type QueryRow = Record<string, any>;\n\n/**\n * Minimal contract for prepared statements used throughout the project.\n */\nexport interface PreparedStatement {\n run: (...params: any[]) => void;\n finalize: () => void;\n}\n\n/**\n * Interface describing reusable query helpers that return all rows or a single row.\n */\nexport interface Query {\n all: (...params: any[]) => QueryRow[];\n get: (...params: any[]) => QueryRow | undefined;\n}\n\n/**\n * Abstraction over the subset of SQLite database operations required by the library.\n */\nexport interface SqliteDatabase {\n run: (sql: string, params?: any[]) => void;\n prepare: (sql: string) => PreparedStatement;\n query: (sql: string) => Query;\n transaction: (fn: () => void) => () => void;\n close: () => void;\n export: () => Uint8Array;\n}\n\n/**\n * Adapter implementing {@link PreparedStatement} by delegating to a sql.js {@link Statement}.\n */\nclass SqlJsPreparedStatement implements PreparedStatement {\n constructor(private readonly statement: Statement) {}\n\n run = (...params: any[]) => {\n if (params.length > 0) {\n this.statement.bind(params);\n }\n\n this.statement.step();\n this.statement.reset();\n };\n\n finalize = () => {\n this.statement.free();\n };\n}\n\n/**\n * Wrapper providing the {@link SqliteDatabase} interface on top of a sql.js database instance.\n */\nclass SqlJsDatabaseWrapper implements SqliteDatabase {\n constructor(private readonly db: SqlJsDatabase) {}\n\n run = (sql: string, params: any[] = []) => {\n this.db.run(sql, params);\n };\n\n prepare = (sql: string): PreparedStatement => {\n return new SqlJsPreparedStatement(this.db.prepare(sql));\n };\n\n query = (sql: string): Query => {\n return {\n all: (...params: any[]) => this.all(sql, params),\n get: (...params: any[]) => this.get(sql, params),\n };\n };\n\n transaction = (fn: () => void) => {\n return () => {\n this.db.run('BEGIN TRANSACTION');\n try {\n fn();\n this.db.run('COMMIT');\n } catch (error) {\n this.db.run('ROLLBACK');\n throw error;\n }\n };\n };\n\n close = () => {\n this.db.close();\n };\n\n export = () => {\n return this.db.export();\n };\n\n private all = (sql: string, params: any[]): QueryRow[] => {\n const statement = this.db.prepare(sql);\n try {\n if (params.length > 0) {\n statement.bind(params);\n }\n\n const rows: QueryRow[] = [];\n while (statement.step()) {\n rows.push(statement.getAsObject());\n }\n return rows;\n } finally {\n statement.free();\n }\n };\n\n private get = (sql: string, params: any[]): QueryRow | undefined => {\n const rows = this.all(sql, params);\n return rows[0];\n };\n}\n\nlet sqlPromise: Promise<SqlJsStatic> | null = null;\nlet resolvedWasmPath: string | null = null;\n\nconst isNodeEnvironment = typeof process !== 'undefined' && Boolean(process?.versions?.node);\nconst DEFAULT_BROWSER_WASM_URL = 'https://cdn.jsdelivr.net/npm/sql.js@1.13.0/dist/sql-wasm.wasm';\n\n/**\n * Resolves the appropriate location of the sql.js WebAssembly binary.\n *\n * @returns The resolved path or remote URL for the sql.js wasm asset\n */\nconst getWasmPath = () => {\n if (!resolvedWasmPath) {\n const configured = getConfigValue('sqlJsWasmUrl');\n if (configured) {\n resolvedWasmPath = configured;\n } else if (isNodeEnvironment) {\n const url = new URL('../../node_modules/sql.js/dist/sql-wasm.wasm', import.meta.url);\n resolvedWasmPath = decodeURIComponent(url.pathname);\n } else {\n resolvedWasmPath = DEFAULT_BROWSER_WASM_URL;\n }\n }\n\n return resolvedWasmPath;\n};\n\n/**\n * Lazily initialises the sql.js runtime, reusing the same promise for subsequent calls.\n *\n * @returns A promise resolving to the sql.js module\n */\nconst loadSql = () => {\n if (!sqlPromise) {\n sqlPromise = initSqlJs({\n locateFile: () => getWasmPath(),\n });\n }\n\n return sqlPromise;\n};\n\n/**\n * Creates a new in-memory SQLite database instance backed by sql.js.\n *\n * @returns A promise resolving to a {@link SqliteDatabase} wrapper\n */\nexport const createDatabase = async () => {\n const SQL = await loadSql();\n return new SqlJsDatabaseWrapper(new SQL.Database());\n};\n\n/**\n * Opens an existing SQLite database from the provided binary contents.\n *\n * @param data - The Uint8Array containing the SQLite database bytes\n * @returns A promise resolving to a {@link SqliteDatabase} wrapper\n */\nexport const openDatabase = async (data: Uint8Array) => {\n const SQL = await loadSql();\n return new SqlJsDatabaseWrapper(new SQL.Database(data));\n};\n","import type { Author, Book, Category, MasterData } from '../types';\nimport type { SqliteDatabase } from './sqlite';\nimport { openDatabase } from './sqlite';\nimport { Tables } from './types';\n\n/**\n * Ensures the target database has the same table schema as the source database for a specific table.\n * @param db - The database instance\n * @param alias - The alias name of the attached database\n * @param table - The table name to ensure schema for\n * @throws {Error} When table definition is missing in the source database\n */\nconst ensureTableSchema = (db: SqliteDatabase, source: SqliteDatabase, table: Tables) => {\n const row = source.query(`SELECT sql FROM sqlite_master WHERE type='table' AND name = ?1`).get(table) as\n | { sql: string }\n | undefined;\n\n if (!row?.sql) {\n throw new Error(`Missing table definition for ${table} in source database`);\n }\n\n db.run(`DROP TABLE IF EXISTS ${table}`);\n db.run(row.sql);\n};\n\n/**\n * Copies data from foreign master table files into the main master database.\n *\n * This function processes the source table files (author.sqlite, book.sqlite, category.sqlite)\n * by attaching them to the current database connection, then copying their data into\n * the main master database tables. It handles data transformation including filtering\n * out deleted records and converting placeholder values.\n *\n * @param db - The database client instance for the master database\n * @param sourceTables - Array of file paths to the source SQLite table files\n *\n * @throws {Error} When source files cannot be attached or data copying operations fail\n */\nexport const copyForeignMasterTableData = async (\n db: SqliteDatabase,\n sourceTables: Array<{ name: string; data: Uint8Array }>,\n) => {\n const TABLE_MAP: Record<string, Tables> = {\n author: Tables.Authors,\n book: Tables.Books,\n category: Tables.Categories,\n };\n\n const tableDbs: Partial<Record<Tables, SqliteDatabase>> = {};\n\n for (const table of sourceTables) {\n const baseName = table.name.split('/').pop()?.split('\\\\').pop() ?? table.name;\n const normalized = baseName.replace(/\\.(sqlite|db)$/i, '').toLowerCase();\n const tableName = TABLE_MAP[normalized];\n if (!tableName) {\n continue;\n }\n\n tableDbs[tableName] = await openDatabase(table.data);\n }\n\n try {\n const entries = Object.entries(tableDbs) as Array<[Tables, SqliteDatabase]>;\n\n db.transaction(() => {\n for (const [table, sourceDb] of entries) {\n ensureTableSchema(db, sourceDb, table);\n\n const columnInfo = sourceDb.query(`PRAGMA table_info(${table})`).all() as Array<{\n name: string;\n type: string;\n }>;\n const columnNames = columnInfo.map((info) => info.name);\n if (columnNames.length === 0) {\n continue;\n }\n\n const rows = sourceDb.query(`SELECT * FROM ${table}`).all();\n if (rows.length === 0) {\n continue;\n }\n\n const placeholders = columnNames.map(() => '?').join(',');\n const sqlColumns = columnNames.map((name) => (name === 'order' ? '\"order\"' : name));\n const statement = db.prepare(`INSERT INTO ${table} (${sqlColumns.join(',')}) VALUES (${placeholders})`);\n\n try {\n for (const row of rows) {\n const values = columnNames.map((column) => (column in row ? row[column] : null));\n statement.run(...values);\n }\n } finally {\n statement.finalize();\n }\n }\n })();\n } finally {\n Object.values(tableDbs).forEach((database) => database?.close());\n }\n};\n\n/**\n * Creates a backward-compatible database view for legacy table names.\n * @param db - The database instance\n * @param viewName - The name of the view to create\n * @param sourceTable - The source table to base the view on\n */\nconst createCompatibilityView = (db: SqliteDatabase, viewName: string, sourceTable: Tables) => {\n db.run(`DROP VIEW IF EXISTS ${viewName}`);\n db.run(`CREATE VIEW ${viewName} AS SELECT * FROM ${sourceTable}`);\n};\n\n/**\n * Creates the necessary database tables for the master database.\n *\n * This function sets up the schema for the master database by creating\n * tables for authors, books, and categories with their respective columns\n * and data types. This is typically the first step in setting up a new\n * master database. Also creates backward-compatible views for legacy table names.\n *\n * @param db - The database client instance where tables should be created\n *\n * @throws {Error} When table creation fails due to database constraints or permissions\n */\nexport const createTables = (db: SqliteDatabase) => {\n db.run(\n `CREATE TABLE ${Tables.Authors} (\n id INTEGER,\n is_deleted TEXT,\n name TEXT,\n biography TEXT,\n death_text TEXT,\n death_number TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Books} (\n id INTEGER,\n name TEXT,\n is_deleted TEXT,\n category TEXT,\n type TEXT,\n date TEXT,\n author TEXT,\n printed TEXT,\n minor_release TEXT,\n major_release TEXT,\n bibliography TEXT,\n hint TEXT,\n pdf_links TEXT,\n metadata TEXT\n )`,\n );\n db.run(\n `CREATE TABLE ${Tables.Categories} (\n id INTEGER,\n is_deleted TEXT,\n \"order\" TEXT,\n name TEXT\n )`,\n );\n\n // Provide backward-compatible pluralised views since callers historically\n // queried \"authors\", \"books\", and \"categories\" tables.\n createCompatibilityView(db, 'authors', Tables.Authors);\n createCompatibilityView(db, 'books', Tables.Books);\n createCompatibilityView(db, 'categories', Tables.Categories);\n};\n\n/**\n * Retrieves all authors from the Authors table.\n * @param db - The database instance\n * @returns Array of all authors\n */\nexport const getAllAuthors = (db: SqliteDatabase) => {\n return db.query(`SELECT * FROM ${Tables.Authors}`).all() as Author[];\n};\n\n/**\n * Retrieves all books from the Books table.\n * @param db - The database instance\n * @returns Array of all books\n */\nexport const getAllBooks = (db: SqliteDatabase) => {\n return db.query(`SELECT * FROM ${Tables.Books}`).all() as Book[];\n};\n\n/**\n * Retrieves all categories from the Categories table.\n * @param db - The database instance\n * @returns Array of all categories\n */\nexport const getAllCategories = (db: SqliteDatabase) => {\n return db.query(`SELECT * FROM ${Tables.Categories}`).all() as Category[];\n};\n\n/**\n * Retrieves all master data including authors, books, and categories.\n * @param db - The database instance\n * @returns Object containing arrays of authors, books, and categories\n */\nexport const getData = (db: SqliteDatabase, version: number) => {\n return {\n authors: getAllAuthors(db),\n books: getAllBooks(db),\n categories: getAllCategories(db),\n version,\n } satisfies MasterData;\n};\n","import type { PageRow, TitleRow } from '@/db/types';\n\n/**\n * Redacts sensitive query parameters from a URL for safe logging\n * @param url - The URL to redact\n * @param sensitiveParams - Array of parameter names to redact (defaults to common sensitive params)\n * @returns The URL string with sensitive parameters redacted\n */\nexport const redactUrl = (\n url: URL | string,\n sensitiveParams: string[] = ['api_key', 'token', 'password', 'secret', 'auth'],\n): string => {\n const urlObj = typeof url === 'string' ? new URL(url) : new URL(url.toString());\n\n sensitiveParams.forEach((param) => {\n const value = urlObj.searchParams.get(param);\n if (value && value.length > 6) {\n const redacted = `${value.slice(0, 3)}***${value.slice(-3)}`;\n urlObj.searchParams.set(param, redacted);\n } else if (value) {\n urlObj.searchParams.set(param, '***');\n }\n });\n\n return urlObj.toString();\n};\n\n/**\n * Normalises a raw page row from SQLite into a serialisable {@link Page}.\n *\n * @param page - The database row representing a page\n * @returns The mapped page with numeric fields converted where appropriate\n */\nexport const mapPageRowToPage = (page: PageRow) => {\n return {\n content: page.content,\n id: page.id,\n ...(page.number && { number: page.number }),\n ...(page.page && { page: Number(page.page) }),\n ...(page.part && { part: page.part }),\n };\n};\n\n/**\n * Normalises a raw title row from SQLite into a serialisable {@link Title}.\n *\n * @param title - The database row representing a title\n * @returns The mapped title with numeric identifiers converted\n */\nexport const mapTitleRowToTitle = (title: TitleRow) => {\n const parent = Number(title.parent);\n\n return {\n content: title.content,\n id: title.id,\n page: Number(title.page),\n ...(parent && { parent }),\n };\n};\n","/**\n * The default version number for master metadata.\n * @constant {number}\n */\nexport const DEFAULT_MASTER_METADATA_VERSION = 0;\n\n/**\n * Placeholder value used to represent unknown or missing data.\n * @constant {string}\n */\nexport const UNKNOWN_VALUE_PLACEHOLDER = '99999';\n\n/**\n * Default rules to sanitize page content.\n */\nexport const DEFAULT_SANITIZATION_RULES: Record<string, string> = {\n '<img[^>]*>>': '',\n 舄: '',\n '﵀': 'رَحِمَهُ ٱللَّٰهُ',\n '﵁': 'رضي الله عنه',\n '﵂': 'رَضِيَ ٱللَّٰهُ عَنْهَا',\n '﵃': 'رَضِيَ اللَّهُ عَنْهُمْ',\n '﵄': 'رَضِيَ ٱللَّٰهُ عَنْهُمَا',\n '﵅': 'رَضِيَ اللَّهُ عَنْهُنَّ',\n '﵌': 'صلى الله عليه وآله وسلم',\n '﵏': 'رَحِمَهُمُ ٱللَّٰهُ',\n};\n","import type { UnzippedEntry } from '@/utils/io';\n\n/**\n * Enforces HTTPS protocol for a given URL string.\n *\n * @param originalUrl - The URL that may use an insecure scheme\n * @returns The normalized URL string using the HTTPS protocol\n */\nexport const fixHttpsProtocol = (originalUrl: string): string => {\n const url = new URL(originalUrl);\n url.protocol = 'https';\n\n return url.toString();\n};\n\n/**\n * Determines whether an archive entry contains a SQLite database file.\n *\n * @param entry - The entry extracted from an archive\n * @returns True when the entry name ends with a recognized SQLite extension\n */\nexport const isSqliteEntry = (entry: UnzippedEntry): boolean => /\\.(sqlite|db)$/i.test(entry.name);\n\n/**\n * Finds the first SQLite database entry from a list of archive entries.\n *\n * @param entries - The extracted entries to inspect\n * @returns The first matching entry or undefined when not present\n */\nexport const findSqliteEntry = (entries: UnzippedEntry[]): UnzippedEntry | undefined => {\n return entries.find(isSqliteEntry);\n};\n\n/**\n * Extracts the lowercase file extension from a path or filename.\n *\n * @param filePath - The path to inspect\n * @returns The lowercase extension (including the dot) or an empty string\n */\nexport const getExtension = (filePath: string): string => {\n const match = /\\.([^.]+)$/.exec(filePath);\n return match ? `.${match[1].toLowerCase()}` : '';\n};\n","import { getConfig, requireConfigValue } from '@/config';\n\n/**\n * Builds a URL with query parameters and optional authentication.\n * @param {string} endpoint - The base endpoint URL\n * @param {Record<string, any>} queryParams - Object containing query parameters to append\n * @param {boolean} [useAuth=true] - Whether to include the API key from environment variables\n * @returns {URL} The constructed URL object with query parameters\n */\nexport const buildUrl = (endpoint: string, queryParams: Record<string, any>, useAuth: boolean = true): URL => {\n const url = new URL(endpoint);\n const params = new URLSearchParams();\n\n Object.entries(queryParams).forEach(([key, value]) => {\n params.append(key, value.toString());\n });\n\n if (useAuth) {\n params.append('api_key', requireConfigValue('apiKey'));\n }\n\n url.search = params.toString();\n\n return url;\n};\n\n/**\n * Makes an HTTPS GET request and returns the response data using the configured fetch implementation.\n * @template T - The expected return type (Buffer or Record<string, any>)\n * @param {string | URL} url - The URL to make the request to\n * @param options - Optional overrides including a custom fetch implementation\n * @returns {Promise<T>} A promise that resolves to the response data, parsed as JSON if content-type is application/json, otherwise as Buffer\n * @throws {Error} When the request fails or JSON parsing fails\n */\nexport const httpsGet = async <T extends Uint8Array | Record<string, any>>(\n url: string | URL,\n options: { fetchImpl?: typeof fetch } = {},\n): Promise<T> => {\n const target = typeof url === 'string' ? url : url.toString();\n const activeFetch = options.fetchImpl ?? getConfig().fetchImplementation ?? fetch;\n const response = await activeFetch(target);\n\n if (!response.ok) {\n throw new Error(`Error making request: ${response.status} ${response.statusText}`);\n }\n\n const contentType = response.headers.get('content-type') ?? '';\n\n if (contentType.includes('application/json')) {\n return (await response.json()) as T;\n }\n\n const buffer = await response.arrayBuffer();\n return new Uint8Array(buffer) as T;\n};\n","import { unzipSync } from 'fflate';\n\nimport type { OutputOptions } from '@/types';\nimport logger from './logger';\nimport { httpsGet } from './network';\n\n/**\n * Representation of an extracted archive entry containing raw bytes and filename metadata.\n */\nexport type UnzippedEntry = { name: string; data: Uint8Array };\n\nconst isNodeEnvironment = typeof process !== 'undefined' && Boolean(process?.versions?.node);\n\n/**\n * Dynamically imports the Node.js fs/promises module, ensuring the runtime supports file operations.\n *\n * @throws {Error} When executed in a non-Node.js environment\n * @returns The fs/promises module when available\n */\nconst ensureNodeFs = async () => {\n if (!isNodeEnvironment) {\n throw new Error('File system operations are only supported in Node.js environments');\n }\n\n return import('node:fs/promises');\n};\n\n/**\n * Ensures the directory for a file path exists, creating parent folders as needed.\n *\n * @param filePath - The target file path whose directory should be created\n * @returns The fs/promises module instance\n */\nconst ensureDirectory = async (filePath: string) => {\n const [fs, path] = await Promise.all([ensureNodeFs(), import('node:path')]);\n const directory = path.dirname(filePath);\n await fs.mkdir(directory, { recursive: true });\n return fs;\n};\n\n/**\n * Downloads a ZIP archive from the given URL and returns its extracted entries.\n *\n * @param url - The remote URL referencing a ZIP archive\n * @returns A promise resolving to the extracted archive entries\n */\nexport const unzipFromUrl = async (url: string): Promise<UnzippedEntry[]> => {\n const binary = await httpsGet<Uint8Array>(url);\n const byteLength =\n binary instanceof Uint8Array\n ? binary.length\n : binary && typeof (binary as ArrayBufferLike).byteLength === 'number'\n ? (binary as ArrayBufferLike).byteLength\n : 0;\n logger.debug('unzipFromUrl:bytes', byteLength);\n\n return new Promise((resolve, reject) => {\n const dataToUnzip = binary instanceof Uint8Array ? binary : new Uint8Array(binary as ArrayBufferLike);\n\n try {\n const result = unzipSync(dataToUnzip);\n const entries = Object.entries(result).map(([name, data]) => ({ data, name }));\n logger.debug(\n 'unzipFromUrl:entries',\n entries.map((entry) => entry.name),\n );\n resolve(entries);\n } catch (error: any) {\n reject(new Error(`Error processing URL: ${error.message}`));\n }\n });\n};\n\n/**\n * Creates a unique temporary directory with the provided prefix.\n *\n * @param prefix - Optional prefix for the generated directory name\n * @returns The created temporary directory path\n */\nexport const createTempDir = async (prefix = 'shamela') => {\n const [fs, os, path] = await Promise.all([ensureNodeFs(), import('node:os'), import('node:path')]);\n const base = path.join(os.tmpdir(), prefix);\n return fs.mkdtemp(base);\n};\n\n/**\n * Writes output data either using a provided writer function or to a file path.\n *\n * @param output - The configured output destination or writer\n * @param payload - The payload to persist (string or binary)\n * @throws {Error} When neither a writer nor file path is provided\n */\nexport const writeOutput = async (output: OutputOptions, payload: string | Uint8Array) => {\n if (output.writer) {\n await output.writer(payload);\n return;\n }\n\n if (!output.path) {\n throw new Error('Output options must include either a writer or a path');\n }\n\n const fs = await ensureDirectory(output.path);\n\n if (typeof payload === 'string') {\n await fs.writeFile(output.path, payload, 'utf-8');\n } else {\n await fs.writeFile(output.path, payload);\n }\n};\n","import { getConfig } from '@/config';\n\nconst SOURCE_TABLES = ['author.sqlite', 'book.sqlite', 'category.sqlite'];\n\n/**\n * Validates that required environment variables are set.\n * @throws {Error} When any required environment variable is missing\n */\nexport const validateEnvVariables = () => {\n const { apiKey, booksEndpoint, masterPatchEndpoint } = getConfig();\n const envVariablesNotFound = [\n ['apiKey', apiKey],\n ['booksEndpoint', booksEndpoint],\n ['masterPatchEndpoint', masterPatchEndpoint],\n ]\n .filter(([, value]) => !value)\n .map(([key]) => key);\n\n if (envVariablesNotFound.length) {\n throw new Error(`${envVariablesNotFound.join(', ')} environment variables not set`);\n }\n};\n\n/**\n * Validates that all required master source tables are present in the provided paths.\n * @param {string[]} sourceTablePaths - Array of file paths to validate\n * @returns {boolean} True if all required source tables (author.sqlite, book.sqlite, category.sqlite) are present\n */\nexport const validateMasterSourceTables = (sourceTablePaths: string[]) => {\n const sourceTableNames = new Set(\n sourceTablePaths\n .map((tablePath) => tablePath.match(/[^\\\\/]+$/)?.[0] ?? tablePath)\n .map((name) => name.toLowerCase()),\n );\n return SOURCE_TABLES.every((table) => sourceTableNames.has(table.toLowerCase()));\n};\n","import { requireConfigValue } from './config';\nimport { applyPatches, copyTableData, createTables as createBookTables, getData as getBookData } from './db/book';\nimport { copyForeignMasterTableData, createTables as createMasterTables, getData as getMasterData } from './db/master';\nimport { createDatabase, openDatabase, type SqliteDatabase } from './db/sqlite';\nimport type {\n BookData,\n DownloadBookOptions,\n DownloadMasterOptions,\n GetBookMetadataOptions,\n GetBookMetadataResponsePayload,\n GetMasterMetadataResponsePayload,\n MasterData,\n} from './types';\nimport { mapPageRowToPage, mapTitleRowToTitle, redactUrl } from './utils/common';\nimport { DEFAULT_MASTER_METADATA_VERSION } from './utils/constants';\nimport { findSqliteEntry, fixHttpsProtocol, getExtension, isSqliteEntry } from './utils/downloads';\nimport type { UnzippedEntry } from './utils/io';\nimport { unzipFromUrl, writeOutput } from './utils/io';\nimport logger from './utils/logger';\nimport { buildUrl, httpsGet } from './utils/network';\nimport { validateEnvVariables, validateMasterSourceTables } from './utils/validation';\n\n/**\n * Response payload received when requesting book update metadata from the Shamela API.\n */\ntype BookUpdatesResponse = {\n major_release: number;\n major_release_url: string;\n minor_release?: number;\n minor_release_url?: string;\n};\n\n/**\n * Sets up a book database with tables and data, returning the database client.\n *\n * This helper function handles the common logic of downloading book files,\n * creating database tables, and applying patches or copying data.\n *\n * @param id - The unique identifier of the book\n * @param bookMetadata - Optional pre-fetched book metadata\n * @returns A promise that resolves to an object containing the database client and cleanup function\n */\nconst setupBookDatabase = async (\n id: number,\n bookMetadata?: GetBookMetadataResponsePayload,\n): Promise<{ client: SqliteDatabase; cleanup: () => Promise<void> }> => {\n logger.info(`Setting up book database for ${id}`);\n\n const bookResponse: GetBookMetadataResponsePayload = bookMetadata || (await getBookMetadata(id));\n const patchEntriesPromise = bookResponse.minorReleaseUrl\n ? unzipFromUrl(bookResponse.minorReleaseUrl)\n : Promise.resolve<UnzippedEntry[]>([]);\n\n const [bookEntries, patchEntries] = await Promise.all([\n unzipFromUrl(bookResponse.majorReleaseUrl),\n patchEntriesPromise,\n ]);\n\n const bookEntry = findSqliteEntry(bookEntries);\n\n if (!bookEntry) {\n throw new Error('Unable to locate book database in archive');\n }\n\n const client = await createDatabase();\n\n try {\n logger.info(`Creating tables`);\n createBookTables(client);\n\n const sourceDatabase = await openDatabase(bookEntry.data);\n\n try {\n const patchEntry = findSqliteEntry(patchEntries);\n\n if (patchEntry) {\n logger.info(`Applying patches from ${patchEntry.name} to ${bookEntry.name}`);\n const patchDatabase = await openDatabase(patchEntry.data);\n\n try {\n applyPatches(client, sourceDatabase, patchDatabase);\n } finally {\n patchDatabase.close();\n }\n } else {\n logger.info(`Copying table data from ${bookEntry.name}`);\n copyTableData(client, sourceDatabase);\n }\n } finally {\n sourceDatabase.close();\n }\n\n const cleanup = async () => {\n client.close();\n };\n\n return { cleanup, client };\n } catch (error) {\n client.close();\n throw error;\n }\n};\n\n/**\n * Downloads, validates, and prepares the master SQLite database for use.\n *\n * This helper is responsible for retrieving the master archive, ensuring all\n * required tables are present, copying their contents into a fresh in-memory\n * database, and returning both the database instance and cleanup hook.\n *\n * @param masterMetadata - Optional pre-fetched metadata describing the master archive\n * @returns A promise resolving to the database client, cleanup function, and version number\n */\nconst setupMasterDatabase = async (\n masterMetadata?: GetMasterMetadataResponsePayload,\n): Promise<{ client: SqliteDatabase; cleanup: () => Promise<void>; version: number }> => {\n logger.info('Setting up master database');\n\n const masterResponse = masterMetadata || (await getMasterMetadata(DEFAULT_MASTER_METADATA_VERSION));\n\n logger.info(`Downloading master database ${masterResponse.version} from: ${redactUrl(masterResponse.url)}`);\n const sourceTables = await unzipFromUrl(fixHttpsProtocol(masterResponse.url));\n\n logger.debug?.(`sourceTables downloaded: ${sourceTables.map((table) => table.name).toString()}`);\n\n if (!validateMasterSourceTables(sourceTables.map((table) => table.name))) {\n logger.error(`Some source tables were not found: ${sourceTables.map((table) => table.name).toString()}`);\n throw new Error('Expected tables not found!');\n }\n\n const client = await createDatabase();\n\n try {\n logger.info('Creating master tables');\n createMasterTables(client);\n\n logger.info('Copying data to master table');\n await copyForeignMasterTableData(client, sourceTables.filter(isSqliteEntry));\n\n const cleanup = async () => {\n client.close();\n };\n\n return { cleanup, client, version: masterResponse.version };\n } catch (error) {\n client.close();\n throw error;\n }\n};\n\n/**\n * Retrieves metadata for a specific book from the Shamela API.\n *\n * This function fetches book release information including major and minor release\n * URLs and version numbers from the Shamela web service.\n *\n * @param id - The unique identifier of the book to fetch metadata for\n * @param options - Optional parameters for specifying major and minor versions\n * @returns A promise that resolves to book metadata including release URLs and versions\n *\n * @throws {Error} When environment variables are not set or API request fails\n *\n * @example\n * ```typescript\n * const metadata = await getBookMetadata(123, { majorVersion: 1, minorVersion: 2 });\n * console.log(metadata.majorReleaseUrl); // Download URL for the book\n * ```\n */\nexport const getBookMetadata = async (\n id: number,\n options?: GetBookMetadataOptions,\n): Promise<GetBookMetadataResponsePayload> => {\n validateEnvVariables();\n\n const booksEndpoint = requireConfigValue('booksEndpoint');\n const url = buildUrl(`${booksEndpoint}/${id}`, {\n major_release: (options?.majorVersion || 0).toString(),\n minor_release: (options?.minorVersion || 0).toString(),\n });\n\n logger.info(`Fetching shamela.ws book link: ${redactUrl(url)}`);\n\n try {\n const response = (await httpsGet(url)) as BookUpdatesResponse;\n return {\n majorRelease: response.major_release,\n majorReleaseUrl: fixHttpsProtocol(response.major_release_url),\n ...(response.minor_release_url && { minorReleaseUrl: fixHttpsProtocol(response.minor_release_url) }),\n ...(response.minor_release_url && { minorRelease: response.minor_release }),\n };\n } catch (error: any) {\n throw new Error(`Error fetching book metadata: ${error.message}`);\n }\n};\n\n/**\n * Downloads and processes a book from the Shamela database.\n *\n * This function downloads the book's database files, applies patches if available,\n * creates the necessary database tables, and exports the data to the specified format.\n * The output can be either a JSON file or a SQLite database file.\n *\n * @param id - The unique identifier of the book to download\n * @param options - Configuration options including output file path and optional book metadata\n * @returns A promise that resolves to the path of the created output file\n *\n * @throws {Error} When download fails, database operations fail, or file operations fail\n *\n * @example\n * ```typescript\n * // Download as JSON\n * const jsonPath = await downloadBook(123, {\n * outputFile: { path: './book.json' }\n * });\n *\n * // Download as SQLite database\n * const dbPath = await downloadBook(123, {\n * outputFile: { path: './book.db' }\n * });\n * ```\n */\nexport const downloadBook = async (id: number, options: DownloadBookOptions): Promise<string> => {\n logger.info(`downloadBook ${id} ${JSON.stringify(options)}`);\n\n if (!options.outputFile.path) {\n throw new Error('outputFile.path must be provided to determine output format');\n }\n\n const extension = getExtension(options.outputFile.path).toLowerCase();\n\n const { client, cleanup } = await setupBookDatabase(id, options?.bookMetadata);\n\n try {\n if (extension === '.json') {\n const result = await getBookData(client);\n await writeOutput(options.outputFile, JSON.stringify(result, null, 2));\n } else if (extension === '.db' || extension === '.sqlite') {\n const payload = client.export();\n await writeOutput(options.outputFile, payload);\n } else {\n throw new Error(`Unsupported output extension: ${extension}`);\n }\n } finally {\n await cleanup();\n }\n\n return options.outputFile.path;\n};\n\n/**\n * Retrieves metadata for the master database from the Shamela API.\n *\n * The master database contains information about all books, authors, and categories\n * in the Shamela library. This function fetches the download URL and version\n * information for the master database patches.\n *\n * @param version - The version number to check for updates (defaults to 0)\n * @returns A promise that resolves to master database metadata including download URL and version\n *\n * @throws {Error} When environment variables are not set or API request fails\n *\n * @example\n * ```typescript\n * const masterMetadata = await getMasterMetadata(5);\n * console.log(masterMetadata.url); // URL to download master database patch\n * console.log(masterMetadata.version); // Latest version number\n * ```\n */\nexport const getMasterMetadata = async (version: number = 0): Promise<GetMasterMetadataResponsePayload> => {\n validateEnvVariables();\n\n const masterEndpoint = requireConfigValue('masterPatchEndpoint');\n const url = buildUrl(masterEndpoint, { version: version.toString() });\n\n logger.info(`Fetching shamela.ws master database patch link: ${redactUrl(url)}`);\n\n try {\n const response: Record<string, any> = await httpsGet(url);\n return { url: response.patch_url, version: response.version };\n } catch (error: any) {\n throw new Error(`Error fetching master patch: ${error.message}`);\n }\n};\n\n/**\n * Generates the URL for a book's cover image.\n *\n * This function constructs the URL to access the cover image for a specific book\n * using the book's ID and the API endpoint host.\n *\n * @param bookId - The unique identifier of the book\n * @returns The complete URL to the book's cover image\n *\n * @example\n * ```typescript\n * const coverUrl = getCoverUrl(123);\n * console.log(coverUrl); // \"https://api.shamela.ws/covers/123.jpg\"\n * ```\n */\nexport const getCoverUrl = (bookId: number) => {\n const masterEndpoint = requireConfigValue('masterPatchEndpoint');\n const { origin } = new URL(masterEndpoint);\n return `${origin}/covers/${bookId}.jpg`;\n};\n\n/**\n * Downloads and processes the master database from the Shamela service.\n *\n * The master database contains comprehensive information about all books, authors,\n * and categories available in the Shamela library. This function downloads the\n * database files, creates the necessary tables, and exports the data in the\n * specified format (JSON or SQLite).\n *\n * @param options - Configuration options including output file path and optional master metadata\n * @returns A promise that resolves to the path of the created output file\n *\n * @throws {Error} When download fails, expected tables are missing, database operations fail, or file operations fail\n *\n * @example\n * ```typescript\n * // Download master database as JSON\n * const jsonPath = await downloadMasterDatabase({\n * outputFile: { path: './master.json' }\n * });\n *\n * // Download master database as SQLite\n * const dbPath = await downloadMasterDatabase({\n * outputFile: { path: './master.db' }\n * });\n * ```\n */\nexport const downloadMasterDatabase = async (options: DownloadMasterOptions): Promise<string> => {\n logger.info(`downloadMasterDatabase ${JSON.stringify(options)}`);\n\n if (!options.outputFile.path) {\n throw new Error('outputFile.path must be provided to determine output format');\n }\n\n const extension = getExtension(options.outputFile.path);\n const { client, cleanup, version } = await setupMasterDatabase(options.masterMetadata);\n\n try {\n if (extension === '.json') {\n const result = getMasterData(client, version);\n await writeOutput(options.outputFile, JSON.stringify(result, null, 2));\n } else if (extension === '.db' || extension === '.sqlite') {\n await writeOutput(options.outputFile, client.export());\n } else {\n throw new Error(`Unsupported output extension: ${extension}`);\n }\n } finally {\n await cleanup();\n }\n\n return options.outputFile.path;\n};\n\n/**\n * Retrieves complete book data including pages and titles.\n *\n * This is a convenience function that downloads a book's data and returns it\n * as a structured JavaScript object. The function handles the temporary file\n * creation and cleanup automatically.\n *\n * @param id - The unique identifier of the book to retrieve\n * @returns A promise that resolves to the complete book data including pages and titles\n *\n * @throws {Error} When download fails, file operations fail, or JSON parsing fails\n *\n * @example\n * ```typescript\n * const bookData = await getBook(123);\n * console.log(bookData.pages.length); // Number of pages in the book\n * console.log(bookData.titles?.length); // Number of title entries\n * ```\n */\nexport const getBook = async (id: number): Promise<BookData> => {\n logger.info(`getBook ${id}`);\n\n const { client, cleanup } = await setupBookDatabase(id);\n\n try {\n const data = await getBookData(client);\n\n const result: BookData = {\n pages: data.pages.map(mapPageRowToPage),\n titles: data.titles.map(mapTitleRowToTitle),\n };\n\n return result;\n } finally {\n await cleanup();\n }\n};\n\n/**\n * Retrieves complete master data including authors, books, and categories.\n *\n * This convenience function downloads the master database archive, builds an in-memory\n * SQLite database, and returns structured data for immediate consumption alongside\n * the version number of the snapshot.\n *\n * @returns A promise that resolves to the complete master dataset and its version\n */\nexport const getMaster = async (): Promise<MasterData> => {\n logger.info('getMaster');\n\n const { client, cleanup, version } = await setupMasterDatabase();\n\n try {\n return getMasterData(client, version);\n } finally {\n await cleanup();\n }\n};\n","import { DEFAULT_SANITIZATION_RULES } from './utils/constants';\n\nexport type Line = {\n id?: string;\n text: string;\n};\n\nconst PUNCT_ONLY = /^[)\\]\\u00BB\"”'’.,?!:\\u061B\\u060C\\u061F\\u06D4\\u2026]+$/;\nconst OPENER_AT_END = /[[({«“‘]$/;\n\nconst mergeDanglingPunctuation = (lines: Line[]): Line[] => {\n const out: Line[] = [];\n for (const item of lines) {\n const last = out[out.length - 1];\n if (last?.id && PUNCT_ONLY.test(item.text)) {\n last.text += item.text;\n } else {\n out.push(item);\n }\n }\n return out;\n};\n\nconst splitIntoLines = (text: string) => {\n let normalized = text.replace(/\\r\\n/g, '\\n').replace(/\\r/g, '\\n');\n\n if (!/\\n/.test(normalized)) {\n normalized = normalized.replace(/([.?!\\u061F\\u061B\\u06D4\\u2026][\"“”'’»«)\\]]?)\\s+(?=[\\u0600-\\u06FF])/, '$1\\n');\n }\n\n return normalized\n .split('\\n')\n .map((line) => line.replace(/^\\*+/, '').trim())\n .filter(Boolean);\n};\n\nconst processTextContent = (content: string): Line[] => {\n return splitIntoLines(content).map((line) => ({ text: line }));\n};\n\nconst extractAttribute = (tag: string, name: string): string | undefined => {\n const pattern = new RegExp(`${name}\\\\s*=\\\\s*(\"([^\"]*)\"|'([^']*)'|([^s>]+))`, 'i');\n const match = tag.match(pattern);\n if (!match) {\n return undefined;\n }\n return match[2] ?? match[3] ?? match[4];\n};\n\ntype Token =\n | { type: 'text'; value: string }\n | { type: 'start'; name: string; attributes: Record<string, string | undefined> }\n | { type: 'end'; name: string };\n\nconst tokenize = (html: string): Token[] => {\n const tokens: Token[] = [];\n const tagRegex = /<[^>]+>/g;\n let lastIndex = 0;\n let match: RegExpExecArray | null;\n match = tagRegex.exec(html);\n\n while (match) {\n if (match.index > lastIndex) {\n tokens.push({ type: 'text', value: html.slice(lastIndex, match.index) });\n }\n\n const raw = match[0];\n const isEnd = /^<\\//.test(raw);\n const nameMatch = raw.match(/^<\\/?\\s*([a-zA-Z0-9:-]+)/);\n const name = nameMatch ? nameMatch[1].toLowerCase() : '';\n\n if (isEnd) {\n tokens.push({ name, type: 'end' });\n } else {\n const attributes: Record<string, string | undefined> = {};\n attributes.id = extractAttribute(raw, 'id');\n attributes['data-type'] = extractAttribute(raw, 'data-type');\n tokens.push({ attributes, name, type: 'start' });\n }\n\n lastIndex = tagRegex.lastIndex;\n match = tagRegex.exec(html);\n }\n\n if (lastIndex < html.length) {\n tokens.push({ type: 'text', value: html.slice(lastIndex) });\n }\n\n return tokens;\n};\n\nconst maybeAppendToPrevTitle = (result: Line[], raw: string) => {\n const last = result[result.length - 1];\n if (!raw) {\n return false;\n }\n if (!last || !last.id) {\n return false;\n }\n if (!OPENER_AT_END.test(last.text)) {\n return false;\n }\n if (/\\n/.test(raw)) {\n return false;\n }\n last.text += raw.replace(/^\\s+/, '');\n return true;\n};\n\nexport const parseContentRobust = (content: string): Line[] => {\n if (!/<span[^>]*>/i.test(content)) {\n return processTextContent(content);\n }\n\n const tokens = tokenize(`<root>${content}</root>`);\n const result: Line[] = [];\n\n let titleDepth = 0;\n let currentTitle: Line | null = null;\n\n const pushText = (raw: string) => {\n if (!raw) {\n return;\n }\n\n if (titleDepth > 0 && currentTitle) {\n const cleaned = titleDepth === 1 ? raw.replace(/^\\s+/, '') : raw;\n currentTitle.text += cleaned;\n return;\n }\n\n if (maybeAppendToPrevTitle(result, raw)) {\n return;\n }\n\n const text = raw.trim();\n if (text) {\n result.push(...processTextContent(text));\n }\n };\n\n for (const token of tokens) {\n if (token.type === 'text') {\n pushText(token.value);\n } else if (token.type === 'start' && token.name === 'span') {\n const dataType = token.attributes['data-type'];\n if (dataType === 'title') {\n if (titleDepth === 0) {\n const id = token.attributes.id?.replace(/^toc-/, '') ?? '';\n currentTitle = { id, text: '' };\n result.push(currentTitle);\n }\n titleDepth += 1;\n }\n } else if (token.type === 'end' && token.name === 'span') {\n if (titleDepth > 0) {\n titleDepth -= 1;\n if (titleDepth === 0) {\n currentTitle = null;\n }\n }\n }\n }\n\n const cleaned = result.map((line) => (line.id ? line : { ...line, text: line.text.trim() }));\n\n return mergeDanglingPunctuation(cleaned.map((line) => (line.id ? line : { ...line, text: line.text }))).filter(\n (line) => line.text.length > 0,\n );\n};\n\nconst DEFAULT_COMPILED_RULES = Object.entries(DEFAULT_SANITIZATION_RULES).map(([pattern, replacement]) => ({\n regex: new RegExp(pattern, 'g'),\n replacement,\n}));\n\n/**\n * Compiles sanitization rules into RegExp objects for performance\n */\nconst getCompiledRules = (rules: Record<string, string>) => {\n if (rules === DEFAULT_SANITIZATION_RULES) {\n return DEFAULT_COMPILED_RULES;\n }\n\n const compiled = [];\n for (const pattern in rules) {\n compiled.push({\n regex: new RegExp(pattern, 'g'),\n replacement: rules[pattern],\n });\n }\n return compiled;\n};\n\n/**\n * Sanitizes page content by applying regex replacement rules\n * @param text - The text to sanitize\n * @param rules - Optional custom rules (defaults to DEFAULT_SANITIZATION_RULES)\n * @returns The sanitized text\n */\nexport const sanitizePageContent = (\n text: string,\n rules: Record<string, string> = DEFAULT_SANITIZATION_RULES,\n): string => {\n const compiledRules = getCompiledRules(rules);\n\n let content = text;\n for (let i = 0; i < compiledRules.length; i++) {\n const { regex, replacement } = compiledRules[i];\n content = content.replace(regex, replacement);\n }\n return content;\n};\n\nexport const splitPageBodyFromFooter = (content: string, footnoteMarker = '_________') => {\n let footnote = '';\n const indexOfFootnote = content.lastIndexOf(footnoteMarker);\n\n if (indexOfFootnote >= 0) {\n footnote = content.slice(indexOfFootnote + footnoteMarker.length);\n content = content.slice(0, indexOfFootnote);\n }\n\n return [content, footnote] as const;\n};\n\nexport const removeArabicNumericPageMarkers = (text: string) => {\n return text.replace(/\\s?⦗[\\u0660-\\u0669]+⦘\\s?/, ' ');\n};\n\nexport const removeTagsExceptSpan = (content: string) => {\n // Remove <a> tags and their content, keeping only the text inside\n content = content.replace(/<a[^>]*>(.*?)<\\/a>/g, '$1');\n\n // Remove <hadeeth> tags (both self-closing, with content, and numbered)\n content = content.replace(/<hadeeth[^>]*>|<\\/hadeeth>|<hadeeth-\\d+>/g, '');\n\n return content;\n};\n"],"mappings":"yDAkBA,MAAaA,EAAwB,OAAO,OAAO,CAC/C,UAAa,GACb,UAAa,GACb,SAAY,GACZ,SAAY,GACf,CAAC,CAEF,IAAIC,EAAwB,EAQ5B,MAAa,EAAmB,GAAuB,CACnD,GAAI,CAAC,EAAW,CACZ,EAAgB,EAChB,OAIJ,IAAM,EADuC,CAAC,QAAS,QAAS,OAAQ,OAAO,CACzC,KAAM,GAAW,OAAO,EAAU,IAAY,WAAW,CAE/F,GAAI,EACA,MAAU,MACN,wEAAwE,OAAO,EAAc,GAChG,CAGL,EAAgB,GAMP,MAAkB,EAKlB,MAAoB,CAC7B,EAAgB,GAmBpB,IAAA,EAb4B,IAAI,MAAM,EAAE,CAAY,CAChD,KAAM,EAAS,IAA2B,CACtC,IAAM,EAAe,GAAW,CAC1B,EAAQ,EAAa,GAM3B,OAJI,OAAO,GAAU,YACT,GAAG,IAAqB,EAAsB,MAAM,EAAc,EAAK,CAG5E,GAEd,CAAC,CCtEF,IAAIE,EAAwC,EAAE,CAK9C,MAAMC,EAA4E,CAC9E,OAAQ,kBACR,cAAe,6BACf,oBAAqB,oCACrB,aAAc,yBACjB,CAKK,EAAqB,OAAO,QAAY,KAAe,EAAQ,SAAS,IAQxE,EAAyE,GAAa,CACxF,IAAM,EAAe,EAAc,GAEnC,GAAI,IAAiB,IAAA,GACjB,OAAO,EAGX,IAAM,EAAS,EAAQ,GAEvB,GAAI,EACA,OAAO,QAAQ,IAAI,IAmBd,GAAa,GAA6B,CACnD,GAAM,CAAE,SAAQ,GAAG,GAAY,EAE3B,WAAY,GACZ,EAAgB,EAAO,CAG3B,EAAgB,CAAE,GAAG,EAAe,GAAG,EAAS,EASvC,EAAgD,GACrD,IAAQ,sBACD,EAAc,oBAGlB,EAAQ,EAA2C,CAQjD,OACF,CACH,OAAQ,EAAQ,SAAS,CACzB,cAAe,EAAQ,gBAAgB,CACvC,oBAAqB,EAAc,oBACnC,oBAAqB,EAAQ,sBAAsB,CACnD,aAAc,EAAQ,eAAe,CACxC,EAUQ,EAAoF,GAAa,CAC1G,GAAK,IAA6B,sBAC9B,MAAU,MAAM,wDAAwD,CAG5E,IAAM,EAAQ,EAAe,EAAI,CACjC,GAAI,CAAC,EACD,MAAU,MAAM,GAAG,EAAQ,GAAK,+BAA+B,CAGnE,OAAO,GAME,OAAoB,CAC7B,EAAgB,EAAE,CAClB,GAAa,ECxHjB,IAAY,EAAA,SAAA,EAAL,OAEH,GAAA,QAAA,SAEA,EAAA,MAAA,OAEA,EAAA,WAAA,WAEA,EAAA,KAAA,OAEA,EAAA,MAAA,eCPJ,MAQM,GAAgB,EAAoB,IAC/B,EAAG,MAAM,qBAAqB,EAAM,GAAG,CAAC,KAAK,CASlD,GAAY,EAAoB,IAI3B,EAHQ,EAAG,MAAM,kEAAkE,CAAC,IAAI,EAAM,CAYnG,GAAY,EAAoB,IAC7B,EAAS,EAAI,EAAM,CAIjB,EAAG,MAAM,iBAAiB,IAAQ,CAAC,KAAK,CAHpC,EAAE,CAWX,EAAa,GACR,OAAO,EAAI,WAAW,GAAK,IAUhC,GAAkB,EAA0B,EAA2B,IAA2B,CACpG,IAAMC,EAAc,EAAE,CAEtB,IAAK,IAAM,KAAU,EAAS,CAC1B,GAAI,IAAW,KAAM,CACjB,EAAO,IAAM,GAAY,IAAU,IAAM,KACzC,SAGJ,GAAI,GAAY,KAAU,EAAU,CAChC,IAAM,EAAQ,EAAS,GAEvB,GAAI,IAAU,KAAoB,GAAU,KAA6B,CACrE,EAAO,GAAU,EACjB,UAIR,GAAI,GAAW,KAAU,EAAS,CAC9B,EAAO,GAAU,EAAQ,GACzB,SAGJ,EAAO,GAAU,KAGrB,OAAO,GAUL,IAAa,EAAiB,EAAkB,IAA6B,CAC/E,IAAM,EAAU,IAAI,IACd,EAAY,IAAI,IAEtB,IAAK,IAAM,KAAO,EACd,EAAQ,IAAI,OAAO,EAAI,GAAG,CAAC,CAG/B,IAAK,IAAM,KAAO,EACd,EAAU,IAAI,OAAO,EAAI,GAAG,CAAE,EAAI,CAGtC,IAAMC,EAAgB,EAAE,CAExB,IAAK,IAAM,KAAW,EAAU,CAC5B,IAAM,EAAW,EAAU,IAAI,OAAO,EAAQ,GAAG,CAAC,CAE9C,GAAY,EAAU,EAAS,EAInC,EAAO,KAAK,EAAe,EAAS,EAAU,EAAQ,CAAC,CAG3D,IAAK,IAAM,KAAO,EAAW,CACzB,IAAM,EAAK,OAAO,EAAI,GAAG,CAErB,EAAQ,IAAI,EAAG,EAAI,EAAU,EAAI,EAIrC,EAAO,KAAK,EAAe,IAAA,GAAW,EAAK,EAAQ,CAAC,CAGxD,OAAO,GAUL,IAAc,EAAoB,EAAe,EAAmB,IAAgB,CACtF,GAAI,EAAK,SAAW,EAChB,OAGJ,IAAM,EAAe,EAAQ,QAAU,IAAI,CAAC,KAAK,IAAI,CAC/C,EAAY,EAAG,QAAQ,eAAe,EAAM,IAAI,EAAQ,KAAK,IAAI,CAAC,YAAY,EAAa,GAAG,CAEpG,EAAK,QAAS,GAAQ,CAClB,IAAM,EAAS,EAAQ,IAAK,GAAY,KAAU,EAAM,EAAI,GAAU,KAAM,CAE5E,EAAU,IAAI,GAAG,EAAO,EAC1B,CAEF,EAAU,UAAU,EAUlBC,IAAqB,EAAwB,EAAwB,IAAkB,CACzF,IAAM,EAAM,EAAO,MAAM,iEAAiE,CAAC,IAAI,EAAM,CAWrG,OAPK,GAAK,KAKV,EAAO,IAAI,wBAAwB,IAAQ,CAC3C,EAAO,IAAI,EAAI,IAAI,CACZ,KANH,EAAO,KAAK,GAAG,EAAM,8CAA8C,CAC5D,KAeT,GACF,EACA,EACA,EACA,IACC,CACD,GAAI,CAAC,EAAS,EAAQ,EAAM,CAAE,CAC1B,EAAO,KAAK,GAAG,EAAM,mCAAmC,CACxD,OAGJ,GAAI,CAACA,GAAkB,EAAQ,EAAQ,EAAM,CACzC,OAGJ,IAAM,EAAW,EAAa,EAAQ,EAAM,CACtC,EAAY,GAAS,EAAS,EAAO,EAAM,CAAG,EAAa,EAAO,EAAM,CAAG,EAAE,CAE7E,EAAU,EAAS,IAAK,GAAS,EAAK,KAAK,CAEjD,IAAK,IAAM,KAAQ,EACf,GAAI,CAAC,EAAQ,SAAS,EAAK,KAAK,CAAE,CAC9B,IAAM,EAAa,EAAK,MAAQ,EAAK,KAAK,OAAS,EAAI,EAAK,KAAO,OACnE,EAAO,IAAI,eAAe,EAAM,cAAc,EAAK,KAAK,GAAG,IAAa,CACxE,EAAQ,KAAK,EAAK,KAAK,CAS/B,GAAW,EAAQ,EAAO,EAFP,GAHF,EAAS,EAAQ,EAAM,CACtB,EAAQ,EAAS,EAAO,EAAM,CAAG,EAAE,CAEH,EAAQ,CAEZ,EASrC,IAAgB,EAAoB,EAAwB,IAA0B,CAC/F,EAAG,gBAAkB,CACjB,EAAkB,EAAI,EAAQ,EAAO,EAAO,KAAK,CACjD,EAAkB,EAAI,EAAQ,EAAO,EAAO,MAAM,EACpD,EAAE,EAQK,IAAiB,EAAoB,IAA2B,CACzE,EAAG,gBAAkB,CACjB,EAAkB,EAAI,EAAQ,KAAM,EAAO,KAAK,CAChD,EAAkB,EAAI,EAAQ,KAAM,EAAO,MAAM,EACnD,EAAE,EAOK,EAAgB,GAAuB,CAChD,EAAG,IACC,gBAAgB,EAAO,KAAK;;;;;;;;WAS/B,CACD,EAAG,IACC,gBAAgB,EAAO,MAAM;;;;;;WAOhC,EAQQ,EAAe,GACjB,EAAG,MAAM,iBAAiB,EAAO,OAAO,CAAC,KAAK,CAQ5C,EAAgB,GAClB,EAAG,MAAM,iBAAiB,EAAO,QAAQ,CAAC,KAAK,CAQ7C,EAAW,IACb,CAAE,MAAO,EAAY,EAAG,CAAE,OAAQ,EAAa,EAAG,CAAE,ECnQ/D,IAAM,GAAN,KAA0D,CACtD,YAAY,EAAuC,CAAtB,KAAA,UAAA,EAE7B,KAAO,GAAG,IAAkB,CACpB,EAAO,OAAS,GAChB,KAAK,UAAU,KAAK,EAAO,CAG/B,KAAK,UAAU,MAAM,CACrB,KAAK,UAAU,OAAO,EAG1B,aAAiB,CACb,KAAK,UAAU,MAAM,GAOvB,EAAN,KAAqD,CACjD,YAAY,EAAoC,CAAnB,KAAA,GAAA,EAE7B,KAAO,EAAa,EAAgB,EAAE,GAAK,CACvC,KAAK,GAAG,IAAI,EAAK,EAAO,EAG5B,QAAW,GACA,IAAI,GAAuB,KAAK,GAAG,QAAQ,EAAI,CAAC,CAG3D,MAAS,IACE,CACH,KAAM,GAAG,IAAkB,KAAK,IAAI,EAAK,EAAO,CAChD,KAAM,GAAG,IAAkB,KAAK,IAAI,EAAK,EAAO,CACnD,EAGL,YAAe,OACE,CACT,KAAK,GAAG,IAAI,oBAAoB,CAChC,GAAI,CACA,GAAI,CACJ,KAAK,GAAG,IAAI,SAAS,OAChB,EAAO,CAEZ,MADA,KAAK,GAAG,IAAI,WAAW,CACjB,IAKlB,UAAc,CACV,KAAK,GAAG,OAAO,EAGnB,WACW,KAAK,GAAG,QAAQ,CAG3B,KAAe,EAAa,IAA8B,CACtD,IAAM,EAAY,KAAK,GAAG,QAAQ,EAAI,CACtC,GAAI,CACI,EAAO,OAAS,GAChB,EAAU,KAAK,EAAO,CAG1B,IAAMG,EAAmB,EAAE,CAC3B,KAAO,EAAU,MAAM,EACnB,EAAK,KAAK,EAAU,aAAa,CAAC,CAEtC,OAAO,SACD,CACN,EAAU,MAAM,GAIxB,KAAe,EAAa,IACX,KAAK,IAAI,EAAK,EAAO,CACtB,IAIpB,IAAIC,EAA0C,KAC1CC,EAAkC,KAEtC,MAAMC,GAAoB,OAAO,QAAY,KAAe,EAAQ,SAAS,UAAU,KAQjF,OAAoB,CACtB,GAAI,CAAC,EAAkB,CACnB,IAAM,EAAa,EAAe,eAAe,CACjD,GAAI,EACA,EAAmB,UACZA,GAAmB,CAC1B,IAAM,EAAM,IAAI,IAAI,+CAAgD,OAAO,KAAK,IAAI,CACpF,EAAmB,mBAAmB,EAAI,SAAS,MAEnD,EAAmB,gEAI3B,OAAO,GAQL,OACF,AACI,IAAa,EAAU,CACnB,eAAkB,IAAa,CAClC,CAAC,CAGC,GAQE,EAAiB,SAEnB,IAAI,EAAqB,IADpB,MAAM,GAAS,GACa,SAAW,CAS1C,EAAe,KAAO,IAExB,IAAI,EAAqB,IADpB,MAAM,GAAS,GACa,SAAS,EAAK,CAAC,CC1KrD,IAAqB,EAAoB,EAAwB,IAAkB,CACrF,IAAM,EAAM,EAAO,MAAM,iEAAiE,CAAC,IAAI,EAAM,CAIrG,GAAI,CAAC,GAAK,IACN,MAAU,MAAM,gCAAgC,EAAM,qBAAqB,CAG/E,EAAG,IAAI,wBAAwB,IAAQ,CACvC,EAAG,IAAI,EAAI,IAAI,EAgBN,GAA6B,MACtC,EACA,IACC,CACD,IAAMC,EAAoC,CACtC,OAAQ,EAAO,QACf,KAAM,EAAO,MACb,SAAU,EAAO,WACpB,CAEKC,EAAoD,EAAE,CAE5D,IAAK,IAAM,KAAS,EAAc,CAG9B,IAAM,EAAY,GAFD,EAAM,KAAK,MAAM,IAAI,CAAC,KAAK,EAAE,MAAM,KAAK,CAAC,KAAK,EAAI,EAAM,MAC7C,QAAQ,kBAAmB,GAAG,CAAC,aAAa,EAEnE,IAIL,EAAS,GAAa,MAAM,EAAa,EAAM,KAAK,EAGxD,GAAI,CACA,IAAM,EAAU,OAAO,QAAQ,EAAS,CAExC,EAAG,gBAAkB,CACjB,IAAK,GAAM,CAAC,EAAO,KAAa,EAAS,CACrC,GAAkB,EAAI,EAAU,EAAM,CAMtC,IAAM,EAJa,EAAS,MAAM,qBAAqB,EAAM,GAAG,CAAC,KAAK,CAIvC,IAAK,GAAS,EAAK,KAAK,CACvD,GAAI,EAAY,SAAW,EACvB,SAGJ,IAAM,EAAO,EAAS,MAAM,iBAAiB,IAAQ,CAAC,KAAK,CAC3D,GAAI,EAAK,SAAW,EAChB,SAGJ,IAAM,EAAe,EAAY,QAAU,IAAI,CAAC,KAAK,IAAI,CACnD,EAAa,EAAY,IAAK,GAAU,IAAS,QAAU,UAAY,EAAM,CAC7E,EAAY,EAAG,QAAQ,eAAe,EAAM,IAAI,EAAW,KAAK,IAAI,CAAC,YAAY,EAAa,GAAG,CAEvG,GAAI,CACA,IAAK,IAAM,KAAO,EAAM,CACpB,IAAM,EAAS,EAAY,IAAK,GAAY,KAAU,EAAM,EAAI,GAAU,KAAM,CAChF,EAAU,IAAI,GAAG,EAAO,SAEtB,CACN,EAAU,UAAU,IAG9B,EAAE,QACE,CACN,OAAO,OAAO,EAAS,CAAC,QAAS,GAAa,GAAU,OAAO,CAAC,GAUlE,GAA2B,EAAoB,EAAkB,IAAwB,CAC3F,EAAG,IAAI,uBAAuB,IAAW,CACzC,EAAG,IAAI,eAAe,EAAS,oBAAoB,IAAc,EAexDC,GAAgB,GAAuB,CAChD,EAAG,IACC,gBAAgB,EAAO,QAAQ;;;;;;;WAQlC,CACD,EAAG,IACC,gBAAgB,EAAO,MAAM;;;;;;;;;;;;;;;WAgBhC,CACD,EAAG,IACC,gBAAgB,EAAO,WAAW;;;;;WAMrC,CAID,EAAwB,EAAI,UAAW,EAAO,QAAQ,CACtD,EAAwB,EAAI,QAAS,EAAO,MAAM,CAClD,EAAwB,EAAI,aAAc,EAAO,WAAW,EAQnD,GAAiB,GACnB,EAAG,MAAM,iBAAiB,EAAO,UAAU,CAAC,KAAK,CAQ/C,GAAe,GACjB,EAAG,MAAM,iBAAiB,EAAO,QAAQ,CAAC,KAAK,CAQ7C,GAAoB,GACtB,EAAG,MAAM,iBAAiB,EAAO,aAAa,CAAC,KAAK,CAQlDC,GAAW,EAAoB,KACjC,CACH,QAAS,GAAc,EAAG,CAC1B,MAAO,GAAY,EAAG,CACtB,WAAY,GAAiB,EAAG,CAChC,UACH,ECvMQ,GACT,EACA,EAA4B,CAAC,UAAW,QAAS,WAAY,SAAU,OAAO,GACrE,CACT,IAAM,EAAS,OAAO,GAAQ,SAAW,IAAI,IAAI,EAAI,CAAG,IAAI,IAAI,EAAI,UAAU,CAAC,CAY/E,OAVA,EAAgB,QAAS,GAAU,CAC/B,IAAM,EAAQ,EAAO,aAAa,IAAI,EAAM,CAC5C,GAAI,GAAS,EAAM,OAAS,EAAG,CAC3B,IAAM,EAAW,GAAG,EAAM,MAAM,EAAG,EAAE,CAAC,KAAK,EAAM,MAAM,GAAG,GAC1D,EAAO,aAAa,IAAI,EAAO,EAAS,MACjC,GACP,EAAO,aAAa,IAAI,EAAO,MAAM,EAE3C,CAEK,EAAO,UAAU,EASf,EAAoB,IACtB,CACH,QAAS,EAAK,QACd,GAAI,EAAK,GACT,GAAI,EAAK,QAAU,CAAE,OAAQ,EAAK,OAAQ,CAC1C,GAAI,EAAK,MAAQ,CAAE,KAAM,OAAO,EAAK,KAAK,CAAE,CAC5C,GAAI,EAAK,MAAQ,CAAE,KAAM,EAAK,KAAM,CACvC,EASQ,EAAsB,GAAoB,CACnD,IAAM,EAAS,OAAO,EAAM,OAAO,CAEnC,MAAO,CACH,QAAS,EAAM,QACf,GAAI,EAAM,GACV,KAAM,OAAO,EAAM,KAAK,CACxB,GAAI,GAAU,CAAE,SAAQ,CAC3B,EC1CQC,EAAqD,CAC9D,cAAe,GACf,EAAG,GACH,IAAK,oBACL,IAAK,eACL,IAAK,0BACL,IAAK,0BACL,IAAK,4BACL,IAAK,2BACL,IAAK,0BACL,IAAK,sBACR,CClBY,EAAoB,GAAgC,CAC7D,IAAM,EAAM,IAAI,IAAI,EAAY,CAGhC,MAFA,GAAI,SAAW,QAER,EAAI,UAAU,EASZ,EAAiB,GAAkC,kBAAkB,KAAK,EAAM,KAAK,CAQrF,EAAmB,GACrB,EAAQ,KAAK,EAAc,CASzB,EAAgB,GAA6B,CACtD,IAAM,EAAQ,aAAa,KAAK,EAAS,CACzC,OAAO,EAAQ,IAAI,EAAM,GAAG,aAAa,GAAK,IChCrC,GAAY,EAAkB,EAAkC,EAAmB,KAAc,CAC1G,IAAM,EAAM,IAAI,IAAI,EAAS,CACvB,EAAS,IAAI,gBAYnB,OAVA,OAAO,QAAQ,EAAY,CAAC,SAAS,CAAC,EAAK,KAAW,CAClD,EAAO,OAAO,EAAK,EAAM,UAAU,CAAC,EACtC,CAEE,GACA,EAAO,OAAO,UAAW,EAAmB,SAAS,CAAC,CAG1D,EAAI,OAAS,EAAO,UAAU,CAEvB,GAWE,EAAW,MACpB,EACA,EAAwC,EAAE,GAC7B,CACb,IAAM,EAAS,OAAO,GAAQ,SAAW,EAAM,EAAI,UAAU,CAEvD,EAAW,MADG,EAAQ,WAAa,GAAW,CAAC,qBAAuB,OACzC,EAAO,CAE1C,GAAI,CAAC,EAAS,GACV,MAAU,MAAM,yBAAyB,EAAS,OAAO,GAAG,EAAS,aAAa,CAKtF,IAFoB,EAAS,QAAQ,IAAI,eAAe,EAAI,IAE5C,SAAS,mBAAmB,CACxC,OAAQ,MAAM,EAAS,MAAM,CAGjC,IAAM,EAAS,MAAM,EAAS,aAAa,CAC3C,OAAO,IAAI,WAAW,EAAO,EC1C3B,GAAoB,OAAO,QAAY,KAAe,EAAQ,SAAS,UAAU,KAQjF,GAAe,SAAY,CAC7B,GAAI,CAAC,GACD,MAAU,MAAM,oEAAoE,CAGxF,OAAO,OAAO,qBASZ,GAAkB,KAAO,IAAqB,CAChD,GAAM,CAAC,EAAI,GAAQ,MAAM,QAAQ,IAAI,CAAC,IAAc,CAAE,OAAO,aAAa,CAAC,CACrE,EAAY,EAAK,QAAQ,EAAS,CAExC,OADA,MAAM,EAAG,MAAM,EAAW,CAAE,UAAW,GAAM,CAAC,CACvC,GASE,EAAe,KAAO,IAA0C,CACzE,IAAM,EAAS,MAAM,EAAqB,EAAI,CACxC,EACF,aAAkB,WACZ,EAAO,OACP,GAAU,OAAQ,EAA2B,YAAe,SACzD,EAA2B,WAC5B,EAGZ,OAFA,EAAO,MAAM,qBAAsB,EAAW,CAEvC,IAAI,SAAS,EAAS,IAAW,CACpC,IAAM,EAAc,aAAkB,WAAa,EAAS,IAAI,WAAW,EAA0B,CAErG,GAAI,CACA,IAAM,EAAS,EAAU,EAAY,CAC/B,EAAU,OAAO,QAAQ,EAAO,CAAC,KAAK,CAAC,EAAM,MAAW,CAAE,OAAM,OAAM,EAAE,CAC9E,EAAO,MACH,uBACA,EAAQ,IAAK,GAAU,EAAM,KAAK,CACrC,CACD,EAAQ,EAAQ,OACXC,EAAY,CACjB,EAAW,MAAM,yBAAyB,EAAM,UAAU,CAAC,GAEjE,EAsBO,EAAc,MAAO,EAAuB,IAAiC,CACtF,GAAI,EAAO,OAAQ,CACf,MAAM,EAAO,OAAO,EAAQ,CAC5B,OAGJ,GAAI,CAAC,EAAO,KACR,MAAU,MAAM,wDAAwD,CAG5E,IAAM,EAAK,MAAM,GAAgB,EAAO,KAAK,CAEzC,OAAO,GAAY,SACnB,MAAM,EAAG,UAAU,EAAO,KAAM,EAAS,QAAQ,CAEjD,MAAM,EAAG,UAAU,EAAO,KAAM,EAAQ,ECzG1C,GAAgB,CAAC,gBAAiB,cAAe,kBAAkB,CAM5D,MAA6B,CACtC,GAAM,CAAE,SAAQ,gBAAe,uBAAwB,GAAW,CAC5D,EAAuB,CACzB,CAAC,SAAU,EAAO,CAClB,CAAC,gBAAiB,EAAc,CAChC,CAAC,sBAAuB,EAAoB,CAC/C,CACI,QAAQ,EAAG,KAAW,CAAC,EAAM,CAC7B,KAAK,CAAC,KAAS,EAAI,CAExB,GAAI,EAAqB,OACrB,MAAU,MAAM,GAAG,EAAqB,KAAK,KAAK,CAAC,gCAAgC,EAS9E,GAA8B,GAA+B,CACtE,IAAM,EAAmB,IAAI,IACzB,EACK,IAAK,GAAc,EAAU,MAAM,WAAW,GAAG,IAAM,EAAU,CACjE,IAAK,GAAS,EAAK,aAAa,CAAC,CACzC,CACD,OAAO,GAAc,MAAO,GAAU,EAAiB,IAAI,EAAM,aAAa,CAAC,CAAC,ECQ9E,EAAoB,MACtB,EACA,IACoE,CACpE,EAAO,KAAK,gCAAgC,IAAK,CAEjD,IAAMC,EAA+C,GAAiB,MAAM,EAAgB,EAAG,CACzF,EAAsB,EAAa,gBACnC,EAAa,EAAa,gBAAgB,CAC1C,QAAQ,QAAyB,EAAE,CAAC,CAEpC,CAAC,EAAa,GAAgB,MAAM,QAAQ,IAAI,CAClD,EAAa,EAAa,gBAAgB,CAC1C,EACH,CAAC,CAEI,EAAY,EAAgB,EAAY,CAE9C,GAAI,CAAC,EACD,MAAU,MAAM,4CAA4C,CAGhE,IAAM,EAAS,MAAM,GAAgB,CAErC,GAAI,CACA,EAAO,KAAK,kBAAkB,CAC9B,EAAiB,EAAO,CAExB,IAAM,EAAiB,MAAM,EAAa,EAAU,KAAK,CAEzD,GAAI,CACA,IAAM,EAAa,EAAgB,EAAa,CAEhD,GAAI,EAAY,CACZ,EAAO,KAAK,yBAAyB,EAAW,KAAK,MAAM,EAAU,OAAO,CAC5E,IAAM,EAAgB,MAAM,EAAa,EAAW,KAAK,CAEzD,GAAI,CACA,GAAa,EAAQ,EAAgB,EAAc,QAC7C,CACN,EAAc,OAAO,OAGzB,EAAO,KAAK,2BAA2B,EAAU,OAAO,CACxD,GAAc,EAAQ,EAAe,QAEnC,CACN,EAAe,OAAO,CAO1B,MAAO,CAAE,QAJO,SAAY,CACxB,EAAO,OAAO,EAGA,SAAQ,OACrB,EAAO,CAEZ,MADA,EAAO,OAAO,CACR,IAcR,EAAsB,KACxB,IACqF,CACrF,EAAO,KAAK,6BAA6B,CAEzC,IAAM,EAAiB,GAAmB,MAAM,EAAkB,EAAgC,CAElG,EAAO,KAAK,+BAA+B,EAAe,QAAQ,SAAS,EAAU,EAAe,IAAI,GAAG,CAC3G,IAAM,EAAe,MAAM,EAAa,EAAiB,EAAe,IAAI,CAAC,CAI7E,GAFA,EAAO,QAAQ,4BAA4B,EAAa,IAAK,GAAU,EAAM,KAAK,CAAC,UAAU,GAAG,CAE5F,CAAC,GAA2B,EAAa,IAAK,GAAU,EAAM,KAAK,CAAC,CAEpE,MADA,EAAO,MAAM,sCAAsC,EAAa,IAAK,GAAU,EAAM,KAAK,CAAC,UAAU,GAAG,CAC9F,MAAM,6BAA6B,CAGjD,IAAM,EAAS,MAAM,GAAgB,CAErC,GAAI,CAWA,OAVA,EAAO,KAAK,yBAAyB,CACrC,GAAmB,EAAO,CAE1B,EAAO,KAAK,+BAA+B,CAC3C,MAAM,GAA2B,EAAQ,EAAa,OAAO,EAAc,CAAC,CAMrE,CAAE,QAJO,SAAY,CACxB,EAAO,OAAO,EAGA,SAAQ,QAAS,EAAe,QAAS,OACtD,EAAO,CAEZ,MADA,EAAO,OAAO,CACR,IAsBD,EAAkB,MAC3B,EACA,IAC0C,CAC1C,GAAsB,CAGtB,IAAM,EAAM,EAAS,GADC,EAAmB,gBAAgB,CACnB,GAAG,IAAM,CAC3C,eAAgB,GAAS,cAAgB,GAAG,UAAU,CACtD,eAAgB,GAAS,cAAgB,GAAG,UAAU,CACzD,CAAC,CAEF,EAAO,KAAK,kCAAkC,EAAU,EAAI,GAAG,CAE/D,GAAI,CACA,IAAM,EAAY,MAAM,EAAS,EAAI,CACrC,MAAO,CACH,aAAc,EAAS,cACvB,gBAAiB,EAAiB,EAAS,kBAAkB,CAC7D,GAAI,EAAS,mBAAqB,CAAE,gBAAiB,EAAiB,EAAS,kBAAkB,CAAE,CACnG,GAAI,EAAS,mBAAqB,CAAE,aAAc,EAAS,cAAe,CAC7E,OACIC,EAAY,CACjB,MAAU,MAAM,iCAAiC,EAAM,UAAU,GA8B5D,GAAe,MAAO,EAAY,IAAkD,CAG7F,GAFA,EAAO,KAAK,gBAAgB,EAAG,GAAG,KAAK,UAAU,EAAQ,GAAG,CAExD,CAAC,EAAQ,WAAW,KACpB,MAAU,MAAM,8DAA8D,CAGlF,IAAM,EAAY,EAAa,EAAQ,WAAW,KAAK,CAAC,aAAa,CAE/D,CAAE,SAAQ,WAAY,MAAM,EAAkB,EAAI,GAAS,aAAa,CAE9E,GAAI,CACA,GAAI,IAAc,QAAS,CACvB,IAAM,EAAS,MAAMC,EAAY,EAAO,CACxC,MAAM,EAAY,EAAQ,WAAY,KAAK,UAAU,EAAQ,KAAM,EAAE,CAAC,SAC/D,IAAc,OAAS,IAAc,UAAW,CACvD,IAAM,EAAU,EAAO,QAAQ,CAC/B,MAAM,EAAY,EAAQ,WAAY,EAAQ,MAE9C,MAAU,MAAM,iCAAiC,IAAY,QAE3D,CACN,MAAM,GAAS,CAGnB,OAAO,EAAQ,WAAW,MAsBjB,EAAoB,MAAO,EAAkB,IAAiD,CACvG,GAAsB,CAGtB,IAAM,EAAM,EADW,EAAmB,sBAAsB,CAC3B,CAAE,QAAS,EAAQ,UAAU,CAAE,CAAC,CAErE,EAAO,KAAK,mDAAmD,EAAU,EAAI,GAAG,CAEhF,GAAI,CACA,IAAMC,EAAgC,MAAM,EAAS,EAAI,CACzD,MAAO,CAAE,IAAK,EAAS,UAAW,QAAS,EAAS,QAAS,OACxDF,EAAY,CACjB,MAAU,MAAM,gCAAgC,EAAM,UAAU,GAmB3D,GAAe,GAAmB,CAC3C,IAAM,EAAiB,EAAmB,sBAAsB,CAC1D,CAAE,UAAW,IAAI,IAAI,EAAe,CAC1C,MAAO,GAAG,EAAO,UAAU,EAAO,OA6BzB,GAAyB,KAAO,IAAoD,CAG7F,GAFA,EAAO,KAAK,0BAA0B,KAAK,UAAU,EAAQ,GAAG,CAE5D,CAAC,EAAQ,WAAW,KACpB,MAAU,MAAM,8DAA8D,CAGlF,IAAM,EAAY,EAAa,EAAQ,WAAW,KAAK,CACjD,CAAE,SAAQ,UAAS,WAAY,MAAM,EAAoB,EAAQ,eAAe,CAEtF,GAAI,CACA,GAAI,IAAc,QAAS,CACvB,IAAM,EAASG,EAAc,EAAQ,EAAQ,CAC7C,MAAM,EAAY,EAAQ,WAAY,KAAK,UAAU,EAAQ,KAAM,EAAE,CAAC,SAC/D,IAAc,OAAS,IAAc,UAC5C,MAAM,EAAY,EAAQ,WAAY,EAAO,QAAQ,CAAC,MAEtD,MAAU,MAAM,iCAAiC,IAAY,QAE3D,CACN,MAAM,GAAS,CAGnB,OAAO,EAAQ,WAAW,MAsBjB,GAAU,KAAO,IAAkC,CAC5D,EAAO,KAAK,WAAW,IAAK,CAE5B,GAAM,CAAE,SAAQ,WAAY,MAAM,EAAkB,EAAG,CAEvD,GAAI,CACA,IAAM,EAAO,MAAMF,EAAY,EAAO,CAOtC,MALyB,CACrB,MAAO,EAAK,MAAM,IAAI,EAAiB,CACvC,OAAQ,EAAK,OAAO,IAAI,EAAmB,CAC9C,QAGK,CACN,MAAM,GAAS,GAaV,GAAY,SAAiC,CACtD,EAAO,KAAK,YAAY,CAExB,GAAM,CAAE,SAAQ,UAAS,WAAY,MAAM,GAAqB,CAEhE,GAAI,CACA,OAAOE,EAAc,EAAQ,EAAQ,QAC/B,CACN,MAAM,GAAS,GCrZjB,GAAa,wDACb,GAAgB,YAEhB,GAA4B,GAA0B,CACxD,IAAMC,EAAc,EAAE,CACtB,IAAK,IAAM,KAAQ,EAAO,CACtB,IAAM,EAAO,EAAI,EAAI,OAAS,GAC1B,GAAM,IAAM,GAAW,KAAK,EAAK,KAAK,CACtC,EAAK,MAAQ,EAAK,KAElB,EAAI,KAAK,EAAK,CAGtB,OAAO,GAGL,GAAkB,GAAiB,CACrC,IAAI,EAAa,EAAK,QAAQ,QAAS;EAAK,CAAC,QAAQ,MAAO;EAAK,CAMjE,MAJK,KAAK,KAAK,EAAW,GACtB,EAAa,EAAW,QAAQ,qEAAsE;EAAO,EAG1G,EACF,MAAM;EAAK,CACX,IAAK,GAAS,EAAK,QAAQ,OAAQ,GAAG,CAAC,MAAM,CAAC,CAC9C,OAAO,QAAQ,EAGlB,EAAsB,GACjB,GAAe,EAAQ,CAAC,IAAK,IAAU,CAAE,KAAM,EAAM,EAAE,CAG5D,GAAoB,EAAa,IAAqC,CACxE,IAAM,EAAc,OAAO,GAAG,EAAK,yCAA0C,IAAI,CAC3E,EAAQ,EAAI,MAAM,EAAQ,CAC3B,KAGL,OAAO,EAAM,IAAM,EAAM,IAAM,EAAM,IAQnC,GAAY,GAA0B,CACxC,IAAMC,EAAkB,EAAE,CACpB,EAAW,WACb,EAAY,EACZC,EAGJ,IAFA,EAAQ,EAAS,KAAK,EAAK,CAEpB,GAAO,CACN,EAAM,MAAQ,GACd,EAAO,KAAK,CAAE,KAAM,OAAQ,MAAO,EAAK,MAAM,EAAW,EAAM,MAAM,CAAE,CAAC,CAG5E,IAAM,EAAM,EAAM,GACZ,EAAQ,OAAO,KAAK,EAAI,CACxB,EAAY,EAAI,MAAM,2BAA2B,CACjD,EAAO,EAAY,EAAU,GAAG,aAAa,CAAG,GAEtD,GAAI,EACA,EAAO,KAAK,CAAE,OAAM,KAAM,MAAO,CAAC,KAC/B,CACH,IAAMC,EAAiD,EAAE,CACzD,EAAW,GAAK,EAAiB,EAAK,KAAK,CAC3C,EAAW,aAAe,EAAiB,EAAK,YAAY,CAC5D,EAAO,KAAK,CAAE,aAAY,OAAM,KAAM,QAAS,CAAC,CAGpD,EAAY,EAAS,UACrB,EAAQ,EAAS,KAAK,EAAK,CAO/B,OAJI,EAAY,EAAK,QACjB,EAAO,KAAK,CAAE,KAAM,OAAQ,MAAO,EAAK,MAAM,EAAU,CAAE,CAAC,CAGxD,GAGL,GAA0B,EAAgB,IAAgB,CAC5D,IAAM,EAAO,EAAO,EAAO,OAAS,GAcpC,MAbI,CAAC,GAGD,CAAC,GAAQ,CAAC,EAAK,IAGf,CAAC,GAAc,KAAK,EAAK,KAAK,EAG9B,KAAK,KAAK,EAAI,CACP,IAEX,EAAK,MAAQ,EAAI,QAAQ,OAAQ,GAAG,CAC7B,KAGE,GAAsB,GAA4B,CAC3D,GAAI,CAAC,eAAe,KAAK,EAAQ,CAC7B,OAAO,EAAmB,EAAQ,CAGtC,IAAM,EAAS,GAAS,SAAS,EAAQ,SAAS,CAC5CC,EAAiB,EAAE,CAErB,EAAa,EACbC,EAA4B,KAE1B,EAAY,GAAgB,CAC9B,GAAI,CAAC,EACD,OAGJ,GAAI,EAAa,GAAK,EAAc,CAChC,IAAM,EAAU,IAAe,EAAI,EAAI,QAAQ,OAAQ,GAAG,CAAG,EAC7D,EAAa,MAAQ,EACrB,OAGJ,GAAI,EAAuB,EAAQ,EAAI,CACnC,OAGJ,IAAM,EAAO,EAAI,MAAM,CACnB,GACA,EAAO,KAAK,GAAG,EAAmB,EAAK,CAAC,EAIhD,IAAK,IAAM,KAAS,EACZ,EAAM,OAAS,OACf,EAAS,EAAM,MAAM,CACd,EAAM,OAAS,SAAW,EAAM,OAAS,OAC/B,EAAM,WAAW,eACjB,UACT,IAAe,IAEf,EAAe,CAAE,GADN,EAAM,WAAW,IAAI,QAAQ,QAAS,GAAG,EAAI,GACnC,KAAM,GAAI,CAC/B,EAAO,KAAK,EAAa,EAE7B,GAAc,GAEX,EAAM,OAAS,OAAS,EAAM,OAAS,QAC1C,EAAa,IACb,IACI,IAAe,IACf,EAAe,OAQ/B,OAAO,GAFS,EAAO,IAAK,GAAU,EAAK,GAAK,EAAO,CAAE,GAAG,EAAM,KAAM,EAAK,KAAK,MAAM,CAAE,CAAE,CAEpD,IAAK,GAAU,EAAK,GAAK,EAAO,CAAE,GAAG,EAAM,KAAM,EAAK,KAAM,CAAE,CAAC,CAAC,OACnG,GAAS,EAAK,KAAK,OAAS,EAChC,EAGC,GAAyB,OAAO,QAAQ,EAA2B,CAAC,KAAK,CAAC,EAAS,MAAkB,CACvG,MAAO,IAAI,OAAO,EAAS,IAAI,CAC/B,cACH,EAAE,CAKG,GAAoB,GAAkC,CACxD,GAAI,IAAU,EACV,OAAO,GAGX,IAAM,EAAW,EAAE,CACnB,IAAK,IAAM,KAAW,EAClB,EAAS,KAAK,CACV,MAAO,IAAI,OAAO,EAAS,IAAI,CAC/B,YAAa,EAAM,GACtB,CAAC,CAEN,OAAO,GASE,IACT,EACA,EAAgC,IACvB,CACT,IAAM,EAAgB,GAAiB,EAAM,CAEzC,EAAU,EACd,IAAK,IAAI,EAAI,EAAG,EAAI,EAAc,OAAQ,IAAK,CAC3C,GAAM,CAAE,QAAO,eAAgB,EAAc,GAC7C,EAAU,EAAQ,QAAQ,EAAO,EAAY,CAEjD,OAAO,GAGE,IAA2B,EAAiB,EAAiB,cAAgB,CACtF,IAAI,EAAW,GACT,EAAkB,EAAQ,YAAY,EAAe,CAO3D,OALI,GAAmB,IACnB,EAAW,EAAQ,MAAM,EAAkB,EAAe,OAAO,CACjE,EAAU,EAAQ,MAAM,EAAG,EAAgB,EAGxC,CAAC,EAAS,EAAS,EAGjB,GAAkC,GACpC,EAAK,QAAQ,2BAA4B,IAAI,CAG3C,GAAwB,IAEjC,EAAU,EAAQ,QAAQ,sBAAuB,KAAK,CAGtD,EAAU,EAAQ,QAAQ,4CAA6C,GAAG,CAEnE"}
package/package.json CHANGED
@@ -2,19 +2,25 @@
2
2
  "author": "Ragaeeb Haq",
3
3
  "default": "./dist/index.js",
4
4
  "dependencies": {
5
- "unzipper": "^0.12.3"
5
+ "fflate": "^0.8.2",
6
+ "sql.js": "^1.13.0"
6
7
  },
7
8
  "description": "Library to interact with the Maktabah Shamela v4 APIs",
8
9
  "devDependencies": {
9
- "@biomejs/biome": "^2.2.4",
10
- "@types/bun": "^1.2.22",
11
- "@types/node": "^24.5.2",
12
- "@types/unzipper": "^0.10.11",
13
- "semantic-release": "^24.2.9",
14
- "tsup": "^8.5.0"
10
+ "@biomejs/biome": "^2.2.7",
11
+ "@types/node": "^24.9.1",
12
+ "@types/react": "^18.3.12",
13
+ "@types/react-dom": "^18.3.1",
14
+ "@types/sql.js": "^1.4.9",
15
+ "next": "^16.0.1-canary.0",
16
+ "react": "^18.3.1",
17
+ "react-dom": "^18.3.1",
18
+ "semantic-release": "^25.0.1",
19
+ "tsdown": "^0.15.9",
20
+ "typescript": "^5.9.3"
15
21
  },
16
22
  "engines": {
17
- "bun": ">=1.2.22",
23
+ "bun": ">=1.3.0",
18
24
  "node": ">=22.0.0"
19
25
  },
20
26
  "exports": {
@@ -26,6 +32,7 @@
26
32
  "files": [
27
33
  "dist/**"
28
34
  ],
35
+ "homepage": "https://github.com/ragaeeb/shamela",
29
36
  "keywords": [
30
37
  "shamela",
31
38
  "Arabic",
@@ -41,15 +48,17 @@
41
48
  "url": "git+https://github.com/ragaeeb/shamela.git"
42
49
  },
43
50
  "scripts": {
44
- "build": "tsup",
45
- "e2e": "bun test e2e --env-file .env",
46
- "e2e:ci": "bun test e2e",
51
+ "build": "tsdown",
52
+ "demo": "next dev demo",
53
+ "demo:build": "next build demo",
54
+ "demo:start": "next start demo",
55
+ "format": "biome format --write .",
47
56
  "lint": "biome check .",
48
- "test": "bun test src/ --coverage --coverage-reporter=lcov"
57
+ "test": "bun test src --coverage --coverage-reporter=lcov"
49
58
  },
50
59
  "sideEffects": false,
51
60
  "source": "src/index.ts",
52
61
  "type": "module",
53
62
  "types": "dist/index.d.ts",
54
- "version": "1.2.2"
63
+ "version": "1.3.0"
55
64
  }