@bdkinc/knex-ibmi 0.5.9 → 0.5.11

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,453 +1,476 @@
1
- # @bdkinc/knex-ibmi
2
-
3
- [![npm version](http://img.shields.io/npm/v/@bdkinc/knex-ibmi.svg)](https://npmjs.org/package/@bdkinc/knex-ibmi)
4
-
5
- Knex.js dialect for DB2 on IBM i (via ODBC). Built for usage with the official IBM i Access ODBC driver and tested on IBM i.
6
-
7
- For IBM i OSS docs, see https://ibmi-oss-docs.readthedocs.io/. ODBC guidance: https://ibmi-oss-docs.readthedocs.io/en/latest/odbc/README.html.
8
-
9
- > Found an issue or have a question? Please open an issue.
10
-
11
- ## Features
12
-
13
- - Query building
14
- - Query execution
15
- - Transactions
16
- - Streaming
17
- - Multi-row insert strategies (auto | sequential | disabled)
18
- - Emulated returning for UPDATE and DELETE
19
-
20
- ## Requirements
21
-
22
- - Node.js >= 16
23
- - ODBC driver (IBM i Access ODBC Driver)
24
-
25
- ## Installation
26
-
27
- ```bash
28
- npm install @bdkinc/knex-ibmi knex odbc
29
- ```
30
-
31
- ## Quick Start
32
-
33
- ```js
34
- import knex from "knex";
35
- import { DB2Dialect } from "@bdkinc/knex-ibmi";
36
-
37
- /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
38
- const config = {
39
- client: DB2Dialect,
40
- connection: {
41
- host: "your-ibm-i-host",
42
- database: "*LOCAL",
43
- user: "your-username",
44
- password: "your-password",
45
- driver: "IBM i Access ODBC Driver",
46
- connectionStringParams: { DBQ: "MYLIB" },
47
- },
48
- pool: { min: 2, max: 10 },
49
- };
50
-
51
- const db = knex(config);
52
-
53
- try {
54
- const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
55
- console.log(results);
56
- } catch (error) {
57
- console.error("Database error:", error);
58
- } finally {
59
- await db.destroy();
60
- }
61
- ```
62
-
63
- ## Usage
64
-
65
- This package can be used with CommonJS, ESM, or TypeScript.
66
-
67
- ### CommonJS
68
-
69
- ```js
70
- const knex = require("knex");
71
- const { DB2Dialect } = require("@bdkinc/knex-ibmi");
72
-
73
- const db = knex({
74
- client: DB2Dialect,
75
- connection: {
76
- host: "your-ibm-i-host",
77
- database: "*LOCAL",
78
- user: "your-username",
79
- password: "your-password",
80
- driver: "IBM i Access ODBC Driver",
81
- connectionStringParams: {
82
- ALLOWPROCCALLS: 1,
83
- CMT: 0,
84
- DBQ: "MYLIB"
85
- },
86
- },
87
- pool: { min: 2, max: 10 },
88
- });
89
-
90
- // Example query
91
- db.select("*")
92
- .from("MYTABLE")
93
- .where({ STATUS: "A" })
94
- .then(results => console.log(results))
95
- .catch(error => console.error("Database error:", error))
96
- .finally(() => db.destroy());
97
- ```
98
-
99
- ### ESM
100
-
101
- ```js
102
- import knex from "knex";
103
- import { DB2Dialect } from "@bdkinc/knex-ibmi";
104
-
105
- /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
106
- const config = {
107
- client: DB2Dialect,
108
- connection: {
109
- host: "your-ibm-i-host",
110
- database: "*LOCAL",
111
- user: "your-username",
112
- password: "your-password",
113
- driver: "IBM i Access ODBC Driver",
114
- connectionStringParams: {
115
- ALLOWPROCCALLS: 1,
116
- CMT: 0,
117
- DBQ: "MYLIB"
118
- },
119
- },
120
- pool: { min: 2, max: 10 },
121
- };
122
-
123
- const db = knex(config);
124
-
125
- try {
126
- const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
127
- console.log(results);
128
- } catch (error) {
129
- console.error("Database error:", error);
130
- } finally {
131
- await db.destroy();
132
- }
133
- ```
134
-
135
- ### TypeScript
136
-
137
- ```ts
138
- import { knex } from "knex";
139
- import { DB2Dialect, DB2Config } from "@bdkinc/knex-ibmi";
140
-
141
- const config: DB2Config = {
142
- client: DB2Dialect,
143
- connection: {
144
- host: "your-ibm-i-host",
145
- database: "*LOCAL",
146
- user: "your-username",
147
- password: "your-password",
148
- driver: "IBM i Access ODBC Driver",
149
- connectionStringParams: {
150
- ALLOWPROCCALLS: 1,
151
- CMT: 0,
152
- DBQ: "MYLIB"
153
- },
154
- },
155
- pool: { min: 2, max: 10 },
156
- };
157
-
158
- const db = knex(config);
159
-
160
- try {
161
- const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
162
- console.log(results);
163
- } catch (error) {
164
- console.error("Database error:", error);
165
- } finally {
166
- await db.destroy();
167
- }
168
- ```
169
-
170
- ### Streaming
171
-
172
- There are two primary ways to consume a result stream: (1) classic Node stream piping with transform stages, and (2) async iteration with `for await` (which can be easier to reason about). Use a `fetchSize` to control how many rows are fetched from the driver per batch.
173
-
174
- ```ts
175
- import { knex } from "knex";
176
- import { DB2Dialect, DB2Config } from "@bdkinc/knex-ibmi";
177
- import { Transform } from "node:stream";
178
- import { finished } from "node:stream/promises";
179
-
180
- const config: DB2Config = { /* ...same as earlier examples... */ };
181
- const db = knex(config);
182
-
183
- try {
184
- const stream = await db("LARGETABLE").select("*").stream({ fetchSize: 100 });
185
-
186
- // Approach 1: Pipe through a Transform stream
187
- const transform = new Transform({
188
- objectMode: true,
189
- transform(row, _enc, cb) {
190
- // Process each row (side effects, enrichment, filtering, etc.)
191
- console.log("Transforming row id=", row.ID);
192
- cb(null, row);
193
- },
194
- });
195
- stream.pipe(transform);
196
- await finished(stream); // Wait until piping completes
197
-
198
- // Approach 2: Async iteration (recommended for simplicity)
199
- const iterStream = await db("LARGETABLE").select("*").stream({ fetchSize: 200 });
200
- for await (const row of iterStream) {
201
- console.log("Iter row id=", row.ID);
202
- }
203
- } catch (error) {
204
- console.error("Streaming error:", error);
205
- } finally {
206
- await db.destroy();
207
- }
208
- ```
209
-
210
- ## ODBC Driver Setup
211
-
212
- If you don't know the name of your installed driver, check `odbcinst.ini`. Find its path with:
213
-
214
- ```bash
215
- odbcinst -j
216
- ```
217
-
218
- Example entries:
219
-
220
- ```
221
- [IBM i Access ODBC Driver] # driver name in square brackets
222
- Description=IBM i Access for Linux ODBC Driver
223
- Driver=/opt/ibm/iaccess/lib/libcwbodbc.so
224
- Setup=/opt/ibm/iaccess/lib/libcwbodbcs.so
225
- Driver64=/opt/ibm/iaccess/lib64/libcwbodbc.so
226
- Setup64=/opt/ibm/iaccess/lib64/libcwbodbcs.so
227
- Threading=0
228
- DontDLClose=1
229
- UsageCount=1
230
-
231
- [IBM i Access ODBC Driver 64-bit]
232
- Description=IBM i Access for Linux 64-bit ODBC Driver
233
- Driver=/opt/ibm/iaccess/lib64/libcwbodbc.so
234
- Setup=/opt/ibm/iaccess/lib64/libcwbodbcs.so
235
- Threading=0
236
- DontDLClose=1
237
- UsageCount=1
238
- ```
239
-
240
- If unixODBC is using the wrong config directory (e.g., your configs are in `/etc` but it expects elsewhere), set:
241
-
242
- ```bash
243
- export ODBCINI=/etc
244
- export ODBCSYSINI=/etc
245
- ```
246
-
247
- ## Bundling with Vite
248
-
249
- If you bundle with Vite, exclude certain native deps during optimize step:
250
-
251
- ```js
252
- // vite.config.js
253
- export default {
254
- optimizeDeps: {
255
- exclude: ["@mapbox"],
256
- },
257
- };
258
- ```
259
-
260
- ## Migrations
261
-
262
- ⚠️ **Important**: Standard Knex migrations don't work reliably with IBM i DB2 due to auto-commit DDL operations and locking issues.
263
-
264
- ### Recommended: Use Built-in IBM i Migration System
265
-
266
- The knex-ibmi library includes a custom migration system that bypasses Knex's problematic locking mechanism:
267
-
268
- ```js
269
- import { createIBMiMigrationRunner } from "@bdkinc/knex-ibmi";
270
-
271
- const migrationRunner = createIBMiMigrationRunner(db, {
272
- directory: "./migrations",
273
- tableName: "KNEX_MIGRATIONS",
274
- schemaName: "MYSCHEMA"
275
- });
276
-
277
- // Run migrations
278
- await migrationRunner.latest();
279
-
280
- // Rollback
281
- await migrationRunner.rollback();
282
-
283
- // Check status
284
- const pending = await migrationRunner.listPending();
285
- ```
286
-
287
- **CLI Usage:** The package includes a built-in CLI that can be used via npm scripts or npx:
288
-
289
- ```bash
290
- # Install globally (optional)
291
- npm install -g @bdkinc/knex-ibmi
292
-
293
- # Or use via npx (recommended)
294
- npx ibmi-migrations migrate:latest # Run pending migrations
295
- npx ibmi-migrations migrate:rollback # Rollback last batch
296
- npx ibmi-migrations migrate:status # Show migration status
297
- npx ibmi-migrations migrate:make create_users_table # Create new JS migration
298
- npx ibmi-migrations migrate:make add_email_column -x ts # Create new TS migration
299
-
300
- # Or add to your package.json scripts:
301
- {
302
- "scripts": {
303
- "migrate:latest": "ibmi-migrations migrate:latest",
304
- "migrate:rollback": "ibmi-migrations migrate:rollback",
305
- "migrate:status": "ibmi-migrations migrate:status",
306
- "migrate:make": "ibmi-migrations migrate:make"
307
- }
308
- }
309
-
310
- # Then run with npm:
311
- npm run migrate:latest
312
- npm run migrate:status
313
- ```
314
-
315
- **Full CLI API (similar to Knex):**
316
- ```bash
317
- ibmi-migrations migrate:latest # Run all pending migrations
318
- ibmi-migrations migrate:rollback # Rollback last migration batch
319
- ibmi-migrations migrate:status # Show detailed migration status
320
- ibmi-migrations migrate:currentVersion # Show current migration version
321
- ibmi-migrations migrate:list # List all migrations
322
- ibmi-migrations migrate:make <name> # Create new migration file
323
-
324
- # Options:
325
- ibmi-migrations migrate:status --env production
326
- ibmi-migrations migrate:latest --knexfile ./config/knexfile.js
327
- ibmi-migrations migrate:latest --knexfile ./knexfile.ts # Use TypeScript knexfile
328
- ibmi-migrations migrate:make create_users_table
329
- ibmi-migrations migrate:make add_email_column -x ts # TypeScript migration
330
- ```
331
-
332
- 📖 **See [MIGRATIONS.md](./MIGRATIONS.md) for complete documentation**
333
-
334
- ### Alternative: Standard Knex with Transactions Disabled
335
-
336
- If you must use standard Knex migrations, disable transactions to avoid issues:
337
-
338
- ```js
339
- /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
340
- const config = {
341
- client: DB2Dialect,
342
- connection: { /* your connection config */ },
343
- migrations: {
344
- disableTransactions: true, // Required for IBM i
345
- directory: './migrations',
346
- tableName: 'knex_migrations',
347
- },
348
- };
349
- ```
350
-
351
- **Warning**: Standard Knex migrations may still hang on lock operations. The built-in IBM i migration system is strongly recommended.
352
-
353
- ## Multi-Row Insert Strategies
354
-
355
- Configure via `ibmi.multiRowInsert` in the knex config:
356
-
357
- ```ts
358
- const db = knex({
359
- client: DB2Dialect,
360
- connection: { /* ... */ },
361
- ibmi: { multiRowInsert: 'auto' } // 'auto' | 'sequential' | 'disabled'
362
- });
363
- ```
364
-
365
- - `auto` (default): Generates a single INSERT with multiple VALUES lists. For `.returning('*')` or no explicit column list it returns all inserted rows (lenient fallback). Identity values are whatever DB2 ODBC surfaces for that multi-row statement.
366
- - `sequential`: Compiler shows a single-row statement (first row) but at execution time each row is inserted individually inside a loop to reliably collect identity values (using `IDENTITY_VAL_LOCAL()` per row). Suitable when you need each generated identity.
367
- - `disabled`: Falls back to legacy behavior: only the first row is inserted (others ignored). Useful for strict backward compatibility.
368
-
369
- If you specify `.returning(['COL1', 'COL2'])` with multi-row inserts, those columns are selected; otherwise `IDENTITY_VAL_LOCAL()` (single-row) or `*` (multi-row) is used as a lenient fallback.
370
-
371
- ## Returning Behavior (INSERT / UPDATE / DELETE)
372
-
373
- Native `RETURNING` is not broadly supported over ODBC on IBM i. The dialect provides pragmatic emulation:
374
-
375
- ### INSERT
376
- - `auto` multi-row: generates a single multi-values INSERT. When no explicit column list is requested it returns all inserted rows (`*`) as a lenient fallback. Some installations may see this internally wrapped using a `SELECT * FROM FINAL TABLE( INSERT ... )` pattern in logs or debug output; that wrapper is only an implementation detail to surface inserted rows.
377
- - `sequential`: inserts each row one at a time so it can reliably call `IDENTITY_VAL_LOCAL()` after each insert; builds an array of returned rows.
378
- - `disabled`: legacy single-row insert behavior; additional rows in the values array are ignored.
379
-
380
- ### UPDATE
381
- - Executes the UPDATE.
382
- - Re-selects the affected rows using the original WHERE clause when `.returning(...)` is requested.
383
-
384
- ### DELETE
385
- - Selects the rows to be deleted (capturing requested returning columns or `*`).
386
- - Executes the DELETE.
387
- - Returns the previously selected rows.
388
-
389
- ### Notes
390
- - `returning('*')` can be expensive on large result sets—limit the column list when possible.
391
- - For guaranteed, ordered identity values across many inserted rows use the `sequential` strategy.
392
-
393
- ## Configuration Summary
394
-
395
- ```ts
396
- interface IbmiDialectConfig {
397
- multiRowInsert?: 'auto' | 'sequential' | 'disabled';
398
- sequentialInsertTransactional?: boolean; // if true, wraps sequential loop in BEGIN/COMMIT
399
- preparedStatementCache?: boolean; // Enable per-connection statement caching (default: false)
400
- preparedStatementCacheSize?: number; // Max cached statements per connection (default: 100)
401
- readUncommitted?: boolean; // Append WITH UR to SELECT queries (default: false)
402
- }
403
- ```
404
-
405
- Attach under the root knex config as `ibmi`.
406
-
407
- ### Performance Tuning
408
-
409
- #### Prepared Statement Caching (v0.5.0+)
410
-
411
- Enable optional prepared statement caching to reduce parse overhead for repeated queries:
412
-
413
- ```ts
414
- const db = knex({
415
- client: DB2Dialect,
416
- connection: { /* ... */ },
417
- ibmi: {
418
- preparedStatementCache: true, // Enable caching
419
- preparedStatementCacheSize: 100, // Max statements per connection
420
- }
421
- });
422
- ```
423
-
424
- When enabled, the dialect maintains a per-connection LRU cache of prepared statements. Statements are automatically closed when evicted or when the connection is destroyed.
425
-
426
- #### Read Uncommitted Isolation (v0.5.0+)
427
-
428
- For read-heavy workloads, enable uncommitted read isolation to improve concurrency:
429
-
430
- ```ts
431
- const db = knex({
432
- client: DB2Dialect,
433
- connection: { /* ... */ },
434
- ibmi: {
435
- readUncommitted: true // Appends WITH UR to all SELECT queries
436
- }
437
- });
438
- ```
439
-
440
- This appends `WITH UR` to all SELECT queries, allowing reads without waiting for locks. Only use this if your application can tolerate reading uncommitted data.
441
-
442
- ### Transactional Sequential Inserts
443
-
444
- When `ibmi.sequentialInsertTransactional` is `true`, the dialect will attempt `BEGIN` before the per-row loop and `COMMIT` after. On commit failure it will attempt a `ROLLBACK`. If `BEGIN` is not supported, it logs a warning and continues non-transactionally.
445
-
446
- <!-- Benchmarks section intentionally removed. Benchmarking is handled in the external test harness project -->
447
-
448
- ## Links
449
-
450
- - Knex: https://knexjs.org/
451
- - Knex repo: https://github.com/knex/knex
452
- - ODBC driver: https://github.com/IBM/node-odbc
453
- - IBM i OSS docs: https://ibmi-oss-docs.readthedocs.io/
1
+ # @bdkinc/knex-ibmi
2
+
3
+ [![npm version](http://img.shields.io/npm/v/@bdkinc/knex-ibmi.svg)](https://npmjs.org/package/@bdkinc/knex-ibmi)
4
+
5
+ Knex.js dialect for DB2 on IBM i (via ODBC). Built for usage with the official IBM i Access ODBC driver and tested on IBM i.
6
+
7
+ For IBM i OSS docs, see https://ibmi-oss-docs.readthedocs.io/. ODBC guidance: https://ibmi-oss-docs.readthedocs.io/en/latest/odbc/README.html.
8
+
9
+ > Found an issue or have a question? Please open an issue.
10
+
11
+ ## Features
12
+
13
+ - Query building
14
+ - Query execution
15
+ - Transactions
16
+ - Streaming
17
+ - Multi-row insert strategies (auto | sequential | disabled)
18
+ - Emulated returning for UPDATE and DELETE
19
+
20
+ ## Requirements
21
+
22
+ - Node.js >= 16
23
+ - ODBC driver (IBM i Access ODBC Driver)
24
+
25
+ ## Installation
26
+
27
+ ```bash
28
+ npm install @bdkinc/knex-ibmi knex odbc
29
+ ```
30
+
31
+ ## Quick Start
32
+
33
+ ```js
34
+ import knex from "knex";
35
+ import { DB2Dialect } from "@bdkinc/knex-ibmi";
36
+
37
+ /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
38
+ const config = {
39
+ client: DB2Dialect,
40
+ connection: {
41
+ host: "your-ibm-i-host",
42
+ database: "*LOCAL",
43
+ user: "your-username",
44
+ password: "your-password",
45
+ driver: "IBM i Access ODBC Driver",
46
+ connectionStringParams: { DBQ: "MYLIB" },
47
+ },
48
+ pool: { min: 2, max: 10 },
49
+ };
50
+
51
+ const db = knex(config);
52
+
53
+ try {
54
+ const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
55
+ console.log(results);
56
+ } catch (error) {
57
+ console.error("Database error:", error);
58
+ } finally {
59
+ await db.destroy();
60
+ }
61
+ ```
62
+
63
+ ## Usage
64
+
65
+ This package can be used with CommonJS, ESM, or TypeScript.
66
+
67
+ ### CommonJS
68
+
69
+ ```js
70
+ const knex = require("knex");
71
+ const { DB2Dialect } = require("@bdkinc/knex-ibmi");
72
+
73
+ const db = knex({
74
+ client: DB2Dialect,
75
+ connection: {
76
+ host: "your-ibm-i-host",
77
+ database: "*LOCAL",
78
+ user: "your-username",
79
+ password: "your-password",
80
+ driver: "IBM i Access ODBC Driver",
81
+ connectionStringParams: {
82
+ ALLOWPROCCALLS: 1,
83
+ CMT: 0,
84
+ DBQ: "MYLIB",
85
+ },
86
+ },
87
+ pool: { min: 2, max: 10 },
88
+ });
89
+
90
+ // Example query
91
+ db.select("*")
92
+ .from("MYTABLE")
93
+ .where({ STATUS: "A" })
94
+ .then((results) => console.log(results))
95
+ .catch((error) => console.error("Database error:", error))
96
+ .finally(() => db.destroy());
97
+ ```
98
+
99
+ ### ESM
100
+
101
+ ```js
102
+ import knex from "knex";
103
+ import { DB2Dialect } from "@bdkinc/knex-ibmi";
104
+
105
+ /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
106
+ const config = {
107
+ client: DB2Dialect,
108
+ connection: {
109
+ host: "your-ibm-i-host",
110
+ database: "*LOCAL",
111
+ user: "your-username",
112
+ password: "your-password",
113
+ driver: "IBM i Access ODBC Driver",
114
+ connectionStringParams: {
115
+ ALLOWPROCCALLS: 1,
116
+ CMT: 0,
117
+ DBQ: "MYLIB",
118
+ },
119
+ },
120
+ pool: { min: 2, max: 10 },
121
+ };
122
+
123
+ const db = knex(config);
124
+
125
+ try {
126
+ const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
127
+ console.log(results);
128
+ } catch (error) {
129
+ console.error("Database error:", error);
130
+ } finally {
131
+ await db.destroy();
132
+ }
133
+ ```
134
+
135
+ ### TypeScript
136
+
137
+ ```ts
138
+ import { knex } from "knex";
139
+ import { DB2Dialect, DB2Config } from "@bdkinc/knex-ibmi";
140
+
141
+ const config: DB2Config = {
142
+ client: DB2Dialect,
143
+ connection: {
144
+ host: "your-ibm-i-host",
145
+ database: "*LOCAL",
146
+ user: "your-username",
147
+ password: "your-password",
148
+ driver: "IBM i Access ODBC Driver",
149
+ connectionStringParams: {
150
+ ALLOWPROCCALLS: 1,
151
+ CMT: 0,
152
+ DBQ: "MYLIB",
153
+ },
154
+ },
155
+ pool: { min: 2, max: 10 },
156
+ };
157
+
158
+ const db = knex(config);
159
+
160
+ try {
161
+ const results = await db.select("*").from("MYTABLE").where({ STATUS: "A" });
162
+ console.log(results);
163
+ } catch (error) {
164
+ console.error("Database error:", error);
165
+ } finally {
166
+ await db.destroy();
167
+ }
168
+ ```
169
+
170
+ ### Streaming
171
+
172
+ There are two primary ways to consume a result stream: (1) classic Node stream piping with transform stages, and (2) async iteration with `for await` (which can be easier to reason about). Use a `fetchSize` to control how many rows are fetched from the driver per batch.
173
+
174
+ ```ts
175
+ import { knex } from "knex";
176
+ import { DB2Dialect, DB2Config } from "@bdkinc/knex-ibmi";
177
+ import { Transform } from "node:stream";
178
+ import { finished } from "node:stream/promises";
179
+
180
+ const config: DB2Config = {
181
+ /* ...same as earlier examples... */
182
+ };
183
+ const db = knex(config);
184
+
185
+ try {
186
+ const stream = await db("LARGETABLE").select("*").stream({ fetchSize: 100 });
187
+
188
+ // Approach 1: Pipe through a Transform stream
189
+ const transform = new Transform({
190
+ objectMode: true,
191
+ transform(row, _enc, cb) {
192
+ // Process each row (side effects, enrichment, filtering, etc.)
193
+ console.log("Transforming row id=", row.ID);
194
+ cb(null, row);
195
+ },
196
+ });
197
+ stream.pipe(transform);
198
+ await finished(stream); // Wait until piping completes
199
+
200
+ // Approach 2: Async iteration (recommended for simplicity)
201
+ const iterStream = await db("LARGETABLE")
202
+ .select("*")
203
+ .stream({ fetchSize: 200 });
204
+ for await (const row of iterStream) {
205
+ console.log("Iter row id=", row.ID);
206
+ }
207
+ } catch (error) {
208
+ console.error("Streaming error:", error);
209
+ } finally {
210
+ await db.destroy();
211
+ }
212
+ ```
213
+
214
+ ## ODBC Driver Setup
215
+
216
+ If you don't know the name of your installed driver, check `odbcinst.ini`. Find its path with:
217
+
218
+ ```bash
219
+ odbcinst -j
220
+ ```
221
+
222
+ Example entries:
223
+
224
+ ```
225
+ [IBM i Access ODBC Driver] # driver name in square brackets
226
+ Description=IBM i Access for Linux ODBC Driver
227
+ Driver=/opt/ibm/iaccess/lib/libcwbodbc.so
228
+ Setup=/opt/ibm/iaccess/lib/libcwbodbcs.so
229
+ Driver64=/opt/ibm/iaccess/lib64/libcwbodbc.so
230
+ Setup64=/opt/ibm/iaccess/lib64/libcwbodbcs.so
231
+ Threading=0
232
+ DontDLClose=1
233
+ UsageCount=1
234
+
235
+ [IBM i Access ODBC Driver 64-bit]
236
+ Description=IBM i Access for Linux 64-bit ODBC Driver
237
+ Driver=/opt/ibm/iaccess/lib64/libcwbodbc.so
238
+ Setup=/opt/ibm/iaccess/lib64/libcwbodbcs.so
239
+ Threading=0
240
+ DontDLClose=1
241
+ UsageCount=1
242
+ ```
243
+
244
+ If unixODBC is using the wrong config directory (e.g., your configs are in `/etc` but it expects elsewhere), set:
245
+
246
+ ```bash
247
+ export ODBCINI=/etc
248
+ export ODBCSYSINI=/etc
249
+ ```
250
+
251
+ ## Bundling with Vite
252
+
253
+ If you bundle with Vite, exclude certain native deps during optimize step:
254
+
255
+ ```js
256
+ // vite.config.js
257
+ export default {
258
+ optimizeDeps: {
259
+ exclude: ["@mapbox"],
260
+ },
261
+ };
262
+ ```
263
+
264
+ ## Migrations
265
+
266
+ ⚠️ **Important**: Standard Knex migrations don't work reliably with IBM i DB2 due to auto-commit DDL operations and locking issues.
267
+
268
+ ### Recommended: Use Built-in IBM i Migration System
269
+
270
+ The knex-ibmi library includes a custom migration system that bypasses Knex's problematic locking mechanism:
271
+
272
+ ```js
273
+ import { createIBMiMigrationRunner } from "@bdkinc/knex-ibmi";
274
+
275
+ const migrationRunner = createIBMiMigrationRunner(db, {
276
+ directory: "./migrations",
277
+ tableName: "KNEX_MIGRATIONS",
278
+ schemaName: "MYSCHEMA",
279
+ });
280
+
281
+ // Run migrations
282
+ await migrationRunner.latest();
283
+
284
+ // Rollback
285
+ await migrationRunner.rollback();
286
+
287
+ // Check status
288
+ const pending = await migrationRunner.listPending();
289
+ ```
290
+
291
+ **CLI Usage:** The package includes a built-in CLI that can be used via npm scripts or npx:
292
+
293
+ ```bash
294
+ # Install globally (optional)
295
+ npm install -g @bdkinc/knex-ibmi
296
+
297
+ # Or use via npx (recommended)
298
+ npx ibmi-migrations migrate:latest # Run pending migrations
299
+ npx ibmi-migrations migrate:rollback # Rollback last batch
300
+ npx ibmi-migrations migrate:status # Show migration status
301
+ npx ibmi-migrations migrate:make create_users_table # Create new JS migration
302
+ npx ibmi-migrations migrate:make add_email_column -x ts # Create new TS migration
303
+
304
+ # Or add to your package.json scripts:
305
+ {
306
+ "scripts": {
307
+ "migrate:latest": "ibmi-migrations migrate:latest",
308
+ "migrate:rollback": "ibmi-migrations migrate:rollback",
309
+ "migrate:status": "ibmi-migrations migrate:status",
310
+ "migrate:make": "ibmi-migrations migrate:make"
311
+ }
312
+ }
313
+
314
+ # Then run with npm:
315
+ npm run migrate:latest
316
+ npm run migrate:status
317
+ ```
318
+
319
+ **Full CLI API (similar to Knex):**
320
+
321
+ ```bash
322
+ ibmi-migrations migrate:latest # Run all pending migrations
323
+ ibmi-migrations migrate:rollback # Rollback last migration batch
324
+ ibmi-migrations migrate:rollback --steps 2
325
+ ibmi-migrations migrate:status # Show detailed migration status
326
+ ibmi-migrations migrate:currentVersion # Show current migration version
327
+ ibmi-migrations migrate:list # List all migrations
328
+ ibmi-migrations migrate:make <name> # Create new migration file
329
+
330
+ # Options:
331
+ ibmi-migrations migrate:status --env production
332
+ ibmi-migrations migrate:latest --knexfile ./config/knexfile.js
333
+ ibmi-migrations migrate:latest --knexfile ./knexfile.ts # Use TypeScript knexfile
334
+ ibmi-migrations migrate:make create_users_table
335
+ ibmi-migrations migrate:make add_email_column -x ts # TypeScript migration
336
+ ```
337
+
338
+ TypeScript knexfiles/migrations require running the CLI with a TypeScript-capable runtime loader
339
+ (for example: `node --import tsx ./node_modules/.bin/ibmi-migrations ...`) or precompiling to JavaScript.
340
+
341
+ Migration discovery includes `.js`, `.ts`, `.mjs`, and `.cjs` files in the migration directory.
342
+
343
+ 📖 **See [MIGRATIONS.md](./MIGRATIONS.md) for complete documentation**
344
+
345
+ ### Alternative: Standard Knex with Transactions Disabled
346
+
347
+ If you must use standard Knex migrations, disable transactions to avoid issues:
348
+
349
+ ```js
350
+ /** @type {import("@bdkinc/knex-ibmi").DB2Config} */
351
+ const config = {
352
+ client: DB2Dialect,
353
+ connection: {
354
+ /* your connection config */
355
+ },
356
+ migrations: {
357
+ disableTransactions: true, // Required for IBM i
358
+ directory: "./migrations",
359
+ tableName: "knex_migrations",
360
+ },
361
+ };
362
+ ```
363
+
364
+ **Warning**: Standard Knex migrations may still hang on lock operations. The built-in IBM i migration system is strongly recommended.
365
+
366
+ ## Multi-Row Insert Strategies
367
+
368
+ Configure via `ibmi.multiRowInsert` in the knex config:
369
+
370
+ ```ts
371
+ const db = knex({
372
+ client: DB2Dialect,
373
+ connection: {
374
+ /* ... */
375
+ },
376
+ ibmi: { multiRowInsert: "auto" }, // 'auto' | 'sequential' | 'disabled'
377
+ });
378
+ ```
379
+
380
+ - `auto` (default): Generates a single INSERT with multiple VALUES lists. For `.returning('*')` or no explicit column list it returns all inserted rows (lenient fallback). Identity values are whatever DB2 ODBC surfaces for that multi-row statement.
381
+ - `sequential`: Compiler shows a single-row statement (first row) but at execution time each row is inserted individually inside a loop to reliably collect identity values (using `IDENTITY_VAL_LOCAL()` per row). Suitable when you need each generated identity.
382
+ - `disabled`: Falls back to legacy behavior: only the first row is inserted (others ignored). Useful for strict backward compatibility.
383
+
384
+ If you specify `.returning(['COL1', 'COL2'])` with multi-row inserts, those columns are selected; otherwise `IDENTITY_VAL_LOCAL()` (single-row) or `*` (multi-row) is used as a lenient fallback.
385
+
386
+ ## Returning Behavior (INSERT / UPDATE / DELETE)
387
+
388
+ Native `RETURNING` is not broadly supported over ODBC on IBM i. The dialect provides pragmatic emulation:
389
+
390
+ ### INSERT
391
+
392
+ - `auto` multi-row: generates a single multi-values INSERT. When no explicit column list is requested it returns all inserted rows (`*`) as a lenient fallback. Some installations may see this internally wrapped using a `SELECT * FROM FINAL TABLE( INSERT ... )` pattern in logs or debug output; that wrapper is only an implementation detail to surface inserted rows.
393
+ - `sequential`: inserts each row one at a time so it can reliably call `IDENTITY_VAL_LOCAL()` after each insert; builds an array of returned rows.
394
+ - `disabled`: legacy single-row insert behavior; additional rows in the values array are ignored.
395
+
396
+ ### UPDATE
397
+
398
+ - Executes the UPDATE.
399
+ - Re-selects the affected rows using the original WHERE clause when `.returning(...)` is requested.
400
+
401
+ ### DELETE
402
+
403
+ - Selects the rows to be deleted (capturing requested returning columns or `*`).
404
+ - Executes the DELETE.
405
+ - Returns the previously selected rows.
406
+
407
+ ### Notes
408
+
409
+ - `returning('*')` can be expensive on large result sets—limit the column list when possible.
410
+ - For guaranteed, ordered identity values across many inserted rows use the `sequential` strategy.
411
+
412
+ ## Configuration Summary
413
+
414
+ ```ts
415
+ interface IbmiDialectConfig {
416
+ multiRowInsert?: "auto" | "sequential" | "disabled";
417
+ sequentialInsertTransactional?: boolean; // if true, wraps sequential loop in BEGIN/COMMIT
418
+ preparedStatementCache?: boolean; // Enable per-connection statement caching (default: false)
419
+ preparedStatementCacheSize?: number; // Max cached statements per connection (default: 100)
420
+ readUncommitted?: boolean; // Append WITH UR to SELECT queries (default: false)
421
+ }
422
+ ```
423
+
424
+ Attach under the root knex config as `ibmi`.
425
+
426
+ ### Performance Tuning
427
+
428
+ #### Prepared Statement Caching (v0.5.0+)
429
+
430
+ Enable optional prepared statement caching to reduce parse overhead for repeated queries:
431
+
432
+ ```ts
433
+ const db = knex({
434
+ client: DB2Dialect,
435
+ connection: {
436
+ /* ... */
437
+ },
438
+ ibmi: {
439
+ preparedStatementCache: true, // Enable caching
440
+ preparedStatementCacheSize: 100, // Max statements per connection
441
+ },
442
+ });
443
+ ```
444
+
445
+ When enabled, the dialect maintains a per-connection LRU cache of prepared statements. Statements are automatically closed when evicted or when the connection is destroyed.
446
+
447
+ #### Read Uncommitted Isolation (v0.5.0+)
448
+
449
+ For read-heavy workloads, enable uncommitted read isolation to improve concurrency:
450
+
451
+ ```ts
452
+ const db = knex({
453
+ client: DB2Dialect,
454
+ connection: {
455
+ /* ... */
456
+ },
457
+ ibmi: {
458
+ readUncommitted: true, // Appends WITH UR to all SELECT queries
459
+ },
460
+ });
461
+ ```
462
+
463
+ This appends `WITH UR` to all SELECT queries, allowing reads without waiting for locks. Only use this if your application can tolerate reading uncommitted data.
464
+
465
+ ### Transactional Sequential Inserts
466
+
467
+ When `ibmi.sequentialInsertTransactional` is `true`, the dialect will attempt `BEGIN` before the per-row loop and `COMMIT` after. On commit failure it will attempt a `ROLLBACK`. If `BEGIN` is not supported, it logs a warning and continues non-transactionally.
468
+
469
+ <!-- Benchmarks section intentionally removed. Benchmarking is handled in the external test harness project -->
470
+
471
+ ## Links
472
+
473
+ - Knex: https://knexjs.org/
474
+ - Knex repo: https://github.com/knex/knex
475
+ - ODBC driver: https://github.com/IBM/node-odbc
476
+ - IBM i OSS docs: https://ibmi-oss-docs.readthedocs.io/