@clickhouse/client 1.18.4 → 1.18.5-head.360dc24.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,149 @@
1
+ # Custom JSON `parse` / `stringify`
2
+
3
+ > **Requires:** client `>= 1.14.0` (configurable `json.parse` and
4
+ > `json.stringify`). Earlier versions cannot swap the JSON implementation.
5
+
6
+ Backing example:
7
+ [`examples/node/coding/custom_json_handling.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/custom_json_handling.ts).
8
+
9
+ ## Answer checklist
10
+
11
+ When the user wants `UInt64`/`Int64` values back as `BigInt`:
12
+
13
+ - State that configurable `json.parse` / `json.stringify` requires
14
+ `@clickhouse/client >= 1.14.0`.
15
+ - Show the supported `createClient({ json: { parse, stringify } })` option,
16
+ usually with `json-bigint` and `useNativeBigInt: true`.
17
+ - Combine it with `output_format_json_quote_64bit_integers: 0` so the server
18
+ emits unquoted 64-bit integers that the parser can turn into `BigInt`.
19
+ - Mention that `output_format_json_quote_64bit_integers: 0` is the default
20
+ since ClickHouse `25.8`, but setting it explicitly is useful for older
21
+ servers or portable examples.
22
+ - Warn that casting to JavaScript `Number` / `parseInt` / `parseFloat` loses
23
+ precision above `Number.MAX_SAFE_INTEGER`.
24
+
25
+ ## Why customize?
26
+
27
+ The default `JSON.stringify` / `JSON.parse`:
28
+
29
+ - Throws on `BigInt`.
30
+ - Calls `Date.prototype.toJSON()` (ISO string) — fine for `DateTime` with
31
+ `date_time_input_format: 'best_effort'`, surprising in some workflows.
32
+ - Loses precision for 64-bit integers returned as numbers (a separate
33
+ issue — covered in the troubleshooting skill).
34
+
35
+ A custom `{ parse, stringify }` lets you plug in `JSONBig`,
36
+ `safe-stable-stringify`, your own `BigInt`-aware serializer, etc.
37
+
38
+ ## Recipe: BigInt-safe stringify, custom Date handling
39
+
40
+ ```ts
41
+ import { createClient } from '@clickhouse/client'
42
+
43
+ const valueSerializer = (value: unknown): unknown => {
44
+ // Serialize Date as a UNIX millis number (instead of toJSON's ISO string)
45
+ if (value instanceof Date) {
46
+ return value.getTime()
47
+ }
48
+
49
+ // Serialize BigInt as a string so JSON.stringify won't throw
50
+ if (typeof value === 'bigint') {
51
+ return value.toString()
52
+ }
53
+
54
+ if (Array.isArray(value)) {
55
+ return value.map(valueSerializer)
56
+ }
57
+
58
+ if (typeof value === 'object' && value !== null) {
59
+ return Object.fromEntries(
60
+ Object.entries(value).map(([k, v]) => [k, valueSerializer(v)]),
61
+ )
62
+ }
63
+
64
+ return value
65
+ }
66
+
67
+ const client = createClient({
68
+ json: {
69
+ parse: JSON.parse,
70
+ stringify: (obj: unknown) => JSON.stringify(valueSerializer(obj)),
71
+ },
72
+ })
73
+
74
+ await client.command({
75
+ query: `
76
+ CREATE OR REPLACE TABLE inserts_custom_json_handling
77
+ (id UInt64, dt DateTime64(3, 'UTC'))
78
+ ENGINE MergeTree
79
+ ORDER BY id
80
+ `,
81
+ })
82
+
83
+ await client.insert({
84
+ table: 'inserts_custom_json_handling',
85
+ format: 'JSONEachRow',
86
+ values: [
87
+ {
88
+ id: BigInt(250000000000000200), // serialized as a string
89
+ dt: new Date(), // serialized as ms since epoch
90
+ },
91
+ ],
92
+ })
93
+
94
+ const rows = await client.query({
95
+ query: 'SELECT * FROM inserts_custom_json_handling',
96
+ format: 'JSONEachRow',
97
+ })
98
+ console.info(await rows.json())
99
+ await client.close()
100
+ ```
101
+
102
+ > The custom `valueSerializer` runs **before** `JSON.stringify`, so values
103
+ > are transformed before the standard hooks (`Date.prototype.toJSON`,
104
+ > object `toJSON()` methods, etc.) ever run.
105
+
106
+ ## Recipe: BigInt-safe parsing for 64-bit integer columns
107
+
108
+ If you want `UInt64`/`Int64` to come back as `BigInt`s (instead of strings
109
+ or precision-lossy numbers), plug in a `BigInt`-aware parser such as
110
+ [`json-bigint`](https://www.npmjs.com/package/json-bigint):
111
+
112
+ ```ts
113
+ import { createClient } from '@clickhouse/client'
114
+ import JSONBig from 'json-bigint'
115
+
116
+ const bigJson = JSONBig({ useNativeBigInt: true })
117
+
118
+ const client = createClient({
119
+ json: {
120
+ parse: bigJson.parse,
121
+ stringify: bigJson.stringify,
122
+ },
123
+ clickhouse_settings: {
124
+ output_format_json_quote_64bit_integers: 0,
125
+ },
126
+ })
127
+ ```
128
+
129
+ This applies to **both** outgoing JSON bodies and incoming JSON-format
130
+ responses. Combine with `output_format_json_quote_64bit_integers: 0` (the
131
+ default since CH 25.8) so the server emits unquoted 64-bit integers that
132
+ `json-bigint` can parse to `BigInt`.
133
+
134
+ ## Common pitfalls
135
+
136
+ - **Setting `json.parse` only.** That only affects reading JSON responses;
137
+ outgoing JSON bodies use `json.stringify`. If you want consistent custom
138
+ handling in both directions, generally provide a matching `stringify` too.
139
+ - **Forgetting `bigint` handling in `stringify`.** Default `JSON.stringify`
140
+ throws on `BigInt`; if your data ever contains one, the insert will fail
141
+ with `TypeError: Do not know how to serialize a BigInt`.
142
+ - **Targeting client `< 1.14.0`.** The `json` option doesn't exist; you'll
143
+ need to convert values manually before calling `insert()` / `query()` (or
144
+ upgrade).
145
+ - **Casting 64-bit integers to `Number`.** JavaScript's `number` type has
146
+ only 53 bits of mantissa — values above `Number.MAX_SAFE_INTEGER` (2^53 − 1)
147
+ are silently rounded. Do **not** try to fix precision loss by calling
148
+ `Number()`, `parseInt()`, or `parseFloat()` on the value. The correct fix
149
+ is a `BigInt`-aware parser (shown above), not a lossy cast.
@@ -0,0 +1,169 @@
1
+ # Modern Data Types: Dynamic, Variant, JSON, Time, Time64
2
+
3
+ > **Applies to** (server side):
4
+ >
5
+ > - `Variant`: ClickHouse `>= 24.1`.
6
+ > - `Dynamic`: ClickHouse `>= 24.5`.
7
+ > - New `JSON` (object) type: ClickHouse `>= 24.8`.
8
+ > - All three are **no longer experimental since `25.3`**; on older servers,
9
+ > you must enable the corresponding `allow_experimental_*_type` setting.
10
+ > - `Time` / `Time64`: ClickHouse `>= 25.6` and require
11
+ > `enable_time_time64_type: 1`.
12
+
13
+ Backing examples:
14
+ [`examples/node/coding/dynamic_variant_json.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/dynamic_variant_json.ts),
15
+ [`examples/node/coding/time_time64.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/time_time64.ts).
16
+
17
+ ## Answer checklist
18
+
19
+ When answering about storing and reading JSON objects:
20
+
21
+ - Use the new `JSON` column type, introduced in ClickHouse `>= 24.8`.
22
+ - Say `JSON` is no longer experimental since ClickHouse `25.3`; on older
23
+ supported versions, enable `allow_experimental_json_type`.
24
+ - Insert real JS objects with `format: 'JSONEachRow'`; do not
25
+ `JSON.stringify()` the column value.
26
+ - Read with a JSON output format such as `JSONEachRow` and `resultSet.json()`;
27
+ `JSON` column values come back as parsed JS objects.
28
+
29
+ ## `Dynamic`, `Variant(...)`, `JSON`
30
+
31
+ ```ts
32
+ import { createClient } from '@clickhouse/client'
33
+
34
+ const client = createClient({
35
+ // Required only on ClickHouse < 25.3 — harmless to leave on
36
+ clickhouse_settings: {
37
+ allow_experimental_variant_type: 1,
38
+ allow_experimental_dynamic_type: 1,
39
+ allow_experimental_json_type: 1,
40
+ },
41
+ })
42
+
43
+ await client.command({
44
+ query: `
45
+ CREATE OR REPLACE TABLE chjs_dynamic_variant_json
46
+ (
47
+ id UInt64,
48
+ var Variant(Int64, String),
49
+ dynamic Dynamic,
50
+ json JSON
51
+ )
52
+ ENGINE MergeTree
53
+ ORDER BY id
54
+ `,
55
+ })
56
+
57
+ await client.insert({
58
+ table: 'chjs_dynamic_variant_json',
59
+ format: 'JSONEachRow',
60
+ values: [
61
+ { id: 1, var: 42, dynamic: 'foo', json: { foo: 'x' } },
62
+ { id: 2, var: 'str', dynamic: 144, json: { bar: 10 } },
63
+ ],
64
+ })
65
+
66
+ const rs = await client.query({
67
+ query: `
68
+ SELECT *,
69
+ variantType(var),
70
+ dynamicType(dynamic),
71
+ dynamicType(json.foo),
72
+ dynamicType(json.bar)
73
+ FROM chjs_dynamic_variant_json
74
+ `,
75
+ format: 'JSONEachRow',
76
+ })
77
+ console.log(await rs.json())
78
+ ```
79
+
80
+ ### Notes
81
+
82
+ - The `JSON` column type accepts a real JS object on insert and returns one
83
+ on select — no need for `JSON.stringify` / `JSON.parse` in your app code.
84
+ - A JS number written into a `Dynamic` or `Variant` column defaults to
85
+ `Int64` on the server. In JSON formats, `output_format_json_quote_64bit_integers`
86
+ controls how 64-bit integers are returned: `1` returns them as JSON strings,
87
+ while `0` returns them as JSON numbers (and `0` is the default since CH `25.8`).
88
+ In JS, large 64-bit integers returned as numbers can lose precision, so use
89
+ quoted output if you need exact integer values in application code.
90
+ - Use `variantType(...)`, `dynamicType(...)` to introspect what the server
91
+ ended up storing.
92
+
93
+ ## `Time` and `Time64(p)`
94
+
95
+ `Time` is signed seconds (`-999:59:59` … `999:59:59`). `Time64(p)` adds
96
+ sub-second precision (`p` digits, up to `9` for nanoseconds). Both require
97
+ `enable_time_time64_type: 1` on `>= 25.6`.
98
+
99
+ ```ts
100
+ const client = createClient({
101
+ clickhouse_settings: { enable_time_time64_type: 1 },
102
+ })
103
+
104
+ await client.command({
105
+ query: `
106
+ CREATE OR REPLACE TABLE chjs_time_time64
107
+ (
108
+ id UInt64,
109
+ t Time,
110
+ t64_0 Time64(0),
111
+ t64_3 Time64(3),
112
+ t64_6 Time64(6),
113
+ t64_9 Time64(9),
114
+ )
115
+ ENGINE MergeTree
116
+ ORDER BY id
117
+ `,
118
+ })
119
+
120
+ await client.insert({
121
+ table: 'chjs_time_time64',
122
+ format: 'JSONEachRow',
123
+ values: [
124
+ {
125
+ id: 1,
126
+ t: '12:34:56',
127
+ t64_0: '12:34:56',
128
+ t64_3: '12:34:56.123',
129
+ t64_6: '12:34:56.123456',
130
+ t64_9: '12:34:56.123456789',
131
+ },
132
+ {
133
+ id: 2,
134
+ t: '999:59:59',
135
+ t64_0: '999:59:59',
136
+ t64_3: '999:59:59.999',
137
+ t64_6: '999:59:59.999999',
138
+ t64_9: '999:59:59.999999999',
139
+ },
140
+ {
141
+ id: 3,
142
+ t: '-999:59:59',
143
+ t64_0: '-999:59:59',
144
+ t64_3: '-999:59:59.999',
145
+ t64_6: '-999:59:59.999999',
146
+ t64_9: '-999:59:59.999999999',
147
+ },
148
+ ],
149
+ })
150
+ ```
151
+
152
+ ### Notes
153
+
154
+ - Pass values as **strings** in the `HH:MM:SS[.fraction]` format. Negatives
155
+ are supported; the magnitude can exceed 24 hours.
156
+ - For `Time64(p)` with `p > 3`, do not use JS `Date` — it tops out at
157
+ millisecond precision and will silently truncate.
158
+
159
+ ## Common pitfalls
160
+
161
+ - **Targeting old ClickHouse servers without the `allow_experimental_*`
162
+ setting.** On `< 25.3`, `CREATE TABLE` will fail without them.
163
+ - **Expecting `JSON`-column reads to be raw strings.** They come back as
164
+ parsed objects in JSON formats.
165
+ - **Inserting `Time64(9)` from JS `Date` and losing precision.** Use a
166
+ string instead.
167
+ - **Reading a `Variant`/`Dynamic` value of type `Int64` and being surprised
168
+ it's a string.** That's the standard 64-bit-integers-in-JSON behavior;
169
+ see the troubleshooting skill if you need to change it.
@@ -0,0 +1,113 @@
1
+ # Insert into Specific Columns / Other Databases
2
+
3
+ > **Applies to:** all versions. The `columns` option (both forms) and the
4
+ > `database` config field are universally supported.
5
+
6
+ Backing examples:
7
+ [`examples/node/coding/insert_specific_columns.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/insert_specific_columns.ts),
8
+ [`examples/node/coding/insert_exclude_columns.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/insert_exclude_columns.ts),
9
+ [`examples/node/coding/insert_ephemeral_columns.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/insert_ephemeral_columns.ts),
10
+ [`examples/node/coding/insert_into_different_db.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/insert_into_different_db.ts).
11
+
12
+ ## Answer checklist
13
+
14
+ When explaining partial-column inserts:
15
+
16
+ - Show `columns: ['col_a', 'col_b']` for the allowlist form.
17
+ - Also mention the inverse `columns: { except: ['col_to_skip'] }` form so the
18
+ user knows both supported shapes.
19
+ - Explain that omitted columns receive their server-side defaults
20
+ (`DEFAULT`, `MATERIALIZED`, `ALIAS`, nullable/type defaults) and inserts can
21
+ still fail or produce surprising zero/empty values if the table definition
22
+ has no appropriate defaults.
23
+
24
+ ## Insert into specific columns
25
+
26
+ Pass `columns: string[]` to limit the `INSERT` to a subset. Omitted columns
27
+ get their declared default.
28
+
29
+ ```ts
30
+ await client.insert({
31
+ table: 'events',
32
+ format: 'JSONEachRow',
33
+ values: [{ message: 'foo' }],
34
+ columns: ['message'], // `id` will get its default (0 for UInt32)
35
+ })
36
+ ```
37
+
38
+ ## Insert excluding columns
39
+
40
+ Use `columns: { except: string[] }` for the inverse. Useful when most columns
41
+ should default but you want to name only the few to skip.
42
+
43
+ ```ts
44
+ await client.insert({
45
+ table: 'events',
46
+ format: 'JSONEachRow',
47
+ values: [{ message: 'bar' }],
48
+ columns: { except: ['id'] },
49
+ })
50
+ ```
51
+
52
+ ## Tables with EPHEMERAL columns
53
+
54
+ [Ephemeral columns](https://clickhouse.com/docs/en/sql-reference/statements/create/table#ephemeral)
55
+ are not stored — they only exist to drive `DEFAULT` expressions of other
56
+ columns. To trigger that default logic, **the ephemeral column must be in the
57
+ `columns` list**, even though no value will be persisted for it.
58
+
59
+ ```ts
60
+ await client.command({
61
+ query: `
62
+ CREATE OR REPLACE TABLE events
63
+ (
64
+ id UInt64,
65
+ message String DEFAULT message_default,
66
+ message_default String EPHEMERAL
67
+ )
68
+ ENGINE MergeTree
69
+ ORDER BY id
70
+ `,
71
+ })
72
+
73
+ await client.insert({
74
+ table: 'events',
75
+ format: 'JSONEachRow',
76
+ values: [
77
+ { id: '42', message_default: 'foo' },
78
+ { id: '144', message_default: 'bar' },
79
+ ],
80
+ // Including the ephemeral column name triggers the DEFAULT expression
81
+ columns: ['id', 'message_default'],
82
+ })
83
+ ```
84
+
85
+ ## Insert into a different database
86
+
87
+ If the client's default `database` is not the target, qualify the table name
88
+ with `db.table`:
89
+
90
+ ```ts
91
+ const client = createClient({ database: 'system' })
92
+
93
+ await client.command({ query: 'CREATE DATABASE IF NOT EXISTS analytics' })
94
+
95
+ await client.insert({
96
+ table: 'analytics.events', // fully qualified
97
+ format: 'JSONEachRow',
98
+ values: [{ id: 42, message: 'foo' }],
99
+ })
100
+ ```
101
+
102
+ There is no per-call `database` override on `insert()` / `query()` — qualify
103
+ the identifier, or create a second client with the desired `database`.
104
+
105
+ ## Common pitfalls
106
+
107
+ - **Forgetting the ephemeral column in `columns`.** If you list only the
108
+ non-ephemeral columns, the `DEFAULT` expression that depends on the
109
+ ephemeral value won't fire and you'll get empty/zero defaults instead.
110
+ - **Hoping `client.insert({ database: '…' })` works.** It doesn't — qualify
111
+ the `table` instead.
112
+ - **Mixing the two `columns` forms.** Use either `string[]` _or_
113
+ `{ except: string[] }`, not both.
@@ -0,0 +1,145 @@
1
+ # Insert Data Formats
2
+
3
+ > **Applies to:** all versions. The `JSON` type column / new JSON family is a
4
+ > ClickHouse feature; the JSON _formats_ listed here are universally supported
5
+ > by the client.
6
+
7
+ Backing examples:
8
+ [`examples/node/coding/array_json_each_row.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/array_json_each_row.ts),
9
+ [`examples/node/coding/insert_data_formats_overview.ts`](https://github.com/ClickHouse/clickhouse-js/blob/main/examples/node/coding/insert_data_formats_overview.ts).
10
+
11
+ > **Raw / binary formats (CSV, TSV, CustomSeparated, Parquet) require a Node
12
+ > stream as input.** See
13
+ > [`examples/node/performance/`](https://github.com/ClickHouse/clickhouse-js/tree/main/examples/node/performance)
14
+ > — defer if the user wants to insert from a file or `Readable`.
15
+
16
+ ## Answer checklist
17
+
18
+ When answering "what format/call should I use for an array of JS objects?":
19
+
20
+ - Use `client.insert({ table, values, format: 'JSONEachRow' })`.
21
+ - Say the array of plain objects can be passed directly as `values` for
22
+ ordinary in-memory batches such as a few thousand or tens of thousands of
23
+ rows.
24
+ - Do not steer the user to streaming, Parquet, or file APIs unless their input
25
+ is already a stream/file or the task is explicitly about throughput.
26
+ - Warn not to wrap `JSONEachRow` rows in a `{ data: [...] }` envelope; that
27
+ shape belongs to single-document formats.
28
+ - Mention `JSONCompactEachRow*` as a denser alternative for larger payloads
29
+ when the caller can provide positional arrays or explicit names/types.
30
+
31
+ ## Default choice: `JSONEachRow` with an array of objects
32
+
33
+ This is the right answer for ~90% of inserts.
34
+
35
+ ```ts
36
+ import { createClient } from '@clickhouse/client'
37
+
38
+ const client = createClient()
39
+
40
+ await client.insert({
41
+ table: 'events',
42
+ format: 'JSONEachRow',
43
+ values: [
44
+ { id: 42, name: 'foo' },
45
+ { id: 43, name: 'bar' },
46
+ ],
47
+ })
48
+
49
+ await client.close()
50
+ ```
51
+
52
+ The shape of `values` must match the chosen format.
53
+
54
+ ## Streamable JSON formats (pass an array)
55
+
56
+ | Format | `values` shape |
57
+ | -------------------------------------------- | --------------------------------------------------- |
58
+ | `JSONEachRow` | `Array<{ col: value, ... }>` |
59
+ | `JSONStringsEachRow` | `Array<{ col: stringifiedValue, ... }>` |
60
+ | `JSONCompactEachRow` | `Array<[v1, v2, ...]>` |
61
+ | `JSONCompactStringsEachRow` | `Array<[stringV1, stringV2, ...]>` |
62
+ | `JSONCompactEachRowWithNames` | First row = column names, then data rows |
63
+ | `JSONCompactEachRowWithNamesAndTypes` | Row 1 = names, row 2 = types, then data |
64
+ | `JSONCompactStringsEachRowWithNames` | First row = names, then stringified data rows |
65
+ | `JSONCompactStringsEachRowWithNamesAndTypes` | Row 1 = names, row 2 = types, then stringified data |
66
+
67
+ ```ts
68
+ await client.insert({
69
+ table: 'events',
70
+ format: 'JSONCompactEachRowWithNamesAndTypes',
71
+ values: [
72
+ ['id', 'name', 'sku'],
73
+ ['UInt32', 'String', 'Array(UInt32)'],
74
+ [11, 'foo', [1, 2, 3]],
75
+ [12, 'bar', [4, 5, 6]],
76
+ ],
77
+ })
78
+ ```
79
+
80
+ These formats can be **streamed** — pass a Node stream of rows instead of an
81
+ array. See
82
+ [`examples/node/performance/`](https://github.com/ClickHouse/clickhouse-js/tree/main/examples/node/performance)
83
+ for streaming guidance.
84
+
85
+ ## Single-document JSON formats (pass an object)
86
+
87
+ These cannot be streamed — the entire body is sent in one shot.
88
+
89
+ | Format | `values` shape (typed via `InputJSON<T>` / `InputJSONObjectEachRow<T>`) |
90
+ | ------------------------- | ------------------------------------------------------------------------------------------------------------------------- |
91
+ | `JSON` | `{ meta: [], data: Array<{ col: value, ... }> }` — for TypeScript/client usage, pass `meta: []` if metadata is not needed |
92
+ | `JSONCompact` | `{ meta: [{ name, type }, ...], data: Array<[v1, v2, ...]> }` |
93
+ | `JSONColumnsWithMetadata` | `{ meta: [...], data: { col1: [v, ...], col2: [v, ...] } }` |
94
+ | `JSONObjectEachRow` | `Record<string, { col: value, ... }>` (the record key labels each row but is not stored) |
95
+
96
+ ```ts
97
+ import type { InputJSON, InputJSONObjectEachRow } from '@clickhouse/client'
98
+
99
+ const meta: InputJSON['meta'] = [
100
+ { name: 'id', type: 'UInt32' },
101
+ { name: 'name', type: 'String' },
102
+ ]
103
+
104
+ await client.insert({
105
+ table: 'events',
106
+ format: 'JSONCompact',
107
+ values: {
108
+ meta,
109
+ data: [
110
+ [19, 'foo'],
111
+ [20, 'bar'],
112
+ ],
113
+ },
114
+ })
115
+
116
+ await client.insert({
117
+ table: 'events',
118
+ format: 'JSONObjectEachRow',
119
+ values: {
120
+ row_1: { id: 23, name: 'foo' },
121
+ row_2: { id: 24, name: 'bar' },
122
+ } satisfies InputJSONObjectEachRow<{ id: number; name: string }>,
123
+ })
124
+ ```
125
+
126
+ ## Quick chooser
127
+
128
+ | Use case | Format |
129
+ | -------------------------------------------- | ------------------------------------------------- |
130
+ | Insert plain JS objects | `JSONEachRow` _(default)_ |
131
+ | Insert tuples / column-positional rows | `JSONCompactEachRow` |
132
+ | Insert with explicit column ordering / types | `JSONCompactEachRow*WithNames…` |
133
+ | Insert a single document with metadata | `JSON`, `JSONCompact` |
134
+ | Insert from a CSV / TSV / Parquet file | Raw format + Node stream → `examples/node/performance/` |
135
+
136
+ ## Common pitfalls
137
+
138
+ - **Wrong shape for the format.** The most common cause of insert failures —
139
+ e.g., passing `Array<{...}>` to `JSONCompact` (which expects
140
+ `{ meta, data }`).
141
+ - **Don't wrap a `JSONEachRow` array in a `{ data: [...] }` envelope.** That
142
+ envelope only belongs to single-document formats (`JSON` / `JSONCompact` /
143
+ `JSONColumnsWithMetadata`).
144
+ - For type guidance (`Decimal` strings, `Date` objects, `BigInt`), see
145
+ `insert-values.md` and `custom-json.md`.