@noy-db/as-csv 0.1.0-pre.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 vLannaAi
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,111 @@
1
+ # @noy-db/as-csv
2
+
3
+ CSV plaintext export for noy-db — decrypts records from a single collection and formats them as comma-separated values with RFC 4180 escaping. Part of the `@noy-db/as-*` portable-artefact family (plaintext tier).
4
+
5
+ The **reference implementation** of the plaintext-tier shape — every other record formatter in the family (`as-json`, `as-xml`, `as-sql`, …) follows the same 3-entry-point + authorization-gate structure.
6
+
7
+ ## Install
8
+
9
+ ```bash
10
+ pnpm add @noy-db/as-csv
11
+ ```
12
+
13
+ ## Authorization (RFC #249)
14
+
15
+ Every call checks `vault.assertCanExport('plaintext', 'csv')` **before decrypting anything**. The caller's keyring must have been granted the `'csv'` format (or `'*'` wildcard) via `vault.grant({ exportCapability: { plaintext: ['csv'] } })`. Otherwise → `ExportCapabilityError`.
16
+
17
+ **Default policy:** every role (including `owner`) requires explicit plaintext grant. Installing this package does not unlock anything; the owner's grant does. See [`docs/packages-exports.md#authorization-model`](../../docs/packages-exports.md#authorization-model) for the full policy.
18
+
19
+ ## Usage
20
+
21
+ ### `toString(vault, options)` — returns CSV string
22
+
23
+ ```ts
24
+ import { toString } from '@noy-db/as-csv'
25
+
26
+ const csv = await toString(vault, { collection: 'invoices' })
27
+ // id,client,amount,status\n
28
+ // inv-1,Globex,1500,paid\n
29
+ // inv-2,"Acme, Inc.",2400,draft\n
30
+ // inv-3,"Stark ""Industries""",999,overdue
31
+ ```
32
+
33
+ ### `download(vault, options)` — browser download (Tier 2)
34
+
35
+ ```ts
36
+ import { download } from '@noy-db/as-csv'
37
+
38
+ await download(vault, {
39
+ collection: 'invoices',
40
+ filename: 'invoices-2026-03.csv', // optional; defaults to '<collection>.csv'
41
+ })
42
+ ```
43
+
44
+ Wraps the CSV in a `Blob`, creates an object URL, clicks a hidden anchor. Plaintext lives in RAM + the end-user's Downloads folder only. Does not write to your server.
45
+
46
+ ### `write(vault, path, options)` — Node file-write (Tier 3)
47
+
48
+ ```ts
49
+ import { write } from '@noy-db/as-csv'
50
+
51
+ await write(vault, '/tmp/invoices.csv', {
52
+ collection: 'invoices',
53
+ acknowledgeRisks: true, // required — see below
54
+ })
55
+ ```
56
+
57
+ `acknowledgeRisks: true` is **required** at runtime (in addition to being a type requirement). This signals the consumer has considered that the CSV will persist on disk past the current process — consider retention, access control, and secondary-exposure risk before using.
58
+
59
+ ### Options
60
+
61
+ ```ts
62
+ interface AsCSVOptions {
63
+ collection: string // required
64
+ columns?: readonly string[] // optional — default: infer from record keys
65
+ eol?: '\n' | '\r\n' // optional — default: '\n' (LF)
66
+ }
67
+ ```
68
+
69
+ ## RFC 4180 escaping
70
+
71
+ - Strings containing `,` `"` `\r` or `\n` are wrapped in double quotes.
72
+ - Embedded double quotes are doubled: `She said "hi"` → `"She said ""hi"""`.
73
+ - `null` / `undefined` → empty field.
74
+ - `number` / `boolean` → stringified.
75
+ - `Date` → ISO 8601 string.
76
+ - Objects / arrays → `JSON.stringify()` (then escaped if needed).
77
+
78
+ Dates render as ISO strings rather than locale-formatted — spreadsheet consumers can re-parse if needed. For locale-formatted dates with Thai BE calendar, pipe through `@noy-db/locale-th` before exporting.
79
+
80
+ ## Audit
81
+
82
+ This package does NOT write an audit-ledger entry yet — that lands with the full vault-level gated wrappers in a #249 follow-up. For now, applications using `@noy-db/as-csv` should write their own `type: 'as-export'` ledger entry after each call:
83
+
84
+ ```ts
85
+ await vault.collection<AsExportEntry>('_ledger_custom').put(ulid(), {
86
+ type: 'as-export',
87
+ encrypted: false,
88
+ package: '@noy-db/as-csv',
89
+ collection: options.collection,
90
+ recordCount: records.length,
91
+ actor: currentUserId,
92
+ timestamp: new Date().toISOString(),
93
+ })
94
+ ```
95
+
96
+ ## Related packages
97
+
98
+ - `@noy-db/as-json` — structured JSON, schema-aware
99
+ - `@noy-db/as-ndjson` — newline-delimited JSON, streaming-friendly
100
+ - `@noy-db/as-xml` — XML for legacy systems
101
+ - `@noy-db/as-xlsx` — multi-sheet Excel with dictionary-label expansion
102
+ - `@noy-db/as-sql` — SQL dump for migration
103
+ - `@noy-db/as-blob` — single-attachment extraction (binary, not structured data)
104
+ - `@noy-db/as-zip` — composite records + attached blobs
105
+ - `@noy-db/as-noydb` — encrypted-tier whole-vault bundle
106
+
107
+ All share the same authorization model; see [`docs/packages-exports.md#authorization-model`](../../docs/packages-exports.md#authorization-model).
108
+
109
+ ## License
110
+
111
+ MIT
package/dist/index.cjs ADDED
@@ -0,0 +1,241 @@
1
+ "use strict";
2
+ var __create = Object.create;
3
+ var __defProp = Object.defineProperty;
4
+ var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
5
+ var __getOwnPropNames = Object.getOwnPropertyNames;
6
+ var __getProtoOf = Object.getPrototypeOf;
7
+ var __hasOwnProp = Object.prototype.hasOwnProperty;
8
+ var __export = (target, all) => {
9
+ for (var name in all)
10
+ __defProp(target, name, { get: all[name], enumerable: true });
11
+ };
12
+ var __copyProps = (to, from, except, desc) => {
13
+ if (from && typeof from === "object" || typeof from === "function") {
14
+ for (let key of __getOwnPropNames(from))
15
+ if (!__hasOwnProp.call(to, key) && key !== except)
16
+ __defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
17
+ }
18
+ return to;
19
+ };
20
+ var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
21
+ // If the importer is in node compatibility mode or this is not an ESM
22
+ // file that has been converted to a CommonJS file using a Babel-
23
+ // compatible transform (i.e. "__esModule" has not been set), then set
24
+ // "default" to the CommonJS "module.exports" for node compatibility.
25
+ isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
26
+ mod
27
+ ));
28
+ var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
29
+
30
+ // src/index.ts
31
+ var index_exports = {};
32
+ __export(index_exports, {
33
+ download: () => download,
34
+ fromString: () => fromString,
35
+ toString: () => toString,
36
+ write: () => write
37
+ });
38
+ module.exports = __toCommonJS(index_exports);
39
+ var import_hub = require("@noy-db/hub");
40
+ async function toString(vault, options) {
41
+ vault.assertCanExport("plaintext", "csv");
42
+ const eol = options.eol ?? "\n";
43
+ const collection = options.collection;
44
+ const records = [];
45
+ for await (const chunk of vault.exportStream({ granularity: "collection" })) {
46
+ if (chunk.collection === collection) {
47
+ records.push(...chunk.records);
48
+ break;
49
+ }
50
+ }
51
+ const columns = options.columns ?? inferColumns(records);
52
+ if (columns.length === 0) {
53
+ return "";
54
+ }
55
+ const lines = [columns.map(escapeField).join(",")];
56
+ for (const record of records) {
57
+ const row = columns.map((c) => escapeField(record[c]));
58
+ lines.push(row.join(","));
59
+ }
60
+ return lines.join(eol);
61
+ }
62
+ async function download(vault, options) {
63
+ const csv = await toString(vault, options);
64
+ const filename = options.filename ?? `${options.collection}.csv`;
65
+ const blob = new Blob([csv], { type: "text/csv;charset=utf-8" });
66
+ const url = URL.createObjectURL(blob);
67
+ const a = document.createElement("a");
68
+ a.href = url;
69
+ a.download = filename;
70
+ a.click();
71
+ URL.revokeObjectURL(url);
72
+ }
73
+ async function write(vault, path, options) {
74
+ if (options.acknowledgeRisks !== true) {
75
+ throw new Error(
76
+ `as-csv.write: acknowledgeRisks: true is required for on-disk plaintext output. This call creates a persistent plaintext copy of your data outside noy-db's encrypted storage \u2014 see docs/patterns/as-exports.md \xA7"The three tiers of \\"plaintext out\\""`
77
+ );
78
+ }
79
+ const csv = await toString(vault, options);
80
+ const { writeFile } = await import("fs/promises");
81
+ await writeFile(path, csv, "utf-8");
82
+ }
83
+ function escapeField(value) {
84
+ if (value === null || value === void 0) return "";
85
+ if (typeof value === "number" || typeof value === "boolean") return String(value);
86
+ if (value instanceof Date) return value.toISOString();
87
+ const s = typeof value === "string" ? value : JSON.stringify(value);
88
+ if (/[",\r\n]/.test(s)) {
89
+ return `"${s.replace(/"/g, '""')}"`;
90
+ }
91
+ return s;
92
+ }
93
+ function inferColumns(records) {
94
+ const columns = [];
95
+ const seen = /* @__PURE__ */ new Set();
96
+ for (const r of records) {
97
+ if (r && typeof r === "object") {
98
+ for (const key of Object.keys(r)) {
99
+ if (!seen.has(key)) {
100
+ seen.add(key);
101
+ columns.push(key);
102
+ }
103
+ }
104
+ }
105
+ }
106
+ return columns;
107
+ }
108
+ async function fromString(vault, csv, options) {
109
+ vault.assertCanImport("plaintext", "csv");
110
+ const policy = options.policy ?? "merge";
111
+ const idKey = options.idKey ?? "id";
112
+ const types = options.columnTypes ?? {};
113
+ const rows = parseCSV(csv);
114
+ if (rows.length === 0) {
115
+ return emptyPlan(vault, options.collection, policy, idKey);
116
+ }
117
+ const header = rows[0] ?? [];
118
+ const records = [];
119
+ for (let r = 1; r < rows.length; r++) {
120
+ const row = rows[r];
121
+ if (row.length === 1 && row[0] === "") continue;
122
+ const record = {};
123
+ for (let c = 0; c < header.length; c++) {
124
+ const col = header[c] ?? "";
125
+ const cell = row[c] ?? "";
126
+ record[col] = coerceCell(cell, types[col]);
127
+ }
128
+ records.push(record);
129
+ }
130
+ const plan = await (0, import_hub.diffVault)(vault, { [options.collection]: records }, {
131
+ collections: [options.collection],
132
+ idKey
133
+ });
134
+ return {
135
+ plan,
136
+ policy,
137
+ async apply() {
138
+ await vault.noydb.transaction((tx) => {
139
+ const txVault = tx.vault(vault.name);
140
+ for (const entry of plan.added) {
141
+ txVault.collection(entry.collection).put(entry.id, entry.record);
142
+ }
143
+ if (policy !== "insert-only") {
144
+ for (const entry of plan.modified) {
145
+ txVault.collection(entry.collection).put(entry.id, entry.record);
146
+ }
147
+ }
148
+ if (policy === "replace") {
149
+ for (const entry of plan.deleted) {
150
+ txVault.collection(entry.collection).delete(entry.id);
151
+ }
152
+ }
153
+ });
154
+ }
155
+ };
156
+ }
157
+ async function emptyPlan(vault, collection, policy, idKey) {
158
+ const plan = await (0, import_hub.diffVault)(vault, { [collection]: [] }, { collections: [collection], idKey });
159
+ return { plan, policy, async apply() {
160
+ } };
161
+ }
162
+ function coerceCell(cell, type) {
163
+ if (type === "number") {
164
+ if (cell === "") return void 0;
165
+ const n = Number(cell);
166
+ return Number.isFinite(n) ? n : cell;
167
+ }
168
+ if (type === "boolean") {
169
+ if (cell === "true") return true;
170
+ if (cell === "false") return false;
171
+ return cell;
172
+ }
173
+ return cell;
174
+ }
175
+ function parseCSV(input) {
176
+ const rows = [];
177
+ let row = [];
178
+ let field = "";
179
+ let inQuotes = false;
180
+ let i = 0;
181
+ while (i < input.length) {
182
+ const ch = input[i];
183
+ if (inQuotes) {
184
+ if (ch === '"') {
185
+ if (input[i + 1] === '"') {
186
+ field += '"';
187
+ i += 2;
188
+ continue;
189
+ }
190
+ inQuotes = false;
191
+ i++;
192
+ continue;
193
+ }
194
+ field += ch;
195
+ i++;
196
+ continue;
197
+ }
198
+ if (ch === '"') {
199
+ inQuotes = true;
200
+ i++;
201
+ continue;
202
+ }
203
+ if (ch === ",") {
204
+ row.push(field);
205
+ field = "";
206
+ i++;
207
+ continue;
208
+ }
209
+ if (ch === "\r" && input[i + 1] === "\n") {
210
+ row.push(field);
211
+ rows.push(row);
212
+ row = [];
213
+ field = "";
214
+ i += 2;
215
+ continue;
216
+ }
217
+ if (ch === "\n" || ch === "\r") {
218
+ row.push(field);
219
+ rows.push(row);
220
+ row = [];
221
+ field = "";
222
+ i++;
223
+ continue;
224
+ }
225
+ field += ch;
226
+ i++;
227
+ }
228
+ if (field !== "" || row.length > 0) {
229
+ row.push(field);
230
+ rows.push(row);
231
+ }
232
+ return rows;
233
+ }
234
+ // Annotate the CommonJS export names for ESM import in node:
235
+ 0 && (module.exports = {
236
+ download,
237
+ fromString,
238
+ toString,
239
+ write
240
+ });
241
+ //# sourceMappingURL=index.cjs.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/index.ts"],"sourcesContent":["/**\n * **@noy-db/as-csv** — CSV plaintext export for noy-db.\n *\n * Decrypts records from a single collection and formats them as\n * comma-separated values suitable for spreadsheet import. RFC 4180\n * escaping (quote fields containing commas, quotes, or newlines;\n * escape embedded quotes by doubling them).\n *\n * **Authorization.** Every call is gated by the invoking keyring's\n * `canExportPlaintext` capability — plaintext crossings of the\n * library boundary require an explicit grant from the vault owner\n *. The package calls `vault.assertCanExport('plaintext',\n * 'csv')` before decrypting anything.\n *\n * **Scope.** One collection per call. Multi-collection + attachments\n * → use `@noy-db/as-zip`. Structured JSON → `@noy-db/as-json`.\n * Excel with dictionary-label expansion → `@noy-db/as-xlsx`.\n *\n * See [`docs/patterns/as-exports.md`](https://github.com/vLannaAi/noy-db/blob/main/docs/patterns/as-exports.md).\n *\n * @packageDocumentation\n */\n\nimport type { Vault } from '@noy-db/hub'\n\nexport interface AsCSVOptions {\n /**\n * Collection to export. Must be in the caller's read ACL; otherwise\n * the resulting CSV will be empty (ACL-scoping applies at the\n * `exportStream` layer).\n */\n readonly collection: string\n\n /**\n * Explicit column list. When omitted, columns are inferred from\n * the union of keys across all records, in first-record-wins\n * order. Specify explicitly for deterministic exports or when the\n * source data has sparse fields.\n */\n readonly columns?: readonly string[]\n\n /**\n * Row separator. Default `'\\n'` (LF). Use `'\\r\\n'` for Windows-\n * friendly output (Excel prefers CRLF but accepts LF).\n */\n readonly eol?: '\\n' | '\\r\\n'\n}\n\nexport interface AsCSVWriteOptions extends AsCSVOptions {\n /**\n * Required for Node file-write calls — consumer acknowledgement\n * that plaintext bytes will persist on disk past the current\n * process lifetime (Tier 3 risk per `docs/patterns/as-exports.md`).\n */\n readonly acknowledgeRisks: true\n}\n\nexport interface AsCSVDownloadOptions extends AsCSVOptions {\n /** Filename offered to the browser. Default `'<collection>.csv'`. */\n readonly filename?: string\n}\n\n/**\n * Serialise a collection as a CSV string. Pure operation — no side\n * effects beyond the authorization check + audit ledger write.\n */\nexport async function toString(vault: Vault, options: AsCSVOptions): Promise<string> {\n vault.assertCanExport('plaintext', 'csv')\n\n const eol = options.eol ?? '\\n'\n const collection = options.collection\n\n // Pull the one collection via exportStream in collection granularity.\n const records: unknown[] = []\n for await (const chunk of vault.exportStream({ granularity: 'collection' })) {\n if (chunk.collection === collection) {\n records.push(...chunk.records)\n break\n }\n }\n\n // Determine columns.\n const columns = options.columns ?? inferColumns(records)\n if (columns.length === 0) {\n // Empty collection or no accessible records — emit header-only csv.\n return ''\n }\n\n // Build header + rows\n const lines: string[] = [columns.map(escapeField).join(',')]\n for (const record of records) {\n const row = columns.map(c => escapeField((record as Record<string, unknown>)[c]))\n lines.push(row.join(','))\n }\n return lines.join(eol)\n}\n\n/**\n * Browser download — wraps `toString()` in a `Blob` + triggers the\n * browser's download prompt. Tier 2 egress per the pattern doc.\n *\n * Requires a browser-like environment with `URL.createObjectURL` and\n * `document.createElement`. No-op in headless environments; use\n * `toString()` there instead.\n */\nexport async function download(vault: Vault, options: AsCSVDownloadOptions): Promise<void> {\n const csv = await toString(vault, options)\n const filename = options.filename ?? `${options.collection}.csv`\n const blob = new Blob([csv], { type: 'text/csv;charset=utf-8' })\n const url = URL.createObjectURL(blob)\n const a = document.createElement('a')\n a.href = url\n a.download = filename\n a.click()\n URL.revokeObjectURL(url)\n}\n\n/**\n * Node file-write — persists the CSV to the filesystem. Requires\n * explicit `acknowledgeRisks: true` because the plaintext file\n * outlives the current process (Tier 3 egress).\n */\nexport async function write(\n vault: Vault,\n path: string,\n options: AsCSVWriteOptions,\n): Promise<void> {\n if (options.acknowledgeRisks !== true) {\n throw new Error(\n 'as-csv.write: acknowledgeRisks: true is required for on-disk plaintext output. ' +\n 'This call creates a persistent plaintext copy of your data outside noy-db\\'s ' +\n 'encrypted storage — see docs/patterns/as-exports.md §\"The three tiers of \\\\\"plaintext out\\\\\"\"',\n )\n }\n const csv = await toString(vault, options)\n // Defer the node:fs import so this package remains browser-safe.\n const { writeFile } = await import('node:fs/promises')\n await writeFile(path, csv, 'utf-8')\n}\n\n// ── CSV formatting internals ───────────────────────────────────────────\n\n/**\n * RFC 4180 escaping: wrap a field in double quotes if it contains\n * comma, double quote, CR, or LF. Embedded double quotes become `\"\"`.\n * Other values stringify naturally.\n */\nfunction escapeField(value: unknown): string {\n if (value === null || value === undefined) return ''\n if (typeof value === 'number' || typeof value === 'boolean') return String(value)\n if (value instanceof Date) return value.toISOString()\n const s =\n typeof value === 'string' ? value : JSON.stringify(value)\n if (/[\",\\r\\n]/.test(s)) {\n return `\"${s.replace(/\"/g, '\"\"')}\"`\n }\n return s\n}\n\n/**\n * Derive column list from the records array, preserving first-\n * encountered-wins ordering. An explicit `options.columns` bypasses\n * this.\n */\nfunction inferColumns(records: readonly unknown[]): string[] {\n const columns: string[] = []\n const seen = new Set<string>()\n for (const r of records) {\n if (r && typeof r === 'object') {\n for (const key of Object.keys(r)) {\n if (!seen.has(key)) {\n seen.add(key)\n columns.push(key)\n }\n }\n }\n }\n return columns\n}\n\n// ─── Reader ─────────────────────────────────────────────\n\nimport { diffVault, type VaultDiff } from '@noy-db/hub'\n\nexport type ImportPolicy = 'merge' | 'replace' | 'insert-only'\n\nexport interface AsCSVImportOptions {\n /** Target collection. CSV has no native collection grouping. Required. */\n readonly collection: string\n /**\n * Optional column type hints. When omitted, every cell is parsed as\n * a string. Number / boolean cells are auto-detected when the hint\n * matches: `'1'` → `1`, `'true'` → `true`, etc.\n */\n readonly columnTypes?: Record<string, 'string' | 'number' | 'boolean'>\n /** Field on each record that carries its id. Default `'id'`. */\n readonly idKey?: string\n /** Reconciliation policy. Default `'merge'`. */\n readonly policy?: ImportPolicy\n}\n\nexport interface AsCSVImportPlan {\n readonly plan: VaultDiff\n readonly policy: ImportPolicy\n apply(): Promise<void>\n}\n\n/**\n * Parse RFC-4180 CSV into records and build an import plan for one\n * collection. The first row is the header; subsequent rows are\n * records. Quoted fields, embedded commas, embedded `\"\"`, and\n * CRLF line endings all round-trip with `as-csv.toString()`.\n *\n * Cells are returned as strings unless overridden via `columnTypes`.\n * For the common case of numeric ids (\"1001\" → 1001), pass\n * `columnTypes: { id: 'number' }`.\n */\nexport async function fromString(\n vault: Vault,\n csv: string,\n options: AsCSVImportOptions,\n): Promise<AsCSVImportPlan> {\n vault.assertCanImport('plaintext', 'csv')\n const policy: ImportPolicy = options.policy ?? 'merge'\n const idKey = options.idKey ?? 'id'\n const types = options.columnTypes ?? {}\n\n const rows = parseCSV(csv)\n if (rows.length === 0) {\n return emptyPlan(vault, options.collection, policy, idKey)\n }\n const header = rows[0] ?? []\n const records: Record<string, unknown>[] = []\n for (let r = 1; r < rows.length; r++) {\n const row = rows[r]!\n if (row.length === 1 && row[0] === '') continue // ignore blank lines\n const record: Record<string, unknown> = {}\n for (let c = 0; c < header.length; c++) {\n const col = header[c] ?? ''\n const cell = row[c] ?? ''\n record[col] = coerceCell(cell, types[col])\n }\n records.push(record)\n }\n\n const plan = await diffVault(vault, { [options.collection]: records }, {\n collections: [options.collection],\n idKey,\n })\n\n return {\n plan,\n policy,\n async apply(): Promise<void> {\n // Routes through the txStrategy seam — vault.noydb.transaction()\n // throws a clear error pointing at withTransactions() when the\n // strategy is not opted in. Atomicity ensures a partial failure\n // rolls back every executed put.\n await vault.noydb.transaction((tx) => {\n const txVault = tx.vault(vault.name)\n for (const entry of plan.added) {\n txVault.collection(entry.collection).put(entry.id, entry.record)\n }\n if (policy !== 'insert-only') {\n for (const entry of plan.modified) {\n txVault.collection(entry.collection).put(entry.id, entry.record)\n }\n }\n if (policy === 'replace') {\n for (const entry of plan.deleted) {\n txVault.collection(entry.collection).delete(entry.id)\n }\n }\n })\n },\n }\n}\n\nasync function emptyPlan(\n vault: Vault,\n collection: string,\n policy: ImportPolicy,\n idKey: string,\n): Promise<AsCSVImportPlan> {\n const plan = await diffVault(vault, { [collection]: [] }, { collections: [collection], idKey })\n return { plan, policy, async apply() { /* nothing to do */ } }\n}\n\nfunction coerceCell(cell: string, type?: 'string' | 'number' | 'boolean'): unknown {\n if (type === 'number') {\n if (cell === '') return undefined\n const n = Number(cell)\n return Number.isFinite(n) ? n : cell\n }\n if (type === 'boolean') {\n if (cell === 'true') return true\n if (cell === 'false') return false\n return cell\n }\n return cell\n}\n\n/**\n * Minimal RFC-4180 CSV parser. Recognises:\n * - Comma-separated fields\n * - Quoted fields with embedded commas, newlines, and `\"\"` escapes\n * - Both CRLF and LF row endings\n *\n * Returns a 2D string array. The caller maps the first row to a\n * header and the rest to records.\n */\nfunction parseCSV(input: string): string[][] {\n const rows: string[][] = []\n let row: string[] = []\n let field = ''\n let inQuotes = false\n let i = 0\n\n while (i < input.length) {\n const ch = input[i]!\n if (inQuotes) {\n if (ch === '\"') {\n if (input[i + 1] === '\"') {\n field += '\"'\n i += 2\n continue\n }\n inQuotes = false\n i++\n continue\n }\n field += ch\n i++\n continue\n }\n if (ch === '\"') {\n inQuotes = true\n i++\n continue\n }\n if (ch === ',') {\n row.push(field)\n field = ''\n i++\n continue\n }\n if (ch === '\\r' && input[i + 1] === '\\n') {\n row.push(field)\n rows.push(row)\n row = []\n field = ''\n i += 2\n continue\n }\n if (ch === '\\n' || ch === '\\r') {\n row.push(field)\n rows.push(row)\n row = []\n field = ''\n i++\n continue\n }\n field += ch\n i++\n }\n\n // Final field / row.\n if (field !== '' || row.length > 0) {\n row.push(field)\n rows.push(row)\n }\n\n return rows\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAsLA,iBAA0C;AApH1C,eAAsB,SAAS,OAAc,SAAwC;AACnF,QAAM,gBAAgB,aAAa,KAAK;AAExC,QAAM,MAAM,QAAQ,OAAO;AAC3B,QAAM,aAAa,QAAQ;AAG3B,QAAM,UAAqB,CAAC;AAC5B,mBAAiB,SAAS,MAAM,aAAa,EAAE,aAAa,aAAa,CAAC,GAAG;AAC3E,QAAI,MAAM,eAAe,YAAY;AACnC,cAAQ,KAAK,GAAG,MAAM,OAAO;AAC7B;AAAA,IACF;AAAA,EACF;AAGA,QAAM,UAAU,QAAQ,WAAW,aAAa,OAAO;AACvD,MAAI,QAAQ,WAAW,GAAG;AAExB,WAAO;AAAA,EACT;AAGA,QAAM,QAAkB,CAAC,QAAQ,IAAI,WAAW,EAAE,KAAK,GAAG,CAAC;AAC3D,aAAW,UAAU,SAAS;AAC5B,UAAM,MAAM,QAAQ,IAAI,OAAK,YAAa,OAAmC,CAAC,CAAC,CAAC;AAChF,UAAM,KAAK,IAAI,KAAK,GAAG,CAAC;AAAA,EAC1B;AACA,SAAO,MAAM,KAAK,GAAG;AACvB;AAUA,eAAsB,SAAS,OAAc,SAA8C;AACzF,QAAM,MAAM,MAAM,SAAS,OAAO,OAAO;AACzC,QAAM,WAAW,QAAQ,YAAY,GAAG,QAAQ,UAAU;AAC1D,QAAM,OAAO,IAAI,KAAK,CAAC,GAAG,GAAG,EAAE,MAAM,yBAAyB,CAAC;AAC/D,QAAM,MAAM,IAAI,gBAAgB,IAAI;AACpC,QAAM,IAAI,SAAS,cAAc,GAAG;AACpC,IAAE,OAAO;AACT,IAAE,WAAW;AACb,IAAE,MAAM;AACR,MAAI,gBAAgB,GAAG;AACzB;AAOA,eAAsB,MACpB,OACA,MACA,SACe;AACf,MAAI,QAAQ,qBAAqB,MAAM;AACrC,UAAM,IAAI;AAAA,MACR;AAAA,IAGF;AAAA,EACF;AACA,QAAM,MAAM,MAAM,SAAS,OAAO,OAAO;AAEzC,QAAM,EAAE,UAAU,IAAI,MAAM,OAAO,aAAkB;AACrD,QAAM,UAAU,MAAM,KAAK,OAAO;AACpC;AASA,SAAS,YAAY,OAAwB;AAC3C,MAAI,UAAU,QAAQ,UAAU,OAAW,QAAO;AAClD,MAAI,OAAO,UAAU,YAAY,OAAO,UAAU,UAAW,QAAO,OAAO,KAAK;AAChF,MAAI,iBAAiB,KAAM,QAAO,MAAM,YAAY;AACpD,QAAM,IACJ,OAAO,UAAU,WAAW,QAAQ,KAAK,UAAU,KAAK;AAC1D,MAAI,WAAW,KAAK,CAAC,GAAG;AACtB,WAAO,IAAI,EAAE,QAAQ,MAAM,IAAI,CAAC;AAAA,EAClC;AACA,SAAO;AACT;AAOA,SAAS,aAAa,SAAuC;AAC3D,QAAM,UAAoB,CAAC;AAC3B,QAAM,OAAO,oBAAI,IAAY;AAC7B,aAAW,KAAK,SAAS;AACvB,QAAI,KAAK,OAAO,MAAM,UAAU;AAC9B,iBAAW,OAAO,OAAO,KAAK,CAAC,GAAG;AAChC,YAAI,CAAC,KAAK,IAAI,GAAG,GAAG;AAClB,eAAK,IAAI,GAAG;AACZ,kBAAQ,KAAK,GAAG;AAAA,QAClB;AAAA,MACF;AAAA,IACF;AAAA,EACF;AACA,SAAO;AACT;AAuCA,eAAsB,WACpB,OACA,KACA,SAC0B;AAC1B,QAAM,gBAAgB,aAAa,KAAK;AACxC,QAAM,SAAuB,QAAQ,UAAU;AAC/C,QAAM,QAAQ,QAAQ,SAAS;AAC/B,QAAM,QAAQ,QAAQ,eAAe,CAAC;AAEtC,QAAM,OAAO,SAAS,GAAG;AACzB,MAAI,KAAK,WAAW,GAAG;AACrB,WAAO,UAAU,OAAO,QAAQ,YAAY,QAAQ,KAAK;AAAA,EAC3D;AACA,QAAM,SAAS,KAAK,CAAC,KAAK,CAAC;AAC3B,QAAM,UAAqC,CAAC;AAC5C,WAAS,IAAI,GAAG,IAAI,KAAK,QAAQ,KAAK;AACpC,UAAM,MAAM,KAAK,CAAC;AAClB,QAAI,IAAI,WAAW,KAAK,IAAI,CAAC,MAAM,GAAI;AACvC,UAAM,SAAkC,CAAC;AACzC,aAAS,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AACtC,YAAM,MAAM,OAAO,CAAC,KAAK;AACzB,YAAM,OAAO,IAAI,CAAC,KAAK;AACvB,aAAO,GAAG,IAAI,WAAW,MAAM,MAAM,GAAG,CAAC;AAAA,IAC3C;AACA,YAAQ,KAAK,MAAM;AAAA,EACrB;AAEA,QAAM,OAAO,UAAM,sBAAU,OAAO,EAAE,CAAC,QAAQ,UAAU,GAAG,QAAQ,GAAG;AAAA,IACrE,aAAa,CAAC,QAAQ,UAAU;AAAA,IAChC;AAAA,EACF,CAAC;AAED,SAAO;AAAA,IACL;AAAA,IACA;AAAA,IACA,MAAM,QAAuB;AAK3B,YAAM,MAAM,MAAM,YAAY,CAAC,OAAO;AACpC,cAAM,UAAU,GAAG,MAAM,MAAM,IAAI;AACnC,mBAAW,SAAS,KAAK,OAAO;AAC9B,kBAAQ,WAAW,MAAM,UAAU,EAAE,IAAI,MAAM,IAAI,MAAM,MAAM;AAAA,QACjE;AACA,YAAI,WAAW,eAAe;AAC5B,qBAAW,SAAS,KAAK,UAAU;AACjC,oBAAQ,WAAW,MAAM,UAAU,EAAE,IAAI,MAAM,IAAI,MAAM,MAAM;AAAA,UACjE;AAAA,QACF;AACA,YAAI,WAAW,WAAW;AACxB,qBAAW,SAAS,KAAK,SAAS;AAChC,oBAAQ,WAAW,MAAM,UAAU,EAAE,OAAO,MAAM,EAAE;AAAA,UACtD;AAAA,QACF;AAAA,MACF,CAAC;AAAA,IACH;AAAA,EACF;AACF;AAEA,eAAe,UACb,OACA,YACA,QACA,OAC0B;AAC1B,QAAM,OAAO,UAAM,sBAAU,OAAO,EAAE,CAAC,UAAU,GAAG,CAAC,EAAE,GAAG,EAAE,aAAa,CAAC,UAAU,GAAG,MAAM,CAAC;AAC9F,SAAO,EAAE,MAAM,QAAQ,MAAM,QAAQ;AAAA,EAAsB,EAAE;AAC/D;AAEA,SAAS,WAAW,MAAc,MAAiD;AACjF,MAAI,SAAS,UAAU;AACrB,QAAI,SAAS,GAAI,QAAO;AACxB,UAAM,IAAI,OAAO,IAAI;AACrB,WAAO,OAAO,SAAS,CAAC,IAAI,IAAI;AAAA,EAClC;AACA,MAAI,SAAS,WAAW;AACtB,QAAI,SAAS,OAAQ,QAAO;AAC5B,QAAI,SAAS,QAAS,QAAO;AAC7B,WAAO;AAAA,EACT;AACA,SAAO;AACT;AAWA,SAAS,SAAS,OAA2B;AAC3C,QAAM,OAAmB,CAAC;AAC1B,MAAI,MAAgB,CAAC;AACrB,MAAI,QAAQ;AACZ,MAAI,WAAW;AACf,MAAI,IAAI;AAER,SAAO,IAAI,MAAM,QAAQ;AACvB,UAAM,KAAK,MAAM,CAAC;AAClB,QAAI,UAAU;AACZ,UAAI,OAAO,KAAK;AACd,YAAI,MAAM,IAAI,CAAC,MAAM,KAAK;AACxB,mBAAS;AACT,eAAK;AACL;AAAA,QACF;AACA,mBAAW;AACX;AACA;AAAA,MACF;AACA,eAAS;AACT;AACA;AAAA,IACF;AACA,QAAI,OAAO,KAAK;AACd,iBAAW;AACX;AACA;AAAA,IACF;AACA,QAAI,OAAO,KAAK;AACd,UAAI,KAAK,KAAK;AACd,cAAQ;AACR;AACA;AAAA,IACF;AACA,QAAI,OAAO,QAAQ,MAAM,IAAI,CAAC,MAAM,MAAM;AACxC,UAAI,KAAK,KAAK;AACd,WAAK,KAAK,GAAG;AACb,YAAM,CAAC;AACP,cAAQ;AACR,WAAK;AACL;AAAA,IACF;AACA,QAAI,OAAO,QAAQ,OAAO,MAAM;AAC9B,UAAI,KAAK,KAAK;AACd,WAAK,KAAK,GAAG;AACb,YAAM,CAAC;AACP,cAAQ;AACR;AACA;AAAA,IACF;AACA,aAAS;AACT;AAAA,EACF;AAGA,MAAI,UAAU,MAAM,IAAI,SAAS,GAAG;AAClC,QAAI,KAAK,KAAK;AACd,SAAK,KAAK,GAAG;AAAA,EACf;AAEA,SAAO;AACT;","names":[]}
@@ -0,0 +1,111 @@
1
+ import { VaultDiff, Vault } from '@noy-db/hub';
2
+
3
+ /**
4
+ * **@noy-db/as-csv** — CSV plaintext export for noy-db.
5
+ *
6
+ * Decrypts records from a single collection and formats them as
7
+ * comma-separated values suitable for spreadsheet import. RFC 4180
8
+ * escaping (quote fields containing commas, quotes, or newlines;
9
+ * escape embedded quotes by doubling them).
10
+ *
11
+ * **Authorization.** Every call is gated by the invoking keyring's
12
+ * `canExportPlaintext` capability — plaintext crossings of the
13
+ * library boundary require an explicit grant from the vault owner
14
+ *. The package calls `vault.assertCanExport('plaintext',
15
+ * 'csv')` before decrypting anything.
16
+ *
17
+ * **Scope.** One collection per call. Multi-collection + attachments
18
+ * → use `@noy-db/as-zip`. Structured JSON → `@noy-db/as-json`.
19
+ * Excel with dictionary-label expansion → `@noy-db/as-xlsx`.
20
+ *
21
+ * See [`docs/patterns/as-exports.md`](https://github.com/vLannaAi/noy-db/blob/main/docs/patterns/as-exports.md).
22
+ *
23
+ * @packageDocumentation
24
+ */
25
+
26
+ interface AsCSVOptions {
27
+ /**
28
+ * Collection to export. Must be in the caller's read ACL; otherwise
29
+ * the resulting CSV will be empty (ACL-scoping applies at the
30
+ * `exportStream` layer).
31
+ */
32
+ readonly collection: string;
33
+ /**
34
+ * Explicit column list. When omitted, columns are inferred from
35
+ * the union of keys across all records, in first-record-wins
36
+ * order. Specify explicitly for deterministic exports or when the
37
+ * source data has sparse fields.
38
+ */
39
+ readonly columns?: readonly string[];
40
+ /**
41
+ * Row separator. Default `'\n'` (LF). Use `'\r\n'` for Windows-
42
+ * friendly output (Excel prefers CRLF but accepts LF).
43
+ */
44
+ readonly eol?: '\n' | '\r\n';
45
+ }
46
+ interface AsCSVWriteOptions extends AsCSVOptions {
47
+ /**
48
+ * Required for Node file-write calls — consumer acknowledgement
49
+ * that plaintext bytes will persist on disk past the current
50
+ * process lifetime (Tier 3 risk per `docs/patterns/as-exports.md`).
51
+ */
52
+ readonly acknowledgeRisks: true;
53
+ }
54
+ interface AsCSVDownloadOptions extends AsCSVOptions {
55
+ /** Filename offered to the browser. Default `'<collection>.csv'`. */
56
+ readonly filename?: string;
57
+ }
58
+ /**
59
+ * Serialise a collection as a CSV string. Pure operation — no side
60
+ * effects beyond the authorization check + audit ledger write.
61
+ */
62
+ declare function toString(vault: Vault, options: AsCSVOptions): Promise<string>;
63
+ /**
64
+ * Browser download — wraps `toString()` in a `Blob` + triggers the
65
+ * browser's download prompt. Tier 2 egress per the pattern doc.
66
+ *
67
+ * Requires a browser-like environment with `URL.createObjectURL` and
68
+ * `document.createElement`. No-op in headless environments; use
69
+ * `toString()` there instead.
70
+ */
71
+ declare function download(vault: Vault, options: AsCSVDownloadOptions): Promise<void>;
72
+ /**
73
+ * Node file-write — persists the CSV to the filesystem. Requires
74
+ * explicit `acknowledgeRisks: true` because the plaintext file
75
+ * outlives the current process (Tier 3 egress).
76
+ */
77
+ declare function write(vault: Vault, path: string, options: AsCSVWriteOptions): Promise<void>;
78
+
79
+ type ImportPolicy = 'merge' | 'replace' | 'insert-only';
80
+ interface AsCSVImportOptions {
81
+ /** Target collection. CSV has no native collection grouping. Required. */
82
+ readonly collection: string;
83
+ /**
84
+ * Optional column type hints. When omitted, every cell is parsed as
85
+ * a string. Number / boolean cells are auto-detected when the hint
86
+ * matches: `'1'` → `1`, `'true'` → `true`, etc.
87
+ */
88
+ readonly columnTypes?: Record<string, 'string' | 'number' | 'boolean'>;
89
+ /** Field on each record that carries its id. Default `'id'`. */
90
+ readonly idKey?: string;
91
+ /** Reconciliation policy. Default `'merge'`. */
92
+ readonly policy?: ImportPolicy;
93
+ }
94
+ interface AsCSVImportPlan {
95
+ readonly plan: VaultDiff;
96
+ readonly policy: ImportPolicy;
97
+ apply(): Promise<void>;
98
+ }
99
+ /**
100
+ * Parse RFC-4180 CSV into records and build an import plan for one
101
+ * collection. The first row is the header; subsequent rows are
102
+ * records. Quoted fields, embedded commas, embedded `""`, and
103
+ * CRLF line endings all round-trip with `as-csv.toString()`.
104
+ *
105
+ * Cells are returned as strings unless overridden via `columnTypes`.
106
+ * For the common case of numeric ids ("1001" → 1001), pass
107
+ * `columnTypes: { id: 'number' }`.
108
+ */
109
+ declare function fromString(vault: Vault, csv: string, options: AsCSVImportOptions): Promise<AsCSVImportPlan>;
110
+
111
+ export { type AsCSVDownloadOptions, type AsCSVImportOptions, type AsCSVImportPlan, type AsCSVOptions, type AsCSVWriteOptions, type ImportPolicy, download, fromString, toString, write };
@@ -0,0 +1,111 @@
1
+ import { VaultDiff, Vault } from '@noy-db/hub';
2
+
3
+ /**
4
+ * **@noy-db/as-csv** — CSV plaintext export for noy-db.
5
+ *
6
+ * Decrypts records from a single collection and formats them as
7
+ * comma-separated values suitable for spreadsheet import. RFC 4180
8
+ * escaping (quote fields containing commas, quotes, or newlines;
9
+ * escape embedded quotes by doubling them).
10
+ *
11
+ * **Authorization.** Every call is gated by the invoking keyring's
12
+ * `canExportPlaintext` capability — plaintext crossings of the
13
+ * library boundary require an explicit grant from the vault owner
14
+ *. The package calls `vault.assertCanExport('plaintext',
15
+ * 'csv')` before decrypting anything.
16
+ *
17
+ * **Scope.** One collection per call. Multi-collection + attachments
18
+ * → use `@noy-db/as-zip`. Structured JSON → `@noy-db/as-json`.
19
+ * Excel with dictionary-label expansion → `@noy-db/as-xlsx`.
20
+ *
21
+ * See [`docs/patterns/as-exports.md`](https://github.com/vLannaAi/noy-db/blob/main/docs/patterns/as-exports.md).
22
+ *
23
+ * @packageDocumentation
24
+ */
25
+
26
+ interface AsCSVOptions {
27
+ /**
28
+ * Collection to export. Must be in the caller's read ACL; otherwise
29
+ * the resulting CSV will be empty (ACL-scoping applies at the
30
+ * `exportStream` layer).
31
+ */
32
+ readonly collection: string;
33
+ /**
34
+ * Explicit column list. When omitted, columns are inferred from
35
+ * the union of keys across all records, in first-record-wins
36
+ * order. Specify explicitly for deterministic exports or when the
37
+ * source data has sparse fields.
38
+ */
39
+ readonly columns?: readonly string[];
40
+ /**
41
+ * Row separator. Default `'\n'` (LF). Use `'\r\n'` for Windows-
42
+ * friendly output (Excel prefers CRLF but accepts LF).
43
+ */
44
+ readonly eol?: '\n' | '\r\n';
45
+ }
46
+ interface AsCSVWriteOptions extends AsCSVOptions {
47
+ /**
48
+ * Required for Node file-write calls — consumer acknowledgement
49
+ * that plaintext bytes will persist on disk past the current
50
+ * process lifetime (Tier 3 risk per `docs/patterns/as-exports.md`).
51
+ */
52
+ readonly acknowledgeRisks: true;
53
+ }
54
+ interface AsCSVDownloadOptions extends AsCSVOptions {
55
+ /** Filename offered to the browser. Default `'<collection>.csv'`. */
56
+ readonly filename?: string;
57
+ }
58
+ /**
59
+ * Serialise a collection as a CSV string. Pure operation — no side
60
+ * effects beyond the authorization check + audit ledger write.
61
+ */
62
+ declare function toString(vault: Vault, options: AsCSVOptions): Promise<string>;
63
+ /**
64
+ * Browser download — wraps `toString()` in a `Blob` + triggers the
65
+ * browser's download prompt. Tier 2 egress per the pattern doc.
66
+ *
67
+ * Requires a browser-like environment with `URL.createObjectURL` and
68
+ * `document.createElement`. No-op in headless environments; use
69
+ * `toString()` there instead.
70
+ */
71
+ declare function download(vault: Vault, options: AsCSVDownloadOptions): Promise<void>;
72
+ /**
73
+ * Node file-write — persists the CSV to the filesystem. Requires
74
+ * explicit `acknowledgeRisks: true` because the plaintext file
75
+ * outlives the current process (Tier 3 egress).
76
+ */
77
+ declare function write(vault: Vault, path: string, options: AsCSVWriteOptions): Promise<void>;
78
+
79
+ type ImportPolicy = 'merge' | 'replace' | 'insert-only';
80
+ interface AsCSVImportOptions {
81
+ /** Target collection. CSV has no native collection grouping. Required. */
82
+ readonly collection: string;
83
+ /**
84
+ * Optional column type hints. When omitted, every cell is parsed as
85
+ * a string. Number / boolean cells are auto-detected when the hint
86
+ * matches: `'1'` → `1`, `'true'` → `true`, etc.
87
+ */
88
+ readonly columnTypes?: Record<string, 'string' | 'number' | 'boolean'>;
89
+ /** Field on each record that carries its id. Default `'id'`. */
90
+ readonly idKey?: string;
91
+ /** Reconciliation policy. Default `'merge'`. */
92
+ readonly policy?: ImportPolicy;
93
+ }
94
+ interface AsCSVImportPlan {
95
+ readonly plan: VaultDiff;
96
+ readonly policy: ImportPolicy;
97
+ apply(): Promise<void>;
98
+ }
99
+ /**
100
+ * Parse RFC-4180 CSV into records and build an import plan for one
101
+ * collection. The first row is the header; subsequent rows are
102
+ * records. Quoted fields, embedded commas, embedded `""`, and
103
+ * CRLF line endings all round-trip with `as-csv.toString()`.
104
+ *
105
+ * Cells are returned as strings unless overridden via `columnTypes`.
106
+ * For the common case of numeric ids ("1001" → 1001), pass
107
+ * `columnTypes: { id: 'number' }`.
108
+ */
109
+ declare function fromString(vault: Vault, csv: string, options: AsCSVImportOptions): Promise<AsCSVImportPlan>;
110
+
111
+ export { type AsCSVDownloadOptions, type AsCSVImportOptions, type AsCSVImportPlan, type AsCSVOptions, type AsCSVWriteOptions, type ImportPolicy, download, fromString, toString, write };
package/dist/index.js ADDED
@@ -0,0 +1,203 @@
1
+ // src/index.ts
2
+ import { diffVault } from "@noy-db/hub";
3
+ async function toString(vault, options) {
4
+ vault.assertCanExport("plaintext", "csv");
5
+ const eol = options.eol ?? "\n";
6
+ const collection = options.collection;
7
+ const records = [];
8
+ for await (const chunk of vault.exportStream({ granularity: "collection" })) {
9
+ if (chunk.collection === collection) {
10
+ records.push(...chunk.records);
11
+ break;
12
+ }
13
+ }
14
+ const columns = options.columns ?? inferColumns(records);
15
+ if (columns.length === 0) {
16
+ return "";
17
+ }
18
+ const lines = [columns.map(escapeField).join(",")];
19
+ for (const record of records) {
20
+ const row = columns.map((c) => escapeField(record[c]));
21
+ lines.push(row.join(","));
22
+ }
23
+ return lines.join(eol);
24
+ }
25
+ async function download(vault, options) {
26
+ const csv = await toString(vault, options);
27
+ const filename = options.filename ?? `${options.collection}.csv`;
28
+ const blob = new Blob([csv], { type: "text/csv;charset=utf-8" });
29
+ const url = URL.createObjectURL(blob);
30
+ const a = document.createElement("a");
31
+ a.href = url;
32
+ a.download = filename;
33
+ a.click();
34
+ URL.revokeObjectURL(url);
35
+ }
36
+ async function write(vault, path, options) {
37
+ if (options.acknowledgeRisks !== true) {
38
+ throw new Error(
39
+ `as-csv.write: acknowledgeRisks: true is required for on-disk plaintext output. This call creates a persistent plaintext copy of your data outside noy-db's encrypted storage \u2014 see docs/patterns/as-exports.md \xA7"The three tiers of \\"plaintext out\\""`
40
+ );
41
+ }
42
+ const csv = await toString(vault, options);
43
+ const { writeFile } = await import("fs/promises");
44
+ await writeFile(path, csv, "utf-8");
45
+ }
46
+ function escapeField(value) {
47
+ if (value === null || value === void 0) return "";
48
+ if (typeof value === "number" || typeof value === "boolean") return String(value);
49
+ if (value instanceof Date) return value.toISOString();
50
+ const s = typeof value === "string" ? value : JSON.stringify(value);
51
+ if (/[",\r\n]/.test(s)) {
52
+ return `"${s.replace(/"/g, '""')}"`;
53
+ }
54
+ return s;
55
+ }
56
+ function inferColumns(records) {
57
+ const columns = [];
58
+ const seen = /* @__PURE__ */ new Set();
59
+ for (const r of records) {
60
+ if (r && typeof r === "object") {
61
+ for (const key of Object.keys(r)) {
62
+ if (!seen.has(key)) {
63
+ seen.add(key);
64
+ columns.push(key);
65
+ }
66
+ }
67
+ }
68
+ }
69
+ return columns;
70
+ }
71
+ async function fromString(vault, csv, options) {
72
+ vault.assertCanImport("plaintext", "csv");
73
+ const policy = options.policy ?? "merge";
74
+ const idKey = options.idKey ?? "id";
75
+ const types = options.columnTypes ?? {};
76
+ const rows = parseCSV(csv);
77
+ if (rows.length === 0) {
78
+ return emptyPlan(vault, options.collection, policy, idKey);
79
+ }
80
+ const header = rows[0] ?? [];
81
+ const records = [];
82
+ for (let r = 1; r < rows.length; r++) {
83
+ const row = rows[r];
84
+ if (row.length === 1 && row[0] === "") continue;
85
+ const record = {};
86
+ for (let c = 0; c < header.length; c++) {
87
+ const col = header[c] ?? "";
88
+ const cell = row[c] ?? "";
89
+ record[col] = coerceCell(cell, types[col]);
90
+ }
91
+ records.push(record);
92
+ }
93
+ const plan = await diffVault(vault, { [options.collection]: records }, {
94
+ collections: [options.collection],
95
+ idKey
96
+ });
97
+ return {
98
+ plan,
99
+ policy,
100
+ async apply() {
101
+ await vault.noydb.transaction((tx) => {
102
+ const txVault = tx.vault(vault.name);
103
+ for (const entry of plan.added) {
104
+ txVault.collection(entry.collection).put(entry.id, entry.record);
105
+ }
106
+ if (policy !== "insert-only") {
107
+ for (const entry of plan.modified) {
108
+ txVault.collection(entry.collection).put(entry.id, entry.record);
109
+ }
110
+ }
111
+ if (policy === "replace") {
112
+ for (const entry of plan.deleted) {
113
+ txVault.collection(entry.collection).delete(entry.id);
114
+ }
115
+ }
116
+ });
117
+ }
118
+ };
119
+ }
120
+ async function emptyPlan(vault, collection, policy, idKey) {
121
+ const plan = await diffVault(vault, { [collection]: [] }, { collections: [collection], idKey });
122
+ return { plan, policy, async apply() {
123
+ } };
124
+ }
125
+ function coerceCell(cell, type) {
126
+ if (type === "number") {
127
+ if (cell === "") return void 0;
128
+ const n = Number(cell);
129
+ return Number.isFinite(n) ? n : cell;
130
+ }
131
+ if (type === "boolean") {
132
+ if (cell === "true") return true;
133
+ if (cell === "false") return false;
134
+ return cell;
135
+ }
136
+ return cell;
137
+ }
138
+ function parseCSV(input) {
139
+ const rows = [];
140
+ let row = [];
141
+ let field = "";
142
+ let inQuotes = false;
143
+ let i = 0;
144
+ while (i < input.length) {
145
+ const ch = input[i];
146
+ if (inQuotes) {
147
+ if (ch === '"') {
148
+ if (input[i + 1] === '"') {
149
+ field += '"';
150
+ i += 2;
151
+ continue;
152
+ }
153
+ inQuotes = false;
154
+ i++;
155
+ continue;
156
+ }
157
+ field += ch;
158
+ i++;
159
+ continue;
160
+ }
161
+ if (ch === '"') {
162
+ inQuotes = true;
163
+ i++;
164
+ continue;
165
+ }
166
+ if (ch === ",") {
167
+ row.push(field);
168
+ field = "";
169
+ i++;
170
+ continue;
171
+ }
172
+ if (ch === "\r" && input[i + 1] === "\n") {
173
+ row.push(field);
174
+ rows.push(row);
175
+ row = [];
176
+ field = "";
177
+ i += 2;
178
+ continue;
179
+ }
180
+ if (ch === "\n" || ch === "\r") {
181
+ row.push(field);
182
+ rows.push(row);
183
+ row = [];
184
+ field = "";
185
+ i++;
186
+ continue;
187
+ }
188
+ field += ch;
189
+ i++;
190
+ }
191
+ if (field !== "" || row.length > 0) {
192
+ row.push(field);
193
+ rows.push(row);
194
+ }
195
+ return rows;
196
+ }
197
+ export {
198
+ download,
199
+ fromString,
200
+ toString,
201
+ write
202
+ };
203
+ //# sourceMappingURL=index.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/index.ts"],"sourcesContent":["/**\n * **@noy-db/as-csv** — CSV plaintext export for noy-db.\n *\n * Decrypts records from a single collection and formats them as\n * comma-separated values suitable for spreadsheet import. RFC 4180\n * escaping (quote fields containing commas, quotes, or newlines;\n * escape embedded quotes by doubling them).\n *\n * **Authorization.** Every call is gated by the invoking keyring's\n * `canExportPlaintext` capability — plaintext crossings of the\n * library boundary require an explicit grant from the vault owner\n *. The package calls `vault.assertCanExport('plaintext',\n * 'csv')` before decrypting anything.\n *\n * **Scope.** One collection per call. Multi-collection + attachments\n * → use `@noy-db/as-zip`. Structured JSON → `@noy-db/as-json`.\n * Excel with dictionary-label expansion → `@noy-db/as-xlsx`.\n *\n * See [`docs/patterns/as-exports.md`](https://github.com/vLannaAi/noy-db/blob/main/docs/patterns/as-exports.md).\n *\n * @packageDocumentation\n */\n\nimport type { Vault } from '@noy-db/hub'\n\nexport interface AsCSVOptions {\n /**\n * Collection to export. Must be in the caller's read ACL; otherwise\n * the resulting CSV will be empty (ACL-scoping applies at the\n * `exportStream` layer).\n */\n readonly collection: string\n\n /**\n * Explicit column list. When omitted, columns are inferred from\n * the union of keys across all records, in first-record-wins\n * order. Specify explicitly for deterministic exports or when the\n * source data has sparse fields.\n */\n readonly columns?: readonly string[]\n\n /**\n * Row separator. Default `'\\n'` (LF). Use `'\\r\\n'` for Windows-\n * friendly output (Excel prefers CRLF but accepts LF).\n */\n readonly eol?: '\\n' | '\\r\\n'\n}\n\nexport interface AsCSVWriteOptions extends AsCSVOptions {\n /**\n * Required for Node file-write calls — consumer acknowledgement\n * that plaintext bytes will persist on disk past the current\n * process lifetime (Tier 3 risk per `docs/patterns/as-exports.md`).\n */\n readonly acknowledgeRisks: true\n}\n\nexport interface AsCSVDownloadOptions extends AsCSVOptions {\n /** Filename offered to the browser. Default `'<collection>.csv'`. */\n readonly filename?: string\n}\n\n/**\n * Serialise a collection as a CSV string. Pure operation — no side\n * effects beyond the authorization check + audit ledger write.\n */\nexport async function toString(vault: Vault, options: AsCSVOptions): Promise<string> {\n vault.assertCanExport('plaintext', 'csv')\n\n const eol = options.eol ?? '\\n'\n const collection = options.collection\n\n // Pull the one collection via exportStream in collection granularity.\n const records: unknown[] = []\n for await (const chunk of vault.exportStream({ granularity: 'collection' })) {\n if (chunk.collection === collection) {\n records.push(...chunk.records)\n break\n }\n }\n\n // Determine columns.\n const columns = options.columns ?? inferColumns(records)\n if (columns.length === 0) {\n // Empty collection or no accessible records — emit header-only csv.\n return ''\n }\n\n // Build header + rows\n const lines: string[] = [columns.map(escapeField).join(',')]\n for (const record of records) {\n const row = columns.map(c => escapeField((record as Record<string, unknown>)[c]))\n lines.push(row.join(','))\n }\n return lines.join(eol)\n}\n\n/**\n * Browser download — wraps `toString()` in a `Blob` + triggers the\n * browser's download prompt. Tier 2 egress per the pattern doc.\n *\n * Requires a browser-like environment with `URL.createObjectURL` and\n * `document.createElement`. No-op in headless environments; use\n * `toString()` there instead.\n */\nexport async function download(vault: Vault, options: AsCSVDownloadOptions): Promise<void> {\n const csv = await toString(vault, options)\n const filename = options.filename ?? `${options.collection}.csv`\n const blob = new Blob([csv], { type: 'text/csv;charset=utf-8' })\n const url = URL.createObjectURL(blob)\n const a = document.createElement('a')\n a.href = url\n a.download = filename\n a.click()\n URL.revokeObjectURL(url)\n}\n\n/**\n * Node file-write — persists the CSV to the filesystem. Requires\n * explicit `acknowledgeRisks: true` because the plaintext file\n * outlives the current process (Tier 3 egress).\n */\nexport async function write(\n vault: Vault,\n path: string,\n options: AsCSVWriteOptions,\n): Promise<void> {\n if (options.acknowledgeRisks !== true) {\n throw new Error(\n 'as-csv.write: acknowledgeRisks: true is required for on-disk plaintext output. ' +\n 'This call creates a persistent plaintext copy of your data outside noy-db\\'s ' +\n 'encrypted storage — see docs/patterns/as-exports.md §\"The three tiers of \\\\\"plaintext out\\\\\"\"',\n )\n }\n const csv = await toString(vault, options)\n // Defer the node:fs import so this package remains browser-safe.\n const { writeFile } = await import('node:fs/promises')\n await writeFile(path, csv, 'utf-8')\n}\n\n// ── CSV formatting internals ───────────────────────────────────────────\n\n/**\n * RFC 4180 escaping: wrap a field in double quotes if it contains\n * comma, double quote, CR, or LF. Embedded double quotes become `\"\"`.\n * Other values stringify naturally.\n */\nfunction escapeField(value: unknown): string {\n if (value === null || value === undefined) return ''\n if (typeof value === 'number' || typeof value === 'boolean') return String(value)\n if (value instanceof Date) return value.toISOString()\n const s =\n typeof value === 'string' ? value : JSON.stringify(value)\n if (/[\",\\r\\n]/.test(s)) {\n return `\"${s.replace(/\"/g, '\"\"')}\"`\n }\n return s\n}\n\n/**\n * Derive column list from the records array, preserving first-\n * encountered-wins ordering. An explicit `options.columns` bypasses\n * this.\n */\nfunction inferColumns(records: readonly unknown[]): string[] {\n const columns: string[] = []\n const seen = new Set<string>()\n for (const r of records) {\n if (r && typeof r === 'object') {\n for (const key of Object.keys(r)) {\n if (!seen.has(key)) {\n seen.add(key)\n columns.push(key)\n }\n }\n }\n }\n return columns\n}\n\n// ─── Reader ─────────────────────────────────────────────\n\nimport { diffVault, type VaultDiff } from '@noy-db/hub'\n\nexport type ImportPolicy = 'merge' | 'replace' | 'insert-only'\n\nexport interface AsCSVImportOptions {\n /** Target collection. CSV has no native collection grouping. Required. */\n readonly collection: string\n /**\n * Optional column type hints. When omitted, every cell is parsed as\n * a string. Number / boolean cells are auto-detected when the hint\n * matches: `'1'` → `1`, `'true'` → `true`, etc.\n */\n readonly columnTypes?: Record<string, 'string' | 'number' | 'boolean'>\n /** Field on each record that carries its id. Default `'id'`. */\n readonly idKey?: string\n /** Reconciliation policy. Default `'merge'`. */\n readonly policy?: ImportPolicy\n}\n\nexport interface AsCSVImportPlan {\n readonly plan: VaultDiff\n readonly policy: ImportPolicy\n apply(): Promise<void>\n}\n\n/**\n * Parse RFC-4180 CSV into records and build an import plan for one\n * collection. The first row is the header; subsequent rows are\n * records. Quoted fields, embedded commas, embedded `\"\"`, and\n * CRLF line endings all round-trip with `as-csv.toString()`.\n *\n * Cells are returned as strings unless overridden via `columnTypes`.\n * For the common case of numeric ids (\"1001\" → 1001), pass\n * `columnTypes: { id: 'number' }`.\n */\nexport async function fromString(\n vault: Vault,\n csv: string,\n options: AsCSVImportOptions,\n): Promise<AsCSVImportPlan> {\n vault.assertCanImport('plaintext', 'csv')\n const policy: ImportPolicy = options.policy ?? 'merge'\n const idKey = options.idKey ?? 'id'\n const types = options.columnTypes ?? {}\n\n const rows = parseCSV(csv)\n if (rows.length === 0) {\n return emptyPlan(vault, options.collection, policy, idKey)\n }\n const header = rows[0] ?? []\n const records: Record<string, unknown>[] = []\n for (let r = 1; r < rows.length; r++) {\n const row = rows[r]!\n if (row.length === 1 && row[0] === '') continue // ignore blank lines\n const record: Record<string, unknown> = {}\n for (let c = 0; c < header.length; c++) {\n const col = header[c] ?? ''\n const cell = row[c] ?? ''\n record[col] = coerceCell(cell, types[col])\n }\n records.push(record)\n }\n\n const plan = await diffVault(vault, { [options.collection]: records }, {\n collections: [options.collection],\n idKey,\n })\n\n return {\n plan,\n policy,\n async apply(): Promise<void> {\n // Routes through the txStrategy seam — vault.noydb.transaction()\n // throws a clear error pointing at withTransactions() when the\n // strategy is not opted in. Atomicity ensures a partial failure\n // rolls back every executed put.\n await vault.noydb.transaction((tx) => {\n const txVault = tx.vault(vault.name)\n for (const entry of plan.added) {\n txVault.collection(entry.collection).put(entry.id, entry.record)\n }\n if (policy !== 'insert-only') {\n for (const entry of plan.modified) {\n txVault.collection(entry.collection).put(entry.id, entry.record)\n }\n }\n if (policy === 'replace') {\n for (const entry of plan.deleted) {\n txVault.collection(entry.collection).delete(entry.id)\n }\n }\n })\n },\n }\n}\n\nasync function emptyPlan(\n vault: Vault,\n collection: string,\n policy: ImportPolicy,\n idKey: string,\n): Promise<AsCSVImportPlan> {\n const plan = await diffVault(vault, { [collection]: [] }, { collections: [collection], idKey })\n return { plan, policy, async apply() { /* nothing to do */ } }\n}\n\nfunction coerceCell(cell: string, type?: 'string' | 'number' | 'boolean'): unknown {\n if (type === 'number') {\n if (cell === '') return undefined\n const n = Number(cell)\n return Number.isFinite(n) ? n : cell\n }\n if (type === 'boolean') {\n if (cell === 'true') return true\n if (cell === 'false') return false\n return cell\n }\n return cell\n}\n\n/**\n * Minimal RFC-4180 CSV parser. Recognises:\n * - Comma-separated fields\n * - Quoted fields with embedded commas, newlines, and `\"\"` escapes\n * - Both CRLF and LF row endings\n *\n * Returns a 2D string array. The caller maps the first row to a\n * header and the rest to records.\n */\nfunction parseCSV(input: string): string[][] {\n const rows: string[][] = []\n let row: string[] = []\n let field = ''\n let inQuotes = false\n let i = 0\n\n while (i < input.length) {\n const ch = input[i]!\n if (inQuotes) {\n if (ch === '\"') {\n if (input[i + 1] === '\"') {\n field += '\"'\n i += 2\n continue\n }\n inQuotes = false\n i++\n continue\n }\n field += ch\n i++\n continue\n }\n if (ch === '\"') {\n inQuotes = true\n i++\n continue\n }\n if (ch === ',') {\n row.push(field)\n field = ''\n i++\n continue\n }\n if (ch === '\\r' && input[i + 1] === '\\n') {\n row.push(field)\n rows.push(row)\n row = []\n field = ''\n i += 2\n continue\n }\n if (ch === '\\n' || ch === '\\r') {\n row.push(field)\n rows.push(row)\n row = []\n field = ''\n i++\n continue\n }\n field += ch\n i++\n }\n\n // Final field / row.\n if (field !== '' || row.length > 0) {\n row.push(field)\n rows.push(row)\n }\n\n return rows\n}\n"],"mappings":";AAsLA,SAAS,iBAAiC;AApH1C,eAAsB,SAAS,OAAc,SAAwC;AACnF,QAAM,gBAAgB,aAAa,KAAK;AAExC,QAAM,MAAM,QAAQ,OAAO;AAC3B,QAAM,aAAa,QAAQ;AAG3B,QAAM,UAAqB,CAAC;AAC5B,mBAAiB,SAAS,MAAM,aAAa,EAAE,aAAa,aAAa,CAAC,GAAG;AAC3E,QAAI,MAAM,eAAe,YAAY;AACnC,cAAQ,KAAK,GAAG,MAAM,OAAO;AAC7B;AAAA,IACF;AAAA,EACF;AAGA,QAAM,UAAU,QAAQ,WAAW,aAAa,OAAO;AACvD,MAAI,QAAQ,WAAW,GAAG;AAExB,WAAO;AAAA,EACT;AAGA,QAAM,QAAkB,CAAC,QAAQ,IAAI,WAAW,EAAE,KAAK,GAAG,CAAC;AAC3D,aAAW,UAAU,SAAS;AAC5B,UAAM,MAAM,QAAQ,IAAI,OAAK,YAAa,OAAmC,CAAC,CAAC,CAAC;AAChF,UAAM,KAAK,IAAI,KAAK,GAAG,CAAC;AAAA,EAC1B;AACA,SAAO,MAAM,KAAK,GAAG;AACvB;AAUA,eAAsB,SAAS,OAAc,SAA8C;AACzF,QAAM,MAAM,MAAM,SAAS,OAAO,OAAO;AACzC,QAAM,WAAW,QAAQ,YAAY,GAAG,QAAQ,UAAU;AAC1D,QAAM,OAAO,IAAI,KAAK,CAAC,GAAG,GAAG,EAAE,MAAM,yBAAyB,CAAC;AAC/D,QAAM,MAAM,IAAI,gBAAgB,IAAI;AACpC,QAAM,IAAI,SAAS,cAAc,GAAG;AACpC,IAAE,OAAO;AACT,IAAE,WAAW;AACb,IAAE,MAAM;AACR,MAAI,gBAAgB,GAAG;AACzB;AAOA,eAAsB,MACpB,OACA,MACA,SACe;AACf,MAAI,QAAQ,qBAAqB,MAAM;AACrC,UAAM,IAAI;AAAA,MACR;AAAA,IAGF;AAAA,EACF;AACA,QAAM,MAAM,MAAM,SAAS,OAAO,OAAO;AAEzC,QAAM,EAAE,UAAU,IAAI,MAAM,OAAO,aAAkB;AACrD,QAAM,UAAU,MAAM,KAAK,OAAO;AACpC;AASA,SAAS,YAAY,OAAwB;AAC3C,MAAI,UAAU,QAAQ,UAAU,OAAW,QAAO;AAClD,MAAI,OAAO,UAAU,YAAY,OAAO,UAAU,UAAW,QAAO,OAAO,KAAK;AAChF,MAAI,iBAAiB,KAAM,QAAO,MAAM,YAAY;AACpD,QAAM,IACJ,OAAO,UAAU,WAAW,QAAQ,KAAK,UAAU,KAAK;AAC1D,MAAI,WAAW,KAAK,CAAC,GAAG;AACtB,WAAO,IAAI,EAAE,QAAQ,MAAM,IAAI,CAAC;AAAA,EAClC;AACA,SAAO;AACT;AAOA,SAAS,aAAa,SAAuC;AAC3D,QAAM,UAAoB,CAAC;AAC3B,QAAM,OAAO,oBAAI,IAAY;AAC7B,aAAW,KAAK,SAAS;AACvB,QAAI,KAAK,OAAO,MAAM,UAAU;AAC9B,iBAAW,OAAO,OAAO,KAAK,CAAC,GAAG;AAChC,YAAI,CAAC,KAAK,IAAI,GAAG,GAAG;AAClB,eAAK,IAAI,GAAG;AACZ,kBAAQ,KAAK,GAAG;AAAA,QAClB;AAAA,MACF;AAAA,IACF;AAAA,EACF;AACA,SAAO;AACT;AAuCA,eAAsB,WACpB,OACA,KACA,SAC0B;AAC1B,QAAM,gBAAgB,aAAa,KAAK;AACxC,QAAM,SAAuB,QAAQ,UAAU;AAC/C,QAAM,QAAQ,QAAQ,SAAS;AAC/B,QAAM,QAAQ,QAAQ,eAAe,CAAC;AAEtC,QAAM,OAAO,SAAS,GAAG;AACzB,MAAI,KAAK,WAAW,GAAG;AACrB,WAAO,UAAU,OAAO,QAAQ,YAAY,QAAQ,KAAK;AAAA,EAC3D;AACA,QAAM,SAAS,KAAK,CAAC,KAAK,CAAC;AAC3B,QAAM,UAAqC,CAAC;AAC5C,WAAS,IAAI,GAAG,IAAI,KAAK,QAAQ,KAAK;AACpC,UAAM,MAAM,KAAK,CAAC;AAClB,QAAI,IAAI,WAAW,KAAK,IAAI,CAAC,MAAM,GAAI;AACvC,UAAM,SAAkC,CAAC;AACzC,aAAS,IAAI,GAAG,IAAI,OAAO,QAAQ,KAAK;AACtC,YAAM,MAAM,OAAO,CAAC,KAAK;AACzB,YAAM,OAAO,IAAI,CAAC,KAAK;AACvB,aAAO,GAAG,IAAI,WAAW,MAAM,MAAM,GAAG,CAAC;AAAA,IAC3C;AACA,YAAQ,KAAK,MAAM;AAAA,EACrB;AAEA,QAAM,OAAO,MAAM,UAAU,OAAO,EAAE,CAAC,QAAQ,UAAU,GAAG,QAAQ,GAAG;AAAA,IACrE,aAAa,CAAC,QAAQ,UAAU;AAAA,IAChC;AAAA,EACF,CAAC;AAED,SAAO;AAAA,IACL;AAAA,IACA;AAAA,IACA,MAAM,QAAuB;AAK3B,YAAM,MAAM,MAAM,YAAY,CAAC,OAAO;AACpC,cAAM,UAAU,GAAG,MAAM,MAAM,IAAI;AACnC,mBAAW,SAAS,KAAK,OAAO;AAC9B,kBAAQ,WAAW,MAAM,UAAU,EAAE,IAAI,MAAM,IAAI,MAAM,MAAM;AAAA,QACjE;AACA,YAAI,WAAW,eAAe;AAC5B,qBAAW,SAAS,KAAK,UAAU;AACjC,oBAAQ,WAAW,MAAM,UAAU,EAAE,IAAI,MAAM,IAAI,MAAM,MAAM;AAAA,UACjE;AAAA,QACF;AACA,YAAI,WAAW,WAAW;AACxB,qBAAW,SAAS,KAAK,SAAS;AAChC,oBAAQ,WAAW,MAAM,UAAU,EAAE,OAAO,MAAM,EAAE;AAAA,UACtD;AAAA,QACF;AAAA,MACF,CAAC;AAAA,IACH;AAAA,EACF;AACF;AAEA,eAAe,UACb,OACA,YACA,QACA,OAC0B;AAC1B,QAAM,OAAO,MAAM,UAAU,OAAO,EAAE,CAAC,UAAU,GAAG,CAAC,EAAE,GAAG,EAAE,aAAa,CAAC,UAAU,GAAG,MAAM,CAAC;AAC9F,SAAO,EAAE,MAAM,QAAQ,MAAM,QAAQ;AAAA,EAAsB,EAAE;AAC/D;AAEA,SAAS,WAAW,MAAc,MAAiD;AACjF,MAAI,SAAS,UAAU;AACrB,QAAI,SAAS,GAAI,QAAO;AACxB,UAAM,IAAI,OAAO,IAAI;AACrB,WAAO,OAAO,SAAS,CAAC,IAAI,IAAI;AAAA,EAClC;AACA,MAAI,SAAS,WAAW;AACtB,QAAI,SAAS,OAAQ,QAAO;AAC5B,QAAI,SAAS,QAAS,QAAO;AAC7B,WAAO;AAAA,EACT;AACA,SAAO;AACT;AAWA,SAAS,SAAS,OAA2B;AAC3C,QAAM,OAAmB,CAAC;AAC1B,MAAI,MAAgB,CAAC;AACrB,MAAI,QAAQ;AACZ,MAAI,WAAW;AACf,MAAI,IAAI;AAER,SAAO,IAAI,MAAM,QAAQ;AACvB,UAAM,KAAK,MAAM,CAAC;AAClB,QAAI,UAAU;AACZ,UAAI,OAAO,KAAK;AACd,YAAI,MAAM,IAAI,CAAC,MAAM,KAAK;AACxB,mBAAS;AACT,eAAK;AACL;AAAA,QACF;AACA,mBAAW;AACX;AACA;AAAA,MACF;AACA,eAAS;AACT;AACA;AAAA,IACF;AACA,QAAI,OAAO,KAAK;AACd,iBAAW;AACX;AACA;AAAA,IACF;AACA,QAAI,OAAO,KAAK;AACd,UAAI,KAAK,KAAK;AACd,cAAQ;AACR;AACA;AAAA,IACF;AACA,QAAI,OAAO,QAAQ,MAAM,IAAI,CAAC,MAAM,MAAM;AACxC,UAAI,KAAK,KAAK;AACd,WAAK,KAAK,GAAG;AACb,YAAM,CAAC;AACP,cAAQ;AACR,WAAK;AACL;AAAA,IACF;AACA,QAAI,OAAO,QAAQ,OAAO,MAAM;AAC9B,UAAI,KAAK,KAAK;AACd,WAAK,KAAK,GAAG;AACb,YAAM,CAAC;AACP,cAAQ;AACR;AACA;AAAA,IACF;AACA,aAAS;AACT;AAAA,EACF;AAGA,MAAI,UAAU,MAAM,IAAI,SAAS,GAAG;AAClC,QAAI,KAAK,KAAK;AACd,SAAK,KAAK,GAAG;AAAA,EACf;AAEA,SAAO;AACT;","names":[]}
package/package.json ADDED
@@ -0,0 +1,67 @@
1
+ {
2
+ "name": "@noy-db/as-csv",
3
+ "version": "0.1.0-pre.3",
4
+ "description": "CSV plaintext export for noy-db — decrypts records and formats as comma-separated values. Gated by the RFC #249 `canExportPlaintext` capability bit; writes an audit-ledger entry on every call. Part of the @noy-db/as-* portable-artefact family (plaintext tier).",
5
+ "license": "MIT",
6
+ "author": "vLannaAi <vicio@lanna.ai>",
7
+ "homepage": "https://github.com/vLannaAi/noy-db/tree/main/packages/as-csv#readme",
8
+ "repository": {
9
+ "type": "git",
10
+ "url": "git+https://github.com/vLannaAi/noy-db.git",
11
+ "directory": "packages/as-csv"
12
+ },
13
+ "bugs": {
14
+ "url": "https://github.com/vLannaAi/noy-db/issues"
15
+ },
16
+ "type": "module",
17
+ "sideEffects": false,
18
+ "exports": {
19
+ ".": {
20
+ "import": {
21
+ "types": "./dist/index.d.ts",
22
+ "default": "./dist/index.js"
23
+ },
24
+ "require": {
25
+ "types": "./dist/index.d.cts",
26
+ "default": "./dist/index.cjs"
27
+ }
28
+ }
29
+ },
30
+ "main": "./dist/index.cjs",
31
+ "module": "./dist/index.js",
32
+ "types": "./dist/index.d.ts",
33
+ "files": [
34
+ "dist",
35
+ "README.md",
36
+ "LICENSE"
37
+ ],
38
+ "engines": {
39
+ "node": ">=18.0.0"
40
+ },
41
+ "peerDependencies": {
42
+ "@noy-db/hub": "0.1.0-pre.3"
43
+ },
44
+ "devDependencies": {
45
+ "@types/node": "^22.0.0",
46
+ "@noy-db/hub": "0.1.0-pre.3"
47
+ },
48
+ "keywords": [
49
+ "noy-db",
50
+ "as-csv",
51
+ "csv",
52
+ "export",
53
+ "plaintext",
54
+ "spreadsheet",
55
+ "zero-knowledge"
56
+ ],
57
+ "publishConfig": {
58
+ "access": "public",
59
+ "tag": "latest"
60
+ },
61
+ "scripts": {
62
+ "build": "tsup",
63
+ "test": "vitest run",
64
+ "lint": "eslint src/",
65
+ "typecheck": "tsc --noEmit"
66
+ }
67
+ }