@stackmemoryai/stackmemory 0.5.29 → 0.5.31

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. package/README.md +53 -32
  2. package/dist/core/database/batch-operations.js +29 -4
  3. package/dist/core/database/batch-operations.js.map +2 -2
  4. package/dist/core/database/connection-pool.js +13 -2
  5. package/dist/core/database/connection-pool.js.map +2 -2
  6. package/dist/core/database/migration-manager.js +130 -34
  7. package/dist/core/database/migration-manager.js.map +2 -2
  8. package/dist/core/database/paradedb-adapter.js +23 -7
  9. package/dist/core/database/paradedb-adapter.js.map +2 -2
  10. package/dist/core/database/query-router.js +8 -3
  11. package/dist/core/database/query-router.js.map +2 -2
  12. package/dist/core/database/sqlite-adapter.js +152 -33
  13. package/dist/core/database/sqlite-adapter.js.map +2 -2
  14. package/dist/integrations/linear/auth.js +34 -20
  15. package/dist/integrations/linear/auth.js.map +2 -2
  16. package/dist/integrations/linear/auto-sync.js +18 -8
  17. package/dist/integrations/linear/auto-sync.js.map +2 -2
  18. package/dist/integrations/linear/client.js +42 -9
  19. package/dist/integrations/linear/client.js.map +2 -2
  20. package/dist/integrations/linear/migration.js +94 -36
  21. package/dist/integrations/linear/migration.js.map +2 -2
  22. package/dist/integrations/linear/oauth-server.js +77 -34
  23. package/dist/integrations/linear/oauth-server.js.map +2 -2
  24. package/dist/integrations/linear/rest-client.js +13 -3
  25. package/dist/integrations/linear/rest-client.js.map +2 -2
  26. package/dist/integrations/linear/sync-service.js +18 -15
  27. package/dist/integrations/linear/sync-service.js.map +2 -2
  28. package/dist/integrations/linear/sync.js +12 -4
  29. package/dist/integrations/linear/sync.js.map +2 -2
  30. package/dist/integrations/linear/unified-sync.js +33 -8
  31. package/dist/integrations/linear/unified-sync.js.map +2 -2
  32. package/dist/integrations/linear/webhook-handler.js +5 -1
  33. package/dist/integrations/linear/webhook-handler.js.map +2 -2
  34. package/dist/integrations/linear/webhook-server.js +7 -7
  35. package/dist/integrations/linear/webhook-server.js.map +2 -2
  36. package/dist/integrations/linear/webhook.js +9 -2
  37. package/dist/integrations/linear/webhook.js.map +2 -2
  38. package/dist/integrations/mcp/schemas.js +147 -0
  39. package/dist/integrations/mcp/schemas.js.map +7 -0
  40. package/dist/integrations/mcp/server.js +19 -3
  41. package/dist/integrations/mcp/server.js.map +2 -2
  42. package/package.json +1 -1
package/README.md CHANGED
@@ -1,19 +1,19 @@
1
1
  # StackMemory
2
2
 
3
- **Lossless, project-scoped memory for AI tools** • v0.3.16
3
+ **Lossless, project-scoped memory for AI tools** • v0.5.30
4
4
 
5
5
  StackMemory is a **production-ready memory runtime** for AI coding tools that preserves full project context across sessions. With **Phases 1-4 complete**, it delivers:
6
6
 
7
- - **89-98% faster** task operations than manual tracking
8
- - **10,000+ frame depth** support with hierarchical organization
9
- - **Full Linear integration** with bidirectional sync
10
- - **20+ MCP tools** for Claude Code
11
- - **Context persistence** that survives /clear operations
12
- - **Two-tier storage system** with local tiers and infinite remote storage
13
- - **Smart compression** (LZ4/ZSTD) with 2.5-3.5x ratios
14
- - **Background migration** with configurable triggers
15
- - **296 tests passing** with improved error handling
16
- - **npm v0.3.16** published with production-ready improvements
7
+ - **89-98% faster** task operations than manual tracking
8
+ - **10,000+ frame depth** support with hierarchical organization
9
+ - **Full Linear integration** with bidirectional sync
10
+ - **20+ MCP tools** for Claude Code
11
+ - **Context persistence** that survives /clear operations
12
+ - **Two-tier storage system** with local tiers and infinite remote storage
13
+ - **Smart compression** (LZ4/ZSTD) with 2.5-3.5x ratios
14
+ - **Background migration** with configurable triggers
15
+ - **396 tests passing** with standardized error handling
16
+ - **npm v0.5.30** published with WhatsApp notifications and improved integrations
17
17
 
18
18
  Instead of a linear chat log, StackMemory organizes memory as a **call stack** of scoped work (frames), with intelligent LLM-driven retrieval and team collaboration features.
19
19
 
@@ -96,18 +96,18 @@ The editor never manages memory directly; it asks StackMemory for the **context
96
96
 
97
97
  ## Product Health Metrics
98
98
 
99
- ### Current Status (v0.3.16)
99
+ ### Current Status (v0.5.30)
100
100
 
101
101
  | Metric | Current | Target | Status |
102
102
  |--------|---------|--------|--------|
103
- | **Test Coverage** | 80% | 90% | 🟡 |
104
- | **Performance (p50)** | TBD | <50ms | 🔄 |
105
- | **Documentation** | 60% | 100% | 🟡 |
106
- | **Active Issues** | 13 high | 0 high | 🟡 |
107
- | **Code Quality** | 296 tests | 350+ | |
108
- | **npm Downloads** | Growing | 1K+/week | 🚀 |
103
+ | **Test Coverage** | 85% | 90% | In Progress |
104
+ | **Performance (p50)** | TBD | <50ms | Pending |
105
+ | **Documentation** | 70% | 100% | In Progress |
106
+ | **Active Issues** | 5 high | 0 high | In Progress |
107
+ | **Code Quality** | 396 tests | 400+ | Done |
108
+ | **npm Downloads** | Growing | 1K+/week | On Track |
109
109
 
110
- ### Quality Score: 72/100
110
+ ### Quality Score: 78/100
111
111
 
112
112
  **Formula:** (Test Coverage × 0.3) + (Performance × 0.3) + (Documentation × 0.2) + (Issues Resolution × 0.2)
113
113
 
@@ -136,7 +136,7 @@ This creates a **project-scoped memory space** tied to the repo.
136
136
  ### Step 2: Install StackMemory
137
137
 
138
138
  ```bash
139
- npm install -g @stackmemoryai/stackmemory@0.3.16
139
+ npm install -g @stackmemoryai/stackmemory@0.5.30
140
140
  # or latest
141
141
  npm install -g @stackmemoryai/stackmemory@latest
142
142
  ```
@@ -455,26 +455,48 @@ stackmemory mcp-server [--port 3001]
455
455
  - Hosted: **Private beta**
456
456
  - OSS mirror: **Production ready**
457
457
  - MCP integration: **Stable**
458
- - CLI: **v0.3.16** - Full task, context, Linear, and storage management
458
+ - CLI: **v0.5.30** - Full task, context, Linear, and storage management
459
459
  - Two-tier storage: **Complete**
460
- - Test Suite: **296 tests passing**
460
+ - Test Suite: **396 tests passing**
461
461
 
462
462
  ---
463
463
 
464
464
  ## Changelog
465
465
 
466
+ ### v0.5.30 (2026-01-26)
467
+ - Standardized error handling with `IntegrationError`, `DatabaseError`, `ValidationError`
468
+ - Adopted error classes across Linear integration (12 files)
469
+ - Adopted error classes across database layer (6 files)
470
+ - WhatsApp notifications with session ID and interactive options
471
+ - 396 tests passing with improved code quality
472
+
473
+ ### v0.5.28 (2026-01-25)
474
+ - WhatsApp flag for claude-sm automatic notifications
475
+ - Incoming request queue for WhatsApp triggers
476
+ - SMS webhook /send endpoint for outgoing notifications
477
+
478
+ ### v0.5.26 (2026-01-24)
479
+ - OpenCode wrapper (opencode-sm) with context integration
480
+ - Discovery CLI and MCP tools
481
+ - Real LLM provider and retrieval audit system
482
+ - Linear issue management and task picker
483
+
484
+ ### v0.5.21 (2026-01-23)
485
+ - Claude-sm remote mode and configurable defaults
486
+ - Context loading command improvements
487
+ - Session summary features
488
+
466
489
  ### v0.3.16 (2026-01-15)
467
- - Fixed critical error handling - getFrame() returns undefined instead of throwing
468
- - Improved test coverage and fixed StackMemoryError constructor usage
469
- - Removed dangerous secret-cleaning scripts from repository
470
- - All tests passing, lint clean, build successful
471
- - ✅ Published to npm with production-ready improvements
490
+ - Fixed critical error handling - getFrame() returns undefined instead of throwing
491
+ - Improved test coverage and fixed StackMemoryError constructor usage
492
+ - Removed dangerous secret-cleaning scripts from repository
493
+ - All tests passing, lint clean, build successful
472
494
 
473
495
  ### v0.3.15 (2026-01-14)
474
- - Two-tier storage system implementation complete
475
- - Smart compression with LZ4/ZSTD support
476
- - Background migration with configurable triggers
477
- - Improved Linear integration with bidirectional sync
496
+ - Two-tier storage system implementation complete
497
+ - Smart compression with LZ4/ZSTD support
498
+ - Background migration with configurable triggers
499
+ - Improved Linear integration with bidirectional sync
478
500
 
479
501
  ---
480
502
 
@@ -508,4 +530,3 @@ stackmemory mcp-server [--port 3001]
508
530
  - [Beads Integration](./BEADS_INTEGRATION.md) - Git-native memory patterns from Beads ecosystem
509
531
 
510
532
  ---
511
- # Husky fix successful
@@ -4,6 +4,7 @@ const __filename = __fileURLToPath(import.meta.url);
4
4
  const __dirname = __pathDirname(__filename);
5
5
  import { logger } from "../monitoring/logger.js";
6
6
  import { trace } from "../trace/index.js";
7
+ import { ErrorCode, wrapError } from "../errors/index.js";
7
8
  class BatchOperationsManager {
8
9
  db;
9
10
  preparedStatements = /* @__PURE__ */ new Map();
@@ -92,9 +93,15 @@ class BatchOperationsManager {
92
93
  stats.successfulInserts += result.changes;
93
94
  } catch (error) {
94
95
  stats.failedInserts++;
96
+ const wrappedError = wrapError(
97
+ error,
98
+ "Failed to update frame digest",
99
+ ErrorCode.DB_UPDATE_FAILED,
100
+ { frameId: update.frame_id }
101
+ );
95
102
  logger.warn("Failed to update frame digest", {
96
103
  frameId: update.frame_id,
97
- error: error.message
104
+ error: wrappedError.message
98
105
  });
99
106
  }
100
107
  }
@@ -155,9 +162,15 @@ class BatchOperationsManager {
155
162
  stats.successfulInserts += result.changes;
156
163
  } catch (error) {
157
164
  stats.failedInserts++;
165
+ const wrappedError = wrapError(
166
+ error,
167
+ `Failed to insert ${table} record`,
168
+ ErrorCode.DB_INSERT_FAILED,
169
+ { table, record }
170
+ );
158
171
  logger.warn(`Failed to insert ${table} record`, {
159
172
  record,
160
- error: error.message
173
+ error: wrappedError.message
161
174
  });
162
175
  }
163
176
  }
@@ -209,7 +222,13 @@ class BatchOperationsManager {
209
222
  }
210
223
  } catch (error) {
211
224
  stats.failedInserts += batch.length;
212
- logger.error("Batch processing failed", error, {
225
+ const wrappedError = wrapError(
226
+ error,
227
+ "Batch processing failed",
228
+ ErrorCode.DB_TRANSACTION_FAILED,
229
+ { batchNumber: stats.batchesProcessed + 1, batchSize: batch.length }
230
+ );
231
+ logger.error("Batch processing failed", wrappedError, {
213
232
  batchNumber: stats.batchesProcessed + 1,
214
233
  batchSize: batch.length
215
234
  });
@@ -245,7 +264,13 @@ class BatchOperationsManager {
245
264
  tables: groupedOps.size
246
265
  });
247
266
  } catch (error) {
248
- logger.error("Batch queue processing failed", error);
267
+ const wrappedError = wrapError(
268
+ error,
269
+ "Batch queue processing failed",
270
+ ErrorCode.DB_TRANSACTION_FAILED,
271
+ { operationsCount: operations.length }
272
+ );
273
+ logger.error("Batch queue processing failed", wrappedError);
249
274
  } finally {
250
275
  this.isProcessing = false;
251
276
  }
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "version": 3,
3
3
  "sources": ["../../../src/core/database/batch-operations.ts"],
4
- "sourcesContent": ["/**\n * Batch Database Operations\n * High-performance bulk operations with transaction management\n */\n\nimport Database from 'better-sqlite3';\nimport { getConnectionPool } from './connection-pool.js';\nimport { logger } from '../monitoring/logger.js';\nimport { trace } from '../trace/index.js';\n\nexport interface BatchOperation {\n table: string;\n operation: 'insert' | 'update' | 'delete';\n data: Record<string, any>[];\n onConflict?: 'ignore' | 'replace' | 'update';\n}\n\nexport interface BulkInsertOptions {\n batchSize?: number;\n onConflict?: 'ignore' | 'replace' | 'update';\n enableTransactions?: boolean;\n parallelTables?: boolean;\n}\n\nexport interface BatchStats {\n totalRecords: number;\n batchesProcessed: number;\n successfulInserts: number;\n failedInserts: number;\n totalTimeMs: number;\n avgBatchTimeMs: number;\n}\n\n/**\n * High-performance batch operations manager\n */\nexport class BatchOperationsManager {\n private db: Database.Database;\n private preparedStatements = new Map<string, Database.Statement>();\n private batchQueue: BatchOperation[] = [];\n private isProcessing = false;\n\n constructor(db?: Database.Database) {\n if (db) {\n this.db = db;\n this.initializePreparedStatements();\n } else {\n // Will be initialized when used with getConnectionPool().withConnection()\n this.db = undefined as any;\n }\n }\n\n /**\n * Add events in bulk with optimized batching\n */\n async bulkInsertEvents(\n events: Array<{\n frame_id: string;\n run_id: string;\n seq: number;\n event_type: string;\n payload: any;\n ts: number;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n const {\n batchSize = 100,\n onConflict = 'ignore',\n enableTransactions = true,\n } = options;\n\n return this.performBulkInsert('events', events, {\n batchSize,\n onConflict,\n enableTransactions,\n preprocessor: (event) => ({\n ...event,\n event_id: `evt_${event.frame_id}_${event.seq}_${Date.now()}`,\n payload: JSON.stringify(event.payload),\n }),\n });\n }\n\n /**\n * Add anchors in bulk\n */\n async bulkInsertAnchors(\n anchors: Array<{\n frame_id: string;\n type: string;\n text: string;\n priority: number;\n metadata: any;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n return this.performBulkInsert('anchors', anchors, {\n ...options,\n preprocessor: (anchor) => ({\n ...anchor,\n anchor_id: `anc_${anchor.frame_id}_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,\n metadata: JSON.stringify(anchor.metadata),\n created_at: Date.now(),\n }),\n });\n }\n\n /**\n * Bulk update frame digests\n */\n async bulkUpdateFrameDigests(\n updates: Array<{\n frame_id: string;\n digest_text: string;\n digest_json: any;\n closed_at?: number;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n const { batchSize = 50, enableTransactions = true } = options;\n\n return trace.traceAsync(\n 'function',\n 'bulkUpdateFrameDigests',\n { count: updates.length },\n async () => {\n const startTime = performance.now();\n const stats: BatchStats = {\n totalRecords: updates.length,\n batchesProcessed: 0,\n successfulInserts: 0,\n failedInserts: 0,\n totalTimeMs: 0,\n avgBatchTimeMs: 0,\n };\n\n if (updates.length === 0) return stats;\n\n const stmt = this.db.prepare(`\n UPDATE frames \n SET digest_text = ?, \n digest_json = ?, \n closed_at = COALESCE(?, closed_at),\n state = CASE WHEN ? IS NOT NULL THEN 'closed' ELSE state END\n WHERE frame_id = ?\n `);\n\n const updateFn = (batch: typeof updates) => {\n for (const update of batch) {\n try {\n const result = stmt.run(\n update.digest_text,\n JSON.stringify(update.digest_json),\n update.closed_at,\n update.closed_at,\n update.frame_id\n );\n stats.successfulInserts += result.changes;\n } catch (error: unknown) {\n stats.failedInserts++;\n logger.warn('Failed to update frame digest', {\n frameId: update.frame_id,\n error: (error as Error).message,\n });\n }\n }\n };\n\n if (enableTransactions) {\n const transaction = this.db.transaction(updateFn);\n await this.processBatches(updates, batchSize, transaction, stats);\n } else {\n await this.processBatches(updates, batchSize, updateFn, stats);\n }\n\n stats.totalTimeMs = performance.now() - startTime;\n stats.avgBatchTimeMs =\n stats.batchesProcessed > 0\n ? stats.totalTimeMs / stats.batchesProcessed\n : 0;\n\n logger.info(\n 'Bulk frame digest update completed',\n stats as unknown as Record<string, unknown>\n );\n return stats;\n }\n );\n }\n\n /**\n * Generic bulk insert with preprocessing\n */\n private async performBulkInsert<T extends Record<string, any>>(\n table: string,\n records: T[],\n options: BulkInsertOptions & {\n preprocessor?: (record: T) => Record<string, any>;\n } = {}\n ): Promise<BatchStats> {\n const {\n batchSize = 100,\n onConflict = 'ignore',\n enableTransactions = true,\n preprocessor,\n } = options;\n\n return trace.traceAsync(\n 'function',\n `bulkInsert${table}`,\n { count: records.length },\n async () => {\n const startTime = performance.now();\n const stats: BatchStats = {\n totalRecords: records.length,\n batchesProcessed: 0,\n successfulInserts: 0,\n failedInserts: 0,\n totalTimeMs: 0,\n avgBatchTimeMs: 0,\n };\n\n if (records.length === 0) return stats;\n\n // Preprocess records if needed\n const processedRecords = preprocessor\n ? records.map(preprocessor)\n : records;\n\n // Build dynamic insert statement\n const firstRecord = processedRecords[0];\n const columns = Object.keys(firstRecord);\n const placeholders = columns.map(() => '?').join(', ');\n const conflictClause = this.getConflictClause(onConflict);\n\n const insertSql = `INSERT ${conflictClause} INTO ${table} (${columns.join(', ')}) VALUES (${placeholders})`;\n const stmt = this.db.prepare(insertSql);\n\n const insertFn = (batch: typeof processedRecords) => {\n for (const record of batch) {\n try {\n const values = columns.map((col: any) => record[col]);\n const result = stmt.run(...values);\n stats.successfulInserts += result.changes;\n } catch (error: unknown) {\n stats.failedInserts++;\n logger.warn(`Failed to insert ${table} record`, {\n record,\n error: (error as Error).message,\n });\n }\n }\n };\n\n if (enableTransactions) {\n const transaction = this.db.transaction(insertFn);\n await this.processBatches(\n processedRecords,\n batchSize,\n transaction,\n stats\n );\n } else {\n await this.processBatches(\n processedRecords,\n batchSize,\n insertFn,\n stats\n );\n }\n\n stats.totalTimeMs = performance.now() - startTime;\n stats.avgBatchTimeMs =\n stats.batchesProcessed > 0\n ? stats.totalTimeMs / stats.batchesProcessed\n : 0;\n\n logger.info(\n `Bulk ${table} insert completed`,\n stats as unknown as Record<string, unknown>\n );\n return stats;\n }\n );\n }\n\n /**\n * Process records in batches\n */\n private async processBatches<T>(\n records: T[],\n batchSize: number,\n processFn: (batch: T[]) => void,\n stats: BatchStats\n ): Promise<void> {\n for (let i = 0; i < records.length; i += batchSize) {\n const batch = records.slice(i, i + batchSize);\n const batchStart = performance.now();\n\n try {\n processFn(batch);\n stats.batchesProcessed++;\n\n const batchTime = performance.now() - batchStart;\n logger.debug('Batch processed', {\n batchNumber: stats.batchesProcessed,\n records: batch.length,\n timeMs: batchTime.toFixed(2),\n });\n\n // Yield control to prevent blocking\n if (stats.batchesProcessed % 10 === 0) {\n await new Promise((resolve) => setImmediate(resolve));\n }\n } catch (error: unknown) {\n stats.failedInserts += batch.length;\n logger.error('Batch processing failed', error as Error, {\n batchNumber: stats.batchesProcessed + 1,\n batchSize: batch.length,\n });\n }\n }\n }\n\n /**\n * Queue batch operation for later processing\n */\n queueBatchOperation(operation: BatchOperation): void {\n this.batchQueue.push(operation);\n\n if (this.batchQueue.length >= 10 && !this.isProcessing) {\n setImmediate(() => this.processBatchQueue());\n }\n }\n\n /**\n * Process queued batch operations\n */\n async processBatchQueue(): Promise<void> {\n if (this.isProcessing || this.batchQueue.length === 0) {\n return;\n }\n\n this.isProcessing = true;\n const operations = [...this.batchQueue];\n this.batchQueue = [];\n\n try {\n const groupedOps = this.groupOperationsByTable(operations);\n\n for (const [table, tableOps] of groupedOps) {\n await this.processTableOperations(table, tableOps);\n }\n\n logger.info('Batch queue processed', {\n operations: operations.length,\n tables: groupedOps.size,\n });\n } catch (error: unknown) {\n logger.error('Batch queue processing failed', error as Error);\n } finally {\n this.isProcessing = false;\n }\n }\n\n /**\n * Flush any remaining queued operations\n */\n async flush(): Promise<void> {\n if (this.batchQueue.length > 0) {\n await this.processBatchQueue();\n }\n }\n\n /**\n * Get SQL conflict clause\n */\n private getConflictClause(onConflict: string): string {\n switch (onConflict) {\n case 'ignore':\n return 'OR IGNORE';\n case 'replace':\n return 'OR REPLACE';\n case 'update':\n return 'ON CONFLICT DO UPDATE SET';\n default:\n return '';\n }\n }\n\n /**\n * Group operations by table for efficient processing\n */\n private groupOperationsByTable(\n operations: BatchOperation[]\n ): Map<string, BatchOperation[]> {\n const grouped = new Map<string, BatchOperation[]>();\n\n for (const op of operations) {\n if (!grouped.has(op.table)) {\n grouped.set(op.table, []);\n }\n grouped.get(op.table)!.push(op);\n }\n\n return grouped;\n }\n\n /**\n * Process all operations for a specific table\n */\n private async processTableOperations(\n table: string,\n operations: BatchOperation[]\n ): Promise<void> {\n for (const op of operations) {\n switch (op.operation) {\n case 'insert':\n await this.performBulkInsert(table, op.data, {\n onConflict: op.onConflict,\n });\n break;\n // Add update and delete operations as needed\n default:\n logger.warn('Unsupported batch operation', {\n table,\n operation: op.operation,\n });\n }\n }\n }\n\n /**\n * Initialize commonly used prepared statements\n */\n private initializePreparedStatements(): void {\n // Event insertion\n this.preparedStatements.set(\n 'insert_event',\n this.db.prepare(`\n INSERT OR IGNORE INTO events \n (event_id, frame_id, run_id, seq, event_type, payload, ts) \n VALUES (?, ?, ?, ?, ?, ?, ?)\n `)\n );\n\n // Anchor insertion\n this.preparedStatements.set(\n 'insert_anchor',\n this.db.prepare(`\n INSERT OR IGNORE INTO anchors \n (anchor_id, frame_id, type, text, priority, metadata, created_at) \n VALUES (?, ?, ?, ?, ?, ?, ?)\n `)\n );\n\n logger.info('Batch operations prepared statements initialized');\n }\n\n /**\n * Cleanup resources\n */\n cleanup(): void {\n // Modern better-sqlite3 automatically handles cleanup\n this.preparedStatements.clear();\n }\n}\n\n// Global batch operations manager\nlet globalBatchManager: BatchOperationsManager | null = null;\n\n/**\n * Get or create global batch operations manager\n */\nexport function getBatchManager(\n db?: Database.Database\n): BatchOperationsManager {\n if (!globalBatchManager) {\n globalBatchManager = new BatchOperationsManager(db);\n }\n return globalBatchManager;\n}\n\n/**\n * Convenience function for bulk event insertion\n */\nexport async function bulkInsertEvents(\n events: any[],\n options?: BulkInsertOptions\n): Promise<BatchStats> {\n const manager = getBatchManager();\n return manager.bulkInsertEvents(events, options);\n}\n\n/**\n * Convenience function for bulk anchor insertion\n */\nexport async function bulkInsertAnchors(\n anchors: any[],\n options?: BulkInsertOptions\n): Promise<BatchStats> {\n const manager = getBatchManager();\n return manager.bulkInsertAnchors(anchors, options);\n}\n"],
5
- "mappings": ";;;;AAOA,SAAS,cAAc;AACvB,SAAS,aAAa;AA4Bf,MAAM,uBAAuB;AAAA,EAC1B;AAAA,EACA,qBAAqB,oBAAI,IAAgC;AAAA,EACzD,aAA+B,CAAC;AAAA,EAChC,eAAe;AAAA,EAEvB,YAAY,IAAwB;AAClC,QAAI,IAAI;AACN,WAAK,KAAK;AACV,WAAK,6BAA6B;AAAA,IACpC,OAAO;AAEL,WAAK,KAAK;AAAA,IACZ;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,iBACJ,QAQA,UAA6B,CAAC,GACT;AACrB,UAAM;AAAA,MACJ,YAAY;AAAA,MACZ,aAAa;AAAA,MACb,qBAAqB;AAAA,IACvB,IAAI;AAEJ,WAAO,KAAK,kBAAkB,UAAU,QAAQ;AAAA,MAC9C;AAAA,MACA;AAAA,MACA;AAAA,MACA,cAAc,CAAC,WAAW;AAAA,QACxB,GAAG;AAAA,QACH,UAAU,OAAO,MAAM,QAAQ,IAAI,MAAM,GAAG,IAAI,KAAK,IAAI,CAAC;AAAA,QAC1D,SAAS,KAAK,UAAU,MAAM,OAAO;AAAA,MACvC;AAAA,IACF,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,kBACJ,SAOA,UAA6B,CAAC,GACT;AACrB,WAAO,KAAK,kBAAkB,WAAW,SAAS;AAAA,MAChD,GAAG;AAAA,MACH,cAAc,CAAC,YAAY;AAAA,QACzB,GAAG;AAAA,QACH,WAAW,OAAO,OAAO,QAAQ,IAAI,KAAK,IAAI,CAAC,IAAI,KAAK,OAAO,EAAE,SAAS,EAAE,EAAE,OAAO,GAAG,CAAC,CAAC;AAAA,QAC1F,UAAU,KAAK,UAAU,OAAO,QAAQ;AAAA,QACxC,YAAY,KAAK,IAAI;AAAA,MACvB;AAAA,IACF,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,uBACJ,SAMA,UAA6B,CAAC,GACT;AACrB,UAAM,EAAE,YAAY,IAAI,qBAAqB,KAAK,IAAI;AAEtD,WAAO,MAAM;AAAA,MACX;AAAA,MACA;AAAA,MACA,EAAE,OAAO,QAAQ,OAAO;AAAA,MACxB,YAAY;AACV,cAAM,YAAY,YAAY,IAAI;AAClC,cAAM,QAAoB;AAAA,UACxB,cAAc,QAAQ;AAAA,UACtB,kBAAkB;AAAA,UAClB,mBAAmB;AAAA,UACnB,eAAe;AAAA,UACf,aAAa;AAAA,UACb,gBAAgB;AAAA,QAClB;AAEA,YAAI,QAAQ,WAAW,EAAG,QAAO;AAEjC,cAAM,OAAO,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,OAO9B;AAEC,cAAM,WAAW,CAAC,UAA0B;AAC1C,qBAAW,UAAU,OAAO;AAC1B,gBAAI;AACF,oBAAM,SAAS,KAAK;AAAA,gBAClB,OAAO;AAAA,gBACP,KAAK,UAAU,OAAO,WAAW;AAAA,gBACjC,OAAO;AAAA,gBACP,OAAO;AAAA,gBACP,OAAO;AAAA,cACT;AACA,oBAAM,qBAAqB,OAAO;AAAA,YACpC,SAAS,OAAgB;AACvB,oBAAM;AACN,qBAAO,KAAK,iCAAiC;AAAA,gBAC3C,SAAS,OAAO;AAAA,gBAChB,OAAQ,MAAgB;AAAA,cAC1B,CAAC;AAAA,YACH;AAAA,UACF;AAAA,QACF;AAEA,YAAI,oBAAoB;AACtB,gBAAM,cAAc,KAAK,GAAG,YAAY,QAAQ;AAChD,gBAAM,KAAK,eAAe,SAAS,WAAW,aAAa,KAAK;AAAA,QAClE,OAAO;AACL,gBAAM,KAAK,eAAe,SAAS,WAAW,UAAU,KAAK;AAAA,QAC/D;AAEA,cAAM,cAAc,YAAY,IAAI,IAAI;AACxC,cAAM,iBACJ,MAAM,mBAAmB,IACrB,MAAM,cAAc,MAAM,mBAC1B;AAEN,eAAO;AAAA,UACL;AAAA,UACA;AAAA,QACF;AACA,eAAO;AAAA,MACT;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,kBACZ,OACA,SACA,UAEI,CAAC,GACgB;AACrB,UAAM;AAAA,MACJ,YAAY;AAAA,MACZ,aAAa;AAAA,MACb,qBAAqB;AAAA,MACrB;AAAA,IACF,IAAI;AAEJ,WAAO,MAAM;AAAA,MACX;AAAA,MACA,aAAa,KAAK;AAAA,MAClB,EAAE,OAAO,QAAQ,OAAO;AAAA,MACxB,YAAY;AACV,cAAM,YAAY,YAAY,IAAI;AAClC,cAAM,QAAoB;AAAA,UACxB,cAAc,QAAQ;AAAA,UACtB,kBAAkB;AAAA,UAClB,mBAAmB;AAAA,UACnB,eAAe;AAAA,UACf,aAAa;AAAA,UACb,gBAAgB;AAAA,QAClB;AAEA,YAAI,QAAQ,WAAW,EAAG,QAAO;AAGjC,cAAM,mBAAmB,eACrB,QAAQ,IAAI,YAAY,IACxB;AAGJ,cAAM,cAAc,iBAAiB,CAAC;AACtC,cAAM,UAAU,OAAO,KAAK,WAAW;AACvC,cAAM,eAAe,QAAQ,IAAI,MAAM,GAAG,EAAE,KAAK,IAAI;AACrD,cAAM,iBAAiB,KAAK,kBAAkB,UAAU;AAExD,cAAM,YAAY,UAAU,cAAc,SAAS,KAAK,KAAK,QAAQ,KAAK,IAAI,CAAC,aAAa,YAAY;AACxG,cAAM,OAAO,KAAK,GAAG,QAAQ,SAAS;AAEtC,cAAM,WAAW,CAAC,UAAmC;AACnD,qBAAW,UAAU,OAAO;AAC1B,gBAAI;AACF,oBAAM,SAAS,QAAQ,IAAI,CAAC,QAAa,OAAO,GAAG,CAAC;AACpD,oBAAM,SAAS,KAAK,IAAI,GAAG,MAAM;AACjC,oBAAM,qBAAqB,OAAO;AAAA,YACpC,SAAS,OAAgB;AACvB,oBAAM;AACN,qBAAO,KAAK,oBAAoB,KAAK,WAAW;AAAA,gBAC9C;AAAA,gBACA,OAAQ,MAAgB;AAAA,cAC1B,CAAC;AAAA,YACH;AAAA,UACF;AAAA,QACF;AAEA,YAAI,oBAAoB;AACtB,gBAAM,cAAc,KAAK,GAAG,YAAY,QAAQ;AAChD,gBAAM,KAAK;AAAA,YACT;AAAA,YACA;AAAA,YACA;AAAA,YACA;AAAA,UACF;AAAA,QACF,OAAO;AACL,gBAAM,KAAK;AAAA,YACT;AAAA,YACA;AAAA,YACA;AAAA,YACA;AAAA,UACF;AAAA,QACF;AAEA,cAAM,cAAc,YAAY,IAAI,IAAI;AACxC,cAAM,iBACJ,MAAM,mBAAmB,IACrB,MAAM,cAAc,MAAM,mBAC1B;AAEN,eAAO;AAAA,UACL,QAAQ,KAAK;AAAA,UACb;AAAA,QACF;AACA,eAAO;AAAA,MACT;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,eACZ,SACA,WACA,WACA,OACe;AACf,aAAS,IAAI,GAAG,IAAI,QAAQ,QAAQ,KAAK,WAAW;AAClD,YAAM,QAAQ,QAAQ,MAAM,GAAG,IAAI,SAAS;AAC5C,YAAM,aAAa,YAAY,IAAI;AAEnC,UAAI;AACF,kBAAU,KAAK;AACf,cAAM;AAEN,cAAM,YAAY,YAAY,IAAI,IAAI;AACtC,eAAO,MAAM,mBAAmB;AAAA,UAC9B,aAAa,MAAM;AAAA,UACnB,SAAS,MAAM;AAAA,UACf,QAAQ,UAAU,QAAQ,CAAC;AAAA,QAC7B,CAAC;AAGD,YAAI,MAAM,mBAAmB,OAAO,GAAG;AACrC,gBAAM,IAAI,QAAQ,CAAC,YAAY,aAAa,OAAO,CAAC;AAAA,QACtD;AAAA,MACF,SAAS,OAAgB;AACvB,cAAM,iBAAiB,MAAM;AAC7B,eAAO,MAAM,2BAA2B,OAAgB;AAAA,UACtD,aAAa,MAAM,mBAAmB;AAAA,UACtC,WAAW,MAAM;AAAA,QACnB,CAAC;AAAA,MACH;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,oBAAoB,WAAiC;AACnD,SAAK,WAAW,KAAK,SAAS;AAE9B,QAAI,KAAK,WAAW,UAAU,MAAM,CAAC,KAAK,cAAc;AACtD,mBAAa,MAAM,KAAK,kBAAkB,CAAC;AAAA,IAC7C;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,oBAAmC;AACvC,QAAI,KAAK,gBAAgB,KAAK,WAAW,WAAW,GAAG;AACrD;AAAA,IACF;AAEA,SAAK,eAAe;AACpB,UAAM,aAAa,CAAC,GAAG,KAAK,UAAU;AACtC,SAAK,aAAa,CAAC;AAEnB,QAAI;AACF,YAAM,aAAa,KAAK,uBAAuB,UAAU;AAEzD,iBAAW,CAAC,OAAO,QAAQ,KAAK,YAAY;AAC1C,cAAM,KAAK,uBAAuB,OAAO,QAAQ;AAAA,MACnD;AAEA,aAAO,KAAK,yBAAyB;AAAA,QACnC,YAAY,WAAW;AAAA,QACvB,QAAQ,WAAW;AAAA,MACrB,CAAC;AAAA,IACH,SAAS,OAAgB;AACvB,aAAO,MAAM,iCAAiC,KAAc;AAAA,IAC9D,UAAE;AACA,WAAK,eAAe;AAAA,IACtB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,QAAuB;AAC3B,QAAI,KAAK,WAAW,SAAS,GAAG;AAC9B,YAAM,KAAK,kBAAkB;AAAA,IAC/B;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,kBAAkB,YAA4B;AACpD,YAAQ,YAAY;AAAA,MAClB,KAAK;AACH,eAAO;AAAA,MACT,KAAK;AACH,eAAO;AAAA,MACT,KAAK;AACH,eAAO;AAAA,MACT;AACE,eAAO;AAAA,IACX;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,uBACN,YAC+B;AAC/B,UAAM,UAAU,oBAAI,IAA8B;AAElD,eAAW,MAAM,YAAY;AAC3B,UAAI,CAAC,QAAQ,IAAI,GAAG,KAAK,GAAG;AAC1B,gBAAQ,IAAI,GAAG,OAAO,CAAC,CAAC;AAAA,MAC1B;AACA,cAAQ,IAAI,GAAG,KAAK,EAAG,KAAK,EAAE;AAAA,IAChC;AAEA,WAAO;AAAA,EACT;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,uBACZ,OACA,YACe;AACf,eAAW,MAAM,YAAY;AAC3B,cAAQ,GAAG,WAAW;AAAA,QACpB,KAAK;AACH,gBAAM,KAAK,kBAAkB,OAAO,GAAG,MAAM;AAAA,YAC3C,YAAY,GAAG;AAAA,UACjB,CAAC;AACD;AAAA;AAAA,QAEF;AACE,iBAAO,KAAK,+BAA+B;AAAA,YACzC;AAAA,YACA,WAAW,GAAG;AAAA,UAChB,CAAC;AAAA,MACL;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,+BAAqC;AAE3C,SAAK,mBAAmB;AAAA,MACtB;AAAA,MACA,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA,OAIf;AAAA,IACH;AAGA,SAAK,mBAAmB;AAAA,MACtB;AAAA,MACA,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA,OAIf;AAAA,IACH;AAEA,WAAO,KAAK,kDAAkD;AAAA,EAChE;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AAEd,SAAK,mBAAmB,MAAM;AAAA,EAChC;AACF;AAGA,IAAI,qBAAoD;AAKjD,SAAS,gBACd,IACwB;AACxB,MAAI,CAAC,oBAAoB;AACvB,yBAAqB,IAAI,uBAAuB,EAAE;AAAA,EACpD;AACA,SAAO;AACT;AAKA,eAAsB,iBACpB,QACA,SACqB;AACrB,QAAM,UAAU,gBAAgB;AAChC,SAAO,QAAQ,iBAAiB,QAAQ,OAAO;AACjD;AAKA,eAAsB,kBACpB,SACA,SACqB;AACrB,QAAM,UAAU,gBAAgB;AAChC,SAAO,QAAQ,kBAAkB,SAAS,OAAO;AACnD;",
4
+ "sourcesContent": ["/**\n * Batch Database Operations\n * High-performance bulk operations with transaction management\n */\n\nimport Database from 'better-sqlite3';\n// Connection pool imported when needed: getConnectionPool\nimport { logger } from '../monitoring/logger.js';\nimport { trace } from '../trace/index.js';\nimport { ErrorCode, wrapError } from '../errors/index.js';\n\nexport interface BatchOperation {\n table: string;\n operation: 'insert' | 'update' | 'delete';\n data: Record<string, any>[];\n onConflict?: 'ignore' | 'replace' | 'update';\n}\n\nexport interface BulkInsertOptions {\n batchSize?: number;\n onConflict?: 'ignore' | 'replace' | 'update';\n enableTransactions?: boolean;\n parallelTables?: boolean;\n}\n\nexport interface BatchStats {\n totalRecords: number;\n batchesProcessed: number;\n successfulInserts: number;\n failedInserts: number;\n totalTimeMs: number;\n avgBatchTimeMs: number;\n}\n\n/**\n * High-performance batch operations manager\n */\nexport class BatchOperationsManager {\n private db: Database.Database;\n private preparedStatements = new Map<string, Database.Statement>();\n private batchQueue: BatchOperation[] = [];\n private isProcessing = false;\n\n constructor(db?: Database.Database) {\n if (db) {\n this.db = db;\n this.initializePreparedStatements();\n } else {\n // Will be initialized when used with getConnectionPool().withConnection()\n this.db = undefined as any;\n }\n }\n\n /**\n * Add events in bulk with optimized batching\n */\n async bulkInsertEvents(\n events: Array<{\n frame_id: string;\n run_id: string;\n seq: number;\n event_type: string;\n payload: any;\n ts: number;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n const {\n batchSize = 100,\n onConflict = 'ignore',\n enableTransactions = true,\n } = options;\n\n return this.performBulkInsert('events', events, {\n batchSize,\n onConflict,\n enableTransactions,\n preprocessor: (event) => ({\n ...event,\n event_id: `evt_${event.frame_id}_${event.seq}_${Date.now()}`,\n payload: JSON.stringify(event.payload),\n }),\n });\n }\n\n /**\n * Add anchors in bulk\n */\n async bulkInsertAnchors(\n anchors: Array<{\n frame_id: string;\n type: string;\n text: string;\n priority: number;\n metadata: any;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n return this.performBulkInsert('anchors', anchors, {\n ...options,\n preprocessor: (anchor) => ({\n ...anchor,\n anchor_id: `anc_${anchor.frame_id}_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,\n metadata: JSON.stringify(anchor.metadata),\n created_at: Date.now(),\n }),\n });\n }\n\n /**\n * Bulk update frame digests\n */\n async bulkUpdateFrameDigests(\n updates: Array<{\n frame_id: string;\n digest_text: string;\n digest_json: any;\n closed_at?: number;\n }>,\n options: BulkInsertOptions = {}\n ): Promise<BatchStats> {\n const { batchSize = 50, enableTransactions = true } = options;\n\n return trace.traceAsync(\n 'function',\n 'bulkUpdateFrameDigests',\n { count: updates.length },\n async () => {\n const startTime = performance.now();\n const stats: BatchStats = {\n totalRecords: updates.length,\n batchesProcessed: 0,\n successfulInserts: 0,\n failedInserts: 0,\n totalTimeMs: 0,\n avgBatchTimeMs: 0,\n };\n\n if (updates.length === 0) return stats;\n\n const stmt = this.db.prepare(`\n UPDATE frames \n SET digest_text = ?, \n digest_json = ?, \n closed_at = COALESCE(?, closed_at),\n state = CASE WHEN ? IS NOT NULL THEN 'closed' ELSE state END\n WHERE frame_id = ?\n `);\n\n const updateFn = (batch: typeof updates) => {\n for (const update of batch) {\n try {\n const result = stmt.run(\n update.digest_text,\n JSON.stringify(update.digest_json),\n update.closed_at,\n update.closed_at,\n update.frame_id\n );\n stats.successfulInserts += result.changes;\n } catch (error: unknown) {\n stats.failedInserts++;\n const wrappedError = wrapError(\n error,\n 'Failed to update frame digest',\n ErrorCode.DB_UPDATE_FAILED,\n { frameId: update.frame_id }\n );\n logger.warn('Failed to update frame digest', {\n frameId: update.frame_id,\n error: wrappedError.message,\n });\n }\n }\n };\n\n if (enableTransactions) {\n const transaction = this.db.transaction(updateFn);\n await this.processBatches(updates, batchSize, transaction, stats);\n } else {\n await this.processBatches(updates, batchSize, updateFn, stats);\n }\n\n stats.totalTimeMs = performance.now() - startTime;\n stats.avgBatchTimeMs =\n stats.batchesProcessed > 0\n ? stats.totalTimeMs / stats.batchesProcessed\n : 0;\n\n logger.info(\n 'Bulk frame digest update completed',\n stats as unknown as Record<string, unknown>\n );\n return stats;\n }\n );\n }\n\n /**\n * Generic bulk insert with preprocessing\n */\n private async performBulkInsert<T extends Record<string, any>>(\n table: string,\n records: T[],\n options: BulkInsertOptions & {\n preprocessor?: (record: T) => Record<string, any>;\n } = {}\n ): Promise<BatchStats> {\n const {\n batchSize = 100,\n onConflict = 'ignore',\n enableTransactions = true,\n preprocessor,\n } = options;\n\n return trace.traceAsync(\n 'function',\n `bulkInsert${table}`,\n { count: records.length },\n async () => {\n const startTime = performance.now();\n const stats: BatchStats = {\n totalRecords: records.length,\n batchesProcessed: 0,\n successfulInserts: 0,\n failedInserts: 0,\n totalTimeMs: 0,\n avgBatchTimeMs: 0,\n };\n\n if (records.length === 0) return stats;\n\n // Preprocess records if needed\n const processedRecords = preprocessor\n ? records.map(preprocessor)\n : records;\n\n // Build dynamic insert statement\n const firstRecord = processedRecords[0];\n const columns = Object.keys(firstRecord);\n const placeholders = columns.map(() => '?').join(', ');\n const conflictClause = this.getConflictClause(onConflict);\n\n const insertSql = `INSERT ${conflictClause} INTO ${table} (${columns.join(', ')}) VALUES (${placeholders})`;\n const stmt = this.db.prepare(insertSql);\n\n const insertFn = (batch: typeof processedRecords) => {\n for (const record of batch) {\n try {\n const values = columns.map((col: any) => record[col]);\n const result = stmt.run(...values);\n stats.successfulInserts += result.changes;\n } catch (error: unknown) {\n stats.failedInserts++;\n const wrappedError = wrapError(\n error,\n `Failed to insert ${table} record`,\n ErrorCode.DB_INSERT_FAILED,\n { table, record }\n );\n logger.warn(`Failed to insert ${table} record`, {\n record,\n error: wrappedError.message,\n });\n }\n }\n };\n\n if (enableTransactions) {\n const transaction = this.db.transaction(insertFn);\n await this.processBatches(\n processedRecords,\n batchSize,\n transaction,\n stats\n );\n } else {\n await this.processBatches(\n processedRecords,\n batchSize,\n insertFn,\n stats\n );\n }\n\n stats.totalTimeMs = performance.now() - startTime;\n stats.avgBatchTimeMs =\n stats.batchesProcessed > 0\n ? stats.totalTimeMs / stats.batchesProcessed\n : 0;\n\n logger.info(\n `Bulk ${table} insert completed`,\n stats as unknown as Record<string, unknown>\n );\n return stats;\n }\n );\n }\n\n /**\n * Process records in batches\n */\n private async processBatches<T>(\n records: T[],\n batchSize: number,\n processFn: (batch: T[]) => void,\n stats: BatchStats\n ): Promise<void> {\n for (let i = 0; i < records.length; i += batchSize) {\n const batch = records.slice(i, i + batchSize);\n const batchStart = performance.now();\n\n try {\n processFn(batch);\n stats.batchesProcessed++;\n\n const batchTime = performance.now() - batchStart;\n logger.debug('Batch processed', {\n batchNumber: stats.batchesProcessed,\n records: batch.length,\n timeMs: batchTime.toFixed(2),\n });\n\n // Yield control to prevent blocking\n if (stats.batchesProcessed % 10 === 0) {\n await new Promise((resolve) => setImmediate(resolve));\n }\n } catch (error: unknown) {\n stats.failedInserts += batch.length;\n const wrappedError = wrapError(\n error,\n 'Batch processing failed',\n ErrorCode.DB_TRANSACTION_FAILED,\n { batchNumber: stats.batchesProcessed + 1, batchSize: batch.length }\n );\n logger.error('Batch processing failed', wrappedError, {\n batchNumber: stats.batchesProcessed + 1,\n batchSize: batch.length,\n });\n }\n }\n }\n\n /**\n * Queue batch operation for later processing\n */\n queueBatchOperation(operation: BatchOperation): void {\n this.batchQueue.push(operation);\n\n if (this.batchQueue.length >= 10 && !this.isProcessing) {\n setImmediate(() => this.processBatchQueue());\n }\n }\n\n /**\n * Process queued batch operations\n */\n async processBatchQueue(): Promise<void> {\n if (this.isProcessing || this.batchQueue.length === 0) {\n return;\n }\n\n this.isProcessing = true;\n const operations = [...this.batchQueue];\n this.batchQueue = [];\n\n try {\n const groupedOps = this.groupOperationsByTable(operations);\n\n for (const [table, tableOps] of groupedOps) {\n await this.processTableOperations(table, tableOps);\n }\n\n logger.info('Batch queue processed', {\n operations: operations.length,\n tables: groupedOps.size,\n });\n } catch (error: unknown) {\n const wrappedError = wrapError(\n error,\n 'Batch queue processing failed',\n ErrorCode.DB_TRANSACTION_FAILED,\n { operationsCount: operations.length }\n );\n logger.error('Batch queue processing failed', wrappedError);\n } finally {\n this.isProcessing = false;\n }\n }\n\n /**\n * Flush any remaining queued operations\n */\n async flush(): Promise<void> {\n if (this.batchQueue.length > 0) {\n await this.processBatchQueue();\n }\n }\n\n /**\n * Get SQL conflict clause\n */\n private getConflictClause(onConflict: string): string {\n switch (onConflict) {\n case 'ignore':\n return 'OR IGNORE';\n case 'replace':\n return 'OR REPLACE';\n case 'update':\n return 'ON CONFLICT DO UPDATE SET';\n default:\n return '';\n }\n }\n\n /**\n * Group operations by table for efficient processing\n */\n private groupOperationsByTable(\n operations: BatchOperation[]\n ): Map<string, BatchOperation[]> {\n const grouped = new Map<string, BatchOperation[]>();\n\n for (const op of operations) {\n if (!grouped.has(op.table)) {\n grouped.set(op.table, []);\n }\n grouped.get(op.table)!.push(op);\n }\n\n return grouped;\n }\n\n /**\n * Process all operations for a specific table\n */\n private async processTableOperations(\n table: string,\n operations: BatchOperation[]\n ): Promise<void> {\n for (const op of operations) {\n switch (op.operation) {\n case 'insert':\n await this.performBulkInsert(table, op.data, {\n onConflict: op.onConflict,\n });\n break;\n // Add update and delete operations as needed\n default:\n logger.warn('Unsupported batch operation', {\n table,\n operation: op.operation,\n });\n }\n }\n }\n\n /**\n * Initialize commonly used prepared statements\n */\n private initializePreparedStatements(): void {\n // Event insertion\n this.preparedStatements.set(\n 'insert_event',\n this.db.prepare(`\n INSERT OR IGNORE INTO events \n (event_id, frame_id, run_id, seq, event_type, payload, ts) \n VALUES (?, ?, ?, ?, ?, ?, ?)\n `)\n );\n\n // Anchor insertion\n this.preparedStatements.set(\n 'insert_anchor',\n this.db.prepare(`\n INSERT OR IGNORE INTO anchors \n (anchor_id, frame_id, type, text, priority, metadata, created_at) \n VALUES (?, ?, ?, ?, ?, ?, ?)\n `)\n );\n\n logger.info('Batch operations prepared statements initialized');\n }\n\n /**\n * Cleanup resources\n */\n cleanup(): void {\n // Modern better-sqlite3 automatically handles cleanup\n this.preparedStatements.clear();\n }\n}\n\n// Global batch operations manager\nlet globalBatchManager: BatchOperationsManager | null = null;\n\n/**\n * Get or create global batch operations manager\n */\nexport function getBatchManager(\n db?: Database.Database\n): BatchOperationsManager {\n if (!globalBatchManager) {\n globalBatchManager = new BatchOperationsManager(db);\n }\n return globalBatchManager;\n}\n\n/**\n * Convenience function for bulk event insertion\n */\nexport async function bulkInsertEvents(\n events: any[],\n options?: BulkInsertOptions\n): Promise<BatchStats> {\n const manager = getBatchManager();\n return manager.bulkInsertEvents(events, options);\n}\n\n/**\n * Convenience function for bulk anchor insertion\n */\nexport async function bulkInsertAnchors(\n anchors: any[],\n options?: BulkInsertOptions\n): Promise<BatchStats> {\n const manager = getBatchManager();\n return manager.bulkInsertAnchors(anchors, options);\n}\n"],
5
+ "mappings": ";;;;AAOA,SAAS,cAAc;AACvB,SAAS,aAAa;AACtB,SAAS,WAAW,iBAAiB;AA4B9B,MAAM,uBAAuB;AAAA,EAC1B;AAAA,EACA,qBAAqB,oBAAI,IAAgC;AAAA,EACzD,aAA+B,CAAC;AAAA,EAChC,eAAe;AAAA,EAEvB,YAAY,IAAwB;AAClC,QAAI,IAAI;AACN,WAAK,KAAK;AACV,WAAK,6BAA6B;AAAA,IACpC,OAAO;AAEL,WAAK,KAAK;AAAA,IACZ;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,iBACJ,QAQA,UAA6B,CAAC,GACT;AACrB,UAAM;AAAA,MACJ,YAAY;AAAA,MACZ,aAAa;AAAA,MACb,qBAAqB;AAAA,IACvB,IAAI;AAEJ,WAAO,KAAK,kBAAkB,UAAU,QAAQ;AAAA,MAC9C;AAAA,MACA;AAAA,MACA;AAAA,MACA,cAAc,CAAC,WAAW;AAAA,QACxB,GAAG;AAAA,QACH,UAAU,OAAO,MAAM,QAAQ,IAAI,MAAM,GAAG,IAAI,KAAK,IAAI,CAAC;AAAA,QAC1D,SAAS,KAAK,UAAU,MAAM,OAAO;AAAA,MACvC;AAAA,IACF,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,kBACJ,SAOA,UAA6B,CAAC,GACT;AACrB,WAAO,KAAK,kBAAkB,WAAW,SAAS;AAAA,MAChD,GAAG;AAAA,MACH,cAAc,CAAC,YAAY;AAAA,QACzB,GAAG;AAAA,QACH,WAAW,OAAO,OAAO,QAAQ,IAAI,KAAK,IAAI,CAAC,IAAI,KAAK,OAAO,EAAE,SAAS,EAAE,EAAE,OAAO,GAAG,CAAC,CAAC;AAAA,QAC1F,UAAU,KAAK,UAAU,OAAO,QAAQ;AAAA,QACxC,YAAY,KAAK,IAAI;AAAA,MACvB;AAAA,IACF,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,uBACJ,SAMA,UAA6B,CAAC,GACT;AACrB,UAAM,EAAE,YAAY,IAAI,qBAAqB,KAAK,IAAI;AAEtD,WAAO,MAAM;AAAA,MACX;AAAA,MACA;AAAA,MACA,EAAE,OAAO,QAAQ,OAAO;AAAA,MACxB,YAAY;AACV,cAAM,YAAY,YAAY,IAAI;AAClC,cAAM,QAAoB;AAAA,UACxB,cAAc,QAAQ;AAAA,UACtB,kBAAkB;AAAA,UAClB,mBAAmB;AAAA,UACnB,eAAe;AAAA,UACf,aAAa;AAAA,UACb,gBAAgB;AAAA,QAClB;AAEA,YAAI,QAAQ,WAAW,EAAG,QAAO;AAEjC,cAAM,OAAO,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,OAO9B;AAEC,cAAM,WAAW,CAAC,UAA0B;AAC1C,qBAAW,UAAU,OAAO;AAC1B,gBAAI;AACF,oBAAM,SAAS,KAAK;AAAA,gBAClB,OAAO;AAAA,gBACP,KAAK,UAAU,OAAO,WAAW;AAAA,gBACjC,OAAO;AAAA,gBACP,OAAO;AAAA,gBACP,OAAO;AAAA,cACT;AACA,oBAAM,qBAAqB,OAAO;AAAA,YACpC,SAAS,OAAgB;AACvB,oBAAM;AACN,oBAAM,eAAe;AAAA,gBACnB;AAAA,gBACA;AAAA,gBACA,UAAU;AAAA,gBACV,EAAE,SAAS,OAAO,SAAS;AAAA,cAC7B;AACA,qBAAO,KAAK,iCAAiC;AAAA,gBAC3C,SAAS,OAAO;AAAA,gBAChB,OAAO,aAAa;AAAA,cACtB,CAAC;AAAA,YACH;AAAA,UACF;AAAA,QACF;AAEA,YAAI,oBAAoB;AACtB,gBAAM,cAAc,KAAK,GAAG,YAAY,QAAQ;AAChD,gBAAM,KAAK,eAAe,SAAS,WAAW,aAAa,KAAK;AAAA,QAClE,OAAO;AACL,gBAAM,KAAK,eAAe,SAAS,WAAW,UAAU,KAAK;AAAA,QAC/D;AAEA,cAAM,cAAc,YAAY,IAAI,IAAI;AACxC,cAAM,iBACJ,MAAM,mBAAmB,IACrB,MAAM,cAAc,MAAM,mBAC1B;AAEN,eAAO;AAAA,UACL;AAAA,UACA;AAAA,QACF;AACA,eAAO;AAAA,MACT;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,kBACZ,OACA,SACA,UAEI,CAAC,GACgB;AACrB,UAAM;AAAA,MACJ,YAAY;AAAA,MACZ,aAAa;AAAA,MACb,qBAAqB;AAAA,MACrB;AAAA,IACF,IAAI;AAEJ,WAAO,MAAM;AAAA,MACX;AAAA,MACA,aAAa,KAAK;AAAA,MAClB,EAAE,OAAO,QAAQ,OAAO;AAAA,MACxB,YAAY;AACV,cAAM,YAAY,YAAY,IAAI;AAClC,cAAM,QAAoB;AAAA,UACxB,cAAc,QAAQ;AAAA,UACtB,kBAAkB;AAAA,UAClB,mBAAmB;AAAA,UACnB,eAAe;AAAA,UACf,aAAa;AAAA,UACb,gBAAgB;AAAA,QAClB;AAEA,YAAI,QAAQ,WAAW,EAAG,QAAO;AAGjC,cAAM,mBAAmB,eACrB,QAAQ,IAAI,YAAY,IACxB;AAGJ,cAAM,cAAc,iBAAiB,CAAC;AACtC,cAAM,UAAU,OAAO,KAAK,WAAW;AACvC,cAAM,eAAe,QAAQ,IAAI,MAAM,GAAG,EAAE,KAAK,IAAI;AACrD,cAAM,iBAAiB,KAAK,kBAAkB,UAAU;AAExD,cAAM,YAAY,UAAU,cAAc,SAAS,KAAK,KAAK,QAAQ,KAAK,IAAI,CAAC,aAAa,YAAY;AACxG,cAAM,OAAO,KAAK,GAAG,QAAQ,SAAS;AAEtC,cAAM,WAAW,CAAC,UAAmC;AACnD,qBAAW,UAAU,OAAO;AAC1B,gBAAI;AACF,oBAAM,SAAS,QAAQ,IAAI,CAAC,QAAa,OAAO,GAAG,CAAC;AACpD,oBAAM,SAAS,KAAK,IAAI,GAAG,MAAM;AACjC,oBAAM,qBAAqB,OAAO;AAAA,YACpC,SAAS,OAAgB;AACvB,oBAAM;AACN,oBAAM,eAAe;AAAA,gBACnB;AAAA,gBACA,oBAAoB,KAAK;AAAA,gBACzB,UAAU;AAAA,gBACV,EAAE,OAAO,OAAO;AAAA,cAClB;AACA,qBAAO,KAAK,oBAAoB,KAAK,WAAW;AAAA,gBAC9C;AAAA,gBACA,OAAO,aAAa;AAAA,cACtB,CAAC;AAAA,YACH;AAAA,UACF;AAAA,QACF;AAEA,YAAI,oBAAoB;AACtB,gBAAM,cAAc,KAAK,GAAG,YAAY,QAAQ;AAChD,gBAAM,KAAK;AAAA,YACT;AAAA,YACA;AAAA,YACA;AAAA,YACA;AAAA,UACF;AAAA,QACF,OAAO;AACL,gBAAM,KAAK;AAAA,YACT;AAAA,YACA;AAAA,YACA;AAAA,YACA;AAAA,UACF;AAAA,QACF;AAEA,cAAM,cAAc,YAAY,IAAI,IAAI;AACxC,cAAM,iBACJ,MAAM,mBAAmB,IACrB,MAAM,cAAc,MAAM,mBAC1B;AAEN,eAAO;AAAA,UACL,QAAQ,KAAK;AAAA,UACb;AAAA,QACF;AACA,eAAO;AAAA,MACT;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,eACZ,SACA,WACA,WACA,OACe;AACf,aAAS,IAAI,GAAG,IAAI,QAAQ,QAAQ,KAAK,WAAW;AAClD,YAAM,QAAQ,QAAQ,MAAM,GAAG,IAAI,SAAS;AAC5C,YAAM,aAAa,YAAY,IAAI;AAEnC,UAAI;AACF,kBAAU,KAAK;AACf,cAAM;AAEN,cAAM,YAAY,YAAY,IAAI,IAAI;AACtC,eAAO,MAAM,mBAAmB;AAAA,UAC9B,aAAa,MAAM;AAAA,UACnB,SAAS,MAAM;AAAA,UACf,QAAQ,UAAU,QAAQ,CAAC;AAAA,QAC7B,CAAC;AAGD,YAAI,MAAM,mBAAmB,OAAO,GAAG;AACrC,gBAAM,IAAI,QAAQ,CAAC,YAAY,aAAa,OAAO,CAAC;AAAA,QACtD;AAAA,MACF,SAAS,OAAgB;AACvB,cAAM,iBAAiB,MAAM;AAC7B,cAAM,eAAe;AAAA,UACnB;AAAA,UACA;AAAA,UACA,UAAU;AAAA,UACV,EAAE,aAAa,MAAM,mBAAmB,GAAG,WAAW,MAAM,OAAO;AAAA,QACrE;AACA,eAAO,MAAM,2BAA2B,cAAc;AAAA,UACpD,aAAa,MAAM,mBAAmB;AAAA,UACtC,WAAW,MAAM;AAAA,QACnB,CAAC;AAAA,MACH;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,oBAAoB,WAAiC;AACnD,SAAK,WAAW,KAAK,SAAS;AAE9B,QAAI,KAAK,WAAW,UAAU,MAAM,CAAC,KAAK,cAAc;AACtD,mBAAa,MAAM,KAAK,kBAAkB,CAAC;AAAA,IAC7C;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,oBAAmC;AACvC,QAAI,KAAK,gBAAgB,KAAK,WAAW,WAAW,GAAG;AACrD;AAAA,IACF;AAEA,SAAK,eAAe;AACpB,UAAM,aAAa,CAAC,GAAG,KAAK,UAAU;AACtC,SAAK,aAAa,CAAC;AAEnB,QAAI;AACF,YAAM,aAAa,KAAK,uBAAuB,UAAU;AAEzD,iBAAW,CAAC,OAAO,QAAQ,KAAK,YAAY;AAC1C,cAAM,KAAK,uBAAuB,OAAO,QAAQ;AAAA,MACnD;AAEA,aAAO,KAAK,yBAAyB;AAAA,QACnC,YAAY,WAAW;AAAA,QACvB,QAAQ,WAAW;AAAA,MACrB,CAAC;AAAA,IACH,SAAS,OAAgB;AACvB,YAAM,eAAe;AAAA,QACnB;AAAA,QACA;AAAA,QACA,UAAU;AAAA,QACV,EAAE,iBAAiB,WAAW,OAAO;AAAA,MACvC;AACA,aAAO,MAAM,iCAAiC,YAAY;AAAA,IAC5D,UAAE;AACA,WAAK,eAAe;AAAA,IACtB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,QAAuB;AAC3B,QAAI,KAAK,WAAW,SAAS,GAAG;AAC9B,YAAM,KAAK,kBAAkB;AAAA,IAC/B;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,kBAAkB,YAA4B;AACpD,YAAQ,YAAY;AAAA,MAClB,KAAK;AACH,eAAO;AAAA,MACT,KAAK;AACH,eAAO;AAAA,MACT,KAAK;AACH,eAAO;AAAA,MACT;AACE,eAAO;AAAA,IACX;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,uBACN,YAC+B;AAC/B,UAAM,UAAU,oBAAI,IAA8B;AAElD,eAAW,MAAM,YAAY;AAC3B,UAAI,CAAC,QAAQ,IAAI,GAAG,KAAK,GAAG;AAC1B,gBAAQ,IAAI,GAAG,OAAO,CAAC,CAAC;AAAA,MAC1B;AACA,cAAQ,IAAI,GAAG,KAAK,EAAG,KAAK,EAAE;AAAA,IAChC;AAEA,WAAO;AAAA,EACT;AAAA;AAAA;AAAA;AAAA,EAKA,MAAc,uBACZ,OACA,YACe;AACf,eAAW,MAAM,YAAY;AAC3B,cAAQ,GAAG,WAAW;AAAA,QACpB,KAAK;AACH,gBAAM,KAAK,kBAAkB,OAAO,GAAG,MAAM;AAAA,YAC3C,YAAY,GAAG;AAAA,UACjB,CAAC;AACD;AAAA;AAAA,QAEF;AACE,iBAAO,KAAK,+BAA+B;AAAA,YACzC;AAAA,YACA,WAAW,GAAG;AAAA,UAChB,CAAC;AAAA,MACL;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKQ,+BAAqC;AAE3C,SAAK,mBAAmB;AAAA,MACtB;AAAA,MACA,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA,OAIf;AAAA,IACH;AAGA,SAAK,mBAAmB;AAAA,MACtB;AAAA,MACA,KAAK,GAAG,QAAQ;AAAA;AAAA;AAAA;AAAA,OAIf;AAAA,IACH;AAEA,WAAO,KAAK,kDAAkD;AAAA,EAChE;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AAEd,SAAK,mBAAmB,MAAM;AAAA,EAChC;AACF;AAGA,IAAI,qBAAoD;AAKjD,SAAS,gBACd,IACwB;AACxB,MAAI,CAAC,oBAAoB;AACvB,yBAAqB,IAAI,uBAAuB,EAAE;AAAA,EACpD;AACA,SAAO;AACT;AAKA,eAAsB,iBACpB,QACA,SACqB;AACrB,QAAM,UAAU,gBAAgB;AAChC,SAAO,QAAQ,iBAAiB,QAAQ,OAAO;AACjD;AAKA,eAAsB,kBACpB,SACA,SACqB;AACrB,QAAM,UAAU,gBAAgB;AAChC,SAAO,QAAQ,kBAAkB,SAAS,OAAO;AACnD;",
6
6
  "names": []
7
7
  }
@@ -5,6 +5,7 @@ const __dirname = __pathDirname(__filename);
5
5
  import { Pool } from "pg";
6
6
  import { EventEmitter } from "events";
7
7
  import { logger } from "../monitoring/logger.js";
8
+ import { DatabaseError, ErrorCode } from "../errors/index.js";
8
9
  class ConnectionPool extends EventEmitter {
9
10
  pool;
10
11
  config;
@@ -180,7 +181,12 @@ class ConnectionPool extends EventEmitter {
180
181
  } catch (error) {
181
182
  this.metrics.totalErrors++;
182
183
  logger.error("Failed to acquire connection:", error);
183
- throw error;
184
+ throw new DatabaseError(
185
+ "Failed to acquire database connection",
186
+ ErrorCode.DB_CONNECTION_FAILED,
187
+ { pool: "paradedb" },
188
+ error instanceof Error ? error : void 0
189
+ );
184
190
  }
185
191
  }
186
192
  /**
@@ -307,7 +313,12 @@ class ConnectionPool extends EventEmitter {
307
313
  logger.error("Transaction rollback failed:", rollbackError);
308
314
  this.markConnectionAsBad(client);
309
315
  }
310
- throw error;
316
+ throw new DatabaseError(
317
+ "Transaction failed",
318
+ ErrorCode.DB_TRANSACTION_FAILED,
319
+ { operation: "transaction" },
320
+ error instanceof Error ? error : void 0
321
+ );
311
322
  } finally {
312
323
  this.release(client);
313
324
  }
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "version": 3,
3
3
  "sources": ["../../../src/core/database/connection-pool.ts"],
4
- "sourcesContent": ["/**\n * Connection Pool Manager for ParadeDB\n * Manages PostgreSQL connection pooling with health checks and monitoring\n */\n\nimport { Pool, PoolClient, PoolConfig } from 'pg';\nimport { EventEmitter } from 'events';\nimport { logger } from '../monitoring/logger.js';\n\nexport interface ConnectionPoolConfig extends PoolConfig {\n // Basic pool settings\n min?: number; // Minimum pool size (default: 2)\n max?: number; // Maximum pool size (default: 10)\n idleTimeoutMillis?: number; // Close idle connections after ms (default: 30000)\n connectionTimeoutMillis?: number; // Connection acquire timeout (default: 5000)\n\n // Health check settings\n healthCheckInterval?: number; // Health check frequency in ms (default: 30000)\n healthCheckQuery?: string; // Query to test connection health (default: 'SELECT 1')\n retryOnFailure?: boolean; // Retry failed connections (default: true)\n maxRetries?: number; // Max retry attempts (default: 3)\n retryDelayMs?: number; // Delay between retries (default: 1000)\n\n // Monitoring settings\n enableMetrics?: boolean; // Enable connection metrics (default: true)\n metricsInterval?: number; // Metrics collection interval (default: 60000)\n}\n\nexport interface ConnectionMetrics {\n totalConnections: number;\n idleConnections: number;\n activeConnections: number;\n waitingRequests: number;\n totalAcquired: number;\n totalReleased: number;\n totalErrors: number;\n averageAcquireTime: number;\n peakConnections: number;\n uptime: number;\n}\n\nexport interface ConnectionHealth {\n isHealthy: boolean;\n lastCheck: Date;\n consecutiveFailures: number;\n totalChecks: number;\n totalFailures: number;\n averageResponseTime: number;\n}\n\nexport class ConnectionPool extends EventEmitter {\n private pool: Pool;\n private config: Required<ConnectionPoolConfig>;\n private metrics: ConnectionMetrics;\n private health: ConnectionHealth;\n private healthCheckTimer?: NodeJS.Timeout;\n private metricsTimer?: NodeJS.Timeout;\n private startTime: Date;\n private badConnections = new Set<PoolClient>();\n private acquireTimes: number[] = [];\n\n constructor(config: ConnectionPoolConfig) {\n super();\n\n this.config = this.normalizeConfig(config);\n this.startTime = new Date();\n\n // Initialize metrics\n this.metrics = {\n totalConnections: 0,\n idleConnections: 0,\n activeConnections: 0,\n waitingRequests: 0,\n totalAcquired: 0,\n totalReleased: 0,\n totalErrors: 0,\n averageAcquireTime: 0,\n peakConnections: 0,\n uptime: 0,\n };\n\n // Initialize health\n this.health = {\n isHealthy: false,\n lastCheck: new Date(),\n consecutiveFailures: 0,\n totalChecks: 0,\n totalFailures: 0,\n averageResponseTime: 0,\n };\n\n // Create pool\n this.pool = new Pool(this.config);\n this.setupPoolEvents();\n\n // Start monitoring if enabled\n if (this.config.enableMetrics) {\n this.startMonitoring();\n }\n }\n\n private normalizeConfig(\n config: ConnectionPoolConfig\n ): Required<ConnectionPoolConfig> {\n return {\n ...config,\n min: config.min ?? 2,\n max: config.max ?? 10,\n idleTimeoutMillis: config.idleTimeoutMillis ?? 30000,\n connectionTimeoutMillis: config.connectionTimeoutMillis ?? 5000,\n healthCheckInterval: config.healthCheckInterval ?? 30000,\n healthCheckQuery: config.healthCheckQuery ?? 'SELECT 1',\n retryOnFailure: config.retryOnFailure ?? true,\n maxRetries: config.maxRetries ?? 3,\n retryDelayMs: config.retryDelayMs ?? 1000,\n enableMetrics: config.enableMetrics ?? true,\n metricsInterval: config.metricsInterval ?? 60000,\n };\n }\n\n private setupPoolEvents(): void {\n this.pool.on('connect', (client) => {\n logger.debug('New database connection established');\n this.metrics.totalConnections++;\n this.updatePeakConnections();\n this.emit('connect', client);\n });\n\n this.pool.on('acquire', (client) => {\n this.metrics.totalAcquired++;\n this.emit('acquire', client);\n });\n\n this.pool.on('release', (client) => {\n this.metrics.totalReleased++;\n this.emit('release', client);\n });\n\n this.pool.on('remove', (client) => {\n logger.debug('Database connection removed from pool');\n this.metrics.totalConnections--;\n this.emit('remove', client);\n });\n\n this.pool.on('error', (error) => {\n logger.error('Database pool error:', error);\n this.metrics.totalErrors++;\n this.emit('error', error);\n });\n }\n\n private updatePeakConnections(): void {\n const current = this.pool.totalCount;\n if (current > this.metrics.peakConnections) {\n this.metrics.peakConnections = current;\n }\n }\n\n private startMonitoring(): void {\n // Health checks\n if (this.config.healthCheckInterval > 0) {\n this.healthCheckTimer = setInterval(() => {\n this.performHealthCheck().catch((error) => {\n logger.error('Health check failed:', error);\n });\n }, this.config.healthCheckInterval);\n }\n\n // Metrics collection\n if (this.config.metricsInterval > 0) {\n this.metricsTimer = setInterval(() => {\n this.updateMetrics();\n this.emit('metrics', this.getMetrics());\n }, this.config.metricsInterval);\n }\n\n // Initial health check\n this.performHealthCheck().catch((error) => {\n logger.warn('Initial health check failed:', error);\n });\n }\n\n private async performHealthCheck(): Promise<void> {\n const startTime = Date.now();\n let client: PoolClient | undefined;\n\n try {\n this.health.totalChecks++;\n\n client = await this.pool.connect();\n await client.query(this.config.healthCheckQuery);\n\n const responseTime = Date.now() - startTime;\n this.updateHealthMetrics(true, responseTime);\n\n logger.debug(`Health check passed in ${responseTime}ms`);\n } catch (error: unknown) {\n const responseTime = Date.now() - startTime;\n this.updateHealthMetrics(false, responseTime);\n\n logger.warn(`Health check failed after ${responseTime}ms:`, error);\n\n if (\n this.config.retryOnFailure &&\n this.health.consecutiveFailures < this.config.maxRetries\n ) {\n setTimeout(() => {\n this.performHealthCheck().catch(() => {\n // Ignore retry failures\n });\n }, this.config.retryDelayMs);\n }\n } finally {\n if (client) {\n client.release();\n }\n }\n }\n\n private updateHealthMetrics(success: boolean, responseTime: number): void {\n this.health.lastCheck = new Date();\n\n if (success) {\n this.health.isHealthy = true;\n this.health.consecutiveFailures = 0;\n } else {\n this.health.isHealthy = false;\n this.health.consecutiveFailures++;\n this.health.totalFailures++;\n }\n\n // Update average response time (simple moving average of last 10 checks)\n const weight = Math.min(this.health.totalChecks, 10);\n this.health.averageResponseTime =\n (this.health.averageResponseTime * (weight - 1) + responseTime) / weight;\n }\n\n private updateMetrics(): void {\n this.metrics.idleConnections = this.pool.idleCount;\n this.metrics.activeConnections = this.pool.totalCount - this.pool.idleCount;\n this.metrics.waitingRequests = this.pool.waitingCount;\n this.metrics.uptime = Date.now() - this.startTime.getTime();\n\n // Update average acquire time\n if (this.acquireTimes.length > 0) {\n this.metrics.averageAcquireTime =\n this.acquireTimes.reduce((sum, time) => sum + time, 0) /\n this.acquireTimes.length;\n\n // Keep only recent acquire times (last 100)\n if (this.acquireTimes.length > 100) {\n this.acquireTimes = this.acquireTimes.slice(-100);\n }\n }\n }\n\n /**\n * Acquire a connection from the pool\n */\n async acquire(): Promise<PoolClient> {\n const startTime = Date.now();\n\n try {\n const client = await this.pool.connect();\n\n // Track acquire time\n const acquireTime = Date.now() - startTime;\n this.acquireTimes.push(acquireTime);\n\n // Check if connection is marked as bad\n if (this.badConnections.has(client)) {\n this.badConnections.delete(client);\n client.release(true); // Force removal\n return this.acquire(); // Try again\n }\n\n return client;\n } catch (error: unknown) {\n this.metrics.totalErrors++;\n logger.error('Failed to acquire connection:', error);\n throw error;\n }\n }\n\n /**\n * Release a connection back to the pool\n */\n release(client: PoolClient, error?: Error | boolean): void {\n if (error) {\n // Release with error - connection will be removed from pool\n client.release(true);\n } else {\n client.release();\n }\n }\n\n /**\n * Mark a connection as bad (will be removed on next acquire)\n */\n markConnectionAsBad(client: PoolClient): void {\n this.badConnections.add(client);\n logger.warn('Connection marked as bad and will be removed');\n }\n\n /**\n * Get current connection metrics\n */\n getMetrics(): ConnectionMetrics {\n this.updateMetrics();\n return { ...this.metrics };\n }\n\n /**\n * Get current health status\n */\n getHealth(): ConnectionHealth {\n return { ...this.health };\n }\n\n /**\n * Test connection to database\n */\n async ping(): Promise<boolean> {\n try {\n const client = await this.acquire();\n await client.query('SELECT 1');\n this.release(client);\n return true;\n } catch {\n return false;\n }\n }\n\n /**\n * Get pool status information\n */\n getStatus() {\n return {\n totalCount: this.pool.totalCount,\n idleCount: this.pool.idleCount,\n waitingCount: this.pool.waitingCount,\n config: {\n min: this.config.min,\n max: this.config.max,\n idleTimeoutMillis: this.config.idleTimeoutMillis,\n connectionTimeoutMillis: this.config.connectionTimeoutMillis,\n },\n health: this.getHealth(),\n metrics: this.getMetrics(),\n };\n }\n\n /**\n * Close all connections and clean up\n */\n async close(): Promise<void> {\n logger.info('Closing connection pool');\n\n // Clear timers\n if (this.healthCheckTimer) {\n clearInterval(this.healthCheckTimer);\n }\n if (this.metricsTimer) {\n clearInterval(this.metricsTimer);\n }\n\n // Close pool\n await this.pool.end();\n\n // Clear bad connections set\n this.badConnections.clear();\n\n this.emit('close');\n logger.info('Connection pool closed');\n }\n\n /**\n * Drain pool gracefully (wait for active connections to finish)\n */\n async drain(timeoutMs = 30000): Promise<void> {\n logger.info('Draining connection pool');\n\n const startTime = Date.now();\n\n while (this.pool.totalCount - this.pool.idleCount > 0) {\n if (Date.now() - startTime > timeoutMs) {\n logger.warn('Pool drain timeout reached, forcing close');\n break;\n }\n\n // Wait a bit before checking again\n await new Promise((resolve) => setTimeout(resolve, 100));\n }\n\n await this.close();\n }\n\n /**\n * Execute a query using a pooled connection\n */\n async query<T = any>(\n text: string,\n params?: any[]\n ): Promise<{ rows: T[]; rowCount: number }> {\n const client = await this.acquire();\n\n try {\n const result = await client.query(text, params);\n return {\n rows: result.rows,\n rowCount: result.rowCount || 0,\n };\n } finally {\n this.release(client);\n }\n }\n\n /**\n * Execute multiple queries in a transaction\n */\n async transaction<T>(\n callback: (client: PoolClient) => Promise<T>\n ): Promise<T> {\n const client = await this.acquire();\n\n try {\n await client.query('BEGIN');\n const result = await callback(client);\n await client.query('COMMIT');\n return result;\n } catch (error: unknown) {\n try {\n await client.query('ROLLBACK');\n } catch (rollbackError: unknown) {\n logger.error('Transaction rollback failed:', rollbackError);\n this.markConnectionAsBad(client);\n }\n throw error;\n } finally {\n this.release(client);\n }\n }\n}\n"],
5
- "mappings": ";;;;AAKA,SAAS,YAAoC;AAC7C,SAAS,oBAAoB;AAC7B,SAAS,cAAc;AA2ChB,MAAM,uBAAuB,aAAa;AAAA,EACvC;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA,iBAAiB,oBAAI,IAAgB;AAAA,EACrC,eAAyB,CAAC;AAAA,EAElC,YAAY,QAA8B;AACxC,UAAM;AAEN,SAAK,SAAS,KAAK,gBAAgB,MAAM;AACzC,SAAK,YAAY,oBAAI,KAAK;AAG1B,SAAK,UAAU;AAAA,MACb,kBAAkB;AAAA,MAClB,iBAAiB;AAAA,MACjB,mBAAmB;AAAA,MACnB,iBAAiB;AAAA,MACjB,eAAe;AAAA,MACf,eAAe;AAAA,MACf,aAAa;AAAA,MACb,oBAAoB;AAAA,MACpB,iBAAiB;AAAA,MACjB,QAAQ;AAAA,IACV;AAGA,SAAK,SAAS;AAAA,MACZ,WAAW;AAAA,MACX,WAAW,oBAAI,KAAK;AAAA,MACpB,qBAAqB;AAAA,MACrB,aAAa;AAAA,MACb,eAAe;AAAA,MACf,qBAAqB;AAAA,IACvB;AAGA,SAAK,OAAO,IAAI,KAAK,KAAK,MAAM;AAChC,SAAK,gBAAgB;AAGrB,QAAI,KAAK,OAAO,eAAe;AAC7B,WAAK,gBAAgB;AAAA,IACvB;AAAA,EACF;AAAA,EAEQ,gBACN,QACgC;AAChC,WAAO;AAAA,MACL,GAAG;AAAA,MACH,KAAK,OAAO,OAAO;AAAA,MACnB,KAAK,OAAO,OAAO;AAAA,MACnB,mBAAmB,OAAO,qBAAqB;AAAA,MAC/C,yBAAyB,OAAO,2BAA2B;AAAA,MAC3D,qBAAqB,OAAO,uBAAuB;AAAA,MACnD,kBAAkB,OAAO,oBAAoB;AAAA,MAC7C,gBAAgB,OAAO,kBAAkB;AAAA,MACzC,YAAY,OAAO,cAAc;AAAA,MACjC,cAAc,OAAO,gBAAgB;AAAA,MACrC,eAAe,OAAO,iBAAiB;AAAA,MACvC,iBAAiB,OAAO,mBAAmB;AAAA,IAC7C;AAAA,EACF;AAAA,EAEQ,kBAAwB;AAC9B,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,aAAO,MAAM,qCAAqC;AAClD,WAAK,QAAQ;AACb,WAAK,sBAAsB;AAC3B,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,WAAK,QAAQ;AACb,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,WAAK,QAAQ;AACb,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,UAAU,CAAC,WAAW;AACjC,aAAO,MAAM,uCAAuC;AACpD,WAAK,QAAQ;AACb,WAAK,KAAK,UAAU,MAAM;AAAA,IAC5B,CAAC;AAED,SAAK,KAAK,GAAG,SAAS,CAAC,UAAU;AAC/B,aAAO,MAAM,wBAAwB,KAAK;AAC1C,WAAK,QAAQ;AACb,WAAK,KAAK,SAAS,KAAK;AAAA,IAC1B,CAAC;AAAA,EACH;AAAA,EAEQ,wBAA8B;AACpC,UAAM,UAAU,KAAK,KAAK;AAC1B,QAAI,UAAU,KAAK,QAAQ,iBAAiB;AAC1C,WAAK,QAAQ,kBAAkB;AAAA,IACjC;AAAA,EACF;AAAA,EAEQ,kBAAwB;AAE9B,QAAI,KAAK,OAAO,sBAAsB,GAAG;AACvC,WAAK,mBAAmB,YAAY,MAAM;AACxC,aAAK,mBAAmB,EAAE,MAAM,CAAC,UAAU;AACzC,iBAAO,MAAM,wBAAwB,KAAK;AAAA,QAC5C,CAAC;AAAA,MACH,GAAG,KAAK,OAAO,mBAAmB;AAAA,IACpC;AAGA,QAAI,KAAK,OAAO,kBAAkB,GAAG;AACnC,WAAK,eAAe,YAAY,MAAM;AACpC,aAAK,cAAc;AACnB,aAAK,KAAK,WAAW,KAAK,WAAW,CAAC;AAAA,MACxC,GAAG,KAAK,OAAO,eAAe;AAAA,IAChC;AAGA,SAAK,mBAAmB,EAAE,MAAM,CAAC,UAAU;AACzC,aAAO,KAAK,gCAAgC,KAAK;AAAA,IACnD,CAAC;AAAA,EACH;AAAA,EAEA,MAAc,qBAAoC;AAChD,UAAM,YAAY,KAAK,IAAI;AAC3B,QAAI;AAEJ,QAAI;AACF,WAAK,OAAO;AAEZ,eAAS,MAAM,KAAK,KAAK,QAAQ;AACjC,YAAM,OAAO,MAAM,KAAK,OAAO,gBAAgB;AAE/C,YAAM,eAAe,KAAK,IAAI,IAAI;AAClC,WAAK,oBAAoB,MAAM,YAAY;AAE3C,aAAO,MAAM,0BAA0B,YAAY,IAAI;AAAA,IACzD,SAAS,OAAgB;AACvB,YAAM,eAAe,KAAK,IAAI,IAAI;AAClC,WAAK,oBAAoB,OAAO,YAAY;AAE5C,aAAO,KAAK,6BAA6B,YAAY,OAAO,KAAK;AAEjE,UACE,KAAK,OAAO,kBACZ,KAAK,OAAO,sBAAsB,KAAK,OAAO,YAC9C;AACA,mBAAW,MAAM;AACf,eAAK,mBAAmB,EAAE,MAAM,MAAM;AAAA,UAEtC,CAAC;AAAA,QACH,GAAG,KAAK,OAAO,YAAY;AAAA,MAC7B;AAAA,IACF,UAAE;AACA,UAAI,QAAQ;AACV,eAAO,QAAQ;AAAA,MACjB;AAAA,IACF;AAAA,EACF;AAAA,EAEQ,oBAAoB,SAAkB,cAA4B;AACxE,SAAK,OAAO,YAAY,oBAAI,KAAK;AAEjC,QAAI,SAAS;AACX,WAAK,OAAO,YAAY;AACxB,WAAK,OAAO,sBAAsB;AAAA,IACpC,OAAO;AACL,WAAK,OAAO,YAAY;AACxB,WAAK,OAAO;AACZ,WAAK,OAAO;AAAA,IACd;AAGA,UAAM,SAAS,KAAK,IAAI,KAAK,OAAO,aAAa,EAAE;AACnD,SAAK,OAAO,uBACT,KAAK,OAAO,uBAAuB,SAAS,KAAK,gBAAgB;AAAA,EACtE;AAAA,EAEQ,gBAAsB;AAC5B,SAAK,QAAQ,kBAAkB,KAAK,KAAK;AACzC,SAAK,QAAQ,oBAAoB,KAAK,KAAK,aAAa,KAAK,KAAK;AAClE,SAAK,QAAQ,kBAAkB,KAAK,KAAK;AACzC,SAAK,QAAQ,SAAS,KAAK,IAAI,IAAI,KAAK,UAAU,QAAQ;AAG1D,QAAI,KAAK,aAAa,SAAS,GAAG;AAChC,WAAK,QAAQ,qBACX,KAAK,aAAa,OAAO,CAAC,KAAK,SAAS,MAAM,MAAM,CAAC,IACrD,KAAK,aAAa;AAGpB,UAAI,KAAK,aAAa,SAAS,KAAK;AAClC,aAAK,eAAe,KAAK,aAAa,MAAM,IAAI;AAAA,MAClD;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,UAA+B;AACnC,UAAM,YAAY,KAAK,IAAI;AAE3B,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,KAAK,QAAQ;AAGvC,YAAM,cAAc,KAAK,IAAI,IAAI;AACjC,WAAK,aAAa,KAAK,WAAW;AAGlC,UAAI,KAAK,eAAe,IAAI,MAAM,GAAG;AACnC,aAAK,eAAe,OAAO,MAAM;AACjC,eAAO,QAAQ,IAAI;AACnB,eAAO,KAAK,QAAQ;AAAA,MACtB;AAEA,aAAO;AAAA,IACT,SAAS,OAAgB;AACvB,WAAK,QAAQ;AACb,aAAO,MAAM,iCAAiC,KAAK;AACnD,YAAM;AAAA,IACR;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,QAAQ,QAAoB,OAA+B;AACzD,QAAI,OAAO;AAET,aAAO,QAAQ,IAAI;AAAA,IACrB,OAAO;AACL,aAAO,QAAQ;AAAA,IACjB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,oBAAoB,QAA0B;AAC5C,SAAK,eAAe,IAAI,MAAM;AAC9B,WAAO,KAAK,8CAA8C;AAAA,EAC5D;AAAA;AAAA;AAAA;AAAA,EAKA,aAAgC;AAC9B,SAAK,cAAc;AACnB,WAAO,EAAE,GAAG,KAAK,QAAQ;AAAA,EAC3B;AAAA;AAAA;AAAA;AAAA,EAKA,YAA8B;AAC5B,WAAO,EAAE,GAAG,KAAK,OAAO;AAAA,EAC1B;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,OAAyB;AAC7B,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,QAAQ;AAClC,YAAM,OAAO,MAAM,UAAU;AAC7B,WAAK,QAAQ,MAAM;AACnB,aAAO;AAAA,IACT,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,YAAY;AACV,WAAO;AAAA,MACL,YAAY,KAAK,KAAK;AAAA,MACtB,WAAW,KAAK,KAAK;AAAA,MACrB,cAAc,KAAK,KAAK;AAAA,MACxB,QAAQ;AAAA,QACN,KAAK,KAAK,OAAO;AAAA,QACjB,KAAK,KAAK,OAAO;AAAA,QACjB,mBAAmB,KAAK,OAAO;AAAA,QAC/B,yBAAyB,KAAK,OAAO;AAAA,MACvC;AAAA,MACA,QAAQ,KAAK,UAAU;AAAA,MACvB,SAAS,KAAK,WAAW;AAAA,IAC3B;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,QAAuB;AAC3B,WAAO,KAAK,yBAAyB;AAGrC,QAAI,KAAK,kBAAkB;AACzB,oBAAc,KAAK,gBAAgB;AAAA,IACrC;AACA,QAAI,KAAK,cAAc;AACrB,oBAAc,KAAK,YAAY;AAAA,IACjC;AAGA,UAAM,KAAK,KAAK,IAAI;AAGpB,SAAK,eAAe,MAAM;AAE1B,SAAK,KAAK,OAAO;AACjB,WAAO,KAAK,wBAAwB;AAAA,EACtC;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,MAAM,YAAY,KAAsB;AAC5C,WAAO,KAAK,0BAA0B;AAEtC,UAAM,YAAY,KAAK,IAAI;AAE3B,WAAO,KAAK,KAAK,aAAa,KAAK,KAAK,YAAY,GAAG;AACrD,UAAI,KAAK,IAAI,IAAI,YAAY,WAAW;AACtC,eAAO,KAAK,2CAA2C;AACvD;AAAA,MACF;AAGA,YAAM,IAAI,QAAQ,CAAC,YAAY,WAAW,SAAS,GAAG,CAAC;AAAA,IACzD;AAEA,UAAM,KAAK,MAAM;AAAA,EACnB;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,MACJ,MACA,QAC0C;AAC1C,UAAM,SAAS,MAAM,KAAK,QAAQ;AAElC,QAAI;AACF,YAAM,SAAS,MAAM,OAAO,MAAM,MAAM,MAAM;AAC9C,aAAO;AAAA,QACL,MAAM,OAAO;AAAA,QACb,UAAU,OAAO,YAAY;AAAA,MAC/B;AAAA,IACF,UAAE;AACA,WAAK,QAAQ,MAAM;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,YACJ,UACY;AACZ,UAAM,SAAS,MAAM,KAAK,QAAQ;AAElC,QAAI;AACF,YAAM,OAAO,MAAM,OAAO;AAC1B,YAAM,SAAS,MAAM,SAAS,MAAM;AACpC,YAAM,OAAO,MAAM,QAAQ;AAC3B,aAAO;AAAA,IACT,SAAS,OAAgB;AACvB,UAAI;AACF,cAAM,OAAO,MAAM,UAAU;AAAA,MAC/B,SAAS,eAAwB;AAC/B,eAAO,MAAM,gCAAgC,aAAa;AAC1D,aAAK,oBAAoB,MAAM;AAAA,MACjC;AACA,YAAM;AAAA,IACR,UAAE;AACA,WAAK,QAAQ,MAAM;AAAA,IACrB;AAAA,EACF;AACF;",
4
+ "sourcesContent": ["/**\n * Connection Pool Manager for ParadeDB\n * Manages PostgreSQL connection pooling with health checks and monitoring\n */\n\nimport { Pool, PoolClient, PoolConfig } from 'pg';\nimport { EventEmitter } from 'events';\nimport { logger } from '../monitoring/logger.js';\nimport { DatabaseError, ErrorCode } from '../errors/index.js';\n\nexport interface ConnectionPoolConfig extends PoolConfig {\n // Basic pool settings\n min?: number; // Minimum pool size (default: 2)\n max?: number; // Maximum pool size (default: 10)\n idleTimeoutMillis?: number; // Close idle connections after ms (default: 30000)\n connectionTimeoutMillis?: number; // Connection acquire timeout (default: 5000)\n\n // Health check settings\n healthCheckInterval?: number; // Health check frequency in ms (default: 30000)\n healthCheckQuery?: string; // Query to test connection health (default: 'SELECT 1')\n retryOnFailure?: boolean; // Retry failed connections (default: true)\n maxRetries?: number; // Max retry attempts (default: 3)\n retryDelayMs?: number; // Delay between retries (default: 1000)\n\n // Monitoring settings\n enableMetrics?: boolean; // Enable connection metrics (default: true)\n metricsInterval?: number; // Metrics collection interval (default: 60000)\n}\n\nexport interface ConnectionMetrics {\n totalConnections: number;\n idleConnections: number;\n activeConnections: number;\n waitingRequests: number;\n totalAcquired: number;\n totalReleased: number;\n totalErrors: number;\n averageAcquireTime: number;\n peakConnections: number;\n uptime: number;\n}\n\nexport interface ConnectionHealth {\n isHealthy: boolean;\n lastCheck: Date;\n consecutiveFailures: number;\n totalChecks: number;\n totalFailures: number;\n averageResponseTime: number;\n}\n\nexport class ConnectionPool extends EventEmitter {\n private pool: Pool;\n private config: Required<ConnectionPoolConfig>;\n private metrics: ConnectionMetrics;\n private health: ConnectionHealth;\n private healthCheckTimer?: NodeJS.Timeout;\n private metricsTimer?: NodeJS.Timeout;\n private startTime: Date;\n private badConnections = new Set<PoolClient>();\n private acquireTimes: number[] = [];\n\n constructor(config: ConnectionPoolConfig) {\n super();\n\n this.config = this.normalizeConfig(config);\n this.startTime = new Date();\n\n // Initialize metrics\n this.metrics = {\n totalConnections: 0,\n idleConnections: 0,\n activeConnections: 0,\n waitingRequests: 0,\n totalAcquired: 0,\n totalReleased: 0,\n totalErrors: 0,\n averageAcquireTime: 0,\n peakConnections: 0,\n uptime: 0,\n };\n\n // Initialize health\n this.health = {\n isHealthy: false,\n lastCheck: new Date(),\n consecutiveFailures: 0,\n totalChecks: 0,\n totalFailures: 0,\n averageResponseTime: 0,\n };\n\n // Create pool\n this.pool = new Pool(this.config);\n this.setupPoolEvents();\n\n // Start monitoring if enabled\n if (this.config.enableMetrics) {\n this.startMonitoring();\n }\n }\n\n private normalizeConfig(\n config: ConnectionPoolConfig\n ): Required<ConnectionPoolConfig> {\n return {\n ...config,\n min: config.min ?? 2,\n max: config.max ?? 10,\n idleTimeoutMillis: config.idleTimeoutMillis ?? 30000,\n connectionTimeoutMillis: config.connectionTimeoutMillis ?? 5000,\n healthCheckInterval: config.healthCheckInterval ?? 30000,\n healthCheckQuery: config.healthCheckQuery ?? 'SELECT 1',\n retryOnFailure: config.retryOnFailure ?? true,\n maxRetries: config.maxRetries ?? 3,\n retryDelayMs: config.retryDelayMs ?? 1000,\n enableMetrics: config.enableMetrics ?? true,\n metricsInterval: config.metricsInterval ?? 60000,\n };\n }\n\n private setupPoolEvents(): void {\n this.pool.on('connect', (client) => {\n logger.debug('New database connection established');\n this.metrics.totalConnections++;\n this.updatePeakConnections();\n this.emit('connect', client);\n });\n\n this.pool.on('acquire', (client) => {\n this.metrics.totalAcquired++;\n this.emit('acquire', client);\n });\n\n this.pool.on('release', (client) => {\n this.metrics.totalReleased++;\n this.emit('release', client);\n });\n\n this.pool.on('remove', (client) => {\n logger.debug('Database connection removed from pool');\n this.metrics.totalConnections--;\n this.emit('remove', client);\n });\n\n this.pool.on('error', (error) => {\n logger.error('Database pool error:', error);\n this.metrics.totalErrors++;\n this.emit('error', error);\n });\n }\n\n private updatePeakConnections(): void {\n const current = this.pool.totalCount;\n if (current > this.metrics.peakConnections) {\n this.metrics.peakConnections = current;\n }\n }\n\n private startMonitoring(): void {\n // Health checks\n if (this.config.healthCheckInterval > 0) {\n this.healthCheckTimer = setInterval(() => {\n this.performHealthCheck().catch((error) => {\n logger.error('Health check failed:', error);\n });\n }, this.config.healthCheckInterval);\n }\n\n // Metrics collection\n if (this.config.metricsInterval > 0) {\n this.metricsTimer = setInterval(() => {\n this.updateMetrics();\n this.emit('metrics', this.getMetrics());\n }, this.config.metricsInterval);\n }\n\n // Initial health check\n this.performHealthCheck().catch((error) => {\n logger.warn('Initial health check failed:', error);\n });\n }\n\n private async performHealthCheck(): Promise<void> {\n const startTime = Date.now();\n let client: PoolClient | undefined;\n\n try {\n this.health.totalChecks++;\n\n client = await this.pool.connect();\n await client.query(this.config.healthCheckQuery);\n\n const responseTime = Date.now() - startTime;\n this.updateHealthMetrics(true, responseTime);\n\n logger.debug(`Health check passed in ${responseTime}ms`);\n } catch (error: unknown) {\n const responseTime = Date.now() - startTime;\n this.updateHealthMetrics(false, responseTime);\n\n logger.warn(`Health check failed after ${responseTime}ms:`, error);\n\n if (\n this.config.retryOnFailure &&\n this.health.consecutiveFailures < this.config.maxRetries\n ) {\n setTimeout(() => {\n this.performHealthCheck().catch(() => {\n // Ignore retry failures\n });\n }, this.config.retryDelayMs);\n }\n } finally {\n if (client) {\n client.release();\n }\n }\n }\n\n private updateHealthMetrics(success: boolean, responseTime: number): void {\n this.health.lastCheck = new Date();\n\n if (success) {\n this.health.isHealthy = true;\n this.health.consecutiveFailures = 0;\n } else {\n this.health.isHealthy = false;\n this.health.consecutiveFailures++;\n this.health.totalFailures++;\n }\n\n // Update average response time (simple moving average of last 10 checks)\n const weight = Math.min(this.health.totalChecks, 10);\n this.health.averageResponseTime =\n (this.health.averageResponseTime * (weight - 1) + responseTime) / weight;\n }\n\n private updateMetrics(): void {\n this.metrics.idleConnections = this.pool.idleCount;\n this.metrics.activeConnections = this.pool.totalCount - this.pool.idleCount;\n this.metrics.waitingRequests = this.pool.waitingCount;\n this.metrics.uptime = Date.now() - this.startTime.getTime();\n\n // Update average acquire time\n if (this.acquireTimes.length > 0) {\n this.metrics.averageAcquireTime =\n this.acquireTimes.reduce((sum, time) => sum + time, 0) /\n this.acquireTimes.length;\n\n // Keep only recent acquire times (last 100)\n if (this.acquireTimes.length > 100) {\n this.acquireTimes = this.acquireTimes.slice(-100);\n }\n }\n }\n\n /**\n * Acquire a connection from the pool\n */\n async acquire(): Promise<PoolClient> {\n const startTime = Date.now();\n\n try {\n const client = await this.pool.connect();\n\n // Track acquire time\n const acquireTime = Date.now() - startTime;\n this.acquireTimes.push(acquireTime);\n\n // Check if connection is marked as bad\n if (this.badConnections.has(client)) {\n this.badConnections.delete(client);\n client.release(true); // Force removal\n return this.acquire(); // Try again\n }\n\n return client;\n } catch (error: unknown) {\n this.metrics.totalErrors++;\n logger.error('Failed to acquire connection:', error);\n throw new DatabaseError(\n 'Failed to acquire database connection',\n ErrorCode.DB_CONNECTION_FAILED,\n { pool: 'paradedb' },\n error instanceof Error ? error : undefined\n );\n }\n }\n\n /**\n * Release a connection back to the pool\n */\n release(client: PoolClient, error?: Error | boolean): void {\n if (error) {\n // Release with error - connection will be removed from pool\n client.release(true);\n } else {\n client.release();\n }\n }\n\n /**\n * Mark a connection as bad (will be removed on next acquire)\n */\n markConnectionAsBad(client: PoolClient): void {\n this.badConnections.add(client);\n logger.warn('Connection marked as bad and will be removed');\n }\n\n /**\n * Get current connection metrics\n */\n getMetrics(): ConnectionMetrics {\n this.updateMetrics();\n return { ...this.metrics };\n }\n\n /**\n * Get current health status\n */\n getHealth(): ConnectionHealth {\n return { ...this.health };\n }\n\n /**\n * Test connection to database\n */\n async ping(): Promise<boolean> {\n try {\n const client = await this.acquire();\n await client.query('SELECT 1');\n this.release(client);\n return true;\n } catch {\n return false;\n }\n }\n\n /**\n * Get pool status information\n */\n getStatus() {\n return {\n totalCount: this.pool.totalCount,\n idleCount: this.pool.idleCount,\n waitingCount: this.pool.waitingCount,\n config: {\n min: this.config.min,\n max: this.config.max,\n idleTimeoutMillis: this.config.idleTimeoutMillis,\n connectionTimeoutMillis: this.config.connectionTimeoutMillis,\n },\n health: this.getHealth(),\n metrics: this.getMetrics(),\n };\n }\n\n /**\n * Close all connections and clean up\n */\n async close(): Promise<void> {\n logger.info('Closing connection pool');\n\n // Clear timers\n if (this.healthCheckTimer) {\n clearInterval(this.healthCheckTimer);\n }\n if (this.metricsTimer) {\n clearInterval(this.metricsTimer);\n }\n\n // Close pool\n await this.pool.end();\n\n // Clear bad connections set\n this.badConnections.clear();\n\n this.emit('close');\n logger.info('Connection pool closed');\n }\n\n /**\n * Drain pool gracefully (wait for active connections to finish)\n */\n async drain(timeoutMs = 30000): Promise<void> {\n logger.info('Draining connection pool');\n\n const startTime = Date.now();\n\n while (this.pool.totalCount - this.pool.idleCount > 0) {\n if (Date.now() - startTime > timeoutMs) {\n logger.warn('Pool drain timeout reached, forcing close');\n break;\n }\n\n // Wait a bit before checking again\n await new Promise((resolve) => setTimeout(resolve, 100));\n }\n\n await this.close();\n }\n\n /**\n * Execute a query using a pooled connection\n */\n async query<T = any>(\n text: string,\n params?: any[]\n ): Promise<{ rows: T[]; rowCount: number }> {\n const client = await this.acquire();\n\n try {\n const result = await client.query(text, params);\n return {\n rows: result.rows,\n rowCount: result.rowCount || 0,\n };\n } finally {\n this.release(client);\n }\n }\n\n /**\n * Execute multiple queries in a transaction\n */\n async transaction<T>(\n callback: (client: PoolClient) => Promise<T>\n ): Promise<T> {\n const client = await this.acquire();\n\n try {\n await client.query('BEGIN');\n const result = await callback(client);\n await client.query('COMMIT');\n return result;\n } catch (error: unknown) {\n try {\n await client.query('ROLLBACK');\n } catch (rollbackError: unknown) {\n logger.error('Transaction rollback failed:', rollbackError);\n this.markConnectionAsBad(client);\n }\n throw new DatabaseError(\n 'Transaction failed',\n ErrorCode.DB_TRANSACTION_FAILED,\n { operation: 'transaction' },\n error instanceof Error ? error : undefined\n );\n } finally {\n this.release(client);\n }\n }\n}\n"],
5
+ "mappings": ";;;;AAKA,SAAS,YAAoC;AAC7C,SAAS,oBAAoB;AAC7B,SAAS,cAAc;AACvB,SAAS,eAAe,iBAAiB;AA2ClC,MAAM,uBAAuB,aAAa;AAAA,EACvC;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA;AAAA,EACA,iBAAiB,oBAAI,IAAgB;AAAA,EACrC,eAAyB,CAAC;AAAA,EAElC,YAAY,QAA8B;AACxC,UAAM;AAEN,SAAK,SAAS,KAAK,gBAAgB,MAAM;AACzC,SAAK,YAAY,oBAAI,KAAK;AAG1B,SAAK,UAAU;AAAA,MACb,kBAAkB;AAAA,MAClB,iBAAiB;AAAA,MACjB,mBAAmB;AAAA,MACnB,iBAAiB;AAAA,MACjB,eAAe;AAAA,MACf,eAAe;AAAA,MACf,aAAa;AAAA,MACb,oBAAoB;AAAA,MACpB,iBAAiB;AAAA,MACjB,QAAQ;AAAA,IACV;AAGA,SAAK,SAAS;AAAA,MACZ,WAAW;AAAA,MACX,WAAW,oBAAI,KAAK;AAAA,MACpB,qBAAqB;AAAA,MACrB,aAAa;AAAA,MACb,eAAe;AAAA,MACf,qBAAqB;AAAA,IACvB;AAGA,SAAK,OAAO,IAAI,KAAK,KAAK,MAAM;AAChC,SAAK,gBAAgB;AAGrB,QAAI,KAAK,OAAO,eAAe;AAC7B,WAAK,gBAAgB;AAAA,IACvB;AAAA,EACF;AAAA,EAEQ,gBACN,QACgC;AAChC,WAAO;AAAA,MACL,GAAG;AAAA,MACH,KAAK,OAAO,OAAO;AAAA,MACnB,KAAK,OAAO,OAAO;AAAA,MACnB,mBAAmB,OAAO,qBAAqB;AAAA,MAC/C,yBAAyB,OAAO,2BAA2B;AAAA,MAC3D,qBAAqB,OAAO,uBAAuB;AAAA,MACnD,kBAAkB,OAAO,oBAAoB;AAAA,MAC7C,gBAAgB,OAAO,kBAAkB;AAAA,MACzC,YAAY,OAAO,cAAc;AAAA,MACjC,cAAc,OAAO,gBAAgB;AAAA,MACrC,eAAe,OAAO,iBAAiB;AAAA,MACvC,iBAAiB,OAAO,mBAAmB;AAAA,IAC7C;AAAA,EACF;AAAA,EAEQ,kBAAwB;AAC9B,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,aAAO,MAAM,qCAAqC;AAClD,WAAK,QAAQ;AACb,WAAK,sBAAsB;AAC3B,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,WAAK,QAAQ;AACb,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,WAAW,CAAC,WAAW;AAClC,WAAK,QAAQ;AACb,WAAK,KAAK,WAAW,MAAM;AAAA,IAC7B,CAAC;AAED,SAAK,KAAK,GAAG,UAAU,CAAC,WAAW;AACjC,aAAO,MAAM,uCAAuC;AACpD,WAAK,QAAQ;AACb,WAAK,KAAK,UAAU,MAAM;AAAA,IAC5B,CAAC;AAED,SAAK,KAAK,GAAG,SAAS,CAAC,UAAU;AAC/B,aAAO,MAAM,wBAAwB,KAAK;AAC1C,WAAK,QAAQ;AACb,WAAK,KAAK,SAAS,KAAK;AAAA,IAC1B,CAAC;AAAA,EACH;AAAA,EAEQ,wBAA8B;AACpC,UAAM,UAAU,KAAK,KAAK;AAC1B,QAAI,UAAU,KAAK,QAAQ,iBAAiB;AAC1C,WAAK,QAAQ,kBAAkB;AAAA,IACjC;AAAA,EACF;AAAA,EAEQ,kBAAwB;AAE9B,QAAI,KAAK,OAAO,sBAAsB,GAAG;AACvC,WAAK,mBAAmB,YAAY,MAAM;AACxC,aAAK,mBAAmB,EAAE,MAAM,CAAC,UAAU;AACzC,iBAAO,MAAM,wBAAwB,KAAK;AAAA,QAC5C,CAAC;AAAA,MACH,GAAG,KAAK,OAAO,mBAAmB;AAAA,IACpC;AAGA,QAAI,KAAK,OAAO,kBAAkB,GAAG;AACnC,WAAK,eAAe,YAAY,MAAM;AACpC,aAAK,cAAc;AACnB,aAAK,KAAK,WAAW,KAAK,WAAW,CAAC;AAAA,MACxC,GAAG,KAAK,OAAO,eAAe;AAAA,IAChC;AAGA,SAAK,mBAAmB,EAAE,MAAM,CAAC,UAAU;AACzC,aAAO,KAAK,gCAAgC,KAAK;AAAA,IACnD,CAAC;AAAA,EACH;AAAA,EAEA,MAAc,qBAAoC;AAChD,UAAM,YAAY,KAAK,IAAI;AAC3B,QAAI;AAEJ,QAAI;AACF,WAAK,OAAO;AAEZ,eAAS,MAAM,KAAK,KAAK,QAAQ;AACjC,YAAM,OAAO,MAAM,KAAK,OAAO,gBAAgB;AAE/C,YAAM,eAAe,KAAK,IAAI,IAAI;AAClC,WAAK,oBAAoB,MAAM,YAAY;AAE3C,aAAO,MAAM,0BAA0B,YAAY,IAAI;AAAA,IACzD,SAAS,OAAgB;AACvB,YAAM,eAAe,KAAK,IAAI,IAAI;AAClC,WAAK,oBAAoB,OAAO,YAAY;AAE5C,aAAO,KAAK,6BAA6B,YAAY,OAAO,KAAK;AAEjE,UACE,KAAK,OAAO,kBACZ,KAAK,OAAO,sBAAsB,KAAK,OAAO,YAC9C;AACA,mBAAW,MAAM;AACf,eAAK,mBAAmB,EAAE,MAAM,MAAM;AAAA,UAEtC,CAAC;AAAA,QACH,GAAG,KAAK,OAAO,YAAY;AAAA,MAC7B;AAAA,IACF,UAAE;AACA,UAAI,QAAQ;AACV,eAAO,QAAQ;AAAA,MACjB;AAAA,IACF;AAAA,EACF;AAAA,EAEQ,oBAAoB,SAAkB,cAA4B;AACxE,SAAK,OAAO,YAAY,oBAAI,KAAK;AAEjC,QAAI,SAAS;AACX,WAAK,OAAO,YAAY;AACxB,WAAK,OAAO,sBAAsB;AAAA,IACpC,OAAO;AACL,WAAK,OAAO,YAAY;AACxB,WAAK,OAAO;AACZ,WAAK,OAAO;AAAA,IACd;AAGA,UAAM,SAAS,KAAK,IAAI,KAAK,OAAO,aAAa,EAAE;AACnD,SAAK,OAAO,uBACT,KAAK,OAAO,uBAAuB,SAAS,KAAK,gBAAgB;AAAA,EACtE;AAAA,EAEQ,gBAAsB;AAC5B,SAAK,QAAQ,kBAAkB,KAAK,KAAK;AACzC,SAAK,QAAQ,oBAAoB,KAAK,KAAK,aAAa,KAAK,KAAK;AAClE,SAAK,QAAQ,kBAAkB,KAAK,KAAK;AACzC,SAAK,QAAQ,SAAS,KAAK,IAAI,IAAI,KAAK,UAAU,QAAQ;AAG1D,QAAI,KAAK,aAAa,SAAS,GAAG;AAChC,WAAK,QAAQ,qBACX,KAAK,aAAa,OAAO,CAAC,KAAK,SAAS,MAAM,MAAM,CAAC,IACrD,KAAK,aAAa;AAGpB,UAAI,KAAK,aAAa,SAAS,KAAK;AAClC,aAAK,eAAe,KAAK,aAAa,MAAM,IAAI;AAAA,MAClD;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,UAA+B;AACnC,UAAM,YAAY,KAAK,IAAI;AAE3B,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,KAAK,QAAQ;AAGvC,YAAM,cAAc,KAAK,IAAI,IAAI;AACjC,WAAK,aAAa,KAAK,WAAW;AAGlC,UAAI,KAAK,eAAe,IAAI,MAAM,GAAG;AACnC,aAAK,eAAe,OAAO,MAAM;AACjC,eAAO,QAAQ,IAAI;AACnB,eAAO,KAAK,QAAQ;AAAA,MACtB;AAEA,aAAO;AAAA,IACT,SAAS,OAAgB;AACvB,WAAK,QAAQ;AACb,aAAO,MAAM,iCAAiC,KAAK;AACnD,YAAM,IAAI;AAAA,QACR;AAAA,QACA,UAAU;AAAA,QACV,EAAE,MAAM,WAAW;AAAA,QACnB,iBAAiB,QAAQ,QAAQ;AAAA,MACnC;AAAA,IACF;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,QAAQ,QAAoB,OAA+B;AACzD,QAAI,OAAO;AAET,aAAO,QAAQ,IAAI;AAAA,IACrB,OAAO;AACL,aAAO,QAAQ;AAAA,IACjB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,oBAAoB,QAA0B;AAC5C,SAAK,eAAe,IAAI,MAAM;AAC9B,WAAO,KAAK,8CAA8C;AAAA,EAC5D;AAAA;AAAA;AAAA;AAAA,EAKA,aAAgC;AAC9B,SAAK,cAAc;AACnB,WAAO,EAAE,GAAG,KAAK,QAAQ;AAAA,EAC3B;AAAA;AAAA;AAAA;AAAA,EAKA,YAA8B;AAC5B,WAAO,EAAE,GAAG,KAAK,OAAO;AAAA,EAC1B;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,OAAyB;AAC7B,QAAI;AACF,YAAM,SAAS,MAAM,KAAK,QAAQ;AAClC,YAAM,OAAO,MAAM,UAAU;AAC7B,WAAK,QAAQ,MAAM;AACnB,aAAO;AAAA,IACT,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,YAAY;AACV,WAAO;AAAA,MACL,YAAY,KAAK,KAAK;AAAA,MACtB,WAAW,KAAK,KAAK;AAAA,MACrB,cAAc,KAAK,KAAK;AAAA,MACxB,QAAQ;AAAA,QACN,KAAK,KAAK,OAAO;AAAA,QACjB,KAAK,KAAK,OAAO;AAAA,QACjB,mBAAmB,KAAK,OAAO;AAAA,QAC/B,yBAAyB,KAAK,OAAO;AAAA,MACvC;AAAA,MACA,QAAQ,KAAK,UAAU;AAAA,MACvB,SAAS,KAAK,WAAW;AAAA,IAC3B;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,QAAuB;AAC3B,WAAO,KAAK,yBAAyB;AAGrC,QAAI,KAAK,kBAAkB;AACzB,oBAAc,KAAK,gBAAgB;AAAA,IACrC;AACA,QAAI,KAAK,cAAc;AACrB,oBAAc,KAAK,YAAY;AAAA,IACjC;AAGA,UAAM,KAAK,KAAK,IAAI;AAGpB,SAAK,eAAe,MAAM;AAE1B,SAAK,KAAK,OAAO;AACjB,WAAO,KAAK,wBAAwB;AAAA,EACtC;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,MAAM,YAAY,KAAsB;AAC5C,WAAO,KAAK,0BAA0B;AAEtC,UAAM,YAAY,KAAK,IAAI;AAE3B,WAAO,KAAK,KAAK,aAAa,KAAK,KAAK,YAAY,GAAG;AACrD,UAAI,KAAK,IAAI,IAAI,YAAY,WAAW;AACtC,eAAO,KAAK,2CAA2C;AACvD;AAAA,MACF;AAGA,YAAM,IAAI,QAAQ,CAAC,YAAY,WAAW,SAAS,GAAG,CAAC;AAAA,IACzD;AAEA,UAAM,KAAK,MAAM;AAAA,EACnB;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,MACJ,MACA,QAC0C;AAC1C,UAAM,SAAS,MAAM,KAAK,QAAQ;AAElC,QAAI;AACF,YAAM,SAAS,MAAM,OAAO,MAAM,MAAM,MAAM;AAC9C,aAAO;AAAA,QACL,MAAM,OAAO;AAAA,QACb,UAAU,OAAO,YAAY;AAAA,MAC/B;AAAA,IACF,UAAE;AACA,WAAK,QAAQ,MAAM;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,MAAM,YACJ,UACY;AACZ,UAAM,SAAS,MAAM,KAAK,QAAQ;AAElC,QAAI;AACF,YAAM,OAAO,MAAM,OAAO;AAC1B,YAAM,SAAS,MAAM,SAAS,MAAM;AACpC,YAAM,OAAO,MAAM,QAAQ;AAC3B,aAAO;AAAA,IACT,SAAS,OAAgB;AACvB,UAAI;AACF,cAAM,OAAO,MAAM,UAAU;AAAA,MAC/B,SAAS,eAAwB;AAC/B,eAAO,MAAM,gCAAgC,aAAa;AAC1D,aAAK,oBAAoB,MAAM;AAAA,MACjC;AACA,YAAM,IAAI;AAAA,QACR;AAAA,QACA,UAAU;AAAA,QACV,EAAE,WAAW,cAAc;AAAA,QAC3B,iBAAiB,QAAQ,QAAQ;AAAA,MACnC;AAAA,IACF,UAAE;AACA,WAAK,QAAQ,MAAM;AAAA,IACrB;AAAA,EACF;AACF;",
6
6
  "names": []
7
7
  }