@docstack/pouchdb-adapter-googledrive 0.0.2 β†’ 0.0.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -6,10 +6,13 @@ A persistent, serverless PouchDB adapter that uses Google Drive as a backend sto
6
6
 
7
7
  - **πŸš€ Append-Only Log**: Uses an efficient append-only log pattern for fast, conflict-free writes.
8
8
  - **⚑ Lazy Loading**: Optimizes memory and bandwidth by loading only the **Index** into memory. Document bodies are fetched on-demand.
9
- - **πŸ›‘οΈ Optimistic Concurrency Control**: Uses ETag-based locking on metadata to prevent race conditions and data loss during simultaneous updates.
10
- - **πŸ”„ Replication Ready**: Fully automated support for PouchDB's `sync` and `replicate` protocols (bilateral sync).
11
- - **πŸ“¦ Auto-Compaction**: Automatically merges logs into snapshots to keep performance high.
12
- - **πŸ’Ύ Offline/Resilient**: Retry logic with exponential backoff handles network instability and "thundering herd" scenarios.
9
+ - **πŸ›‘οΈ Optimistic Concurrency Control**: Uses ETag-based locking on metadata to prevent race conditions.
10
+ - **πŸ“¦ Auto-Compaction**: Automatically merges logs for performance.
11
+ - **🌍 Universal**: Works natively in Node.js 18+, Browsers, and Edge environments (no `googleapis` dependency).
12
+
13
+ ## Requirements
14
+
15
+ - **Node.js 18+** (for global `fetch` support) or a modern browser.
13
16
 
14
17
  ## Installation
15
18
 
@@ -19,74 +22,50 @@ npm install @docstack/pouchdb-adapter-googledrive
19
22
 
20
23
  ## Usage
21
24
 
25
+ The adapter is initialized as a plugin with your Google Drive access token.
26
+
22
27
  ```typescript
23
28
  import PouchDB from 'pouchdb-core';
24
29
  import GoogleDriveAdapter from '@docstack/pouchdb-adapter-googledrive';
25
- import { google } from 'googleapis';
26
-
27
- // Register the adapter
28
- PouchDB.plugin(GoogleDriveAdapter);
29
-
30
- // Setup Google Drive Client (Authenticated)
31
- const oauth2Client = new google.auth.OAuth2(
32
- YOUR_CLIENT_ID,
33
- YOUR_CLIENT_SECRET,
34
- YOUR_REDIRECT_URL
35
- );
36
- oauth2Client.setCredentials({ access_token: '...' });
37
-
38
- const drive = google.drive({ version: 'v3', auth: oauth2Client });
39
-
40
- // Create Database
41
- const db = new PouchDB('my-drive-db', {
42
- adapter: 'googledrive',
43
- drive: drive, // valid googleapis Drive instance
44
- folderId: '...', // Optional: Storage Folder ID (recommended)
45
- folderName: 'my-db', // Optional: Folder name (created if not exists)
46
- pollingIntervalMs: 2000, // Optional: Check for remote changes
47
- compactionThreshold: 50, // Optional: Entries before auto-compaction
48
- cacheSize: 1000 // Optional: Number of document bodies to keep in LRU cache
30
+
31
+ // 1. Initialize the Adapter Plugin Factory
32
+ const adapterPlugin = GoogleDriveAdapter({
33
+ accessToken: 'YOUR_GOOGLE_ACCESS_TOKEN',
34
+ folderName: 'my-app-db-folder', // Root folder in Drive
35
+ pollingIntervalMs: 2000 // Optional: check for remote changes
36
+ });
37
+
38
+ // 2. Register Plugin
39
+ PouchDB.plugin(adapterPlugin);
40
+
41
+ // 3. Create Database
42
+ const db = new PouchDB('user_db', {
43
+ adapter: 'googledrive'
49
44
  });
50
45
 
51
- // Use standard PouchDB API
52
- await db.put({ _id: 'doc1', title: 'Hello Drive!' });
53
- const doc = await db.get('doc1');
46
+ await db.post({ title: 'Hello World' });
47
+ ```
48
+
49
+ ### Dynamic Tokens
50
+
51
+ If your token expires, you can provide an async function that returns a valid token:
52
+
53
+ ```typescript
54
+ const adapterPlugin = GoogleDriveAdapter({
55
+ accessToken: async () => {
56
+ const session = await getMySession();
57
+ return session.accessToken;
58
+ },
59
+ folderName: 'my-app-db'
60
+ });
54
61
  ```
55
62
 
56
63
  ## Architecture
57
64
 
58
- The adapter implements a **"Remote-First"** architecture designed for scale:
59
-
60
- ### 1. Storage Structure
61
- Inside your Google Drive folder, you will see:
62
- - `_meta.json`: The "Lock File". Tracks the sequence number and active log pointers.
63
- - `snapshot-index.json`: A lightweight map of `DocID -> { Revision, FilePointer }`. Loaded at startup.
64
- - `snapshot-data.json`: Large payload files containing document bodies. **Not loaded** until requested.
65
- - `changes-{seq}-{uuid}.ndjson`: Immutable append-only logs for recent updates.
66
-
67
- ### 2. Lazy Loading & Caching
68
- - **Startup**: The client downloads only `_meta.json` and `snapshot-index.json` (~MBs even for large DBs).
69
- - **Access**: `db.get(id)` checks a local **LRU Cache**. If missing, it fetches the specific file containing that document from Drive.
70
- - **Sync**: `db.changes()` iterates the local index, ensuring fast replication without downloading full content.
71
-
72
- ### 3. Concurrency
73
- - **Writes**: Every write creates a new unique `changes-*.ndjson` file.
74
- - **Commit**: The adapter attempts to update `_meta.json` with an ETag check (`If-Match`).
75
- - **Conflict**: If `_meta.json` was changed by another client, the write retries automatically after re-syncing the index.
76
-
77
- ## Testing
78
-
79
- To run the tests, you need to provide Google Drive API credentials.
80
-
81
- 1. Copy `.env.example` to `.env`:
82
- ```bash
83
- cp .env.example .env
84
- ```
85
- 2. Fill in your Google Cloud credentials in `.env`.
86
- 3. Run the tests:
87
- ```bash
88
- npm test
89
- ```
65
+ The adapter implements a **"Remote-First"** architecture:
66
+ - **Lazy Loading**: `db.get(id)` fetches data on-demand from Drive.
67
+ - **Caching**: Changes are indexed locally but bodies are cached in an LRU cache.
68
+ - **Resilience**: Writes use optimistic locking to handle multi-client concurrency safer.
90
69
 
91
70
  ## License
92
71
 
@@ -0,0 +1,29 @@
1
+ export interface DriveFile {
2
+ id: string;
3
+ name: string;
4
+ mimeType: string;
5
+ parents?: string[];
6
+ etag?: string;
7
+ }
8
+ export interface DriveClientOptions {
9
+ accessToken: string | (() => Promise<string>);
10
+ }
11
+ export declare class GoogleDriveClient {
12
+ private options;
13
+ constructor(options: DriveClientOptions);
14
+ private getToken;
15
+ private fetch;
16
+ listFiles(q: string): Promise<DriveFile[]>;
17
+ getFile(fileId: string): Promise<any>;
18
+ getFileMetadata(fileId: string): Promise<DriveFile>;
19
+ createFile(name: string, parents: string[] | undefined, mimeType: string, content: string): Promise<{
20
+ id: string;
21
+ etag: string;
22
+ }>;
23
+ updateFile(fileId: string, content: string, expectedEtag?: string): Promise<{
24
+ id: string;
25
+ etag: string;
26
+ }>;
27
+ deleteFile(fileId: string): Promise<void>;
28
+ private buildMultipart;
29
+ }
package/lib/client.js ADDED
@@ -0,0 +1,119 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.GoogleDriveClient = void 0;
4
+ const BASE_URL = 'https://www.googleapis.com/drive/v3/files';
5
+ const UPLOAD_URL = 'https://www.googleapis.com/upload/drive/v3/files';
6
+ class GoogleDriveClient {
7
+ constructor(options) {
8
+ this.options = options;
9
+ }
10
+ async getToken() {
11
+ if (typeof this.options.accessToken === 'function') {
12
+ return await this.options.accessToken();
13
+ }
14
+ return this.options.accessToken;
15
+ }
16
+ async fetch(url, init) {
17
+ const token = await this.getToken();
18
+ const headers = new Headers(init.headers);
19
+ headers.set('Authorization', `Bearer ${token}`);
20
+ const res = await fetch(url, { ...init, headers });
21
+ if (!res.ok) {
22
+ // Basic error handling
23
+ const text = await res.text();
24
+ let errorMsg = `Drive API Error: ${res.status} ${res.statusText}`;
25
+ try {
26
+ const json = JSON.parse(text);
27
+ if (json.error && json.error.message) {
28
+ errorMsg += ` - ${json.error.message}`;
29
+ }
30
+ }
31
+ catch { }
32
+ const err = new Error(errorMsg);
33
+ err.status = res.status;
34
+ throw err;
35
+ }
36
+ return res;
37
+ }
38
+ async listFiles(q) {
39
+ const params = new URLSearchParams({
40
+ q,
41
+ fields: 'files(id, name, mimeType, parents, etag)',
42
+ spaces: 'drive',
43
+ pageSize: '1000' // Ensure we get enough
44
+ });
45
+ const res = await this.fetch(`${BASE_URL}?${params.toString()}`, { method: 'GET' });
46
+ const data = await res.json();
47
+ return data.files || [];
48
+ }
49
+ async getFile(fileId) {
50
+ // Try getting media
51
+ try {
52
+ const params = new URLSearchParams({ alt: 'media' });
53
+ const res = await this.fetch(`${BASE_URL}/${fileId}?${params.toString()}`, { method: 'GET' });
54
+ // Standard fetch handles JSON/Text transparency?
55
+ // We expect JSON mostly, but sometimes we might want text.
56
+ // PouchDB adapter flow: downloadJson, downloadNdjson
57
+ // Let's rely on content-type or caller expectation?
58
+ // The usage in `drive.ts` expects parsed JSON/NDJSON lines.
59
+ // Let's return the raw Text or JSON based on Content-Type?
60
+ const contentType = res.headers.get('content-type');
61
+ if (contentType && contentType.includes('application/json')) {
62
+ return await res.json();
63
+ }
64
+ return await res.text();
65
+ }
66
+ catch (e) {
67
+ throw e;
68
+ }
69
+ }
70
+ // Single metadata get (for etag check)
71
+ async getFileMetadata(fileId) {
72
+ const params = new URLSearchParams({ fields: 'id, name, mimeType, parents, etag' });
73
+ const res = await this.fetch(`${BASE_URL}/${fileId}?${params.toString()}`, { method: 'GET' });
74
+ return await res.json();
75
+ }
76
+ async createFile(name, parents, mimeType, content) {
77
+ const metadata = {
78
+ name,
79
+ mimeType,
80
+ parents
81
+ };
82
+ const multipartBody = this.buildMultipart(metadata, content, mimeType);
83
+ const res = await this.fetch(`${UPLOAD_URL}?uploadType=multipart&fields=id,etag`, {
84
+ method: 'POST',
85
+ headers: {
86
+ 'Content-Type': `multipart/related; boundary=${multipartBody.boundary}`
87
+ },
88
+ body: multipartBody.body
89
+ });
90
+ return await res.json();
91
+ }
92
+ async updateFile(fileId, content, expectedEtag) {
93
+ // Update content (media) usually, but sometimes meta?
94
+ // In our usage (saveMeta), we update body.
95
+ const res = await this.fetch(`${UPLOAD_URL}/${fileId}?uploadType=media&fields=id,etag`, {
96
+ method: 'PATCH',
97
+ headers: expectedEtag ? { 'If-Match': expectedEtag, 'Content-Type': 'application/json' } : { 'Content-Type': 'application/json' },
98
+ body: content
99
+ });
100
+ return await res.json();
101
+ }
102
+ async deleteFile(fileId) {
103
+ await this.fetch(`${BASE_URL}/${fileId}`, { method: 'DELETE' });
104
+ }
105
+ buildMultipart(metadata, content, contentType) {
106
+ const boundary = '-------' + Math.random().toString(36).substring(2);
107
+ const delimiter = `\r\n--${boundary}\r\n`;
108
+ const closeDelimiter = `\r\n--${boundary}--`;
109
+ const body = delimiter +
110
+ 'Content-Type: application/json\r\n\r\n' +
111
+ JSON.stringify(metadata) +
112
+ delimiter +
113
+ `Content-Type: ${contentType}\r\n\r\n` +
114
+ content +
115
+ closeDelimiter;
116
+ return { body, boundary };
117
+ }
118
+ }
119
+ exports.GoogleDriveClient = GoogleDriveClient;
package/lib/drive.d.ts CHANGED
@@ -10,7 +10,7 @@ import { GoogleDriveAdapterOptions, ChangeEntry, IndexEntry } from './types';
10
10
  * └── changes-*.ndjson # Append logs
11
11
  */
12
12
  export declare class DriveHandler {
13
- private drive;
13
+ private client;
14
14
  private folderId;
15
15
  private folderName;
16
16
  private parents;
package/lib/drive.js CHANGED
@@ -2,6 +2,7 @@
2
2
  Object.defineProperty(exports, "__esModule", { value: true });
3
3
  exports.DriveHandler = void 0;
4
4
  const cache_1 = require("./cache");
5
+ const client_1 = require("./client");
5
6
  const DEFAULT_COMPACTION_THRESHOLD = 100; // entries
6
7
  const DEFAULT_SIZE_THRESHOLD = 1024 * 1024; // 1MB
7
8
  const DEFAULT_CACHE_SIZE = 1000; // Number of docs
@@ -32,7 +33,7 @@ class DriveHandler {
32
33
  this.currentLogSizeEstimate = 0;
33
34
  this.listeners = [];
34
35
  this.pollingInterval = null;
35
- this.drive = options.drive;
36
+ this.client = new client_1.GoogleDriveClient(options);
36
37
  this.folderId = options.folderId || null;
37
38
  this.folderName = options.folderName || dbName;
38
39
  this.parents = options.parents || [];
@@ -267,7 +268,7 @@ class DriveHandler {
267
268
  return await this.tryAppendChanges(changes);
268
269
  }
269
270
  catch (err) {
270
- if (err.code === 412 || err.code === 409) {
271
+ if (err.status === 412 || err.status === 409) {
271
272
  // Reload and RETRY
272
273
  await this.load();
273
274
  // Check conflicts against Index (Metadata sufficient)
@@ -359,16 +360,8 @@ class DriveHandler {
359
360
  });
360
361
  // 2. Upload Data File
361
362
  const dataContent = JSON.stringify(snapshotData);
362
- const dataRes = await this.drive.files.create({
363
- requestBody: {
364
- name: `snapshot-data-${Date.now()}.json`,
365
- parents: [this.folderId],
366
- mimeType: 'application/json'
367
- },
368
- media: { mimeType: 'application/json', body: dataContent },
369
- fields: 'id'
370
- });
371
- const dataFileId = dataRes.data.id;
363
+ const dataRes = await this.client.createFile(`snapshot-data-${Date.now()}.json`, [this.folderId], 'application/json', dataContent);
364
+ const dataFileId = dataRes.id;
372
365
  // 3. Create Index pointing to this Data File
373
366
  const newIndexEntries = {};
374
367
  for (const id of Object.keys(snapshotData.docs)) {
@@ -384,16 +377,8 @@ class DriveHandler {
384
377
  createdAt: Date.now()
385
378
  };
386
379
  const indexContent = JSON.stringify(snapshotIndex);
387
- const indexRes = await this.drive.files.create({
388
- requestBody: {
389
- name: `snapshot-index-${Date.now()}.json`,
390
- parents: [this.folderId],
391
- mimeType: 'application/json'
392
- },
393
- media: { mimeType: 'application/json', body: indexContent },
394
- fields: 'id'
395
- });
396
- const newIndexId = indexRes.data.id;
380
+ const indexRes = await this.client.createFile(`snapshot-index-${Date.now()}.json`, [this.folderId], 'application/json', indexContent);
381
+ const newIndexId = indexRes.id;
397
382
  // 4. Update Meta
398
383
  await this.atomicUpdateMeta((latest) => {
399
384
  const remainingLogs = latest.changeLogIds.filter(id => !oldLogIds.includes(id));
@@ -424,7 +409,7 @@ class DriveHandler {
424
409
  return;
425
410
  }
426
411
  catch (err) {
427
- if (err.code === 412 || err.code === 409) {
412
+ if (err.status === 412 || err.status === 409) {
428
413
  attempt++;
429
414
  await new Promise(r => setTimeout(r, Math.random() * 500 + 100));
430
415
  continue;
@@ -436,44 +421,33 @@ class DriveHandler {
436
421
  // Reused helpers
437
422
  async findOrCreateFolder() {
438
423
  const q = `name = '${this.folderName}' and mimeType = 'application/vnd.google-apps.folder' and trashed = false`;
439
- const res = await this.drive.files.list({ q, spaces: 'drive', fields: 'files(id)' });
440
- if (res.data.files && res.data.files.length > 0)
441
- return res.data.files[0].id;
442
- const createRes = await this.drive.files.create({
443
- requestBody: { name: this.folderName, mimeType: 'application/vnd.google-apps.folder', parents: this.parents.length ? this.parents : undefined },
444
- fields: 'id'
445
- });
446
- return createRes.data.id;
424
+ const files = await this.client.listFiles(q);
425
+ if (files.length > 0)
426
+ return files[0].id;
427
+ const createRes = await this.client.createFile(this.folderName, this.parents.length ? this.parents : undefined, 'application/vnd.google-apps.folder', '');
428
+ return createRes.id;
447
429
  }
448
430
  async findFile(name) {
449
431
  const q = `name = '${name}' and '${this.folderId}' in parents and trashed = false`;
450
- const res = await this.drive.files.list({ q, spaces: 'drive', fields: 'files(id, etag)' });
451
- if (res.data.files && res.data.files.length > 0)
452
- return { id: res.data.files[0].id, etag: res.data.files[0].etag };
432
+ const files = await this.client.listFiles(q);
433
+ if (files.length > 0)
434
+ return { id: files[0].id, etag: files[0].etag || '' };
453
435
  return null;
454
436
  }
455
437
  async downloadJson(fileId) {
456
- const res = await this.drive.files.get({ fileId, alt: 'media' });
457
- return res.data;
438
+ return await this.client.getFile(fileId);
458
439
  }
459
440
  async downloadFileAny(fileId) {
460
- const res = await this.drive.files.get({ fileId, alt: 'media' });
461
- if (typeof res.data === 'string') {
462
- // NDJSON or JSON string
463
- try {
464
- return JSON.parse(res.data);
465
- }
466
- catch {
467
- // NDJSON?
468
- const lines = res.data.trim().split('\n').filter((l) => l);
469
- return lines.map((line) => JSON.parse(line));
470
- }
471
- }
472
- return res.data;
441
+ return await this.client.getFile(fileId);
473
442
  }
474
443
  async downloadNdjson(fileId) {
475
- const res = await this.drive.files.get({ fileId, alt: 'media' });
476
- const content = typeof res.data === 'string' ? res.data : JSON.stringify(res.data);
444
+ const data = await this.client.getFile(fileId);
445
+ // data will likely be a string if NDJSON is returned and getFile sees weird content-type
446
+ // Or if getFile auto-parsed standard "application/json" but NDJSON is just text.
447
+ // Google Drive might return application/json for everything if we aren't careful?
448
+ // Actually .ndjson is separate.
449
+ // Safest: Handle string or object.
450
+ const content = typeof data === 'string' ? data : JSON.stringify(data);
477
451
  const lines = content.trim().split('\n').filter((l) => l);
478
452
  return lines.map((line) => JSON.parse(line));
479
453
  }
@@ -481,33 +455,20 @@ class DriveHandler {
481
455
  const lines = changes.map(c => JSON.stringify(c)).join('\n') + '\n';
482
456
  const startSeq = changes[0].seq;
483
457
  const name = `changes-${startSeq}-${Math.random().toString(36).substring(7)}.ndjson`;
484
- const res = await this.drive.files.create({
485
- requestBody: { name, parents: [this.folderId], mimeType: 'application/x-ndjson' },
486
- media: { mimeType: 'application/x-ndjson', body: lines },
487
- fields: 'id'
488
- });
458
+ const res = await this.client.createFile(name, [this.folderId], 'application/x-ndjson', lines);
489
459
  this.currentLogSizeEstimate += new Blob([lines]).size;
490
- return res.data.id;
460
+ return res.id;
491
461
  }
492
462
  async saveMeta(meta, expectedEtag = null) {
493
463
  const content = JSON.stringify(meta);
494
464
  const metaFile = await this.findFile('_meta.json');
495
465
  if (metaFile) {
496
- const res = await this.drive.files.update({
497
- fileId: metaFile.id,
498
- headers: expectedEtag ? { 'If-Match': expectedEtag } : undefined,
499
- media: { mimeType: 'application/json', body: content },
500
- fields: 'id, etag'
501
- });
502
- this.metaEtag = res.data.etag;
466
+ const res = await this.client.updateFile(metaFile.id, content, expectedEtag || undefined);
467
+ this.metaEtag = res.etag;
503
468
  }
504
469
  else {
505
- const res = await this.drive.files.create({
506
- requestBody: { name: '_meta.json', parents: [this.folderId], mimeType: 'application/json' },
507
- media: { mimeType: 'application/json', body: content },
508
- fields: 'id, etag'
509
- });
510
- this.metaEtag = res.data.etag;
470
+ const res = await this.client.createFile('_meta.json', [this.folderId], 'application/json', content);
471
+ this.metaEtag = res.etag;
511
472
  }
512
473
  }
513
474
  async countTotalChanges() {
@@ -521,12 +482,12 @@ class DriveHandler {
521
482
  async cleanupOldFiles(oldIndexId, oldLogIds) {
522
483
  if (oldIndexId)
523
484
  try {
524
- await this.drive.files.delete({ fileId: oldIndexId });
485
+ await this.client.deleteFile(oldIndexId);
525
486
  }
526
487
  catch { }
527
488
  for (const id of oldLogIds)
528
489
  try {
529
- await this.drive.files.delete({ fileId: id });
490
+ await this.client.deleteFile(id);
530
491
  }
531
492
  catch { }
532
493
  }
@@ -538,6 +499,7 @@ class DriveHandler {
538
499
  const metaFile = await this.findFile('_meta.json');
539
500
  if (!metaFile)
540
501
  return;
502
+ // Etag check
541
503
  if (metaFile.etag !== this.metaEtag) {
542
504
  await this.load();
543
505
  this.notifyListeners();
@@ -567,7 +529,7 @@ class DriveHandler {
567
529
  stopPolling() { if (this.pollingInterval)
568
530
  clearInterval(this.pollingInterval); }
569
531
  async deleteFolder() { if (this.folderId)
570
- await this.drive.files.delete({ fileId: this.folderId }); }
532
+ await this.client.deleteFile(this.folderId); }
571
533
  getNextSeq() { return this.meta.seq + 1; }
572
534
  }
573
535
  exports.DriveHandler = DriveHandler;
package/lib/index.d.ts CHANGED
@@ -1 +1,10 @@
1
- export default function (PouchDB: any): void;
1
+ import { GoogleDriveAdapterOptions } from './types';
2
+ export * from './types';
3
+ /**
4
+ * Google Drive Adapter Plugin Factory
5
+ *
6
+ * Usage:
7
+ * const plugin = GoogleDriveAdapter({ drive: myDriveClient, ... });
8
+ * PouchDB.plugin(plugin);
9
+ */
10
+ export default function GoogleDriveAdapter(config: GoogleDriveAdapterOptions): (PouchDB: any) => void;
package/lib/index.js CHANGED
@@ -1,7 +1,55 @@
1
1
  "use strict";
2
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
3
+ if (k2 === undefined) k2 = k;
4
+ var desc = Object.getOwnPropertyDescriptor(m, k);
5
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
6
+ desc = { enumerable: true, get: function() { return m[k]; } };
7
+ }
8
+ Object.defineProperty(o, k2, desc);
9
+ }) : (function(o, m, k, k2) {
10
+ if (k2 === undefined) k2 = k;
11
+ o[k2] = m[k];
12
+ }));
13
+ var __exportStar = (this && this.__exportStar) || function(m, exports) {
14
+ for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
15
+ };
2
16
  Object.defineProperty(exports, "__esModule", { value: true });
3
- exports.default = default_1;
17
+ exports.default = GoogleDriveAdapter;
4
18
  const adapter_1 = require("./adapter");
5
- function default_1(PouchDB) {
6
- PouchDB.adapter('googledrive', (0, adapter_1.GoogleDriveAdapter)(PouchDB), true);
19
+ // Export types
20
+ __exportStar(require("./types"), exports);
21
+ /**
22
+ * Google Drive Adapter Plugin Factory
23
+ *
24
+ * Usage:
25
+ * const plugin = GoogleDriveAdapter({ drive: myDriveClient, ... });
26
+ * PouchDB.plugin(plugin);
27
+ */
28
+ function GoogleDriveAdapter(config) {
29
+ return function (PouchDB) {
30
+ // Get the base adapter constructor (scoped to this PouchDB instance)
31
+ const BaseAdapter = (0, adapter_1.GoogleDriveAdapter)(PouchDB);
32
+ // Create a wrapper constructor that injects the config
33
+ function ConfiguredAdapter(opts, callback) {
34
+ // Merge factory config with constructor options
35
+ // Constructor options take precedence (overrides)
36
+ const mergedOpts = Object.assign({}, config, opts);
37
+ // Call the base adapter
38
+ BaseAdapter.call(this, mergedOpts, callback);
39
+ }
40
+ // Copy static properties required by PouchDB
41
+ // @ts-ignore
42
+ ConfiguredAdapter.valid = BaseAdapter.valid;
43
+ // @ts-ignore
44
+ ConfiguredAdapter.use_prefix = BaseAdapter.use_prefix;
45
+ // Register the adapter manually
46
+ // We use PouchDB.adapters object directly to avoid using the .adapter() method
47
+ if (PouchDB.adapters) {
48
+ PouchDB.adapters['googledrive'] = ConfiguredAdapter;
49
+ }
50
+ else {
51
+ // Fallback/Warning if adapters object is somehow missing (should not happen in core)
52
+ console.warn('PouchDB.adapters not found, unable to register googledrive adapter');
53
+ }
54
+ };
7
55
  }
package/lib/types.d.ts CHANGED
@@ -1,9 +1,6 @@
1
- /** Google Drive API client type */
2
- export type DriveClient = any;
1
+ import { DriveClientOptions } from './client';
3
2
  /** Options for configuring the Google Drive adapter */
4
- export interface GoogleDriveAdapterOptions {
5
- /** Configured Google Drive client (googleapis) */
6
- drive: DriveClient;
3
+ export interface GoogleDriveAdapterOptions extends DriveClientOptions {
7
4
  /** Specific folder ID to use as the DB root */
8
5
  folderId?: string;
9
6
  /** Folder name to search/create if no ID provided */
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@docstack/pouchdb-adapter-googledrive",
3
- "version": "0.0.2",
3
+ "version": "0.0.4",
4
4
  "description": "PouchDB adapter for Google Drive",
5
5
  "main": "lib/index.js",
6
6
  "types": "lib/index.d.ts",
@@ -23,8 +23,10 @@
23
23
  "url": "https://github.com/onyx-ac/docstack-pouchdb-adapter-gdrive/issues"
24
24
  },
25
25
  "homepage": "https://onyx.ac/docstack",
26
+ "engines": {
27
+ "node": ">=18"
28
+ },
26
29
  "dependencies": {
27
- "googleapis": "^126.0.0",
28
30
  "pouchdb-core": "^7.3.1"
29
31
  },
30
32
  "devDependencies": {
package/DOCUMENTATION.md DELETED
@@ -1,54 +0,0 @@
1
- # Architecture & Design Documentation
2
-
3
- ## 1. Core Principles
4
- The `pouchdb-adapter-googledrive` implementation is built on three core pillars to ensure data integrity and performance on a file-based remote storage system.
5
-
6
- ### A. Append-Only Log (Storage)
7
- Instead of modifying a single database file (which is prone to conflicts), we use an **Append-Only** strategy.
8
- - **Changes**: Every write operation (or batch of writes) creates a **new, immutable file** (e.g., `changes-{seq}-{uuid}.ndjson`).
9
- - **Snapshots**: Periodically, the log is compacted into a `snapshot` file.
10
- - **Benefit**: Historical data is preserved until compaction, and file-write conflicts are minimized.
11
-
12
- ### B. Optimistic Concurrency Control (OCC)
13
- To prevent race conditions (two clients writing simultaneously), we use **ETag-based locking** on a single entry point: `_meta.json`.
14
- - **The Lock**: `_meta.json` holds the current Sequence Number and the list of active log files.
15
- - **The Protocol**:
16
- 1. Reader fetches `_meta.json` and its `ETag`.
17
- 2. Writer prepares a new change file and uploads it (orphaned initially).
18
- 3. Writer attempts to update `_meta.json` with the new file reference, sending `If-Match: <Old-ETag>`.
19
- 4. **Success**: The change is now officially part of the DB.
20
- 5. **Failure (412/409)**: Another client updated the DB. The writer deletes its orphaned file, pulls the new state, and retries the logical operation.
21
-
22
- ### C. Remote-First "Lazy" Loading (Memory Optimization)
23
- To support large databases without exhausting client memory, we separate **Metadata** from **Content**.
24
-
25
- #### Storage Structure
26
- - `_meta.json`: Root pointer. Small.
27
- - `snapshot-index.json`: A map of `{ docId: { rev, filePointer } }`. Medium size (~100 bytes/doc). Loaded at startup.
28
- - `snapshot-data.json`: The actual document bodies. Large. **Never fully loaded.**
29
- - `changes-*.ndjson`: Recent updates.
30
-
31
- #### Client Startup Sequence
32
- 1. **Fetch Meta**: Download `_meta.json` and get the `snapshotIndexId`.
33
- 2. **Fetch Index**: Download `snapshot-index.json`. This builds the "Revision Tree" in memory.
34
- 3. **Replay Logs**: Download and parse only the small `changes-*.ndjson` files created since the snapshot to update the in-memory Index.
35
- 4. **Ready**: The client is now ready to query keys. No document content has been downloaded yet.
36
-
37
- #### On-Demand Usage
38
- - **`db.get(id)`**:
39
- 1. Look up `id` in the **Memory Index** to find the `filePointer`.
40
- 2. Check **LRU Cache**.
41
- 3. If missing, fetch the specific file/range from Google Drive.
42
- - **`db.allDocs({ keys: [...] })`**: Efficiently looks up pointers and fetches only requested docs.
43
-
44
- ## 2. Technical Patterns
45
-
46
- ### Atomic Compaction
47
- Compaction is a critical maintenance task that merges the `snapshot-data` with recent `changes` to create a new baseline.
48
- - **Safe**: Limits memory usage by streaming/batching.
49
- - **Atomic**: Uploads the new snapshot as a new file. Swaps the pointer in `_meta.json` using OCC.
50
- - **Zero-Downtime**: Clients can continue reading/writing to the old logs while compaction runs. Writes that happen *during* compaction are detected via the ETag check, causing the compaction to abort/retry safeley.
51
-
52
- ### Conflict Handling
53
- - **PouchDB Level**: Standard CouchDB revision conflicts (409) are preserved. A "winner" is chosen deterministically, but conflicting revisions are kept in the tree (requires `snapshot-index` to store the full revision tree, not just the winner).
54
- - **Adapter Level**: Drive API 409s handling (retry logic) ensures the transport layer is reliable.
package/error.log DELETED
@@ -1,21 +0,0 @@
1
- FAIL tests/adapter.test.ts
2
- ΓùÅ Test suite failed to run
3
-
4
- tests/adapter.test.ts:51:13 - error TS2353: Object literal may only specify known properties, and 'drive' does not exist in type 'DatabaseConfiguration'.
5
-
6
- 51 drive: drive,
7
- ~~~~~
8
- tests/adapter.test.ts:57:21 - error TS2339: Property 'backend_adapter' does not exist on type 'DatabaseInfo'.
9
-
10
- 57 expect(info.backend_adapter).toBe('googledrive');
11
- ~~~~~~~~~~~~~~~
12
- tests/adapter.test.ts:65:24 - error TS2339: Property 'title' does not exist on type 'IdMeta & GetMeta'.
13
-
14
- 65 expect(fetched.title).toBe('Start Wars');
15
- ~~~~~
16
-
17
- Test Suites: 1 failed, 1 total
18
- Tests: 0 total
19
- Snapshots: 0 total
20
- Time: 14.401 s
21
- Ran all test suites matching tests/adapter.test.ts.