@docstack/pouchdb-adapter-googledrive 0.0.3 → 0.0.5
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/DOCUMENTATION.md +54 -0
- package/README.md +26 -16
- package/lib/client.d.ts +29 -0
- package/lib/client.js +129 -0
- package/lib/drive.d.ts +3 -1
- package/lib/drive.js +50 -78
- package/lib/types.d.ts +2 -5
- package/package.json +4 -2
package/DOCUMENTATION.md
ADDED
|
@@ -0,0 +1,54 @@
|
|
|
1
|
+
# Architecture & Design Documentation
|
|
2
|
+
|
|
3
|
+
## 1. Core Principles
|
|
4
|
+
The `pouchdb-adapter-googledrive` implementation is built on three core pillars to ensure data integrity and performance on a file-based remote storage system.
|
|
5
|
+
|
|
6
|
+
### A. Append-Only Log (Storage)
|
|
7
|
+
Instead of modifying a single database file (which is prone to conflicts), we use an **Append-Only** strategy.
|
|
8
|
+
- **Changes**: Every write operation (or batch of writes) creates a **new, immutable file** (e.g., `changes-{seq}-{uuid}.ndjson`).
|
|
9
|
+
- **Snapshots**: Periodically, the log is compacted into a `snapshot` file.
|
|
10
|
+
- **Benefit**: Historical data is preserved until compaction, and file-write conflicts are minimized.
|
|
11
|
+
|
|
12
|
+
### B. Optimistic Concurrency Control (OCC)
|
|
13
|
+
To prevent race conditions (two clients writing simultaneously), we use **ETag-based locking** on a single entry point: `_meta.json`.
|
|
14
|
+
- **The Lock**: `_meta.json` holds the current Sequence Number and the list of active log files.
|
|
15
|
+
- **The Protocol**:
|
|
16
|
+
1. Reader fetches `_meta.json` and its `ETag`.
|
|
17
|
+
2. Writer prepares a new change file and uploads it (orphaned initially).
|
|
18
|
+
3. Writer attempts to update `_meta.json` with the new file reference, sending `If-Match: <Old-ETag>`.
|
|
19
|
+
4. **Success**: The change is now officially part of the DB.
|
|
20
|
+
5. **Failure (412/409)**: Another client updated the DB. The writer deletes its orphaned file, pulls the new state, and retries the logical operation.
|
|
21
|
+
|
|
22
|
+
### C. Remote-First "Lazy" Loading (Memory Optimization)
|
|
23
|
+
To support large databases without exhausting client memory, we separate **Metadata** from **Content**.
|
|
24
|
+
|
|
25
|
+
#### Storage Structure
|
|
26
|
+
- `_meta.json`: Root pointer. Small.
|
|
27
|
+
- `snapshot-index.json`: A map of `{ docId: { rev, filePointer } }`. Medium size (~100 bytes/doc). Loaded at startup.
|
|
28
|
+
- `snapshot-data.json`: The actual document bodies. Large. **Never fully loaded.**
|
|
29
|
+
- `changes-*.ndjson`: Recent updates.
|
|
30
|
+
|
|
31
|
+
#### Client Startup Sequence
|
|
32
|
+
1. **Fetch Meta**: Download `_meta.json` and get the `snapshotIndexId`.
|
|
33
|
+
2. **Fetch Index**: Download `snapshot-index.json`. This builds the "Revision Tree" in memory.
|
|
34
|
+
3. **Replay Logs**: Download and parse only the small `changes-*.ndjson` files created since the snapshot to update the in-memory Index.
|
|
35
|
+
4. **Ready**: The client is now ready to query keys. No document content has been downloaded yet.
|
|
36
|
+
|
|
37
|
+
#### On-Demand Usage
|
|
38
|
+
- **`db.get(id)`**:
|
|
39
|
+
1. Look up `id` in the **Memory Index** to find the `filePointer`.
|
|
40
|
+
2. Check **LRU Cache**.
|
|
41
|
+
3. If missing, fetch the specific file/range from Google Drive.
|
|
42
|
+
- **`db.allDocs({ keys: [...] })`**: Efficiently looks up pointers and fetches only requested docs.
|
|
43
|
+
|
|
44
|
+
## 2. Technical Patterns
|
|
45
|
+
|
|
46
|
+
### Atomic Compaction
|
|
47
|
+
Compaction is a critical maintenance task that merges the `snapshot-data` with recent `changes` to create a new baseline.
|
|
48
|
+
- **Safe**: Limits memory usage by streaming/batching.
|
|
49
|
+
- **Atomic**: Uploads the new snapshot as a new file. Swaps the pointer in `_meta.json` using OCC.
|
|
50
|
+
- **Zero-Downtime**: Clients can continue reading/writing to the old logs while compaction runs. Writes that happen *during* compaction are detected via the ETag check, causing the compaction to abort/retry safeley.
|
|
51
|
+
|
|
52
|
+
### Conflict Handling
|
|
53
|
+
- **PouchDB Level**: Standard CouchDB revision conflicts (409) are preserved. A "winner" is chosen deterministically, but conflicting revisions are kept in the tree (requires `snapshot-index` to store the full revision tree, not just the winner).
|
|
54
|
+
- **Adapter Level**: Drive API 409s handling (retry logic) ensures the transport layer is reliable.
|
package/README.md
CHANGED
|
@@ -8,6 +8,11 @@ A persistent, serverless PouchDB adapter that uses Google Drive as a backend sto
|
|
|
8
8
|
- **⚡ Lazy Loading**: Optimizes memory and bandwidth by loading only the **Index** into memory. Document bodies are fetched on-demand.
|
|
9
9
|
- **🛡️ Optimistic Concurrency Control**: Uses ETag-based locking on metadata to prevent race conditions.
|
|
10
10
|
- **📦 Auto-Compaction**: Automatically merges logs for performance.
|
|
11
|
+
- **🌍 Universal**: Works natively in Node.js 18+, Browsers, and Edge environments (no `googleapis` dependency).
|
|
12
|
+
|
|
13
|
+
## Requirements
|
|
14
|
+
|
|
15
|
+
- **Node.js 18+** (for global `fetch` support) or a modern browser.
|
|
11
16
|
|
|
12
17
|
## Installation
|
|
13
18
|
|
|
@@ -17,32 +22,23 @@ npm install @docstack/pouchdb-adapter-googledrive
|
|
|
17
22
|
|
|
18
23
|
## Usage
|
|
19
24
|
|
|
20
|
-
The adapter is initialized as a plugin with your Google Drive
|
|
25
|
+
The adapter is initialized as a plugin with your Google Drive access token.
|
|
21
26
|
|
|
22
27
|
```typescript
|
|
23
28
|
import PouchDB from 'pouchdb-core';
|
|
24
29
|
import GoogleDriveAdapter from '@docstack/pouchdb-adapter-googledrive';
|
|
25
|
-
import { google } from 'googleapis';
|
|
26
|
-
|
|
27
|
-
// 1. Setup Google Drive Client
|
|
28
|
-
const oauth2Client = new google.auth.OAuth2(CLIENT_ID, SECRET, REDIRECT);
|
|
29
|
-
oauth2Client.setCredentials({ access_token: '...' });
|
|
30
|
-
const drive = google.drive({ version: 'v3', auth: oauth2Client });
|
|
31
30
|
|
|
32
|
-
//
|
|
31
|
+
// 1. Initialize the Adapter Plugin Factory
|
|
33
32
|
const adapterPlugin = GoogleDriveAdapter({
|
|
34
|
-
|
|
35
|
-
folderName: 'my-app-db-folder', // Root folder
|
|
36
|
-
pollingIntervalMs: 2000
|
|
33
|
+
accessToken: 'YOUR_GOOGLE_ACCESS_TOKEN',
|
|
34
|
+
folderName: 'my-app-db-folder', // Root folder in Drive
|
|
35
|
+
pollingIntervalMs: 2000 // Optional: check for remote changes
|
|
37
36
|
});
|
|
38
37
|
|
|
39
|
-
//
|
|
38
|
+
// 2. Register Plugin
|
|
40
39
|
PouchDB.plugin(adapterPlugin);
|
|
41
|
-
// Also needs replication plugin if using replicate()
|
|
42
|
-
// PouchDB.plugin(require('pouchdb-replication'));
|
|
43
40
|
|
|
44
|
-
//
|
|
45
|
-
// No need to pass 'drive' here anymore!
|
|
41
|
+
// 3. Create Database
|
|
46
42
|
const db = new PouchDB('user_db', {
|
|
47
43
|
adapter: 'googledrive'
|
|
48
44
|
});
|
|
@@ -50,6 +46,20 @@ const db = new PouchDB('user_db', {
|
|
|
50
46
|
await db.post({ title: 'Hello World' });
|
|
51
47
|
```
|
|
52
48
|
|
|
49
|
+
### Dynamic Tokens
|
|
50
|
+
|
|
51
|
+
If your token expires, you can provide an async function that returns a valid token:
|
|
52
|
+
|
|
53
|
+
```typescript
|
|
54
|
+
const adapterPlugin = GoogleDriveAdapter({
|
|
55
|
+
accessToken: async () => {
|
|
56
|
+
const session = await getMySession();
|
|
57
|
+
return session.accessToken;
|
|
58
|
+
},
|
|
59
|
+
folderName: 'my-app-db'
|
|
60
|
+
});
|
|
61
|
+
```
|
|
62
|
+
|
|
53
63
|
## Architecture
|
|
54
64
|
|
|
55
65
|
The adapter implements a **"Remote-First"** architecture:
|
package/lib/client.d.ts
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
1
|
+
export interface DriveFile {
|
|
2
|
+
id: string;
|
|
3
|
+
name: string;
|
|
4
|
+
mimeType: string;
|
|
5
|
+
parents?: string[];
|
|
6
|
+
etag?: string;
|
|
7
|
+
}
|
|
8
|
+
export interface DriveClientOptions {
|
|
9
|
+
accessToken: string | (() => Promise<string>);
|
|
10
|
+
}
|
|
11
|
+
export declare class GoogleDriveClient {
|
|
12
|
+
private options;
|
|
13
|
+
constructor(options: DriveClientOptions);
|
|
14
|
+
private getToken;
|
|
15
|
+
private fetch;
|
|
16
|
+
listFiles(q: string): Promise<DriveFile[]>;
|
|
17
|
+
getFile(fileId: string): Promise<any>;
|
|
18
|
+
getFileMetadata(fileId: string): Promise<DriveFile>;
|
|
19
|
+
createFile(name: string, parents: string[] | undefined, mimeType: string, content: string): Promise<{
|
|
20
|
+
id: string;
|
|
21
|
+
etag: string;
|
|
22
|
+
}>;
|
|
23
|
+
updateFile(fileId: string, content: string, expectedEtag?: string): Promise<{
|
|
24
|
+
id: string;
|
|
25
|
+
etag: string;
|
|
26
|
+
}>;
|
|
27
|
+
deleteFile(fileId: string): Promise<void>;
|
|
28
|
+
private buildMultipart;
|
|
29
|
+
}
|
package/lib/client.js
ADDED
|
@@ -0,0 +1,129 @@
|
|
|
1
|
+
"use strict";
|
|
2
|
+
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
|
+
exports.GoogleDriveClient = void 0;
|
|
4
|
+
const BASE_URL = 'https://www.googleapis.com/drive/v3/files';
|
|
5
|
+
const UPLOAD_URL = 'https://www.googleapis.com/upload/drive/v3/files';
|
|
6
|
+
class GoogleDriveClient {
|
|
7
|
+
constructor(options) {
|
|
8
|
+
this.options = options;
|
|
9
|
+
}
|
|
10
|
+
async getToken() {
|
|
11
|
+
if (typeof this.options.accessToken === 'function') {
|
|
12
|
+
return await this.options.accessToken();
|
|
13
|
+
}
|
|
14
|
+
return this.options.accessToken;
|
|
15
|
+
}
|
|
16
|
+
async fetch(url, init) {
|
|
17
|
+
const token = await this.getToken();
|
|
18
|
+
const headers = new Headers(init.headers);
|
|
19
|
+
headers.set('Authorization', `Bearer ${token}`);
|
|
20
|
+
const res = await fetch(url, { ...init, headers });
|
|
21
|
+
const method = init.method || 'GET';
|
|
22
|
+
if (!res.ok) {
|
|
23
|
+
// Basic error handling
|
|
24
|
+
const text = await res.text();
|
|
25
|
+
let errorMsg = `Drive API Error: ${res.status} ${res.statusText} (${method} ${url})`;
|
|
26
|
+
try {
|
|
27
|
+
const json = JSON.parse(text);
|
|
28
|
+
if (json.error && json.error.message) {
|
|
29
|
+
errorMsg += ` - ${json.error.message}`;
|
|
30
|
+
}
|
|
31
|
+
}
|
|
32
|
+
catch { }
|
|
33
|
+
const err = new Error(errorMsg);
|
|
34
|
+
err.status = res.status;
|
|
35
|
+
throw err;
|
|
36
|
+
}
|
|
37
|
+
return res;
|
|
38
|
+
}
|
|
39
|
+
async listFiles(q) {
|
|
40
|
+
const params = new URLSearchParams({
|
|
41
|
+
q,
|
|
42
|
+
fields: 'files(id, name, mimeType, parents, etag)',
|
|
43
|
+
spaces: 'drive',
|
|
44
|
+
pageSize: '1000' // Ensure we get enough
|
|
45
|
+
});
|
|
46
|
+
const res = await this.fetch(`${BASE_URL}?${params.toString()}`, { method: 'GET' });
|
|
47
|
+
const data = await res.json();
|
|
48
|
+
return data.files || [];
|
|
49
|
+
}
|
|
50
|
+
async getFile(fileId) {
|
|
51
|
+
// Try getting media
|
|
52
|
+
try {
|
|
53
|
+
const params = new URLSearchParams({ alt: 'media' });
|
|
54
|
+
const res = await this.fetch(`${BASE_URL}/${fileId}?${params.toString()}`, { method: 'GET' });
|
|
55
|
+
// Standard fetch handles JSON/Text transparency?
|
|
56
|
+
// We expect JSON mostly, but sometimes we might want text.
|
|
57
|
+
// PouchDB adapter flow: downloadJson, downloadNdjson
|
|
58
|
+
// Let's rely on content-type or caller expectation?
|
|
59
|
+
// The usage in `drive.ts` expects parsed JSON/NDJSON lines.
|
|
60
|
+
// Let's return the raw Text or JSON based on Content-Type?
|
|
61
|
+
const contentType = res.headers.get('content-type');
|
|
62
|
+
if (contentType && contentType.includes('application/json')) {
|
|
63
|
+
return await res.json();
|
|
64
|
+
}
|
|
65
|
+
return await res.text();
|
|
66
|
+
}
|
|
67
|
+
catch (e) {
|
|
68
|
+
throw e;
|
|
69
|
+
}
|
|
70
|
+
}
|
|
71
|
+
// Single metadata get (for etag check)
|
|
72
|
+
async getFileMetadata(fileId) {
|
|
73
|
+
const params = new URLSearchParams({ fields: 'id, name, mimeType, parents, etag' });
|
|
74
|
+
const res = await this.fetch(`${BASE_URL}/${fileId}?${params.toString()}`, { method: 'GET' });
|
|
75
|
+
return await res.json();
|
|
76
|
+
}
|
|
77
|
+
async createFile(name, parents, mimeType, content) {
|
|
78
|
+
const metadata = {
|
|
79
|
+
name,
|
|
80
|
+
mimeType,
|
|
81
|
+
parents
|
|
82
|
+
};
|
|
83
|
+
// Folders or empty content can use simple metadata-only POST
|
|
84
|
+
if (!content && mimeType === 'application/vnd.google-apps.folder') {
|
|
85
|
+
const res = await this.fetch(`${BASE_URL}?fields=id,etag`, {
|
|
86
|
+
method: 'POST',
|
|
87
|
+
headers: { 'Content-Type': 'application/json' },
|
|
88
|
+
body: JSON.stringify(metadata)
|
|
89
|
+
});
|
|
90
|
+
return await res.json();
|
|
91
|
+
}
|
|
92
|
+
const multipartBody = this.buildMultipart(metadata, content, mimeType);
|
|
93
|
+
const res = await this.fetch(`${UPLOAD_URL}?uploadType=multipart&fields=id,etag`, {
|
|
94
|
+
method: 'POST',
|
|
95
|
+
headers: {
|
|
96
|
+
'Content-Type': `multipart/related; boundary=${multipartBody.boundary}`
|
|
97
|
+
},
|
|
98
|
+
body: multipartBody.body
|
|
99
|
+
});
|
|
100
|
+
return await res.json();
|
|
101
|
+
}
|
|
102
|
+
async updateFile(fileId, content, expectedEtag) {
|
|
103
|
+
// Update content (media) usually, but sometimes meta?
|
|
104
|
+
// In our usage (saveMeta), we update body.
|
|
105
|
+
const res = await this.fetch(`${UPLOAD_URL}/${fileId}?uploadType=media&fields=id,etag`, {
|
|
106
|
+
method: 'PATCH',
|
|
107
|
+
headers: expectedEtag ? { 'If-Match': expectedEtag, 'Content-Type': 'application/json' } : { 'Content-Type': 'application/json' },
|
|
108
|
+
body: content
|
|
109
|
+
});
|
|
110
|
+
return await res.json();
|
|
111
|
+
}
|
|
112
|
+
async deleteFile(fileId) {
|
|
113
|
+
await this.fetch(`${BASE_URL}/${fileId}`, { method: 'DELETE' });
|
|
114
|
+
}
|
|
115
|
+
buildMultipart(metadata, content, contentType) {
|
|
116
|
+
const boundary = '-------' + Math.random().toString(36).substring(2);
|
|
117
|
+
const delimiter = `\r\n--${boundary}\r\n`;
|
|
118
|
+
const closeDelimiter = `\r\n--${boundary}--`;
|
|
119
|
+
const body = delimiter +
|
|
120
|
+
'Content-Type: application/json\r\n\r\n' +
|
|
121
|
+
JSON.stringify(metadata) +
|
|
122
|
+
delimiter +
|
|
123
|
+
`Content-Type: ${contentType}\r\n\r\n` +
|
|
124
|
+
content +
|
|
125
|
+
closeDelimiter;
|
|
126
|
+
return { body, boundary };
|
|
127
|
+
}
|
|
128
|
+
}
|
|
129
|
+
exports.GoogleDriveClient = GoogleDriveClient;
|
package/lib/drive.d.ts
CHANGED
|
@@ -10,7 +10,8 @@ import { GoogleDriveAdapterOptions, ChangeEntry, IndexEntry } from './types';
|
|
|
10
10
|
* └── changes-*.ndjson # Append logs
|
|
11
11
|
*/
|
|
12
12
|
export declare class DriveHandler {
|
|
13
|
-
private
|
|
13
|
+
private client;
|
|
14
|
+
private options;
|
|
14
15
|
private folderId;
|
|
15
16
|
private folderName;
|
|
16
17
|
private parents;
|
|
@@ -64,6 +65,7 @@ export declare class DriveHandler {
|
|
|
64
65
|
private notifyListeners;
|
|
65
66
|
onChange(cb: any): void;
|
|
66
67
|
stopPolling(): void;
|
|
68
|
+
private escapeQuery;
|
|
67
69
|
deleteFolder(): Promise<void>;
|
|
68
70
|
getNextSeq(): number;
|
|
69
71
|
}
|
package/lib/drive.js
CHANGED
|
@@ -2,6 +2,7 @@
|
|
|
2
2
|
Object.defineProperty(exports, "__esModule", { value: true });
|
|
3
3
|
exports.DriveHandler = void 0;
|
|
4
4
|
const cache_1 = require("./cache");
|
|
5
|
+
const client_1 = require("./client");
|
|
5
6
|
const DEFAULT_COMPACTION_THRESHOLD = 100; // entries
|
|
6
7
|
const DEFAULT_SIZE_THRESHOLD = 1024 * 1024; // 1MB
|
|
7
8
|
const DEFAULT_CACHE_SIZE = 1000; // Number of docs
|
|
@@ -32,7 +33,8 @@ class DriveHandler {
|
|
|
32
33
|
this.currentLogSizeEstimate = 0;
|
|
33
34
|
this.listeners = [];
|
|
34
35
|
this.pollingInterval = null;
|
|
35
|
-
this.
|
|
36
|
+
this.client = new client_1.GoogleDriveClient(options);
|
|
37
|
+
this.options = options;
|
|
36
38
|
this.folderId = options.folderId || null;
|
|
37
39
|
this.folderName = options.folderName || dbName;
|
|
38
40
|
this.parents = options.parents || [];
|
|
@@ -40,9 +42,7 @@ class DriveHandler {
|
|
|
40
42
|
this.compactionSizeThreshold = options.compactionSizeThreshold || DEFAULT_SIZE_THRESHOLD;
|
|
41
43
|
this.meta.dbName = dbName;
|
|
42
44
|
this.docCache = new cache_1.LRUCache(options.cacheSize || DEFAULT_CACHE_SIZE);
|
|
43
|
-
|
|
44
|
-
this.startPolling(options.pollingIntervalMs);
|
|
45
|
-
}
|
|
45
|
+
// Polling will be started in load() after folderId is resolved
|
|
46
46
|
}
|
|
47
47
|
// Public getter for Sequence (used by adapter)
|
|
48
48
|
get seq() {
|
|
@@ -108,6 +108,10 @@ class DriveHandler {
|
|
|
108
108
|
}
|
|
109
109
|
}
|
|
110
110
|
}
|
|
111
|
+
// 3. Start Polling (if enabled)
|
|
112
|
+
if (this.options.pollingIntervalMs) {
|
|
113
|
+
this.startPolling(this.options.pollingIntervalMs);
|
|
114
|
+
}
|
|
111
115
|
}
|
|
112
116
|
// Migration helper
|
|
113
117
|
filesFromLegacySnapshot(snapshot) {
|
|
@@ -267,7 +271,7 @@ class DriveHandler {
|
|
|
267
271
|
return await this.tryAppendChanges(changes);
|
|
268
272
|
}
|
|
269
273
|
catch (err) {
|
|
270
|
-
if (err.
|
|
274
|
+
if (err.status === 412 || err.status === 409) {
|
|
271
275
|
// Reload and RETRY
|
|
272
276
|
await this.load();
|
|
273
277
|
// Check conflicts against Index (Metadata sufficient)
|
|
@@ -359,16 +363,8 @@ class DriveHandler {
|
|
|
359
363
|
});
|
|
360
364
|
// 2. Upload Data File
|
|
361
365
|
const dataContent = JSON.stringify(snapshotData);
|
|
362
|
-
const dataRes = await this.
|
|
363
|
-
|
|
364
|
-
name: `snapshot-data-${Date.now()}.json`,
|
|
365
|
-
parents: [this.folderId],
|
|
366
|
-
mimeType: 'application/json'
|
|
367
|
-
},
|
|
368
|
-
media: { mimeType: 'application/json', body: dataContent },
|
|
369
|
-
fields: 'id'
|
|
370
|
-
});
|
|
371
|
-
const dataFileId = dataRes.data.id;
|
|
366
|
+
const dataRes = await this.client.createFile(`snapshot-data-${Date.now()}.json`, [this.folderId], 'application/json', dataContent);
|
|
367
|
+
const dataFileId = dataRes.id;
|
|
372
368
|
// 3. Create Index pointing to this Data File
|
|
373
369
|
const newIndexEntries = {};
|
|
374
370
|
for (const id of Object.keys(snapshotData.docs)) {
|
|
@@ -384,16 +380,8 @@ class DriveHandler {
|
|
|
384
380
|
createdAt: Date.now()
|
|
385
381
|
};
|
|
386
382
|
const indexContent = JSON.stringify(snapshotIndex);
|
|
387
|
-
const indexRes = await this.
|
|
388
|
-
|
|
389
|
-
name: `snapshot-index-${Date.now()}.json`,
|
|
390
|
-
parents: [this.folderId],
|
|
391
|
-
mimeType: 'application/json'
|
|
392
|
-
},
|
|
393
|
-
media: { mimeType: 'application/json', body: indexContent },
|
|
394
|
-
fields: 'id'
|
|
395
|
-
});
|
|
396
|
-
const newIndexId = indexRes.data.id;
|
|
383
|
+
const indexRes = await this.client.createFile(`snapshot-index-${Date.now()}.json`, [this.folderId], 'application/json', indexContent);
|
|
384
|
+
const newIndexId = indexRes.id;
|
|
397
385
|
// 4. Update Meta
|
|
398
386
|
await this.atomicUpdateMeta((latest) => {
|
|
399
387
|
const remainingLogs = latest.changeLogIds.filter(id => !oldLogIds.includes(id));
|
|
@@ -424,7 +412,7 @@ class DriveHandler {
|
|
|
424
412
|
return;
|
|
425
413
|
}
|
|
426
414
|
catch (err) {
|
|
427
|
-
if (err.
|
|
415
|
+
if (err.status === 412 || err.status === 409) {
|
|
428
416
|
attempt++;
|
|
429
417
|
await new Promise(r => setTimeout(r, Math.random() * 500 + 100));
|
|
430
418
|
continue;
|
|
@@ -435,45 +423,38 @@ class DriveHandler {
|
|
|
435
423
|
}
|
|
436
424
|
// Reused helpers
|
|
437
425
|
async findOrCreateFolder() {
|
|
438
|
-
const
|
|
439
|
-
const
|
|
440
|
-
|
|
441
|
-
|
|
442
|
-
|
|
443
|
-
|
|
444
|
-
|
|
445
|
-
});
|
|
446
|
-
return createRes.data.id;
|
|
426
|
+
const safeName = this.escapeQuery(this.folderName);
|
|
427
|
+
const q = `name = '${safeName}' and mimeType = 'application/vnd.google-apps.folder' and trashed = false`;
|
|
428
|
+
const files = await this.client.listFiles(q);
|
|
429
|
+
if (files.length > 0)
|
|
430
|
+
return files[0].id;
|
|
431
|
+
const createRes = await this.client.createFile(this.folderName, this.parents.length ? this.parents : undefined, 'application/vnd.google-apps.folder', '');
|
|
432
|
+
return createRes.id;
|
|
447
433
|
}
|
|
448
434
|
async findFile(name) {
|
|
449
|
-
|
|
450
|
-
|
|
451
|
-
|
|
452
|
-
|
|
435
|
+
if (!this.folderId)
|
|
436
|
+
return null;
|
|
437
|
+
const safeName = this.escapeQuery(name);
|
|
438
|
+
const q = `name = '${safeName}' and '${this.folderId}' in parents and trashed = false`;
|
|
439
|
+
const files = await this.client.listFiles(q);
|
|
440
|
+
if (files.length > 0)
|
|
441
|
+
return { id: files[0].id, etag: files[0].etag || '' };
|
|
453
442
|
return null;
|
|
454
443
|
}
|
|
455
444
|
async downloadJson(fileId) {
|
|
456
|
-
|
|
457
|
-
return res.data;
|
|
445
|
+
return await this.client.getFile(fileId);
|
|
458
446
|
}
|
|
459
447
|
async downloadFileAny(fileId) {
|
|
460
|
-
|
|
461
|
-
if (typeof res.data === 'string') {
|
|
462
|
-
// NDJSON or JSON string
|
|
463
|
-
try {
|
|
464
|
-
return JSON.parse(res.data);
|
|
465
|
-
}
|
|
466
|
-
catch {
|
|
467
|
-
// NDJSON?
|
|
468
|
-
const lines = res.data.trim().split('\n').filter((l) => l);
|
|
469
|
-
return lines.map((line) => JSON.parse(line));
|
|
470
|
-
}
|
|
471
|
-
}
|
|
472
|
-
return res.data;
|
|
448
|
+
return await this.client.getFile(fileId);
|
|
473
449
|
}
|
|
474
450
|
async downloadNdjson(fileId) {
|
|
475
|
-
const
|
|
476
|
-
|
|
451
|
+
const data = await this.client.getFile(fileId);
|
|
452
|
+
// data will likely be a string if NDJSON is returned and getFile sees weird content-type
|
|
453
|
+
// Or if getFile auto-parsed standard "application/json" but NDJSON is just text.
|
|
454
|
+
// Google Drive might return application/json for everything if we aren't careful?
|
|
455
|
+
// Actually .ndjson is separate.
|
|
456
|
+
// Safest: Handle string or object.
|
|
457
|
+
const content = typeof data === 'string' ? data : JSON.stringify(data);
|
|
477
458
|
const lines = content.trim().split('\n').filter((l) => l);
|
|
478
459
|
return lines.map((line) => JSON.parse(line));
|
|
479
460
|
}
|
|
@@ -481,33 +462,20 @@ class DriveHandler {
|
|
|
481
462
|
const lines = changes.map(c => JSON.stringify(c)).join('\n') + '\n';
|
|
482
463
|
const startSeq = changes[0].seq;
|
|
483
464
|
const name = `changes-${startSeq}-${Math.random().toString(36).substring(7)}.ndjson`;
|
|
484
|
-
const res = await this.
|
|
485
|
-
requestBody: { name, parents: [this.folderId], mimeType: 'application/x-ndjson' },
|
|
486
|
-
media: { mimeType: 'application/x-ndjson', body: lines },
|
|
487
|
-
fields: 'id'
|
|
488
|
-
});
|
|
465
|
+
const res = await this.client.createFile(name, [this.folderId], 'application/x-ndjson', lines);
|
|
489
466
|
this.currentLogSizeEstimate += new Blob([lines]).size;
|
|
490
|
-
return res.
|
|
467
|
+
return res.id;
|
|
491
468
|
}
|
|
492
469
|
async saveMeta(meta, expectedEtag = null) {
|
|
493
470
|
const content = JSON.stringify(meta);
|
|
494
471
|
const metaFile = await this.findFile('_meta.json');
|
|
495
472
|
if (metaFile) {
|
|
496
|
-
const res = await this.
|
|
497
|
-
|
|
498
|
-
headers: expectedEtag ? { 'If-Match': expectedEtag } : undefined,
|
|
499
|
-
media: { mimeType: 'application/json', body: content },
|
|
500
|
-
fields: 'id, etag'
|
|
501
|
-
});
|
|
502
|
-
this.metaEtag = res.data.etag;
|
|
473
|
+
const res = await this.client.updateFile(metaFile.id, content, expectedEtag || undefined);
|
|
474
|
+
this.metaEtag = res.etag;
|
|
503
475
|
}
|
|
504
476
|
else {
|
|
505
|
-
const res = await this.
|
|
506
|
-
|
|
507
|
-
media: { mimeType: 'application/json', body: content },
|
|
508
|
-
fields: 'id, etag'
|
|
509
|
-
});
|
|
510
|
-
this.metaEtag = res.data.etag;
|
|
477
|
+
const res = await this.client.createFile('_meta.json', [this.folderId], 'application/json', content);
|
|
478
|
+
this.metaEtag = res.etag;
|
|
511
479
|
}
|
|
512
480
|
}
|
|
513
481
|
async countTotalChanges() {
|
|
@@ -521,12 +489,12 @@ class DriveHandler {
|
|
|
521
489
|
async cleanupOldFiles(oldIndexId, oldLogIds) {
|
|
522
490
|
if (oldIndexId)
|
|
523
491
|
try {
|
|
524
|
-
await this.
|
|
492
|
+
await this.client.deleteFile(oldIndexId);
|
|
525
493
|
}
|
|
526
494
|
catch { }
|
|
527
495
|
for (const id of oldLogIds)
|
|
528
496
|
try {
|
|
529
|
-
await this.
|
|
497
|
+
await this.client.deleteFile(id);
|
|
530
498
|
}
|
|
531
499
|
catch { }
|
|
532
500
|
}
|
|
@@ -538,6 +506,7 @@ class DriveHandler {
|
|
|
538
506
|
const metaFile = await this.findFile('_meta.json');
|
|
539
507
|
if (!metaFile)
|
|
540
508
|
return;
|
|
509
|
+
// Etag check
|
|
541
510
|
if (metaFile.etag !== this.metaEtag) {
|
|
542
511
|
await this.load();
|
|
543
512
|
this.notifyListeners();
|
|
@@ -566,8 +535,11 @@ class DriveHandler {
|
|
|
566
535
|
onChange(cb) { this.listeners.push(cb); }
|
|
567
536
|
stopPolling() { if (this.pollingInterval)
|
|
568
537
|
clearInterval(this.pollingInterval); }
|
|
538
|
+
escapeQuery(value) {
|
|
539
|
+
return value.replace(/'/g, "\\'");
|
|
540
|
+
}
|
|
569
541
|
async deleteFolder() { if (this.folderId)
|
|
570
|
-
await this.
|
|
542
|
+
await this.client.deleteFile(this.folderId); }
|
|
571
543
|
getNextSeq() { return this.meta.seq + 1; }
|
|
572
544
|
}
|
|
573
545
|
exports.DriveHandler = DriveHandler;
|
package/lib/types.d.ts
CHANGED
|
@@ -1,9 +1,6 @@
|
|
|
1
|
-
|
|
2
|
-
export type DriveClient = any;
|
|
1
|
+
import { DriveClientOptions } from './client';
|
|
3
2
|
/** Options for configuring the Google Drive adapter */
|
|
4
|
-
export interface GoogleDriveAdapterOptions {
|
|
5
|
-
/** Configured Google Drive client (googleapis) */
|
|
6
|
-
drive: DriveClient;
|
|
3
|
+
export interface GoogleDriveAdapterOptions extends DriveClientOptions {
|
|
7
4
|
/** Specific folder ID to use as the DB root */
|
|
8
5
|
folderId?: string;
|
|
9
6
|
/** Folder name to search/create if no ID provided */
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@docstack/pouchdb-adapter-googledrive",
|
|
3
|
-
"version": "0.0.
|
|
3
|
+
"version": "0.0.5",
|
|
4
4
|
"description": "PouchDB adapter for Google Drive",
|
|
5
5
|
"main": "lib/index.js",
|
|
6
6
|
"types": "lib/index.d.ts",
|
|
@@ -23,8 +23,10 @@
|
|
|
23
23
|
"url": "https://github.com/onyx-ac/docstack-pouchdb-adapter-gdrive/issues"
|
|
24
24
|
},
|
|
25
25
|
"homepage": "https://onyx.ac/docstack",
|
|
26
|
+
"engines": {
|
|
27
|
+
"node": ">=18"
|
|
28
|
+
},
|
|
26
29
|
"dependencies": {
|
|
27
|
-
"googleapis": "^126.0.0",
|
|
28
30
|
"pouchdb-core": "^7.3.1"
|
|
29
31
|
},
|
|
30
32
|
"devDependencies": {
|