@asaidimu/utils-cache 3.0.6 → 3.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +370 -299
- package/index.d.mts +16 -16
- package/index.d.ts +16 -16
- package/package.json +2 -2
package/README.md
CHANGED
|
@@ -48,27 +48,27 @@ An intelligent, configurable in-memory cache library for Node.js and browser env
|
|
|
48
48
|
|
|
49
49
|
### Detailed Description
|
|
50
50
|
|
|
51
|
-
`@asaidimu/utils-cache` provides a robust, in-memory caching solution designed for applications that require efficient data retrieval, resilience against network failures, and state persistence across sessions or processes. It implements common caching patterns like
|
|
51
|
+
`@asaidimu/utils-cache` provides a robust, in-memory caching solution designed for applications that require efficient data retrieval, resilience against network failures, and state persistence across sessions or processes. It implements common caching patterns like _stale-while-revalidate_ and _Least Recently Used (LRU)_ eviction, along with advanced features such as automatic retries for failed fetches, an extensible persistence mechanism, and a comprehensive event system for real-time monitoring.
|
|
52
52
|
|
|
53
53
|
Unlike simpler caches, `Cache` manages data freshness intelligently, allowing you to serve stale data immediately while a fresh copy is being fetched in the background. Its pluggable persistence layer enables you to save and restore the cache state, making it ideal for client-side applications that need to maintain state offline or server-side applications that need rapid startup with pre-populated data. With built-in metrics and events, `@asaidimu/utils-cache` offers deep insights into cache performance and lifecycle, ensuring both speed and data integrity.
|
|
54
54
|
|
|
55
55
|
### Key Features
|
|
56
56
|
|
|
57
|
-
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
69
|
-
|
|
70
|
-
|
|
71
|
-
|
|
57
|
+
- **Configurable In-Memory Store**: Provides fast access to cached data with an underlying `Map` structure.
|
|
58
|
+
- **Stale-While-Revalidate (SWR)**: Serve existing data immediately while fetching new data in the background, minimizing perceived latency and improving user experience.
|
|
59
|
+
- **Automatic Retries with Exponential Backoff**: Configurable retry attempts and an exponentially increasing delay between retries for `fetchFunction` failures, enhancing resilience to transient network issues.
|
|
60
|
+
- **Pluggable Persistence**: Seamlessly integrates with any `SimplePersistence` implementation (e.g., LocalStorage, IndexedDB via `@asaidimu/utils-persistence`, or custom backend) to save and restore cache state across application restarts or sessions.
|
|
61
|
+
- **Debounced Persistence Writes**: Optimizes write frequency to the underlying persistence layer, reducing I/O operations and improving performance.
|
|
62
|
+
- **Remote Update Handling**: Automatically synchronizes cache state when the persistence layer is updated externally by other instances or processes.
|
|
63
|
+
- **Custom Serialization/Deserialization**: Provides options to serialize and deserialize complex data types (e.g., `Date`, `Map`, custom classes) for proper storage and retrieval.
|
|
64
|
+
- **Configurable Eviction Policies**:
|
|
65
|
+
- **Time-Based (TTL)**: Automatically evicts entries that haven't been accessed for a specified `cacheTime`, managing memory efficiently.
|
|
66
|
+
- **Size-Based (LRU)**: Evicts least recently used items when the `maxSize` limit is exceeded, preventing unbounded memory growth.
|
|
67
|
+
- **Comprehensive Event System**: Subscribe to granular, scoped cache events (e.g., `'cache:read:hit'`, `'cache:fetch:start'`, `'cache:data:set'`) for real-time logging, debugging, analytics, and advanced reactivity. Wildcard subscriptions (e.g., `'cache:read:*'`) are supported for capturing related events.
|
|
68
|
+
- **Performance Metrics**: Built-in tracking for `hits`, `misses`, `fetches`, `errors`, `evictions`, and `staleHits`, providing insights into cache efficiency with calculated hit rates.
|
|
69
|
+
- **Flexible Query Management**: Register asynchronous `fetchFunction`s for specific keys, allowing the `Cache` instance to intelligently manage their data lifecycle, including fetching, caching, and invalidation.
|
|
70
|
+
- **Imperative Control**: Offers direct methods for `invalidate` (making data stale), `prefetch` (loading data proactively), `refresh` (forcing a re-fetch), `setData` (manual data injection), and `remove` operations.
|
|
71
|
+
- **TypeScript Support**: Fully typed API for enhanced developer experience, compile-time safety, and autocompletion.
|
|
72
72
|
|
|
73
73
|
---
|
|
74
74
|
|
|
@@ -76,8 +76,8 @@ Unlike simpler caches, `Cache` manages data freshness intelligently, allowing yo
|
|
|
76
76
|
|
|
77
77
|
### Prerequisites
|
|
78
78
|
|
|
79
|
-
|
|
80
|
-
|
|
79
|
+
- Node.js (v14.x or higher)
|
|
80
|
+
- npm, yarn, or bun
|
|
81
81
|
|
|
82
82
|
### Installation Steps
|
|
83
83
|
|
|
@@ -96,33 +96,35 @@ yarn add @asaidimu/utils-cache @asaidimu/events
|
|
|
96
96
|
`Cache` is initialized with a `CacheOptions` object, allowing you to customize its behavior globally. Individual queries registered via `registerQuery` can override these options for specific data keys.
|
|
97
97
|
|
|
98
98
|
```typescript
|
|
99
|
-
import { Cache } from
|
|
99
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
100
100
|
// Example persistence layer (install separately, e.g., @asaidimu/utils-persistence)
|
|
101
|
-
import { IndexedDBPersistence } from
|
|
101
|
+
import { IndexedDBPersistence } from "@asaidimu/utils-persistence"; // Example
|
|
102
102
|
|
|
103
103
|
const myCache = new Cache({
|
|
104
|
-
staleTime: 5 * 60 * 1000,
|
|
105
|
-
cacheTime: 30 * 60 * 1000,
|
|
106
|
-
retryAttempts: 2,
|
|
107
|
-
retryDelay: 2000,
|
|
108
|
-
maxSize: 500,
|
|
109
|
-
enableMetrics: true,
|
|
110
|
-
|
|
104
|
+
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes (5 * 60 * 1000ms)
|
|
105
|
+
cacheTime: 30 * 60 * 1000, // Data evicted if not accessed for 30 minutes
|
|
106
|
+
retryAttempts: 2, // Retry fetch up to 2 times on failure
|
|
107
|
+
retryDelay: 2000, // 2-second initial delay between retries (doubles each attempt)
|
|
108
|
+
maxSize: 500, // Maximum 500 entries in cache (LRU eviction)
|
|
109
|
+
enableMetrics: true, // Enable performance tracking
|
|
110
|
+
|
|
111
111
|
// Persistence options (optional but recommended for stateful caches)
|
|
112
|
-
persistence: new IndexedDBPersistence(
|
|
113
|
-
persistenceId:
|
|
112
|
+
persistence: new IndexedDBPersistence("my-app-db"), // Plug in your persistence layer
|
|
113
|
+
persistenceId: "my-app-cache-v1", // Unique ID for this cache instance in persistence
|
|
114
114
|
persistenceDebounceTime: 1000, // Debounce persistence writes by 1 second
|
|
115
115
|
|
|
116
116
|
// Custom serializers/deserializers for non-JSON-serializable data (optional)
|
|
117
117
|
serializeValue: (value: any) => {
|
|
118
|
-
if (value instanceof Map)
|
|
119
|
-
|
|
118
|
+
if (value instanceof Map)
|
|
119
|
+
return { _type: "Map", data: Array.from(value.entries()) };
|
|
120
|
+
if (value instanceof Date)
|
|
121
|
+
return { _type: "Date", data: value.toISOString() };
|
|
120
122
|
return value;
|
|
121
123
|
},
|
|
122
124
|
deserializeValue: (value: any) => {
|
|
123
|
-
if (typeof value ===
|
|
124
|
-
if (value._type ===
|
|
125
|
-
if (value._type ===
|
|
125
|
+
if (typeof value === "object" && value !== null) {
|
|
126
|
+
if (value._type === "Map") return new Map(value.data);
|
|
127
|
+
if (value._type === "Date") return new Date(value.data);
|
|
126
128
|
}
|
|
127
129
|
return value;
|
|
128
130
|
},
|
|
@@ -138,23 +140,26 @@ const invalidCache = new Cache({ staleTime: -100, cacheTime: -1, maxSize: -5 });
|
|
|
138
140
|
To verify that `Cache` is installed and initialized correctly, you can run a simple test:
|
|
139
141
|
|
|
140
142
|
```typescript
|
|
141
|
-
import { Cache } from
|
|
143
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
142
144
|
|
|
143
145
|
const cache = new Cache();
|
|
144
|
-
console.log(
|
|
146
|
+
console.log("Cache initialized successfully!");
|
|
145
147
|
|
|
146
148
|
// Register a simple query
|
|
147
|
-
cache.registerQuery(
|
|
149
|
+
cache.registerQuery("hello", async () => {
|
|
148
150
|
console.log('Fetching "hello" data...');
|
|
149
|
-
return
|
|
151
|
+
return "world";
|
|
150
152
|
});
|
|
151
153
|
|
|
152
154
|
// Try to fetch data
|
|
153
|
-
cache
|
|
154
|
-
|
|
155
|
-
|
|
156
|
-
|
|
157
|
-
})
|
|
155
|
+
cache
|
|
156
|
+
.get("hello")
|
|
157
|
+
.then((data) => {
|
|
158
|
+
console.log(`Fetched 'hello': ${data}`); // Expected: Fetching "hello" data... \n Fetched 'hello': world
|
|
159
|
+
})
|
|
160
|
+
.catch((error) => {
|
|
161
|
+
console.error("Error fetching:", error);
|
|
162
|
+
});
|
|
158
163
|
```
|
|
159
164
|
|
|
160
165
|
---
|
|
@@ -166,46 +171,50 @@ cache.get('hello').then(data => {
|
|
|
166
171
|
The core of `Cache` involves registering queries (data fetching functions) and then retrieving data using those queries.
|
|
167
172
|
|
|
168
173
|
```typescript
|
|
169
|
-
import { Cache } from
|
|
174
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
170
175
|
|
|
171
176
|
const myCache = new Cache({
|
|
172
|
-
staleTime: 5000,
|
|
177
|
+
staleTime: 5000, // Data becomes stale after 5 seconds
|
|
173
178
|
cacheTime: 60000, // Data will be garbage collected if not accessed for 1 minute
|
|
174
179
|
});
|
|
175
180
|
|
|
176
181
|
// 1. Register a query with a unique string key and an async function to fetch the data.
|
|
177
|
-
myCache.registerQuery(
|
|
178
|
-
|
|
179
|
-
|
|
180
|
-
|
|
181
|
-
|
|
182
|
-
|
|
182
|
+
myCache.registerQuery(
|
|
183
|
+
"user/123",
|
|
184
|
+
async () => {
|
|
185
|
+
console.log("--- Fetching user data from API... ---");
|
|
186
|
+
// Simulate network delay
|
|
187
|
+
await new Promise((resolve) => setTimeout(resolve, 1000));
|
|
188
|
+
return { id: 123, name: "Alice", email: "alice@example.com" };
|
|
189
|
+
},
|
|
190
|
+
{ staleTime: 2000 },
|
|
191
|
+
); // Override staleTime for this specific query to 2 seconds
|
|
183
192
|
|
|
184
193
|
// 2. Retrieve data from the cache using `get()`.
|
|
185
194
|
|
|
186
195
|
async function getUserData(label: string) {
|
|
187
196
|
console.log(`\n${label}: Requesting user/123`);
|
|
188
|
-
const userData = await myCache.get(
|
|
197
|
+
const userData = await myCache.get("user/123"); // Default: stale-while-revalidate
|
|
189
198
|
console.log(`${label}: User data received:`, userData);
|
|
190
199
|
}
|
|
191
200
|
|
|
192
201
|
// First call: Data is not in cache (miss). Triggers fetch.
|
|
193
|
-
getUserData(
|
|
202
|
+
getUserData("Initial Call");
|
|
194
203
|
|
|
195
204
|
// Subsequent calls (within staleTime): Data is returned instantly from cache. No fetch.
|
|
196
|
-
setTimeout(() => getUserData(
|
|
205
|
+
setTimeout(() => getUserData("Cached Call"), 500);
|
|
197
206
|
|
|
198
207
|
// Call after query's staleTime: Data is returned instantly, but a background fetch is triggered.
|
|
199
|
-
setTimeout(() => getUserData(
|
|
208
|
+
setTimeout(() => getUserData("Stale & Background Fetch"), 2500);
|
|
200
209
|
|
|
201
210
|
// Example of waiting for fresh data
|
|
202
211
|
async function getFreshUserData() {
|
|
203
|
-
console.log(
|
|
212
|
+
console.log("\n--- Requesting FRESH user data (waiting for fetch)... ---");
|
|
204
213
|
try {
|
|
205
|
-
const freshUserData = await myCache.get(
|
|
206
|
-
console.log(
|
|
214
|
+
const freshUserData = await myCache.get("user/123", { waitForFresh: true });
|
|
215
|
+
console.log("Fresh user data received:", freshUserData);
|
|
207
216
|
} catch (error) {
|
|
208
|
-
console.error(
|
|
217
|
+
console.error("Failed to get fresh user data:", error);
|
|
209
218
|
}
|
|
210
219
|
}
|
|
211
220
|
|
|
@@ -220,7 +229,7 @@ setTimeout(() => getFreshUserData(), 3000);
|
|
|
220
229
|
Creates a new `Cache` instance with global default options.
|
|
221
230
|
|
|
222
231
|
```typescript
|
|
223
|
-
import { Cache } from
|
|
232
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
224
233
|
|
|
225
234
|
const cache = new Cache({
|
|
226
235
|
staleTime: 5 * 60 * 1000, // 5 minutes
|
|
@@ -233,41 +242,48 @@ const cache = new Cache({
|
|
|
233
242
|
|
|
234
243
|
Registers a data fetching function associated with a unique `key`. This `fetchFunction` will be called when data for the `key` is not in cache, is stale, or explicitly invalidated/refreshed.
|
|
235
244
|
|
|
236
|
-
-
|
|
237
|
-
-
|
|
238
|
-
-
|
|
245
|
+
- `key`: A unique string identifier for the data.
|
|
246
|
+
- `fetchFunction`: An `async` function that returns a `Promise` resolving to the data of type `T`.
|
|
247
|
+
- `options`: Optional `CacheOptions` to override the instance's default options for this specific query (e.g., a shorter `staleTime` for frequently changing data).
|
|
239
248
|
|
|
240
249
|
```typescript
|
|
241
|
-
cache.registerQuery(
|
|
242
|
-
|
|
243
|
-
|
|
244
|
-
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
|
|
248
|
-
|
|
250
|
+
cache.registerQuery(
|
|
251
|
+
"products/featured",
|
|
252
|
+
async () => {
|
|
253
|
+
const response = await fetch("https://api.example.com/products/featured");
|
|
254
|
+
if (!response.ok) throw new Error("Failed to fetch featured products");
|
|
255
|
+
return response.json();
|
|
256
|
+
},
|
|
257
|
+
{
|
|
258
|
+
staleTime: 60 * 1000, // This query's data is stale after 1 minute
|
|
259
|
+
retryAttempts: 5, // It will retry fetching up to 5 times
|
|
260
|
+
},
|
|
261
|
+
);
|
|
249
262
|
```
|
|
250
263
|
|
|
251
264
|
#### `cache.get<T>(key: string, options?: { waitForFresh?: boolean; throwOnError?: boolean }): Promise<T | undefined>`
|
|
252
265
|
|
|
253
266
|
Retrieves data for a given `key`.
|
|
254
267
|
|
|
255
|
-
-
|
|
256
|
-
-
|
|
257
|
-
-
|
|
258
|
-
-
|
|
259
|
-
-
|
|
268
|
+
- If data is fresh, returns it immediately.
|
|
269
|
+
- If data is stale (and `waitForFresh` is `false` or unset), returns it immediately and triggers a background refetch (stale-while-revalidate).
|
|
270
|
+
- If data is not in cache (miss), it triggers a fetch.
|
|
271
|
+
- `waitForFresh`: If `true`, the method will await the `fetchFunction` to complete and return fresh data. If `false` (default), it will return existing stale data immediately if available, otherwise `undefined` while a fetch is ongoing in the background.
|
|
272
|
+
- `throwOnError`: If `true`, and the `fetchFunction` fails after all retries, the promise returned by `get` will reject with the error. If `false` (default), it will return `undefined` on fetch failure, or the last successfully fetched data if available.
|
|
260
273
|
|
|
261
274
|
```typescript
|
|
262
275
|
// Basic usage (stale-while-revalidate)
|
|
263
|
-
const post = await cache.get(
|
|
276
|
+
const post = await cache.get("posts/latest");
|
|
264
277
|
|
|
265
278
|
// Wait for fresh data, throw if fetch fails
|
|
266
279
|
try {
|
|
267
|
-
const userProfile = await cache.get(
|
|
268
|
-
|
|
280
|
+
const userProfile = await cache.get("user/profile", {
|
|
281
|
+
waitForFresh: true,
|
|
282
|
+
throwOnError: true,
|
|
283
|
+
});
|
|
284
|
+
console.log("Latest user profile:", userProfile);
|
|
269
285
|
} catch (error) {
|
|
270
|
-
console.error(
|
|
286
|
+
console.error("Could not get fresh user profile due to an error:", error);
|
|
271
287
|
}
|
|
272
288
|
```
|
|
273
289
|
|
|
@@ -276,11 +292,11 @@ try {
|
|
|
276
292
|
Retrieves data from the cache without triggering any fetches, updating `lastAccessed` time, or `accessCount`. Useful for quick synchronous checks.
|
|
277
293
|
|
|
278
294
|
```typescript
|
|
279
|
-
const cachedValue = cache.peek(
|
|
295
|
+
const cachedValue = cache.peek("some-config-key");
|
|
280
296
|
if (cachedValue) {
|
|
281
|
-
console.log(
|
|
297
|
+
console.log("Value is in cache:", cachedValue);
|
|
282
298
|
} else {
|
|
283
|
-
console.log(
|
|
299
|
+
console.log("Value not found in cache.");
|
|
284
300
|
}
|
|
285
301
|
```
|
|
286
302
|
|
|
@@ -289,10 +305,10 @@ if (cachedValue) {
|
|
|
289
305
|
Checks if a non-stale, non-loading entry exists in the cache for the given `key`.
|
|
290
306
|
|
|
291
307
|
```typescript
|
|
292
|
-
if (cache.has(
|
|
293
|
-
console.log(
|
|
308
|
+
if (cache.has("config/app")) {
|
|
309
|
+
console.log("App config is ready and fresh.");
|
|
294
310
|
} else {
|
|
295
|
-
console.log(
|
|
311
|
+
console.log("App config is missing, stale, or currently loading.");
|
|
296
312
|
}
|
|
297
313
|
```
|
|
298
314
|
|
|
@@ -300,23 +316,23 @@ if (cache.has('config/app')) {
|
|
|
300
316
|
|
|
301
317
|
Marks a specific cache entry as stale, forcing the next `get` call for that key to trigger a refetch. Optionally triggers an immediate background refetch.
|
|
302
318
|
|
|
303
|
-
-
|
|
304
|
-
-
|
|
319
|
+
- `key`: The cache key to invalidate.
|
|
320
|
+
- `refetch`: If `true` (default), triggers an immediate background fetch for the invalidated key using its registered `fetchFunction`.
|
|
305
321
|
|
|
306
322
|
```typescript
|
|
307
323
|
// After updating a user, invalidate their profile data to ensure next fetch is fresh
|
|
308
|
-
await cache.invalidate(
|
|
324
|
+
await cache.invalidate("user/123/profile");
|
|
309
325
|
|
|
310
326
|
// Invalidate and don't refetch until `get` is explicitly called later
|
|
311
|
-
await cache.invalidate(
|
|
327
|
+
await cache.invalidate("admin/dashboard/stats", false);
|
|
312
328
|
```
|
|
313
329
|
|
|
314
330
|
#### `cache.invalidatePattern(pattern: RegExp, refetch = true): Promise<void>`
|
|
315
331
|
|
|
316
332
|
Invalidates all cache entries whose keys match the given regular expression. Similar to `invalidate`, it optionally triggers immediate background refetches for all matched keys.
|
|
317
333
|
|
|
318
|
-
-
|
|
319
|
-
-
|
|
334
|
+
- `pattern`: A `RegExp` object to match against cache keys.
|
|
335
|
+
- `refetch`: If `true` (default), triggers immediate background fetches for all matched keys.
|
|
320
336
|
|
|
321
337
|
```typescript
|
|
322
338
|
// Invalidate all product-related data (e.g., after a mass product update)
|
|
@@ -332,8 +348,8 @@ Triggers a background fetch for a `key` if it's not already in cache or is stale
|
|
|
332
348
|
|
|
333
349
|
```typescript
|
|
334
350
|
// On application startup or route change, prefetch common data
|
|
335
|
-
cache.prefetch(
|
|
336
|
-
cache.prefetch(
|
|
351
|
+
cache.prefetch("static-content/footer");
|
|
352
|
+
cache.prefetch("user/notifications/unread");
|
|
337
353
|
```
|
|
338
354
|
|
|
339
355
|
#### `cache.refresh<T>(key: string): Promise<T | undefined>`
|
|
@@ -342,8 +358,8 @@ Forces a re-fetch of data for a given `key`, bypassing staleness checks and any
|
|
|
342
358
|
|
|
343
359
|
```typescript
|
|
344
360
|
// After an API call modifies a resource, force update its cached version
|
|
345
|
-
const updatedUser = await cache.refresh(
|
|
346
|
-
console.log(
|
|
361
|
+
const updatedUser = await cache.refresh("user/current");
|
|
362
|
+
console.log("User data refreshed:", updatedUser);
|
|
347
363
|
```
|
|
348
364
|
|
|
349
365
|
#### `cache.setData<T>(key: string, data: T): void`
|
|
@@ -352,11 +368,11 @@ Manually sets or updates data in the cache for a given `key`. This immediately u
|
|
|
352
368
|
|
|
353
369
|
```typescript
|
|
354
370
|
// Manually update a shopping cart item count after a local UI interaction
|
|
355
|
-
cache.setData(
|
|
371
|
+
cache.setData("cart/item-count", 5);
|
|
356
372
|
|
|
357
373
|
// Directly inject data fetched from another source or computed locally
|
|
358
|
-
const localConfig = { theme:
|
|
359
|
-
cache.setData(
|
|
374
|
+
const localConfig = { theme: "dark", fontSize: "medium" };
|
|
375
|
+
cache.setData("app/settings", localConfig);
|
|
360
376
|
```
|
|
361
377
|
|
|
362
378
|
#### `cache.remove(key: string): boolean`
|
|
@@ -365,51 +381,63 @@ Removes a specific entry from the cache. Returns `true` if an entry was found an
|
|
|
365
381
|
|
|
366
382
|
```typescript
|
|
367
383
|
// When a user logs out, remove their specific session data
|
|
368
|
-
cache.remove(
|
|
384
|
+
cache.remove("user/session");
|
|
369
385
|
```
|
|
370
386
|
|
|
371
387
|
#### `cache.on<EType extends CacheEventType>(event: EType, listener: (ev: Extract<CacheEvent, { type: EType }>) => void): () => void`
|
|
372
388
|
|
|
373
389
|
Subscribes a listener function to specific cache events. Returns an `unsubscribe` function.
|
|
374
390
|
|
|
375
|
-
-
|
|
376
|
-
-
|
|
391
|
+
- `event`: The type of event to listen for (e.g., `'cache:read:hit'`, `'cache:fetch:error'`). Wildcards like `'cache:read:*'` are supported. See `CacheEventType` in `types.ts` for all available types.
|
|
392
|
+
- `listener`: A callback function that receives the specific event payload for the subscribed event type.
|
|
377
393
|
|
|
378
394
|
```typescript
|
|
379
|
-
import { Cache, CacheEvent, CacheEventType } from
|
|
395
|
+
import { Cache, CacheEvent, CacheEventType } from "@asaidimu/utils-cache";
|
|
380
396
|
|
|
381
397
|
const myCache = new Cache();
|
|
382
398
|
|
|
383
|
-
const unsubscribeHit = myCache.on(
|
|
399
|
+
const unsubscribeHit = myCache.on("cache:read:hit", (e) => {
|
|
384
400
|
console.log(`[CacheEvent] HIT for ${e.key} (isStale: ${e.isStale})`);
|
|
385
401
|
});
|
|
386
402
|
|
|
387
|
-
myCache.on(
|
|
403
|
+
myCache.on("cache:read:miss", (e) => {
|
|
388
404
|
console.log(`[CacheEvent] MISS for ${e.key}`);
|
|
389
405
|
});
|
|
390
406
|
|
|
391
|
-
myCache.on(
|
|
392
|
-
console.error(
|
|
407
|
+
myCache.on("cache:fetch:error", (e) => {
|
|
408
|
+
console.error(
|
|
409
|
+
`[CacheEvent] ERROR for ${e.key} (attempt ${e.attempt}):`,
|
|
410
|
+
e.error.message,
|
|
411
|
+
);
|
|
393
412
|
});
|
|
394
413
|
|
|
395
|
-
myCache.on(
|
|
396
|
-
console.log(
|
|
414
|
+
myCache.on("cache:persistence:save:success", (e) => {
|
|
415
|
+
console.log(
|
|
416
|
+
`[CacheEvent] Persistence: Cache state saved successfully for ID: ${e.key}`,
|
|
417
|
+
);
|
|
397
418
|
});
|
|
398
419
|
|
|
399
|
-
myCache.on(
|
|
400
|
-
console.error(
|
|
420
|
+
myCache.on("cache:persistence:load:error", (e) => {
|
|
421
|
+
console.error(
|
|
422
|
+
`[CacheEvent] Persistence: Failed to load cache state for ID: ${e.key}`,
|
|
423
|
+
e.error,
|
|
424
|
+
);
|
|
401
425
|
});
|
|
402
426
|
|
|
403
427
|
// For demonstration, register a query and trigger events
|
|
404
|
-
myCache.registerQuery(
|
|
405
|
-
|
|
406
|
-
|
|
407
|
-
|
|
408
|
-
|
|
428
|
+
myCache.registerQuery(
|
|
429
|
+
"demo-item",
|
|
430
|
+
async () => {
|
|
431
|
+
console.log("--- Fetching demo-item ---");
|
|
432
|
+
await new Promise((r) => setTimeout(r, 200));
|
|
433
|
+
return "demo-data";
|
|
434
|
+
},
|
|
435
|
+
{ staleTime: 100 },
|
|
436
|
+
);
|
|
409
437
|
|
|
410
|
-
myCache.get(
|
|
411
|
-
setTimeout(() => myCache.get(
|
|
412
|
-
setTimeout(() => myCache.get(
|
|
438
|
+
myCache.get("demo-item"); // Triggers miss, fetch, set_data
|
|
439
|
+
setTimeout(() => myCache.get("demo-item"), 50); // Triggers hit
|
|
440
|
+
setTimeout(() => myCache.get("demo-item"), 150); // Triggers stale hit, background fetch
|
|
413
441
|
|
|
414
442
|
// To unsubscribe from a specific event later:
|
|
415
443
|
unsubscribeHit();
|
|
@@ -419,18 +447,18 @@ unsubscribeHit();
|
|
|
419
447
|
|
|
420
448
|
Returns current cache statistics and detailed metrics.
|
|
421
449
|
|
|
422
|
-
-
|
|
423
|
-
-
|
|
424
|
-
-
|
|
425
|
-
-
|
|
426
|
-
-
|
|
450
|
+
- `size`: Number of active entries in the cache.
|
|
451
|
+
- `metrics`: An object containing raw counts (`hits`, `misses`, `fetches`, `errors`, `evictions`, `staleHits`).
|
|
452
|
+
- `hitRate`: Ratio of hits to total requests (hits + misses).
|
|
453
|
+
- `staleHitRate`: Ratio of stale hits to total hits.
|
|
454
|
+
- `entries`: An array of objects providing details for each cached item (key, lastAccessed, lastUpdated, accessCount, isStale, isLoading, error status).
|
|
427
455
|
|
|
428
456
|
```typescript
|
|
429
457
|
const stats = myCache.getStats();
|
|
430
|
-
console.log(
|
|
431
|
-
console.log(
|
|
432
|
-
console.log(
|
|
433
|
-
console.log(
|
|
458
|
+
console.log("Cache Size:", stats.size);
|
|
459
|
+
console.log("Metrics:", stats.metrics);
|
|
460
|
+
console.log("Overall Hit Rate:", (stats.hitRate * 100).toFixed(2) + "%");
|
|
461
|
+
console.log("Entries details:", stats.entries);
|
|
434
462
|
```
|
|
435
463
|
|
|
436
464
|
#### `cache.clear(): Promise<void>`
|
|
@@ -438,9 +466,9 @@ console.log('Entries details:', stats.entries);
|
|
|
438
466
|
Clears all data from the in-memory cache, resets metrics, and attempts to clear the associated persisted state via the `persistence` layer.
|
|
439
467
|
|
|
440
468
|
```typescript
|
|
441
|
-
console.log(
|
|
469
|
+
console.log("Clearing cache...");
|
|
442
470
|
await myCache.clear();
|
|
443
|
-
console.log(
|
|
471
|
+
console.log("Cache cleared. Current size:", myCache.getStats().size);
|
|
444
472
|
```
|
|
445
473
|
|
|
446
474
|
#### `cache.destroy(): void`
|
|
@@ -449,7 +477,7 @@ Shuts down the cache instance, clearing all data, stopping the automatic garbage
|
|
|
449
477
|
|
|
450
478
|
```typescript
|
|
451
479
|
myCache.destroy();
|
|
452
|
-
console.log(
|
|
480
|
+
console.log("Cache instance destroyed. All timers stopped and data cleared.");
|
|
453
481
|
```
|
|
454
482
|
|
|
455
483
|
### Configuration Examples
|
|
@@ -457,79 +485,94 @@ console.log('Cache instance destroyed. All timers stopped and data cleared.');
|
|
|
457
485
|
The `CacheOptions` interface provides extensive control over the cache's behavior:
|
|
458
486
|
|
|
459
487
|
```typescript
|
|
460
|
-
import {
|
|
488
|
+
import {
|
|
489
|
+
CacheOptions,
|
|
490
|
+
SimplePersistence,
|
|
491
|
+
SerializableCacheState,
|
|
492
|
+
} from "@asaidimu/utils-cache";
|
|
461
493
|
|
|
462
494
|
// A mock persistence layer for demonstration purposes.
|
|
463
495
|
// In a real application, you'd use an actual implementation like IndexedDBPersistence.
|
|
464
496
|
class MockPersistence implements SimplePersistence<SerializableCacheState> {
|
|
465
|
-
|
|
466
|
-
|
|
467
|
-
|
|
468
|
-
|
|
469
|
-
|
|
470
|
-
|
|
471
|
-
|
|
472
|
-
|
|
473
|
-
|
|
474
|
-
|
|
475
|
-
|
|
476
|
-
|
|
477
|
-
|
|
478
|
-
|
|
479
|
-
|
|
480
|
-
|
|
481
|
-
|
|
482
|
-
|
|
483
|
-
|
|
484
|
-
|
|
497
|
+
private store = new Map<string, SerializableCacheState>();
|
|
498
|
+
private subscribers = new Map<
|
|
499
|
+
string,
|
|
500
|
+
Array<(data: SerializableCacheState) => void>
|
|
501
|
+
>();
|
|
502
|
+
|
|
503
|
+
async get(id: string): Promise<SerializableCacheState | undefined> {
|
|
504
|
+
console.log(`[MockPersistence] Getting state for ID: ${id}`);
|
|
505
|
+
return this.store.get(id);
|
|
506
|
+
}
|
|
507
|
+
async set(id: string, data: SerializableCacheState): Promise<void> {
|
|
508
|
+
console.log(`[MockPersistence] Setting state for ID: ${id}`);
|
|
509
|
+
this.store.set(id, data);
|
|
510
|
+
// Simulate remote update notification to all subscribed instances
|
|
511
|
+
this.subscribers.get(id)?.forEach((cb) => cb(data));
|
|
512
|
+
}
|
|
513
|
+
async clear(id?: string): Promise<void> {
|
|
514
|
+
console.log(
|
|
515
|
+
`[MockPersistence] Clearing state ${id ? "for ID: " + id : "(all)"}`,
|
|
516
|
+
);
|
|
517
|
+
if (id) {
|
|
518
|
+
this.store.delete(id);
|
|
519
|
+
} else {
|
|
520
|
+
this.store.clear();
|
|
485
521
|
}
|
|
486
|
-
|
|
487
|
-
|
|
488
|
-
|
|
489
|
-
|
|
490
|
-
|
|
491
|
-
|
|
492
|
-
|
|
493
|
-
|
|
494
|
-
const callbacks = this.subscribers.get(id);
|
|
495
|
-
if (callbacks) {
|
|
496
|
-
this.subscribers.set(id, callbacks.filter(cb => cb !== callback));
|
|
497
|
-
}
|
|
498
|
-
console.log(`[MockPersistence] Unsubscribed from ID: ${id}`);
|
|
499
|
-
};
|
|
522
|
+
}
|
|
523
|
+
subscribe(
|
|
524
|
+
id: string,
|
|
525
|
+
callback: (data: SerializableCacheState) => void,
|
|
526
|
+
): () => void {
|
|
527
|
+
console.log(`[MockPersistence] Subscribing to ID: ${id}`);
|
|
528
|
+
if (!this.subscribers.has(id)) {
|
|
529
|
+
this.subscribers.set(id, []);
|
|
500
530
|
}
|
|
531
|
+
this.subscribers.get(id)?.push(callback);
|
|
532
|
+
// Return unsubscribe function
|
|
533
|
+
return () => {
|
|
534
|
+
const callbacks = this.subscribers.get(id);
|
|
535
|
+
if (callbacks) {
|
|
536
|
+
this.subscribers.set(
|
|
537
|
+
id,
|
|
538
|
+
callbacks.filter((cb) => cb !== callback),
|
|
539
|
+
);
|
|
540
|
+
}
|
|
541
|
+
console.log(`[MockPersistence] Unsubscribed from ID: ${id}`);
|
|
542
|
+
};
|
|
543
|
+
}
|
|
501
544
|
}
|
|
502
545
|
|
|
503
546
|
const fullOptions: CacheOptions = {
|
|
504
|
-
staleTime: 1000 * 60 * 5,
|
|
505
|
-
cacheTime: 1000 * 60 * 60,
|
|
506
|
-
retryAttempts: 3,
|
|
507
|
-
retryDelay: 1000,
|
|
508
|
-
maxSize: 2000,
|
|
509
|
-
enableMetrics: true,
|
|
547
|
+
staleTime: 1000 * 60 * 5, // 5 minutes: After this time, data is stale; a background fetch is considered.
|
|
548
|
+
cacheTime: 1000 * 60 * 60, // 1 hour: Items idle (not accessed) for this long are eligible for garbage collection.
|
|
549
|
+
retryAttempts: 3, // Max 3 fetch attempts (initial + 2 retries) on network/fetch failures.
|
|
550
|
+
retryDelay: 1000, // 1 second initial delay for retries (doubles each subsequent attempt).
|
|
551
|
+
maxSize: 2000, // Keep up to 2000 entries; LRU eviction kicks in beyond this limit.
|
|
552
|
+
enableMetrics: true, // Enable performance tracking (hits, misses, fetches, etc.).
|
|
510
553
|
persistence: new MockPersistence(), // Provide an instance of your persistence layer implementation.
|
|
511
|
-
persistenceId:
|
|
554
|
+
persistenceId: "my-unique-cache-instance", // A unique identifier for this cache instance within the persistence store.
|
|
512
555
|
persistenceDebounceTime: 750, // Wait 750ms after a cache change before writing to persistence to batch writes.
|
|
513
|
-
|
|
556
|
+
|
|
514
557
|
// Custom serializers/deserializers for data that isn't natively JSON serializable (e.g., Maps, Dates, custom classes).
|
|
515
558
|
serializeValue: (value: any) => {
|
|
516
559
|
// Example: Convert Date objects to ISO strings for JSON serialization
|
|
517
560
|
if (value instanceof Date) {
|
|
518
|
-
return { _type:
|
|
561
|
+
return { _type: "Date", data: value.toISOString() };
|
|
519
562
|
}
|
|
520
563
|
// Example: Convert Map objects to an array for JSON serialization
|
|
521
564
|
if (value instanceof Map) {
|
|
522
|
-
return { _type:
|
|
565
|
+
return { _type: "Map", data: Array.from(value.entries()) };
|
|
523
566
|
}
|
|
524
567
|
return value; // Return as is for other types
|
|
525
568
|
},
|
|
526
569
|
deserializeValue: (value: any) => {
|
|
527
570
|
// Example: Convert ISO strings back to Date objects
|
|
528
|
-
if (typeof value ===
|
|
571
|
+
if (typeof value === "object" && value !== null && value._type === "Date") {
|
|
529
572
|
return new Date(value.data);
|
|
530
573
|
}
|
|
531
574
|
// Example: Convert array back to Map objects
|
|
532
|
-
if (typeof value ===
|
|
575
|
+
if (typeof value === "object" && value !== null && value._type === "Map") {
|
|
533
576
|
return new Map(value.data);
|
|
534
577
|
}
|
|
535
578
|
return value; // Return as is for other types
|
|
@@ -546,36 +589,42 @@ const configuredCache = new Cache(fullOptions);
|
|
|
546
589
|
This is the default and most common pattern, where you prioritize immediate responsiveness while ensuring data freshness in the background.
|
|
547
590
|
|
|
548
591
|
```typescript
|
|
549
|
-
import { Cache } from
|
|
592
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
550
593
|
|
|
551
594
|
const apiCache = new Cache({
|
|
552
|
-
staleTime: 5 * 60 * 1000,
|
|
595
|
+
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes
|
|
553
596
|
cacheTime: 30 * 60 * 1000, // Idle data garbage collected after 30 minutes
|
|
554
|
-
retryAttempts: 3,
|
|
597
|
+
retryAttempts: 3, // Retry fetching on network failures
|
|
555
598
|
});
|
|
556
599
|
|
|
557
600
|
// Register a query for a list of blog posts
|
|
558
|
-
apiCache.registerQuery(
|
|
559
|
-
console.log(
|
|
560
|
-
const response = await fetch(
|
|
561
|
-
if (!response.ok) throw new Error(
|
|
601
|
+
apiCache.registerQuery("blog/posts", async () => {
|
|
602
|
+
console.log("--- Fetching ALL blog posts from API... ---");
|
|
603
|
+
const response = await fetch("https://api.example.com/blog/posts");
|
|
604
|
+
if (!response.ok) throw new Error("Failed to fetch blog posts");
|
|
562
605
|
return response.json();
|
|
563
606
|
});
|
|
564
607
|
|
|
565
608
|
// Function to display blog posts
|
|
566
609
|
async function displayBlogPosts(source: string) {
|
|
567
610
|
console.log(`\nDisplaying blog posts from: ${source}`);
|
|
568
|
-
const posts = await apiCache.get(
|
|
611
|
+
const posts = await apiCache.get("blog/posts"); // Uses SWR by default
|
|
569
612
|
if (posts) {
|
|
570
|
-
console.log(
|
|
613
|
+
console.log(
|
|
614
|
+
`Received ${posts.length} posts (first 2):`,
|
|
615
|
+
posts.slice(0, 2).map((p: any) => p.title),
|
|
616
|
+
);
|
|
571
617
|
} else {
|
|
572
|
-
console.log(
|
|
618
|
+
console.log("No posts yet, waiting for initial fetch...");
|
|
573
619
|
}
|
|
574
620
|
}
|
|
575
621
|
|
|
576
|
-
displayBlogPosts(
|
|
577
|
-
setTimeout(() => displayBlogPosts(
|
|
578
|
-
setTimeout(
|
|
622
|
+
displayBlogPosts("Initial Load"); // First `get`: cache miss, triggers fetch.
|
|
623
|
+
setTimeout(() => displayBlogPosts("After 1 sec (cached)"), 1000); // Second `get`: cache hit, returns instantly.
|
|
624
|
+
setTimeout(
|
|
625
|
+
() => displayBlogPosts("After 6 mins (stale & background fetch)"),
|
|
626
|
+
6 * 60 * 1000,
|
|
627
|
+
); // After `staleTime`: returns cached, triggers background fetch.
|
|
579
628
|
```
|
|
580
629
|
|
|
581
630
|
#### Using `waitForFresh` for Critical Data
|
|
@@ -583,29 +632,32 @@ setTimeout(() => displayBlogPosts('After 6 mins (stale & background fetch)'), 6
|
|
|
583
632
|
For scenarios where serving outdated data is unacceptable (e.g., user permissions, critical configuration).
|
|
584
633
|
|
|
585
634
|
```typescript
|
|
586
|
-
import { Cache } from
|
|
635
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
587
636
|
|
|
588
637
|
const criticalCache = new Cache({ retryAttempts: 5, retryDelay: 1000 });
|
|
589
638
|
|
|
590
|
-
criticalCache.registerQuery(
|
|
591
|
-
console.log(
|
|
639
|
+
criticalCache.registerQuery("user/permissions", async () => {
|
|
640
|
+
console.log("--- Fetching user permissions from API... ---");
|
|
592
641
|
// Simulate potential network flakiness
|
|
593
642
|
if (Math.random() > 0.7) {
|
|
594
|
-
throw new Error(
|
|
643
|
+
throw new Error("Network error during permission fetch!");
|
|
595
644
|
}
|
|
596
|
-
await new Promise(resolve => setTimeout(resolve, 500));
|
|
597
|
-
return { canEdit: true, canDelete: false, roles: [
|
|
645
|
+
await new Promise((resolve) => setTimeout(resolve, 500));
|
|
646
|
+
return { canEdit: true, canDelete: false, roles: ["user", "editor"] };
|
|
598
647
|
});
|
|
599
648
|
|
|
600
649
|
async function checkPermissionsBeforeAction() {
|
|
601
|
-
console.log(
|
|
650
|
+
console.log("\nAttempting to get FRESH user permissions...");
|
|
602
651
|
try {
|
|
603
652
|
// We MUST have the latest permissions before proceeding with a sensitive action
|
|
604
|
-
const permissions = await criticalCache.get(
|
|
605
|
-
|
|
653
|
+
const permissions = await criticalCache.get("user/permissions", {
|
|
654
|
+
waitForFresh: true,
|
|
655
|
+
throwOnError: true,
|
|
656
|
+
});
|
|
657
|
+
console.log("User permissions received:", permissions);
|
|
606
658
|
// Proceed with action based on permissions
|
|
607
659
|
} catch (error) {
|
|
608
|
-
console.error(
|
|
660
|
+
console.error("CRITICAL: Failed to load user permissions:", error);
|
|
609
661
|
// Redirect to error page, show critical alert, or disable functionality
|
|
610
662
|
}
|
|
611
663
|
}
|
|
@@ -620,54 +672,73 @@ setInterval(() => checkPermissionsBeforeAction(), 3000);
|
|
|
620
672
|
Utilize the comprehensive event system to log, monitor, or react to cache lifecycle events.
|
|
621
673
|
|
|
622
674
|
```typescript
|
|
623
|
-
import { Cache } from
|
|
675
|
+
import { Cache } from "@asaidimu/utils-cache";
|
|
624
676
|
|
|
625
677
|
const monitorCache = new Cache({ enableMetrics: true });
|
|
626
678
|
|
|
627
|
-
monitorCache.registerQuery(
|
|
628
|
-
|
|
629
|
-
|
|
630
|
-
|
|
631
|
-
|
|
679
|
+
monitorCache.registerQuery(
|
|
680
|
+
"stock/AAPL",
|
|
681
|
+
async () => {
|
|
682
|
+
const price = Math.random() * 100 + 150;
|
|
683
|
+
console.log(`--- Fetching AAPL price: ${price.toFixed(2)} ---`);
|
|
684
|
+
return {
|
|
685
|
+
symbol: "AAPL",
|
|
686
|
+
price: parseFloat(price.toFixed(2)),
|
|
687
|
+
timestamp: Date.now(),
|
|
688
|
+
};
|
|
689
|
+
},
|
|
690
|
+
{ staleTime: 1000 },
|
|
691
|
+
); // Very short staleTime for frequent fetches
|
|
632
692
|
|
|
633
693
|
// Subscribe to various cache events
|
|
634
|
-
monitorCache.on(
|
|
694
|
+
monitorCache.on("cache:fetch:start", (e) => {
|
|
635
695
|
console.log(`[EVENT] Fetching ${e.key} (attempt ${e.attempt})`);
|
|
636
696
|
});
|
|
637
|
-
monitorCache.on(
|
|
697
|
+
monitorCache.on("cache:read:hit", (e) => {
|
|
638
698
|
console.log(`[EVENT] Cache hit for ${e.key}. Stale: ${e.isStale}`);
|
|
639
699
|
});
|
|
640
|
-
monitorCache.on(
|
|
700
|
+
monitorCache.on("cache:read:miss", (e) => {
|
|
641
701
|
console.log(`[EVENT] Cache miss for ${e.key}`);
|
|
642
702
|
});
|
|
643
|
-
monitorCache.on(
|
|
703
|
+
monitorCache.on("cache:data:evict", (e) => {
|
|
644
704
|
console.log(`[EVENT] Evicted ${e.key} due to ${e.reason}`);
|
|
645
705
|
});
|
|
646
|
-
monitorCache.on(
|
|
647
|
-
|
|
706
|
+
monitorCache.on("cache:data:set", (e) => {
|
|
707
|
+
console.log(
|
|
708
|
+
`[EVENT] Data for ${e.key} manually set. Old price: ${e.oldData?.price}, New price: ${e.newData.price}`,
|
|
709
|
+
);
|
|
648
710
|
});
|
|
649
|
-
monitorCache.on(
|
|
650
|
-
console.log(`[EVENT] Persistence: ${e.message ||
|
|
711
|
+
monitorCache.on("cache:persistence:save:success", (e) => {
|
|
712
|
+
console.log(`[EVENT] Persistence: ${e.message || "Save successful"}`);
|
|
651
713
|
});
|
|
652
714
|
|
|
653
|
-
|
|
654
715
|
// Continuously try to get data (will trigger fetches due to short staleTime)
|
|
655
716
|
setInterval(() => {
|
|
656
|
-
monitorCache.get(
|
|
717
|
+
monitorCache.get("stock/AAPL");
|
|
657
718
|
}, 500);
|
|
658
719
|
|
|
659
720
|
// Manually set data to trigger 'set_data' and 'persistence' events
|
|
660
721
|
setTimeout(() => {
|
|
661
|
-
|
|
722
|
+
monitorCache.setData("stock/AAPL", {
|
|
723
|
+
symbol: "AAPL",
|
|
724
|
+
price: 160.0,
|
|
725
|
+
timestamp: Date.now(),
|
|
726
|
+
});
|
|
662
727
|
}, 3000);
|
|
663
728
|
|
|
664
729
|
// Log cache statistics periodically
|
|
665
730
|
setInterval(() => {
|
|
666
731
|
const stats = monitorCache.getStats();
|
|
667
732
|
console.log(`\n--- CACHE STATS ---`);
|
|
668
|
-
console.log(
|
|
669
|
-
|
|
670
|
-
|
|
733
|
+
console.log(
|
|
734
|
+
`Size: ${stats.size}, Hits: ${stats.metrics.hits}, Misses: ${stats.metrics.misses}, Fetches: ${stats.metrics.fetches}`,
|
|
735
|
+
);
|
|
736
|
+
console.log(
|
|
737
|
+
`Hit Rate: ${(stats.hitRate * 100).toFixed(2)}%, Stale Hit Rate: ${(stats.staleHitRate * 100).toFixed(2)}%`,
|
|
738
|
+
);
|
|
739
|
+
console.log(
|
|
740
|
+
`Active entries: ${stats.entries.map((e) => `${e.key} (stale:${e.isStale})`).join(", ")}`,
|
|
741
|
+
);
|
|
671
742
|
console.log(`-------------------\n`);
|
|
672
743
|
}, 5000); // Log stats every 5 seconds
|
|
673
744
|
```
|
|
@@ -691,72 +762,72 @@ package.json # Package metadata and dependencies for this specific mo
|
|
|
691
762
|
|
|
692
763
|
### Core Components
|
|
693
764
|
|
|
694
|
-
|
|
695
|
-
|
|
696
|
-
|
|
697
|
-
|
|
698
|
-
|
|
699
|
-
|
|
700
|
-
|
|
701
|
-
|
|
702
|
-
|
|
703
|
-
|
|
704
|
-
|
|
705
|
-
|
|
706
|
-
|
|
765
|
+
- **`Cache` Class (`cache.ts`)**: The central component of the library. It orchestrates all caching logic, including:
|
|
766
|
+
- Managing the in-memory `Map` (`this.cache`) that stores `CacheEntry` objects.
|
|
767
|
+
- Handling data fetching, retries, and staleness checks.
|
|
768
|
+
- Implementing time-based (TTL) and size-based (LRU) garbage collection.
|
|
769
|
+
- Integrating with the pluggable persistence layer.
|
|
770
|
+
- Emitting detailed cache events.
|
|
771
|
+
- Tracking performance metrics.
|
|
772
|
+
- **`CacheOptions` (`types.ts`)**: An interface defining the configurable parameters for a `Cache` instance or individual queries. This includes `staleTime`, `cacheTime`, `retryAttempts`, `maxSize`, persistence settings, and custom serialization/deserialization functions.
|
|
773
|
+
- **`CacheEntry` (`types.ts`)**: Represents a single item stored within the cache. It encapsulates the actual `data`, `lastUpdated` and `lastAccessed` timestamps, `accessCount`, and flags like `isLoading` or `error` status.
|
|
774
|
+
- **`QueryConfig` (`types.ts`)**: Stores the `fetchFunction` and the resolved `CacheOptions` (merged with instance defaults) for each registered query, enabling tailored behavior per data key.
|
|
775
|
+
- **`CacheMetrics` (`types.ts`)**: Defines the structure for tracking cache performance statistics, including hits, misses, fetches, errors, and evictions.
|
|
776
|
+
- **`SimplePersistence<SerializableCacheState>` (from `@asaidimu/utils-persistence`)**: An external interface that `Cache` relies on for persistent storage. It requires implementations of `get()`, `set()`, `clear()`, and optionally `subscribe()` methods to handle data serialization and deserialization for the specific storage medium (e.g., IndexedDB, LocalStorage, or a remote backend).
|
|
777
|
+
- **`CacheEvent` / `CacheEventType` (`types.ts`)**: A union type defining all possible events emitted by the cache (e.g., `'cache:read:hit'`, `'cache:fetch:start'`, `'cache:data:evict'`). This enables a fine-grained, scoped observability model for the cache's lifecycle.
|
|
707
778
|
|
|
708
779
|
### Data Flow
|
|
709
780
|
|
|
710
781
|
1. **Initialization**:
|
|
711
|
-
|
|
712
|
-
|
|
713
|
-
|
|
782
|
+
- The `Cache` constructor sets up global default options, initializes performance metrics, and starts the automatic garbage collection timer.
|
|
783
|
+
- If a `persistence` layer is configured, it attempts to load a previously saved state using `persistence.get()`, emitting `'cache:persistence:load:success'` or `'cache:persistence:load:error'`.
|
|
784
|
+
- It then subscribes to `persistence.subscribe()` (if available) to listen for remote state changes from the underlying storage, ensuring cache consistency across multiple instances or processes.
|
|
714
785
|
|
|
715
786
|
2. **`registerQuery`**:
|
|
716
|
-
|
|
787
|
+
- When `registerQuery(key, fetchFunction, options)` is called, the `fetchFunction` and its specific `options` (merged with the global `defaultOptions`) are stored internally in the `this.queries` map. This prepares the cache to handle requests for that `key`.
|
|
717
788
|
|
|
718
789
|
3. **`get` Request**:
|
|
719
|
-
|
|
720
|
-
|
|
721
|
-
|
|
722
|
-
|
|
723
|
-
|
|
724
|
-
|
|
790
|
+
- When `get(key, options)` is invoked, `Cache` first checks `this.cache` for an existing `CacheEntry` for the `key`.
|
|
791
|
+
- **Cache Hit**: If an entry exists, `lastAccessed` and `accessCount` are updated, a `'cache:read:hit'` event is emitted, and metrics are incremented. The entry's staleness is evaluated based on `staleTime`.
|
|
792
|
+
- If `waitForFresh` is `true` OR if the entry is stale/loading, it proceeds to `fetchAndWait`.
|
|
793
|
+
- If `waitForFresh` is `false` (default) and the entry is stale, the cached data is returned immediately, and a background `fetch` is triggered to update the data.
|
|
794
|
+
- If `waitForFresh` is `false` and the entry is fresh, the cached data is returned immediately.
|
|
795
|
+
- **Cache Miss**: If no entry exists, a `'cache:read:miss'` event is emitted. A placeholder `CacheEntry` (marked `isLoading`) is created, and a `fetch` is immediately triggered to retrieve the data.
|
|
725
796
|
|
|
726
797
|
4. **`fetch` / `fetchAndWait`**:
|
|
727
|
-
|
|
728
|
-
|
|
798
|
+
- These methods ensure that only one `fetchFunction` runs concurrently for a given `key` by tracking ongoing fetches in `this.fetching`.
|
|
799
|
+
- They delegate the actual data retrieval and retry logic to `performFetchWithRetry`.
|
|
729
800
|
|
|
730
801
|
5. **`performFetchWithRetry`**:
|
|
731
|
-
|
|
732
|
-
|
|
733
|
-
|
|
734
|
-
|
|
802
|
+
- This is where the registered `fetchFunction` is executed. It attempts to call the `fetchFunction` multiple times (up to `retryAttempts`) with exponential backoff (`retryDelay`).
|
|
803
|
+
- Before each attempt, a `'cache:fetch:start'` event is emitted, and `fetches` metrics are updated.
|
|
804
|
+
- **On Success**: The `CacheEntry` is updated with the new `data`, `lastUpdated` timestamp, and its `isLoading` status is set to `false`. The cache then calls `schedulePersistState()` to save the updated state and `enforceSizeLimit()` to maintain the `maxSize`.
|
|
805
|
+
- **On Failure**: If the `fetchFunction` fails, a `'cache:fetch:error'` event is emitted, and `errors` metrics are updated. If `retryAttempts` are remaining, it waits (`delay`) and retries. After all attempts, the `CacheEntry` is updated with the last `error`, `isLoading` is set to `false`, and `schedulePersistState()` is called.
|
|
735
806
|
|
|
736
807
|
6. **`schedulePersistState`**:
|
|
737
|
-
|
|
808
|
+
- This method debounces write operations to the `persistence` layer. It prevents excessive writes by waiting for a configurable `persistenceDebounceTime` before serializing the current cache state (using `serializeCache` and `serializeValue`) and writing it via `persistence.set()`. Appropriate persistence events (`'cache:persistence:save:success'`/`'cache:persistence:save:error'`) are emitted.
|
|
738
809
|
|
|
739
810
|
7. **`handleRemoteStateChange`**:
|
|
740
|
-
|
|
811
|
+
- This callback is invoked by the `persistence` layer's `subscribe` mechanism when an external change to the persisted state is detected. It deserializes the `remoteState` (using `deserializeValue`) and intelligently updates the local `this.cache` to reflect these external changes, emitting a `'cache:persistence:sync'` event.
|
|
741
812
|
|
|
742
813
|
8. **`garbageCollect`**:
|
|
743
|
-
|
|
814
|
+
- Running on a `setInterval` timer (`gcTimer`), this method periodically scans `this.cache`. It removes any `CacheEntry` that has not been `lastAccessed` for longer than its (or global) `cacheTime`, emitting `'cache:data:evict'` events.
|
|
744
815
|
|
|
745
816
|
9. **`enforceSizeLimit`**:
|
|
746
|
-
|
|
817
|
+
- Triggered after successful data updates (`fetch` success or `setData`). If the `cache.size` exceeds `maxSize`, it evicts the Least Recently Used (LRU) entries until the `maxSize` is satisfied, emitting `'cache:data:evict'` events.
|
|
747
818
|
|
|
748
819
|
### Extension Points
|
|
749
820
|
|
|
750
821
|
The design of `@asaidimu/utils-cache` provides several powerful extension points for customization and integration:
|
|
751
822
|
|
|
752
|
-
|
|
753
|
-
|
|
754
|
-
|
|
755
|
-
|
|
756
|
-
|
|
757
|
-
|
|
758
|
-
|
|
759
|
-
|
|
823
|
+
- **`SimplePersistence` Interface**: This is the primary mechanism for integrating `Cache` with various storage backends. By implementing this interface, you can use `Cache` with `localStorage`, `IndexedDB` (e.g., via `@asaidimu/utils-persistence`), a custom database, a server-side cache, or any other persistent storage solution.
|
|
824
|
+
- **`serializeValue` / `deserializeValue` Options**: These functions within `CacheOptions` allow you to define custom logic for how your specific data types are converted to and from a serializable format (e.g., JSON-compatible strings or objects) before being passed to and received from the `persistence` layer. This is crucial for handling `Date` objects, `Map`s, `Set`s, or custom class instances.
|
|
825
|
+
- **Event Listeners (`on`)**: The comprehensive event system, powered by `@asaidimu/events`, allows you to subscribe to a wide range of cache lifecycle events. This enables powerful integrations for:
|
|
826
|
+
- **Logging**: Detailed logging of cache activity (hits, misses, errors, evictions).
|
|
827
|
+
- **Analytics**: Feeding cache performance metrics into an analytics platform.
|
|
828
|
+
- **UI Reactivity**: Updating UI components in response to cache changes (e.g., showing a "stale data" indicator or a "refreshing" spinner).
|
|
829
|
+
- **Debugging**: Gaining deep insights into cache behavior during development.
|
|
830
|
+
- **External Synchronization**: Triggering side effects or synchronizing with other systems based on cache events.
|
|
760
831
|
|
|
761
832
|
---
|
|
762
833
|
|
|
@@ -798,11 +869,11 @@ To set up the development environment for `@asaidimu/utils-cache`:
|
|
|
798
869
|
|
|
799
870
|
The following `npm` scripts are typically available in this project's setup:
|
|
800
871
|
|
|
801
|
-
|
|
802
|
-
|
|
803
|
-
|
|
804
|
-
|
|
805
|
-
|
|
872
|
+
- `npm run build`: Compiles TypeScript source files from `src/` to JavaScript output in `dist/`.
|
|
873
|
+
- `npm run test`: Runs the test suite using `Vitest`.
|
|
874
|
+
- `npm run test:watch`: Runs tests in watch mode for continuous feedback during development.
|
|
875
|
+
- `npm run lint`: Runs ESLint to check for code style and potential errors.
|
|
876
|
+
- `npm run format`: Formats code using Prettier according to the project's style guidelines.
|
|
806
877
|
|
|
807
878
|
### Testing
|
|
808
879
|
|
|
@@ -838,11 +909,11 @@ Found a bug, have a feature request, or need clarification? Please open an issue
|
|
|
838
909
|
|
|
839
910
|
When reporting a bug, please include:
|
|
840
911
|
|
|
841
|
-
|
|
842
|
-
|
|
843
|
-
|
|
844
|
-
|
|
845
|
-
|
|
912
|
+
- A clear and concise description of the issue.
|
|
913
|
+
- Detailed steps to reproduce the behavior.
|
|
914
|
+
- The expected behavior.
|
|
915
|
+
- Any relevant screenshots or code snippets.
|
|
916
|
+
- Your environment details (Node.js version, OS, browser, package version).
|
|
846
917
|
|
|
847
918
|
---
|
|
848
919
|
|
|
@@ -850,29 +921,29 @@ When reporting a bug, please include:
|
|
|
850
921
|
|
|
851
922
|
### Troubleshooting
|
|
852
923
|
|
|
853
|
-
|
|
854
|
-
|
|
855
|
-
|
|
856
|
-
|
|
857
|
-
|
|
858
|
-
|
|
859
|
-
|
|
860
|
-
|
|
861
|
-
|
|
862
|
-
|
|
863
|
-
|
|
864
|
-
|
|
865
|
-
|
|
866
|
-
|
|
867
|
-
|
|
868
|
-
|
|
869
|
-
|
|
870
|
-
|
|
871
|
-
|
|
872
|
-
|
|
873
|
-
|
|
874
|
-
|
|
875
|
-
|
|
924
|
+
- **"No query registered for key: [key]" Error**:
|
|
925
|
+
- **Cause**: This error occurs if you try to `get()`, `prefetch()`, or `refresh()` a `key` that has not been previously associated with a `fetchFunction` using `cache.registerQuery()`.
|
|
926
|
+
- **Solution**: Ensure you call `cache.registerQuery(key, fetchFunction)` for every `key` you intend to use with the cache before attempting to retrieve data.
|
|
927
|
+
- **Data not persisting**:
|
|
928
|
+
- **Cause**: The cache state is not being correctly saved or loaded from the underlying storage.
|
|
929
|
+
- **Solution**:
|
|
930
|
+
1. **`persistence` instance**: Double-check that you are passing a valid `SimplePersistence` instance to the `Cache` constructor's `persistence` option.
|
|
931
|
+
2. **`persistenceId`**: Ensure you've provided a unique `persistenceId` if multiple cache instances share the same persistence layer.
|
|
932
|
+
3. **Serialization**: Verify that your data types are correctly handled by `serializeValue` and `deserializeValue` options, especially for non-JSON-serializable types like `Map`s, `Date` objects, or custom classes.
|
|
933
|
+
4. **Persistence Layer**: Confirm your `SimplePersistence` implementation correctly handles `get()`, `set()`, `clear()`, and `subscribe()` operations for the specific storage medium (e.g., local storage quota, IndexedDB permissions).
|
|
934
|
+
5. **Event Errors**: Check for persistence event errors in your browser's or Node.js console (e.g., `cache.on('cache:persistence:save:error', ...)`).
|
|
935
|
+
- **Cache not evicting data**:
|
|
936
|
+
- **Cause**: Eviction policies might be disabled or configured with very long durations.
|
|
937
|
+
- **Solution**:
|
|
938
|
+
1. **`cacheTime`**: Ensure `cacheTime` in `CacheOptions` is set to a finite, non-zero positive number (in milliseconds). `Infinity` or `0` for `cacheTime` disables time-based garbage collection.
|
|
939
|
+
2. **`maxSize`**: Ensure `maxSize` is set to a finite, non-zero positive number. `Infinity` disables size-based LRU eviction, and `0` means the cache will always be empty (evicting immediately).
|
|
940
|
+
3. **Garbage Collection Interval**: The garbage collection runs periodically. While generally sufficient, verify that `cacheTime` isn't so large that you rarely hit the GC interval.
|
|
941
|
+
- **Event listeners not firing**:
|
|
942
|
+
- **Cause**: The listener might be removed, or the expected event is not actually occurring.
|
|
943
|
+
- **Solution**:
|
|
944
|
+
1. **Correct Event Type**: Ensure you are subscribing to the exact `CacheEventType` you expect (e.g., `'cache:read:hit'`, `'cache:fetch:error'`).
|
|
945
|
+
2. **`enableMetrics`**: If you expect metric-related events or updates, ensure `enableMetrics` is not set to `false` in your `CacheOptions`.
|
|
946
|
+
3. **Unsubscribe Function**: Ensure you are not accidentally calling the `unsubscribe` function returned by `on()` prematurely.
|
|
876
947
|
|
|
877
948
|
### FAQ
|
|
878
949
|
|
|
@@ -886,7 +957,7 @@ A: Use `waitForFresh: true` when your application absolutely needs the most up-t
|
|
|
886
957
|
A: Yes, `Cache` is designed to be environment-agnostic. Its persistence mechanism is pluggable, so you can implement a `SimplePersistence` that works within a web worker (e.g., using IndexedDB directly or communicating with the main thread via `postMessage`).
|
|
887
958
|
|
|
888
959
|
**Q: Is `Cache` thread-safe (or safe with concurrent access)?**
|
|
889
|
-
A: JavaScript is single-threaded. `Cache` manages its internal state with `Map`s and `Promise`s. For concurrent `get` requests to the
|
|
960
|
+
A: JavaScript is single-threaded. `Cache` manages its internal state with `Map`s and `Promise`s. For concurrent `get` requests to the _same key_, it ensures only one `fetchFunction` runs via the `this.fetching` map, preventing redundant fetches. Therefore, it is safe for concurrent access within a single JavaScript runtime context. For multiple JavaScript runtimes (e.g., different browser tabs or Node.js processes), the `persistence` layer's `subscribe` mechanism handles synchronization.
|
|
890
961
|
|
|
891
962
|
### Changelog/Roadmap
|
|
892
963
|
|
|
@@ -898,6 +969,6 @@ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file
|
|
|
898
969
|
|
|
899
970
|
### Acknowledgments
|
|
900
971
|
|
|
901
|
-
|
|
902
|
-
|
|
903
|
-
|
|
972
|
+
- Inspired by modern data fetching and caching libraries like [React Query](https://react-query.tanstack.com/) and [SWR](https://swr.vercel.app/).
|
|
973
|
+
- Uses the `uuid` library for generating unique cache instance IDs.
|
|
974
|
+
- Event system powered by `@asaidimu/events`.
|
package/index.d.mts
CHANGED
|
@@ -97,51 +97,51 @@ type CacheEventBase<Type extends string, Payload = {}> = {
|
|
|
97
97
|
key: string;
|
|
98
98
|
timestamp: number;
|
|
99
99
|
} & Payload;
|
|
100
|
-
type CacheReadHitEvent<T = any> = CacheEventBase<
|
|
100
|
+
type CacheReadHitEvent<T = any> = CacheEventBase<"cache:read:hit", {
|
|
101
101
|
data: T;
|
|
102
102
|
isStale: boolean;
|
|
103
103
|
}>;
|
|
104
|
-
type CacheReadMissEvent = CacheEventBase<
|
|
105
|
-
type CacheFetchStartEvent = CacheEventBase<
|
|
104
|
+
type CacheReadMissEvent = CacheEventBase<"cache:read:miss">;
|
|
105
|
+
type CacheFetchStartEvent = CacheEventBase<"cache:fetch:start", {
|
|
106
106
|
attempt: number;
|
|
107
107
|
}>;
|
|
108
|
-
type CacheFetchSuccessEvent<T = any> = CacheEventBase<
|
|
108
|
+
type CacheFetchSuccessEvent<T = any> = CacheEventBase<"cache:fetch:success", {
|
|
109
109
|
data: T;
|
|
110
110
|
}>;
|
|
111
|
-
type CacheFetchErrorEvent = CacheEventBase<
|
|
111
|
+
type CacheFetchErrorEvent = CacheEventBase<"cache:fetch:error", {
|
|
112
112
|
error: Error;
|
|
113
113
|
attempt: number;
|
|
114
114
|
}>;
|
|
115
|
-
type CacheDataEvictEvent = CacheEventBase<
|
|
115
|
+
type CacheDataEvictEvent = CacheEventBase<"cache:data:evict", {
|
|
116
116
|
reason?: string;
|
|
117
117
|
}>;
|
|
118
|
-
type CacheDataInvalidateEvent = CacheEventBase<
|
|
119
|
-
type CacheDataSetEvent<T = any> = CacheEventBase<
|
|
118
|
+
type CacheDataInvalidateEvent = CacheEventBase<"cache:data:invalidate">;
|
|
119
|
+
type CacheDataSetEvent<T = any> = CacheEventBase<"cache:data:set", {
|
|
120
120
|
newData: T;
|
|
121
121
|
oldData?: T;
|
|
122
122
|
}>;
|
|
123
|
-
type CachePersistenceLoadSuccessEvent = CacheEventBase<
|
|
123
|
+
type CachePersistenceLoadSuccessEvent = CacheEventBase<"cache:persistence:load:success", {
|
|
124
124
|
message?: string;
|
|
125
125
|
}>;
|
|
126
|
-
type CachePersistenceLoadErrorEvent = CacheEventBase<
|
|
126
|
+
type CachePersistenceLoadErrorEvent = CacheEventBase<"cache:persistence:load:error", {
|
|
127
127
|
message?: string;
|
|
128
128
|
error?: any;
|
|
129
129
|
}>;
|
|
130
|
-
type CachePersistenceSaveSuccessEvent = CacheEventBase<
|
|
131
|
-
type CachePersistenceSaveErrorEvent = CacheEventBase<
|
|
130
|
+
type CachePersistenceSaveSuccessEvent = CacheEventBase<"cache:persistence:save:success">;
|
|
131
|
+
type CachePersistenceSaveErrorEvent = CacheEventBase<"cache:persistence:save:error", {
|
|
132
132
|
message?: string;
|
|
133
133
|
error?: any;
|
|
134
134
|
}>;
|
|
135
|
-
type CachePersistenceClearSuccessEvent = CacheEventBase<
|
|
136
|
-
type CachePersistenceClearErrorEvent = CacheEventBase<
|
|
135
|
+
type CachePersistenceClearSuccessEvent = CacheEventBase<"cache:persistence:clear:success">;
|
|
136
|
+
type CachePersistenceClearErrorEvent = CacheEventBase<"cache:persistence:clear:error", {
|
|
137
137
|
message?: string;
|
|
138
138
|
error?: any;
|
|
139
139
|
}>;
|
|
140
|
-
type CachePersistenceSyncEvent = CacheEventBase<
|
|
140
|
+
type CachePersistenceSyncEvent = CacheEventBase<"cache:persistence:sync", {
|
|
141
141
|
message?: string;
|
|
142
142
|
}>;
|
|
143
143
|
type CacheEvent = CacheReadHitEvent | CacheReadMissEvent | CacheFetchStartEvent | CacheFetchSuccessEvent | CacheFetchErrorEvent | CacheDataEvictEvent | CacheDataInvalidateEvent | CacheDataSetEvent | CachePersistenceLoadSuccessEvent | CachePersistenceLoadErrorEvent | CachePersistenceSaveSuccessEvent | CachePersistenceSaveErrorEvent | CachePersistenceClearSuccessEvent | CachePersistenceClearErrorEvent | CachePersistenceSyncEvent;
|
|
144
|
-
type CacheEventType = CacheEvent[
|
|
144
|
+
type CacheEventType = CacheEvent["type"];
|
|
145
145
|
|
|
146
146
|
declare class QueryCache {
|
|
147
147
|
private cache;
|
package/index.d.ts
CHANGED
|
@@ -97,51 +97,51 @@ type CacheEventBase<Type extends string, Payload = {}> = {
|
|
|
97
97
|
key: string;
|
|
98
98
|
timestamp: number;
|
|
99
99
|
} & Payload;
|
|
100
|
-
type CacheReadHitEvent<T = any> = CacheEventBase<
|
|
100
|
+
type CacheReadHitEvent<T = any> = CacheEventBase<"cache:read:hit", {
|
|
101
101
|
data: T;
|
|
102
102
|
isStale: boolean;
|
|
103
103
|
}>;
|
|
104
|
-
type CacheReadMissEvent = CacheEventBase<
|
|
105
|
-
type CacheFetchStartEvent = CacheEventBase<
|
|
104
|
+
type CacheReadMissEvent = CacheEventBase<"cache:read:miss">;
|
|
105
|
+
type CacheFetchStartEvent = CacheEventBase<"cache:fetch:start", {
|
|
106
106
|
attempt: number;
|
|
107
107
|
}>;
|
|
108
|
-
type CacheFetchSuccessEvent<T = any> = CacheEventBase<
|
|
108
|
+
type CacheFetchSuccessEvent<T = any> = CacheEventBase<"cache:fetch:success", {
|
|
109
109
|
data: T;
|
|
110
110
|
}>;
|
|
111
|
-
type CacheFetchErrorEvent = CacheEventBase<
|
|
111
|
+
type CacheFetchErrorEvent = CacheEventBase<"cache:fetch:error", {
|
|
112
112
|
error: Error;
|
|
113
113
|
attempt: number;
|
|
114
114
|
}>;
|
|
115
|
-
type CacheDataEvictEvent = CacheEventBase<
|
|
115
|
+
type CacheDataEvictEvent = CacheEventBase<"cache:data:evict", {
|
|
116
116
|
reason?: string;
|
|
117
117
|
}>;
|
|
118
|
-
type CacheDataInvalidateEvent = CacheEventBase<
|
|
119
|
-
type CacheDataSetEvent<T = any> = CacheEventBase<
|
|
118
|
+
type CacheDataInvalidateEvent = CacheEventBase<"cache:data:invalidate">;
|
|
119
|
+
type CacheDataSetEvent<T = any> = CacheEventBase<"cache:data:set", {
|
|
120
120
|
newData: T;
|
|
121
121
|
oldData?: T;
|
|
122
122
|
}>;
|
|
123
|
-
type CachePersistenceLoadSuccessEvent = CacheEventBase<
|
|
123
|
+
type CachePersistenceLoadSuccessEvent = CacheEventBase<"cache:persistence:load:success", {
|
|
124
124
|
message?: string;
|
|
125
125
|
}>;
|
|
126
|
-
type CachePersistenceLoadErrorEvent = CacheEventBase<
|
|
126
|
+
type CachePersistenceLoadErrorEvent = CacheEventBase<"cache:persistence:load:error", {
|
|
127
127
|
message?: string;
|
|
128
128
|
error?: any;
|
|
129
129
|
}>;
|
|
130
|
-
type CachePersistenceSaveSuccessEvent = CacheEventBase<
|
|
131
|
-
type CachePersistenceSaveErrorEvent = CacheEventBase<
|
|
130
|
+
type CachePersistenceSaveSuccessEvent = CacheEventBase<"cache:persistence:save:success">;
|
|
131
|
+
type CachePersistenceSaveErrorEvent = CacheEventBase<"cache:persistence:save:error", {
|
|
132
132
|
message?: string;
|
|
133
133
|
error?: any;
|
|
134
134
|
}>;
|
|
135
|
-
type CachePersistenceClearSuccessEvent = CacheEventBase<
|
|
136
|
-
type CachePersistenceClearErrorEvent = CacheEventBase<
|
|
135
|
+
type CachePersistenceClearSuccessEvent = CacheEventBase<"cache:persistence:clear:success">;
|
|
136
|
+
type CachePersistenceClearErrorEvent = CacheEventBase<"cache:persistence:clear:error", {
|
|
137
137
|
message?: string;
|
|
138
138
|
error?: any;
|
|
139
139
|
}>;
|
|
140
|
-
type CachePersistenceSyncEvent = CacheEventBase<
|
|
140
|
+
type CachePersistenceSyncEvent = CacheEventBase<"cache:persistence:sync", {
|
|
141
141
|
message?: string;
|
|
142
142
|
}>;
|
|
143
143
|
type CacheEvent = CacheReadHitEvent | CacheReadMissEvent | CacheFetchStartEvent | CacheFetchSuccessEvent | CacheFetchErrorEvent | CacheDataEvictEvent | CacheDataInvalidateEvent | CacheDataSetEvent | CachePersistenceLoadSuccessEvent | CachePersistenceLoadErrorEvent | CachePersistenceSaveSuccessEvent | CachePersistenceSaveErrorEvent | CachePersistenceClearSuccessEvent | CachePersistenceClearErrorEvent | CachePersistenceSyncEvent;
|
|
144
|
-
type CacheEventType = CacheEvent[
|
|
144
|
+
type CacheEventType = CacheEvent["type"];
|
|
145
145
|
|
|
146
146
|
declare class QueryCache {
|
|
147
147
|
private cache;
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@asaidimu/utils-cache",
|
|
3
|
-
"version": "3.0
|
|
3
|
+
"version": "3.1.0",
|
|
4
4
|
"description": "Resource and cache management utilities for @asaidimu applications.",
|
|
5
5
|
"main": "index.js",
|
|
6
6
|
"module": "index.mjs",
|
|
@@ -30,7 +30,7 @@
|
|
|
30
30
|
"access": "public"
|
|
31
31
|
},
|
|
32
32
|
"dependencies": {
|
|
33
|
-
"@asaidimu/utils-persistence": "
|
|
33
|
+
"@asaidimu/utils-persistence": "6.0.0",
|
|
34
34
|
"uuid": "^11.1.0",
|
|
35
35
|
"@asaidimu/events": "^1.1.1"
|
|
36
36
|
},
|