cacheable 1.9.0 → 1.10.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -37,10 +37,14 @@
37
37
  * [Cacheable Statistics (Instance Only)](#cacheable-statistics-instance-only)
38
38
  * [Cacheable - API](#cacheable---api)
39
39
  * [CacheableMemory - In-Memory Cache](#cacheablememory---in-memory-cache)
40
+ * [CacheableMemory Store Hashing](#cacheablememory-store-hashing)
41
+ * [CacheableMemory LRU Feature](#cacheablememory-lru-feature)
42
+ * [CacheableMemory Performance](#cacheablememory-performance)
40
43
  * [CacheableMemory Options](#cacheablememory-options)
41
44
  * [CacheableMemory - API](#cacheablememory---api)
42
45
  * [Keyv Storage Adapter - KeyvCacheableMemory](#keyv-storage-adapter---keyvcacheablememory)
43
46
  * [Wrap / Memoization for Sync and Async Functions](#wrap--memoization-for-sync-and-async-functions)
47
+ * [Get Or Set Memoization Function](#get-or-set-memoization-function)
44
48
  * [How to Contribute](#how-to-contribute)
45
49
  * [License and Copyright](#license-and-copyright)
46
50
 
@@ -252,6 +256,41 @@ raws.forEach((entry, idx) => {
252
256
 
253
257
  If you want your layer 2 (secondary) store to be non-blocking you can set the `nonBlocking` property to `true` in the options. This will make the secondary store non-blocking and will not wait for the secondary store to respond on `setting data`, `deleting data`, or `clearing data`. This is useful if you want to have a faster response time and not wait for the secondary store to respond.
254
258
 
259
+ # GetOrSet
260
+
261
+ The `getOrSet` method provides a convenient way to implement the cache-aside pattern. It attempts to retrieve a value
262
+ from cache, and if not found, calls the provided function to compute the value and store it in cache before returning
263
+ it.
264
+
265
+ ```typescript
266
+ import { Cacheable } from 'cacheable';
267
+
268
+ // Create a new Cacheable instance
269
+ const cache = new Cacheable();
270
+
271
+ // Use getOrSet to fetch user data
272
+ async function getUserData(userId: string) {
273
+ return await cache.getOrSet(
274
+ `user:${userId}`,
275
+ async () => {
276
+ // This function only runs if the data isn't in the cache
277
+ console.log('Fetching user from database...');
278
+ // Simulate database fetch
279
+ return { id: userId, name: 'John Doe', email: 'john@example.com' };
280
+ },
281
+ { ttl: '30m' } // Cache for 30 minutes
282
+ );
283
+ }
284
+
285
+ // First call - will fetch from "database"
286
+ const user1 = await getUserData('123');
287
+ console.log(user1); // { id: '123', name: 'John Doe', email: 'john@example.com' }
288
+
289
+ // Second call - will retrieve from cache
290
+ const user2 = await getUserData('123');
291
+ console.log(user2); // Same data, but retrieved from cache
292
+ ```
293
+
255
294
  ```javascript
256
295
  import { Cacheable } from 'cacheable';
257
296
  import {KeyvRedis} from '@keyv/redis';
@@ -317,6 +356,7 @@ _This does not enable statistics for your layer 2 cache as that is a distributed
317
356
  * `deleteMany([keys])`: Deletes multiple values from the cache.
318
357
  * `clear()`: Clears the cache stores. Be careful with this as it will clear both layer 1 and layer 2.
319
358
  * `wrap(function, WrapOptions)`: Wraps an `async` function in a cache.
359
+ * `getOrSet(GetOrSetKey, valueFunction, GetOrSetFunctionOptions)`: Gets a value from cache or sets it if not found using the provided function.
320
360
  * `disconnect()`: Disconnects from the cache stores.
321
361
  * `onHook(hook, callback)`: Sets a hook.
322
362
  * `removeHook(hook)`: Removes a hook.
@@ -351,12 +391,111 @@ This simple in-memory cache uses multiple Map objects and a with `expiration` an
351
391
 
352
392
  By default we use lazy expiration deletion which means on `get` and `getMany` type functions we look if it is expired and then delete it. If you want to have a more aggressive expiration policy you can set the `checkInterval` property to a value greater than `0` which will check for expired keys at the interval you set.
353
393
 
394
+ Here are some of the main features of `CacheableMemory`:
395
+ * High performance in-memory cache with a robust API and feature set. 🚀
396
+ * Can scale past the `16,777,216 (2^24) keys` limit of a single `Map` via `hashStoreSize`. Default is `16` Map objects.
397
+ * LRU (Least Recently Used) cache feature to limit the number of keys in the cache via `lruSize`. Limit to `16,777,216 (2^24) keys` total.
398
+ * Expiration policy to delete expired keys with lazy deletion or aggressive deletion via `checkInterval`.
399
+ * `Wrap` feature to memoize `sync` and `async` functions with stampede protection.
400
+ * Ability to do many operations at once such as `setMany`, `getMany`, `deleteMany`, and `takeMany`.
401
+ * Supports `raw` data retrieval with `getRaw` and `getManyRaw` methods to get the full metadata of the cache entry.
402
+
403
+ ## CacheableMemory Store Hashing
404
+
405
+ `CacheableMemory` uses `Map` objects to store the keys and values. To make this scale past the `16,777,216 (2^24) keys` limit of a single `Map` we use a hash to balance the data across multiple `Map` objects. This is done by hashing the key and using the hash to determine which `Map` object to use. The default hashing algorithm is `djb2Hash` but you can change it by setting the `storeHashAlgorithm` property in the options. By default we set the amount of `Map` objects to `16`.
406
+
407
+ NOTE: if you are using the LRU cache feature the `lruSize` no matter how many `Map` objects you have it will be limited to the `16,777,216 (2^24) keys` limit of a single `Map` object. This is because we use a double linked list to manage the LRU cache and it is not possible to have more than `16,777,216 (2^24) keys` in a single `Map` object.
408
+
409
+ Here is an example of how to set the number of `Map` objects and the hashing algorithm:
410
+
411
+ ```javascript
412
+ import { CacheableMemory } from 'cacheable';
413
+ const cache = new CacheableMemory({
414
+ storeSize: 32, // set the number of Map objects to 32
415
+ });
416
+ cache.set('key', 'value');
417
+ const value = cache.get('key'); // value
418
+ ```
419
+
420
+ Here is an example of how to use the `storeHashAlgorithm` property:
421
+
422
+ ```javascript
423
+ import { CacheableMemory } from 'cacheable';
424
+ const cache = new CacheableMemory({ storeHashAlgorithm: 'sha256' });
425
+ cache.set('key', 'value');
426
+ const value = cache.get('key'); // value
427
+ ```
428
+
429
+ If you want to provide your own hashing function you can set the `storeHashAlgorithm` property to a function that takes an object and returns a `number` that is in the range of the amount of `Map` stores you have.
430
+
431
+ ```javascript
432
+ import { CacheableMemory } from 'cacheable';
433
+ /**
434
+ * Custom hash function that takes a key and the size of the store
435
+ * and returns a number between 0 and storeHashSize - 1.
436
+ * @param {string} key - The key to hash.
437
+ * @param {number} storeHashSize - The size of the store (number of Map objects).
438
+ * @returns {number} - A number between 0 and storeHashSize - 1.
439
+ */
440
+ const customHash = (key, storeHashSize) => {
441
+ // custom hashing logic
442
+ return key.length % storeHashSize; // returns a number between 0 and 31 for 32 Map objects
443
+ };
444
+ const cache = new CacheableMemory({ storeHashAlgorithm: customHash, storeSize: 32 });
445
+ cache.set('key', 'value');
446
+ const value = cache.get('key'); // value
447
+ ```
448
+
449
+ ## CacheableMemory LRU Feature
450
+
451
+ You can enable the LRU (Least Recently Used) feature in `CacheableMemory` by setting the `lruSize` property in the options. This will limit the number of keys in the cache to the size you set. When the cache reaches the limit it will remove the least recently used keys from the cache. This is useful if you want to limit the memory usage of the cache.
452
+
453
+ When you set the `lruSize` we use a double linked list to manage the LRU cache and also set the `hashStoreSize` to `1` which means we will only use a single `Map` object for the LRU cache. This is because the LRU cache is managed by the double linked list and it is not possible to have more than `16,777,216 (2^24) keys` in a single `Map` object.
454
+
455
+ ```javascript
456
+ import { CacheableMemory } from 'cacheable';
457
+ const cache = new CacheableMemory({ lruSize: 1 }); // sets the LRU cache size to 1000 keys and hashStoreSize to 1
458
+ cache.set('key1', 'value1');
459
+ cache.set('key2', 'value2');
460
+ const value1 = cache.get('key1');
461
+ console.log(value1); // undefined if the cache is full and key1 is the least recently used
462
+ const value2 = cache.get('key2');
463
+ console.log(value2); // value2 if key2 is still in the cache
464
+ console.log(cache.size()); // 1
465
+ ```
466
+
467
+ NOTE: if you set the `lruSize` property to `0` after it was enabled it will disable the LRU cache feature and will not limit the number of keys in the cache. This will remove the `16,777,216 (2^24) keys` limit of a single `Map` object and will allow you to store more keys in the cache.
468
+
469
+ ## CacheableMemory Performance
470
+
471
+ Our goal with `cacheable` and `CacheableMemory` is to provide a high performance caching engine that is simple to use and has a robust API. We test it against other cacheing engines such that are less feature rich to make sure there is little difference. Here are some of the benchmarks we have run:
472
+
473
+ *Memory Benchmark Results:*
474
+ | name | summary | ops/sec | time/op | margin | samples |
475
+ |------------------------------------------|:---------:|----------:|----------:|:--------:|----------:|
476
+ | Map (v22) - set / get | 🥇 | 117K | 9µs | ±1.29% | 110K |
477
+ | Cacheable Memory (v1.10.0) - set / get | -1.3% | 116K | 9µs | ±0.77% | 110K |
478
+ | Node Cache - set / get | -4.1% | 112K | 9µs | ±1.34% | 107K |
479
+ | bentocache (v1.4.0) - set / get | -45% | 65K | 17µs | ±1.10% | 100K |
480
+
481
+ *Memory LRU Benchmark Results:*
482
+ | name | summary | ops/sec | time/op | margin | samples |
483
+ |------------------------------------------|:---------:|----------:|----------:|:--------:|----------:|
484
+ | quick-lru (v7.0.1) - set / get | 🥇 | 118K | 9µs | ±0.85% | 112K |
485
+ | Map (v22) - set / get | -0.56% | 117K | 9µs | ±1.35% | 110K |
486
+ | lru.min (v1.1.2) - set / get | -1.7% | 116K | 9µs | ±0.90% | 110K |
487
+ | Cacheable Memory (v1.10.0) - set / get | -3.3% | 114K | 9µs | ±1.16% | 108K |
488
+
489
+ As you can see from the benchmarks `CacheableMemory` is on par with other caching engines such as `Map`, `Node Cache`, and `bentocache`. We have also tested it against other LRU caching engines such as `quick-lru` and `lru.min` and it performs well against them too.
490
+
354
491
  ## CacheableMemory Options
355
492
 
356
493
  * `ttl`: The time to live for the cache in milliseconds. Default is `undefined` which is means indefinitely.
357
494
  * `useClones`: If the cache should use clones for the values. Default is `true`.
358
495
  * `lruSize`: The size of the LRU cache. Default is `0` which is unlimited.
359
496
  * `checkInterval`: The interval to check for expired keys in milliseconds. Default is `0` which is disabled.
497
+ * `storeHashSize`: The number of `Map` objects to use for the cache. Default is `16`.
498
+ * `storeHashAlgorithm`: The hashing algorithm to use for the cache. Default is `djb2Hash`.
360
499
 
361
500
  ## CacheableMemory - API
362
501
 
@@ -374,13 +513,19 @@ By default we use lazy expiration deletion which means on `get` and `getMany` ty
374
513
  * `takeMany([keys])`: Takes multiple values from the cache and deletes them.
375
514
  * `wrap(function, WrapSyncOptions)`: Wraps a `sync` function in a cache.
376
515
  * `clear()`: Clears the cache.
377
- * `size()`: The number of keys in the cache.
378
- * `keys()`: The keys in the cache.
379
- * `items()`: The items in the cache as `CacheableStoreItem` example `{ key, value, expires? }`.
516
+ * `ttl`: The default time to live for the cache in milliseconds. Default is `undefined` which is disabled.
517
+ * `useClones`: If the cache should use clones for the values. Default is `true`.
518
+ * `lruSize`: The size of the LRU cache. Default is `0` which is unlimited.
519
+ * `size`: The number of keys in the cache.
520
+ * `checkInterval`: The interval to check for expired keys in milliseconds. Default is `0` which is disabled.
521
+ * `storeHashSize`: The number of `Map` objects to use for the cache. Default is `16`.
522
+ * `storeHashAlgorithm`: The hashing algorithm to use for the cache. Default is `djb2Hash`.
523
+ * `keys`: Get the keys in the cache. Not able to be set.
524
+ * `items`: Get the items in the cache as `CacheableStoreItem` example `{ key, value, expires? }`.
525
+ * `store`: The hash store for the cache which is an array of `Map` objects.
380
526
  * `checkExpired()`: Checks for expired keys in the cache. This is used by the `checkInterval` property.
381
527
  * `startIntervalCheck()`: Starts the interval check for expired keys if `checkInterval` is above 0 ms.
382
528
  * `stopIntervalCheck()`: Stops the interval check for expired keys.
383
- * `hash(object: any, algorithm = 'sha256'): string`: Hashes an object with the algorithm. Default is `sha256`.
384
529
 
385
530
  # Keyv Storage Adapter - KeyvCacheableMemory
386
531
 
@@ -470,9 +615,70 @@ console.log(wrappedFunction()); // error
470
615
  console.log(wrappedFunction()); // error from cache
471
616
  ```
472
617
 
618
+ If you would like to generate your own key for the wrapped function you can set the `createKey` property in the `wrap()` options. This is useful if you want to generate a key based on the arguments of the function or any other criteria.
619
+
620
+ ```javascript
621
+ const cache = new Cacheable();
622
+ const options: WrapOptions = {
623
+ cache,
624
+ keyPrefix: 'test',
625
+ createKey: (function_, arguments_, options: WrapOptions) => `customKey:${options?.keyPrefix}:${arguments_[0]}`,
626
+ };
627
+
628
+ const wrapped = wrap((argument: string) => `Result for ${argument}`, options);
629
+
630
+ const result1 = await wrapped('arg1');
631
+ const result2 = await wrapped('arg1'); // Should hit the cache
632
+
633
+ console.log(result1); // Result for arg1
634
+ console.log(result2); // Result for arg1 (from cache)
635
+ ```
636
+
637
+ We will pass in the `function` that is being wrapped, the `arguments` passed to the function, and the `options` used to wrap the function. You can then use these to generate a custom key for the cache.
638
+
639
+ # Get Or Set Memoization Function
640
+
641
+ The `getOrSet` method provides a convenient way to implement the cache-aside pattern. It attempts to retrieve a value from cache, and if not found, calls the provided function to compute the value and store it in cache before returning it. Here are the options:
642
+
643
+ ```typescript
644
+ export type GetOrSetFunctionOptions = {
645
+ ttl?: number | string;
646
+ cacheErrors?: boolean;
647
+ throwErrors?: boolean;
648
+ };
649
+ ```
650
+
651
+ Here is an example of how to use the `getOrSet` method:
652
+
653
+ ```javascript
654
+ import { Cacheable } from 'cacheable';
655
+ const cache = new Cacheable();
656
+ // Use getOrSet to fetch user data
657
+ const function_ = async () => Math.random() * 100;
658
+ const value = await cache.getOrSet('randomValue', function_, { ttl: '1h' });
659
+ console.log(value); // e.g. 42.123456789
660
+ ```
661
+
662
+ You can also use a function to compute the key for the function:
663
+
664
+ ```javascript
665
+ import { Cacheable, GetOrSetOptions } from 'cacheable';
666
+ const cache = new Cacheable();
667
+
668
+ // Function to generate a key based on options
669
+ const generateKey = (options?: GetOrSetOptions) => {
670
+ return `custom_key_:${options?.cacheId || 'default'}`;
671
+ };
672
+
673
+ const function_ = async () => Math.random() * 100;
674
+ const value = await cache.getOrSet(generateKey(), function_, { ttl: '1h' });
675
+ ```
676
+
677
+
678
+
473
679
  # How to Contribute
474
680
 
475
681
  You can contribute by forking the repo and submitting a pull request. Please make sure to add tests and update the documentation. To learn more about how to contribute go to our main README [https://github.com/jaredwray/cacheable](https://github.com/jaredwray/cacheable). This will talk about how to `Open a Pull Request`, `Ask a Question`, or `Post an Issue`.
476
682
 
477
683
  # License and Copyright
478
- [MIT © Jared Wray](./LICENSE)
684
+ [MIT © Jared Wray](./LICENSE)