@63klabs/cache-data 1.3.3 → 1.3.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -8,7 +8,29 @@ Report all vulnerabilities under the [Security menu](https://github.com/63Klabs/
8
8
 
9
9
  > Note: This project is still in beta. Even though changes are tested and breaking changes are avoided, things may break.
10
10
 
11
- ## 1.3.3 (2025-09-14)
11
+ ## v1.3.5 (2025-01-13)
12
+
13
+ ### Enhancements
14
+
15
+ - `endpoint.getDataDirectFromURI()` has been renamed to `endpoint.get()` and the old name is now a deprecated alias. This was just to simplify naming.
16
+
17
+ Example Use:
18
+
19
+ ```javascript
20
+ const { endpoint } = require("@63klabs/cache-data");
21
+ const data = await endpoint.get({host: "api.example.com", path: "data"}, { parameters: {q: "Chicago" }});
22
+ ```
23
+
24
+ ## v1.3.4 (2025-01-12)
25
+
26
+ ### Fixes
27
+
28
+ - `Cache.CachableDataAccess` - Lambda function may end before writes to S3 completed on large objects resulting in a cached object not being found. When caching an object the `CachableDataAccess` now waits for all writes to complete.
29
+ - Updated dependencies resolving the deprecated `lodash` dependency of `sinon`. This only affected `devDependencies`. [lodash.get deprecation warning (used by sinon) #214](https://github.com/63Klabs/cache-data/issues/214)
30
+ - **CI:** Use npm Trusted Publishing (OIDC) for `npm publish` and removed the `NPM_TOKEN` fallback to avoid classic token deprecation warnings; configure the package on npm to trust this workflow to publish.
31
+
32
+
33
+ ## v1.3.3 (2025-09-14)
12
34
 
13
35
  ### Enhancements
14
36
 
package/README.md CHANGED
@@ -4,30 +4,35 @@ A package for AWS Lambda Node.js applications to access and cache data from remo
4
4
 
5
5
  > Note: This repository and package has moved from chadkluck to 63Klabs but is still managed by the same developer.
6
6
 
7
- [@63klabs/cache-data on npmjs.com](https://www.npmjs.com/package/@63klabs/cache-data)
7
+ - [@63klabs/cache-data on npmjs.com](https://www.npmjs.com/package/@63klabs/cache-data)
8
+ - [@63klabs/cache-data on GitHub](https://github.com/@63klabs/cache-data)
8
9
 
9
10
  ## Description
10
11
 
11
- For AWS Lambda functions written in Node.js that require caching of data either of an internal process or external data source such as remote API endpoints. While out of the box it can fetch data from remote endpoint APIs, custom Data Access Objects can be written to provide caching of data from all sorts of sources including resource expensive database calls.
12
+ A distributed, serverless data caching solution for AWS Lambda Node.js functions. Cache data from internal processes or external data sources such as remote API endpoints to be shared among concurrent executions. Uses DynamoDb and S3 for caching.
12
13
 
13
14
  It has several utility functions such `DebugAndLog`, `Timer`, and SSM Parameter Store loaders.
14
15
 
15
16
  It can be used in place of Express.js for simple web service applications as it also includes functions for handling and validating requests, routing, and client request logging.
16
17
 
17
- This package has been used in production for web service applications receiving over 1 million requests per week with a 75% cache-hit rate lowering latency to less than 100ms in most cases. This is a considerable improvement when faced with resource intense processes, connection pools, API rate limits, and slow endpoints.
18
+ This package has been used in production environments for web service applications receiving over 1 million requests per week with a 75% cache-hit rate lowering latency to less than 100ms in most cases. This is a considerable improvement when faced with resource intense processes, connection pools, API rate limits, and slow endpoints.
18
19
 
19
20
  ## Getting Started
20
21
 
21
22
  ### Requirements
22
23
 
23
- - Node >18 runtime on Lambda
24
+ - Node >22 runtime on Lambda
24
25
  - AWS Lambda, S3 bucket, DynamoDb table, and SSM Parameter Store
25
26
  - A basic understanding of CloudFormation, Lambda, S3, DynamoDb, and SSM Parameters
26
27
  - A basic understanding of IAM policies, especially the Lambda Execution Role, that will allow Lambda to access S3, DynamoDb, and SSM Parameter Store
27
- - Lambda function should have between 512MB and 1024MB of memory allocated. (256MB minimum). See [Lambda Optimization: Memory Allocation](./docs/lambda-optimization/README.md#lambda-memory-allocation).
28
+ - Lambda function should have between 512MB and 2048MB of memory allocated. (>1024MB recommended). See [Lambda Optimization: Memory Allocation](./docs/lambda-optimization/README.md#lambda-memory-allocation).
28
29
 
29
30
  ### Installing
30
31
 
32
+ The simplest way to get started is to use the [63klabs Atlantis Templates and Script platform](hhttps://github.com/63Klabs/atlantis-cfn-configuration-repo-for-serverless-deployments) to deploy this and other ready-to-run solutions via CI/CD.
33
+
34
+ However, if you want to write your own templates and code, follow the following steps:
35
+
31
36
  1. Generate Secret Key to Encrypt Cache:
32
37
  - Use the [key generation script](./docs/00-example-implementation/generate-put-ssm.py) during [the build](./docs/00-example-implementation/example-buildspec.yml) to establish a key to encrypt your data.
33
38
  2. Lambda CloudFormation Template:
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@63klabs/cache-data",
3
- "version": "1.3.3",
3
+ "version": "1.3.5",
4
4
  "description": "Cache data from an API endpoint or application process using AWS S3 and DynamoDb",
5
5
  "author": "Chad Leigh Kluck (https://chadkluck.me)",
6
6
  "license": "MIT",
@@ -13,10 +13,10 @@
13
13
  "test": "test"
14
14
  },
15
15
  "engines": {
16
- "node": ">=18.0.0"
16
+ "node": ">=20.0.0"
17
17
  },
18
18
  "dependencies": {
19
- "aws-xray-sdk-core": "^3.6.0",
19
+ "aws-xray-sdk-core": "^3.12.0",
20
20
  "moment-timezone": "^0.6.0",
21
21
  "object-hash": "^3.0.0"
22
22
  },
@@ -27,8 +27,8 @@
27
27
  "@aws-sdk/lib-dynamodb": "3.x",
28
28
  "chai": "^6.0.1",
29
29
  "chai-http": "^5.1.2",
30
- "mocha": "^11.7.1",
31
- "sinon": "^21.0.0"
30
+ "mocha": "^11.7.5",
31
+ "sinon": "^21.0.1"
32
32
  },
33
33
  "scripts": {
34
34
  "test": "mocha 'test/**/*-tests.mjs'",
@@ -118,7 +118,7 @@ class S3Cache {
118
118
  */
119
119
  static async read (idHash) {
120
120
 
121
- return new Promise(async (resolve, reject) => {
121
+ return new Promise(async (resolve) => {
122
122
 
123
123
  const objKey = `${S3Cache.getPath()}${idHash}.json`;
124
124
  const objFullLocation = `${S3Cache.getBucket()}/${objKey}`;
@@ -145,8 +145,8 @@ class S3Cache {
145
145
  resolve(item);
146
146
 
147
147
  } catch (error) {
148
- tools.DebugAndLog.error(`Error getting object from S3 (${objFullLocation}): ${error.message}`, error.stack);
149
- reject(item);
148
+ tools.DebugAndLog.error(`Error getting object from S3 (${objFullLocation}): ${error?.message || 'Unknown error'}`, error?.stack);
149
+ resolve(item);
150
150
  }
151
151
 
152
152
  });
@@ -165,7 +165,7 @@ class S3Cache {
165
165
  const objFullLocation = `${S3Cache.getBucket()}/${objKey}`;
166
166
  tools.DebugAndLog.debug(`Putting object to S3: ${objFullLocation}`);
167
167
 
168
- return new Promise( async (resolve, reject) => {
168
+ return new Promise( async (resolve) => {
169
169
 
170
170
  try {
171
171
  const params = {
@@ -182,8 +182,8 @@ class S3Cache {
182
182
  resolve(true);
183
183
 
184
184
  } catch (error) {
185
- tools.DebugAndLog.error(`Error putting object to S3. [E2] (${objFullLocation}) ${error.message}`, error.stack);
186
- reject(false)
185
+ tools.DebugAndLog.error(`Error putting object to S3. [E2] (${objFullLocation}) ${error?.message || 'Unknown error'}`, error?.stack);
186
+ resolve(false)
187
187
  };
188
188
  });
189
189
 
@@ -235,7 +235,7 @@ class DynamoDbCache {
235
235
  */
236
236
  static async read (idHash) {
237
237
 
238
- return new Promise(async (resolve, reject) => {
238
+ return new Promise(async (resolve) => {
239
239
 
240
240
  tools.DebugAndLog.debug(`Getting record from DynamoDb for id_hash: ${idHash}`)
241
241
  let result = {};
@@ -258,11 +258,11 @@ class DynamoDbCache {
258
258
 
259
259
  tools.DebugAndLog.debug(`Query success from DynamoDb for id_hash: ${idHash}`);
260
260
 
261
- resolve(result);
262
261
  } catch (error) {
263
- tools.DebugAndLog.error(`Unable to perform DynamoDb query. (${idHash}) ${error.message}`, error.stack);
264
- reject(result);
265
- };
262
+ tools.DebugAndLog.error(`Unable to perform DynamoDb query. (${idHash}) ${error?.message || 'Unknown error'}`, error?.stack);
263
+ } finally {
264
+ resolve(result);
265
+ }
266
266
 
267
267
  });
268
268
 
@@ -275,7 +275,7 @@ class DynamoDbCache {
275
275
  */
276
276
  static async write (item) {
277
277
 
278
- return new Promise( async (resolve, reject) => {
278
+ return new Promise( async (resolve) => {
279
279
 
280
280
  try {
281
281
 
@@ -293,9 +293,9 @@ class DynamoDbCache {
293
293
  resolve(true);
294
294
 
295
295
  } catch (error) {
296
- tools.DebugAndLog.error(`Write to DynamoDb failed for id_hash: ${item.id_hash} ${error.message}`, error.stack);
297
- reject(false)
298
- };
296
+ tools.DebugAndLog.error(`Write to DynamoDb failed for id_hash: ${item.id_hash} ${error?.message || 'Unknown error'}`, error?.stack);
297
+ resolve(false)
298
+ }
299
299
  });
300
300
 
301
301
  };
@@ -405,7 +405,7 @@ class CacheData {
405
405
  * @returns {Promise<boolean>}
406
406
  */
407
407
  static async prime() {
408
- return new Promise(async (resolve, reject) => {
408
+ return new Promise(async (resolve) => {
409
409
  try {
410
410
  let primeTasks = [];
411
411
 
@@ -417,8 +417,8 @@ class CacheData {
417
417
 
418
418
  resolve(true);
419
419
  } catch (error) {
420
- tools.DebugAndLog.error(`CacheData.prime() failed ${error.message}`, error.stack);
421
- reject(false);
420
+ tools.DebugAndLog.error(`CacheData.prime() failed ${error?.message || 'Unknown error'}`, error?.stack);
421
+ resolve(false);
422
422
  }
423
423
  });
424
424
  }
@@ -498,55 +498,52 @@ class CacheData {
498
498
  */
499
499
  static async _process(idHash, item, syncedNow, syncedLater) {
500
500
 
501
- return new Promise(async (resolve, reject) => {
502
-
503
- try {
504
-
505
- // Is this a pointer to data in S3?
506
- if ("data" in item && "info" in item.data && "objInS3" in item.data.info && item.data.info.objInS3 === true) {
507
- tools.DebugAndLog.debug(`Item is in S3. Fetching... (${idHash})`);
508
- item = await S3Cache.read(idHash); // The data is stored in S3 so get it
509
- tools.DebugAndLog.debug(`Item returned from S3 replaces pointer to S3 (${idHash})`, item);
510
- // NOTE: if this fails and returns null it will be handled as any item === null which is to say that body will be null
511
- }
501
+ try {
512
502
 
513
- let body = null;
514
- let headers = null;
515
- let expires = syncedLater;
516
- let statusCode = null;
517
-
518
- if (item !== null) {
519
- tools.DebugAndLog.debug(`Process data from cache (${idHash})`);
520
- body = item.data.body; // set the cached body data (this is what we will be the body of the response)
521
-
522
- headers = item.data.headers;
523
- expires = item.expires;
524
- statusCode = item.data.statusCode;
525
-
526
- // if the body is encrypted (because classification is private) decrypt it
527
- if ( item.data.info.classification === CacheData.PRIVATE ) {
528
- try {
529
- tools.DebugAndLog.debug(`Policy for (${idHash}) data is classified as PRIVATE. Decrypting body...`);
530
- await CacheData.prime();
531
- body = this._decrypt(body);
532
- } catch (error) {
533
- // Decryption failed
534
- body = null;
535
- expires = syncedNow;
536
- headers = null;
537
- statusCode = "500";
538
- tools.DebugAndLog.error(`Unable to decrypt cache. Ignoring it. (${idHash}) ${error.message}`, error.stack);
539
- }
540
- }
541
- }
542
-
543
- resolve({ body: body, headers: headers, expires: expires, statusCode: statusCode });
544
- } catch (error) {
545
- tools.DebugAndLog.error(`Error getting data from cache. (${idHash}) ${error.message}`, error.stack);
546
- reject( {body: null, expires: syncedNow, headers: null, statusCode: "500"} );
503
+ // Is this a pointer to data in S3?
504
+ if ("data" in item && "info" in item.data && "objInS3" in item.data.info && item.data.info.objInS3 === true) {
505
+ tools.DebugAndLog.debug(`Item is in S3. Fetching... (${idHash})`);
506
+ item = await S3Cache.read(idHash); // The data is stored in S3 so get it
507
+ tools.DebugAndLog.debug(`Item returned from S3 replaces pointer to S3 (${idHash})`, item);
508
+ // NOTE: if this fails and returns null it will be handled as any item === null which is to say that body will be null
547
509
  }
510
+
511
+ let body = null;
512
+ let headers = null;
513
+ let expires = syncedLater;
514
+ let statusCode = null;
515
+
516
+ if (item !== null) {
517
+ tools.DebugAndLog.debug(`Process data from cache (${idHash})`);
518
+ body = item.data.body; // set the cached body data (this is what we will be the body of the response)
519
+
520
+ headers = item.data.headers;
521
+ expires = item.expires;
522
+ statusCode = item.data.statusCode;
548
523
 
549
- });
524
+ // if the body is encrypted (because classification is private) decrypt it
525
+ if ( item.data.info.classification === CacheData.PRIVATE ) {
526
+ try {
527
+ tools.DebugAndLog.debug(`Policy for (${idHash}) data is classified as PRIVATE. Decrypting body...`);
528
+ await CacheData.prime();
529
+ body = this._decrypt(body);
530
+ } catch (error) {
531
+ // Decryption failed
532
+ body = null;
533
+ expires = syncedNow;
534
+ headers = null;
535
+ statusCode = "500";
536
+ tools.DebugAndLog.error(`Unable to decrypt cache. Ignoring it. (${idHash}) ${error?.message || 'Unknown error'}`, error?.stack);
537
+ }
538
+ }
539
+ }
540
+
541
+ return { body: body, headers: headers, expires: expires, statusCode: statusCode };
542
+ } catch (error) {
543
+ tools.DebugAndLog.error(`Error getting data from cache. (${idHash}) ${error?.message || 'Unknown error'}`, error?.stack);
544
+ return { body: null, expires: syncedNow, headers: null, statusCode: "500" };
545
+ }
546
+
550
547
  };
551
548
 
552
549
  /**
@@ -557,7 +554,7 @@ class CacheData {
557
554
  */
558
555
  static async read(idHash, syncedLater) {
559
556
 
560
- return new Promise(async (resolve, reject) => {
557
+ return new Promise(async (resolve) => {
561
558
 
562
559
  let cache = this.format(syncedLater);
563
560
 
@@ -577,10 +574,10 @@ class CacheData {
577
574
  tools.DebugAndLog.debug(`No cache found for ${idHash}`);
578
575
  }
579
576
 
580
- resolve(cache);
581
577
  } catch (error) {
582
- tools.DebugAndLog.error(`CacheData.read(${idHash}) failed ${error.message}`, error.stack);
583
- reject(cache);
578
+ tools.DebugAndLog.error(`CacheData.read(${idHash}) failed ${error?.message || 'Unknown error'}`, error?.stack);
579
+ } finally {
580
+ resolve(cache);
584
581
  };
585
582
 
586
583
  });
@@ -597,106 +594,113 @@ class CacheData {
597
594
  * @param {number} expires
598
595
  * @param {number} statusCode
599
596
  * @param {boolean} encrypt
600
- * @returns {CacheDataFormat}
597
+ * @returns {Promise<CacheDataFormat>}
601
598
  */
602
- static write (idHash, syncedNow, body, headers, host, path, expires, statusCode, encrypt = true) {
599
+ static async write (idHash, syncedNow, body, headers, host, path, expires, statusCode, encrypt = true) {
603
600
 
604
601
  let cacheData = null;
605
602
 
606
- try {
607
-
608
- tools.DebugAndLog.debug(`Updating Cache for ${idHash} now:${syncedNow} | host:${host} | path:${path} | expires:${expires} | statusCode:${statusCode} | encrypt:${encrypt} ... `);
603
+ return new Promise(async (resolve) => {
609
604
 
610
- if( isNaN(expires) || expires < syncedNow ) {
611
- expires = syncedNow + 300;
612
- }
605
+ const taskList = [];
613
606
 
614
- // lowercase all headers
615
- headers = CacheData.lowerCaseKeys(headers);
607
+ try {
616
608
 
617
- // set etag
618
- if ( !("etag" in headers) ) {
619
- headers.etag = CacheData.generateEtag(idHash, body);
620
- }
609
+ tools.DebugAndLog.debug(`Updating Cache for ${idHash} now:${syncedNow} | host:${host} | path:${path} | expires:${expires} | statusCode:${statusCode} | encrypt:${encrypt} ... `);
621
610
 
622
- // set last modified
623
- if ( !("last-modified" in headers) ) {
624
- headers['last-modified'] = CacheData.generateInternetFormattedDate(syncedNow);
625
- }
611
+ if( isNaN(expires) || expires < syncedNow ) {
612
+ expires = syncedNow + 300;
613
+ }
626
614
 
627
- // set expires in header
628
- if ( !("expires" in headers) ) {
629
- headers['expires'] = CacheData.generateInternetFormattedDate(expires);
630
- }
631
-
632
- cacheData = CacheData.format(expires, body, headers, statusCode);
615
+ // lowercase all headers
616
+ headers = CacheData.lowerCaseKeys(headers);
633
617
 
634
- const bodySize_kb = this.calculateKBytes(body);
635
- let bodyToStore = body;
618
+ // set etag
619
+ if ( !("etag" in headers) ) {
620
+ headers.etag = CacheData.generateEtag(idHash, body);
621
+ }
636
622
 
637
- // if the endpoint policy is classified as private, encrypt
638
- if ( encrypt ) {
639
- tools.DebugAndLog.debug(`Policy for (${idHash}) data is classified as PRIVATE. Encrypting body...`);
640
- bodyToStore = this._encrypt(body);
641
- }
623
+ // set last modified
624
+ if ( !("last-modified" in headers) ) {
625
+ headers['last-modified'] = CacheData.generateInternetFormattedDate(syncedNow);
626
+ }
642
627
 
643
- // create the (preliminary) cache record
644
- let item = {
645
- id_hash: idHash,
646
- expires: expires,
647
- purge_ts: (syncedNow + (this.#purgeExpiredCacheEntriesAfterXHours * 3600)),
648
- data: {
649
- info: {
650
- expires: headers.expires,
651
- host: host,
652
- path: path,
653
- classification: (encrypt ? CacheData.PRIVATE : CacheData.PUBLIC),
654
- size_kb: bodySize_kb,
655
- objInS3: false
656
- },
657
- headers: headers,
658
- body: bodyToStore,
659
- statusCode: statusCode
628
+ // set expires in header
629
+ if ( !("expires" in headers) ) {
630
+ headers['expires'] = CacheData.generateInternetFormattedDate(expires);
660
631
  }
661
- };
632
+
633
+ cacheData = CacheData.format(expires, body, headers, statusCode);
662
634
 
663
- /*
664
- DynamoDb has a limit of 400KB per item so we want to make sure
665
- the Item does not take up that much space. Also, we want
666
- DynamoDb to run efficiently so it is best to only store smaller
667
- items there and move larger items into S3.
635
+ const bodySize_kb = this.calculateKBytes(body);
636
+ let bodyToStore = body;
668
637
 
669
- Any items larger than the max size we set will be stored over
670
- in S3.
638
+ // if the endpoint policy is classified as private, encrypt
639
+ if ( encrypt ) {
640
+ tools.DebugAndLog.debug(`Policy for (${idHash}) data is classified as PRIVATE. Encrypting body...`);
641
+ bodyToStore = this._encrypt(body);
642
+ }
671
643
 
672
- What is the max size? It can be set in the Lambda Environment
673
- Variables and discovering the proper balance will take some trials.
674
- We don't want to constantly be calling S3, but we also don't want
675
- to make DynamoDb too heavy either.
644
+ // create the (preliminary) cache record
645
+ let item = {
646
+ id_hash: idHash,
647
+ expires: expires,
648
+ purge_ts: (syncedNow + (this.#purgeExpiredCacheEntriesAfterXHours * 3600)),
649
+ data: {
650
+ info: {
651
+ expires: headers.expires,
652
+ host: host,
653
+ path: path,
654
+ classification: (encrypt ? CacheData.PRIVATE : CacheData.PUBLIC),
655
+ size_kb: bodySize_kb,
656
+ objInS3: false
657
+ },
658
+ headers: headers,
659
+ body: bodyToStore,
660
+ statusCode: statusCode
661
+ }
662
+ };
676
663
 
677
- (In summary: Max Item size in DynamoDb is 400KB, and storing too many large
678
- items can have a performance impact. However constantly calling
679
- S3 also will have a performance impact.)
680
- */
664
+ /*
665
+ DynamoDb has a limit of 400KB per item so we want to make sure
666
+ the Item does not take up that much space. Also, we want
667
+ DynamoDb to run efficiently so it is best to only store smaller
668
+ items there and move larger items into S3.
669
+
670
+ Any items larger than the max size we set will be stored over
671
+ in S3.
672
+
673
+ What is the max size? It can be set in the Lambda Environment
674
+ Variables and discovering the proper balance will take some trials.
675
+ We don't want to constantly be calling S3, but we also don't want
676
+ to make DynamoDb too heavy either.
677
+
678
+ (In summary: Max Item size in DynamoDb is 400KB, and storing too many large
679
+ items can have a performance impact. However constantly calling
680
+ S3 also will have a performance impact.)
681
+ */
682
+
683
+ // do the size check
684
+ if (bodySize_kb > this.#dynamoDbMaxCacheSize_kb) {
685
+ // over max size limit set in Lambda Environment Variables
686
+ taskList.push(S3Cache.write(idHash, JSON.stringify(item) )); // ADD_PROMISE_COLLECTION_HERE
687
+ // update the Item we will pass to DynamoDb
688
+ let preview = (typeof item.data.body === "string") ? item.data.body.slice(0,100)+"..." : "[---ENCRYPTED---]";
689
+ item.data.body = "ID: "+idHash+" PREVIEW: "+preview;
690
+ item.data.info.objInS3 = true;
691
+ }
692
+
693
+ taskList.push(DynamoDbCache.write(item)); // ADD_PROMISE_COLLECTION_HERE
681
694
 
682
- // do the size check
683
- if (bodySize_kb > this.#dynamoDbMaxCacheSize_kb) {
684
- // over max size limit set in Lambda Environment Variables
685
- S3Cache.write(idHash, JSON.stringify(item) );
686
- // update the Item we will pass to DynamoDb
687
- let preview = (typeof item.data.body === "string") ? item.data.body.slice(0,100)+"..." : "[---ENCRYPTED---]";
688
- item.data.body = "ID: "+idHash+" PREVIEW: "+preview;
689
- item.data.info.objInS3 = true;
695
+ } catch (error) {
696
+ tools.DebugAndLog.error(`CacheData.write for ${idHash} FAILED now:${syncedNow} | host:${host} | path:${path} | expires:${expires} | statusCode:${statusCode} | encrypt:${encrypt} failed. ${error?.message || 'Unknown error'}`, error?.stack);
697
+ cacheData = CacheData.format(0);
698
+ } finally {
699
+ await Promise.all(taskList);
700
+ resolve(cacheData); // ADD_PROMISE_COLLECTION_HERE
690
701
  }
691
-
692
- DynamoDbCache.write(item); // we don't wait for a response
693
-
694
- } catch (error) {
695
- tools.DebugAndLog.error(`CacheData.write for ${idHash} FAILED now:${syncedNow} | host:${host} | path:${path} | expires:${expires} | statusCode:${statusCode} | encrypt:${encrypt} failed. ${error.message}`, error.stack);
696
- cacheData = CacheData.format(0);
697
- };
698
-
699
- return cacheData;
702
+ });
703
+
700
704
 
701
705
  };
702
706
 
@@ -764,7 +768,7 @@ class CacheData {
764
768
  }
765
769
 
766
770
  } catch (error) {
767
- tools.DebugAndLog.error(`CacheData.getSecureDataKey() failed ${error.message}`, error.stack);
771
+ tools.DebugAndLog.error(`CacheData.getSecureDataKey() failed ${error?.message || 'Unknown error'}`, error?.stack);
768
772
  }
769
773
 
770
774
  return buff;
@@ -873,12 +877,12 @@ class CacheData {
873
877
  */
874
878
  static lowerCaseKeys (objectWithKeys) {
875
879
  let objectWithLowerCaseKeys = {};
876
- if ( objectWithKeys !== null ) {
880
+ if ( objectWithKeys !== null && objectWithKeys !== undefined && typeof objectWithKeys === 'object' ) {
877
881
  let keys = Object.keys(objectWithKeys);
878
882
  // move each value from objectWithKeys to objectWithLowerCaseKeys
879
883
  keys.forEach( function( k ) {
880
884
  objectWithLowerCaseKeys[k.toLowerCase()] = objectWithKeys[k];
881
- });
885
+ });
882
886
  }
883
887
  return objectWithLowerCaseKeys;
884
888
  }
@@ -995,9 +999,16 @@ class CacheData {
995
999
  *
996
1000
  * Cache.init({parameters});
997
1001
  *
998
- * Then you can create new objects:
1002
+ * Then you can then make a request, sending it through CacheableDataAccess:
1003
+ *
1004
+ * const { cache } = require("@63klabs/cache-data");
999
1005
  *
1000
- * const cacheObject = new Cache({id}, {parameters});
1006
+ * const cacheObj = await cache.CacheableDataAccess.getData(
1007
+ * cacheCfg,
1008
+ * yourFetchFunction,
1009
+ * conn,
1010
+ * daoQuery
1011
+ * );
1001
1012
  */
1002
1013
  class Cache {
1003
1014
 
@@ -1355,7 +1366,7 @@ class Cache {
1355
1366
  try {
1356
1367
  timestampInSeconds = CacheData.convertTimestampFromMilliToSeconds( Date.parse(date) );
1357
1368
  } catch (error) {
1358
- tools.DebugAndLog.error(`Cannot parse date/time: ${date} ${error.message}`, error.stack);
1369
+ tools.DebugAndLog.error(`Cannot parse date/time: ${date} ${error?.message || 'Unknown error'}`, error?.stack);
1359
1370
  }
1360
1371
  return timestampInSeconds;
1361
1372
  };
@@ -1446,7 +1457,7 @@ class Cache {
1446
1457
  */
1447
1458
  async read () {
1448
1459
 
1449
- return new Promise(async (resolve, reject) => {
1460
+ return new Promise(async (resolve) => {
1450
1461
 
1451
1462
  if ( this.#store !== null ) {
1452
1463
  resolve(this.#store);
@@ -1462,9 +1473,9 @@ class Cache {
1462
1473
  this.#store = CacheData.format(this.#syncedLaterTimestampInSeconds);
1463
1474
  this.#status = Cache.STATUS_CACHE_ERROR;
1464
1475
 
1465
- tools.DebugAndLog.error(`Cache Read: Cannot read cached data for ${this.#idHash}: ${error.message}`, error.stack);
1476
+ tools.DebugAndLog.error(`Cache Read: Cannot read cached data for ${this.#idHash}: ${error?.message || 'Unknown error'}`, error?.stack);
1466
1477
 
1467
- reject(this.#store);
1478
+ resolve(this.#store);
1468
1479
  };
1469
1480
  }
1470
1481
 
@@ -1630,7 +1641,7 @@ class Cache {
1630
1641
  try {
1631
1642
  bodyToReturn = (body !== null && parseBody) ? JSON.parse(body) : body;
1632
1643
  } catch (error) {
1633
- tools.DebugAndLog.error(`Cache.getBody() parse error: ${error.message}`, error.stack);
1644
+ tools.DebugAndLog.error(`Cache.getBody() parse error: ${error?.message || 'Unknown error'}`, error?.stack);
1634
1645
  tools.DebugAndLog.debug("Error parsing body", body);
1635
1646
  };
1636
1647
 
@@ -1756,44 +1767,56 @@ class Cache {
1756
1767
  * @param {string} reason Reason for extending, either Cache.STATUS_ORIGINAL_ERROR or Cache.STATUS_ORIGINAL_NOT_MODIFIED
1757
1768
  * @param {number} seconds
1758
1769
  * @param {number} errorCode
1770
+ * @returns {Promise<boolean>}
1759
1771
  */
1760
- extendExpires(reason, seconds = 0, errorCode = 0) {
1772
+ async extendExpires(reason, seconds = 0, errorCode = 0) {
1761
1773
 
1762
- try {
1763
-
1764
- let cache = this.#store.cache;
1774
+ let returnStatus = false;
1765
1775
 
1766
- // we will extend based on error extention if in error, we'll look at passed seconds and non-error default later
1767
- if (seconds === 0 && reason === Cache.STATUS_ORIGINAL_ERROR) {
1768
- seconds = this.#defaultExpirationExtensionOnErrorInSeconds;
1769
- }
1776
+ return new Promise(async (resolve) => {
1770
1777
 
1771
- // if the cache exists, we'll extend it
1772
- if ( cache !== null ) {
1773
- // statusCode
1774
- let statusCode = (cache.statusCode !== null) ? cache.statusCode : errorCode ;
1778
+ try {
1779
+
1780
+ let cache = this.#store.cache;
1775
1781
 
1776
- // we are going to create a new expires header, so delete it if it exists so we start from now()
1777
- if (cache.headers !== null && "expires" in cache.headers) { delete cache.headers.expires; }
1782
+ // we will extend based on error extention if in error, we'll look at passed seconds and non-error default later
1783
+ if (seconds === 0 && reason === Cache.STATUS_ORIGINAL_ERROR) {
1784
+ seconds = this.#defaultExpirationExtensionOnErrorInSeconds;
1785
+ }
1778
1786
 
1779
- // calculate the new expires based on default (seconds === 0) or now() + seconds passed to this function
1780
- let expires = (seconds === 0) ? this.calculateDefaultExpires() : this.#syncedNowTimestampInSeconds + seconds;
1787
+ // if the cache exists, we'll extend it
1788
+ if ( cache !== null ) {
1789
+ // statusCode
1790
+ let statusCode = (cache.statusCode !== null) ? cache.statusCode : errorCode ;
1781
1791
 
1782
- // if a reason was passed, use it only if it is a valid reason for extending. Otherwise null
1783
- let status = (reason === Cache.STATUS_ORIGINAL_ERROR || reason === Cache.STATUS_ORIGINAL_NOT_MODIFIED) ? reason : null;
1792
+ // we are going to create a new expires header, so delete it if it exists so we start from now()
1793
+ if (cache.headers !== null && "expires" in cache.headers) { delete cache.headers.expires; }
1784
1794
 
1785
- // if we received an error, add it in in case we want to evaluate further
1786
- if (errorCode >= 400) { this.#errorCode = errorCode; }
1795
+ // calculate the new expires based on default (seconds === 0) or now() + seconds passed to this function
1796
+ let expires = (seconds === 0) ? this.calculateDefaultExpires() : this.#syncedNowTimestampInSeconds + seconds;
1787
1797
 
1788
- // perform the update with existing info, but new expires and status
1789
- this.update( cache.body, cache.headers, statusCode, expires, status);
1790
- } else {
1791
- tools.DebugAndLog.debug("Cache is null. Nothing to extend.");
1792
- }
1798
+ // if a reason was passed, use it only if it is a valid reason for extending. Otherwise null
1799
+ let status = (reason === Cache.STATUS_ORIGINAL_ERROR || reason === Cache.STATUS_ORIGINAL_NOT_MODIFIED) ? reason : null;
1800
+
1801
+ // if we received an error, add it in in case we want to evaluate further
1802
+ if (errorCode >= 400) { this.#errorCode = errorCode; }
1803
+
1804
+ // perform the update with existing info, but new expires and status
1805
+ await this.update( cache.body, cache.headers, statusCode, expires, status);
1806
+ } else {
1807
+ tools.DebugAndLog.debug("Cache is null. Nothing to extend.");
1808
+ }
1809
+
1810
+ returnStatus = true;
1811
+
1812
+ } catch (error) {
1813
+ tools.DebugAndLog.error(`Unable to extend cache: ${error?.message || 'Unknown error'}`, error?.stack);
1814
+ } finally {
1815
+ resolve(returnStatus);
1816
+ };
1817
+
1818
+ });
1793
1819
 
1794
- } catch (error) {
1795
- tools.DebugAndLog.error(`Unable to extend cache: ${error.message}`, error.stack);
1796
- };
1797
1820
 
1798
1821
  };
1799
1822
 
@@ -1819,119 +1842,121 @@ class Cache {
1819
1842
  * @param {object} headers Any headers you want to pass along, including last-modified, etag, and expires. Note that if expires is included as a header here, then it will override the expires paramter passed to .update()
1820
1843
  * @param {number} statusCode Status code of original request
1821
1844
  * @param {number} expires Expiration unix timestamp in seconds
1822
- * @returns {CacheDataFormat} Representation of data stored in cache
1845
+ * @returns {Promise<CacheDataFormat>} Representation of data stored in cache
1823
1846
  */
1824
- update (body, headers, statusCode = 200, expires = 0, status = null) {
1825
-
1826
- const prev = {
1827
- eTag: this.getETag(),
1828
- modified: this.getLastModified(),
1829
- expired: this.isExpired(),
1830
- empty: this.isEmpty()
1831
- };
1832
-
1833
- // lowercase all the header keys so we can evaluate each
1834
- headers = Cache.lowerCaseKeys(headers);
1847
+ async update (body, headers, statusCode = 200, expires = 0, status = null) {
1835
1848
 
1836
- /* Bring in headers
1837
- We'll keep the etag and last-modified. Also any specified
1838
- */
1839
- let defaultHeadersToRetain = [
1840
- "content-type",
1841
- "etag",
1842
- "last-modified",
1843
- "ratelimit-limit",
1844
- "ratelimit-remaining",
1845
- "ratelimit-reset",
1846
- "x-ratelimit-limit",
1847
- "x-ratelimit-remaining",
1848
- "x-ratelimit-reset",
1849
- "retry-after"
1850
- ];
1851
-
1852
- // combine the standard headers with the headers specified for endpoint in custom/policies.json
1853
- let ptHeaders = [].concat(this.#headersToRetain, defaultHeadersToRetain);
1854
-
1855
- // lowercase the headers we are looking for
1856
- let passThrough = ptHeaders.map(element => {
1857
- return element.toLowerCase();
1858
- });
1859
-
1860
- let headersForCache = {};
1849
+ return new Promise(async (resolve) => {
1850
+
1851
+ const prev = {
1852
+ eTag: this.getETag(),
1853
+ modified: this.getLastModified(),
1854
+ expired: this.isExpired(),
1855
+ empty: this.isEmpty()
1856
+ };
1861
1857
 
1862
- // retain specified headers
1863
- passThrough.forEach(function( key ) {
1864
- if (key in headers) { headersForCache[key] = headers[key]; }
1865
- });
1858
+ // lowercase all the header keys so we can evaluate each
1859
+ headers = Cache.lowerCaseKeys(headers);
1866
1860
 
1867
- // we'll set the default expires, in case the expires in header does not work out, or we don't use the header expires
1868
- if ( isNaN(expires) || expires === 0) {
1869
- expires = this.calculateDefaultExpires();
1870
- }
1871
-
1872
- // get the expires and max age (as timestamp)from headers if we don't insist on overriding
1873
- // unlike etag and last-modified, we won't move them over, but let the expires param in .update() do the talking
1874
- if ( !this.#overrideOriginHeaderExpiration && ("expires" in headers || ("cache-control" in headers && headers['cache-control'].includes("max-age") ))) {
1875
-
1876
- let age = this.#syncedNowTimestampInSeconds;
1877
- let exp = this.#syncedNowTimestampInSeconds;
1878
-
1879
- if ("cache-control" in headers && headers['cache-control'].includes("max-age")) {
1880
- // extract max-age
1881
- let cacheControl = headers['cache-control'].split(",");
1882
- for(const p of cacheControl) {
1883
- if(p.trim().startsWith("max-age")) {
1884
- let maxage = parseInt(p.trim().split("=")[1], 10);
1885
- age = this.#syncedNowTimestampInSeconds + maxage; // convert to timestamp
1886
- break; // break out of for
1861
+ /* Bring in headers
1862
+ We'll keep the etag and last-modified. Also any specified
1863
+ */
1864
+ let defaultHeadersToRetain = [
1865
+ "content-type",
1866
+ "etag",
1867
+ "last-modified",
1868
+ "ratelimit-limit",
1869
+ "ratelimit-remaining",
1870
+ "ratelimit-reset",
1871
+ "x-ratelimit-limit",
1872
+ "x-ratelimit-remaining",
1873
+ "x-ratelimit-reset",
1874
+ "retry-after"
1875
+ ];
1876
+
1877
+ // combine the standard headers with the headers specified for endpoint in custom/policies.json
1878
+ let ptHeaders = [].concat(this.#headersToRetain, defaultHeadersToRetain);
1879
+
1880
+ // lowercase the headers we are looking for
1881
+ let passThrough = ptHeaders.map(element => {
1882
+ return element.toLowerCase();
1883
+ });
1884
+
1885
+ let headersForCache = {};
1886
+
1887
+ // retain specified headers
1888
+ passThrough.forEach(function( key ) {
1889
+ if (key in headers) { headersForCache[key] = headers[key]; }
1890
+ });
1891
+
1892
+ // we'll set the default expires, in case the expires in header does not work out, or we don't use the header expires
1893
+ if ( isNaN(expires) || expires === 0) {
1894
+ expires = this.calculateDefaultExpires();
1895
+ }
1896
+
1897
+ // get the expires and max age (as timestamp)from headers if we don't insist on overriding
1898
+ // unlike etag and last-modified, we won't move them over, but let the expires param in .update() do the talking
1899
+ if ( !this.#overrideOriginHeaderExpiration && ("expires" in headers || ("cache-control" in headers && headers['cache-control'].includes("max-age") ))) {
1900
+
1901
+ let age = this.#syncedNowTimestampInSeconds;
1902
+ let exp = this.#syncedNowTimestampInSeconds;
1903
+
1904
+ if ("cache-control" in headers && headers['cache-control'].includes("max-age")) {
1905
+ // extract max-age
1906
+ let cacheControl = headers['cache-control'].split(",");
1907
+ for(const p of cacheControl) {
1908
+ if(p.trim().startsWith("max-age")) {
1909
+ let maxage = parseInt(p.trim().split("=")[1], 10);
1910
+ age = this.#syncedNowTimestampInSeconds + maxage; // convert to timestamp
1911
+ break; // break out of for
1912
+ }
1887
1913
  }
1888
1914
  }
1889
- }
1890
-
1891
- if ("expires" in headers) {
1892
- exp = Cache.parseToSeconds(headers.expires);
1893
- }
1894
1915
 
1895
- // we will take the greater of max-age or expires, and if they are not 0 and not past, use it as expTimestamp
1896
- let max = ( exp > age ) ? exp : age;
1897
- if ( max !== 0 && expires > this.#syncedNowTimestampInSeconds) { expires = max; }
1916
+ if ("expires" in headers) {
1917
+ exp = Cache.parseToSeconds(headers.expires);
1918
+ }
1898
1919
 
1899
- }
1920
+ // we will take the greater of max-age or expires, and if they are not 0 and not past, use it as expTimestamp
1921
+ let max = ( exp > age ) ? exp : age;
1922
+ if ( max !== 0 && expires > this.#syncedNowTimestampInSeconds) { expires = max; }
1900
1923
 
1901
- /* Write to Cache
1902
- We are now ready to write to the cache
1903
- */
1904
- try {
1905
- this.#store = CacheData.write(this.#idHash, this.#syncedNowTimestampInSeconds, body, headersForCache, this.#hostId, this.#pathId, expires, statusCode, this.#encrypt);
1906
-
1907
- if (status === null) {
1908
- if (prev.empty) {
1909
- status = Cache.STATUS_NO_CACHE;
1910
- } else if (this.getETag() === prev.eTag || this.getLastModified() === prev.modified) {
1911
- status = Cache.STATUS_CACHE_SAME;
1912
- } else if (prev.expired) {
1913
- status = Cache.STATUS_EXPIRED;
1914
- } else {
1915
- status = Cache.STATUS_FORCED;
1916
- }
1917
1924
  }
1918
1925
 
1919
- this.#status = status;
1926
+ /* Write to Cache
1927
+ We are now ready to write to the cache
1928
+ */
1929
+ try {
1930
+ this.#store = await CacheData.write(this.#idHash, this.#syncedNowTimestampInSeconds, body, headersForCache, this.#hostId, this.#pathId, expires, statusCode, this.#encrypt);
1931
+
1932
+ if (status === null) {
1933
+ if (prev.empty) {
1934
+ status = Cache.STATUS_NO_CACHE;
1935
+ } else if (this.getETag() === prev.eTag || this.getLastModified() === prev.modified) {
1936
+ status = Cache.STATUS_CACHE_SAME;
1937
+ } else if (prev.expired) {
1938
+ status = Cache.STATUS_EXPIRED;
1939
+ } else {
1940
+ status = Cache.STATUS_FORCED;
1941
+ }
1942
+ }
1920
1943
 
1921
- tools.DebugAndLog.debug("Cache Updated "+this.getStatus()+": "+this.#idHash);
1922
-
1923
- } catch (error) {
1924
- tools.DebugAndLog.error(`Cannot copy cached data to local store for evaluation: ${this.#idHash} ${error.message}`, error.stack);
1925
- if ( this.#store === null ) {
1926
- this.#store = CacheData.format(this.#syncedLaterTimestampInSeconds);
1944
+ this.#status = status;
1945
+
1946
+ tools.DebugAndLog.debug("Cache Updated "+this.getStatus()+": "+this.#idHash);
1947
+
1948
+ } catch (error) {
1949
+ tools.DebugAndLog.error(`Cannot copy cached data to local store for evaluation: ${this.#idHash} ${error?.message || 'Unknown error'}`, error?.stack);
1950
+ if ( this.#store === null ) {
1951
+ this.#store = CacheData.format(this.#syncedLaterTimestampInSeconds);
1952
+ }
1953
+ this.#status = Cache.STATUS_CACHE_ERROR;
1954
+ } finally {
1955
+ resolve(this.#store);
1927
1956
  }
1928
- this.#status = Cache.STATUS_CACHE_ERROR;
1929
- };
1930
1957
 
1931
- return this.#store;
1932
-
1958
+ });
1933
1959
  };
1934
-
1935
1960
  };
1936
1961
 
1937
1962
  class CacheableDataAccess {
@@ -1982,7 +2007,7 @@ class CacheableDataAccess {
1982
2007
  * @param {string} cachePolicy.pathId
1983
2008
  * @param {boolean} cachePolicy.encrypt
1984
2009
  * @param {object} apiCallFunction The function to call in order to make the request. This function can call ANY datasource (file, http endpoint, etc) as long as it returns a DAO object
1985
- * @param {object} connection A connection object that specifies an id, location, and connectin details for the apiCallFunction to access data. If you have a Connection object pass conn.toObject()
2010
+ * @param {object} connection A connection object that specifies an id, location, and connection details for the apiCallFunction to access data. If you have a Connection object pass conn.toObject()
1986
2011
  * @param {string} connection.method
1987
2012
  * @param {string} connection.protocol
1988
2013
  * @param {string} connection.host
@@ -1998,7 +2023,7 @@ class CacheableDataAccess {
1998
2023
  */
1999
2024
  static async getData(cachePolicy, apiCallFunction, connection, data = null, tags = {} ) {
2000
2025
 
2001
- return new Promise(async (resolve, reject) => {
2026
+ return new Promise(async (resolve) => {
2002
2027
 
2003
2028
  CacheData.prime(); // prime anything we'll need that may have changed since init, we'll await the result before read and write
2004
2029
 
@@ -2040,21 +2065,21 @@ class CacheableDataAccess {
2040
2065
  // check header and status for 304 not modified
2041
2066
  if (originalSource.statusCode === 304) {
2042
2067
  tools.DebugAndLog.debug("Received 304 Not Modified. Extending cache");
2043
- cache.extendExpires(Cache.STATUS_ORIGINAL_NOT_MODIFIED, 0, originalSource.statusCode);
2068
+ await cache.extendExpires(Cache.STATUS_ORIGINAL_NOT_MODIFIED, 0, originalSource.statusCode);
2044
2069
  } else {
2045
2070
  let body = ( typeof originalSource.body !== "object" ) ? originalSource.body : JSON.stringify(originalSource.body);
2046
2071
  await CacheData.prime(); // can't proceed until we have the secrets
2047
- cache.update(body, originalSource.headers, originalSource.statusCode);
2072
+ await cache.update(body, originalSource.headers, originalSource.statusCode);
2048
2073
  }
2049
2074
 
2050
2075
  } catch (error) {
2051
- tools.DebugAndLog.error(`Not successful in creating cache: ${idHash} (${tags.path}/${tags.id}) ${error.message}`, error.stack);
2076
+ tools.DebugAndLog.error(`Not successful in creating cache: ${idHash} (${tags.path}/${tags.id}) ${error?.message || 'Unknown error'}`, error?.stack);
2052
2077
  }
2053
2078
 
2054
2079
  } else {
2055
2080
 
2056
2081
  tools.DebugAndLog.error(`${originalSource.statusCode} | Not successful in getting data from original source for cache. Extending cache expires. ${idHash} (${tags.path}/${tags.id})`, originalSource);
2057
- cache.extendExpires(Cache.STATUS_ORIGINAL_ERROR, 0, originalSource.statusCode);
2082
+ await cache.extendExpires(Cache.STATUS_ORIGINAL_ERROR, 0, originalSource.statusCode);
2058
2083
 
2059
2084
  }
2060
2085
  }
@@ -2065,8 +2090,8 @@ class CacheableDataAccess {
2065
2090
 
2066
2091
  } catch (error) {
2067
2092
  timer.stop();
2068
- tools.DebugAndLog.error(`Error while getting data: (${tags.path}/${tags.id}) ${error.message}`, error.stack);
2069
- reject(cache);
2093
+ tools.DebugAndLog.error(`Error while getting data: (${tags.path}/${tags.id}) ${error?.message || 'Unknown error'}`, error?.stack);
2094
+ resolve(cache);
2070
2095
  };
2071
2096
  });
2072
2097
  };
@@ -59,16 +59,19 @@ const tools = require("./tools/index.js");
59
59
  /**
60
60
  *
61
61
  * @param {ConnectionObject} connection An object with details about the connection (method, uri, host, etc)
62
- * @param {*} data Additional data to perform a query for the request, or transformation of the response within the DAO object. This data is not directly sent to the endpoint. It is used within the DAO object to transform the request and/or response. Any data sent to the endpoint should be in the connection or handled within the DAO
63
- * @returns {object} The response
62
+ * @param {Object} query Additional data to perform a query for the request.
63
+ * @returns {Object} The response
64
+ * @example
65
+ const { endpoint } = require("@63klabs/cache-data");
66
+ const data = await endpoint.get({host: "api.example.com", path: "data"}, { parameters: {q: "Chicago" }});
64
67
  */
65
- const getDataDirectFromURI = async (connection, data = null) => {
66
- return (new Endpoint(connection).get());
68
+ const get = async (connection, query = null) => {
69
+ return (new Endpoint(connection, query).get());
67
70
  };
68
71
 
69
72
  /**
70
73
  * A bare bones request to an endpoint. Can be used as a template to
71
- * create more elaboarate requests.
74
+ * create more elaborate requests.
72
75
  */
73
76
  class Endpoint {
74
77
 
@@ -76,10 +79,21 @@ class Endpoint {
76
79
  *
77
80
  * @param {ConnectionObject} connection An object with connection data
78
81
  */
79
- constructor(connection) {
82
+ constructor(connection, query = {}) {
80
83
 
81
84
  this.response = null;
82
85
 
86
+ // if query has parameters property then we will combine with connection parameters
87
+ if ( query !== null && "parameters" in query ) {
88
+ if ( !("parameters" in connection) || connection.parameters === null ) {
89
+ connection.parameters = {};
90
+ }
91
+
92
+ for ( const [key, value] of Object.entries( query.parameters ) ) {
93
+ connection.parameters[key] = value;
94
+ }
95
+ }
96
+
83
97
  this.request = {
84
98
  method: this._setRequestSetting(connection, "method", "GET"),
85
99
  uri: this._setRequestSetting(connection, "uri", ""),
@@ -182,5 +196,6 @@ class Endpoint {
182
196
  };
183
197
 
184
198
  module.exports = {
185
- getDataDirectFromURI
199
+ getDataDirectFromURI: get, // deprecated alias
200
+ get
186
201
  };
@@ -36,13 +36,13 @@ const _httpGetExecute = async function (options, requestObject, xRaySegment = xR
36
36
  /*
37
37
  Return a promise that will resolve to true or false based upon success
38
38
  */
39
- return new Promise ((resolve, reject) => {
39
+ return new Promise ((resolve) => {
40
40
 
41
41
  /*
42
42
  Functions/variables we'll use within https.get()
43
43
  We need to declare functions that we will be using within https.get()
44
44
  "locally" and refer to the requestObject to perform updates
45
- setResponse() and addRedirect() also performs the resolve() and reject() for the promise
45
+ setResponse() and addRedirect() also performs the resolve() for the promise
46
46
  */
47
47
  const setResponse = function (response) { requestObject.setResponse(response); resolve(true)};
48
48
  const addRedirect = function (uri) { requestObject.addRedirect(uri); resolve(false)};
@@ -515,7 +515,7 @@ class APIRequest {
515
515
  */
516
516
  async send_get() {
517
517
 
518
- return new Promise (async (resolve, reject) => {
518
+ return new Promise (async (resolve) => {
519
519
  // https://stackoverflow.com/questions/41470296/how-to-await-and-return-the-result-of-a-http-request-so-that-multiple-request
520
520
 
521
521
  // https://nodejs.org/api/https.html#https_https_request_url_options_callback
@@ -608,11 +608,11 @@ class APIRequest {
608
608
  }
609
609
  catch (error) {
610
610
  DebugAndLog.error(`Error in APIRequest call to _httpGetExecute (${this.getNote()}): ${error.message}`, error.stack);
611
- reject(APIRequest.responseFormat(false, 500, "Error during send request"));
611
+ resolve(APIRequest.responseFormat(false, 500, "Error during send request"));
612
612
  }
613
613
  } catch (error) {
614
614
  DebugAndLog.error(`API error while trying request for host ${this.getHost()} ${this.getNote()} ${error.message}`, { APIRequest: this.toObject(), trace: error.stack } );
615
- reject(APIRequest.responseFormat(false, 500, "Error during send request"));
615
+ resolve(APIRequest.responseFormat(false, 500, "Error during send request"));
616
616
 
617
617
  }
618
618
  });
@@ -101,7 +101,7 @@ class CachedParameterSecrets {
101
101
  */
102
102
  static async prime() {
103
103
 
104
- return new Promise(async (resolve, reject) => {
104
+ return new Promise(async (resolve) => {
105
105
 
106
106
  try {
107
107
  const promises = [];
@@ -115,7 +115,7 @@ class CachedParameterSecrets {
115
115
 
116
116
  } catch (error) {
117
117
  DebugAndLog.error(`CachedParameterSecrets.prime(): ${error.message}`, error.stack);
118
- reject(false);
118
+ resolve(false);
119
119
  }
120
120
 
121
121
  });
@@ -295,7 +295,7 @@ class CachedParameterSecret {
295
295
  DebugAndLog.debug(`CachedParameterSecret.refresh() Checking refresh status of ${this.name}`);
296
296
  if ( !this.isRefreshing() ) {
297
297
  this.cache.status = 0;
298
- this.cache.promise = new Promise(async (resolve, reject) => {
298
+ this.cache.promise = new Promise(async (resolve) => {
299
299
  try {
300
300
  const timer = new Timer('CachedParameterSecret_refresh', true);
301
301
  let resp = null;
@@ -316,7 +316,7 @@ class CachedParameterSecret {
316
316
  resolve(this.cache.status);
317
317
  } catch (error) {
318
318
  DebugAndLog.error(`Error Calling Secrets Manager and SSM Parameter Store Lambda Extension during refresh: ${error.message}`, error.stack);
319
- reject(-1);
319
+ resolve(-1);
320
320
  }
321
321
  });
322
322
  }
@@ -376,7 +376,7 @@ class CachedParameterSecret {
376
376
 
377
377
  async _requestSecretsFromLambdaExtension() {
378
378
 
379
- return new Promise(async (resolve, reject) => {
379
+ return new Promise(async (resolve) => {
380
380
 
381
381
  let body = "";
382
382
 
@@ -402,7 +402,7 @@ class CachedParameterSecret {
402
402
 
403
403
  } catch (error) {
404
404
  DebugAndLog.error(`CachedParameterSecret http: Error Calling Secrets Manager and SSM Parameter Store Lambda Extension: Error parsing response for ${options.path} ${error.message}`, error.stack);
405
- reject(null);
405
+ resolve(null);
406
406
  }
407
407
 
408
408
  };
@@ -426,12 +426,12 @@ class CachedParameterSecret {
426
426
 
427
427
  res.on('error', error => {
428
428
  DebugAndLog.error(`CachedParameterSecret http Error: E0 Error obtaining response for ${options.path} ${error.message}`, error.stack);
429
- reject(null);
429
+ resolve(null);
430
430
  });
431
431
 
432
432
  } catch (error) {
433
433
  DebugAndLog.error(`CachedParameterSecret http Error: E1 Error obtaining response for ${options.path} ${error.message}`, error.stack);
434
- reject(null);
434
+ resolve(null);
435
435
  }
436
436
 
437
437
  });
@@ -439,12 +439,12 @@ class CachedParameterSecret {
439
439
  req.on('timeout', () => {
440
440
  DebugAndLog.error(`CachedParameterSecret http Error: Endpoint request timeout reached for ${options.path}`);
441
441
  req.end();
442
- reject(null);
442
+ resolve(null);
443
443
  });
444
444
 
445
445
  req.on('error', error => {
446
446
  DebugAndLog.error(`CachedParameterSecret http Error: Error during request for ${options.path} ${error.message}`, error.stack);
447
- reject(null);
447
+ resolve(null);
448
448
  });
449
449
 
450
450
  req.end();