pluga-plg 0.1.8 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/.eslintrc.json ADDED
@@ -0,0 +1,24 @@
1
+ {
2
+ "env": {
3
+ "commonjs": true,
4
+ "es6": true,
5
+ "node": true,
6
+ "jest": true
7
+ },
8
+ "extends": "airbnb-base",
9
+ "globals": {
10
+ "Atomics": "readonly",
11
+ "SharedArrayBuffer": "readonly"
12
+ },
13
+ "parserOptions": {
14
+ "ecmaVersion": 2022
15
+ },
16
+ "rules": {
17
+ "func-names": "off",
18
+ "no-await-in-loop": "off",
19
+ "no-console": "off",
20
+ "no-param-reassign": "off",
21
+ "no-restricted-syntax": "off",
22
+ "prefer-arrow-callback": "off"
23
+ }
24
+ }
package/README.md CHANGED
@@ -1,37 +1,273 @@
1
+
1
2
  # pluga-plg
2
3
 
3
4
  Pluga developer platform toolbox
4
5
 
5
- ### Erros customizados:
6
- Existem tipos específicos de erros que são tratados de diferentes maneiras dentro da plataforma da Pluga. Os detalhes sobre os tipos de erros suportados e como podem ser utilizados dentro do seu código estão descritos abaixo:
6
+ ## 📦 Setup
7
+
8
+ ```bash
9
+ npm install pluga-plg
10
+ ```
11
+ ## 🔧 Configuration
12
+
13
+ If you are using the `storageService` (>= 0.2.0), define the following in the environment:
14
+
15
+ ```env
16
+ AWS_S3_BUCKET=bucket-name
17
+ AWS_REGION=region
18
+ AWS_ACCESS_KEY_ID=access-key
19
+ AWS_SECRET_ACCESS_KEY=secret-key
20
+ ```
21
+ ## plg.errors
7
22
 
8
- #### AuthError
9
- Erro de autenticação, geralmente necessita de uma ação manual do cliente. Devem ser utilizados em casos onde problema de autenticação impede o funcionamento da integração.
23
+ ### Custom Errors:
24
+ There are specific types of errors that are handled differently within the Pluga platform. Details about the supported error types and how they can be used in your code are described below:
25
+
26
+ <details>
27
+ <summary>AuthError</summary>
28
+ Authentication error, usually requires manual action from the client. Should be used in cases where an authentication issue prevents the integration from functioning.
10
29
 
11
30
  ```javascript
12
31
  plg.errors.authError(message: String)
13
32
  ```
33
+ </details>
14
34
 
15
- #### Error
16
- Tipo genérico de erro, erros desse tipo deixam o evento em estado de falha. Devem ser utilizados quando o problema na integração exija uma correção manual por parte do cliente.
35
+ <details>
36
+ <summary>Error</summary>
37
+ Generic error type. Errors of this type put the event in a failed state. Should be used when an integration issue requires manual correction by the client.
17
38
 
18
39
  ```javascript
19
40
  plg.errors.error(message: String)
20
41
  ```
42
+ </details>
21
43
 
22
- #### RateLimitError
44
+ <details>
45
+ <summary>RateLimitError</summary>
23
46
 
24
- Erros desse tipo permitem que a Pluga realize o processamento dos eventos automaticamente em um momento futuro. Devem ser utilizados quando um recurso torna-se indisponível por conta do limite de uso por exemplo. Você deve fornecer o tempo necessário (em segundos) para que o recurso esteja disponível novamente.
47
+ Errors of this type allow Pluga to process the events automatically at a later time. Should be used when a resource becomes unavailable due to usage limits, for example. You must provide the necessary time (in seconds) for the resource to become available again.
25
48
 
26
49
  ```javascript
27
50
  plg.errors.rateLimitError(message: String, remaining: Integer(seconds))
28
51
  ```
52
+ </details>
29
53
 
30
- #### TransientError
54
+ <details>
55
+ <summary>TransientError</summary>
31
56
 
32
- Erros temporários ou transitórios que podem ocorrer por instabilidades, quedas, etc, e que não exigem nenhuma ação manual para o seu correto funcionamento. Eventos com erros desse tipo são reprocessados automaticamente pela plataforma da Pluga.
57
+ Temporary or transient errors that may occur due to instabilities, outages, etc., and do not require any manual action for proper functioning. Events with this type of error are automatically reprocessed by the Pluga platform.
33
58
 
34
59
  ```javascript
35
60
  plg.errors.transientError(message: String)
36
61
  ```
62
+ </details>
63
+
64
+ ## plg.files
65
+
66
+ ### plg.files.remote
67
+
68
+ The `files.remote` module provides integration with Amazon S3 for file management.
69
+
70
+ <details>
71
+ <summary>async upload</summary>
72
+
73
+ Send a local file to a S3 bucket.
74
+
75
+ ```
76
+ plg.files.remote.upload({ fileKey: String, filePath: String })
77
+ ```
78
+
79
+ #### Params
80
+
81
+ | Name | Type | Required | Description |
82
+ |----------|--------|-------------|-----------|
83
+ | fileKey | string | Yes | The unique key (name) for the file in the S3 bucket |
84
+ | filePath | string | Yes | The local path of the file to upload |
85
+
86
+ #### Return
87
+
88
+ ```json
89
+ {
90
+ "fileKey": "string"
91
+ }
92
+ ```
93
+
94
+ #### Errors
95
+
96
+
97
+ | type | When it occurs | Example message |
98
+ |--------------|---------------|-------------------|
99
+ | `Error` | Local path does not exist | - |
100
+ | `Error` | Internal errors in the AWS SDK | - |
101
+
102
+ </details>
103
+
104
+ <details>
105
+ <summary>async download </summary>
106
+
107
+ Download a file from an Amazon S3 bucket and save it to a local path.
108
+
109
+ ```
110
+ plg.files.remote.download({
111
+ fileKey: String,
112
+ pathToWrite: String
113
+ sizeLimit: Number
114
+ })
115
+ ```
116
+
117
+ #### Params
118
+
119
+ | Name | Type | Required | Description |
120
+ |-------------|----------|-------------|-----------|
121
+ | fileKey | string | Yes | The unique key (name) for the file in the S3 bucket |
122
+ | pathToWrite | string | Yes | Local path where the downloaded file will be saved |
123
+ | sizeLimit | number | No | Optional maximum allowed file size in bytes. If exceeded, the download will fail |
124
+
125
+ #### Return
126
+
127
+ ```json
128
+ {
129
+ "success": true
130
+ }
131
+ ```
132
+
133
+ #### Errors
134
+
135
+ | type | When it occurs | Example message |
136
+ |--------------|---------------|-------------------|
137
+ | `Error` | Local path does not exist | - |
138
+ | `Error` | Internal errors in the AWS SDK | - |
139
+ | `Error` | File exceeds size limit specified in the `sizeLimit` param | `File size limit exceeded. File size: ${fileSize} bytes, limit: ${sizeLimit} bytes.` |
140
+
141
+ </details>
142
+
143
+ <details>
144
+ <summary>async getSignedUrl</summary>
145
+
146
+ Generate a temporary signed URL to download a file from S3.
147
+
148
+ ```
149
+ plg.files.remote.getSignedUrl({
150
+ fileKey: String,
151
+ expiresIn: Number
152
+ })
153
+ ```
154
+
155
+ #### Params
156
+
157
+ | Name | Type | Required | Description |
158
+ |-----------|----------|-------------|-----------|
159
+ | fileKey | string | Yes | The unique key (name) for the file in the S3 bucket |
160
+ | expiresIn | number | No | Optional expiration time in seconds. Default: 1800 (30 minutes) |
161
+
162
+ #### Return
163
+
164
+ If the file exists:
165
+
166
+ ```json
167
+ {
168
+ "fileKey": "string",
169
+ "signedUrl": "string",
170
+ "expiresIn": "number"
171
+ }
172
+ ```
173
+
174
+ If file does not exists returns `null`
175
+
176
+ #### Errors
177
+
178
+ | type | When it occurs | Example message |
179
+ |--------------|---------------|-------------------|
180
+ | `Error` | Internal errors in the AWS SDK | - |
181
+
182
+ </details>
183
+
184
+ <details>
185
+ <summary>async fileExists </summary>
186
+
187
+ Checks if a file exists in S3 bucket.
188
+
189
+ ```
190
+ plg.files.remote.fileExists({
191
+ fileKey: String
192
+ })
193
+ ```
194
+
195
+ #### Params
196
+
197
+ | Name | Type | Required | Description |
198
+ |--------|----------|-------------|-----------|
199
+ | fileKey | string | Yes | The unique key (name) for the file in the S3 bucket |
200
+
201
+ #### Retorno
202
+
203
+ ```json
204
+ true // if the file exists
205
+ false // if the file does not exist
206
+ ```
207
+
208
+ #### Errors
209
+
210
+ | type | When it occurs | Example message |
211
+ |--------------|---------------|-------------------|
212
+ | `Error` | Internal errors in the AWS SDK, except for 404, which is treated as false | - |
213
+
214
+ </details>
215
+
216
+ ### plg.files.local
217
+
218
+ The `files.local` module enables local file handling.
219
+
220
+ <details>
221
+ <summary><strong>async downloadStream</strong></summary>
222
+
223
+ Download a file from a URL as a stream and save it locally.
224
+
225
+ ```javascript
226
+ plg.files.local.downloadStream({
227
+ pathToWrite: String,
228
+ downloadRequestParams: {
229
+ downloadUrl: String,
230
+ headers?: Object,
231
+ },
232
+ callbacks?: {
233
+ onData?: Function,
234
+ onEnd?: Function,
235
+ },
236
+ })
237
+ ```
238
+
239
+ #### Params
240
+
241
+ | Name | Type | Required | Description |
242
+ | --------------------------------- | -------- | -------- | ------------------------------------------------------------------------------------------------- |
243
+ | pathToWrite | string | Yes | Local path where the downloaded file will be saved. |
244
+ | downloadRequestParams | object | Yes | Object containing the download URL and optional HTTP headers. |
245
+ | downloadRequestParams.downloadUrl | string | Yes | Public URL used to download the file. |
246
+ | downloadRequestParams.headers | object | No | Optional HTTP headers to be sent with the download request. |
247
+ | callbacks | object | No | Optional callbacks executed during the download lifecycle. |
248
+ | callbacks.onData | function | No | Executed for each received data chunk. Receives `(dataChunk, currentDownloadedBytesCount)`. |
249
+ | callbacks.onEnd | function | No | Executed when the download finishes. If it returns a value, the promise resolves with that value. |
250
+
251
+
252
+ #### Return by default
253
+ ```json
254
+ {
255
+ "success": true
256
+ }
257
+ ```
258
+
259
+ - Note: If `onEndCallback` returns a value, the promise resolves with that value instead
260
+
261
+ #### Errors
262
+
263
+
264
+ | Type | When it occurs | Example |
265
+ | --------------------- | ------------------------------------------------------------------- | -------------------------------------------------------------- |
266
+ | `Error` | Local path to write does not exist or is not writable | — |
267
+ | `Error` | Download URL provided is not public or request fails | — |
268
+ | `onDataCallback` | Error thrown inside `callbacks.onData` | `{ "type": "onDataCallback", "error": <original error> }` |
269
+ | `onEndCallback` | Error thrown inside `callbacks.onEnd` | `{ "type": "onEndCallback", "error": <original error> }` |
270
+ | `StreamPipelineError` | Stream pipeline fails (connection drop, timeout, write error, etc.) | `{ "type": "StreamPipelineError", "error": <original error> }` |
271
+
37
272
 
273
+ </details>
@@ -0,0 +1,45 @@
1
+ const axios = require('../../../lib/axios');
2
+ const AxiosService = require('../../../lib/files/local/axiosService');
3
+
4
+ jest.mock('../../../lib/axios');
5
+
6
+ describe('AxiosService', () => {
7
+ let service;
8
+
9
+ beforeEach(() => {
10
+ jest.clearAllMocks();
11
+ service = new AxiosService();
12
+ });
13
+
14
+ it('downloads a file as stream using axios', async () => {
15
+ const fakeStream = { fake: 'stream' };
16
+
17
+ axios.mockResolvedValue({
18
+ data: fakeStream,
19
+ });
20
+
21
+ const params = {
22
+ downloadRequestParams: {
23
+ downloadUrl: 'https://example.com/file.txt',
24
+ headers: {
25
+ Authorization: 'Bearer token',
26
+ },
27
+ },
28
+ };
29
+
30
+ const result = await service.download(params);
31
+
32
+ expect(axios).toHaveBeenCalledWith({
33
+ method: 'get',
34
+ url: 'https://example.com/file.txt',
35
+ responseType: 'stream',
36
+ headers: {
37
+ Authorization: 'Bearer token',
38
+ },
39
+ });
40
+
41
+ expect(result).toEqual({
42
+ stream: fakeStream,
43
+ });
44
+ });
45
+ });
@@ -0,0 +1,151 @@
1
+ const fs = require('fs');
2
+ const {
3
+ S3Client,
4
+ PutObjectCommand,
5
+ GetObjectCommand,
6
+ } = require('@aws-sdk/client-s3');
7
+ const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
8
+ const { PassThrough } = require('stream');
9
+
10
+ const S3Service = require('../../../lib/files/remote/s3Service');
11
+
12
+ jest.mock('fs', () => ({
13
+ createReadStream: jest.fn(),
14
+ }));
15
+
16
+ jest.mock('@aws-sdk/client-s3', () => ({
17
+ S3Client: jest.fn(),
18
+ PutObjectCommand: jest.fn((params) => params),
19
+ GetObjectCommand: jest.fn((params) => params),
20
+ HeadObjectCommand: jest.fn((params) => params),
21
+ }));
22
+
23
+ jest.mock('@aws-sdk/s3-request-presigner', () => ({
24
+ getSignedUrl: jest.fn(),
25
+ }));
26
+
27
+ describe('S3Service', () => {
28
+ let mockSend;
29
+ let s3Service;
30
+
31
+ beforeEach(() => {
32
+ jest.clearAllMocks();
33
+
34
+ mockSend = jest.fn();
35
+ S3Client.mockImplementation(() => ({ send: mockSend }));
36
+
37
+ s3Service = new S3Service({
38
+ client: new S3Client(),
39
+ bucket: 'foo-bucket',
40
+ });
41
+ });
42
+
43
+ describe('upload', () => {
44
+ it('uploads file to S3', async () => {
45
+ const mockStream = {};
46
+ fs.createReadStream.mockReturnValue(mockStream);
47
+ mockSend.mockResolvedValue({});
48
+
49
+ const result = await s3Service.upload({
50
+ fileKey: 'foo',
51
+ filePath: '/tmp/bar',
52
+ });
53
+
54
+ expect(fs.createReadStream).toHaveBeenCalledWith('/tmp/bar');
55
+ expect(PutObjectCommand).toHaveBeenCalledWith({
56
+ Bucket: 'foo-bucket',
57
+ Key: 'foo',
58
+ Body: mockStream,
59
+ });
60
+ expect(result).toEqual({ fileKey: 'foo' });
61
+ });
62
+ });
63
+
64
+ describe('download', () => {
65
+ it('downloads and writes file to disk', async () => {
66
+ const mockReadStream = new PassThrough();
67
+ mockReadStream.end('foo bar');
68
+
69
+ mockSend.mockResolvedValue({
70
+ Body: mockReadStream,
71
+ ContentLength: 100,
72
+ });
73
+
74
+ const result = await s3Service.download({
75
+ fileKey: 'foo',
76
+ });
77
+
78
+ expect(GetObjectCommand).toHaveBeenCalledWith({
79
+ Bucket: 'foo-bucket',
80
+ Key: 'foo',
81
+ });
82
+
83
+ expect(result).toEqual({
84
+ stream: mockReadStream,
85
+ });
86
+ });
87
+
88
+ it('throws error when size exceeds limit', async () => {
89
+ mockSend.mockResolvedValue({
90
+ Body: {},
91
+ ContentLength: 9999,
92
+ });
93
+
94
+ await expect(
95
+ s3Service.download({
96
+ fileKey: 'foo',
97
+ pathToWrite: '/tmp/y',
98
+ sizeLimit: 10,
99
+ }),
100
+ ).rejects.toThrow('File size limit exceeded');
101
+ });
102
+ });
103
+
104
+ describe('fileExists', () => {
105
+ it('returns true if file exists', async () => {
106
+ mockSend.mockResolvedValue({});
107
+
108
+ const exists = await s3Service.fileExists({ fileKey: 'foo' });
109
+
110
+ expect(exists).toBe(true);
111
+ });
112
+
113
+ it('returns false if file does not exist', async () => {
114
+ const error = new Error('Not found');
115
+ error.name = 'NotFound';
116
+ mockSend.mockRejectedValue(error);
117
+
118
+ const exists = await s3Service.fileExists({ fileKey: 'foo' });
119
+
120
+ expect(exists).toBe(false);
121
+ });
122
+ });
123
+
124
+ describe('getSignedUrl', () => {
125
+ it('returns signedUrl if file exists', async () => {
126
+ mockSend.mockResolvedValue({});
127
+ getSignedUrl.mockResolvedValue('http://signed-url.com');
128
+
129
+ const result = await s3Service.getSignedUrl({
130
+ fileKey: 'foo',
131
+ expiresIn: 123,
132
+ });
133
+
134
+ expect(result).toEqual({
135
+ fileKey: 'foo',
136
+ signedUrl: 'http://signed-url.com',
137
+ expiresIn: 123,
138
+ });
139
+ });
140
+
141
+ it('returns null if file does not exist', async () => {
142
+ const error = new Error('Not found');
143
+ error.name = 'NotFound';
144
+ mockSend.mockRejectedValue(error);
145
+
146
+ const result = await s3Service.getSignedUrl({ fileKey: 'foo' });
147
+
148
+ expect(result).toBeNull();
149
+ });
150
+ });
151
+ });
@@ -0,0 +1,89 @@
1
+ const StorageService = require('../../lib/files/storageService');
2
+
3
+ describe('Storage service', () => {
4
+ let client;
5
+ let service;
6
+ const params = { foo: 'bar ' };
7
+ const successResp = { success: true };
8
+ let streamFileWriter;
9
+
10
+ beforeEach(() => {
11
+ client = {
12
+ upload: jest.fn(),
13
+ download: jest.fn(),
14
+ fileExists: jest.fn(),
15
+ getSignedUrl: jest.fn(),
16
+ };
17
+
18
+ service = new StorageService(client);
19
+ });
20
+
21
+ beforeEach(() => {
22
+ client = {
23
+ upload: jest.fn(),
24
+ download: jest.fn(),
25
+ getSignedUrl: jest.fn(),
26
+ fileExists: jest.fn(),
27
+ };
28
+
29
+ streamFileWriter = {
30
+ writeToDisk: jest.fn(),
31
+ };
32
+
33
+ service = new StorageService(client, streamFileWriter);
34
+ });
35
+
36
+ describe('upload', () => {
37
+ it('delegates upload to client', async () => {
38
+ client.upload.mockResolvedValue(successResp);
39
+
40
+ const response = await service.upload(params);
41
+
42
+ expect(client.upload).toHaveBeenCalledWith(params);
43
+
44
+ expect(response).toEqual(successResp);
45
+ });
46
+ });
47
+
48
+ describe('download', () => {
49
+ it('downloads stream from client and writes it to disk', async () => {
50
+ const stream = { fake: 'stream' };
51
+ const expectedResult = { success: true };
52
+
53
+ client.download.mockReturnValue({ stream });
54
+ streamFileWriter.writeToDisk.mockResolvedValue(expectedResult);
55
+
56
+ const result = await service.download(params);
57
+
58
+ expect(client.download).toHaveBeenCalledWith(params);
59
+ expect(streamFileWriter.writeToDisk).toHaveBeenCalledWith({
60
+ stream,
61
+ ...params,
62
+ });
63
+ expect(result).toBe(expectedResult);
64
+ });
65
+ });
66
+
67
+ describe('fileExists', () => {
68
+ it('delegates fileExists to client', async () => {
69
+ client.fileExists.mockResolvedValue(successResp);
70
+
71
+ const response = await service.fileExists(params);
72
+
73
+ expect(client.fileExists).toHaveBeenCalledWith(params);
74
+ expect(response).toBe(successResp);
75
+ });
76
+ });
77
+
78
+ describe('getSignedUrl', () => {
79
+ it('delegates getSignedUrl to client', async () => {
80
+ client.getSignedUrl.mockResolvedValue(successResp);
81
+
82
+ const response = await service.getSignedUrl(params);
83
+
84
+ expect(client.getSignedUrl).toHaveBeenCalledWith(params);
85
+
86
+ expect(response).toEqual(successResp);
87
+ });
88
+ });
89
+ });
@@ -0,0 +1,137 @@
1
+ const fs = require('fs');
2
+ const { PassThrough } = require('stream');
3
+
4
+ const StreamFileWriter = require('../../lib/files/streamFileWriter');
5
+
6
+ jest.mock('fs', () => ({
7
+ createWriteStream: jest.fn(),
8
+ }));
9
+
10
+ describe('Stream file writer', () => {
11
+ it('writes stream to disk successfully without callbacks', async () => {
12
+ const readStream = new PassThrough();
13
+ readStream.end('hello');
14
+
15
+ const writeStream = new PassThrough();
16
+ fs.createWriteStream.mockReturnValue(writeStream);
17
+
18
+ const result = await StreamFileWriter.writeToDisk({
19
+ stream: readStream,
20
+ pathToWrite: '/tmp/file.txt',
21
+ });
22
+
23
+ expect(fs.createWriteStream).toHaveBeenCalledWith('/tmp/file.txt');
24
+ expect(result).toEqual({ success: true });
25
+ });
26
+
27
+ describe('Callbacks', () => {
28
+ it('calls onData callback with accumulated bytes', async () => {
29
+ const readStream = new PassThrough();
30
+ const writeStream = new PassThrough();
31
+ fs.createWriteStream.mockReturnValue(writeStream);
32
+
33
+ const onData = jest.fn();
34
+
35
+ const promise = StreamFileWriter.writeToDisk({
36
+ stream: readStream,
37
+ pathToWrite: '/tmp/file.txt',
38
+ callbacks: { onData },
39
+ });
40
+
41
+ readStream.write(Buffer.from('foo'));
42
+ readStream.write(Buffer.from('bar'));
43
+ readStream.end();
44
+
45
+ await promise;
46
+
47
+ expect(onData).toHaveBeenNthCalledWith(1, expect.any(Buffer), 3);
48
+ expect(onData).toHaveBeenNthCalledWith(2, expect.any(Buffer), 6);
49
+ });
50
+
51
+ it('resolves with onEnd return value when provided', async () => {
52
+ const readStream = new PassThrough();
53
+ readStream.end('done');
54
+
55
+ const writeStream = new PassThrough();
56
+ fs.createWriteStream.mockReturnValue(writeStream);
57
+
58
+ const onEnd = jest.fn(() => ({ finished: true }));
59
+
60
+ const result = await StreamFileWriter.writeToDisk({
61
+ stream: readStream,
62
+ pathToWrite: '/tmp/file.txt',
63
+ callbacks: { onEnd },
64
+ });
65
+
66
+ expect(onEnd).toHaveBeenCalled();
67
+ expect(result).toEqual({ finished: true });
68
+ });
69
+ });
70
+
71
+ describe('Errors', () => {
72
+ it('throws typed error when onData throws', async () => {
73
+ const readStream = new PassThrough();
74
+ const writeStream = new PassThrough();
75
+ fs.createWriteStream.mockReturnValue(writeStream);
76
+
77
+ const onData = jest.fn(() => {
78
+ throw new Error('onData failed');
79
+ });
80
+
81
+ const promise = StreamFileWriter.writeToDisk({
82
+ stream: readStream,
83
+ pathToWrite: '/tmp/file.txt',
84
+ callbacks: { onData },
85
+ });
86
+
87
+ readStream.write(Buffer.from('boom'));
88
+
89
+ await expect(promise).rejects.toMatchObject({
90
+ type: 'onDataCallback',
91
+ });
92
+ });
93
+
94
+ it('throws typed error when onEnd throws', async () => {
95
+ const readStream = new PassThrough();
96
+ readStream.end('done');
97
+
98
+ const writeStream = new PassThrough();
99
+ fs.createWriteStream.mockReturnValue(writeStream);
100
+
101
+ const onEnd = jest.fn(() => {
102
+ throw new Error('onEnd failed');
103
+ });
104
+
105
+ await expect(
106
+ StreamFileWriter.writeToDisk({
107
+ stream: readStream,
108
+ pathToWrite: '/tmp/file.txt',
109
+ callbacks: { onEnd },
110
+ }),
111
+ ).rejects.toMatchObject({
112
+ type: 'onEndCallback',
113
+ });
114
+
115
+ expect(onEnd).toHaveBeenCalled();
116
+ });
117
+
118
+ it('wraps pipeline errors with StreamPipelineError', async () => {
119
+ const readStream = new PassThrough();
120
+
121
+ const writeStream = new PassThrough();
122
+ fs.createWriteStream.mockReturnValue(writeStream);
123
+
124
+ const pipelineError = new Error('pipeline failed');
125
+ writeStream.destroy(pipelineError);
126
+
127
+ await expect(
128
+ StreamFileWriter.writeToDisk({
129
+ stream: readStream,
130
+ pathToWrite: '/tmp/file.txt',
131
+ }),
132
+ ).rejects.toMatchObject({
133
+ type: 'StreamPipelineError',
134
+ });
135
+ });
136
+ });
137
+ });
package/index.js CHANGED
@@ -1,9 +1,11 @@
1
1
  const axios = require('./lib/axios');
2
2
  const errors = require('./lib/errors');
3
- const package = require('./package.json');
3
+ const files = require('./lib/files');
4
+ const packge = require('./package.json');
4
5
 
5
6
  module.exports = {
6
- axios: axios,
7
- errors: errors,
8
- version: package.version,
7
+ axios,
8
+ errors,
9
+ files,
10
+ version: packge.version,
9
11
  };
@@ -0,0 +1,7 @@
1
+ const local = require('./local');
2
+ const remote = require('./remote');
3
+
4
+ module.exports = {
5
+ local,
6
+ remote,
7
+ };
@@ -0,0 +1,19 @@
1
+ /* eslint-disable class-methods-use-this */
2
+ const axios = require('../../axios');
3
+
4
+ class AxiosService {
5
+ async download({
6
+ downloadRequestParams: { downloadUrl, headers },
7
+ }) {
8
+ const res = await axios({
9
+ method: 'get',
10
+ url: downloadUrl,
11
+ responseType: 'stream',
12
+ headers,
13
+ });
14
+
15
+ return { stream: res.data };
16
+ }
17
+ }
18
+
19
+ module.exports = AxiosService;
@@ -0,0 +1,10 @@
1
+ const StorageService = require('../storageService');
2
+ const AxiosService = require('./axiosService');
3
+
4
+ const service = new StorageService(new AxiosService());
5
+
6
+ async function downloadStream(params) {
7
+ return service.download(params);
8
+ }
9
+
10
+ module.exports = { downloadStream };
@@ -0,0 +1,7 @@
1
+ const StorageService = require('../storageService');
2
+ const S3Service = require('./s3Service');
3
+
4
+ const s3Service = new S3Service();
5
+ const storageService = new StorageService(s3Service);
6
+
7
+ module.exports = storageService;
@@ -0,0 +1,93 @@
1
+ /* eslint-disable class-methods-use-this */
2
+ const fs = require('fs');
3
+ const {
4
+ S3Client,
5
+ PutObjectCommand,
6
+ GetObjectCommand,
7
+ HeadObjectCommand,
8
+ } = require('@aws-sdk/client-s3');
9
+ const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
10
+
11
+ class S3Service {
12
+ #DEFAULT_URL_EXPIRATION_IN_SECONDS = 30 * 60;
13
+
14
+ #client;
15
+
16
+ #bucket;
17
+
18
+ constructor({ client = null, bucket = null } = {}) {
19
+ this.#bucket = bucket || process.env.AWS_S3_BUCKET;
20
+ this.#client = client || new S3Client({
21
+ region: process.env.AWS_REGION,
22
+ credentials: {
23
+ accessKeyId: process.env.AWS_ACCESS_KEY_ID,
24
+ secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
25
+ },
26
+ });
27
+ }
28
+
29
+ async upload({ fileKey, filePath }) {
30
+ const fileStream = fs.createReadStream(filePath);
31
+ await this.#runCommand(
32
+ new PutObjectCommand({
33
+ Bucket: this.#bucket,
34
+ Key: fileKey,
35
+ Body: fileStream,
36
+ }),
37
+ );
38
+ return { fileKey };
39
+ }
40
+
41
+ async download({ fileKey, sizeLimit }) {
42
+ const file = await this.#runCommand(
43
+ new GetObjectCommand({ Bucket: this.#bucket, Key: fileKey }),
44
+ );
45
+
46
+ this.#validateSize(file.ContentLength, sizeLimit);
47
+ return { stream: file.Body };
48
+ }
49
+
50
+ async getSignedUrl({ fileKey, expiresIn = this.#DEFAULT_URL_EXPIRATION_IN_SECONDS }) {
51
+ if (await this.fileExists({ fileKey })) {
52
+ const command = new GetObjectCommand({ Bucket: this.#bucket, Key: fileKey });
53
+ const url = await getSignedUrl(this.#client, command, { expiresIn });
54
+ return { fileKey, signedUrl: url, expiresIn };
55
+ }
56
+ return null;
57
+ }
58
+
59
+ async fileExists({ fileKey }) {
60
+ try {
61
+ await this.#runCommand(new HeadObjectCommand({ Bucket: this.#bucket, Key: fileKey }));
62
+ return true;
63
+ } catch (error) {
64
+ if (error.name === 'NotFound' || error.$metadata?.httpStatusCode === 404) {
65
+ return false;
66
+ }
67
+ throw error;
68
+ }
69
+ }
70
+
71
+ #validateSize(fileSize, sizeLimit) {
72
+ if (!sizeLimit) return;
73
+ if (fileSize != null && fileSize > sizeLimit) {
74
+ throw new Error('File size limit exceeded');
75
+ }
76
+ }
77
+
78
+ #writeStreamToDisk(readStream, pathToWrite) {
79
+ return new Promise((resolve, reject) => {
80
+ const writeStream = fs.createWriteStream(pathToWrite);
81
+ readStream
82
+ .pipe(writeStream)
83
+ .on('finish', resolve)
84
+ .on('error', reject);
85
+ });
86
+ }
87
+
88
+ async #runCommand(command) {
89
+ return this.#client.send(command);
90
+ }
91
+ }
92
+
93
+ module.exports = S3Service;
@@ -0,0 +1,31 @@
1
+ const StreamFileWriter = require('./streamFileWriter');
2
+
3
+ class StorageService {
4
+ #client;
5
+
6
+ #streamFileWriter;
7
+
8
+ constructor(client, streamFileWriter = StreamFileWriter) {
9
+ this.#client = client;
10
+ this.#streamFileWriter = streamFileWriter;
11
+ }
12
+
13
+ async upload(params) {
14
+ return this.#client.upload(params);
15
+ }
16
+
17
+ async download(params) {
18
+ const { stream } = this.#client.download(params);
19
+ return this.#streamFileWriter.writeToDisk({ stream, ...params });
20
+ }
21
+
22
+ async getSignedUrl(params) {
23
+ return this.#client.getSignedUrl(params);
24
+ }
25
+
26
+ async fileExists(params) {
27
+ return this.#client.fileExists(params);
28
+ }
29
+ }
30
+
31
+ module.exports = StorageService;
@@ -0,0 +1,88 @@
1
+ /* eslint-disable class-methods-use-this */
2
+ const fs = require('fs');
3
+ const { pipeline } = require('stream');
4
+ const { promisify } = require('util');
5
+
6
+ const pipelineAsync = promisify(pipeline);
7
+
8
+ const Errors = {
9
+ ON_DATA: 'onDataCallback',
10
+ ON_END: 'onEndCallback',
11
+ PIPELINE: 'StreamPipelineError',
12
+ };
13
+
14
+ class StreamFileWriter {
15
+ static async writeToDisk({ stream, pathToWrite, callbacks = {} }) {
16
+ const writableStream = fs.createWriteStream(pathToWrite);
17
+ let callbackError = null;
18
+
19
+ if (callbacks.onData) {
20
+ this.#attachOnData(stream, callbacks.onData, (err) => {
21
+ callbackError = err;
22
+ });
23
+ }
24
+
25
+ try {
26
+ await pipelineAsync(stream, writableStream);
27
+
28
+ this.#throwIfCallbackFailed(callbackError);
29
+
30
+ return callbacks.onEnd
31
+ ? this.#handleOnEnd(callbacks.onEnd)
32
+ : { success: true };
33
+ } catch (err) {
34
+ this.#throwIfCallbackFailed(callbackError);
35
+
36
+ throw this.#normalizePipelineError(err);
37
+ }
38
+ }
39
+
40
+ static #attachOnData(stream, onData, reportCallbackError) {
41
+ let downloadedBytes = 0;
42
+
43
+ const onDataHandler = (chunk) => {
44
+ downloadedBytes += chunk.length;
45
+
46
+ try {
47
+ onData(chunk, downloadedBytes);
48
+ } catch (err) {
49
+ reportCallbackError(
50
+ this.#buildError(Errors.ON_DATA, err),
51
+ );
52
+
53
+ stream.off('data', onDataHandler);
54
+ stream.destroy(err);
55
+ }
56
+ };
57
+
58
+ stream.on('data', onDataHandler);
59
+ }
60
+
61
+ static #handleOnEnd(onEnd) {
62
+ try {
63
+ return onEnd() ?? { success: true };
64
+ } catch (err) {
65
+ throw this.#buildError(Errors.ON_END, err);
66
+ }
67
+ }
68
+
69
+ static #throwIfCallbackFailed(callbackError) {
70
+ if (callbackError) {
71
+ throw callbackError;
72
+ }
73
+ }
74
+
75
+ static #normalizePipelineError(err) {
76
+ if (err?.type && err?.error) {
77
+ return err;
78
+ }
79
+
80
+ return this.#buildError(Errors.PIPELINE, err);
81
+ }
82
+
83
+ static #buildError(type, error) {
84
+ return { type, error };
85
+ }
86
+ }
87
+
88
+ module.exports = StreamFileWriter;
package/package.json CHANGED
@@ -1,16 +1,28 @@
1
1
  {
2
2
  "name": "pluga-plg",
3
- "version": "0.1.8",
3
+ "version": "0.2.0",
4
4
  "description": "Pluga developer platform toolbox",
5
5
  "main": "index.js",
6
6
  "author": "Pluga <tech@pluga.co> (pluga.co)",
7
7
  "contributors": [
8
8
  "Leonardo Camelo (github.com/leocamelo)",
9
9
  "Renan Machado (github.com/renanmac)",
10
- "Alexandre Camillo (github.com/alexandrecamillo)"
10
+ "Alexandre Camillo (github.com/alexandrecamillo)",
11
+ "Mayara Araujo (github.com/mayaraujom)"
11
12
  ],
12
13
  "license": "MIT",
13
14
  "dependencies": {
15
+ "@aws-sdk/client-s3": "^3.934.0",
16
+ "@aws-sdk/s3-request-presigner": "^3.934.0",
14
17
  "axios": "^1.7.4"
18
+ },
19
+ "scripts": {
20
+ "test": "jest"
21
+ },
22
+ "devDependencies": {
23
+ "jest": "^30.2.0",
24
+ "eslint": "8.56.0",
25
+ "eslint-config-airbnb-base": "15.0.0",
26
+ "eslint-plugin-import": "2.29.1"
15
27
  }
16
28
  }