@taqueria/plugin-ipfs-pinata 0.0.0-pr-938-c7391836
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/DESIGN.md +107 -0
- package/index.js +268 -0
- package/index.js.map +1 -0
- package/index.ts +40 -0
- package/package.json +45 -0
- package/src/file-processing.ts +132 -0
- package/src/pinata-api.ts +68 -0
- package/src/proxy.ts +97 -0
- package/src/utils.ts +45 -0
- package/tsconfig.json +101 -0
package/DESIGN.md
ADDED
|
@@ -0,0 +1,107 @@
|
|
|
1
|
+
# Jest plugin for Taqueria
|
|
2
|
+
|
|
3
|
+
Authors: Michael Weichert <<michael.weichert@ecadlabs.com>>
|
|
4
|
+
|
|
5
|
+
Date: Apr 7, 2022
|
|
6
|
+
|
|
7
|
+
Revision: 1
|
|
8
|
+
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
This plugin integrates automated testing into your workflow via the Jest Testing Framework.
|
|
12
|
+
|
|
13
|
+
Our Jest plugin exposes the following tasks:
|
|
14
|
+
|
|
15
|
+
`taq test init [testsDir]`
|
|
16
|
+
|
|
17
|
+
Initializes the Jest plugin and Jest Testing Framework. Manual invocations of this task are not usually required, as uninitialzed test folders will be initialzed automatically. However, for explicit control, `taq test init` is available as a task for your convenience.
|
|
18
|
+
|
|
19
|
+
When executed, a _.taq/jest.config.js_ file is created if one doesn't exist, and then too, a _jest.config.js_ is created in _testsDir_ as specified in the positional arguments. _testsDir_ defaults to _jestTestsRootDir if no value was provided.
|
|
20
|
+
|
|
21
|
+
The _.taq/jest.config.js_ is known as the global jest configuration, and any other _jest.config.js_ files will inherit and override settings defined in the global jest configuration.
|
|
22
|
+
|
|
23
|
+
E.g, below is an example of a _jest.config.js_ file that would get created in your _jestTestsRootDir_:
|
|
24
|
+
```
|
|
25
|
+
module.exports = {
|
|
26
|
+
...require("/.taq/jest.config.js"),
|
|
27
|
+
// Set custom configuration here
|
|
28
|
+
}
|
|
29
|
+
```
|
|
30
|
+
|
|
31
|
+
This assures that you can centralize Jest configuration settings that are global to your project, and yet allow granular and individual tweaks for particular use-cases.
|
|
32
|
+
|
|
33
|
+
|
|
34
|
+
`taq test [testsDir] [pattern]`
|
|
35
|
+
|
|
36
|
+
The _test_ task is used to execute your automated tests using the Jest testing framework. Running `taq test` would scan the _jestTestsRootDir_ directory, which defaults to _./tests_. The __jestTestsRootDir_ is a configuration setting specified in your .taq/config.json, and may be changed at any time.
|
|
37
|
+
|
|
38
|
+
Running `taq test` without specifying a _testsDir_ will default to working with _jestTestsRootDir_. A pattern can be specified to pattern-match against filenames in the _testsDir_ to limit what automated tests are executed.
|
|
39
|
+
|
|
40
|
+
If the _jest.config.js_ file is missing in the specified _testsDir_, then `taq init [testsDir]` will be invoked automatically.
|
|
41
|
+
|
|
42
|
+
## Testing Directories / Partitions
|
|
43
|
+
|
|
44
|
+
Tests can be segmented into partitions, exposed as separate directories underneath the _jestTestsRootDir_ directory. This is sometimes desired to achieve a logical tree-like directory structure for organizational purposes. For instance, a developer may wish to segment their tests for smart contracts from the tests used to test a web application that interacts with the smart contract.
|
|
45
|
+
|
|
46
|
+
To do this, the developer could create two partitions:
|
|
47
|
+
|
|
48
|
+
`taq test contracts`
|
|
49
|
+
|
|
50
|
+
`taq test app`
|
|
51
|
+
|
|
52
|
+
This would create and initialize two partitions for writing Jest automated tests. Assuming your _jestTestsRootDir_ hasn't been customized, then it will default to _./tests_ and the directory structure will look like the following:
|
|
53
|
+
- tests
|
|
54
|
+
- tests/contracts
|
|
55
|
+
- tests/app
|
|
56
|
+
|
|
57
|
+
To run all automated tests in the contracts partition, a developer would run:
|
|
58
|
+
|
|
59
|
+
`taq test contracts`.
|
|
60
|
+
|
|
61
|
+
To run all tests, regardless of partition, a developer could run
|
|
62
|
+
|
|
63
|
+
`taq test`.
|
|
64
|
+
|
|
65
|
+
## Compatibility with other Testing Plugins
|
|
66
|
+
|
|
67
|
+
Its expected that other plugins will provide integration with testing frameworks other than Jest. For instance, the LIGO plugin is expected to provide integration with the LIGO Testing Framework which improve the developer workflow for smart contracts authored in one of many syntax available in the LIGO language.
|
|
68
|
+
|
|
69
|
+
For instance, a developer may wish to organize their tests as such:
|
|
70
|
+
- tests
|
|
71
|
+
- tests/contracts (using the LIGO testing framework)
|
|
72
|
+
- tests/app (using the Jest testing framework)
|
|
73
|
+
|
|
74
|
+
To do so, a developer would change their _jestTestsRootDir_ to _tests/app_ and _ligoTestsRootDir_ to _tests/contracts_.
|
|
75
|
+
|
|
76
|
+
A developer could run their tests written with LIGO with:
|
|
77
|
+
|
|
78
|
+
`taq test --plugin ligo`
|
|
79
|
+
|
|
80
|
+
Likewise, for Jest a developer would run:
|
|
81
|
+
|
|
82
|
+
`taq test --plugin jest`
|
|
83
|
+
|
|
84
|
+
## Future considerations
|
|
85
|
+
|
|
86
|
+
### End-developer defined tasks
|
|
87
|
+
|
|
88
|
+
As first mentioned in the State Architecture Design Document, a developer should have the ability to define their own tasks that can integrated into their workflow.
|
|
89
|
+
|
|
90
|
+
End-developer defined tasks are configurated in _.taq/config.json_.
|
|
91
|
+
|
|
92
|
+
This capability offers the ability to define a task which wraps existing ones.
|
|
93
|
+
|
|
94
|
+
E.g.:
|
|
95
|
+
|
|
96
|
+
```json
|
|
97
|
+
...,
|
|
98
|
+
"tasks": {
|
|
99
|
+
"test": {
|
|
100
|
+
"handler": "taq test --plugin ligo && taq test --plugin ligo"
|
|
101
|
+
}
|
|
102
|
+
}
|
|
103
|
+
```
|
|
104
|
+
|
|
105
|
+
This would expose a task that would run all tests defined using the LIGO and Jest plugin:
|
|
106
|
+
|
|
107
|
+
`taq run test`
|
package/index.js
ADDED
|
@@ -0,0 +1,268 @@
|
|
|
1
|
+
import {Plugin as $8CNkB$Plugin, Task as $8CNkB$Task, PositionalArg as $8CNkB$PositionalArg, sendAsyncErr as $8CNkB$sendAsyncErr} from "@taqueria/node-sdk";
|
|
2
|
+
import $8CNkB$path from "path";
|
|
3
|
+
import $8CNkB$fspromises from "fs/promises";
|
|
4
|
+
import $8CNkB$formdata from "form-data";
|
|
5
|
+
import $8CNkB$fs from "fs";
|
|
6
|
+
import $8CNkB$nodefetch from "node-fetch";
|
|
7
|
+
|
|
8
|
+
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
|
|
12
|
+
|
|
13
|
+
// Async generator
|
|
14
|
+
// https://stackoverflow.com/questions/5827612/node-js-fs-readdir-recursive-directory-search
|
|
15
|
+
async function* $362846282c87e345$var$getFiles(fileOrDirPath) {
|
|
16
|
+
const dirInfo = await (0, $8CNkB$fspromises).stat(fileOrDirPath);
|
|
17
|
+
if (dirInfo.isFile()) {
|
|
18
|
+
yield fileOrDirPath;
|
|
19
|
+
return;
|
|
20
|
+
}
|
|
21
|
+
const dirents = await (0, $8CNkB$fspromises).readdir(fileOrDirPath, {
|
|
22
|
+
withFileTypes: true
|
|
23
|
+
});
|
|
24
|
+
for (const dirent of dirents){
|
|
25
|
+
const res = (0, $8CNkB$path).resolve(fileOrDirPath, dirent.name);
|
|
26
|
+
if (dirent.isDirectory()) yield* $362846282c87e345$var$getFiles(res);
|
|
27
|
+
else yield res;
|
|
28
|
+
}
|
|
29
|
+
}
|
|
30
|
+
const $362846282c87e345$var$createFileProvider = async ({ fileOrDirPath: fileOrDirPath , filter: filter , shouldEstimateFileCount: shouldEstimateFileCount })=>{
|
|
31
|
+
fileOrDirPath = (0, $8CNkB$path).resolve(fileOrDirPath);
|
|
32
|
+
const pathInfo = await (0, $8CNkB$fspromises).stat(fileOrDirPath);
|
|
33
|
+
if (!pathInfo.isFile() && !pathInfo.isDirectory()) throw new Error(`The path '${fileOrDirPath}' is not a file or directory`);
|
|
34
|
+
let estimateFileCount = undefined;
|
|
35
|
+
if (shouldEstimateFileCount) {
|
|
36
|
+
estimateFileCount = 0;
|
|
37
|
+
for await (const filePath of $362846282c87e345$var$getFiles(fileOrDirPath)){
|
|
38
|
+
if (filter && !filter(filePath)) continue;
|
|
39
|
+
estimateFileCount++;
|
|
40
|
+
}
|
|
41
|
+
}
|
|
42
|
+
const fileGenerator = $362846282c87e345$var$getFiles(fileOrDirPath);
|
|
43
|
+
const getNextFile = async ()=>{
|
|
44
|
+
let nextFile = (await fileGenerator.next()).value;
|
|
45
|
+
if (!filter) return nextFile;
|
|
46
|
+
while(nextFile && !filter(nextFile)){
|
|
47
|
+
nextFile = await getNextFile();
|
|
48
|
+
continue;
|
|
49
|
+
}
|
|
50
|
+
return nextFile;
|
|
51
|
+
};
|
|
52
|
+
return {
|
|
53
|
+
getNextFile: getNextFile,
|
|
54
|
+
estimateFileCount: estimateFileCount
|
|
55
|
+
};
|
|
56
|
+
};
|
|
57
|
+
const $362846282c87e345$export$8f4b23801f8f7529 = async ({ fileOrDirPath: fileOrDirPath , processFile: processFile , filter: filter , parallelCount: parallelCount = 10 , onProgress: onProgress })=>{
|
|
58
|
+
const { getNextFile: getNextFile , estimateFileCount: estimateFileCount } = await $362846282c87e345$var$createFileProvider({
|
|
59
|
+
fileOrDirPath: fileOrDirPath,
|
|
60
|
+
filter: filter,
|
|
61
|
+
shouldEstimateFileCount: true
|
|
62
|
+
});
|
|
63
|
+
const successes = [];
|
|
64
|
+
const failures = [];
|
|
65
|
+
onProgress?.({
|
|
66
|
+
processedFilesCount: 0,
|
|
67
|
+
estimateFileCount: estimateFileCount
|
|
68
|
+
});
|
|
69
|
+
await Promise.all([
|
|
70
|
+
...new Array(parallelCount)
|
|
71
|
+
].map(async (x)=>{
|
|
72
|
+
let fileToProcess = await getNextFile();
|
|
73
|
+
while(fileToProcess){
|
|
74
|
+
const progressInfo = {
|
|
75
|
+
processedFilesCount: successes.length + failures.length,
|
|
76
|
+
estimateFileCount: estimateFileCount
|
|
77
|
+
};
|
|
78
|
+
onProgress?.(progressInfo);
|
|
79
|
+
try {
|
|
80
|
+
const result = await processFile(fileToProcess, progressInfo);
|
|
81
|
+
successes.push({
|
|
82
|
+
filePath: fileToProcess,
|
|
83
|
+
result: result
|
|
84
|
+
});
|
|
85
|
+
break;
|
|
86
|
+
} catch (err) {
|
|
87
|
+
failures.push({
|
|
88
|
+
filePath: fileToProcess,
|
|
89
|
+
error: err
|
|
90
|
+
});
|
|
91
|
+
}
|
|
92
|
+
fileToProcess = await getNextFile();
|
|
93
|
+
}
|
|
94
|
+
}));
|
|
95
|
+
onProgress?.({
|
|
96
|
+
processedFilesCount: successes.length + failures.length,
|
|
97
|
+
estimateFileCount: estimateFileCount
|
|
98
|
+
});
|
|
99
|
+
return {
|
|
100
|
+
successes: successes,
|
|
101
|
+
failures: failures
|
|
102
|
+
};
|
|
103
|
+
};
|
|
104
|
+
|
|
105
|
+
|
|
106
|
+
|
|
107
|
+
|
|
108
|
+
|
|
109
|
+
const $8299ce3b64e7f5fc$export$77ce72def6804f3 = async ({ auth: auth , item: item })=>{
|
|
110
|
+
const ipfsJsonFilePath = `${item.filePath}.ipfs.json`;
|
|
111
|
+
// // Skip if already uploaded
|
|
112
|
+
// try {
|
|
113
|
+
// const ipfsJsonFileContent = await fs.promises.readFile(ipfsJsonFilePath, { encoding: 'utf-8' });
|
|
114
|
+
// const ipfsResult = JSON.parse(ipfsJsonFileContent) as PublishFileResult;
|
|
115
|
+
// if (ipfsResult.ipfsHash) {
|
|
116
|
+
// console.log(`Skipping ${item.filePath}`);
|
|
117
|
+
// return;
|
|
118
|
+
// }
|
|
119
|
+
// } catch (err) {
|
|
120
|
+
// // Ignore
|
|
121
|
+
// }
|
|
122
|
+
const data = new (0, $8CNkB$formdata)();
|
|
123
|
+
data.append("file", (0, $8CNkB$fs).createReadStream(item.filePath));
|
|
124
|
+
data.append("pinataMetadata", JSON.stringify({
|
|
125
|
+
name: item.name
|
|
126
|
+
}));
|
|
127
|
+
const response = await (0, $8CNkB$nodefetch)(`https://api.pinata.cloud/pinning/pinFileToIPFS`, {
|
|
128
|
+
headers: {
|
|
129
|
+
Authorization: `Bearer ${auth.pinataJwtToken}`,
|
|
130
|
+
"Content-Type": `multipart/form-data; boundary=${data._boundary}`
|
|
131
|
+
},
|
|
132
|
+
body: data,
|
|
133
|
+
method: "post"
|
|
134
|
+
});
|
|
135
|
+
if (!response.ok) throw new Error(`Failed to upload '${item.name}' to ipfs ${response.statusText}`);
|
|
136
|
+
const uploadResult = await response.json();
|
|
137
|
+
return {
|
|
138
|
+
ipfsHash: uploadResult.IpfsHash
|
|
139
|
+
};
|
|
140
|
+
};
|
|
141
|
+
|
|
142
|
+
|
|
143
|
+
async function $6b0ddb031a0df909$export$1391212d75b2ee65(timeout) {
|
|
144
|
+
return await new Promise((resolve)=>{
|
|
145
|
+
setTimeout(resolve, timeout);
|
|
146
|
+
});
|
|
147
|
+
}
|
|
148
|
+
const $6b0ddb031a0df909$export$568be7ef485b9273 = ({ retryCount: retryCount = 5 })=>{
|
|
149
|
+
const DELAY_MIN = 10;
|
|
150
|
+
const DELAY_SUCCESS_REDUCTION = 10;
|
|
151
|
+
const DELAY_FAILURE_INCREASE = 200;
|
|
152
|
+
let delayTimeMs = 100;
|
|
153
|
+
const processWithBackoff = async (process)=>{
|
|
154
|
+
let attempt = 0;
|
|
155
|
+
let lastError = undefined;
|
|
156
|
+
while(attempt < retryCount){
|
|
157
|
+
try {
|
|
158
|
+
await $6b0ddb031a0df909$export$1391212d75b2ee65(delayTimeMs);
|
|
159
|
+
const result = await process();
|
|
160
|
+
delayTimeMs -= DELAY_SUCCESS_REDUCTION;
|
|
161
|
+
delayTimeMs = Math.max(DELAY_MIN, delayTimeMs);
|
|
162
|
+
return result;
|
|
163
|
+
} catch (err) {
|
|
164
|
+
lastError = err;
|
|
165
|
+
}
|
|
166
|
+
delayTimeMs += DELAY_FAILURE_INCREASE;
|
|
167
|
+
attempt++;
|
|
168
|
+
}
|
|
169
|
+
// All attempts failed
|
|
170
|
+
throw lastError;
|
|
171
|
+
};
|
|
172
|
+
return {
|
|
173
|
+
processWithBackoff: processWithBackoff
|
|
174
|
+
};
|
|
175
|
+
};
|
|
176
|
+
|
|
177
|
+
|
|
178
|
+
const $b297f5d0aa12bc82$var$publishToIpfs = async (fileOrDirPath, auth)=>{
|
|
179
|
+
if (!fileOrDirPath) throw new Error(`path was not provided`);
|
|
180
|
+
const { processWithBackoff: processWithBackoff } = (0, $6b0ddb031a0df909$export$568be7ef485b9273)({
|
|
181
|
+
retryCount: 5
|
|
182
|
+
});
|
|
183
|
+
const result = await (0, $362846282c87e345$export$8f4b23801f8f7529)({
|
|
184
|
+
fileOrDirPath: fileOrDirPath,
|
|
185
|
+
parallelCount: 10,
|
|
186
|
+
processFile: async (filePath)=>{
|
|
187
|
+
return processWithBackoff(()=>(0, $8299ce3b64e7f5fc$export$77ce72def6804f3)({
|
|
188
|
+
auth: auth,
|
|
189
|
+
item: {
|
|
190
|
+
filePath: filePath,
|
|
191
|
+
name: (0, $8CNkB$path).basename(filePath)
|
|
192
|
+
}
|
|
193
|
+
}));
|
|
194
|
+
},
|
|
195
|
+
onProgress: ({ processedFilesCount: processedFilesCount , estimateFileCount: estimateFileCount })=>{
|
|
196
|
+
if (estimateFileCount && processedFilesCount % 10) {
|
|
197
|
+
let ratio = processedFilesCount / estimateFileCount;
|
|
198
|
+
if (ratio > 1) ratio = 1;
|
|
199
|
+
// TODO: Call task sdk progress
|
|
200
|
+
console.log(`Progress: ${(ratio * 100).toFixed(0)}%`);
|
|
201
|
+
}
|
|
202
|
+
}
|
|
203
|
+
});
|
|
204
|
+
return {
|
|
205
|
+
data: {
|
|
206
|
+
fileIpfsHashes: result.successes.map((x)=>({
|
|
207
|
+
filePath: x.filePath,
|
|
208
|
+
ipfsHash: x.result.ipfsHash
|
|
209
|
+
}))
|
|
210
|
+
}
|
|
211
|
+
};
|
|
212
|
+
};
|
|
213
|
+
const $b297f5d0aa12bc82$var$pinToIpfs = async (hash, auth)=>{
|
|
214
|
+
if (!hash) throw new Error(`ipfs hash was not provided`);
|
|
215
|
+
// TODO: Implement pinning
|
|
216
|
+
throw new Error("pinToIpfs: Not Implemented");
|
|
217
|
+
};
|
|
218
|
+
const $b297f5d0aa12bc82$var$process = async (opts)=>{
|
|
219
|
+
const { task: task , path: path , hash: hash , config: config , } = opts;
|
|
220
|
+
const auth = {
|
|
221
|
+
// TODO: get pinata api key from config
|
|
222
|
+
pinataJwtToken: ""
|
|
223
|
+
};
|
|
224
|
+
switch(task){
|
|
225
|
+
case "publish":
|
|
226
|
+
return $b297f5d0aa12bc82$var$publishToIpfs(path, auth);
|
|
227
|
+
case "pin":
|
|
228
|
+
return $b297f5d0aa12bc82$var$pinToIpfs(hash, auth);
|
|
229
|
+
default:
|
|
230
|
+
throw new Error(`${task} is not an understood task by the ipfs-pinata plugin`);
|
|
231
|
+
}
|
|
232
|
+
};
|
|
233
|
+
var $b297f5d0aa12bc82$export$2e2bcd8739ae039 = async (args)=>{
|
|
234
|
+
const opts = args;
|
|
235
|
+
try {
|
|
236
|
+
return $b297f5d0aa12bc82$var$process(opts);
|
|
237
|
+
} catch (err) {
|
|
238
|
+
const error = err;
|
|
239
|
+
if (error.message) return (0, $8CNkB$sendAsyncErr)(error.message);
|
|
240
|
+
}
|
|
241
|
+
};
|
|
242
|
+
|
|
243
|
+
|
|
244
|
+
(0, $8CNkB$Plugin).create(()=>({
|
|
245
|
+
schema: "0.1",
|
|
246
|
+
version: "0.4.0",
|
|
247
|
+
alias: "jest",
|
|
248
|
+
tasks: [
|
|
249
|
+
(0, $8CNkB$Task).create({
|
|
250
|
+
task: "ipfs",
|
|
251
|
+
command: "publish [path]",
|
|
252
|
+
description: "Upload and pin files using your pinata account.",
|
|
253
|
+
aliases: [],
|
|
254
|
+
handler: "proxy",
|
|
255
|
+
positionals: [
|
|
256
|
+
(0, $8CNkB$PositionalArg).create({
|
|
257
|
+
placeholder: "path",
|
|
258
|
+
description: "Directory or file path to publish",
|
|
259
|
+
type: "string"
|
|
260
|
+
}),
|
|
261
|
+
]
|
|
262
|
+
}),
|
|
263
|
+
],
|
|
264
|
+
proxy: $b297f5d0aa12bc82$export$2e2bcd8739ae039
|
|
265
|
+
}), process.argv);
|
|
266
|
+
|
|
267
|
+
|
|
268
|
+
//# sourceMappingURL=index.js.map
|
package/index.js.map
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"mappings":";;;;;;;AAAA;ACAA;;ACAA;;AAIA,kBAAkB;AAClB,4FAA4F;AAC5F,gBAAgB,8BAAQ,CAAC,aAAqB,EAAyC;IACtF,MAAM,OAAO,GAAG,MAAM,CAAA,GAAA,iBAAE,CAAA,CAAC,IAAI,CAAC,aAAa,CAAC,AAAC;IAC7C,IAAI,OAAO,CAAC,MAAM,EAAE,EAAE;QACrB,MAAM,aAAa,CAAC;QACpB,OAAO;KACP;IAED,MAAM,OAAO,GAAG,MAAM,CAAA,GAAA,iBAAE,CAAA,CAAC,OAAO,CAAC,aAAa,EAAE;QAAE,aAAa,EAAE,IAAI;KAAE,CAAC,AAAC;IACzE,KAAK,MAAM,MAAM,IAAI,OAAO,CAAE;QAC7B,MAAM,GAAG,GAAG,CAAA,GAAA,WAAI,CAAA,CAAC,OAAO,CAAC,aAAa,EAAE,MAAM,CAAC,IAAI,CAAC,AAAC;QACrD,IAAI,MAAM,CAAC,WAAW,EAAE,EACvB,OAAO,8BAAQ,CAAC,GAAG,CAAC,CAAC;aAErB,MAAM,GAAG,CAAC;KAEX;CACD;AAED,MAAM,wCAAkB,GAAG,OAAO,iBACjC,aAAa,CAAA,UACb,MAAM,CAAA,2BACN,uBAAuB,CAAA,EAKvB,GAAK;IACL,aAAa,GAAG,CAAA,GAAA,WAAI,CAAA,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC;IAC5C,MAAM,QAAQ,GAAG,MAAM,CAAA,GAAA,iBAAE,CAAA,CAAC,IAAI,CAAC,aAAa,CAAC,AAAC;IAC9C,IACC,CAAC,QAAQ,CAAC,MAAM,EAAE,IACf,CAAC,QAAQ,CAAC,WAAW,EAAE,EAE1B,MAAM,IAAI,KAAK,CAAC,CAAC,UAAU,EAAE,aAAa,CAAC,4BAA4B,CAAC,CAAC,CAAC;IAG3E,IAAI,iBAAiB,GAAG,SAAS,AAAsB,AAAC;IACxD,IAAI,uBAAuB,EAAE;QAC5B,iBAAiB,GAAG,CAAC,CAAC;QACtB,WAAW,MAAM,QAAQ,IAAI,8BAAQ,CAAC,aAAa,CAAC,CAAE;YACrD,IAAI,MAAM,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,EAC9B,SAAS;YAEV,iBAAiB,EAAE,CAAC;SACpB;KACD;IAED,MAAM,aAAa,GAAG,8BAAQ,CAAC,aAAa,CAAC,AAAC;IAC9C,MAAM,WAAW,GAAG,UAAY;QAC/B,IAAI,QAAQ,GAAG,AAAC,CAAA,MAAM,aAAa,CAAC,IAAI,EAAE,CAAA,CAAE,KAAK,AAAC;QAClD,IAAI,CAAC,MAAM,EACV,OAAO,QAAQ,CAAC;QAGjB,MAAO,QAAQ,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAE;YACrC,QAAQ,GAAG,MAAM,WAAW,EAAE,CAAC;YAC/B,SAAS;SACT;QAED,OAAO,QAAQ,CAAC;KAChB,AAAC;IACF,OAAO;qBACN,WAAW;2BACX,iBAAiB;KACjB,CAAC;CACF,AAAC;AAGK,MAAM,yCAAY,GAAG,OAAgB,iBAC3C,aAAa,CAAA,eACb,WAAW,CAAA,UACX,MAAM,CAAA,iBACN,aAAa,GAAG,EAAE,eAClB,UAAU,CAAA,EAOV,GAAK;IACL,MAAM,eAAE,WAAW,CAAA,qBAAE,iBAAiB,CAAA,EAAE,GAAG,MAAM,wCAAkB,CAAC;uBACnE,aAAa;gBACb,MAAM;QACN,uBAAuB,EAAE,IAAI;KAC7B,CAAC,AAAC;IAEH,MAAM,SAAS,GAAG,EAAE,AAA2C,AAAC;IAChE,MAAM,QAAQ,GAAG,EAAE,AAA0C,AAAC;IAE9D,UAAU,GAAG;QACZ,mBAAmB,EAAE,CAAC;2BACtB,iBAAiB;KACjB,CAAC,CAAC;IAEH,MAAM,OAAO,CAAC,GAAG,CAAC;WAAI,IAAI,KAAK,CAAC,aAAa,CAAC;KAAC,CAAC,GAAG,CAAC,OAAM,CAAC,GAAI;QAC9D,IAAI,aAAa,GAAG,MAAM,WAAW,EAAE,AAAC;QACxC,MAAO,aAAa,CAAE;YACrB,MAAM,YAAY,GAAG;gBACpB,mBAAmB,EAAE,SAAS,CAAC,MAAM,GAAG,QAAQ,CAAC,MAAM;mCACvD,iBAAiB;aACjB,AAAC;YACF,UAAU,GAAG,YAAY,CAAC,CAAC;YAE3B,IAAI;gBACH,MAAM,MAAM,GAAG,MAAM,WAAW,CAAC,aAAa,EAAE,YAAY,CAAC,AAAC;gBAC9D,SAAS,CAAC,IAAI,CAAC;oBAAE,QAAQ,EAAE,aAAa;4BAAE,MAAM;iBAAE,CAAC,CAAC;gBACpD,MAAM;aACN,CAAC,OAAO,GAAG,EAAE;gBACb,QAAQ,CAAC,IAAI,CAAC;oBAAE,QAAQ,EAAE,aAAa;oBAAE,KAAK,EAAE,GAAG;iBAAE,CAAC,CAAC;aACvD;YAED,aAAa,GAAG,MAAM,WAAW,EAAE,CAAC;SACpC;KACD,CAAC,CAAC,CAAC;IAEJ,UAAU,GAAG;QACZ,mBAAmB,EAAE,SAAS,CAAC,MAAM,GAAG,QAAQ,CAAC,MAAM;2BACvD,iBAAiB;KACjB,CAAC,CAAC;IAEH,OAAO;mBACN,SAAS;kBACT,QAAQ;KACR,CAAC;CACF,AAAC;;;ACnIF;;;AAYO,MAAM,wCAAiB,GAAG,OAAO,QACvC,IAAI,CAAA,QACJ,IAAI,CAAA,EAOJ,GAAiC;IACjC,MAAM,gBAAgB,GAAG,CAAC,EAAE,IAAI,CAAC,QAAQ,CAAC,UAAU,CAAC,AAAC;IAEtD,8BAA8B;IAC9B,QAAQ;IACR,uGAAuG;IACvG,+EAA+E;IAC/E,iCAAiC;IACjC,oDAAoD;IACpD,kBAAkB;IAClB,QAAQ;IACR,kBAAkB;IAClB,gBAAgB;IAChB,IAAI;IAEJ,MAAM,IAAI,GAAG,IAAI,CAAA,GAAA,eAAQ,CAAA,EAAE,AAAC;IAC5B,IAAI,CAAC,MAAM,CAAC,MAAM,EAAE,CAAA,GAAA,SAAE,CAAA,CAAC,gBAAgB,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC;IACxD,IAAI,CAAC,MAAM,CACV,gBAAgB,EAChB,IAAI,CAAC,SAAS,CAAC;QACd,IAAI,EAAE,IAAI,CAAC,IAAI;KACf,CAAC,CACF,CAAC;IAEF,MAAM,QAAQ,GAAG,MAAM,CAAA,GAAA,gBAAK,CAAA,CAAC,CAAC,8CAA8C,CAAC,EAAE;QAC9E,OAAO,EAAE;YACR,aAAa,EAAE,CAAC,OAAO,EAAE,IAAI,CAAC,cAAc,CAAC,CAAC;YAC9C,cAAc,EAAE,CAAC,8BAA8B,EAAE,AAAC,IAAI,CAAsC,SAAS,CAAC,CAAC;SACvG;QACD,IAAI,EAAE,IAAI;QACV,MAAM,EAAE,MAAM;KACd,CAAC,AAAC;IAEH,IAAI,CAAC,QAAQ,CAAC,EAAE,EACf,MAAM,IAAI,KAAK,CAAC,CAAC,kBAAkB,EAAE,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE,QAAQ,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC;IAGnF,MAAM,YAAY,GAAG,MAAM,QAAQ,CAAC,IAAI,EAAE,AAIzC,AAAC;IAEF,OAAO;QACN,QAAQ,EAAE,YAAY,CAAC,QAAQ;KAC/B,CAAC;CACF,AAAC;;;ACnEK,eAAe,yCAAK,CAAC,OAAe,EAAiB;IAC3D,OAAO,MAAM,IAAI,OAAO,CAAC,CAAA,OAAO,GAAI;QACnC,UAAU,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;KAC7B,CAAC,CAAC;CACH;AAEM,MAAM,yCAA8B,GAAG,CAAC,cAC9C,UAAU,GAAG,CAAC,GAGd,GAAK;IACL,MAAM,SAAS,GAAG,EAAE,AAAC;IACrB,MAAM,uBAAuB,GAAG,EAAE,AAAC;IACnC,MAAM,sBAAsB,GAAG,GAAG,AAAC;IAEnC,IAAI,WAAW,GAAG,GAAG,AAAC;IAEtB,MAAM,kBAAkB,GAAG,OAAgB,OAA+B,GAAK;QAC9E,IAAI,OAAO,GAAG,CAAC,AAAC;QAChB,IAAI,SAAS,GAAG,SAAS,AAAW,AAAC;QACrC,MAAO,OAAO,GAAG,UAAU,CAAE;YAC5B,IAAI;gBACH,MAAM,yCAAK,CAAC,WAAW,CAAC,CAAC;gBAEzB,MAAM,MAAM,GAAG,MAAM,OAAO,EAAE,AAAC;gBAE/B,WAAW,IAAI,uBAAuB,CAAC;gBACvC,WAAW,GAAG,IAAI,CAAC,GAAG,CAAC,SAAS,EAAE,WAAW,CAAC,CAAC;gBAE/C,OAAO,MAAM,CAAC;aACd,CAAC,OAAO,GAAG,EAAE;gBACb,SAAS,GAAG,GAAG,CAAC;aAChB;YACD,WAAW,IAAI,sBAAsB,CAAC;YACtC,OAAO,EAAE,CAAC;SACV;QAED,sBAAsB;QACtB,MAAM,SAAS,CAAC;KAChB,AAAC;IAEF,OAAO;4BACN,kBAAkB;KAClB,CAAC;CACF,AAAC;;;AHhCF,MAAM,mCAAa,GAAG,OAAO,aAAiC,EAAE,IAAgB,GAA8B;IAC7G,IAAI,CAAC,aAAa,EACjB,MAAM,IAAI,KAAK,CAAC,CAAC,qBAAqB,CAAC,CAAC,CAAC;IAG1C,MAAM,sBAAE,kBAAkB,CAAA,EAAE,GAAG,CAAA,GAAA,yCAA8B,CAAA,CAAC;QAC7D,UAAU,EAAE,CAAC;KACb,CAAC,AAAC;IAEH,MAAM,MAAM,GAAG,MAAM,CAAA,GAAA,yCAAY,CAAA,CAAC;uBACjC,aAAa;QACb,aAAa,EAAE,EAAE;QACjB,WAAW,EAAE,OAAM,QAAQ,GAAI;YAC9B,OAAO,kBAAkB,CAAC,IACzB,CAAA,GAAA,wCAAiB,CAAA,CAAC;0BACjB,IAAI;oBACJ,IAAI,EAAE;kCAAE,QAAQ;wBAAE,IAAI,EAAE,CAAA,GAAA,WAAI,CAAA,CAAC,QAAQ,CAAC,QAAQ,CAAC;qBAAE;iBACjD,CAAC,CACF,CAAC;SACF;QACD,UAAU,EAAE,CAAC,uBAAE,mBAAmB,CAAA,qBAAE,iBAAiB,CAAA,EAAE,GAAK;YAC3D,IAAI,iBAAiB,IAAI,mBAAmB,GAAG,EAAE,EAAE;gBAClD,IAAI,KAAK,GAAG,mBAAmB,GAAG,iBAAiB,AAAC;gBACpD,IAAI,KAAK,GAAG,CAAC,EAAE,KAAK,GAAG,CAAC,CAAC;gBAEzB,+BAA+B;gBAC/B,OAAO,CAAC,GAAG,CAAC,CAAC,UAAU,EAAE,AAAC,CAAA,KAAK,GAAG,GAAG,CAAA,CAAE,OAAO,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;aACtD;SACD;KACD,CAAC,AAAC;IAEH,OAAO;QACN,IAAI,EAAE;YACL,cAAc,EAAE,MAAM,CAAC,SAAS,CAAC,GAAG,CAAC,CAAA,CAAC,GAAK,CAAA;oBAC1C,QAAQ,EAAE,CAAC,CAAC,QAAQ;oBACpB,QAAQ,EAAE,CAAC,CAAC,MAAM,CAAC,QAAQ;iBAC3B,CAAA,AAAC,CAAC;SACH;KACD,CAAC;CACF,AAAC;AAEF,MAAM,+BAAS,GAAG,OAAO,IAAwB,EAAE,IAAgB,GAA8B;IAChG,IAAI,CAAC,IAAI,EACR,MAAM,IAAI,KAAK,CAAC,CAAC,0BAA0B,CAAC,CAAC,CAAC;IAG/C,0BAA0B;IAC1B,MAAM,IAAI,KAAK,CAAC,4BAA4B,CAAC,CAAC;CAC9C,AAAC;AAEF,MAAM,6BAAO,GAAG,OAAO,IAAU,GAA8B;IAC9D,MAAM,QACL,IAAI,CAAA,QACJ,IAAI,CAAA,QACJ,IAAI,CAAA,UACJ,MAAM,CAAA,IACN,GAAG,IAAI,AAAC;IAET,MAAM,IAAI,GAAe;QACxB,uCAAuC;QACvC,cAAc,EAAE,EAAE;KAClB,AAAC;IAEF,OAAQ,IAAI;QACX,KAAK,SAAS;YACb,OAAO,mCAAa,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC;QAClC,KAAK,KAAK;YACT,OAAO,+BAAS,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC;QAC9B;YACC,MAAM,IAAI,KAAK,CAAC,CAAC,EAAE,IAAI,CAAC,oDAAoD,CAAC,CAAC,CAAC;KAChF;CACD,AAAC;IAEF,wCAWE,GAXa,OAAO,IAAkC,GAA8B;IACrF,MAAM,IAAI,GAAG,IAAI,AAAQ,AAAC;IAE1B,IAAI;QACH,OAAO,6BAAO,CAAC,IAAI,CAAC,CAAC;KACrB,CAAC,OAAO,GAAG,EAAE;QACb,MAAM,KAAK,GAAG,GAAG,AAAS,AAAC;QAC3B,IAAI,KAAK,CAAC,OAAO,EAChB,OAAO,CAAA,GAAA,mBAAY,CAAA,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;KAEpC;CACD;;;AD7FD,CAAA,GAAA,aAAM,CAAA,CAAC,MAAM,CAAC,IAAO,CAAA;QACpB,MAAM,EAAE,KAAK;QACb,OAAO,EAAE,OAAO;QAChB,KAAK,EAAE,MAAM;QACb,KAAK,EAAE;YACN,CAAA,GAAA,WAAI,CAAA,CAAC,MAAM,CAAC;gBACX,IAAI,EAAE,MAAM;gBACZ,OAAO,EAAE,gBAAgB;gBACzB,WAAW,EAAE,iDAAiD;gBAC9D,OAAO,EAAE,EAAE;gBACX,OAAO,EAAE,OAAO;gBAChB,WAAW,EAAE;oBACZ,CAAA,GAAA,oBAAa,CAAA,CAAC,MAAM,CAAC;wBACpB,WAAW,EAAE,MAAM;wBACnB,WAAW,EAAE,mCAAmC;wBAChD,IAAI,EAAE,QAAQ;qBACd,CAAC;iBACF;aACD,CAAC;SAgBF;eACD,wCAAK;KACL,CAAA,AAAC,EAAE,OAAO,CAAC,IAAI,CAAC,CAAC","sources":["index.ts","src/proxy.ts","src/file-processing.ts","src/pinata-api.ts","src/utils.ts"],"sourcesContent":["import { Option, Plugin, PositionalArg, Task } from '@taqueria/node-sdk';\nimport proxy from './src/proxy';\n\nPlugin.create(() => ({\n\tschema: '0.1',\n\tversion: '0.4.0',\n\talias: 'jest',\n\ttasks: [\n\t\tTask.create({\n\t\t\ttask: 'ipfs',\n\t\t\tcommand: 'publish [path]',\n\t\t\tdescription: 'Upload and pin files using your pinata account.',\n\t\t\taliases: [],\n\t\t\thandler: 'proxy',\n\t\t\tpositionals: [\n\t\t\t\tPositionalArg.create({\n\t\t\t\t\tplaceholder: 'path',\n\t\t\t\t\tdescription: 'Directory or file path to publish',\n\t\t\t\t\ttype: 'string',\n\t\t\t\t}),\n\t\t\t],\n\t\t}),\n\t\t// Pinning Not Implemented Yet\n\t\t// Task.create({\n\t\t// \ttask: 'ipfs',\n\t\t// \tcommand: 'pin [hash]',\n\t\t// \tdescription: 'Pin a file already on ipfs with your pinata account.',\n\t\t// \taliases: [],\n\t\t// \thandler: 'proxy',\n\t\t// \tpositionals: [\n\t\t// \t\tPositionalArg.create({\n\t\t// \t\t\tplaceholder: 'hash',\n\t\t// \t\t\tdescription: 'Ipfs hash of the file or directory that is already on the ipfs network.',\n\t\t// \t\t\ttype: 'string',\n\t\t// \t\t}),\n\t\t// \t]\n\t\t// }),\n\t],\n\tproxy,\n}), process.argv);\n","import { sendAsyncErr, sendAsyncRes, sendErr } from '@taqueria/node-sdk';\nimport { LoadedConfig, PluginResponse, RequestArgs, SanitizedAbsPath } from '@taqueria/node-sdk/types';\nimport path from 'path';\nimport { processFiles } from './file-processing';\nimport { PinataAuth, publishFileToIpfs } from './pinata-api';\nimport { createProcessBackoffController } from './utils';\n\ninterface Opts extends RequestArgs.ProxyRequestArgs {\n\treadonly path?: string;\n\treadonly hash?: string;\n}\n\nconst publishToIpfs = async (fileOrDirPath: undefined | string, auth: PinataAuth): Promise<PluginResponse> => {\n\tif (!fileOrDirPath) {\n\t\tthrow new Error(`path was not provided`);\n\t}\n\n\tconst { processWithBackoff } = createProcessBackoffController({\n\t\tretryCount: 5,\n\t});\n\n\tconst result = await processFiles({\n\t\tfileOrDirPath,\n\t\tparallelCount: 10,\n\t\tprocessFile: async filePath => {\n\t\t\treturn processWithBackoff(() =>\n\t\t\t\tpublishFileToIpfs({\n\t\t\t\t\tauth,\n\t\t\t\t\titem: { filePath, name: path.basename(filePath) },\n\t\t\t\t})\n\t\t\t);\n\t\t},\n\t\tonProgress: ({ processedFilesCount, estimateFileCount }) => {\n\t\t\tif (estimateFileCount && processedFilesCount % 10) {\n\t\t\t\tlet ratio = processedFilesCount / estimateFileCount;\n\t\t\t\tif (ratio > 1) ratio = 1;\n\n\t\t\t\t// TODO: Call task sdk progress\n\t\t\t\tconsole.log(`Progress: ${(ratio * 100).toFixed(0)}%`);\n\t\t\t}\n\t\t},\n\t});\n\n\treturn {\n\t\tdata: {\n\t\t\tfileIpfsHashes: result.successes.map(x => ({\n\t\t\t\tfilePath: x.filePath,\n\t\t\t\tipfsHash: x.result.ipfsHash,\n\t\t\t})),\n\t\t},\n\t};\n};\n\nconst pinToIpfs = async (hash: undefined | string, auth: PinataAuth): Promise<PluginResponse> => {\n\tif (!hash) {\n\t\tthrow new Error(`ipfs hash was not provided`);\n\t}\n\n\t// TODO: Implement pinning\n\tthrow new Error('pinToIpfs: Not Implemented');\n};\n\nconst process = async (opts: Opts): Promise<PluginResponse> => {\n\tconst {\n\t\ttask,\n\t\tpath,\n\t\thash,\n\t\tconfig,\n\t} = opts;\n\n\tconst auth: PinataAuth = {\n\t\t// TODO: get pinata api key from config\n\t\tpinataJwtToken: '',\n\t};\n\n\tswitch (task) {\n\t\tcase 'publish':\n\t\t\treturn publishToIpfs(path, auth);\n\t\tcase 'pin':\n\t\t\treturn pinToIpfs(hash, auth);\n\t\tdefault:\n\t\t\tthrow new Error(`${task} is not an understood task by the ipfs-pinata plugin`);\n\t}\n};\n\nexport default async (args: RequestArgs.ProxyRequestArgs): Promise<PluginResponse> => {\n\tconst opts = args as Opts;\n\n\ttry {\n\t\treturn process(opts);\n\t} catch (err) {\n\t\tconst error = err as Error;\n\t\tif (error.message) {\n\t\t\treturn sendAsyncErr(error.message);\n\t\t}\n\t}\n};\n","import fs from 'fs/promises';\nimport path from 'path';\nimport { delay } from './utils';\n\n// Async generator\n// https://stackoverflow.com/questions/5827612/node-js-fs-readdir-recursive-directory-search\nasync function* getFiles(fileOrDirPath: string): AsyncGenerator<string, void, unknown> {\n\tconst dirInfo = await fs.stat(fileOrDirPath);\n\tif (dirInfo.isFile()) {\n\t\tyield fileOrDirPath;\n\t\treturn;\n\t}\n\n\tconst dirents = await fs.readdir(fileOrDirPath, { withFileTypes: true });\n\tfor (const dirent of dirents) {\n\t\tconst res = path.resolve(fileOrDirPath, dirent.name);\n\t\tif (dirent.isDirectory()) {\n\t\t\tyield* getFiles(res);\n\t\t} else {\n\t\t\tyield res;\n\t\t}\n\t}\n}\n\nconst createFileProvider = async ({\n\tfileOrDirPath,\n\tfilter,\n\tshouldEstimateFileCount,\n}: {\n\tfileOrDirPath: string;\n\tfilter?: (filePath: string) => boolean;\n\tshouldEstimateFileCount?: boolean;\n}) => {\n\tfileOrDirPath = path.resolve(fileOrDirPath);\n\tconst pathInfo = await fs.stat(fileOrDirPath);\n\tif (\n\t\t!pathInfo.isFile()\n\t\t&& !pathInfo.isDirectory()\n\t) {\n\t\tthrow new Error(`The path '${fileOrDirPath}' is not a file or directory`);\n\t}\n\n\tlet estimateFileCount = undefined as undefined | number;\n\tif (shouldEstimateFileCount) {\n\t\testimateFileCount = 0;\n\t\tfor await (const filePath of getFiles(fileOrDirPath)) {\n\t\t\tif (filter && !filter(filePath)) {\n\t\t\t\tcontinue;\n\t\t\t}\n\t\t\testimateFileCount++;\n\t\t}\n\t}\n\n\tconst fileGenerator = getFiles(fileOrDirPath);\n\tconst getNextFile = async () => {\n\t\tlet nextFile = (await fileGenerator.next()).value;\n\t\tif (!filter) {\n\t\t\treturn nextFile;\n\t\t}\n\n\t\twhile (nextFile && !filter(nextFile)) {\n\t\t\tnextFile = await getNextFile();\n\t\t\tcontinue;\n\t\t}\n\n\t\treturn nextFile;\n\t};\n\treturn {\n\t\tgetNextFile,\n\t\testimateFileCount,\n\t};\n};\n\ntype ProgressInfo = { processedFilesCount: number; estimateFileCount: undefined | number };\nexport const processFiles = async <TResult>({\n\tfileOrDirPath,\n\tprocessFile,\n\tfilter,\n\tparallelCount = 10,\n\tonProgress,\n}: {\n\tfileOrDirPath: string;\n\tprocessFile: (filePath: string, progress: ProgressInfo) => Promise<TResult>;\n\tfilter?: (filePath: string) => boolean;\n\tparallelCount?: number;\n\tonProgress?: (progress: ProgressInfo) => void;\n}) => {\n\tconst { getNextFile, estimateFileCount } = await createFileProvider({\n\t\tfileOrDirPath,\n\t\tfilter,\n\t\tshouldEstimateFileCount: true,\n\t});\n\n\tconst successes = [] as { filePath: string; result: TResult }[];\n\tconst failures = [] as { filePath: string; error: unknown }[];\n\n\tonProgress?.({\n\t\tprocessedFilesCount: 0,\n\t\testimateFileCount,\n\t});\n\n\tawait Promise.all([...new Array(parallelCount)].map(async x => {\n\t\tlet fileToProcess = await getNextFile();\n\t\twhile (fileToProcess) {\n\t\t\tconst progressInfo = {\n\t\t\t\tprocessedFilesCount: successes.length + failures.length,\n\t\t\t\testimateFileCount,\n\t\t\t};\n\t\t\tonProgress?.(progressInfo);\n\n\t\t\ttry {\n\t\t\t\tconst result = await processFile(fileToProcess, progressInfo);\n\t\t\t\tsuccesses.push({ filePath: fileToProcess, result });\n\t\t\t\tbreak;\n\t\t\t} catch (err) {\n\t\t\t\tfailures.push({ filePath: fileToProcess, error: err });\n\t\t\t}\n\n\t\t\tfileToProcess = await getNextFile();\n\t\t}\n\t}));\n\n\tonProgress?.({\n\t\tprocessedFilesCount: successes.length + failures.length,\n\t\testimateFileCount,\n\t});\n\n\treturn {\n\t\tsuccesses,\n\t\tfailures,\n\t};\n};\n","import FormData from 'form-data';\nimport fs from 'fs';\nimport fetch from 'node-fetch';\n\nexport type PinataAuth = {\n\tpinataJwtToken: string;\n};\n\nexport type PublishFileResult = {\n\tipfsHash: string;\n};\n\nexport const publishFileToIpfs = async ({\n\tauth,\n\titem,\n}: {\n\tauth: PinataAuth;\n\titem: {\n\t\tname: string;\n\t\tfilePath: string;\n\t};\n}): Promise<PublishFileResult> => {\n\tconst ipfsJsonFilePath = `${item.filePath}.ipfs.json`;\n\n\t// // Skip if already uploaded\n\t// try {\n\t// const ipfsJsonFileContent = await fs.promises.readFile(ipfsJsonFilePath, { encoding: 'utf-8' });\n\t// const ipfsResult = JSON.parse(ipfsJsonFileContent) as PublishFileResult;\n\t// if (ipfsResult.ipfsHash) {\n\t// console.log(`Skipping ${item.filePath}`);\n\t// return;\n\t// }\n\t// } catch (err) {\n\t// // Ignore\n\t// }\n\n\tconst data = new FormData();\n\tdata.append('file', fs.createReadStream(item.filePath));\n\tdata.append(\n\t\t'pinataMetadata',\n\t\tJSON.stringify({\n\t\t\tname: item.name,\n\t\t}),\n\t);\n\n\tconst response = await fetch(`https://api.pinata.cloud/pinning/pinFileToIPFS`, {\n\t\theaders: {\n\t\t\tAuthorization: `Bearer ${auth.pinataJwtToken}`,\n\t\t\t'Content-Type': `multipart/form-data; boundary=${(data as unknown as { _boundary: string })._boundary}`,\n\t\t},\n\t\tbody: data,\n\t\tmethod: 'post',\n\t});\n\n\tif (!response.ok) {\n\t\tthrow new Error(`Failed to upload '${item.name}' to ipfs ${response.statusText}`);\n\t}\n\n\tconst uploadResult = await response.json() as {\n\t\tIpfsHash: string; // This is the IPFS multi-hash provided back for your content,\n\t\tPinSize: string; // This is how large (in bytes) the content you just pinned is,\n\t\tTimestamp: string; // This is the timestamp for your content pinning (represented in ISO 8601 format)\n\t};\n\n\treturn {\n\t\tipfsHash: uploadResult.IpfsHash,\n\t};\n};\n","export async function delay(timeout: number): Promise<void> {\n\treturn await new Promise(resolve => {\n\t\tsetTimeout(resolve, timeout);\n\t});\n}\n\nexport const createProcessBackoffController = ({\n\tretryCount = 5,\n}: {\n\tretryCount: number;\n}) => {\n\tconst DELAY_MIN = 10;\n\tconst DELAY_SUCCESS_REDUCTION = 10;\n\tconst DELAY_FAILURE_INCREASE = 200;\n\n\tlet delayTimeMs = 100;\n\n\tconst processWithBackoff = async <TResult>(process: () => Promise<TResult>) => {\n\t\tlet attempt = 0;\n\t\tlet lastError = undefined as unknown;\n\t\twhile (attempt < retryCount) {\n\t\t\ttry {\n\t\t\t\tawait delay(delayTimeMs);\n\n\t\t\t\tconst result = await process();\n\n\t\t\t\tdelayTimeMs -= DELAY_SUCCESS_REDUCTION;\n\t\t\t\tdelayTimeMs = Math.max(DELAY_MIN, delayTimeMs);\n\n\t\t\t\treturn result;\n\t\t\t} catch (err) {\n\t\t\t\tlastError = err;\n\t\t\t}\n\t\t\tdelayTimeMs += DELAY_FAILURE_INCREASE;\n\t\t\tattempt++;\n\t\t}\n\n\t\t// All attempts failed\n\t\tthrow lastError;\n\t};\n\n\treturn {\n\t\tprocessWithBackoff,\n\t};\n};\n"],"names":[],"version":3,"file":"index.js.map","sourceRoot":"/"}
|
package/index.ts
ADDED
|
@@ -0,0 +1,40 @@
|
|
|
1
|
+
import { Option, Plugin, PositionalArg, Task } from '@taqueria/node-sdk';
|
|
2
|
+
import proxy from './src/proxy';
|
|
3
|
+
|
|
4
|
+
Plugin.create(() => ({
|
|
5
|
+
schema: '0.1',
|
|
6
|
+
version: '0.4.0',
|
|
7
|
+
alias: 'jest',
|
|
8
|
+
tasks: [
|
|
9
|
+
Task.create({
|
|
10
|
+
task: 'ipfs',
|
|
11
|
+
command: 'publish [path]',
|
|
12
|
+
description: 'Upload and pin files using your pinata account.',
|
|
13
|
+
aliases: [],
|
|
14
|
+
handler: 'proxy',
|
|
15
|
+
positionals: [
|
|
16
|
+
PositionalArg.create({
|
|
17
|
+
placeholder: 'path',
|
|
18
|
+
description: 'Directory or file path to publish',
|
|
19
|
+
type: 'string',
|
|
20
|
+
}),
|
|
21
|
+
],
|
|
22
|
+
}),
|
|
23
|
+
// Pinning Not Implemented Yet
|
|
24
|
+
// Task.create({
|
|
25
|
+
// task: 'ipfs',
|
|
26
|
+
// command: 'pin [hash]',
|
|
27
|
+
// description: 'Pin a file already on ipfs with your pinata account.',
|
|
28
|
+
// aliases: [],
|
|
29
|
+
// handler: 'proxy',
|
|
30
|
+
// positionals: [
|
|
31
|
+
// PositionalArg.create({
|
|
32
|
+
// placeholder: 'hash',
|
|
33
|
+
// description: 'Ipfs hash of the file or directory that is already on the ipfs network.',
|
|
34
|
+
// type: 'string',
|
|
35
|
+
// }),
|
|
36
|
+
// ]
|
|
37
|
+
// }),
|
|
38
|
+
],
|
|
39
|
+
proxy,
|
|
40
|
+
}), process.argv);
|
package/package.json
ADDED
|
@@ -0,0 +1,45 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "@taqueria/plugin-ipfs-pinata",
|
|
3
|
+
"version": "0.0.0-pr-938-c7391836",
|
|
4
|
+
"description": "A plugin for Taqueria providing ipfs publishing and pinning using the Pinata service",
|
|
5
|
+
"keywords": [
|
|
6
|
+
"taqueria",
|
|
7
|
+
"plugin",
|
|
8
|
+
"jest",
|
|
9
|
+
"testing",
|
|
10
|
+
"tdd",
|
|
11
|
+
"ecad",
|
|
12
|
+
"ecadlabs",
|
|
13
|
+
"tezos"
|
|
14
|
+
],
|
|
15
|
+
"targets": {
|
|
16
|
+
"default": {
|
|
17
|
+
"source": "./index.ts",
|
|
18
|
+
"distDir": "./",
|
|
19
|
+
"context": "node",
|
|
20
|
+
"isLibrary": true,
|
|
21
|
+
"outputFormat": "esmodule"
|
|
22
|
+
}
|
|
23
|
+
},
|
|
24
|
+
"scripts": {
|
|
25
|
+
"build": "npx tsc -noEmit -p ./tsconfig.json && npx parcel build --no-cache 2>&1"
|
|
26
|
+
},
|
|
27
|
+
"author": "ECAD Labs",
|
|
28
|
+
"license": "Apache-2.0",
|
|
29
|
+
"type": "module",
|
|
30
|
+
"repository": {
|
|
31
|
+
"type": "git",
|
|
32
|
+
"url": "https://github.com/ecadlabs/taqueria.git",
|
|
33
|
+
"directory": "taqueria-plugin-ipfs-pinata"
|
|
34
|
+
},
|
|
35
|
+
"dependencies": {
|
|
36
|
+
"@taqueria/node-sdk": "^0.6.2",
|
|
37
|
+
"form-data": "^4.0.0",
|
|
38
|
+
"node-fetch": "^2.6.7"
|
|
39
|
+
},
|
|
40
|
+
"devDependencies": {
|
|
41
|
+
"@types/node-fetch": "^2.6.1",
|
|
42
|
+
"parcel": "^2.6.0",
|
|
43
|
+
"typescript": "4.7.2"
|
|
44
|
+
}
|
|
45
|
+
}
|
|
@@ -0,0 +1,132 @@
|
|
|
1
|
+
import fs from 'fs/promises';
|
|
2
|
+
import path from 'path';
|
|
3
|
+
import { delay } from './utils';
|
|
4
|
+
|
|
5
|
+
// Async generator
|
|
6
|
+
// https://stackoverflow.com/questions/5827612/node-js-fs-readdir-recursive-directory-search
|
|
7
|
+
async function* getFiles(fileOrDirPath: string): AsyncGenerator<string, void, unknown> {
|
|
8
|
+
const dirInfo = await fs.stat(fileOrDirPath);
|
|
9
|
+
if (dirInfo.isFile()) {
|
|
10
|
+
yield fileOrDirPath;
|
|
11
|
+
return;
|
|
12
|
+
}
|
|
13
|
+
|
|
14
|
+
const dirents = await fs.readdir(fileOrDirPath, { withFileTypes: true });
|
|
15
|
+
for (const dirent of dirents) {
|
|
16
|
+
const res = path.resolve(fileOrDirPath, dirent.name);
|
|
17
|
+
if (dirent.isDirectory()) {
|
|
18
|
+
yield* getFiles(res);
|
|
19
|
+
} else {
|
|
20
|
+
yield res;
|
|
21
|
+
}
|
|
22
|
+
}
|
|
23
|
+
}
|
|
24
|
+
|
|
25
|
+
const createFileProvider = async ({
|
|
26
|
+
fileOrDirPath,
|
|
27
|
+
filter,
|
|
28
|
+
shouldEstimateFileCount,
|
|
29
|
+
}: {
|
|
30
|
+
fileOrDirPath: string;
|
|
31
|
+
filter?: (filePath: string) => boolean;
|
|
32
|
+
shouldEstimateFileCount?: boolean;
|
|
33
|
+
}) => {
|
|
34
|
+
fileOrDirPath = path.resolve(fileOrDirPath);
|
|
35
|
+
const pathInfo = await fs.stat(fileOrDirPath);
|
|
36
|
+
if (
|
|
37
|
+
!pathInfo.isFile()
|
|
38
|
+
&& !pathInfo.isDirectory()
|
|
39
|
+
) {
|
|
40
|
+
throw new Error(`The path '${fileOrDirPath}' is not a file or directory`);
|
|
41
|
+
}
|
|
42
|
+
|
|
43
|
+
let estimateFileCount = undefined as undefined | number;
|
|
44
|
+
if (shouldEstimateFileCount) {
|
|
45
|
+
estimateFileCount = 0;
|
|
46
|
+
for await (const filePath of getFiles(fileOrDirPath)) {
|
|
47
|
+
if (filter && !filter(filePath)) {
|
|
48
|
+
continue;
|
|
49
|
+
}
|
|
50
|
+
estimateFileCount++;
|
|
51
|
+
}
|
|
52
|
+
}
|
|
53
|
+
|
|
54
|
+
const fileGenerator = getFiles(fileOrDirPath);
|
|
55
|
+
const getNextFile = async () => {
|
|
56
|
+
let nextFile = (await fileGenerator.next()).value;
|
|
57
|
+
if (!filter) {
|
|
58
|
+
return nextFile;
|
|
59
|
+
}
|
|
60
|
+
|
|
61
|
+
while (nextFile && !filter(nextFile)) {
|
|
62
|
+
nextFile = await getNextFile();
|
|
63
|
+
continue;
|
|
64
|
+
}
|
|
65
|
+
|
|
66
|
+
return nextFile;
|
|
67
|
+
};
|
|
68
|
+
return {
|
|
69
|
+
getNextFile,
|
|
70
|
+
estimateFileCount,
|
|
71
|
+
};
|
|
72
|
+
};
|
|
73
|
+
|
|
74
|
+
type ProgressInfo = { processedFilesCount: number; estimateFileCount: undefined | number };
|
|
75
|
+
export const processFiles = async <TResult>({
|
|
76
|
+
fileOrDirPath,
|
|
77
|
+
processFile,
|
|
78
|
+
filter,
|
|
79
|
+
parallelCount = 10,
|
|
80
|
+
onProgress,
|
|
81
|
+
}: {
|
|
82
|
+
fileOrDirPath: string;
|
|
83
|
+
processFile: (filePath: string, progress: ProgressInfo) => Promise<TResult>;
|
|
84
|
+
filter?: (filePath: string) => boolean;
|
|
85
|
+
parallelCount?: number;
|
|
86
|
+
onProgress?: (progress: ProgressInfo) => void;
|
|
87
|
+
}) => {
|
|
88
|
+
const { getNextFile, estimateFileCount } = await createFileProvider({
|
|
89
|
+
fileOrDirPath,
|
|
90
|
+
filter,
|
|
91
|
+
shouldEstimateFileCount: true,
|
|
92
|
+
});
|
|
93
|
+
|
|
94
|
+
const successes = [] as { filePath: string; result: TResult }[];
|
|
95
|
+
const failures = [] as { filePath: string; error: unknown }[];
|
|
96
|
+
|
|
97
|
+
onProgress?.({
|
|
98
|
+
processedFilesCount: 0,
|
|
99
|
+
estimateFileCount,
|
|
100
|
+
});
|
|
101
|
+
|
|
102
|
+
await Promise.all([...new Array(parallelCount)].map(async x => {
|
|
103
|
+
let fileToProcess = await getNextFile();
|
|
104
|
+
while (fileToProcess) {
|
|
105
|
+
const progressInfo = {
|
|
106
|
+
processedFilesCount: successes.length + failures.length,
|
|
107
|
+
estimateFileCount,
|
|
108
|
+
};
|
|
109
|
+
onProgress?.(progressInfo);
|
|
110
|
+
|
|
111
|
+
try {
|
|
112
|
+
const result = await processFile(fileToProcess, progressInfo);
|
|
113
|
+
successes.push({ filePath: fileToProcess, result });
|
|
114
|
+
break;
|
|
115
|
+
} catch (err) {
|
|
116
|
+
failures.push({ filePath: fileToProcess, error: err });
|
|
117
|
+
}
|
|
118
|
+
|
|
119
|
+
fileToProcess = await getNextFile();
|
|
120
|
+
}
|
|
121
|
+
}));
|
|
122
|
+
|
|
123
|
+
onProgress?.({
|
|
124
|
+
processedFilesCount: successes.length + failures.length,
|
|
125
|
+
estimateFileCount,
|
|
126
|
+
});
|
|
127
|
+
|
|
128
|
+
return {
|
|
129
|
+
successes,
|
|
130
|
+
failures,
|
|
131
|
+
};
|
|
132
|
+
};
|
|
@@ -0,0 +1,68 @@
|
|
|
1
|
+
import FormData from 'form-data';
|
|
2
|
+
import fs from 'fs';
|
|
3
|
+
import fetch from 'node-fetch';
|
|
4
|
+
|
|
5
|
+
export type PinataAuth = {
|
|
6
|
+
pinataJwtToken: string;
|
|
7
|
+
};
|
|
8
|
+
|
|
9
|
+
export type PublishFileResult = {
|
|
10
|
+
ipfsHash: string;
|
|
11
|
+
};
|
|
12
|
+
|
|
13
|
+
export const publishFileToIpfs = async ({
|
|
14
|
+
auth,
|
|
15
|
+
item,
|
|
16
|
+
}: {
|
|
17
|
+
auth: PinataAuth;
|
|
18
|
+
item: {
|
|
19
|
+
name: string;
|
|
20
|
+
filePath: string;
|
|
21
|
+
};
|
|
22
|
+
}): Promise<PublishFileResult> => {
|
|
23
|
+
const ipfsJsonFilePath = `${item.filePath}.ipfs.json`;
|
|
24
|
+
|
|
25
|
+
// // Skip if already uploaded
|
|
26
|
+
// try {
|
|
27
|
+
// const ipfsJsonFileContent = await fs.promises.readFile(ipfsJsonFilePath, { encoding: 'utf-8' });
|
|
28
|
+
// const ipfsResult = JSON.parse(ipfsJsonFileContent) as PublishFileResult;
|
|
29
|
+
// if (ipfsResult.ipfsHash) {
|
|
30
|
+
// console.log(`Skipping ${item.filePath}`);
|
|
31
|
+
// return;
|
|
32
|
+
// }
|
|
33
|
+
// } catch (err) {
|
|
34
|
+
// // Ignore
|
|
35
|
+
// }
|
|
36
|
+
|
|
37
|
+
const data = new FormData();
|
|
38
|
+
data.append('file', fs.createReadStream(item.filePath));
|
|
39
|
+
data.append(
|
|
40
|
+
'pinataMetadata',
|
|
41
|
+
JSON.stringify({
|
|
42
|
+
name: item.name,
|
|
43
|
+
}),
|
|
44
|
+
);
|
|
45
|
+
|
|
46
|
+
const response = await fetch(`https://api.pinata.cloud/pinning/pinFileToIPFS`, {
|
|
47
|
+
headers: {
|
|
48
|
+
Authorization: `Bearer ${auth.pinataJwtToken}`,
|
|
49
|
+
'Content-Type': `multipart/form-data; boundary=${(data as unknown as { _boundary: string })._boundary}`,
|
|
50
|
+
},
|
|
51
|
+
body: data,
|
|
52
|
+
method: 'post',
|
|
53
|
+
});
|
|
54
|
+
|
|
55
|
+
if (!response.ok) {
|
|
56
|
+
throw new Error(`Failed to upload '${item.name}' to ipfs ${response.statusText}`);
|
|
57
|
+
}
|
|
58
|
+
|
|
59
|
+
const uploadResult = await response.json() as {
|
|
60
|
+
IpfsHash: string; // This is the IPFS multi-hash provided back for your content,
|
|
61
|
+
PinSize: string; // This is how large (in bytes) the content you just pinned is,
|
|
62
|
+
Timestamp: string; // This is the timestamp for your content pinning (represented in ISO 8601 format)
|
|
63
|
+
};
|
|
64
|
+
|
|
65
|
+
return {
|
|
66
|
+
ipfsHash: uploadResult.IpfsHash,
|
|
67
|
+
};
|
|
68
|
+
};
|
package/src/proxy.ts
ADDED
|
@@ -0,0 +1,97 @@
|
|
|
1
|
+
import { sendAsyncErr, sendAsyncRes, sendErr } from '@taqueria/node-sdk';
|
|
2
|
+
import { LoadedConfig, PluginResponse, RequestArgs, SanitizedAbsPath } from '@taqueria/node-sdk/types';
|
|
3
|
+
import path from 'path';
|
|
4
|
+
import { processFiles } from './file-processing';
|
|
5
|
+
import { PinataAuth, publishFileToIpfs } from './pinata-api';
|
|
6
|
+
import { createProcessBackoffController } from './utils';
|
|
7
|
+
|
|
8
|
+
interface Opts extends RequestArgs.ProxyRequestArgs {
|
|
9
|
+
readonly path?: string;
|
|
10
|
+
readonly hash?: string;
|
|
11
|
+
}
|
|
12
|
+
|
|
13
|
+
const publishToIpfs = async (fileOrDirPath: undefined | string, auth: PinataAuth): Promise<PluginResponse> => {
|
|
14
|
+
if (!fileOrDirPath) {
|
|
15
|
+
throw new Error(`path was not provided`);
|
|
16
|
+
}
|
|
17
|
+
|
|
18
|
+
const { processWithBackoff } = createProcessBackoffController({
|
|
19
|
+
retryCount: 5,
|
|
20
|
+
});
|
|
21
|
+
|
|
22
|
+
const result = await processFiles({
|
|
23
|
+
fileOrDirPath,
|
|
24
|
+
parallelCount: 10,
|
|
25
|
+
processFile: async filePath => {
|
|
26
|
+
return processWithBackoff(() =>
|
|
27
|
+
publishFileToIpfs({
|
|
28
|
+
auth,
|
|
29
|
+
item: { filePath, name: path.basename(filePath) },
|
|
30
|
+
})
|
|
31
|
+
);
|
|
32
|
+
},
|
|
33
|
+
onProgress: ({ processedFilesCount, estimateFileCount }) => {
|
|
34
|
+
if (estimateFileCount && processedFilesCount % 10) {
|
|
35
|
+
let ratio = processedFilesCount / estimateFileCount;
|
|
36
|
+
if (ratio > 1) ratio = 1;
|
|
37
|
+
|
|
38
|
+
// TODO: Call task sdk progress
|
|
39
|
+
console.log(`Progress: ${(ratio * 100).toFixed(0)}%`);
|
|
40
|
+
}
|
|
41
|
+
},
|
|
42
|
+
});
|
|
43
|
+
|
|
44
|
+
return {
|
|
45
|
+
data: {
|
|
46
|
+
fileIpfsHashes: result.successes.map(x => ({
|
|
47
|
+
filePath: x.filePath,
|
|
48
|
+
ipfsHash: x.result.ipfsHash,
|
|
49
|
+
})),
|
|
50
|
+
},
|
|
51
|
+
};
|
|
52
|
+
};
|
|
53
|
+
|
|
54
|
+
const pinToIpfs = async (hash: undefined | string, auth: PinataAuth): Promise<PluginResponse> => {
|
|
55
|
+
if (!hash) {
|
|
56
|
+
throw new Error(`ipfs hash was not provided`);
|
|
57
|
+
}
|
|
58
|
+
|
|
59
|
+
// TODO: Implement pinning
|
|
60
|
+
throw new Error('pinToIpfs: Not Implemented');
|
|
61
|
+
};
|
|
62
|
+
|
|
63
|
+
const process = async (opts: Opts): Promise<PluginResponse> => {
|
|
64
|
+
const {
|
|
65
|
+
task,
|
|
66
|
+
path,
|
|
67
|
+
hash,
|
|
68
|
+
config,
|
|
69
|
+
} = opts;
|
|
70
|
+
|
|
71
|
+
const auth: PinataAuth = {
|
|
72
|
+
// TODO: get pinata api key from config
|
|
73
|
+
pinataJwtToken: '',
|
|
74
|
+
};
|
|
75
|
+
|
|
76
|
+
switch (task) {
|
|
77
|
+
case 'publish':
|
|
78
|
+
return publishToIpfs(path, auth);
|
|
79
|
+
case 'pin':
|
|
80
|
+
return pinToIpfs(hash, auth);
|
|
81
|
+
default:
|
|
82
|
+
throw new Error(`${task} is not an understood task by the ipfs-pinata plugin`);
|
|
83
|
+
}
|
|
84
|
+
};
|
|
85
|
+
|
|
86
|
+
export default async (args: RequestArgs.ProxyRequestArgs): Promise<PluginResponse> => {
|
|
87
|
+
const opts = args as Opts;
|
|
88
|
+
|
|
89
|
+
try {
|
|
90
|
+
return process(opts);
|
|
91
|
+
} catch (err) {
|
|
92
|
+
const error = err as Error;
|
|
93
|
+
if (error.message) {
|
|
94
|
+
return sendAsyncErr(error.message);
|
|
95
|
+
}
|
|
96
|
+
}
|
|
97
|
+
};
|
package/src/utils.ts
ADDED
|
@@ -0,0 +1,45 @@
|
|
|
1
|
+
export async function delay(timeout: number): Promise<void> {
|
|
2
|
+
return await new Promise(resolve => {
|
|
3
|
+
setTimeout(resolve, timeout);
|
|
4
|
+
});
|
|
5
|
+
}
|
|
6
|
+
|
|
7
|
+
export const createProcessBackoffController = ({
|
|
8
|
+
retryCount = 5,
|
|
9
|
+
}: {
|
|
10
|
+
retryCount: number;
|
|
11
|
+
}) => {
|
|
12
|
+
const DELAY_MIN = 10;
|
|
13
|
+
const DELAY_SUCCESS_REDUCTION = 10;
|
|
14
|
+
const DELAY_FAILURE_INCREASE = 200;
|
|
15
|
+
|
|
16
|
+
let delayTimeMs = 100;
|
|
17
|
+
|
|
18
|
+
const processWithBackoff = async <TResult>(process: () => Promise<TResult>) => {
|
|
19
|
+
let attempt = 0;
|
|
20
|
+
let lastError = undefined as unknown;
|
|
21
|
+
while (attempt < retryCount) {
|
|
22
|
+
try {
|
|
23
|
+
await delay(delayTimeMs);
|
|
24
|
+
|
|
25
|
+
const result = await process();
|
|
26
|
+
|
|
27
|
+
delayTimeMs -= DELAY_SUCCESS_REDUCTION;
|
|
28
|
+
delayTimeMs = Math.max(DELAY_MIN, delayTimeMs);
|
|
29
|
+
|
|
30
|
+
return result;
|
|
31
|
+
} catch (err) {
|
|
32
|
+
lastError = err;
|
|
33
|
+
}
|
|
34
|
+
delayTimeMs += DELAY_FAILURE_INCREASE;
|
|
35
|
+
attempt++;
|
|
36
|
+
}
|
|
37
|
+
|
|
38
|
+
// All attempts failed
|
|
39
|
+
throw lastError;
|
|
40
|
+
};
|
|
41
|
+
|
|
42
|
+
return {
|
|
43
|
+
processWithBackoff,
|
|
44
|
+
};
|
|
45
|
+
};
|
package/tsconfig.json
ADDED
|
@@ -0,0 +1,101 @@
|
|
|
1
|
+
{
|
|
2
|
+
"compilerOptions": {
|
|
3
|
+
/* Visit https://aka.ms/tsconfig.json to read more about this file */
|
|
4
|
+
|
|
5
|
+
/* Projects */
|
|
6
|
+
// "incremental": true, /* Enable incremental compilation */
|
|
7
|
+
// "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */
|
|
8
|
+
// "tsBuildInfoFile": "./", /* Specify the folder for .tsbuildinfo incremental compilation files. */
|
|
9
|
+
// "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects */
|
|
10
|
+
// "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */
|
|
11
|
+
// "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */
|
|
12
|
+
|
|
13
|
+
/* Language and Environment */
|
|
14
|
+
"target": "ES2021", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
|
15
|
+
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
|
16
|
+
// "jsx": "preserve", /* Specify what JSX code is generated. */
|
|
17
|
+
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
|
|
18
|
+
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
|
|
19
|
+
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h' */
|
|
20
|
+
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
|
|
21
|
+
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using `jsx: react-jsx*`.` */
|
|
22
|
+
// "reactNamespace": "", /* Specify the object invoked for `createElement`. This only applies when targeting `react` JSX emit. */
|
|
23
|
+
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
|
24
|
+
// "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
|
25
|
+
|
|
26
|
+
/* Modules */
|
|
27
|
+
"module": "CommonJS", /* Specify what module code is generated. */
|
|
28
|
+
// "rootDir": "./", /* Specify the root folder within your source files. */
|
|
29
|
+
// "moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
|
|
30
|
+
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
|
31
|
+
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
|
32
|
+
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
|
33
|
+
// "typeRoots": [], /* Specify multiple folders that act like `./node_modules/@types`. */
|
|
34
|
+
// "types": [], /* Specify type package names to be included without being referenced in a source file. */
|
|
35
|
+
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
|
|
36
|
+
// "resolveJsonModule": true, /* Enable importing .json files */
|
|
37
|
+
// "noResolve": true, /* Disallow `import`s, `require`s or `<reference>`s from expanding the number of files TypeScript should add to a project. */
|
|
38
|
+
|
|
39
|
+
/* JavaScript Support */
|
|
40
|
+
// "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */
|
|
41
|
+
// "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
|
42
|
+
// "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from `node_modules`. Only applicable with `allowJs`. */
|
|
43
|
+
|
|
44
|
+
/* Emit */
|
|
45
|
+
// "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
|
46
|
+
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
|
|
47
|
+
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
|
|
48
|
+
// "sourceMap": true, /* Create source map files for emitted JavaScript files. */
|
|
49
|
+
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If `declaration` is true, also designates a file that bundles all .d.ts output. */
|
|
50
|
+
// "outDir": "./", /* Specify an output folder for all emitted files. */
|
|
51
|
+
// "removeComments": true, /* Disable emitting comments. */
|
|
52
|
+
// "noEmit": true, /* Disable emitting files from a compilation. */
|
|
53
|
+
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
|
|
54
|
+
// "importsNotUsedAsValues": "remove", /* Specify emit/checking behavior for imports that are only used for types */
|
|
55
|
+
// "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
|
|
56
|
+
// "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */
|
|
57
|
+
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
|
|
58
|
+
// "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */
|
|
59
|
+
// "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */
|
|
60
|
+
// "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
|
|
61
|
+
// "newLine": "crlf", /* Set the newline character for emitting files. */
|
|
62
|
+
// "stripInternal": true, /* Disable emitting declarations that have `@internal` in their JSDoc comments. */
|
|
63
|
+
// "noEmitHelpers": true, /* Disable generating custom helper functions like `__extends` in compiled output. */
|
|
64
|
+
// "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */
|
|
65
|
+
// "preserveConstEnums": true, /* Disable erasing `const enum` declarations in generated code. */
|
|
66
|
+
// "declarationDir": "./", /* Specify the output directory for generated declaration files. */
|
|
67
|
+
// "preserveValueImports": true, /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */
|
|
68
|
+
|
|
69
|
+
/* Interop Constraints */
|
|
70
|
+
"isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */
|
|
71
|
+
"allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
|
|
72
|
+
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */
|
|
73
|
+
// "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
|
|
74
|
+
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
|
75
|
+
|
|
76
|
+
/* Type Checking */
|
|
77
|
+
"strict": true, /* Enable all strict type-checking options. */
|
|
78
|
+
// "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied `any` type.. */
|
|
79
|
+
// "strictNullChecks": true, /* When type checking, take into account `null` and `undefined`. */
|
|
80
|
+
// "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
|
|
81
|
+
// "strictBindCallApply": true, /* Check that the arguments for `bind`, `call`, and `apply` methods match the original function. */
|
|
82
|
+
// "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
|
|
83
|
+
// "noImplicitThis": true, /* Enable error reporting when `this` is given the type `any`. */
|
|
84
|
+
// "useUnknownInCatchVariables": true, /* Type catch clause variables as 'unknown' instead of 'any'. */
|
|
85
|
+
// "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
|
|
86
|
+
// "noUnusedLocals": true, /* Enable error reporting when a local variables aren't read. */
|
|
87
|
+
// "noUnusedParameters": true, /* Raise an error when a function parameter isn't read */
|
|
88
|
+
// "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
|
|
89
|
+
// "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
|
|
90
|
+
// "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
|
|
91
|
+
// "noUncheckedIndexedAccess": true, /* Include 'undefined' in index signature results */
|
|
92
|
+
// "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
|
|
93
|
+
// "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type */
|
|
94
|
+
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
|
|
95
|
+
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */
|
|
96
|
+
|
|
97
|
+
/* Completeness */
|
|
98
|
+
// "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */
|
|
99
|
+
"skipLibCheck": true /* Skip type checking all .d.ts files. */
|
|
100
|
+
}
|
|
101
|
+
}
|