pim-import 6.10.0 → 6.11.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,26 +1,548 @@
1
- # PIM IMPORT
1
+ # pim-import
2
2
 
3
- A set of features needed to import content from pim to Contenteful.
3
+ TypeScript/Node.js library to bulk-import content from a **PIM** (Product Information Management) system into **Contentful CMS**, with integrations for **Amazon S3**, **Algolia** search, and **Imgix** image management.
4
4
 
5
- ### Services used:
5
+ ## Services
6
6
 
7
- - PIM
8
- - Contentful
9
- - Amazon S3
10
- - Algolia
7
+ | Service | Purpose |
8
+ | ---------- | -------------------------------------------- |
9
+ | PIM | Data source (products, catalogs, dictionary) |
10
+ | Contentful | CMS destination (CMA + CDA) |
11
+ | Amazon S3 | Asset and JSON snapshot storage |
12
+ | Algolia | Search index synchronization |
13
+ | Imgix | Image asset management |
11
14
 
12
- ## Settings
15
+ ---
13
16
 
14
- Depending on what you want to import you will need more or less configurations. [Read More](https://github.com/atoms-studio/pim-import/blob/main/docs/settings.md)
17
+ ## Installation
15
18
 
16
- ## How to use
19
+ ```bash
20
+ yarn add pim-import
21
+ ```
17
22
 
18
- The import process takes place through the following steps:
23
+ ---
19
24
 
20
- - [Import data from the dictionary](https://github.com/atoms-studio/pim-import/blob/main/docs/import-data-from-the-dictionary.md)
21
- - [Import taxonomies from the catalog](https://github.com/atoms-studio/pim-import/blob/main/docs/import-taxonomies-from-the-catalog.md)
22
- - [Import products](https://github.com/atoms-studio/pim-import/blob/main/docs/import-products.md)
23
- - Saving relationships between color variants
24
- - Saving relationships between accessories
25
- - Audit
26
- - Algolia indexes update
25
+ ## Configuration
26
+
27
+ All services are initialized programmatically or via environment variables (prefixed `FPI_`).
28
+
29
+ ### PIM
30
+
31
+ ```ts
32
+ import { initPim } from "pim-import";
33
+
34
+ initPim({
35
+ baseURL: process.env.FPI_PIM_BASE_URL, // required
36
+ username: process.env.FPI_PIM_USERNAME,
37
+ password: process.env.FPI_PIM_PASSWORD,
38
+ timeout: 300, // seconds, default 300
39
+ sslverify: true, // default true
40
+ });
41
+ ```
42
+
43
+ ### Contentful
44
+
45
+ ```ts
46
+ import { initContentful } from "pim-import";
47
+
48
+ initContentful({
49
+ accessToken: process.env.FPI_CTF_CMA_ACCESS_TOKEN, // required
50
+ spaceId: process.env.FPI_CTF_SPACE_ID, // required
51
+ environmentId: process.env.FPI_CTF_ENVIRONMENT, // required
52
+ });
53
+ ```
54
+
55
+ ### Amazon S3
56
+
57
+ ```ts
58
+ import { initS3 } from "pim-import";
59
+
60
+ initS3({
61
+ accessKeyId: process.env.FPI_AWS_ACCESS_KEY, // required
62
+ secretAccessKey: process.env.FPI_AWS_SECRET_ACCESS_KEY, // required
63
+ region: process.env.FPI_AWS_REGION, // required
64
+ bucket: process.env.FPI_AWS_S3_BUCKET, // required
65
+ });
66
+ ```
67
+
68
+ ### Environment variables reference
69
+
70
+ ```bash
71
+ # PIM
72
+ FPI_PIM_BASE_URL=
73
+ FPI_PIM_USERNAME=
74
+ FPI_PIM_PASSWORD=
75
+ FPI_PIM_TIMEOUT=300
76
+ FPI_PIM_SSL_VERIFY=true
77
+
78
+ # Contentful
79
+ FPI_CTF_CMA_ACCESS_TOKEN=
80
+ FPI_CTF_SPACE_ID=
81
+ FPI_CTF_ENVIRONMENT=
82
+
83
+ # Amazon S3
84
+ FPI_AWS_ACCESS_KEY=
85
+ FPI_AWS_SECRET_ACCESS_KEY=
86
+ FPI_AWS_REGION=
87
+ FPI_AWS_S3_BUCKET=
88
+
89
+ # Algolia
90
+ FPI_ALGOLIA_APP_ID=
91
+ FPI_ALGOLIA_API_KEY=
92
+ FPI_ALGOLIA_INDEX_NAME_SUFFIX=
93
+ FPI_ALGOLIA_INDEX_PRODUCTS_NAME=
94
+
95
+ # PDF generation (AWS Lambda)
96
+ FPID_AWS_GEN_PDF_LAMBDA_URL=
97
+ FPI_TRIGGER_GEN_PDF_TASK_ID=
98
+ FPI_TRIGGER_SECRET_KEY=
99
+
100
+ # Notifications
101
+ FPI_TEAMS_NOTIFICATIONS_URL= # Microsoft Teams incoming webhook
102
+
103
+ # Monitoring
104
+ SENTRY_DSN=
105
+ SENTRY_TRACE_SAMPLE_RATE=
106
+ LOGDNA_APP_NAME=
107
+ LOGDNA_KEY=
108
+ ```
109
+
110
+ ---
111
+
112
+ ## Import flow
113
+
114
+ A complete import follows these steps in order:
115
+
116
+ 1. [Dictionary](#1-dictionary)
117
+ 2. [Taxonomies (categories, families, sub-families)](#2-taxonomies)
118
+ 3. [Products](#3-products)
119
+ 4. [Product relationships](#4-product-relationships)
120
+ 5. [Audit](#5-audit)
121
+ 6. [Algolia reindex](#6-algolia-reindex)
122
+
123
+ ---
124
+
125
+ ## 1. Dictionary
126
+
127
+ Imports product field translations and icons from the `/dictionary` PIM endpoint.
128
+
129
+ ### Import field translations
130
+
131
+ Saves localized labels for all product fields to Contentful.
132
+
133
+ **Required:** `FPI_PIM_BASE_URL`, `FPI_CTF_CMA_ACCESS_TOKEN`, `FPI_CTF_SPACE_ID`, `FPI_CTF_ENVIRONMENT`
134
+
135
+ ```ts
136
+ import { importDictionaryFields } from "pim-import";
137
+
138
+ await importDictionaryFields();
139
+ ```
140
+
141
+ ### Import icons to S3
142
+
143
+ Uploads product field icons retrieved from PIM to Amazon S3.
144
+
145
+ **Required:** `FPI_PIM_BASE_URL`, `FPI_AWS_*`
146
+
147
+ ```ts
148
+ import { importDictionaryIcons } from "pim-import";
149
+
150
+ await importDictionaryIcons();
151
+ ```
152
+
153
+ ### Import product line taxonomy
154
+
155
+ ```ts
156
+ import {
157
+ importDictionaryProductLine,
158
+ importDictionaryProductSubLine,
159
+ } from "pim-import";
160
+
161
+ await importDictionaryProductLine();
162
+ await importDictionaryProductSubLine();
163
+ ```
164
+
165
+ ### Import all dictionary data at once
166
+
167
+ ```ts
168
+ import { importDictionaryData } from "pim-import";
169
+
170
+ await importDictionaryData();
171
+ ```
172
+
173
+ ---
174
+
175
+ ## 2. Taxonomies
176
+
177
+ Imports the catalog hierarchy from the `/catalogs` PIM endpoint.
178
+
179
+ Each catalog operation writes the following Contentful content types:
180
+
181
+ - `topicCategory` — category with name, code, catalog, families
182
+ - `topicSubFamily` — sub-family with name, code, catalog
183
+ - `pageContent` + `page` — CMS pages for each category and sub-family
184
+
185
+ ### Import categories
186
+
187
+ ```ts
188
+ import { importCategories } from "pim-import";
189
+
190
+ const result = await importCategories(
191
+ "ARCHITECTURAL", // AvailableCatalogs: 'ARCHITECTURAL' | 'OUTDOOR' | 'DECORATIVE'
192
+ 0, // offset (default 0)
193
+ -1, // limit, -1 = all (default -1)
194
+ "", // optional S3 path to a pre-fetched JSON snapshot
195
+ );
196
+ // result: { completed: boolean, offset, limit, total, s3FilePath }
197
+ ```
198
+
199
+ ### Import families and sub-families
200
+
201
+ ```ts
202
+ import { importFamilies, importSubFamilies } from "pim-import";
203
+
204
+ await importFamilies("ARCHITECTURAL", 0, -1);
205
+ await importSubFamilies("ARCHITECTURAL", 0, -1);
206
+ ```
207
+
208
+ ### Import models and sub-models
209
+
210
+ ```ts
211
+ import { importModels, importSubModels } from "pim-import";
212
+
213
+ await importModels("ARCHITECTURAL", 0, -1);
214
+ await importSubModels("ARCHITECTURAL", 0, -1);
215
+ ```
216
+
217
+ ---
218
+
219
+ ## 3. Products
220
+
221
+ ### Import by last modification date
222
+
223
+ Fetches products from the `/latest-products` endpoint and creates/updates Contentful entries.
224
+
225
+ Each product creates:
226
+
227
+ - `topicProduct` — product with name, code, status, categories, sub-families, product line, PIM details
228
+ - `pageContent` — page content wrapper
229
+ - `page` — public product page with slug
230
+
231
+ ```ts
232
+ import { importLatestProducts } from "pim-import";
233
+
234
+ const result = await importLatestProducts(
235
+ "ARCHITECTURAL", // catalog
236
+ "20240101T00:00:00", // lastModified — ISO date used as lower bound
237
+ 0, // page (default 0)
238
+ 50, // size per page (default 50)
239
+ "20240201T00:00:00", // optional lastModifiedTo upper bound
240
+ );
241
+ // result: { completed, page, size, totalPages, nextPage }
242
+ ```
243
+
244
+ ### Import a single product by code
245
+
246
+ ```ts
247
+ import { importProductByCode } from "pim-import";
248
+
249
+ const entry = await importProductByCode("F3030031", "ARCHITECTURAL");
250
+ ```
251
+
252
+ ### Reimport products flagged by audit
253
+
254
+ ```ts
255
+ import { reimportAuditProducts } from "pim-import";
256
+
257
+ await reimportAuditProducts();
258
+ ```
259
+
260
+ ---
261
+
262
+ ## 4. Product relationships
263
+
264
+ ### Set relationships (color variants, accessories)
265
+
266
+ ```ts
267
+ import { setProductsRelationships, setProductRelationships } from "pim-import";
268
+
269
+ // All products in a catalog
270
+ await setProductsRelationships("ARCHITECTURAL");
271
+
272
+ // Single product
273
+ await setProductRelationships("F3030031");
274
+ ```
275
+
276
+ ### Remove relationships
277
+
278
+ ```ts
279
+ import {
280
+ removeProductFromColorVariantsByProductLine,
281
+ removeAllProductModelProductRelations,
282
+ } from "pim-import";
283
+
284
+ await removeProductFromColorVariantsByProductLine();
285
+ await removeAllProductModelProductRelations();
286
+ ```
287
+
288
+ ### Populate destination fields
289
+
290
+ ```ts
291
+ import { populateDestinations } from "pim-import";
292
+
293
+ await populateDestinations();
294
+ ```
295
+
296
+ ---
297
+
298
+ ## 5. Audit
299
+
300
+ Validates product data integrity across Contentful entries.
301
+
302
+ ```ts
303
+ import { audit } from "pim-import";
304
+
305
+ const results = await audit("ARCHITECTURAL");
306
+ // results: AuditResults — per-product validation status and missing fields
307
+ ```
308
+
309
+ ---
310
+
311
+ ## 6. Algolia reindex
312
+
313
+ All reindex functions accept an optional catalog filter. Individual variants accept a `code` string.
314
+
315
+ ### Products
316
+
317
+ ```ts
318
+ import {
319
+ reindexProducts,
320
+ reindexProduct,
321
+ removeProductObject,
322
+ } from "pim-import";
323
+
324
+ await reindexProducts("ARCHITECTURAL");
325
+ await reindexProduct("F3030031");
326
+ await removeProductObject("F3030031");
327
+ ```
328
+
329
+ ### Families and sub-families
330
+
331
+ ```ts
332
+ import {
333
+ reindexFamilies,
334
+ reindexFamily,
335
+ removeFamilyObject,
336
+ reindexSubFamilies,
337
+ reindexSubFamily,
338
+ removeSubFamilyObject,
339
+ } from "pim-import";
340
+
341
+ await reindexFamilies("ARCHITECTURAL");
342
+ await reindexSubFamilies("ARCHITECTURAL");
343
+ ```
344
+
345
+ ### Models and sub-models
346
+
347
+ ```ts
348
+ import {
349
+ reindexModels,
350
+ reindexModel,
351
+ removeModelObject,
352
+ reindexSubModels,
353
+ reindexSubModel,
354
+ removeSubModelObject,
355
+ } from "pim-import";
356
+
357
+ await reindexModels();
358
+ await reindexSubModels();
359
+ ```
360
+
361
+ ### Editorial content
362
+
363
+ ```ts
364
+ import {
365
+ reindexDownloads,
366
+ reindexInspirations,
367
+ reindexProjects,
368
+ reindexStories,
369
+ reindexPressReleases,
370
+ reindexPressReviews,
371
+ reindexPosts,
372
+ } from "pim-import";
373
+
374
+ await reindexDownloads();
375
+ await reindexInspirations();
376
+ await reindexProjects();
377
+ await reindexStories();
378
+ await reindexPressReleases();
379
+ await reindexPressReviews();
380
+ await reindexPosts();
381
+ ```
382
+
383
+ ### Clone index settings
384
+
385
+ ```ts
386
+ import { cloneIndexSettings } from "pim-import";
387
+
388
+ await cloneIndexSettings("products_en", "products_en_staging");
389
+ ```
390
+
391
+ ---
392
+
393
+ ## Amazon S3 utilities
394
+
395
+ ```ts
396
+ import { uploadS3, saveJsonToS3, getFileFromS3, savePDFToS3 } from "pim-import";
397
+
398
+ // Upload a remote file by URL
399
+ const { Location } = await uploadS3(
400
+ "https://example.com/image.jpg",
401
+ "image.jpg",
402
+ "assets/",
403
+ );
404
+
405
+ // Save a JSON object
406
+ const s3Path = await saveJsonToS3({ key: "value" }, "data.json", "snapshots/");
407
+
408
+ // Retrieve a file (returns content as string)
409
+ const content = await getFileFromS3("snapshots/data.json");
410
+
411
+ // Save a PDF buffer
412
+ await savePDFToS3(pdfBuffer, "product.pdf", "pdf/");
413
+ ```
414
+
415
+ ---
416
+
417
+ ## PDF generation
418
+
419
+ Generates a technical specification PDF via AWS Lambda and returns a URL.
420
+
421
+ ```ts
422
+ import { generateTechSpecPdf, generatePDFByUrl } from "pim-import";
423
+
424
+ // From a product code
425
+ const url = await generateTechSpecPdf("F3030031");
426
+
427
+ // From an arbitrary URL
428
+ const buffer = await generatePDFByUrl("https://example.com/page", "output.pdf");
429
+ ```
430
+
431
+ ---
432
+
433
+ ## Downloads
434
+
435
+ Imports download resources from a CSV file.
436
+
437
+ ```ts
438
+ import { importDownloads } from "pim-import";
439
+
440
+ await importDownloads("/path/to/downloads.csv");
441
+ ```
442
+
443
+ ---
444
+
445
+ ## Contentful utilities
446
+
447
+ ```ts
448
+ import {
449
+ initBaseEntries,
450
+ getEntryByID,
451
+ getEntries,
452
+ getTopicPage,
453
+ getAllProductEntriesByCatalog,
454
+ publishAllProductDrafts,
455
+ deleteEntries,
456
+ deletePages,
457
+ migrateEntryFields,
458
+ checkTopicDraftAndPagePublished,
459
+ } from "pim-import";
460
+
461
+ // Publish all draft products
462
+ await publishAllProductDrafts();
463
+
464
+ // Get all product entries for a catalog
465
+ const entries = await getAllProductEntriesByCatalog("ARCHITECTURAL");
466
+
467
+ // Migrate fields between content types
468
+ await migrateEntryFields("topicProduct", { oldField: "newField" });
469
+
470
+ // Check publish status
471
+ const isReady = await checkTopicDraftAndPagePublished("topic-id-123");
472
+ ```
473
+
474
+ ---
475
+
476
+ ## Designers
477
+
478
+ ```ts
479
+ import { importDesigners, importDesigner } from "pim-import";
480
+
481
+ await importDesigners();
482
+ const entry = await importDesigner("designer-code");
483
+ ```
484
+
485
+ ---
486
+
487
+ ## Auto-descriptions
488
+
489
+ ```ts
490
+ import {
491
+ setProductsAutodescription,
492
+ getProductAutodescription,
493
+ setProductAutodescriptionByTopicId,
494
+ } from "pim-import";
495
+
496
+ await setProductsAutodescription();
497
+ const text = await getProductAutodescription("F3030031");
498
+ await setProductAutodescriptionByTopicId("topic-id-123");
499
+ ```
500
+
501
+ ---
502
+
503
+ ## Logging
504
+
505
+ ```ts
506
+ import {
507
+ log,
508
+ setLogId,
509
+ setLogPath,
510
+ setLogFilename,
511
+ getLogFolder,
512
+ } from "pim-import";
513
+
514
+ setLogPath("/var/log/pim-import");
515
+ setLogFilename("import-run");
516
+ setLogId("session-001");
517
+
518
+ log("Import started", "INFO");
519
+ // Levels: VERBOSE | DEBUG | INFO | WARN | ERROR | SILLY | HTTP
520
+ ```
521
+
522
+ ---
523
+
524
+ ## Notifications & CI
525
+
526
+ ```ts
527
+ import { notify, netlifyBuild } from "pim-import";
528
+
529
+ // Send a message to a Microsoft Teams channel (via FPI_TEAMS_NOTIFICATIONS_URL)
530
+ await notify("Import completed successfully", true);
531
+
532
+ // Trigger a Netlify build
533
+ await netlifyBuild("site-id", "build-hook-id");
534
+ ```
535
+
536
+ ---
537
+
538
+ ## Available catalogs
539
+
540
+ ```ts
541
+ type AvailableCatalogs = "ARCHITECTURAL" | "OUTDOOR" | "DECORATIVE";
542
+ ```
543
+
544
+ ---
545
+
546
+ ## License
547
+
548
+ MIT — [atoms.studio](https://atoms.studio)
@@ -46,12 +46,12 @@ exports.getCatalogTaxonomiesHierarchy = getCatalogTaxonomiesHierarchy;
46
46
  const getDictionary = async () => {
47
47
  try {
48
48
  const timeStart = new Date();
49
- const { data } = await (0, request_1.getRequest)("dictionary", {
50
- params: { onlyweb: true },
49
+ const { data } = await (0, request_1.getRequest)("micro-service/dictionary/get/all", {
50
+ params: { onlyWeb: true },
51
51
  });
52
52
  const timeEnd = new Date();
53
53
  const seconds = (0, utils_1.secondBetweenTwoDate)(timeStart, timeEnd);
54
- (0, logs_1.log)(`Request time: ${seconds} seconds - endpoint: dictionary`);
54
+ (0, logs_1.log)(`Request time: ${seconds} seconds - endpoint: micro-service/dictionary/get/all`);
55
55
  return data;
56
56
  }
57
57
  catch (err) {
@@ -261,11 +261,12 @@ const importDictionaryData = async () => {
261
261
  (0, logs_1.log)(`importDictionaryData`, "INFO");
262
262
  (0, logs_1.log)("Get dictionary data");
263
263
  const pimDictionaryData = await (0, endpoints_1.getDictionary)();
264
+ console.log("pimDictionaryData", pimDictionaryData);
264
265
  if (pimDictionaryData && Object.keys(pimDictionaryData).length) {
265
266
  const dictionaryData = {};
266
267
  for (const key in pimDictionaryData) {
267
268
  if (Array.isArray(pimDictionaryData[key])) {
268
- dictionaryData[key] = pimDictionaryData[key].map((item) => ({
269
+ dictionaryData[key] = pimDictionaryData?.[key]?.map((item) => ({
269
270
  code: item.code,
270
271
  parentName: item.parentName,
271
272
  image: item.image,
@@ -1,2 +1,2 @@
1
1
  import { AvailableCatalogs, PaginationResults } from "../../types";
2
- export declare const importLatestProducts: (catalog: AvailableCatalogs, lastModified: string, page?: number, limit?: number, lastModifiedTo?: string) => Promise<PaginationResults>;
2
+ export declare const importLatestProducts: (catalog: AvailableCatalogs, lastModified: string, page?: number, size?: number, lastModifiedTo?: string) => Promise<PaginationResults>;
@@ -5,16 +5,16 @@ const endpoints_1 = require("../endpoints");
5
5
  const logs_1 = require("../../libs/logs");
6
6
  const products_1 = require("./products");
7
7
  const utils_1 = require("../../utils");
8
- const importLatestProducts = async (catalog, lastModified, page = 0, limit = 100, lastModifiedTo = "") => {
8
+ const importLatestProducts = async (catalog, lastModified, page = 0, size = 100, lastModifiedTo = "") => {
9
9
  const timeStart = new Date();
10
10
  page = Number(page);
11
- limit = Number(limit);
12
- (0, logs_1.log)(`importLatestProducts - catalog: ${catalog} lastModified: ${lastModified} page: ${page} limit: ${limit} lastModifiedTo: ${lastModifiedTo}`, "INFO");
13
- const data = await (0, endpoints_1.getLatestProducts)(catalog, lastModified, page, limit, lastModifiedTo);
11
+ size = Number(size);
12
+ (0, logs_1.log)(`importLatestProducts - catalog: ${catalog} lastModified: ${lastModified} page: ${page} size: ${size} lastModifiedTo: ${lastModifiedTo}`, "INFO");
13
+ const data = await (0, endpoints_1.getLatestProducts)(catalog, lastModified, page, size, lastModifiedTo);
14
14
  const total = data.totalElements;
15
15
  const products = data.content;
16
16
  (0, logs_1.log)(`${total} products founded`);
17
- let count = page * limit + 1;
17
+ let count = page * size + 1;
18
18
  let current = 0;
19
19
  for (const product of products) {
20
20
  (0, logs_1.log)(`${count} of ${total}`);
@@ -30,11 +30,11 @@ const importLatestProducts = async (catalog, lastModified, page = 0, limit = 100
30
30
  const timeEnd = new Date();
31
31
  const seconds = (0, utils_1.secondBetweenTwoDate)(timeStart, timeEnd);
32
32
  (0, logs_1.log)(`Request time: ${seconds} seconds`);
33
- const completed = count >= total;
33
+ const nextPage = Number(page) + 1;
34
34
  return {
35
- nextPage: completed ? null : page + 1,
36
- limit,
37
- completed,
35
+ page: nextPage,
36
+ size,
37
+ completed: nextPage >= data.totalPages,
38
38
  total,
39
39
  totalPages: data.totalPages,
40
40
  };
@@ -0,0 +1,38 @@
1
+ export type Dictionary = Record<string, DictionaryRecord[]>;
2
+ export interface DictionaryRecord {
3
+ parentName: string;
4
+ code: string;
5
+ image?: string | null;
6
+ imageAlternative?: string | null;
7
+ value_en?: string | null;
8
+ value_it?: string | null;
9
+ value_es?: string | null;
10
+ value_de?: string | null;
11
+ value_fr?: string | null;
12
+ value_sv?: string | null;
13
+ value_no?: string | null;
14
+ value_da?: string | null;
15
+ value_ru?: string | null;
16
+ value_en_US?: string | null;
17
+ value_ja?: string | null;
18
+ value_zh?: string | null;
19
+ value_ko?: string | null;
20
+ note_en?: string | null;
21
+ note_it?: string | null;
22
+ note_es?: string | null;
23
+ note_de?: string | null;
24
+ note_fr?: string | null;
25
+ note_sv?: string | null;
26
+ note_no?: string | null;
27
+ note_da?: string | null;
28
+ note_ru?: string | null;
29
+ note_en_US?: string | null;
30
+ note_zh?: string | null;
31
+ note_ja?: string | null;
32
+ note_ko?: string | null;
33
+ imgRelUrl?: string | null;
34
+ imgAltRelUrl?: string | null;
35
+ priority?: number | null;
36
+ imageS3?: string | null;
37
+ imageAlternativeS3?: string | null;
38
+ }
@@ -0,0 +1,2 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "pim-import",
3
- "version": "6.10.0",
3
+ "version": "6.11.0",
4
4
  "description": "",
5
5
  "main": "./dist/index.js",
6
6
  "types": "./dist/index.d.ts",
@@ -15,19 +15,15 @@
15
15
  "scripts": {
16
16
  "prepublishOnly": "npm run build",
17
17
  "publishCmd": "npm publish --access public",
18
- "build": "rm -rf dist && rm -rf coverage && tsc",
19
- "test": "jest --collect-coverage"
18
+ "build": "rm -rf dist && rm -rf coverage && tsc"
20
19
  },
21
20
  "author": "atoms.studio",
22
21
  "license": "MIT",
23
22
  "devDependencies": {
24
- "@types/jest": "^26.0.22",
25
23
  "@types/mime-types": "^2.1.0",
26
24
  "@types/node-gzip": "^1.1.0",
27
25
  "@types/remove-markdown": "^0.3.1",
28
26
  "@types/request": "^2.48.5",
29
- "jest": "^26.6.3",
30
- "ts-jest": "^26.5.5",
31
27
  "tslint": "^6.1.3",
32
28
  "tslint-language-service": "^0.9.9",
33
29
  "typescript": "^5.1.6"