@ibm-cloud/cd-tools 1.13.1 → 1.14.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +41 -6
- package/cmd/copy-toolchain.js +1 -1
- package/cmd/direct-transfer.js +208 -2
- package/cmd/utils/import-terraform.js +4 -1
- package/cmd/utils/requests.js +12 -1
- package/config.js +15 -0
- package/package.json +2 -1
package/README.md
CHANGED
|
@@ -35,19 +35,19 @@ brew install hashicorp/tap/terraform
|
|
|
35
35
|
The tools are provided as an [npx](https://docs.npmjs.com/cli/commands/npx) command. [npx](https://docs.npmjs.com/cli/commands/npx) (Node Package Execute) is a utility provided with [Node.js](https://nodejs.org/) which automatically downloads a module and its dependencies, and runs it. To see the available commands, run `npx @ibm-cloud/cd-tools` on your command line.
|
|
36
36
|
|
|
37
37
|
```shell-session
|
|
38
|
-
$ npx @ibm-cloud/cd-tools
|
|
38
|
+
$ npx @ibm-cloud/cd-tools -h
|
|
39
39
|
Usage: @ibm-cloud/cd-tools [options] [command]
|
|
40
40
|
|
|
41
|
-
Tools for
|
|
41
|
+
Tools and utilities for the IBM Cloud Continuous Delivery service and resources.
|
|
42
42
|
|
|
43
43
|
Options:
|
|
44
44
|
-V, --version output the version number
|
|
45
45
|
-h, --help display help for command
|
|
46
46
|
|
|
47
47
|
Commands:
|
|
48
|
-
copy-project-group [options]
|
|
49
|
-
copy-toolchain [options] Copies a toolchain, including tool integrations and Tekton pipelines, to another region or resource group
|
|
50
|
-
export-secrets [options]
|
|
48
|
+
copy-project-group [options] Copies all Git Repos and Issue Tracking projects in a group to another region.
|
|
49
|
+
copy-toolchain [options] Copies a toolchain, including tool integrations and Tekton pipelines, to another region or resource group.
|
|
50
|
+
export-secrets [options] Exports Toolchain stored secrets to a Secrets Manager instance
|
|
51
51
|
help [command] display help for command
|
|
52
52
|
```
|
|
53
53
|
|
|
@@ -88,6 +88,7 @@ Options:
|
|
|
88
88
|
-g, --group-id <id> The id of the group to copy from the source region (e.g. "1796019"), or the group name (e.g. "mygroup") for top-level groups. For sub-groups, a path
|
|
89
89
|
is also allowed, e.g. "mygroup/subgroup"
|
|
90
90
|
-n, --new-group-slug <slug> (Optional) Destination group URL slug (single path segment, e.g. "mygroup-copy"). Must be unique. Group display name remains the same as source.
|
|
91
|
+
-v, --verbose Enable verbose output (debug logs + wait details)
|
|
91
92
|
-h, --help display help for command
|
|
92
93
|
```
|
|
93
94
|
|
|
@@ -190,11 +191,45 @@ The command will output a collection of `.tf` files in the `terraform` directory
|
|
|
190
191
|
|
|
191
192
|
### Copying toolchains to a different account
|
|
192
193
|
|
|
193
|
-
The `copy-toolchain` command copies a toolchain within an IBM Cloud account. However it is possible to copy a toolchain to a different account with a few extra steps. Note that any tool integrations that access services in the source account, such as [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager
|
|
194
|
+
The `copy-toolchain` command copies a toolchain within an IBM Cloud account. However it is possible to copy a toolchain to a different account with a few extra steps. Note that any tool integrations that access services in the source account, such as [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager), [Event Notifications](https://cloud.ibm.com/docs/event-notifications), etc. are not supported for cross-account copying.
|
|
194
195
|
1. Run the `copy-toolchain` command with the `-D, --dry-run` option to first generate the Terraform (.tf) files to a directory (See [Getting the Terraform code for a toolchain](#getting-the-terraform-code-for-a-toolchain)).
|
|
195
196
|
2. Edit the `cd_toolchain.tf` file, replacing the `resource_group_id` with a valid resource group id in the target account. You can find the resource group id in the IBM Cloud console under [Manage > Account > Resource groups](https://cloud.ibm.com/account/resource-groups).
|
|
196
197
|
3. Switch to the directory containing the Terraform files, and run `terraform init`, then `terraform apply`.
|
|
197
198
|
4. When prompted for the API key, provide an API key for the target account you wish to copy the toolchain to.
|
|
198
199
|
|
|
200
|
+
## export-secrets
|
|
201
|
+
|
|
202
|
+
### Overview
|
|
203
|
+
The `export-secrets` command copies secrets stored directly in your toolchain or Tekton pipeline into [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager), and then updates the toolchain and pipeline to reference the secrets in Secrets Manager. The `copy-toolchain` command does not copy secrets stored directly in the toolchain or its Tekton pipeline environment properties or trigger properties, however [secret references](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-cd_data_security#cd_secrets_references) to secrets in a secret store such as [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager) or [Key Protect](https://cloud.ibm.com/docs/key-protect) can be copied. The `export-secrets` command is useful for moving your secrets out before copying a toolchain. You can also use it to check whether a toolchain or its Tekton pipeline(s) contain any stored secrets. Storing secrets in a proper secret store like Secrets Manager is a recommended practice for added security.
|
|
204
|
+
|
|
205
|
+
### Limitations
|
|
206
|
+
1. The Secrets Manager instance must be in the account that owns the API key you'll be using.
|
|
207
|
+
2. Only [arbitrary type](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-arbitrary-secrets) secrets are supported.
|
|
208
|
+
3. If you opt to create a Secrets Manager tool integration while running the command, it will not automatically create an IAM authorization policy to allow the toolchain to read secrets from the Secrets Manager instance. If the tool integration in the toolchain shows an error status due to the missing authorization policy, you can click the **Create Authorization** button to create a default one.
|
|
209
|
+
|
|
210
|
+
### Prerequisites
|
|
211
|
+
- You must first provision a [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-create-instance) instance before running the command.
|
|
212
|
+
- The API key you use must be from the same account as the Secrets Manager instance.
|
|
213
|
+
- The API key must have IAM permission to read the toolchain and create secrets in the selected Secrets Manager instance.
|
|
214
|
+
|
|
215
|
+
### Recommendations
|
|
216
|
+
- After running the command, open the toolchain and verify that the tool integration shows a healthy status. If it shows an error status due to a missing authorization policy, you can reconfigure the tool integration and click the **Create Authorization** button to create a default one.
|
|
217
|
+
- You can run the command as many times as you like until all secrets are exported.
|
|
218
|
+
|
|
219
|
+
### Usage
|
|
220
|
+
```shell-session
|
|
221
|
+
$ npx @ibm-cloud/cd-tools export-secrets -h
|
|
222
|
+
Usage: @ibm-cloud/cd-tools export-secrets [options]
|
|
223
|
+
|
|
224
|
+
Exports Toolchain stored secrets to a Secrets Manager instance
|
|
225
|
+
|
|
226
|
+
Options:
|
|
227
|
+
-c, --toolchain-crn <crn> The CRN of the toolchain to check
|
|
228
|
+
-a, --apikey <api_key> API key used to authenticate. Must have IAM permission to read toolchains and create secrets in Secrets Manager
|
|
229
|
+
--check (Optional) Checks and lists any stored secrets in your toolchain
|
|
230
|
+
-v, --verbose (Optional) Increase log output
|
|
231
|
+
-h, --help display help for command
|
|
232
|
+
```
|
|
233
|
+
|
|
199
234
|
## Test
|
|
200
235
|
All test setup and usage instructions are documented in [test/README.md](./test/README.md).
|
package/cmd/copy-toolchain.js
CHANGED
|
@@ -85,7 +85,7 @@ async function main(options) {
|
|
|
85
85
|
const verbosity = options.quiet ? 0 : options.verbose ? 2 : 1;
|
|
86
86
|
|
|
87
87
|
logger.setVerbosity(verbosity);
|
|
88
|
-
if (LOG_DUMP) logger.createLogStream(`${LOGS_DIR}/copy-toolchain-${
|
|
88
|
+
if (LOG_DUMP) logger.createLogStream(`${LOGS_DIR}/copy-toolchain-${TIME_SUFFIX}.log`);
|
|
89
89
|
logger.dump(`Options: ${JSON.stringify(options)}\n`);
|
|
90
90
|
|
|
91
91
|
let bearer;
|
package/cmd/direct-transfer.js
CHANGED
|
@@ -10,8 +10,10 @@
|
|
|
10
10
|
import { Command } from 'commander';
|
|
11
11
|
import axios from 'axios';
|
|
12
12
|
import { writeFile } from 'fs/promises';
|
|
13
|
-
import { COPY_PROJECT_GROUP_DESC, SOURCE_REGIONS } from '../config.js';
|
|
14
|
-
import { getWithRetry } from './utils/requests.js';
|
|
13
|
+
import { COPY_PROJECT_GROUP_DESC, SOURCE_REGIONS, BROKER_REGIONS } from '../config.js';
|
|
14
|
+
import { getWithRetry, shouldFailover } from './utils/requests.js';
|
|
15
|
+
import Papa from 'papaparse';
|
|
16
|
+
import fs from 'fs';
|
|
15
17
|
import { logger, LOG_STAGES } from './utils/logger.js';
|
|
16
18
|
import { promptUserYesNo } from './utils/utils.js';
|
|
17
19
|
|
|
@@ -158,11 +160,59 @@ class GitLabClient {
|
|
|
158
160
|
return out;
|
|
159
161
|
}
|
|
160
162
|
|
|
163
|
+
async getGroupPlaceholderCsv(groupId) {
|
|
164
|
+
const response = await this.client.get(`/groups/${groupId}/placeholder_reassignments`);
|
|
165
|
+
return response.data;
|
|
166
|
+
}
|
|
167
|
+
|
|
168
|
+
async reassignGroupPlaceholder(groupId, form) {
|
|
169
|
+
const response = await this.client.postForm(`/groups/${groupId}/placeholder_reassignments`, form);
|
|
170
|
+
return response.data;
|
|
171
|
+
}
|
|
172
|
+
|
|
161
173
|
async getGroup(groupId) {
|
|
162
174
|
const response = await this.client.get(`/groups/${groupId}`);
|
|
163
175
|
return response.data;
|
|
164
176
|
}
|
|
165
177
|
|
|
178
|
+
async syncUser(syncData) {
|
|
179
|
+
const preferred = syncData?.destRegion;
|
|
180
|
+
const candidates = [
|
|
181
|
+
...(preferred ? [preferred] : []),
|
|
182
|
+
...BROKER_REGIONS,
|
|
183
|
+
];
|
|
184
|
+
|
|
185
|
+
const seen = new Set();
|
|
186
|
+
const brokerRegions = candidates.filter(r => (seen.has(r) ? false : (seen.add(r), true)));
|
|
187
|
+
|
|
188
|
+
let lastErr;
|
|
189
|
+
|
|
190
|
+
for (const brokerRegion of brokerRegions) {
|
|
191
|
+
const url = `https://otc-github-consolidated-broker.${brokerRegion}.devops.cloud.ibm.com/git-user-sync`;
|
|
192
|
+
try {
|
|
193
|
+
const response = await this.client.post(url, syncData);
|
|
194
|
+
return response.data;
|
|
195
|
+
} catch (err) {
|
|
196
|
+
lastErr = err;
|
|
197
|
+
|
|
198
|
+
if (!shouldFailover(err)) throw err;
|
|
199
|
+
|
|
200
|
+
continue;
|
|
201
|
+
}
|
|
202
|
+
}
|
|
203
|
+
|
|
204
|
+
const status = lastErr?.response?.status;
|
|
205
|
+
const msg =
|
|
206
|
+
lastErr?.response?.data?.message ||
|
|
207
|
+
lastErr?.message ||
|
|
208
|
+
'Unknown error';
|
|
209
|
+
|
|
210
|
+
throw new Error(
|
|
211
|
+
`git-user-sync failed via all brokers (destRegion=${syncData?.destRegion}). ` +
|
|
212
|
+
`Last error: ${status || 'NO_RESPONSE'} - ${msg}`
|
|
213
|
+
);
|
|
214
|
+
}
|
|
215
|
+
|
|
166
216
|
async createBulkImport(importData) {
|
|
167
217
|
const response = await this.client.post('/bulk_imports', importData);
|
|
168
218
|
return response.data;
|
|
@@ -487,6 +537,160 @@ async function handleBulkImportConflict({ destination, destUrl, sourceGroupFullP
|
|
|
487
537
|
}
|
|
488
538
|
}
|
|
489
539
|
|
|
540
|
+
async function handlePlaceholderReassignments({ destination, options, destinationGroupPath }) {
|
|
541
|
+
const SRC_COL = 'Source user identifier';
|
|
542
|
+
const DEST_COL = 'GitLab username';
|
|
543
|
+
|
|
544
|
+
logger.print();
|
|
545
|
+
logger.info('Checking for placeholder users to reassign...', LOG_STAGES.info);
|
|
546
|
+
|
|
547
|
+
let destGroup;
|
|
548
|
+
try {
|
|
549
|
+
destGroup = await destination.getGroupByFullPath(destinationGroupPath);
|
|
550
|
+
} catch (e) {
|
|
551
|
+
logger.warn(
|
|
552
|
+
`Unable to look up destination group "${destinationGroupPath}" to process placeholder reassignment. Skipping.`,
|
|
553
|
+
LOG_STAGES.import
|
|
554
|
+
);
|
|
555
|
+
logger.debug(`Group lookup error: ${e?.message || e}`, LOG_STAGES.import);
|
|
556
|
+
return;
|
|
557
|
+
}
|
|
558
|
+
|
|
559
|
+
const destGroupId = destGroup?.id;
|
|
560
|
+
if (!destGroupId) {
|
|
561
|
+
logger.warn('Destination group ID not available. Skipping placeholder reassignment.', LOG_STAGES.import);
|
|
562
|
+
return;
|
|
563
|
+
}
|
|
564
|
+
|
|
565
|
+
const csvText = await logger.withSpinner(
|
|
566
|
+
() => destination.getGroupPlaceholderCsv(destGroupId),
|
|
567
|
+
'Fetching placeholder reassignment data...',
|
|
568
|
+
'Fetched placeholder reassignment data.',
|
|
569
|
+
LOG_STAGES.request
|
|
570
|
+
);
|
|
571
|
+
|
|
572
|
+
if (!csvText) {
|
|
573
|
+
logger.info('No placeholder reassignment support detected (or no placeholders). Skipping.', LOG_STAGES.import);
|
|
574
|
+
return;
|
|
575
|
+
}
|
|
576
|
+
|
|
577
|
+
const parsed = Papa.parse(csvText, { header: true, skipEmptyLines: true });
|
|
578
|
+
const rows = Array.isArray(parsed?.data) ? parsed.data.filter(Boolean) : [];
|
|
579
|
+
|
|
580
|
+
if (parsed?.errors?.length) {
|
|
581
|
+
logger.warn(`Placeholder CSV parse warnings: ${parsed.errors.length}`, LOG_STAGES.import);
|
|
582
|
+
if (options.verbose) {
|
|
583
|
+
parsed.errors.slice(0, 5).forEach(e => logger.log(`CSV parse warning: ${JSON.stringify(e)}`, LOG_STAGES.import, true));
|
|
584
|
+
}
|
|
585
|
+
}
|
|
586
|
+
|
|
587
|
+
if (!rows.length) {
|
|
588
|
+
logger.info('No placeholder users found. Skipping reassignment.', LOG_STAGES.import);
|
|
589
|
+
return;
|
|
590
|
+
}
|
|
591
|
+
|
|
592
|
+
if (!(SRC_COL in rows[0])) {
|
|
593
|
+
logger.warn(`Placeholder CSV missing expected column "${SRC_COL}". Skipping reassignment.`, LOG_STAGES.import);
|
|
594
|
+
if (options.verbose) logger.debug(`CSV headers: ${Object.keys(rows[0] || {}).join(', ')}`, LOG_STAGES.import);
|
|
595
|
+
return;
|
|
596
|
+
}
|
|
597
|
+
|
|
598
|
+
logger.info(`Found ${rows.length} placeholder record(s). Resolving users...`, LOG_STAGES.info);
|
|
599
|
+
|
|
600
|
+
const ids = [...new Set(rows.map(r => (r?.[SRC_COL] || '').trim()).filter(Boolean))];
|
|
601
|
+
|
|
602
|
+
if (!ids.length) {
|
|
603
|
+
logger.info('No valid placeholder user identifiers found. Skipping reassignment.', LOG_STAGES.import);
|
|
604
|
+
return;
|
|
605
|
+
}
|
|
606
|
+
|
|
607
|
+
const idToUsername = new Map();
|
|
608
|
+
const failedIds = [];
|
|
609
|
+
|
|
610
|
+
const resolveUsers = async () => {
|
|
611
|
+
for (let i = 0; i < ids.length; i++) {
|
|
612
|
+
const userId = ids[i];
|
|
613
|
+
logger.updateSpinnerMsg(`Resolving placeholder users (${i + 1}/${ids.length})...`);
|
|
614
|
+
|
|
615
|
+
try {
|
|
616
|
+
const resp = await destination.syncUser({
|
|
617
|
+
sourceRegion: options.sourceRegion,
|
|
618
|
+
destRegion: options.destRegion,
|
|
619
|
+
groupId: destGroupId,
|
|
620
|
+
userId,
|
|
621
|
+
});
|
|
622
|
+
|
|
623
|
+
const username = resp?.username;
|
|
624
|
+
if (username) {
|
|
625
|
+
idToUsername.set(userId, username);
|
|
626
|
+
} else {
|
|
627
|
+
failedIds.push(userId);
|
|
628
|
+
if (options.verbose) logger.warn(`User sync returned no username for userId=${userId}`, LOG_STAGES.import);
|
|
629
|
+
}
|
|
630
|
+
} catch (e) {
|
|
631
|
+
failedIds.push(userId);
|
|
632
|
+
logger.warn(`Failed to sync userId=${userId}: ${e?.message || e}`, LOG_STAGES.request);
|
|
633
|
+
}
|
|
634
|
+
}
|
|
635
|
+
return { resolved: idToUsername.size, total: ids.length };
|
|
636
|
+
};
|
|
637
|
+
|
|
638
|
+
const resolvedSummary = await logger.withSpinner(
|
|
639
|
+
resolveUsers,
|
|
640
|
+
`Resolving placeholder users (0/${ids.length})...`,
|
|
641
|
+
`Resolved placeholder users.`,
|
|
642
|
+
LOG_STAGES.request
|
|
643
|
+
);
|
|
644
|
+
|
|
645
|
+
logger.info(`Resolved ${resolvedSummary.resolved}/${resolvedSummary.total} user(s).`, LOG_STAGES.info);
|
|
646
|
+
|
|
647
|
+
if (idToUsername.size === 0) {
|
|
648
|
+
logger.warn('No users could be resolved. Skipping placeholder reassignment upload.', LOG_STAGES.info);
|
|
649
|
+
return;
|
|
650
|
+
}
|
|
651
|
+
|
|
652
|
+
let updatedCount = 0;
|
|
653
|
+
for (const r of rows) {
|
|
654
|
+
const id = (r?.[SRC_COL] || '').trim();
|
|
655
|
+
const username = idToUsername.get(id);
|
|
656
|
+
if (username) {
|
|
657
|
+
r[DEST_COL] = username;
|
|
658
|
+
updatedCount++;
|
|
659
|
+
}
|
|
660
|
+
}
|
|
661
|
+
|
|
662
|
+
logger.info(`Prepared reassignment CSV for ${updatedCount}/${rows.length} record(s).`, LOG_STAGES.setup);
|
|
663
|
+
|
|
664
|
+
const outPath = `groupPlaceholders.csv`;
|
|
665
|
+
const csvOut = Papa.unparse(rows);
|
|
666
|
+
|
|
667
|
+
await fs.promises.writeFile(outPath, csvOut, 'utf8');
|
|
668
|
+
logger.info(`Wrote placeholder reassignment CSV: ${outPath}`, LOG_STAGES.setup);
|
|
669
|
+
|
|
670
|
+
const csvConfig = { file: fs.createReadStream(outPath) };
|
|
671
|
+
|
|
672
|
+
const reassignRes = await logger.withSpinner(
|
|
673
|
+
() => destination.reassignGroupPlaceholder(destGroupId, csvConfig),
|
|
674
|
+
'Submitting placeholder reassignment...',
|
|
675
|
+
'Placeholder reassignment submitted.',
|
|
676
|
+
LOG_STAGES.request
|
|
677
|
+
);
|
|
678
|
+
|
|
679
|
+
if (!reassignRes) {
|
|
680
|
+
logger.warn('Placeholder reassignment endpoint not supported (or returned 404). Skipping.', LOG_STAGES.import);
|
|
681
|
+
return;
|
|
682
|
+
}
|
|
683
|
+
|
|
684
|
+
if (options.verbose) logger.debug(`Placeholder reassignment response: ${JSON.stringify(reassignRes)}`, LOG_STAGES.import);
|
|
685
|
+
|
|
686
|
+
if (failedIds.length) {
|
|
687
|
+
logger.warn(`Some users could not be resolved: ${failedIds.length}/${ids.length}`, LOG_STAGES.import);
|
|
688
|
+
if (options.verbose) failedIds.slice(0, 20).forEach(id => logger.log(`Unresolved userId: ${id}`, LOG_STAGES.import, true));
|
|
689
|
+
} else {
|
|
690
|
+
logger.success('✔ Placeholder reassignment completed.', LOG_STAGES.info);
|
|
691
|
+
}
|
|
692
|
+
}
|
|
693
|
+
|
|
490
694
|
async function directTransfer(options) {
|
|
491
695
|
const sourceUrl = validateAndConvertRegion(options.sourceRegion);
|
|
492
696
|
const destUrl = validateAndConvertRegion(options.destRegion);
|
|
@@ -695,6 +899,8 @@ async function directTransfer(options) {
|
|
|
695
899
|
logger.info(`${summary.entityFailed} entities failed to copy`, LOG_STAGES.import);
|
|
696
900
|
if (newGroupUrl) logger.info(`New group URL: ${newGroupUrl}`, LOG_STAGES.import);
|
|
697
901
|
|
|
902
|
+
await handlePlaceholderReassignments({destination, options, destinationGroupPath});
|
|
903
|
+
|
|
698
904
|
// show failed list only in verbose (or if failures exist)
|
|
699
905
|
if (summary.entityFailed > 0) {
|
|
700
906
|
logger.print();
|
|
@@ -15,9 +15,12 @@ import { jsonToTf } from 'json-to-tf';
|
|
|
15
15
|
import { getPipelineData, getToolchainTools } from './requests.js';
|
|
16
16
|
import { runTerraformPlanGenerate, setTerraformEnv } from './terraform.js';
|
|
17
17
|
import { getRandChars, isSecretReference, normalizeName } from './utils.js';
|
|
18
|
+
import { logger } from './logger.js';
|
|
18
19
|
|
|
19
20
|
import { SECRET_KEYS_MAP, SUPPORTED_TOOLS_MAP } from '../../config.js';
|
|
20
21
|
|
|
22
|
+
const DEBUG_MODE = process.env['DEBUG_MODE'] === 'true'; // when true, log extra errors for debugging
|
|
23
|
+
|
|
21
24
|
export async function importTerraform(token, apiKey, region, toolchainId, toolchainName, dir, isCompact, verbosity) {
|
|
22
25
|
// STEP 1/2: set up terraform file with import blocks
|
|
23
26
|
const importBlocks = []; // an array of objects representing import blocks, used in importBlocksToTf
|
|
@@ -157,7 +160,7 @@ export async function importTerraform(token, apiKey, region, toolchainId, toolch
|
|
|
157
160
|
|
|
158
161
|
// STEP 2/2: run terraform import and post-processing
|
|
159
162
|
setTerraformEnv(apiKey, verbosity);
|
|
160
|
-
await runTerraformPlanGenerate(dir, 'generated/draft.tf').catch(() => { }); // temp fix for errors
|
|
163
|
+
await runTerraformPlanGenerate(dir, 'generated/draft.tf').catch((err) => { DEBUG_MODE && logger.dump(`\n[DEBUG_MODE=true] Draft errors: ${err}`) }); // temp fix for errors before post-processing
|
|
161
164
|
|
|
162
165
|
const generatedFile = fs.readFileSync(`${dir}/generated/draft.tf`);
|
|
163
166
|
const generatedFileJson = await tfToJson('draft.tf', generatedFile.toString());
|
package/cmd/utils/requests.js
CHANGED
|
@@ -502,6 +502,16 @@ async function getWithRetry(client, path, params = {}, { retries = 3, retryDelay
|
|
|
502
502
|
throw lastError;
|
|
503
503
|
}
|
|
504
504
|
|
|
505
|
+
function shouldFailover(err) {
|
|
506
|
+
if (!err?.response) return true;
|
|
507
|
+
const status = err.response.status;
|
|
508
|
+
if (status >= 500) return true;
|
|
509
|
+
if (status === 404) return true;
|
|
510
|
+
if (status === 401) return true;
|
|
511
|
+
|
|
512
|
+
return false;
|
|
513
|
+
}
|
|
514
|
+
|
|
505
515
|
export {
|
|
506
516
|
getBearerToken,
|
|
507
517
|
getAccountId,
|
|
@@ -521,5 +531,6 @@ export {
|
|
|
521
531
|
createTool,
|
|
522
532
|
getSmInstances,
|
|
523
533
|
migrateToolchainSecrets,
|
|
524
|
-
getWithRetry
|
|
534
|
+
getWithRetry,
|
|
535
|
+
shouldFailover
|
|
525
536
|
}
|
package/config.js
CHANGED
|
@@ -55,6 +55,20 @@ const TARGET_REGIONS = [
|
|
|
55
55
|
'us-south'
|
|
56
56
|
];
|
|
57
57
|
|
|
58
|
+
const BROKER_REGIONS = [
|
|
59
|
+
'au-syd',
|
|
60
|
+
'br-sao',
|
|
61
|
+
'ca-mon',
|
|
62
|
+
'ca-tor',
|
|
63
|
+
'eu-de',
|
|
64
|
+
'eu-es',
|
|
65
|
+
'eu-gb',
|
|
66
|
+
'jp-osa',
|
|
67
|
+
'jp-tok',
|
|
68
|
+
'us-east',
|
|
69
|
+
'us-south'
|
|
70
|
+
];
|
|
71
|
+
|
|
58
72
|
const TERRAFORM_REQUIRED_VERSION = '1.13.3';
|
|
59
73
|
|
|
60
74
|
// see https://docs.gitlab.com/user/reserved_names/
|
|
@@ -232,6 +246,7 @@ export {
|
|
|
232
246
|
DOCS_URL,
|
|
233
247
|
SOURCE_REGIONS,
|
|
234
248
|
TARGET_REGIONS,
|
|
249
|
+
BROKER_REGIONS,
|
|
235
250
|
TERRAFORM_REQUIRED_VERSION,
|
|
236
251
|
RESERVED_GRIT_PROJECT_NAMES,
|
|
237
252
|
RESERVED_GRIT_GROUP_NAMES,
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@ibm-cloud/cd-tools",
|
|
3
|
-
"version": "1.
|
|
3
|
+
"version": "1.14.0",
|
|
4
4
|
"description": "Tools and utilities for the IBM Cloud Continuous Delivery service and resources",
|
|
5
5
|
"repository": {
|
|
6
6
|
"type": "git",
|
|
@@ -25,6 +25,7 @@
|
|
|
25
25
|
"commander": "^14.0.1",
|
|
26
26
|
"json-to-tf": "^0.3.1",
|
|
27
27
|
"ora": "^9.0.0",
|
|
28
|
+
"papaparse": "^5.5.3",
|
|
28
29
|
"strip-ansi": "^7.1.2"
|
|
29
30
|
},
|
|
30
31
|
"bin": {
|