@ibm-cloud/cd-tools 1.3.4 → 1.5.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +151 -23
- package/cmd/copy-toolchain.js +33 -26
- package/cmd/utils/import-terraform.js +14 -15
- package/cmd/utils/requests.js +0 -22
- package/cmd/utils/terraform.js +29 -5
- package/cmd/utils/validate.js +5 -5
- package/config.js +4 -4
- package/create-s2s-script.js +119 -0
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,32 +1,20 @@
|
|
|
1
1
|
# Continuous Delivery tools
|
|
2
2
|
|
|
3
|
-
Provides tools to work with IBM Cloud Continuous Delivery resources, including
|
|
3
|
+
Provides tools to work with IBM Cloud [Continuous Delivery](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-getting-started) resources, including [Toolchains](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-toolchains-using), [Delivery Pipelines](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-tekton-pipelines), and [Git Repos and Issue Tracking](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-git_working) projects.
|
|
4
4
|
|
|
5
5
|
#### Supported resources
|
|
6
6
|
| Resource | Supported |
|
|
7
7
|
| :- | :- |
|
|
8
|
-
| Toolchains | Yes <sup>1</sup> |
|
|
9
|
-
| Git Repos and Issue Tracking
|
|
10
|
-
| Delivery Pipelines (Tekton) | Yes <sup>1</sup>
|
|
11
|
-
| Delivery Pipelines (Classic) | No |
|
|
12
|
-
| DevOps Insights | No |
|
|
13
|
-
| Other Tool Integrations | Yes |
|
|
14
|
-
|
|
15
|
-
#### Limitations
|
|
16
|
-
1. Secrets stored directly in Toolchains or Delivery Pipelines (environment properties or trigger properties) will not be copied. A `check-secrets` tool is provided to export secrets into a Secrets Manager instance, replacing the stored secrets with secret references. Secret references are supported in the migration.
|
|
17
|
-
2. Personal Access Tokens will not be copied.
|
|
18
|
-
3. Pipeline run history, logs, and assets will not be copied to the new region. You can keep the original pipelines for some time to retain history.
|
|
19
|
-
4. Classic pipelines are not supported.
|
|
20
|
-
5. DevOps Insights is not supported.
|
|
8
|
+
| [Toolchains](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-toolchains-using) | Yes <sup>[1](#limitations-1)</sup> |
|
|
9
|
+
| [Git Repos and Issue Tracking](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-git_working) | Yes <sup>[2](#limitations)</sup> |
|
|
10
|
+
| [Delivery Pipelines (Tekton)](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-tekton-pipelines) | Yes <sup>[3](#limitations-1)</sup> |
|
|
11
|
+
| [Delivery Pipelines (Classic)](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-deliverypipeline_about) | No |
|
|
12
|
+
| [DevOps Insights](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-di_working) | No |
|
|
13
|
+
| [Other Tool Integrations](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-integrations) | Yes |
|
|
21
14
|
|
|
22
15
|
## Prerequisites
|
|
23
16
|
- Node.js v20 (or later)
|
|
24
17
|
- Terraform v1.13.3 (or later)
|
|
25
|
-
- An **IBM Cloud API key** with the following IAM access permissions:
|
|
26
|
-
- **Viewer** for the source Toolchain(s) being copied
|
|
27
|
-
- **Editor** for create new Toolchains in the target region
|
|
28
|
-
- **Administrator** for other IBM Cloud service instances that have a tool integration with IAM service-to-service authorizations, such as Secrets Manager, Event Notifications, etc.
|
|
29
|
-
- For Git Repos and Issue Tracking projects, Personal Access Tokens (PAT) for the source and destination regions are required, with the `api` scope.
|
|
30
18
|
|
|
31
19
|
## Install
|
|
32
20
|
### Install Node.js, Terraform
|
|
@@ -44,7 +32,7 @@ brew install hashicorp/tap/terraform
|
|
|
44
32
|
|
|
45
33
|
## Usage
|
|
46
34
|
|
|
47
|
-
The tools are provided as an [npx](https://docs.npmjs.com/cli/commands/npx) command which automatically downloads and runs
|
|
35
|
+
The tools are provided as an [npx](https://docs.npmjs.com/cli/commands/npx) command. [npx](https://docs.npmjs.com/cli/commands/npx) (Node Package Execute) is a utility provided with [Node.js](https://nodejs.org/) which automatically downloads a module and its dependencies, and runs it. To see the available commands, run `npx @ibm-cloud/cd-tools` on your command line.
|
|
48
36
|
|
|
49
37
|
```shell-session
|
|
50
38
|
$ npx @ibm-cloud/cd-tools
|
|
@@ -58,10 +46,150 @@ Options:
|
|
|
58
46
|
|
|
59
47
|
Commands:
|
|
60
48
|
copy-project-group [options] Bulk migrate GitLab group projects
|
|
61
|
-
|
|
62
|
-
|
|
49
|
+
copy-toolchain [options] Copies a toolchain, including tool integrations and Tekton pipelines, to another region or resource group
|
|
50
|
+
export-secrets [options] Checks if you have any stored secrets in your toolchain or pipelines, and exports them to Secrets Manager
|
|
63
51
|
help [command] display help for command
|
|
64
52
|
```
|
|
65
53
|
|
|
54
|
+
## copy-project-group
|
|
55
|
+
|
|
56
|
+
### Overview
|
|
57
|
+
The `copy-project-group` command copies a [group](https://docs.gitlab.com/user/group/) of projects in IBM Cloud Continuous Delivery's [Git Repos and Issue Tracking](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-git_working) from one region to another. This includes the project group, projects, Git repositories, issues, merge requests, wiki, and most other resources. See the [full list](https://docs.gitlab.com/user/group/import/migrated_items/) of items included in the copy. In addition to copying the project group, the command will also ensure that project members exist in the destination region and are added to the newly copied project group, preserving existing permissions.
|
|
58
|
+
|
|
59
|
+
### Limitations
|
|
60
|
+
1. Personal projects are not supported. If you created a project under a [personal namespace](https://docs.gitlab.com/user/namespace/), you can either [move your personal project to a group](https://docs.gitlab.com/tutorials/move_personal_project_to_group/), or [convert your personal namespace into a group](https://docs.gitlab.com/tutorials/convert_personal_namespace_to_group/). It is recommended that you store projects in groups, as they allow multiple administrators and allow better continuity of a project over time.
|
|
61
|
+
2. This command requests a GitLab direct transfer and is subject to the [limitations of using direct transfer](https://docs.gitlab.com/user/group/import/#known-issues).
|
|
62
|
+
3. Copying large projects, or projects with large files or many resources, can take time.
|
|
63
|
+
4. As each region of Git Repos and Issue Tracking is independent, your projects' users may not yet exist in the destination region. The `copy-project-group` will ensure that the users exist in the new region, however there may be user name conflicts with other users in the destination region. In the event of a user name conflict, the user name in the destination region may be changed slightly by adding a suffix.
|
|
64
|
+
|
|
65
|
+
### Prerequisites
|
|
66
|
+
- Personal Access Tokens (PAT) for the source and destination regions are required
|
|
67
|
+
- Both PATs must have the `api` scope.
|
|
68
|
+
|
|
69
|
+
### Recommendations
|
|
70
|
+
- Be patient. Copying large projects may take some time. Allow the command to run to completion.
|
|
71
|
+
|
|
72
|
+
### Usage
|
|
73
|
+
```shell-session
|
|
74
|
+
$ npx @ibm-cloud/cd-tools copy-project-group -h
|
|
75
|
+
Usage: @ibm-cloud/cd-tools copy-project-group [options]
|
|
76
|
+
|
|
77
|
+
Bulk migrate GitLab group projects
|
|
78
|
+
|
|
79
|
+
Options:
|
|
80
|
+
-s, --source-region <region> Source GitLab instance region
|
|
81
|
+
-d, --dest-region <region> Destination GitLab instance region
|
|
82
|
+
--st, --source-token <token> Source GitLab access token
|
|
83
|
+
--dt, --dest-token <token> Destination GitLab access token
|
|
84
|
+
-g, --group-id <id> Source group ID to migrate
|
|
85
|
+
-n, --new-name <n> New group path (optional)
|
|
86
|
+
-h, --help display help for command
|
|
87
|
+
```
|
|
88
|
+
|
|
89
|
+
## copy-toolchain
|
|
90
|
+
|
|
91
|
+
### Overview
|
|
92
|
+
The `copy-toolchain` command copies a [toolchain](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-toolchains-using), including tool integrations and Tekton pipelines, to another region or resource group, in the same account. The copy works by first serializing the existing toolchain into Terraform (.tf) files, then applying the Terraform on the destination.
|
|
93
|
+
|
|
94
|
+
### Limitations
|
|
95
|
+
1. [Classic pipelines](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-deliverypipeline_about) are not supported.
|
|
96
|
+
2. [DevOps Insights](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-di_working) is not supported.
|
|
97
|
+
3. Secrets stored directly in Toolchains or Delivery Pipelines (environment properties or trigger properties) will not be copied. An `export-secrets` command is provided to export secrets into a [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-getting-started) instance, replacing the stored secrets with secret references. Secret references are supported. It is recommended to store secrets in [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-getting-started).
|
|
98
|
+
4. Tekton pipeline webhook trigger secrets will not be copied, as references are not supported for webhook trigger secrets. You will need to add the secret after copying the toolchain.
|
|
99
|
+
5. Tekton pipeline run history, logs, and assets will not be copied. You can keep the original pipelines for some time to retain history.
|
|
100
|
+
6. GitHub and Git Repos and Issue Tracking tool integrations configured with OAuth type authentication will automatically be converted to use the OAuth identity of the user performing the copy (the owner of the API key) rather than the original user. This is to simplify the copy operation. You can re-configure the tool integrations after copying to use a different user.
|
|
101
|
+
7. Git Repos and Issue Tracking tool integrations that use Personal Access Tokens (PATs) for authentication will automatically be converted to use OAuth. You can re-configure the tool integrations after copying to use a PAT again.
|
|
102
|
+
|
|
103
|
+
### Prerequisites
|
|
104
|
+
- An [IBM Cloud API key](https://cloud.ibm.com/docs/account?topic=account-manapikey) with the IAM access listed below.
|
|
105
|
+
- **Viewer** access for the source Toolchain(s) being copied
|
|
106
|
+
- **Editor** access for creating new Toolchains in the target region
|
|
107
|
+
- **Administrator** access for other IBM Cloud service instances that have a tool integration with IAM service-to-service authorizations, such as [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-getting-started), [Event Notifications](https://cloud.ibm.com/docs/event-notifications?topic=event-notifications-getting-started), etc.
|
|
108
|
+
- Access to any GitHub or Git Repos and Issue Tracking **repositories** referenced by tool integrations in the toolchain, with permission to **read the repository** and **create webhooks**. This is required in order to create pipeline Git type triggers, which require a webhook to be added on the repository to trigger the pipeline, and for the pipeline to be able to clone the repositories during execution.
|
|
109
|
+
- A [Continuous Delivery](https://cloud.ibm.com/catalog/services/continuous-delivery) service instance is required in the target region and resource group in order to properly create the toolchain copy. Note that Continuous Delivery capabilities (Delivery Pipelines, Git Repos and Issue Tracking, etc) are subject to the plan of the Continuous Delivery instance in the same region and resource group as the toolchain. [Learn more](https://cloud.ibm.com/docs/ContinuousDelivery?topic=ContinuousDelivery-limitations_usage)
|
|
110
|
+
|
|
111
|
+
### Recommendations
|
|
112
|
+
- Ensure that all tool integrations in the toolchains are correctly configured and showing no errors in the toolchain page before proceeding. If there are misconfigured tool integrations, the tool will prompt you before proceeding.
|
|
113
|
+
|
|
114
|
+
### CRN
|
|
115
|
+
|
|
116
|
+
IBM Cloud resources are uniquely identified by a [Cloud Resource Name (CRN)](https://cloud.ibm.com/docs/account?topic=account-crn). You will need the CRN of the toolchain you want to copy. You can get the CRN of a toolchain a few ways:
|
|
117
|
+
|
|
118
|
+
1. Locate the toolchain in the [Platform Automation](https://cloud.ibm.com/automation) > [Toolchains](https://cloud.ibm.com/automation/toolchains) page, open the toolchain, and click **Details** to see the toolchain details, which shows the CRN.
|
|
119
|
+
2. Locate the toolchain in the [Resource list](https://cloud.ibm.com/resources) page, click on the toolchain row to expand the details panel, which shows the CRN.
|
|
120
|
+
3. Using the [ibmcloud cli](https://cloud.ibm.com/docs/cli?topic=cli-getting-started), you can list toolchains and their CRNs via
|
|
121
|
+
```shell-session
|
|
122
|
+
$ ibmcloud resource service-instances --service-name toolchain --long
|
|
123
|
+
```
|
|
124
|
+
4. Using the [CD Toolchain API](https://cloud.ibm.com/apidocs/toolchain).
|
|
125
|
+
|
|
126
|
+
### Usage
|
|
127
|
+
```shell-session
|
|
128
|
+
$ npx @ibm-cloud/cd-tools copy-toolchain -h
|
|
129
|
+
Usage: @ibm-cloud/cd-tools copy-toolchain [options]
|
|
130
|
+
|
|
131
|
+
Copies a toolchain, including tool integrations and Tekton pipelines, to another region or resource group.
|
|
132
|
+
|
|
133
|
+
Examples:
|
|
134
|
+
export IBMCLOUD_API_KEY='...'
|
|
135
|
+
npx @ibm-cloud/cd-migration-tools copy-toolchain -c ${TOOLCHAIN_CRN} -r us-south
|
|
136
|
+
Copy a toolchain to the Dallas region with the same name, in the same resource group.
|
|
137
|
+
npx @ibm-cloud/cd-migration-tools copy-toolchain -c ${TOOLCHAIN_CRN} -r eu-de -n new-toolchain-name -g new-resource-group --apikey ${APIKEY}
|
|
138
|
+
Copy a toolchain to the Frankfurt region with the specified name and target resource group, using the given API key
|
|
139
|
+
|
|
140
|
+
Environment Variables:
|
|
141
|
+
IBMCLOUD_API_KEY API key used to authenticate. Must have IAM permission to read and create toolchains and service-to-service authorizations in source and target region / resource group
|
|
142
|
+
|
|
143
|
+
Basic options:
|
|
144
|
+
-c, --toolchain-crn <crn> The CRN of the source toolchain to copy
|
|
145
|
+
-r, --region <region> The destination region of the copied toolchain (choices: "au-syd", "br-sao", "ca-mon", "ca-tor", "eu-de", "eu-es", "eu-gb", "jp-osa", "jp-tok", "us-east", "us-south")
|
|
146
|
+
-a, --apikey <api_key> API key used to authenticate. Must have IAM permission to read and create toolchains and service-to-service authorizations in source and target region / resource group
|
|
147
|
+
-n, --name <name> (Optional) The name of the copied toolchain (default: same name as original)
|
|
148
|
+
-g, --resource-group <resource_group> (Optional) The name or ID of destination resource group of the copied toolchain (default: same resource group as original)
|
|
149
|
+
-t, --tag <tag> (Optional) The tag to add to the copied toolchain
|
|
150
|
+
-h, --help Display help for command
|
|
151
|
+
|
|
152
|
+
Advanced options:
|
|
153
|
+
-d, --terraform-dir <path> (Optional) The target local directory to store the generated Terraform (.tf) files
|
|
154
|
+
-D, --dry-run (Optional) Skip running terraform apply; only generate the Terraform (.tf) files
|
|
155
|
+
-f, --force (Optional) Force the copy toolchain command to run without user confirmation
|
|
156
|
+
-S, --skip-s2s (Optional) Skip importing toolchain-generated service-to-service authorizations
|
|
157
|
+
-T, --skip-disable-triggers (Optional) Skip disabling Tekton pipeline Git or timed triggers. Note: This may result in duplicate pipeline runs
|
|
158
|
+
-C, --compact (Optional) Generate all resources in a single resources.tf file
|
|
159
|
+
-v, --verbose (Optional) Increase log output
|
|
160
|
+
-q, --quiet (Optional) Suppress non-essential output, only errors and critical warnings are displayed
|
|
161
|
+
```
|
|
162
|
+
|
|
163
|
+
### Retrying after errors
|
|
164
|
+
|
|
165
|
+
If an error occurs while copying the toolchain, the copied toolchain may be incomplete. You may need to try the command again. To try again, you can either:
|
|
166
|
+
1. Delete the partially created toolchain and run the `copy-toolchain` command again.
|
|
167
|
+
2. Re-run the `terraform apply` command.<br/><br/>The `copy-toolchain` first serializes the source toolchain into Terraform (.tf) files. If you don't specify the `-d, --terraform-dir <path>`, the Terraform files will be placed in a folder in the current working directory named `output-{id}`, e.g. `output-1764100766410`. You can locate the most recent output folder and re-run `terraform apply`. This will continue where the previous command left off. When prompted for an API key, specify the same API key you used to run the `copy-toolchain` command.
|
|
168
|
+
```shell-session
|
|
169
|
+
$ cd output-1764102115772
|
|
170
|
+
$ terraform apply
|
|
171
|
+
var.ibmcloud_api_key
|
|
172
|
+
Enter a value: {api_key}
|
|
173
|
+
...
|
|
174
|
+
```
|
|
175
|
+
|
|
176
|
+
### Getting the Terraform code for a toolchain
|
|
177
|
+
|
|
178
|
+
You can get the Terraform (.tf) files for a toolchain by running the `copy-toolchain` command with the `-D, --dry-run` option, and specifying the directory to store the Terraform files with the `-d, --terraform-dir <path>` option.
|
|
179
|
+
|
|
180
|
+
```shell-session
|
|
181
|
+
$ npx @ibm-cloud/cd-tools copy-toolchain -c ${CRN} -r us-south --dry-run --terraform-dir ./terraform
|
|
182
|
+
```
|
|
183
|
+
|
|
184
|
+
The command will output a collection of `.tf` files in the `terraform` directory. If you prefer to have a single file containing all the Terraform source, you can also specify the `-C, --compact` option.
|
|
185
|
+
|
|
186
|
+
### Copying toolchains to a different account
|
|
187
|
+
|
|
188
|
+
The `copy-toolchain` command copies a toolchain within an IBM Cloud account. However it is possible to copy a toolchain to a different account with a few extra steps. Note that any tool integrations that access services in the source account, such as [Secrets Manager](https://cloud.ibm.com/docs/secrets-manager?topic=secrets-manager-getting-started), [Event Notifications](https://cloud.ibm.com/docs/event-notifications?topic=event-notifications-getting-started), etc. are not supported for cross-account copying.
|
|
189
|
+
1. Run the `copy-toolchain` command with the `-D, --dry-run` option to first generate the Terraform (.tf) files to a directory (See [Getting the Terraform code for a toolchain](#getting-the-terraform-code-for-a-toolchain)).
|
|
190
|
+
2. Edit the `cd_toolchain.tf` file, replacing the `resource_group_id` with a valid resource group id in the target account. You can find the resource group id in the IBM Cloud console under [Manage > Account > Resource groups](https://cloud.ibm.com/account/resource-groups).
|
|
191
|
+
3. Switch to the directory containing the Terraform files, and run `terraform init`, then `terraform apply`.
|
|
192
|
+
4. When prompted for the API key, provide an API key for the target account you wish to copy the toolchain to.
|
|
193
|
+
|
|
66
194
|
## Test
|
|
67
|
-
All test setup and usage instructions are documented in [test/README.md](./test/README.md).
|
|
195
|
+
All test setup and usage instructions are documented in [test/README.md](./test/README.md).
|
package/cmd/copy-toolchain.js
CHANGED
|
@@ -8,7 +8,7 @@
|
|
|
8
8
|
*/
|
|
9
9
|
|
|
10
10
|
import { exit } from 'node:process';
|
|
11
|
-
import { resolve } from 'node:path'
|
|
11
|
+
import { resolve } from 'node:path';
|
|
12
12
|
import fs from 'node:fs';
|
|
13
13
|
|
|
14
14
|
import { Command, Option } from 'commander';
|
|
@@ -16,12 +16,14 @@ import { Command, Option } from 'commander';
|
|
|
16
16
|
import { parseEnvVar } from './utils/utils.js';
|
|
17
17
|
import { logger, LOG_STAGES } from './utils/logger.js';
|
|
18
18
|
import { setTerraformEnv, initProviderFile, setupTerraformFiles, runTerraformInit, getNumResourcesPlanned, runTerraformApply, getNumResourcesCreated, getNewToolchainId } from './utils/terraform.js';
|
|
19
|
-
import { getAccountId, getBearerToken, getCdInstanceByRegion,
|
|
19
|
+
import { getAccountId, getBearerToken, getCdInstanceByRegion, getResourceGroupIdAndName, getToolchain } from './utils/requests.js';
|
|
20
20
|
import { validatePrereqsVersions, validateTag, validateToolchainId, validateToolchainName, validateTools, validateOAuth, warnDuplicateName, validateGritUrl } from './utils/validate.js';
|
|
21
21
|
import { importTerraform } from './utils/import-terraform.js';
|
|
22
22
|
|
|
23
23
|
import { COPY_TOOLCHAIN_DESC, DOCS_URL, TARGET_REGIONS, SOURCE_REGIONS } from '../config.js';
|
|
24
24
|
|
|
25
|
+
import packageJson from '../package.json' with { type: "json" };
|
|
26
|
+
|
|
25
27
|
process.on('exit', (code) => {
|
|
26
28
|
if (code !== 0) logger.print(`Need help? Visit ${DOCS_URL} for more troubleshooting information.`);
|
|
27
29
|
});
|
|
@@ -54,7 +56,7 @@ const command = new Command('copy-toolchain')
|
|
|
54
56
|
.option('-d, --terraform-dir <path>', '(Optional) The target local directory to store the generated Terraform (.tf) files')
|
|
55
57
|
.option('-D, --dry-run', '(Optional) Skip running terraform apply; only generate the Terraform (.tf) files')
|
|
56
58
|
.option('-f, --force', '(Optional) Force the copy toolchain command to run without user confirmation')
|
|
57
|
-
.option('-S, --skip-s2s', '(Optional) Skip
|
|
59
|
+
.option('-S, --skip-s2s', '(Optional) Skip creating toolchain-generated service-to-service authorizations')
|
|
58
60
|
.option('-T, --skip-disable-triggers', '(Optional) Skip disabling Tekton pipeline Git or timed triggers. Note: This may result in duplicate pipeline runs')
|
|
59
61
|
.option('-C, --compact', '(Optional) Generate all resources in a single resources.tf file')
|
|
60
62
|
.option('-v, --verbose', '(Optional) Increase log output')
|
|
@@ -92,13 +94,13 @@ async function main(options) {
|
|
|
92
94
|
let targetRgId;
|
|
93
95
|
let targetRgName;
|
|
94
96
|
let apiKey = options.apikey;
|
|
95
|
-
let policyIds; // used to include s2s auth policies
|
|
96
97
|
let moreTfResources = {};
|
|
97
98
|
let gritMapping = {};
|
|
98
99
|
|
|
99
100
|
// Validate arguments are valid and check if Terraform is installed appropriately
|
|
100
101
|
try {
|
|
101
102
|
validatePrereqsVersions();
|
|
103
|
+
logger.info(`\x1b[32m✔\x1b[0m cd-tools Version: ${packageJson.version}`, LOG_STAGES.setup);
|
|
102
104
|
|
|
103
105
|
if (!apiKey) apiKey = parseEnvVar('IBMCLOUD_API_KEY');
|
|
104
106
|
bearer = await getBearerToken(apiKey);
|
|
@@ -195,26 +197,6 @@ async function main(options) {
|
|
|
195
197
|
|
|
196
198
|
collectGHE();
|
|
197
199
|
|
|
198
|
-
const collectPolicyIds = async () => {
|
|
199
|
-
moreTfResources['iam_authorization_policy'] = [];
|
|
200
|
-
|
|
201
|
-
const res = await getIamAuthPolicies(bearer, accountId);
|
|
202
|
-
|
|
203
|
-
policyIds = res['policies'].filter((p) => p.subjects[0].attributes.find(
|
|
204
|
-
(a) => a.name === 'serviceInstance' && a.value === sourceToolchainId)
|
|
205
|
-
);
|
|
206
|
-
policyIds = policyIds.map((p) => p.id);
|
|
207
|
-
};
|
|
208
|
-
|
|
209
|
-
if (includeS2S) {
|
|
210
|
-
try {
|
|
211
|
-
collectPolicyIds();
|
|
212
|
-
} catch (e) {
|
|
213
|
-
logger.error('Something went wrong while fetching service-to-service auth policies', LOG_STAGES.setup);
|
|
214
|
-
throw e;
|
|
215
|
-
}
|
|
216
|
-
}
|
|
217
|
-
|
|
218
200
|
logger.info('Arguments and required packages verified, proceeding with copying toolchain...', LOG_STAGES.setup);
|
|
219
201
|
|
|
220
202
|
// Set up temp folder
|
|
@@ -231,6 +213,9 @@ async function main(options) {
|
|
|
231
213
|
exit(1);
|
|
232
214
|
}
|
|
233
215
|
|
|
216
|
+
let toolchainTfName; // to target creating toolchain first
|
|
217
|
+
let s2sAuthTools; // to create s2s auth with script
|
|
218
|
+
|
|
234
219
|
try {
|
|
235
220
|
let nonSecretRefs;
|
|
236
221
|
|
|
@@ -242,7 +227,7 @@ async function main(options) {
|
|
|
242
227
|
await initProviderFile(sourceRegion, TEMP_DIR);
|
|
243
228
|
await runTerraformInit(TEMP_DIR, verbosity);
|
|
244
229
|
|
|
245
|
-
nonSecretRefs = await importTerraform(bearer, apiKey, sourceRegion, sourceToolchainId, targetToolchainName,
|
|
230
|
+
[toolchainTfName, nonSecretRefs, s2sAuthTools] = await importTerraform(bearer, apiKey, sourceRegion, sourceToolchainId, targetToolchainName, TEMP_DIR, isCompact, verbosity);
|
|
246
231
|
};
|
|
247
232
|
|
|
248
233
|
await logger.withSpinner(
|
|
@@ -286,7 +271,8 @@ async function main(options) {
|
|
|
286
271
|
tempDir: TEMP_DIR,
|
|
287
272
|
moreTfResources: moreTfResources,
|
|
288
273
|
gritMapping: gritMapping,
|
|
289
|
-
skipUserConfirmation: skipUserConfirmation
|
|
274
|
+
skipUserConfirmation: skipUserConfirmation,
|
|
275
|
+
includeS2S: includeS2S
|
|
290
276
|
});
|
|
291
277
|
} catch (err) {
|
|
292
278
|
if (err.message && err.stack) {
|
|
@@ -317,6 +303,27 @@ async function main(options) {
|
|
|
317
303
|
|
|
318
304
|
let applyErrors = false;
|
|
319
305
|
|
|
306
|
+
if (includeS2S) {
|
|
307
|
+
const s2sRequests = s2sAuthTools.map((item) => {
|
|
308
|
+
return {
|
|
309
|
+
parameters: item['parameters'],
|
|
310
|
+
serviceId: item.tool_type_id,
|
|
311
|
+
env_id: `ibm:yp:${targetRegion}`
|
|
312
|
+
};
|
|
313
|
+
});
|
|
314
|
+
fs.writeFileSync(resolve(`${outputDir}/create-s2s.json`), JSON.stringify(s2sRequests));
|
|
315
|
+
|
|
316
|
+
// copy script
|
|
317
|
+
fs.copyFileSync(resolve('create-s2s-script.js'), resolve(`${outputDir}/create-s2s-script.js`), fs.constants.COPYFILE_EXCL);
|
|
318
|
+
}
|
|
319
|
+
|
|
320
|
+
// create toolchain, which invokes script to create s2s if applicable
|
|
321
|
+
await runTerraformApply(true, outputDir, verbosity, `ibm_cd_toolchain.${toolchainTfName}`).catch((err) => {
|
|
322
|
+
logger.error(err, LOG_STAGES.tf);
|
|
323
|
+
applyErrors = true;
|
|
324
|
+
});
|
|
325
|
+
|
|
326
|
+
// create the rest
|
|
320
327
|
await runTerraformApply(skipUserConfirmation, outputDir, verbosity).catch((err) => {
|
|
321
328
|
logger.error(err, LOG_STAGES.tf);
|
|
322
329
|
applyErrors = true;
|
|
@@ -18,7 +18,7 @@ import { getRandChars, isSecretReference, normalizeName } from './utils.js';
|
|
|
18
18
|
|
|
19
19
|
import { SECRET_KEYS_MAP, SUPPORTED_TOOLS_MAP } from '../../config.js';
|
|
20
20
|
|
|
21
|
-
export async function importTerraform(token, apiKey, region, toolchainId, toolchainName,
|
|
21
|
+
export async function importTerraform(token, apiKey, region, toolchainId, toolchainName, dir, isCompact, verbosity) {
|
|
22
22
|
// STEP 1/2: set up terraform file with import blocks
|
|
23
23
|
const importBlocks = []; // an array of objects representing import blocks, used in importBlocksToTf
|
|
24
24
|
const additionalProps = {}; // maps resource name to array of { property/param, value }, used to override terraform import
|
|
@@ -41,6 +41,14 @@ export async function importTerraform(token, apiKey, region, toolchainId, toolch
|
|
|
41
41
|
const toolchainResName = block.name;
|
|
42
42
|
let pipelineResName;
|
|
43
43
|
|
|
44
|
+
const requiresS2S = [
|
|
45
|
+
'ibm_cd_toolchain_tool_appconfig',
|
|
46
|
+
'ibm_cd_toolchain_tool_eventnotifications',
|
|
47
|
+
'ibm_cd_toolchain_tool_keyprotect',
|
|
48
|
+
'ibm_cd_toolchain_tool_secretsmanager'
|
|
49
|
+
];
|
|
50
|
+
let s2sAuthTools = [];
|
|
51
|
+
|
|
44
52
|
// get list of tools
|
|
45
53
|
const allTools = await getToolchainTools(token, toolchainId, region);
|
|
46
54
|
for (const tool of allTools.tools) {
|
|
@@ -55,6 +63,10 @@ export async function importTerraform(token, apiKey, region, toolchainId, toolch
|
|
|
55
63
|
|
|
56
64
|
toolIdMap[tool.id] = { type: SUPPORTED_TOOLS_MAP[tool.tool_type_id], name: toolResName };
|
|
57
65
|
|
|
66
|
+
if (requiresS2S.includes(SUPPORTED_TOOLS_MAP[tool.tool_type_id])) {
|
|
67
|
+
s2sAuthTools.push(tool);
|
|
68
|
+
}
|
|
69
|
+
|
|
58
70
|
// overwrite hard-coded id with reference
|
|
59
71
|
additionalProps[block.name] = [
|
|
60
72
|
{ property: 'toolchain_id', value: `\${ibm_cd_toolchain.${toolchainResName}.id}` },
|
|
@@ -139,19 +151,6 @@ export async function importTerraform(token, apiKey, region, toolchainId, toolch
|
|
|
139
151
|
}
|
|
140
152
|
}
|
|
141
153
|
|
|
142
|
-
// include s2s
|
|
143
|
-
if (policyIds) {
|
|
144
|
-
for (const policyId of policyIds) {
|
|
145
|
-
block = importBlock(policyId, 'iam_authorization_policy', 'ibm_iam_authorization_policy');
|
|
146
|
-
importBlocks.push(block);
|
|
147
|
-
|
|
148
|
-
// overwrite hard-coded id with reference
|
|
149
|
-
additionalProps[block.name] = [
|
|
150
|
-
{ property: 'source_resource_instance_id', value: `\${ibm_cd_toolchain.${toolchainResName}.id}` },
|
|
151
|
-
];
|
|
152
|
-
}
|
|
153
|
-
}
|
|
154
|
-
|
|
155
154
|
importBlocksToTf(importBlocks, dir);
|
|
156
155
|
|
|
157
156
|
if (!fs.existsSync(`${dir}/generated`)) fs.mkdirSync(`${dir}/generated`);
|
|
@@ -313,7 +312,7 @@ export async function importTerraform(token, apiKey, region, toolchainId, toolch
|
|
|
313
312
|
// remove draft
|
|
314
313
|
if (fs.existsSync(`${dir}/generated/draft.tf`)) fs.rmSync(`${dir}/generated/draft.tf`, { recursive: true });
|
|
315
314
|
|
|
316
|
-
return nonSecretRefs;
|
|
315
|
+
return [toolchainResName, nonSecretRefs, s2sAuthTools];
|
|
317
316
|
}
|
|
318
317
|
|
|
319
318
|
// objects have two keys, "id" and "to"
|
package/cmd/utils/requests.js
CHANGED
|
@@ -373,27 +373,6 @@ async function getGritGroupProject(privToken, region, groupId, projectName) {
|
|
|
373
373
|
}
|
|
374
374
|
}
|
|
375
375
|
|
|
376
|
-
async function getIamAuthPolicies(bearer, accountId) {
|
|
377
|
-
const apiBaseUrl = 'https://iam.cloud.ibm.com/v1';
|
|
378
|
-
const options = {
|
|
379
|
-
url: apiBaseUrl + '/policies',
|
|
380
|
-
method: 'GET',
|
|
381
|
-
headers: {
|
|
382
|
-
'Authorization': `Bearer ${bearer}`,
|
|
383
|
-
'Content-Type': 'application/json',
|
|
384
|
-
},
|
|
385
|
-
params: { account_id: accountId, type: 'authorization' },
|
|
386
|
-
validateStatus: () => true
|
|
387
|
-
};
|
|
388
|
-
const response = await axios(options);
|
|
389
|
-
switch (response.status) {
|
|
390
|
-
case 200:
|
|
391
|
-
return response.data;
|
|
392
|
-
default:
|
|
393
|
-
throw Error('Get auth policies failed');
|
|
394
|
-
}
|
|
395
|
-
}
|
|
396
|
-
|
|
397
376
|
async function deleteToolchain(bearer, toolchainId, region) {
|
|
398
377
|
const apiBaseUrl = `https://api.${region}.devops.cloud.ibm.com/toolchain/v2`;
|
|
399
378
|
const options = {
|
|
@@ -430,6 +409,5 @@ export {
|
|
|
430
409
|
getGritUserProject,
|
|
431
410
|
getGritGroup,
|
|
432
411
|
getGritGroupProject,
|
|
433
|
-
getIamAuthPolicies,
|
|
434
412
|
deleteToolchain
|
|
435
413
|
}
|
package/cmd/utils/terraform.js
CHANGED
|
@@ -50,7 +50,7 @@ async function initProviderFile(targetRegion, dir) {
|
|
|
50
50
|
return writeFilePromise(`${dir}/provider.tf`, jsonToTf(newProviderTfStr));
|
|
51
51
|
}
|
|
52
52
|
|
|
53
|
-
async function setupTerraformFiles({ token, srcRegion, targetRegion, targetTag, targetToolchainName, targetRgId, disableTriggers, isCompact, outputDir, tempDir, moreTfResources, gritMapping, skipUserConfirmation }) {
|
|
53
|
+
async function setupTerraformFiles({ token, srcRegion, targetRegion, targetTag, targetToolchainName, targetRgId, disableTriggers, isCompact, outputDir, tempDir, moreTfResources, gritMapping, skipUserConfirmation, includeS2S }) {
|
|
54
54
|
const promises = [];
|
|
55
55
|
|
|
56
56
|
const writeProviderPromise = await initProviderFile(targetRegion, outputDir);
|
|
@@ -206,8 +206,9 @@ async function setupTerraformFiles({ token, srcRegion, targetRegion, targetTag,
|
|
|
206
206
|
|
|
207
207
|
if (isCompact || resourceName === 'ibm_cd_toolchain') {
|
|
208
208
|
if (targetTag) newTfFileObj['resource']['ibm_cd_toolchain'][newTcId]['tags'] = [
|
|
209
|
-
|
|
210
|
-
|
|
209
|
+
Array.from(new Set( // uniqueness
|
|
210
|
+
(newTfFileObj['resource']['ibm_cd_toolchain'][newTcId]['tags'] ?? []).concat([targetTag])
|
|
211
|
+
))
|
|
211
212
|
];
|
|
212
213
|
if (targetToolchainName) newTfFileObj['resource']['ibm_cd_toolchain'][newTcId]['name'] = targetToolchainName;
|
|
213
214
|
if (targetRgId) newTfFileObj['resource']['ibm_cd_toolchain'][newTcId]['resource_group_id'] = targetRgId;
|
|
@@ -339,7 +340,8 @@ async function setupTerraformFiles({ token, srcRegion, targetRegion, targetTag,
|
|
|
339
340
|
}
|
|
340
341
|
|
|
341
342
|
const newTfFileObjStr = JSON.stringify(newTfFileObj);
|
|
342
|
-
|
|
343
|
+
let newTfFile = replaceDependsOn(jsonToTf(newTfFileObjStr));
|
|
344
|
+
if (includeS2S && (isCompact || resourceName === 'ibm_cd_toolchain')) newTfFile = addS2sScriptToToolchainTf(newTfFile);
|
|
343
345
|
const copyResourcesPromise = writeFilePromise(`${outputDir}/${fileName}`, newTfFile);
|
|
344
346
|
promises.push(copyResourcesPromise);
|
|
345
347
|
}
|
|
@@ -382,11 +384,14 @@ async function getNumResourcesPlanned(dir) {
|
|
|
382
384
|
};
|
|
383
385
|
}
|
|
384
386
|
|
|
385
|
-
async function runTerraformApply(skipTfConfirmation, outputDir, verbosity) {
|
|
387
|
+
async function runTerraformApply(skipTfConfirmation, outputDir, verbosity, target) {
|
|
386
388
|
let command = 'terraform apply';
|
|
387
389
|
if (skipTfConfirmation || verbosity === 0) {
|
|
388
390
|
command = 'terraform apply -auto-approve';
|
|
389
391
|
}
|
|
392
|
+
if (target) {
|
|
393
|
+
command += ` -target="${target}"`
|
|
394
|
+
}
|
|
390
395
|
|
|
391
396
|
const child = child_process.spawn(command, {
|
|
392
397
|
cwd: `${outputDir}`,
|
|
@@ -466,6 +471,25 @@ function replaceDependsOn(str) {
|
|
|
466
471
|
}
|
|
467
472
|
}
|
|
468
473
|
|
|
474
|
+
function addS2sScriptToToolchainTf(str) {
|
|
475
|
+
const provisionerStr = (tfName) => `\n\n provisioner "local-exec" {
|
|
476
|
+
command = "node create-s2s-script.js"
|
|
477
|
+
environment = {
|
|
478
|
+
IBMCLOUD_API_KEY = var.ibmcloud_api_key
|
|
479
|
+
TARGET_TOOLCHAIN_ID = ibm_cd_toolchain.${tfName}.id
|
|
480
|
+
}\n }`
|
|
481
|
+
try {
|
|
482
|
+
if (typeof str === 'string') {
|
|
483
|
+
const pattern = /^resource "ibm_cd_toolchain" "([a-z0-9_-]*)" \{$\n((.|\n)*)\n^\}$/gm;
|
|
484
|
+
|
|
485
|
+
// get rid of the quotes
|
|
486
|
+
return str.replace(pattern, (match, s1, s2) => `resource "ibm_cd_toolchain" "${s1}" {\n${s2}${provisionerStr(s1)}\n}`);
|
|
487
|
+
}
|
|
488
|
+
} catch {
|
|
489
|
+
return str;
|
|
490
|
+
}
|
|
491
|
+
}
|
|
492
|
+
|
|
469
493
|
export {
|
|
470
494
|
setTerraformEnv,
|
|
471
495
|
initProviderFile,
|
package/cmd/utils/validate.js
CHANGED
|
@@ -114,13 +114,13 @@ async function warnDuplicateName(token, accountId, tcName, srcRegion, targetRegi
|
|
|
114
114
|
// soft warning of confusion
|
|
115
115
|
logger.warn(`\nWarning! A toolchain named '${tcName}' already exists in:\n - Region: ${targetRegion}`, '', true);
|
|
116
116
|
}
|
|
117
|
-
if (hasSameResourceGroup) {
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
}
|
|
117
|
+
// if (hasSameResourceGroup) {
|
|
118
|
+
// // soft warning of confusion
|
|
119
|
+
// logger.warn(`\nWarning! A toolchain named '${tcName}' already exists in:\n - Resource Group: ${targetResourceGroupName} (${targetResourceGroupId})`, '', true);
|
|
120
|
+
// }
|
|
121
121
|
}
|
|
122
122
|
|
|
123
|
-
if (hasBoth || hasSameRegion
|
|
123
|
+
if (hasBoth || hasSameRegion) {
|
|
124
124
|
// suggest a tag, if one not provided
|
|
125
125
|
if (!targetTag) {
|
|
126
126
|
if (!skipPrompt) {
|
package/config.js
CHANGED
|
@@ -167,9 +167,9 @@ const SECRET_KEYS_MAP = {
|
|
|
167
167
|
'nexus': [
|
|
168
168
|
{ key: 'token', tfKey: 'token' },
|
|
169
169
|
],
|
|
170
|
-
'pagerduty': [
|
|
171
|
-
|
|
172
|
-
],
|
|
170
|
+
// 'pagerduty': [
|
|
171
|
+
// { key: 'service_key', tfKey: 'service_key', required: true },
|
|
172
|
+
// ],
|
|
173
173
|
'private_worker': [
|
|
174
174
|
{ key: 'workerQueueCredentials', tfKey: 'worker_queue_credentials', required: true },
|
|
175
175
|
],
|
|
@@ -205,7 +205,7 @@ const SUPPORTED_TOOLS_MAP = {
|
|
|
205
205
|
'keyprotect': 'ibm_cd_toolchain_tool_keyprotect',
|
|
206
206
|
'nexus': 'ibm_cd_toolchain_tool_nexus',
|
|
207
207
|
'customtool': 'ibm_cd_toolchain_tool_custom',
|
|
208
|
-
'pagerduty': 'ibm_cd_toolchain_tool_pagerduty',
|
|
208
|
+
// 'pagerduty': 'ibm_cd_toolchain_tool_pagerduty',
|
|
209
209
|
'saucelabs': 'ibm_cd_toolchain_tool_saucelabs',
|
|
210
210
|
'secretsmanager': 'ibm_cd_toolchain_tool_secretsmanager',
|
|
211
211
|
'security_compliance': 'ibm_cd_toolchain_tool_securitycompliance',
|
|
@@ -0,0 +1,119 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Licensed Materials - Property of IBM
|
|
3
|
+
* (c) Copyright IBM Corporation 2025. All Rights Reserved.
|
|
4
|
+
*
|
|
5
|
+
* Note to U.S. Government Users Restricted Rights:
|
|
6
|
+
* Use, duplication or disclosure restricted by GSA ADP Schedule
|
|
7
|
+
* Contract with IBM Corp.
|
|
8
|
+
*/
|
|
9
|
+
|
|
10
|
+
import fs from 'node:fs';
|
|
11
|
+
import { resolve } from 'node:path';
|
|
12
|
+
|
|
13
|
+
const API_KEY = process.env['IBMCLOUD_API_KEY'];
|
|
14
|
+
if (!API_KEY) throw Error(`Missing 'IBMCLOUD_API_KEY'`);
|
|
15
|
+
|
|
16
|
+
const TC_ID = process.env['TARGET_TOOLCHAIN_ID'];
|
|
17
|
+
if (!TC_ID) throw Error(`Missing 'TARGET_TOOLCHAIN_ID'`);
|
|
18
|
+
|
|
19
|
+
const INPUT_PATH = 'create-s2s.json';
|
|
20
|
+
const CLOUD_PLATFORM = 'https://cloud.ibm.com';
|
|
21
|
+
const IAM_BASE_URL = 'https://iam.cloud.ibm.com';
|
|
22
|
+
|
|
23
|
+
async function getBearer() {
|
|
24
|
+
const url = `${IAM_BASE_URL}/identity/token`;
|
|
25
|
+
|
|
26
|
+
const params = new URLSearchParams();
|
|
27
|
+
params.append('grant_type', 'urn:ibm:params:oauth:grant-type:apikey');
|
|
28
|
+
params.append('apikey', API_KEY);
|
|
29
|
+
params.append('response_type', 'cloud_iam');
|
|
30
|
+
|
|
31
|
+
try {
|
|
32
|
+
const response = await fetch(url, {
|
|
33
|
+
method: "POST",
|
|
34
|
+
headers: {
|
|
35
|
+
'Accept': 'application/json',
|
|
36
|
+
'Content-Type': 'application/x-www-form-urlencoded'
|
|
37
|
+
},
|
|
38
|
+
body: params
|
|
39
|
+
});
|
|
40
|
+
|
|
41
|
+
if (!response.ok) {
|
|
42
|
+
throw new Error(`Response status: ${response.status}, ${response.statusText}`);
|
|
43
|
+
}
|
|
44
|
+
|
|
45
|
+
console.log(`GETTING BEARER TOKEN... ${response.status}, ${response.statusText}`);
|
|
46
|
+
|
|
47
|
+
return (await response.json()).access_token;
|
|
48
|
+
} catch (error) {
|
|
49
|
+
console.error(error.message);
|
|
50
|
+
}
|
|
51
|
+
}
|
|
52
|
+
|
|
53
|
+
/* expecting item as an object with the format of:
|
|
54
|
+
{
|
|
55
|
+
"parameters": {
|
|
56
|
+
"name": "",
|
|
57
|
+
"integration-status": "",
|
|
58
|
+
"instance-id-type": "",
|
|
59
|
+
"region": "",
|
|
60
|
+
"resource-group": "",
|
|
61
|
+
"instance-name": "",
|
|
62
|
+
"instance-crn": "",
|
|
63
|
+
"setup-authorization-type": ""
|
|
64
|
+
},
|
|
65
|
+
"toolchainId": "",
|
|
66
|
+
"serviceId": "",
|
|
67
|
+
"env_id": ""
|
|
68
|
+
}
|
|
69
|
+
*/
|
|
70
|
+
|
|
71
|
+
async function createS2sAuthPolicy(item) {
|
|
72
|
+
const url = `${CLOUD_PLATFORM}/devops/setup/api/v2/s2s_authorization?${new URLSearchParams({
|
|
73
|
+
toolchainId: TC_ID,
|
|
74
|
+
serviceId: item['serviceId'],
|
|
75
|
+
env_id: item['env_id']
|
|
76
|
+
}).toString()}`;
|
|
77
|
+
|
|
78
|
+
const data = JSON.stringify({
|
|
79
|
+
'parameters': {
|
|
80
|
+
'name': item['parameters']['name'],
|
|
81
|
+
'integration-status': '',
|
|
82
|
+
'instance-id-type': item['parameters']['instance-id-type'],
|
|
83
|
+
'region': item['parameters']['region'],
|
|
84
|
+
'resource-group': item['parameters']['resource-group'],
|
|
85
|
+
'instance-name': item['parameters']['instance-name'],
|
|
86
|
+
'instance-crn': item['parameters']['instance-crn'],
|
|
87
|
+
'setup-authorization-type': 'select'
|
|
88
|
+
}
|
|
89
|
+
});
|
|
90
|
+
|
|
91
|
+
try {
|
|
92
|
+
const response = await fetch(url, {
|
|
93
|
+
method: "POST",
|
|
94
|
+
headers: {
|
|
95
|
+
'Authorization': `Bearer ${bearer}`,
|
|
96
|
+
'Content-Type': 'application/json',
|
|
97
|
+
},
|
|
98
|
+
body: data,
|
|
99
|
+
});
|
|
100
|
+
|
|
101
|
+
if (!response.ok) {
|
|
102
|
+
throw new Error(`Response status: ${response.status}, ${response.statusText}`);
|
|
103
|
+
}
|
|
104
|
+
|
|
105
|
+
console.log(`CREATING AUTH POLICY... ${response.status}, ${response.statusText}`);
|
|
106
|
+
} catch (error) {
|
|
107
|
+
console.error(error.message);
|
|
108
|
+
}
|
|
109
|
+
}
|
|
110
|
+
|
|
111
|
+
// main
|
|
112
|
+
|
|
113
|
+
const bearer = await getBearer();
|
|
114
|
+
|
|
115
|
+
const inputArr = JSON.parse(fs.readFileSync(resolve(INPUT_PATH)));
|
|
116
|
+
|
|
117
|
+
inputArr.forEach(async (item) => {
|
|
118
|
+
await createS2sAuthPolicy(item);
|
|
119
|
+
});
|