@automattic/vip 2.32.0 → 2.32.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/.prettierignore CHANGED
@@ -1,6 +1,4 @@
1
- *.generated.d.ts
2
1
  /CHANGELOG.md
3
2
  /__fixtures__/
4
3
  /dist/
5
4
  /flow-typed/
6
- /src/graphqlTypes.d.ts
package/CHANGELOG.md CHANGED
@@ -1,5 +1,19 @@
1
1
  ## Changelog
2
2
 
3
+ ### 2.32.2
4
+
5
+ - #1443 On-demand Database Backup: Use a less error-prone way of detecting if we're rate limited
6
+ - #1434 Add more clarifying steps for making a release
7
+ - #1435 Remove enums from GraphQL codegen and make sure that prettier actually runs
8
+ - #1433 Redeploy codegen for GraphQL queries
9
+ - #1432 Revert "Add codegen for GraphQL queries (#1429)"
10
+ - #1304 Generate docs
11
+
12
+ ### 2.32.1
13
+
14
+ - #1445 chore(deps): update vulnerable dependencies to fix ReDoS in semver
15
+ - #1444 chore(deps): update Lando
16
+
3
17
  ### 2.32.0
4
18
 
5
19
  - #1411 feature(export): Add ability to generate a new backup when exporting SQL
package/CONTRIBUTING.md CHANGED
@@ -97,26 +97,23 @@ We use a custom pre-publish [script](https://github.com/Automattic/vip/blob/trun
97
97
 
98
98
  Further checks can be added to this flow as needed.
99
99
 
100
- ### Pre-publish Tasks
101
-
102
- As part of the publish flow, we run the `prepareConfig:publish` task on `prepack`. This copies over "production" config values to your working copy to make sure the release includes those instead of development values.
103
-
104
- We use `prepack` because:
105
-
106
- - `prepareConfig:local` runs on `npm build` and we want to make sure those values are overriden.
107
- - This is the latest npm event that we can run on before publishing. (Note: we tried `prepublishOnly` but files added during that step [don't get included in the build](https://github.com/Automattic/vip/commit/c7dabe1b0f73ec9e6e8c05ccff0c41281e4cd5e8)).
108
-
109
100
  ### New Releases
110
101
 
111
102
  Prepare the release by making sure that:
112
103
 
113
104
  1. All relevant PRs have been merged.
114
105
  1. The release has been tested across macOS, Windows, and Linux.
115
- 1. The [changelog](https://github.com/Automattic/vip/blob/trunk/CHANGELOG.md) has been updated on `trunk`.
116
- 1. All tests pass and your working directory is clean (we have pre-publish checks to catch this, just-in-case).
106
+ 1. All tests pass and your working directory is clean (we have pre-publish checks to catch this,
107
+ just-in-case).
108
+ 1. Make sure not to merge anymore changes into `develop` while all the release steps below are in
109
+ progress.
117
110
 
118
111
  #### Changelog Generator Hint:
119
112
 
113
+ In the first step, you'll need to generate a changelog.
114
+
115
+ Run the following and copy the output somewhere.
116
+
120
117
  ```
121
118
  export LAST_RELEASE_DATE=2021-08-25T13:40:00+02
122
119
  gh pr list --search "is:merged sort:updated-desc closed:>$LAST_RELEASE_DATE" | sed -e 's/\s\+\S\+\tMERGED.*$//' -e 's/^/- #/'
@@ -124,15 +121,28 @@ gh pr list --search "is:merged sort:updated-desc closed:>$LAST_RELEASE_DATE" | s
124
121
 
125
122
  Then, let's publish:
126
123
 
127
- 1. Make sure trunk branch is up to date `git pull`
124
+ 1. Create a pull request that adds the next version's changelog into `develop`. Use the Changelog
125
+ Generate Hint above to generate the changelog, and refer to previous releases to ensure that your
126
+ format matches.
127
+ 1. Create a pull request that merges `develop` to `trunk`.
128
+ 1. Merge it after approval.
129
+ 1. Make sure trunk branch is up-to-date `git pull`.
130
+ 1. Make sure to clean all of your repositories of extra files. Run a dangerous, destructive
131
+ command `git clean -xfd` to do so.
132
+ 1. Run `npm install`.
128
133
  1. Set the version (via `npm version minor` or `npm version major` or `npm version patch`)
129
134
  1. For most regular releases, this will be `npm version minor`.
130
135
  1. Push the tag to GitHub (`git push --tags`)
131
136
  1. Push the trunk branch `git push`
132
137
  1. Make sure you're part of the Automattic organization in npm
133
- 1. Publish the release to npm (`npm publish --access public`) the script will do some extra checks (node version, branch, etc) to ensure everything is correct. If all looks good, the new version will be published and you can proceed.
134
- 1. Edit [the release on GitHub](https://github.com/Automattic/vip/releases) to include a description of the changes and publish (this can just copy the details from the changelog).
135
- 1. Push `trunk` changes (mostly the version bump) to `develop` (`git checkout develop && git merge trunk` )
138
+ 1. Publish the release to npm (`npm publish --access public`) the script will do some extra checks (
139
+ node version, branch, etc) to ensure everything is correct. If all looks good, the new version
140
+ will be published and you can proceed.
141
+ 1. Edit [the release on GitHub](https://github.com/Automattic/vip/releases) to include a description
142
+ of the changes and publish (this can just copy the details from the changelog).
143
+ 1. Push `trunk` changes (mostly the version bump)
144
+ to `develop` (`git checkout develop && git merge trunk` ). There's no need to use a pull request
145
+ to do so.
136
146
 
137
147
  Once released, it's worth running `npm i -g @automattic/vip` to install / upgrade the released version to make sure everything looks good.
138
148
 
@@ -161,6 +171,20 @@ Then, repeat for any additional versions that we need to patch.
161
171
 
162
172
  ### go-search-replace binaries
163
173
 
164
- Some unit tests require some go-search-replace executable binary files to run. Binaries files for several OS architectures can be downloaded from https://github.com/Automattic/go-search-replace/releases/
174
+ Some unit tests require some go-search-replace executable binary files to run. Binaries files for
175
+ several OS architectures can be downloaded
176
+ from https://github.com/Automattic/go-search-replace/releases/
177
+
178
+ If, for some reason, you need to compile these binaries yourself, please follow instructions
179
+ at https://github.com/Automattic/go-search-replace
180
+
181
+ ### Generating the types
182
+
183
+ If you're an employee of Automattic, you can follow these steps to regenerate the GraphQL types
184
+ used.
165
185
 
166
- If, for some reason, you need to compile these binaries yourself, please follow instructions at https://github.com/Automattic/go-search-replace
186
+ 1. Get a hold of `schema.gql` and paste it in project root - this is the schema of the endpoint that
187
+ we communicate with.
188
+ 2. Run `npm run typescript:codegen:install-dependencies` - this will install the codegen
189
+ dependencies without updating `package.json`
190
+ 3. Run `npm run typescript:codegen:generate` - this will regenerate the types
@@ -34,8 +34,6 @@ const CREATE_DB_BACKUP_JOB_MUTATION = (0, _graphqlTag.default)`
34
34
  }
35
35
  }
36
36
  `;
37
-
38
- // TODO: Replace this with the codegen
39
37
  exports.CREATE_DB_BACKUP_JOB_MUTATION = CREATE_DB_BACKUP_JOB_MUTATION;
40
38
  const DB_BACKUP_JOB_STATUS_QUERY = (0, _graphqlTag.default)`
41
39
  query AppBackupJobStatus($appId: Int!, $envId: Int!) {
@@ -63,6 +61,7 @@ const DB_BACKUP_JOB_STATUS_QUERY = (0, _graphqlTag.default)`
63
61
  `;
64
62
  exports.DB_BACKUP_JOB_STATUS_QUERY = DB_BACKUP_JOB_STATUS_QUERY;
65
63
  async function getBackupJob(appId, envId) {
64
+ var _app$environments, _app$environments$, _app$environments$$jo;
66
65
  const api = await (0, _api.default)();
67
66
  const response = await api.query({
68
67
  query: DB_BACKUP_JOB_STATUS_QUERY,
@@ -74,13 +73,10 @@ async function getBackupJob(appId, envId) {
74
73
  });
75
74
  const {
76
75
  data: {
77
- app: {
78
- environments
79
- }
76
+ app
80
77
  }
81
78
  } = response;
82
- const job = environments[0].jobs[0];
83
- return job || null;
79
+ return app === null || app === void 0 ? void 0 : (_app$environments = app.environments) === null || _app$environments === void 0 ? void 0 : (_app$environments$ = _app$environments[0]) === null || _app$environments$ === void 0 ? void 0 : (_app$environments$$jo = _app$environments$.jobs) === null || _app$environments$$jo === void 0 ? void 0 : _app$environments$$jo[0];
84
80
  }
85
81
  async function createBackupJob(appId, envId) {
86
82
  // Disable global error handling so that we can handle errors ourselves
@@ -138,10 +134,10 @@ class BackupDBCommand {
138
134
  this.progressTracker.stopPrinting();
139
135
  }
140
136
  async loadBackupJob() {
141
- var _this$job, _this$job$metadata$fi, _this$job2, _this$job2$progress, _this$job3;
142
- this.job = await getBackupJob(this.app.id, this.env.id);
143
- this.backupName = ((_this$job = this.job) === null || _this$job === void 0 ? void 0 : (_this$job$metadata$fi = _this$job.metadata.find(meta => meta.name === 'backupName')) === null || _this$job$metadata$fi === void 0 ? void 0 : _this$job$metadata$fi.value) || 'Unknown';
144
- this.jobStatus = (_this$job2 = this.job) === null || _this$job2 === void 0 ? void 0 : (_this$job2$progress = _this$job2.progress) === null || _this$job2$progress === void 0 ? void 0 : _this$job2$progress.status;
137
+ var _this$app$id, _this$env$id, _this$job$metadata$fi, _this$job, _this$job$metadata, _this$job$metadata$fi2, _this$job$progress$st, _this$job2, _this$job2$progress, _this$job3;
138
+ this.job = await getBackupJob((_this$app$id = this.app.id) !== null && _this$app$id !== void 0 ? _this$app$id : 0, (_this$env$id = this.env.id) !== null && _this$env$id !== void 0 ? _this$env$id : 0);
139
+ this.backupName = (_this$job$metadata$fi = (_this$job = this.job) === null || _this$job === void 0 ? void 0 : (_this$job$metadata = _this$job.metadata) === null || _this$job$metadata === void 0 ? void 0 : (_this$job$metadata$fi2 = _this$job$metadata.find(meta => (meta === null || meta === void 0 ? void 0 : meta.name) === 'backupName')) === null || _this$job$metadata$fi2 === void 0 ? void 0 : _this$job$metadata$fi2.value) !== null && _this$job$metadata$fi !== void 0 ? _this$job$metadata$fi : 'Unknown';
140
+ this.jobStatus = (_this$job$progress$st = (_this$job2 = this.job) === null || _this$job2 === void 0 ? void 0 : (_this$job2$progress = _this$job2.progress) === null || _this$job2$progress === void 0 ? void 0 : _this$job2$progress.status) !== null && _this$job$progress$st !== void 0 ? _this$job$progress$st : undefined;
145
141
  if ((_this$job3 = this.job) !== null && _this$job3 !== void 0 && _this$job3.completedAt) {
146
142
  this.jobAge = (new Date().getTime() - new Date(this.job.completedAt).getTime()) / 1000 / 60;
147
143
  } else {
@@ -161,49 +157,53 @@ class BackupDBCommand {
161
157
  this.log('Database backup already in progress...');
162
158
  } else {
163
159
  try {
160
+ var _this$app$id2, _this$env$id2;
164
161
  this.log('Generating a new database backup...');
165
162
  this.progressTracker.stepRunning(this.steps.PREPARE);
166
163
  this.progressTracker.startPrinting();
167
- await createBackupJob(this.app.id, this.env.id);
168
- } catch (e) {
169
- var _err$message;
170
- const err = e;
164
+ await createBackupJob((_this$app$id2 = this.app.id) !== null && _this$app$id2 !== void 0 ? _this$app$id2 : 0, (_this$env$id2 = this.env.id) !== null && _this$env$id2 !== void 0 ? _this$env$id2 : 0);
165
+ } catch (err) {
166
+ var _error$graphQLErrors, _graphQLError$extensi;
167
+ const error = err;
168
+ const graphQLError = (_error$graphQLErrors = error.graphQLErrors) === null || _error$graphQLErrors === void 0 ? void 0 : _error$graphQLErrors[0];
171
169
  this.progressTracker.stepFailed(this.steps.PREPARE);
172
170
  this.stopProgressTracker();
173
- if ((_err$message = err.message) !== null && _err$message !== void 0 && _err$message.includes('Database backups limit reached')) {
171
+ if ((graphQLError === null || graphQLError === void 0 ? void 0 : (_graphQLError$extensi = graphQLError.extensions) === null || _graphQLError$extensi === void 0 ? void 0 : _graphQLError$extensi.errorHttpCode) === 429) {
172
+ const retryAfter = graphQLError.extensions.retryAfter;
174
173
  await this.track('error', {
175
174
  error_type: 'rate_limit_exceeded',
176
- error_message: `Couldn't create a new database backup job: ${err === null || err === void 0 ? void 0 : err.message}`,
177
- stack: err === null || err === void 0 ? void 0 : err.stack
175
+ error_message: `Couldn't create a new database backup job: ${error.message}`,
176
+ stack: error.stack
178
177
  });
179
- let errMessage = err.message.replace('Database backups limit reached', 'A new database backup was not generated because a recently generated backup already exists.');
180
- errMessage = errMessage.replace('Retry after', '\nIf you would like to run the same command, you can retry on or after:');
181
- errMessage += `\nAlternatively, you can export the latest existing database backup by running: ${_chalk.default.green('vip @app.env export sql')}, right away.`;
182
- errMessage += '\nLearn more about limitations around generating database backups: https://docs.wpvip.com/technical-references/vip-dashboard/backups/#0-limitations \n';
178
+ const errMessage = `A new database backup was not generated because a recently generated backup already exists.
179
+ If you would like to run the same command, you can retry on or after: ${retryAfter}
180
+ Alternatively, you can export the latest existing database backup by running: ${_chalk.default.green('vip @app.env export sql')}, right away.
181
+ Learn more about limitations around generating database backups: https://docs.wpvip.com/technical-references/vip-dashboard/backups/#0-limitations
182
+ `;
183
183
  exit.withError(errMessage);
184
184
  }
185
185
  await this.track('error', {
186
186
  error_type: 'db_backup_job_creation_failed',
187
- error_message: `Database Backup job creation failed: ${err === null || err === void 0 ? void 0 : err.message}`,
188
- stack: err === null || err === void 0 ? void 0 : err.stack
187
+ error_message: `Database Backup job creation failed: ${error.message}`,
188
+ stack: error.stack
189
189
  });
190
- exit.withError(`Couldn't create a new database backup job: ${err === null || err === void 0 ? void 0 : err.message}`);
190
+ exit.withError(`Couldn't create a new database backup job: ${error.message}`);
191
191
  }
192
192
  }
193
193
  this.progressTracker.stepSuccess(this.steps.PREPARE);
194
194
  this.progressTracker.stepRunning(this.steps.GENERATE);
195
195
  try {
196
196
  await (0, _utils.pollUntil)(this.loadBackupJob.bind(this), DB_BACKUP_PROGRESS_POLL_INTERVAL, this.isDone.bind(this));
197
- } catch (e) {
198
- const err = e;
197
+ } catch (err) {
198
+ const error = err;
199
199
  this.progressTracker.stepFailed(this.steps.GENERATE);
200
200
  this.stopProgressTracker();
201
201
  await this.track('error', {
202
202
  error_type: 'db_backup_job_failed',
203
- error_message: `Database Backup job failed: ${err === null || err === void 0 ? void 0 : err.message}`,
204
- stack: err === null || err === void 0 ? void 0 : err.stack
203
+ error_message: `Database Backup job failed: ${error.message}`,
204
+ stack: error.stack
205
205
  });
206
- exit.withError(`Failed to create new database backup: ${err === null || err === void 0 ? void 0 : err.message}`);
206
+ exit.withError(`Failed to create new database backup: ${error.message}`);
207
207
  }
208
208
  this.progressTracker.stepSuccess(this.steps.GENERATE);
209
209
  this.stopProgressTracker();
@@ -51,6 +51,7 @@ var _default = async (path, options = {}) => {
51
51
  'Content-Type': 'application/json',
52
52
  ...((_options$headers = options.headers) !== null && _options$headers !== void 0 ? _options$headers : {})
53
53
  },
54
+ // eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
54
55
  body: typeof options.body === 'object' ? JSON.stringify(options.body) : options.body
55
56
  });
56
57
  };
@@ -0,0 +1 @@
1
+ "use strict";
@@ -0,0 +1,151 @@
1
+ const { spawn } = require( 'child_process' );
2
+ const { once } = require( 'events' );
3
+
4
+ async function runCommand( subcommands ) {
5
+ const childProcess = spawn( 'vip', subcommands );
6
+
7
+ let output = '';
8
+
9
+ for await ( const data of childProcess.stdout ) {
10
+ output += data.toString();
11
+ }
12
+ for await ( const data of childProcess.stderr ) {
13
+ output += data.toString();
14
+ }
15
+
16
+ const [ exitCode ] = await once( childProcess, 'exit' );
17
+
18
+ if ( exitCode !== 0 ) {
19
+ console.log( 'o', output );
20
+ throw new Error( `Script exited with code ${ exitCode }` );
21
+ }
22
+
23
+ return output.trim();
24
+ }
25
+
26
+ const USAGE_REGEXP = /Usage: (.*)/;
27
+ const COMMAND_REGEXP = /(\S+)\s+(.*)/;
28
+ const OPTION_REGEXP = /(-\S, --\S+)\s+(.*)/;
29
+
30
+ const SECTION_COMMAND = 'commands';
31
+ const SECTION_OPTIONS = 'options';
32
+ const SECTION_EXAMPLES = 'examples';
33
+
34
+ const parseOutput = output => {
35
+ const result = {};
36
+
37
+ const lines = output.split( '\n' );
38
+ let currentSection = '';
39
+
40
+ for ( let lineIx = 0; lineIx < lines.length; lineIx++ ) {
41
+ const line = lines[ lineIx ].trim();
42
+ if ( ! line ) {
43
+ continue;
44
+ }
45
+ if ( line.startsWith( 'Usage:' ) ) {
46
+ result.usage = line.match( USAGE_REGEXP )[ 1 ];
47
+ continue;
48
+ }
49
+
50
+ if ( line.startsWith( 'Commands:' ) ) {
51
+ result.commands = [];
52
+ currentSection = SECTION_COMMAND;
53
+ continue;
54
+ }
55
+ if ( line.startsWith( 'Options:' ) ) {
56
+ result.options = [];
57
+ currentSection = SECTION_OPTIONS;
58
+ continue;
59
+ }
60
+ if ( line.startsWith( 'Examples:' ) ) {
61
+ result.examples = [];
62
+ result.examplesRaw = '';
63
+ currentSection = SECTION_EXAMPLES;
64
+ continue;
65
+ }
66
+
67
+ if ( currentSection === SECTION_COMMAND ) {
68
+ const [ , command, description ] = line.match( COMMAND_REGEXP );
69
+ result.commands.push( {
70
+ command,
71
+ description,
72
+ } );
73
+ continue;
74
+ }
75
+ if ( currentSection === SECTION_OPTIONS ) {
76
+ if ( line.match( OPTION_REGEXP ) ) {
77
+ const [ , option, description ] = line.match( OPTION_REGEXP );
78
+ result.options.push( {
79
+ option,
80
+ description,
81
+ } );
82
+ } else {
83
+ console.log( 'Unknown option', line );
84
+ }
85
+ continue;
86
+ }
87
+ if ( currentSection === SECTION_EXAMPLES ) {
88
+ let description = '';
89
+ while ( ! lines[ lineIx ].trim().startsWith( '$' ) ) {
90
+ const descriptionLine = lines[ lineIx ].trim();
91
+ if ( description ) {
92
+ description += '\n';
93
+ }
94
+ if ( result.examplesRaw ) {
95
+ result.examplesRaw += '\n';
96
+ }
97
+ result.examplesRaw += descriptionLine;
98
+ description += descriptionLine;
99
+ lineIx++;
100
+ }
101
+ const usage = lines[ lineIx ] && lines[ lineIx ].trim();
102
+ result.examplesRaw += '\n' + usage;
103
+ result.examples.push( {
104
+ description,
105
+ usage,
106
+ } );
107
+ }
108
+ }
109
+
110
+ return result;
111
+ };
112
+
113
+ const processCommand = async subcommands => {
114
+ const fullCommand = subcommands.join( ' ' );
115
+ console.error( 'Processing', fullCommand, '...' );
116
+
117
+ const output = await runCommand( subcommands.concat( [ '--help' ] ) );
118
+ const parsedOutput = parseOutput( output );
119
+
120
+ const commandCount = parsedOutput.commands?.length || 0;
121
+ const commandOutputs = [];
122
+ for ( let commandIx = 0; commandIx < commandCount; commandIx++ ) {
123
+ const element = parsedOutput.commands[ commandIx ];
124
+ // otherwise the parallel run will randomly fail on too many requests
125
+ // eslint-disable-next-line no-await-in-loop
126
+ commandOutputs.push( await processCommand( subcommands.concat( [ element.command ] ) ) );
127
+ }
128
+ console.error( `Processed ${ fullCommand } -> ${ commandOutputs.length } subcommands` );
129
+ for ( let commandIx = 0; commandIx < commandCount; commandIx++ ) {
130
+ const element = parsedOutput.commands[ commandIx ];
131
+ const commandOutput = commandOutputs[ commandIx ];
132
+ commandOutput.name = element.command;
133
+ commandOutput.description = element.description;
134
+ parsedOutput.commands[ commandIx ] = commandOutput;
135
+ }
136
+
137
+ return parsedOutput;
138
+ };
139
+
140
+ const main = async () => {
141
+ const version = await runCommand( [ '--version' ] );
142
+
143
+ console.error( 'triggering command processing...' );
144
+ const result = await processCommand( [] );
145
+ console.error( 'command processing done' );
146
+ result.version = version;
147
+
148
+ console.log( JSON.stringify( result, null, 2 ) );
149
+ };
150
+
151
+ main();