@automattic/vip 2.32.4 → 2.33.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CONTRIBUTING.md CHANGED
@@ -72,19 +72,39 @@ For help, see: https://nodejs.org/en/docs/inspector
72
72
 
73
73
  New libraries should generally support both CLI and web contexts, though some cases that won't make sense (e.g. formatting for CLI output). Ensuring the libraries are useful everywhere will allow us to offer consistent experiences regardless of the interface.
74
74
 
75
+ ### go-search-replace binaries
76
+
77
+ Some unit tests require some go-search-replace executable binary files to run. Binaries files for
78
+ several OS architectures can be downloaded
79
+ from https://github.com/Automattic/go-search-replace/releases/
80
+
81
+ If, for some reason, you need to compile these binaries yourself, please follow instructions
82
+ at https://github.com/Automattic/go-search-replace
83
+
84
+ ### Generating the types
85
+
86
+ If you're an employee of Automattic, you can follow these steps to regenerate the GraphQL types
87
+ used.
88
+
89
+ 1. Get a hold of `schema.gql` and paste it in project root - this is the schema of the endpoint that
90
+ we communicate with.
91
+ 2. Run `npm run typescript:codegen:install-dependencies` - this will install the codegen
92
+ dependencies without updating `package.json`
93
+ 3. Run `npm run typescript:codegen:generate` - this will regenerate the types
94
+
75
95
  ## Release & Deployment Process
76
96
 
77
97
  Our release flow for VIP CLI follows this pattern:
78
98
 
79
- **_feature branch -> develop branch -> trunk branch -> NPM release_**
99
+ **_feature branch -> trunk branch -> NPM release_**
80
100
 
81
101
  - For feature branches, please follow A8C branch naming conventions (e.g.- `add/data-sync-command`, `fix/subsite-launch-command`, etc.)
82
102
  - Include a Changelog for all npm version releases, including any minor or major versions
83
103
  - This is a public repository. Please do not include any internal links in PRs, changelogs, testing instructions, etc.
84
- - Merge changes from your feature branch to the `develop` branch
85
- - If you are ready to release your changes publicly, merge your changes from the `develop` branch to the `trunk` branch. All changes that are not ready to be public should be feature flagged or stay in the `develop` branch to avoid conflicts when releasing urgent fixes (not recommended).
104
+ - Merge changes from your feature branch to the `trunk` branch when they are ready to be published publicly.
86
105
  - Finally, release your changes as a new minor or major NPM version. Ping in the #vip-platform channel to notify folks of a new release, but please feel free to release your changes without any blockers from the team. Any team member that is part of the Automattic NPM organization can release a new version; if you aren't a member, generic credentials are available in the Secret Store.
87
106
 
107
+ If you need to publish a security release, see [details below](#patching-old-releases).
88
108
  ### Changelogs
89
109
 
90
110
  Changelogs allow customers to keep up with all the changes happening across our VIP Platform. Changelogs for VIP CLI are posted to the [VIP Cloud Changelog P2](https://wpvipchangelog.wordpress.com/), along with the repository’s `README.md`.
@@ -97,6 +117,12 @@ We use a custom pre-publish [script](https://github.com/Automattic/vip/blob/trun
97
117
 
98
118
  Further checks can be added to this flow as needed.
99
119
 
120
+ ### Versioning Guidelines
121
+
122
+ - `patch`: for non-breaking changes/bugfixes and small updates.
123
+ - `minor`: for some new features, bug fixes, and other non-breaking changes.
124
+ - `major`: for breaking changes.
125
+
100
126
  ### New Releases
101
127
 
102
128
  Prepare the release by making sure that:
@@ -108,6 +134,8 @@ Prepare the release by making sure that:
108
134
  1. Make sure not to merge anymore changes into `develop` while all the release steps below are in
109
135
  progress.
110
136
 
137
+ You can release either using GitHub Actions or locally.
138
+
111
139
  #### Changelog Generator Hint:
112
140
 
113
141
  In the first step, you'll need to generate a changelog.
@@ -119,14 +147,32 @@ export LAST_RELEASE_DATE=2021-08-25T13:40:00+02
119
147
  gh pr list --search "is:merged sort:updated-desc closed:>$LAST_RELEASE_DATE" | sed -e 's/\s\+\S\+\tMERGED.*$//' -e 's/^/- #/'
120
148
  ```
121
149
 
122
- Then, let's publish:
123
150
 
124
- 1. Create a pull request that adds the next version's changelog into `develop`. Use the Changelog
151
+ #### Publishing via GitHub Actions (preferred)
152
+
153
+ This is the preferred method for pushing out the latest release. The workflow runs a bunch of validations, generates a build, bump versions + tags, pushes out to npm, and bumps to the next dev version.
154
+
155
+ 1. Initiate the [release process here](https://github.com/Automattic/vip-cli/actions/workflows/npm-prepare-release.yml).
156
+ 1. On the right-hand side, select "Run Workflow".
157
+ 1. Pick your preferred version bump.
158
+ 1. Click `Run Workflow`.
159
+ 1. Wait for a pull request to appear. The pull request will update the version number and shall be assigned to you.
160
+ 1. When ready, merge the pull request. This will lead to a new version to be [published on npmjs.com](https://www.npmjs.com/package/@automattic/vip).
161
+ 1. Another pull request will be created to bump to a development version, also assigned to you. Merge it to finish the process.
162
+
163
+ #### Note on NPM token
164
+
165
+ Publishing via the GitHub Action requires that the `NPM_TOKEN` be set correctly in GitHub Actions secrets. This should be an npm token generated for a bot user on [the npm @automattic org](https://www.npmjs.com/settings/automattic) that has publish access to this repo.
166
+
167
+ #### Publishing locally
168
+
169
+ To publish locally, follow these steps:
170
+
171
+ 1. Create a pull request that adds the next version's changelog into `trunk`. Use the Changelog
125
172
  Generate Hint above to generate the changelog, and refer to previous releases to ensure that your
126
- format matches.
127
- 1. Create a pull request that merges `develop` to `trunk`.
173
+ format matches.
128
174
  1. Merge it after approval.
129
- 1. Make sure trunk branch is up-to-date `git pull`.
175
+ 1. Make sure `trunk` branch is up-to-date `git pull`.
130
176
  1. Make sure to clean all of your repositories of extra files. Run a dangerous, destructive
131
177
  command `git clean -xfd` to do so.
132
178
  1. Run `npm install`.
@@ -168,23 +214,3 @@ For these cases:
168
214
  1. Follow the release steps outlined above (as a `patch` release).
169
215
 
170
216
  Then, repeat for any additional versions that we need to patch.
171
-
172
- ### go-search-replace binaries
173
-
174
- Some unit tests require some go-search-replace executable binary files to run. Binaries files for
175
- several OS architectures can be downloaded
176
- from https://github.com/Automattic/go-search-replace/releases/
177
-
178
- If, for some reason, you need to compile these binaries yourself, please follow instructions
179
- at https://github.com/Automattic/go-search-replace
180
-
181
- ### Generating the types
182
-
183
- If you're an employee of Automattic, you can follow these steps to regenerate the GraphQL types
184
- used.
185
-
186
- 1. Get a hold of `schema.gql` and paste it in project root - this is the schema of the endpoint that
187
- we communicate with.
188
- 2. Run `npm run typescript:codegen:install-dependencies` - this will install the codegen
189
- dependencies without updating `package.json`
190
- 3. Run `npm run typescript:codegen:generate` - this will regenerate the types
@@ -0,0 +1,58 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ *
5
+ * @format
6
+ */
7
+
8
+ /**
9
+ * External dependencies
10
+ */
11
+
12
+ /**
13
+ * Internal dependencies
14
+ */
15
+ "use strict";
16
+
17
+ var _command = _interopRequireDefault(require("../lib/cli/command"));
18
+ var _tracker = require("../lib/tracker");
19
+ var _backupDb = require("../commands/backup-db");
20
+ function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
21
+ const examples = [{
22
+ usage: 'vip backup db @mysite.develop',
23
+ description: 'Trigger a new backup for your database of the @mysite.develop environment'
24
+ }];
25
+ const appQuery = `
26
+ id,
27
+ name,
28
+ type,
29
+ organization { id, name },
30
+ environments{
31
+ id
32
+ appId
33
+ type
34
+ name
35
+ primaryDomain { name }
36
+ uniqueLabel
37
+ }
38
+ `;
39
+ void (0, _command.default)({
40
+ appContext: true,
41
+ appQuery,
42
+ envContext: true,
43
+ module: 'backup-db',
44
+ requiredArgs: 0,
45
+ usage: 'vip backup db'
46
+ }).examples(examples).argv(process.argv, async (arg, {
47
+ app,
48
+ env
49
+ }) => {
50
+ const trackerFn = (0, _tracker.makeCommandTracker)('backup_db', {
51
+ app: app.id,
52
+ env: env.uniqueLabel
53
+ });
54
+ await trackerFn('execute');
55
+ const cmd = new _backupDb.BackupDBCommand(app, env, trackerFn);
56
+ await cmd.run();
57
+ await trackerFn('success');
58
+ });
@@ -0,0 +1,19 @@
1
+ #!/usr/bin/env node
2
+
3
+ /**
4
+ * External dependencies
5
+ */
6
+
7
+ /**
8
+ * Internal dependencies
9
+ */
10
+ "use strict";
11
+
12
+ var _command = _interopRequireDefault(require("../lib/cli/command"));
13
+ var _tracker = require("../lib/tracker");
14
+ function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
15
+ void (0, _command.default)({
16
+ usage: 'vip backup'
17
+ }).command('db', 'Trigger a new backup for your database').example('vip backup db @mysite.develop', 'Trigger a new backup for your database of the @mysite.develop environment').argv(process.argv, async () => {
18
+ await (0, _tracker.trackEvent)('vip_backup_command_execute');
19
+ });
@@ -10,9 +10,7 @@
10
10
  */
11
11
  "use strict";
12
12
 
13
- var _chalk = _interopRequireDefault(require("chalk"));
14
13
  var _debug = _interopRequireDefault(require("debug"));
15
- var _child_process = require("child_process");
16
14
  var _tracker = require("../lib/tracker");
17
15
  var _command = _interopRequireDefault(require("../lib/cli/command"));
18
16
  var _devEnvironmentCore = require("../lib/dev-environment/dev-environment-core");
@@ -24,9 +22,6 @@ function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { de
24
22
  * Internal dependencies
25
23
  */
26
24
  const debug = (0, _debug.default)('@automattic/vip:bin:dev-environment');
27
-
28
- // PowerShell command for Windows Docker patch
29
- const dockerWindowsPathCmd = 'wsl -d docker-desktop bash -c "sysctl -w vm.max_map_count=262144"';
30
25
  const examples = [{
31
26
  usage: `${_devEnvironment.DEV_ENVIRONMENT_FULL_COMMAND} start`,
32
27
  description: 'Starts a local dev environment'
@@ -48,20 +43,6 @@ const examples = [{
48
43
  skipWpVersionsCheck: !!opt.skipWpVersionsCheck
49
44
  };
50
45
  try {
51
- if (process.platform === 'win32') {
52
- debug('Windows platform detected. Applying Docker patch...');
53
- (0, _child_process.exec)(dockerWindowsPathCmd, {
54
- shell: 'powershell.exe'
55
- }, (error, stdout) => {
56
- if (error) {
57
- debug(error);
58
- console.log(`${_chalk.default.red('✕')} There was an error while applying the Windows Docker patch.`);
59
- } else {
60
- debug(stdout);
61
- console.log(`${_chalk.default.green('✓')} Docker patch for Windows applied.`);
62
- }
63
- });
64
- }
65
46
  await (0, _devEnvironmentCore.startEnvironment)(lando, slug, options);
66
47
  const processingTime = Math.ceil((new Date() - startProcessing) / 1000); // in seconds
67
48
  const successTrackingInfo = {
@@ -79,6 +79,9 @@ const appQuery = `
79
79
  const cmd = new _devEnvSyncSql.DevEnvSyncSQLCommand(app, env, slug, trackerFn);
80
80
  // TODO: There's a function called handleCLIException for dev-env that handles exceptions but DevEnvSyncSQLCommand has its own implementation.
81
81
  // We should probably use handleCLIException instead?
82
- await cmd.run();
82
+ const didCommandRun = await cmd.run();
83
+ if (!didCommandRun) {
84
+ console.log('Command canceled by user.');
85
+ }
83
86
  await trackerFn('success');
84
87
  });
@@ -34,6 +34,7 @@ const appQuery = `
34
34
  id
35
35
  appId
36
36
  type
37
+ name
37
38
  primaryDomain {
38
39
  id
39
40
  name
@@ -30,6 +30,7 @@ const appQuery = `
30
30
  id
31
31
  appId
32
32
  type
33
+ name
33
34
  primaryDomain {
34
35
  id
35
36
  name
@@ -30,6 +30,7 @@ environments{
30
30
  id
31
31
  appId
32
32
  type
33
+ name
33
34
  isK8sResident
34
35
  primaryDomain {
35
36
  id
package/dist/bin/vip.js CHANGED
@@ -28,7 +28,7 @@ if (_config.default && _config.default.environment !== 'production') {
28
28
  const tokenURL = 'https://dashboard.wpvip.com/me/cli/token';
29
29
  const runCmd = async function () {
30
30
  const cmd = (0, _command.default)();
31
- cmd.command('logout', 'Logout from your current session').command('app', 'List and modify your VIP applications').command('cache', 'Manage page cache for your VIP applications').command('config', 'Set configuration for your VIP applications').command('dev-env', 'Use local dev-environment').command('export', 'Export data from your VIP application').command('import', 'Import media or SQL files into your VIP applications').command('logs', 'Get logs from your VIP applications').command('search-replace', 'Perform search and replace tasks on files').command('slowlogs', 'Get slowlogs from your VIP applications').command('sync', 'Sync production to a development environment').command('whoami', 'Display details about the currently logged-in user').command('validate', 'Validate your VIP application and environment').command('wp', 'Run WP CLI commands against an environment');
31
+ cmd.command('logout', 'Logout from your current session').command('app', 'List and modify your VIP applications').command('backup', 'Generate a backup for VIP applications').command('cache', 'Manage page cache for your VIP applications').command('config', 'Set configuration for your VIP applications').command('dev-env', 'Use local dev-environment').command('export', 'Export data from your VIP application').command('import', 'Import media or SQL files into your VIP applications').command('logs', 'Get logs from your VIP applications').command('search-replace', 'Perform search and replace tasks on files').command('slowlogs', 'Get slowlogs from your VIP applications').command('sync', 'Sync production to a development environment').command('whoami', 'Display details about the currently logged-in user').command('validate', 'Validate your VIP application and environment').command('wp', 'Run WP CLI commands against an environment');
32
32
  cmd.argv(process.argv);
33
33
  };
34
34
  function doesArgvHaveAtLeastOneParam(argv, params) {
@@ -10,6 +10,7 @@ var _api = _interopRequireWildcard(require("../lib/api"));
10
10
  var exit = _interopRequireWildcard(require("../lib/cli/exit"));
11
11
  var _utils = require("../lib/utils");
12
12
  var _progress = require("../lib/cli/progress");
13
+ var _format = require("../lib/cli/format");
13
14
  function _getRequireWildcardCache(nodeInterop) { if (typeof WeakMap !== "function") return null; var cacheBabelInterop = new WeakMap(); var cacheNodeInterop = new WeakMap(); return (_getRequireWildcardCache = function (nodeInterop) { return nodeInterop ? cacheNodeInterop : cacheBabelInterop; })(nodeInterop); }
14
15
  function _interopRequireWildcard(obj, nodeInterop) { if (!nodeInterop && obj && obj.__esModule) { return obj; } if (obj === null || typeof obj !== "object" && typeof obj !== "function") { return { default: obj }; } var cache = _getRequireWildcardCache(nodeInterop); if (cache && cache.has(obj)) { return cache.get(obj); } var newObj = {}; var hasPropertyDescriptor = Object.defineProperty && Object.getOwnPropertyDescriptor; for (var key in obj) { if (key !== "default" && Object.prototype.hasOwnProperty.call(obj, key)) { var desc = hasPropertyDescriptor ? Object.getOwnPropertyDescriptor(obj, key) : null; if (desc && (desc.get || desc.set)) { Object.defineProperty(newObj, key, desc); } else { newObj[key] = obj[key]; } } } newObj.default = obj; if (cache) { cache.set(obj, newObj); } return newObj; }
15
16
  function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
@@ -148,10 +149,6 @@ class BackupDBCommand {
148
149
  async run(silent = false) {
149
150
  var _this$job4;
150
151
  this.silent = silent;
151
- let noticeMessage = `\n${_chalk.default.yellow('NOTICE: ')}`;
152
- noticeMessage += 'If a recent database backup does not exist, a new one will be generated for this environment. ';
153
- noticeMessage += 'Learn more about this: https://docs.wpvip.com/technical-references/vip-dashboard/backups/#2-download-a-full-database-backup \n';
154
- this.log(noticeMessage);
155
152
  await this.loadBackupJob();
156
153
  if ((_this$job4 = this.job) !== null && _this$job4 !== void 0 && _this$job4.inProgressLock) {
157
154
  this.log('Database backup already in progress...');
@@ -175,7 +172,7 @@ class BackupDBCommand {
175
172
  stack: error.stack
176
173
  });
177
174
  const errMessage = `A new database backup was not generated because a recently generated backup already exists.
178
- If you would like to run the same command, you can retry on or after: ${retryAfter}
175
+ If you would like to run the same command, you can retry in ${(0, _format.formatDuration)(new Date(), new Date(retryAfter))}
179
176
  Alternatively, you can export the latest existing database backup by running: ${_chalk.default.green('vip @app.env export sql')}, right away.
180
177
  Learn more about limitations around generating database backups: https://docs.wpvip.com/technical-references/vip-dashboard/backups/#0-limitations
181
178
  `;
@@ -24,6 +24,7 @@ var _utils = require("../lib/utils");
24
24
  var _lineByLine = require("../lib/validations/line-by-line");
25
25
  var exit = _interopRequireWildcard(require("../lib/cli/exit"));
26
26
  var _devEnvImportSql = require("./dev-env-import-sql");
27
+ var _backupStorageAvailability = require("../lib/backup-storage-availability/backup-storage-availability");
27
28
  function _getRequireWildcardCache(nodeInterop) { if (typeof WeakMap !== "function") return null; var cacheBabelInterop = new WeakMap(); var cacheNodeInterop = new WeakMap(); return (_getRequireWildcardCache = function (nodeInterop) { return nodeInterop ? cacheNodeInterop : cacheBabelInterop; })(nodeInterop); }
28
29
  function _interopRequireWildcard(obj, nodeInterop) { if (!nodeInterop && obj && obj.__esModule) { return obj; } if (obj === null || typeof obj !== "object" && typeof obj !== "function") { return { default: obj }; } var cache = _getRequireWildcardCache(nodeInterop); if (cache && cache.has(obj)) { return cache.get(obj); } var newObj = {}; var hasPropertyDescriptor = Object.defineProperty && Object.getOwnPropertyDescriptor; for (var key in obj) { if (key !== "default" && Object.prototype.hasOwnProperty.call(obj, key)) { var desc = hasPropertyDescriptor ? Object.getOwnPropertyDescriptor(obj, key) : null; if (desc && (desc.get || desc.set)) { Object.defineProperty(newObj, key, desc); } else { newObj[key] = obj[key]; } } } newObj.default = obj; if (cache) { cache.set(obj, newObj); } return newObj; }
29
30
  function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
@@ -92,16 +93,19 @@ class DevEnvSyncSQLCommand {
92
93
  get gzFile() {
93
94
  return `${this.tmpDir}/sql-export.sql.gz`;
94
95
  }
96
+ async confirmEnoughStorage(job) {
97
+ const storageAvailability = _backupStorageAvailability.BackupStorageAvailability.createFromDbCopyJob(job);
98
+ return await storageAvailability.validateAndPromptDiskSpaceWarningForDevEnvBackupImport();
99
+ }
95
100
 
96
101
  /**
97
102
  * Runs the SQL export command to generate the SQL export from
98
103
  * the latest backup
99
- *
100
- * @return {Promise<void>} Promise that resolves when the export is complete
101
104
  */
102
105
  async generateExport() {
103
106
  const exportCommand = new _exportSql.ExportSQLCommand(this.app, this.env, {
104
- outputFile: this.gzFile
107
+ outputFile: this.gzFile,
108
+ confirmEnoughStorageHook: this.confirmEnoughStorage.bind(this)
105
109
  }, this.track);
106
110
  await exportCommand.run();
107
111
  }
@@ -161,7 +165,7 @@ class DevEnvSyncSQLCommand {
161
165
  * Sequentially runs the commands to export, search-replace, and import the SQL file
162
166
  * to the local environment
163
167
  *
164
- * @return {Promise<void>} Promise that resolves when the commands are complete
168
+ * @return {Promise<void>} Promise that resolves to true when the commands are complete. It will return false if the user did not continue during validation prompts.
165
169
  */
166
170
  async run() {
167
171
  try {
@@ -221,6 +225,7 @@ class DevEnvSyncSQLCommand {
221
225
  console.log('Importing the SQL file...');
222
226
  await this.runImport();
223
227
  console.log(`${_chalk.default.green('✓')} SQL file imported`);
228
+ return true;
224
229
  } catch (err) {
225
230
  await this.track('error', {
226
231
  error_type: 'import_sql_file',
@@ -8,12 +8,14 @@ var _graphqlTag = _interopRequireDefault(require("graphql-tag"));
8
8
  var _fs = _interopRequireDefault(require("fs"));
9
9
  var _https = _interopRequireDefault(require("https"));
10
10
  var _path = _interopRequireDefault(require("path"));
11
+ var _chalk = _interopRequireDefault(require("chalk"));
11
12
  var _api = _interopRequireWildcard(require("../lib/api"));
12
13
  var _format = require("../lib/cli/format");
13
14
  var _progress = require("../lib/cli/progress");
14
15
  var exit = _interopRequireWildcard(require("../lib/cli/exit"));
15
16
  var _utils = require("../lib/utils");
16
17
  var _backupDb = require("./backup-db");
18
+ var _backupStorageAvailability = require("../lib/backup-storage-availability/backup-storage-availability");
17
19
  function _getRequireWildcardCache(nodeInterop) { if (typeof WeakMap !== "function") return null; var cacheBabelInterop = new WeakMap(); var cacheNodeInterop = new WeakMap(); return (_getRequireWildcardCache = function (nodeInterop) { return nodeInterop ? cacheNodeInterop : cacheBabelInterop; })(nodeInterop); }
18
20
  function _interopRequireWildcard(obj, nodeInterop) { if (!nodeInterop && obj && obj.__esModule) { return obj; } if (obj === null || typeof obj !== "object" && typeof obj !== "function") { return { default: obj }; } var cache = _getRequireWildcardCache(nodeInterop); if (cache && cache.has(obj)) { return cache.get(obj); } var newObj = {}; var hasPropertyDescriptor = Object.defineProperty && Object.getOwnPropertyDescriptor; for (var key in obj) { if (key !== "default" && Object.prototype.hasOwnProperty.call(obj, key)) { var desc = hasPropertyDescriptor ? Object.getOwnPropertyDescriptor(obj, key) : null; if (desc && (desc.get || desc.set)) { Object.defineProperty(newObj, key, desc); } else { newObj[key] = obj[key]; } } } newObj.default = obj; if (cache) { cache.set(obj, newObj); } return newObj; }
19
21
  function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
@@ -189,6 +191,7 @@ class ExportSQLCommand {
189
191
  PREPARE: 'prepare',
190
192
  CREATE: 'create',
191
193
  DOWNLOAD_LINK: 'downloadLink',
194
+ CONFIRM_ENOUGH_STORAGE: 'confirmEnoughStorage',
192
195
  DOWNLOAD: 'download'
193
196
  };
194
197
  /**
@@ -203,6 +206,7 @@ class ExportSQLCommand {
203
206
  this.app = app;
204
207
  this.env = env;
205
208
  this.outputFile = typeof options.outputFile === 'string' ? (0, _utils.getAbsolutePath)(options.outputFile) : null;
209
+ this.confirmEnoughStorageHook = options.confirmEnoughStorageHook;
206
210
  this.generateBackup = options.generateBackup || false;
207
211
  this.progressTracker = new _progress.ProgressTracker([{
208
212
  id: this.steps.PREPARE,
@@ -210,6 +214,9 @@ class ExportSQLCommand {
210
214
  }, {
211
215
  id: this.steps.CREATE,
212
216
  name: 'Creating backup copy'
217
+ }, {
218
+ id: this.steps.CONFIRM_ENOUGH_STORAGE,
219
+ name: "Checking if there's enough storage"
213
220
  }, {
214
221
  id: this.steps.DOWNLOAD_LINK,
215
222
  name: 'Requesting download link'
@@ -313,13 +320,23 @@ class ExportSQLCommand {
313
320
  }
314
321
  async runBackupJob() {
315
322
  const cmd = new _backupDb.BackupDBCommand(this.app, this.env);
323
+ let noticeMessage = `\n${_chalk.default.yellow('NOTICE: ')}`;
324
+ noticeMessage += 'If a recent database backup does not exist, a new one will be generated for this environment. ';
325
+ noticeMessage += 'Learn more about this: https://docs.wpvip.com/technical-references/vip-dashboard/backups/#2-download-a-full-database-backup \n';
326
+ this.log(noticeMessage);
316
327
  await cmd.run(false);
317
328
  }
329
+ async confirmEnoughStorage(job) {
330
+ if (this.confirmEnoughStorageHook) {
331
+ return await this.confirmEnoughStorageHook(job);
332
+ }
333
+ const storageAvailability = _backupStorageAvailability.BackupStorageAvailability.createFromDbCopyJob(job);
334
+ return await storageAvailability.validateAndPromptDiskSpaceWarningForBackupImport();
335
+ }
318
336
 
319
337
  /**
320
338
  * Sequentially runs the steps of the export workflow
321
339
  *
322
- * @return {Promise} A promise which resolves to void
323
340
  */
324
341
  async run() {
325
342
  if (this.outputFile) {
@@ -384,6 +401,16 @@ class ExportSQLCommand {
384
401
  this.progressTracker.stepSuccess(this.steps.PREPARE);
385
402
  await (0, _utils.pollUntil)(this.getExportJob.bind(this), EXPORT_SQL_PROGRESS_POLL_INTERVAL, this.isCreated.bind(this));
386
403
  this.progressTracker.stepSuccess(this.steps.CREATE);
404
+ const storageConfirmed = await this.progressTracker.handleContinuePrompt(async () => {
405
+ return await this.confirmEnoughStorage(await this.getExportJob());
406
+ }, 3);
407
+ if (storageConfirmed) {
408
+ this.progressTracker.stepSuccess(this.steps.CONFIRM_ENOUGH_STORAGE);
409
+ } else {
410
+ this.progressTracker.stepFailed(this.steps.CONFIRM_ENOUGH_STORAGE);
411
+ this.stopProgressTracker();
412
+ exit.withError('Command canceled by user.');
413
+ }
387
414
  const url = await generateDownloadLink(this.app.id, this.env.id, latestBackup.id);
388
415
  this.progressTracker.stepSuccess(this.steps.DOWNLOAD_LINK);
389
416
 
@@ -0,0 +1,123 @@
1
+ "use strict";
2
+
3
+ Object.defineProperty(exports, "__esModule", {
4
+ value: true
5
+ });
6
+ exports.BackupStorageAvailability = void 0;
7
+ var _shelljs = require("shelljs");
8
+ var _path = _interopRequireDefault(require("path"));
9
+ var _xdgBasedir = _interopRequireDefault(require("xdg-basedir"));
10
+ var _os = _interopRequireDefault(require("os"));
11
+ var _checkDiskSpace = _interopRequireDefault(require("check-disk-space"));
12
+ var _enquirer = require("enquirer");
13
+ var _format = require("../cli/format");
14
+ var _dockerMachineNotFoundError = require("./docker-machine-not-found-error");
15
+ function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
16
+ const oneGiBInBytes = 1024 * 1024 * 1024;
17
+ class BackupStorageAvailability {
18
+ constructor(archiveSize) {
19
+ this.archiveSize = archiveSize;
20
+ }
21
+ static createFromDbCopyJob(job) {
22
+ var _job$metadata;
23
+ const bytesWrittenMeta = (_job$metadata = job.metadata) === null || _job$metadata === void 0 ? void 0 : _job$metadata.find(meta => (meta === null || meta === void 0 ? void 0 : meta.name) === 'bytesWritten');
24
+ if (!(bytesWrittenMeta !== null && bytesWrittenMeta !== void 0 && bytesWrittenMeta.value)) {
25
+ throw new Error('Meta not found');
26
+ }
27
+ return new BackupStorageAvailability(Number(bytesWrittenMeta.value));
28
+ }
29
+ getDockerStorageKiBRaw() {
30
+ return (0, _shelljs.exec)(`docker run --rm alpine df -k`, {
31
+ silent: true
32
+ }).grep(/\/dev\/vda1/).head({
33
+ '-n': 1
34
+ }).replace(/\s+/g, ' ').split(' ')[3];
35
+ }
36
+ getDockerStorageAvailable() {
37
+ const kiBLeft = this.getDockerStorageKiBRaw();
38
+ if (!kiBLeft || Number.isNaN(Number(kiBLeft))) {
39
+ throw new _dockerMachineNotFoundError.DockerMachineNotFoundError();
40
+ }
41
+ return Number(kiBLeft) * 1024;
42
+ }
43
+ bytesToHuman(bytes) {
44
+ return (0, _format.formatMetricBytes)(bytes);
45
+ }
46
+ async getStorageAvailableInVipPath() {
47
+ const vipDir = _path.default.join(_xdgBasedir.default.data ?? _os.default.tmpdir(), 'vip');
48
+ const diskSpace = await (0, _checkDiskSpace.default)(vipDir);
49
+ return diskSpace.free;
50
+ }
51
+ getReserveSpace() {
52
+ return oneGiBInBytes;
53
+ }
54
+ getSqlSize() {
55
+ // We estimated that it'd be about 3.5x the archive size.
56
+ return this.archiveSize * 3.5;
57
+ }
58
+ getArchiveSize() {
59
+ return this.archiveSize;
60
+ }
61
+ getStorageRequiredInMainMachine() {
62
+ return this.getArchiveSize() + this.getSqlSize() + this.getReserveSpace();
63
+ }
64
+ getStorageRequiredInDockerMachine() {
65
+ return this.getSqlSize() + this.getReserveSpace();
66
+ }
67
+ async isStorageAvailableInMainMachine() {
68
+ return (await this.getStorageAvailableInVipPath()) > this.getStorageRequiredInMainMachine();
69
+ }
70
+ isStorageAvailableInDockerMachine() {
71
+ return this.getDockerStorageAvailable() > this.getStorageRequiredInDockerMachine();
72
+ }
73
+
74
+ // eslint-disable-next-line id-length
75
+ async validateAndPromptDiskSpaceWarningForBackupImport() {
76
+ const isStorageAvailable = (await this.getStorageAvailableInVipPath()) > this.getArchiveSize();
77
+ if (!isStorageAvailable) {
78
+ const storageRequired = this.getArchiveSize();
79
+ const confirmPrompt = new _enquirer.Confirm({
80
+ message: `We recommend that you have at least ${this.bytesToHuman(storageRequired)} of free space in your machine to download this database backup. Do you still want to continue with downloading the database backup?`
81
+ });
82
+ return await confirmPrompt.run();
83
+ }
84
+ return true;
85
+ }
86
+
87
+ // eslint-disable-next-line id-length
88
+ async validateAndPromptDiskSpaceWarningForDevEnvBackupImport() {
89
+ let storageAvailableInMainMachinePrompted = false;
90
+ if (!(await this.isStorageAvailableInMainMachine())) {
91
+ const storageRequired = this.getStorageRequiredInMainMachine();
92
+ const storageAvailableInVipPath = this.bytesToHuman(await this.getStorageAvailableInVipPath());
93
+ const confirmPrompt = new _enquirer.Confirm({
94
+ message: `We recommend that you have at least ${this.bytesToHuman(storageRequired)} of free space in your machine to import this database backup. We estimate that you currently have ${storageAvailableInVipPath} of space in your machine.
95
+ Do you still want to continue with importing the database backup?
96
+ `
97
+ });
98
+ storageAvailableInMainMachinePrompted = await confirmPrompt.run();
99
+ if (!storageAvailableInMainMachinePrompted) {
100
+ return false;
101
+ }
102
+ }
103
+ try {
104
+ if (!this.isStorageAvailableInDockerMachine()) {
105
+ const storageRequired = this.getStorageRequiredInDockerMachine();
106
+ const storageAvailableInDockerMachine = this.bytesToHuman(this.getDockerStorageAvailable());
107
+ const confirmPrompt = new _enquirer.Confirm({
108
+ message: `We recommend that you have at least ${this.bytesToHuman(storageRequired)} of free space in your Docker machine to import this database backup. We estimate that you currently have ${storageAvailableInDockerMachine} of space in your machine.
109
+ Do you still want to continue with importing the database backup?`
110
+ });
111
+ return await confirmPrompt.run();
112
+ }
113
+ } catch (error) {
114
+ if (error instanceof _dockerMachineNotFoundError.DockerMachineNotFoundError) {
115
+ // skip storage available check
116
+ return true;
117
+ }
118
+ throw error;
119
+ }
120
+ return true;
121
+ }
122
+ }
123
+ exports.BackupStorageAvailability = BackupStorageAvailability;
@@ -0,0 +1,12 @@
1
+ "use strict";
2
+
3
+ Object.defineProperty(exports, "__esModule", {
4
+ value: true
5
+ });
6
+ exports.DockerMachineNotFoundError = void 0;
7
+ class DockerMachineNotFoundError extends Error {
8
+ constructor() {
9
+ super('Docker machine not found');
10
+ }
11
+ }
12
+ exports.DockerMachineNotFoundError = DockerMachineNotFoundError;