parallel_tests 3.7.3 → 4.7.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 042cf590688332180e07fcf6472cc9b88b740276df0511677cf77f7fe649fa2e
4
- data.tar.gz: cb4fd166b030e16574bf5d4790e5d1e0a3d57001d5474ab81c3d54613af976e2
3
+ metadata.gz: e432543f3ce9e19888f91f4a11e14b073305a7f3fa8fa69a31cd3db96606d187
4
+ data.tar.gz: 1c78c9c74dbbef031ca758d3bec296e4b5d74e7beb8108f604679a45899db070
5
5
  SHA512:
6
- metadata.gz: 98e679733a74273ac77e71db47bc3af3f0f5a928dab351be9419644a0e7fe2bc004a81c929928ef0bd98ea2b8e5b63d1ab9fadcf75e22d098a65c5164343fe0b
7
- data.tar.gz: 2ce2dd070733cd0bb61d0cb610f886a5590d5e267cc43613ab7d08fc251d989ebd4a58ad2f0a9c88478499fe55ff4cb7386fabc2eba8ee7fb3440d4899594654
6
+ metadata.gz: 95cf6bc8c16eca666478417a73d674b5707506c798f3418bbfee339c27de173b5284de9e53d4120ebb4be6ee559f0a042b64e515bde05e4d7e3f08ff568baf21
7
+ data.tar.gz: a02b65aa9fdd77cb9c42f6c1e96ecff0c84affa85b525dc273b020dafc73d3715b40a0b4b64059d6cbc69940a5c7e3604bb1c9f61ff243c416bfe0aaedec3dd4
data/Readme.md CHANGED
@@ -1,11 +1,10 @@
1
1
  # parallel_tests
2
2
 
3
3
  [![Gem Version](https://badge.fury.io/rb/parallel_tests.svg)](https://rubygems.org/gems/parallel_tests)
4
- [![Build Status](https://travis-ci.org/grosser/parallel_tests.svg)](https://travis-ci.org/grosser/parallel_tests/builds)
5
- [![Build status](https://github.com/grosser/parallel_tests/workflows/windows/badge.svg)](https://github.com/grosser/parallel_tests/actions?query=workflow%3Awindows)
4
+ [![Build status](https://github.com/grosser/parallel_tests/workflows/test/badge.svg)](https://github.com/grosser/parallel_tests/actions?query=workflow%3Atest&branch=master)
6
5
 
7
- Speedup Test::Unit + RSpec + Cucumber + Spinach by running parallel on multiple CPU cores.<br/>
8
- ParallelTests splits tests into even groups (by number of lines or runtime) and runs each group in a single process with its own database.
6
+ Speedup Minitest + RSpec + Turnip + Cucumber + Spinach by running parallel on multiple CPU cores.<br/>
7
+ ParallelTests splits tests into balanced groups (by number of lines or runtime) and runs each group in a process with its own database.
9
8
 
10
9
  Setup for Rails
11
10
  ===============
@@ -34,35 +33,45 @@ test:
34
33
  ### Create additional database(s)
35
34
  rake parallel:create
36
35
 
36
+ ### (Multi-DB) Create individual database
37
+ rake parallel:create:<database>
38
+ rake parallel:create:secondary
39
+
37
40
  ### Copy development schema (repeat after migrations)
38
41
  rake parallel:prepare
39
42
 
40
43
  ### Run migrations in additional database(s) (repeat after migrations)
41
44
  rake parallel:migrate
42
45
 
46
+ ### (Multi-DB) Run migrations in individual database
47
+ rake parallel:migrate:<database>
48
+
43
49
  ### Setup environment from scratch (create db and loads schema, useful for CI)
44
50
  rake parallel:setup
45
-
51
+
46
52
  ### Drop all test databases
47
53
  rake parallel:drop
48
54
 
55
+ ### (Multi-DB) Drop individual test database
56
+ rake parallel:drop:<database>
57
+
49
58
  ### Run!
50
- rake parallel:test # Test::Unit
59
+ rake parallel:test # Minitest
51
60
  rake parallel:spec # RSpec
52
61
  rake parallel:features # Cucumber
53
62
  rake parallel:features-spinach # Spinach
54
63
 
55
- rake parallel:test[1] --> force 1 CPU --> 86 seconds
64
+ rake "parallel:test[1]" --> force 1 CPU --> 86 seconds
56
65
  rake parallel:test --> got 2 CPUs? --> 47 seconds
57
66
  rake parallel:test --> got 4 CPUs? --> 26 seconds
58
67
  ...
59
68
 
60
69
  Test by pattern with Regex (e.g. use one integration server per subfolder / see if you broke any 'user'-related tests)
61
70
 
62
- rake parallel:test[^test/unit] # every test file in test/unit folder
63
- rake parallel:test[user] # run users_controller + user_helper + user tests
64
- rake parallel:test['user|product'] # run user and product related tests
65
- rake parallel:spec['spec\/(?!features)'] # run RSpec tests except the tests in spec/features
71
+ rake "parallel:test[^test/unit]" # every test file in test/unit folder
72
+ rake "parallel:test[user]" # run users_controller + user_helper + user tests
73
+ rake "parallel:test['user|product']" # run user and product related tests
74
+ rake "parallel:spec['spec\/(?!features)']" # run RSpec tests except the tests in spec/features
66
75
 
67
76
 
68
77
  ### Example output
@@ -78,13 +87,13 @@ Test by pattern with Regex (e.g. use one integration server per subfolder / see
78
87
  ```Bash
79
88
  RAILS_ENV=test parallel_test -e "rake my:custom:task"
80
89
  # or
81
- rake parallel:rake[my:custom:task]
90
+ rake "parallel:rake[my:custom:task]"
82
91
  # limited parallelism
83
- rake parallel:rake[my:custom:task,2]
92
+ rake "parallel:rake[my:custom:task,2]"
84
93
  ```
85
94
 
86
95
 
87
- Running things once
96
+ Running setup or teardown once
88
97
  ===================
89
98
 
90
99
  ```Ruby
@@ -107,11 +116,16 @@ at_exit do
107
116
  end
108
117
  ```
109
118
 
110
- Even test group run-times
111
- =========================
119
+ Even test group runtimes
120
+ ========================
121
+
122
+ Test groups will often run for different times, making the full test run as slow as the slowest group.
123
+
124
+ **Step 1**: Use these loggers (see below) to record test runtime
112
125
 
113
- Test groups are often not balanced and will run for different times, making everything wait for the slowest group.
114
- Use these loggers to record test runtime and then use the recorded runtime to balance test groups more evenly.
126
+ **Step 2**: The next test run will use the recorded test runtimes (use `--runtime-log <file>` if you picked a location different from below)
127
+
128
+ **Step 3**: Automate upload/download of test runtime from your CI system [example](https://github.com/grosser/parallel_rails_example/blob/master/.github/workflows/test.yml) (chunks need to be combined, an alternative is [amend](https://github.com/grosser/amend))
115
129
 
116
130
  ### RSpec
117
131
 
@@ -129,7 +143,7 @@ Add to your `test_helper.rb`:
129
143
  require 'parallel_tests/test/runtime_logger' if ENV['RECORD_RUNTIME']
130
144
  ```
131
145
 
132
- results will be logged to tmp/parallel_runtime_test.log when `RECORD_RUNTIME` is set,
146
+ results will be logged to `tmp/parallel_runtime_test.log` when `RECORD_RUNTIME` is set,
133
147
  so it is not always required or overwritten.
134
148
 
135
149
  Loggers
@@ -148,7 +162,7 @@ Add the following to your `.rspec_parallel` (or `.rspec`) :
148
162
  RSpec: FailuresLogger
149
163
  -----------------------
150
164
 
151
- Produce pastable command-line snippets for each failed example. For example:
165
+ Produce pasteable command-line snippets for each failed example. For example:
152
166
 
153
167
  ```bash
154
168
  rspec /path/to/my_spec.rb:123 # should do something
@@ -161,6 +175,24 @@ Add to `.rspec_parallel` or use as CLI flag:
161
175
 
162
176
  (Not needed to retry failures, for that pass [--only-failures](https://relishapp.com/rspec/rspec-core/docs/command-line/only-failures) to rspec)
163
177
 
178
+
179
+ RSpec: VerboseLogger
180
+ -----------------------
181
+
182
+ Prints a single line for starting and finishing each example, to see what is currently running in each process.
183
+
184
+ ```
185
+ # PID, parallel process number, spec status, example description
186
+ [14403] [2] [STARTED] Foo foo
187
+ [14402] [1] [STARTED] Bar bar
188
+ [14402] [1] [PASSED] Bar bar
189
+ ```
190
+
191
+ Add to `.rspec_parallel` or use as CLI flag:
192
+
193
+ --format ParallelTests::RSpec::VerboseLogger
194
+
195
+
164
196
  Cucumber: FailuresLogger
165
197
  -----------------------
166
198
 
@@ -193,15 +225,15 @@ Setup for non-rails
193
225
  - use `ENV['TEST_ENV_NUMBER']` inside your tests to select separate db/memcache/etc. (docker compose: expose it)
194
226
 
195
227
  - Only run a subset of files / folders:
196
-
228
+
197
229
  `parallel_test test/bar test/baz/foo_text.rb`
198
230
 
199
231
  - Pass test-options and files via `--`:
200
-
232
+
201
233
  `parallel_rspec -- -t acceptance -f progress -- spec/foo_spec.rb spec/acceptance`
202
-
234
+
203
235
  - Pass in test options, by using the -o flag (wrap everything in quotes):
204
-
236
+
205
237
  `parallel_cucumber -n 2 -o '-p foo_profile --tags @only_this_tag or @only_that_tag --format summary'`
206
238
 
207
239
  Options are:
@@ -219,8 +251,9 @@ Options are:
219
251
  -m, --multiply-processes [FLOAT] use given number as a multiplier of processes to run
220
252
  -s, --single [PATTERN] Run all matching files in the same process
221
253
  -i, --isolate Do not run any other tests in the group used by --single(-s)
222
- --isolate-n [PROCESSES] Use 'isolate' singles with number of processes, default: 1.
254
+ --isolate-n [PROCESSES] Use 'isolate' singles with number of processes, default: 1
223
255
  --highest-exit-status Exit with the highest exit status provided by test run(s)
256
+ --failure-exit-code [INT] Specify the exit code to use when tests fail
224
257
  --specify-groups [SPECS] Use 'specify-groups' if you want to specify multiple specs running in multiple
225
258
  processes in a specific formation. Commas indicate specs in the same process,
226
259
  pipes indicate specs in a new process. Cannot use with --single, --isolate, or
@@ -229,7 +262,8 @@ Options are:
229
262
  Process 1 will contain 1_spec.rb and 2_spec.rb
230
263
  Process 2 will contain 3_spec.rb
231
264
  Process 3 will contain all other specs
232
- --only-group INT[,INT]
265
+ --only-group INT[,INT] Only run the given group numbers.
266
+ Changes `--group-by` default to 'filesize'.
233
267
  -e, --exec [COMMAND] execute this code parallel and with ENV['TEST_ENV_NUMBER']
234
268
  -o, --test-options '[OPTIONS]' execute test commands with those options
235
269
  -t, --type [TYPE] test(default) / rspec / cucumber / spinach
@@ -246,12 +280,14 @@ Options are:
246
280
  --nice execute test commands with low priority.
247
281
  --runtime-log [PATH] Location of previously recorded test runtimes
248
282
  --allowed-missing [INT] Allowed percentage of missing runtimes (default = 50)
283
+ --allow-duplicates When detecting files to run, allow duplicates
249
284
  --unknown-runtime [FLOAT] Use given number as unknown runtime (otherwise use average time)
250
285
  --first-is-1 Use "1" as TEST_ENV_NUMBER to not reuse the default test environment
251
286
  --fail-fast Stop all groups when one group fails (best used with --test-options '--fail-fast' if supported
252
287
  --verbose Print debug output
253
- --verbose-process-command Displays only the command that will be executed by each process
254
- --verbose-rerun-command When there are failures, displays the command executed by each process that failed
288
+ --verbose-command Combines options --verbose-process-command and --verbose-rerun-command
289
+ --verbose-process-command Print the command that will be executed by each process before it begins
290
+ --verbose-rerun-command After a process fails, print the command executed by that process
255
291
  --quiet Print only tests output
256
292
  -v, --version Show Version
257
293
  -h, --help Show this.
@@ -402,6 +438,12 @@ inspired by [pivotal labs](https://blog.pivotal.io/labs/labs/parallelize-your-rs
402
438
  - [Vikram B Kumar](https://github.com/v-kumar)
403
439
  - [Joshua Pinter](https://github.com/joshuapinter)
404
440
  - [Zach Dennis](https://github.com/zdennis)
441
+ - [Jon Dufresne](https://github.com/jdufresne)
442
+ - [Eric Kessler](https://github.com/enkessler)
443
+ - [Adis Osmonov](https://github.com/adis-io)
444
+ - [Josh Westbrook](https://github.com/joshwestbrook)
445
+ - [Jay Dorsey](https://github.com/jaydorsey)
446
+ - [hatsu](https://github.com/hatsu38)
405
447
 
406
448
  [Michael Grosser](http://grosser.it)<br/>
407
449
  michael@grosser.it<br/>
@@ -20,7 +20,7 @@ module ParallelTests
20
20
  options[:first_is_1] ||= first_is_1?
21
21
 
22
22
  if options[:execute]
23
- execute_shell_command_in_parallel(options[:execute], num_processes, options)
23
+ execute_command_in_parallel(options[:execute], num_processes, options)
24
24
  else
25
25
  run_tests_in_parallel(num_processes, options)
26
26
  end
@@ -32,9 +32,23 @@ module ParallelTests
32
32
  @graceful_shutdown_attempted ||= false
33
33
  Kernel.exit if @graceful_shutdown_attempted
34
34
 
35
- # The Pid class's synchronize method can't be called directly from a trap
36
- # Using Thread workaround https://github.com/ddollar/foreman/issues/332
37
- Thread.new { ParallelTests.stop_all_processes }
35
+ # In a shell, all sub-processes also get an interrupt, so they shut themselves down.
36
+ # In a background process this does not happen and we need to do it ourselves.
37
+ # We cannot always send the interrupt since then the sub-processes would get interrupted twice when in foreground
38
+ # and that messes with interrupt handling.
39
+ #
40
+ # (can simulate detached with `(bundle exec parallel_rspec test/a_spec.rb -n 2 &)`)
41
+ # also the integration test "passes on int signal to child processes" is detached.
42
+ #
43
+ # On windows getpgid does not work so we resort to always killing which is the smaller bug.
44
+ #
45
+ # The ParallelTests::Pids `synchronize` method can't be called directly from a trap,
46
+ # using Thread workaround https://github.com/ddollar/foreman/issues/332
47
+ Thread.new do
48
+ if Gem.win_platform? || ((child_pid = ParallelTests.pids.all.first) && Process.getpgid(child_pid) != Process.pid)
49
+ ParallelTests.stop_all_processes
50
+ end
51
+ end
38
52
 
39
53
  @graceful_shutdown_attempted = true
40
54
  end
@@ -43,8 +57,8 @@ module ParallelTests
43
57
  Tempfile.open 'parallel_tests-lock' do |lock|
44
58
  ParallelTests.with_pid_file do
45
59
  simulate_output_for_ci options[:serialize_stdout] do
46
- Parallel.map(items, in_threads: num_processes) do |item|
47
- result = yield(item)
60
+ Parallel.map_with_index(items, in_threads: num_processes) do |item, index|
61
+ result = yield(item, index)
48
62
  reprint_output(result, lock.path) if options[:serialize_stdout]
49
63
  ParallelTests.stop_all_processes if options[:fail_fast] && result[:exit_status] != 0
50
64
  result
@@ -61,20 +75,15 @@ module ParallelTests
61
75
  groups = @runner.tests_in_groups(options[:files], num_processes, options)
62
76
  groups.reject!(&:empty?)
63
77
 
64
- test_results = if options[:only_group]
65
- groups_to_run = options[:only_group].map { |i| groups[i - 1] }.compact
66
- report_number_of_tests(groups_to_run) unless options[:quiet]
67
- execute_in_parallel(groups_to_run, groups_to_run.size, options) do |group|
68
- run_tests(group, groups_to_run.index(group), 1, options)
69
- end
70
- else
71
- report_number_of_tests(groups) unless options[:quiet]
72
-
73
- execute_in_parallel(groups, groups.size, options) do |group|
74
- run_tests(group, groups.index(group), num_processes, options)
75
- end
78
+ if options[:only_group]
79
+ groups = options[:only_group].map { |i| groups[i - 1] }.compact
80
+ num_processes = 1
76
81
  end
77
82
 
83
+ report_number_of_tests(groups) unless options[:quiet]
84
+ test_results = execute_in_parallel(groups, groups.size, options) do |group, index|
85
+ run_tests(group, index, num_processes, options)
86
+ end
78
87
  report_results(test_results, options) unless options[:quiet]
79
88
  end
80
89
 
@@ -87,8 +96,9 @@ module ParallelTests
87
96
  if any_test_failed?(test_results)
88
97
  warn final_fail_message
89
98
 
90
- # return the highest exit status to allow sub-processes to send things other than 1
91
- exit_status = if options[:highest_exit_status]
99
+ exit_status = if options[:failure_exit_code]
100
+ options[:failure_exit_code]
101
+ elsif options[:highest_exit_status]
92
102
  test_results.map { |data| data.fetch(:exit_status) }.max
93
103
  else
94
104
  1
@@ -100,7 +110,7 @@ module ParallelTests
100
110
 
101
111
  def run_tests(group, process_number, num_processes, options)
102
112
  if group.empty?
103
- { stdout: '', exit_status: 0, command: '', seed: nil }
113
+ { stdout: '', exit_status: 0, command: nil, seed: nil }
104
114
  else
105
115
  @runner.run_tests(group, process_number, num_processes, options)
106
116
  end
@@ -140,9 +150,8 @@ module ParallelTests
140
150
  puts "\n\nTests have failed for a parallel_test group. Use the following command to run the group again:\n\n"
141
151
  failing_sets.each do |failing_set|
142
152
  command = failing_set[:command]
143
- command = command.gsub(/;export [A-Z_]+;/, ' ') # remove ugly export statements
144
153
  command = @runner.command_with_seed(command, failing_set[:seed]) if failing_set[:seed]
145
- puts command
154
+ @runner.print_command(command, failing_set[:env] || {})
146
155
  end
147
156
  end
148
157
  end
@@ -215,12 +224,19 @@ module ParallelTests
215
224
  opts.on(
216
225
  "--isolate-n [PROCESSES]",
217
226
  Integer,
218
- "Use 'isolate' singles with number of processes, default: 1."
227
+ "Use 'isolate' singles with number of processes, default: 1"
219
228
  ) { |n| options[:isolate_count] = n }
220
229
 
221
- opts.on("--highest-exit-status", "Exit with the highest exit status provided by test run(s)") do
222
- options[:highest_exit_status] = true
223
- end
230
+ opts.on(
231
+ "--highest-exit-status",
232
+ "Exit with the highest exit status provided by test run(s)"
233
+ ) { options[:highest_exit_status] = true }
234
+
235
+ opts.on(
236
+ "--failure-exit-code [INT]",
237
+ Integer,
238
+ "Specify the exit code to use when tests fail"
239
+ ) { |code| options[:failure_exit_code] = code }
224
240
 
225
241
  opts.on(
226
242
  "--specify-groups [SPECS]",
@@ -236,10 +252,17 @@ module ParallelTests
236
252
  TEXT
237
253
  ) { |groups| options[:specify_groups] = groups }
238
254
 
239
- opts.on("--only-group INT[,INT]", Array) { |groups| options[:only_group] = groups.map(&:to_i) }
255
+ opts.on(
256
+ "--only-group INT[,INT]",
257
+ Array,
258
+ <<~TEXT.rstrip.split("\n").join("\n#{newline_padding}")
259
+ Only run the given group numbers.
260
+ Changes `--group-by` default to 'filesize'.
261
+ TEXT
262
+ ) { |groups| options[:only_group] = groups.map(&:to_i) }
240
263
 
241
- opts.on("-e", "--exec [COMMAND]", "execute this code parallel and with ENV['TEST_ENV_NUMBER']") { |path| options[:execute] = path }
242
- opts.on("-o", "--test-options '[OPTIONS]'", "execute test commands with those options") { |arg| options[:test_options] = arg.lstrip }
264
+ opts.on("-e", "--exec [COMMAND]", "execute this code parallel and with ENV['TEST_ENV_NUMBER']") { |arg| options[:execute] = Shellwords.shellsplit(arg) }
265
+ opts.on("-o", "--test-options '[OPTIONS]'", "execute test commands with those options") { |arg| options[:test_options] = Shellwords.shellsplit(arg) }
243
266
  opts.on("-t", "--type [TYPE]", "test(default) / rspec / cucumber / spinach") do |type|
244
267
  @runner = load_runner(type)
245
268
  rescue NameError, LoadError => e
@@ -250,7 +273,7 @@ module ParallelTests
250
273
  "--suffix [PATTERN]",
251
274
  <<~TEXT.rstrip.split("\n").join("\n#{newline_padding}")
252
275
  override built in test file pattern (should match suffix):
253
- '_spec\.rb$' - matches rspec files
276
+ '_spec.rb$' - matches rspec files
254
277
  '_(test|spec).rb$' - matches test or spec files
255
278
  TEXT
256
279
  ) { |pattern| options[:suffix] = /#{pattern}/ }
@@ -263,12 +286,14 @@ module ParallelTests
263
286
  opts.on("--nice", "execute test commands with low priority.") { options[:nice] = true }
264
287
  opts.on("--runtime-log [PATH]", "Location of previously recorded test runtimes") { |path| options[:runtime_log] = path }
265
288
  opts.on("--allowed-missing [INT]", Integer, "Allowed percentage of missing runtimes (default = 50)") { |percent| options[:allowed_missing_percent] = percent }
289
+ opts.on('--allow-duplicates', 'When detecting files to run, allow duplicates') { options[:allow_duplicates] = true }
266
290
  opts.on("--unknown-runtime [FLOAT]", Float, "Use given number as unknown runtime (otherwise use average time)") { |time| options[:unknown_runtime] = time }
267
291
  opts.on("--first-is-1", "Use \"1\" as TEST_ENV_NUMBER to not reuse the default test environment") { options[:first_is_1] = true }
268
292
  opts.on("--fail-fast", "Stop all groups when one group fails (best used with --test-options '--fail-fast' if supported") { options[:fail_fast] = true }
269
293
  opts.on("--verbose", "Print debug output") { options[:verbose] = true }
270
- opts.on("--verbose-process-command", "Displays only the command that will be executed by each process") { options[:verbose_process_command] = true }
271
- opts.on("--verbose-rerun-command", "When there are failures, displays the command executed by each process that failed") { options[:verbose_rerun_command] = true }
294
+ opts.on("--verbose-command", "Combines options --verbose-process-command and --verbose-rerun-command") { options.merge! verbose_process_command: true, verbose_rerun_command: true }
295
+ opts.on("--verbose-process-command", "Print the command that will be executed by each process before it begins") { options[:verbose_process_command] = true }
296
+ opts.on("--verbose-rerun-command", "After a process fails, print the command executed by that process") { options[:verbose_rerun_command] = true }
272
297
  opts.on("--quiet", "Print only tests output") { options[:quiet] = true }
273
298
  opts.on("-v", "--version", "Show Version") do
274
299
  puts ParallelTests::VERSION
@@ -316,26 +341,30 @@ module ParallelTests
316
341
  raise "Can't pass --specify-groups with any of these keys: --single, --isolate, or --isolate-n"
317
342
  end
318
343
 
344
+ if options[:failure_exit_code] && options[:highest_exit_status]
345
+ raise "Can't pass --failure-exit-code and --highest-exit-status"
346
+ end
347
+
319
348
  options
320
349
  end
321
350
 
322
351
  def extract_file_paths(argv)
323
352
  dash_index = argv.rindex("--")
324
353
  file_args_at = (dash_index || -1) + 1
325
- [argv[file_args_at..-1], argv[0...(dash_index || 0)]]
354
+ [argv[file_args_at..], argv[0...(dash_index || 0)]]
326
355
  end
327
356
 
328
357
  def extract_test_options(argv)
329
358
  dash_index = argv.index("--") || -1
330
- argv[dash_index + 1..-1]
359
+ argv[dash_index + 1..]
331
360
  end
332
361
 
333
362
  def append_test_options(options, argv)
334
363
  new_opts = extract_test_options(argv)
335
364
  return if new_opts.empty?
336
365
 
337
- prev_and_new = [options[:test_options], new_opts.shelljoin]
338
- options[:test_options] = prev_and_new.compact.join(' ')
366
+ options[:test_options] ||= []
367
+ options[:test_options] += new_opts
339
368
  end
340
369
 
341
370
  def load_runner(type)
@@ -345,7 +374,7 @@ module ParallelTests
345
374
  klass_name.split('::').inject(Object) { |x, y| x.const_get(y) }
346
375
  end
347
376
 
348
- def execute_shell_command_in_parallel(command, num_processes, options)
377
+ def execute_command_in_parallel(command, num_processes, options)
349
378
  runs = if options[:only_group]
350
379
  options[:only_group].map { |g| g - 1 }
351
380
  else
@@ -397,7 +426,7 @@ module ParallelTests
397
426
  def simulate_output_for_ci(simulate)
398
427
  if simulate
399
428
  progress_indicator = Thread.new do
400
- interval = Float(ENV.fetch('PARALLEL_TEST_HEARTBEAT_INTERVAL', 60))
429
+ interval = Float(ENV['PARALLEL_TEST_HEARTBEAT_INTERVAL'] || 60)
401
430
  loop do
402
431
  sleep interval
403
432
  print '.'
@@ -4,8 +4,8 @@ require "parallel_tests/gherkin/runner"
4
4
  module ParallelTests
5
5
  module Cucumber
6
6
  class Runner < ParallelTests::Gherkin::Runner
7
- SCENARIOS_RESULTS_BOUNDARY_REGEX = /^(Failing|Flaky) Scenarios:$/.freeze
8
- SCENARIO_REGEX = %r{^cucumber features/.+:\d+}.freeze
7
+ SCENARIOS_RESULTS_BOUNDARY_REGEX = /^(Failing|Flaky) Scenarios:$/
8
+ SCENARIO_REGEX = %r{^cucumber features/.+:\d+}
9
9
 
10
10
  class << self
11
11
  def name
@@ -35,8 +35,8 @@ module ParallelTests
35
35
  end
36
36
 
37
37
  def command_with_seed(cmd, seed)
38
- clean = cmd.sub(/\s--order\s+random(:\d+)?\b/, '')
39
- "#{clean} --order random:#{seed}"
38
+ clean = remove_command_arguments(cmd, '--order')
39
+ [*clean, '--order', "random:#{seed}"]
40
40
  end
41
41
  end
42
42
  end
@@ -27,7 +27,7 @@ module ParallelTests
27
27
  example_tags = example.tags.map(&:name)
28
28
  example_tags = scenario_tags + example_tags
29
29
  next unless matches_tags?(example_tags)
30
- example.rows[1..-1].each do |row|
30
+ example.rows[1..].each do |row|
31
31
  test_line = row.source_line
32
32
  next if line_numbers.any? && !line_numbers.include?(test_line)
33
33
 
@@ -4,7 +4,6 @@ require 'cucumber/runtime'
4
4
  require 'cucumber'
5
5
  require 'parallel_tests/cucumber/scenario_line_logger'
6
6
  require 'parallel_tests/gherkin/listener'
7
- require 'shellwords'
8
7
 
9
8
  begin
10
9
  gem "cuke_modeler", "~> 3.0"
@@ -20,7 +19,7 @@ module ParallelTests
20
19
  def all(files, options = {})
21
20
  # Parse tag expression from given test options and ignore tag pattern. Refer here to understand how new tag expression syntax works - https://github.com/cucumber/cucumber/tree/master/tag-expressions
22
21
  tags = []
23
- words = options[:test_options].to_s.shellsplit
22
+ words = options[:test_options] || []
24
23
  words.each_with_index { |w, i| tags << words[i + 1] if ["-t", "--tags"].include?(w) }
25
24
  if ignore = options[:ignore_tag_pattern]
26
25
  tags << "not (#{ignore})"
@@ -53,7 +52,9 @@ module ParallelTests
53
52
  feature_tags = feature.tags.map(&:name)
54
53
 
55
54
  # We loop on each children of the feature
56
- feature.tests.each do |test|
55
+ test_models = feature.tests
56
+ test_models += feature.rules.flat_map(&:tests) if feature.respond_to?(:rules) # cuke_modeler >= 3.2 supports rules
57
+ test_models.each do |test|
57
58
  # It's a scenario, we add it to the scenario_line_logger
58
59
  scenario_line_logger.visit_feature_element(document.path, test, feature_tags, line_numbers: test_lines)
59
60
  end
@@ -1,6 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
  require "parallel_tests/test/runner"
3
- require 'shellwords'
4
3
 
5
4
  module ParallelTests
6
5
  module Gherkin
@@ -16,17 +15,13 @@ module ParallelTests
16
15
  end
17
16
  end
18
17
 
19
- sanitized_test_files = combined_scenarios.map { |val| WINDOWS ? "\"#{val}\"" : Shellwords.escape(val) }
20
-
21
18
  options[:env] ||= {}
22
19
  options[:env] = options[:env].merge({ 'AUTOTEST' => '1' }) if $stdout.tty?
23
20
 
24
- cmd = [
25
- executable,
26
- (runtime_logging if File.directory?(File.dirname(runtime_log))),
27
- *sanitized_test_files,
28
- cucumber_opts(options[:test_options])
29
- ].compact.reject(&:empty?).join(' ')
21
+ cmd = executable
22
+ cmd += runtime_logging if File.directory?(File.dirname(runtime_log))
23
+ cmd += combined_scenarios
24
+ cmd += cucumber_opts(options[:test_options])
30
25
  execute_command(cmd, process_number, num_processes, options)
31
26
  end
32
27
 
@@ -62,22 +57,22 @@ module ParallelTests
62
57
  plural = "s" if (word == group) && (number != 1)
63
58
  "#{number} #{word}#{plural}"
64
59
  end
65
- "#{sums[0]} (#{sums[1..-1].join(", ")})"
60
+ "#{sums[0]} (#{sums[1..].join(", ")})"
66
61
  end.compact.join("\n")
67
62
  end
68
63
 
69
64
  def cucumber_opts(given)
70
- if given =~ (/--profile/) || given =~ (/(^|\s)-p /)
65
+ if given&.include?('--profile') || given&.include?('-p')
71
66
  given
72
67
  else
73
- [given, profile_from_config].compact.join(" ")
68
+ [*given, *profile_from_config]
74
69
  end
75
70
  end
76
71
 
77
72
  def profile_from_config
78
73
  # copied from https://github.com/cucumber/cucumber/blob/master/lib/cucumber/cli/profile_loader.rb#L85
79
74
  config = Dir.glob("{,.config/,config/}#{name}{.yml,.yaml}").first
80
- "--profile parallel" if config && File.read(config) =~ /^parallel:/
75
+ ['--profile', 'parallel'] if config && File.read(config) =~ /^parallel:/
81
76
  end
82
77
 
83
78
  def tests_in_groups(tests, num_groups, options = {})
@@ -91,7 +86,7 @@ module ParallelTests
91
86
  end
92
87
 
93
88
  def runtime_logging
94
- "--format ParallelTests::Gherkin::RuntimeLogger --out #{runtime_log}"
89
+ ['--format', 'ParallelTests::Gherkin::RuntimeLogger', '--out', runtime_log]
95
90
  end
96
91
 
97
92
  def runtime_log
@@ -102,11 +97,11 @@ module ParallelTests
102
97
  if File.exist?("bin/#{name}")
103
98
  ParallelTests.with_ruby_binary("bin/#{name}")
104
99
  elsif ParallelTests.bundler_enabled?
105
- "bundle exec #{name}"
100
+ ["bundle", "exec", name]
106
101
  elsif File.file?("script/#{name}")
107
102
  ParallelTests.with_ruby_binary("script/#{name}")
108
103
  else
109
- name.to_s
104
+ [name.to_s]
110
105
  end
111
106
  end
112
107
  end
@@ -26,10 +26,6 @@ module ParallelTests
26
26
 
27
27
  isolate_count = isolate_count(options)
28
28
 
29
- if isolate_count >= num_groups
30
- raise 'Number of isolated processes must be less than total the number of processes'
31
- end
32
-
33
29
  if isolate_count >= num_groups
34
30
  raise 'Number of isolated processes must be >= total number of processes'
35
31
  end
@@ -38,7 +34,7 @@ module ParallelTests
38
34
  # add all files that should run in a multiple isolated processes to their own groups
39
35
  group_features_by_size(items_to_group(single_items), groups[0..(isolate_count - 1)])
40
36
  # group the non-isolated by size
41
- group_features_by_size(items_to_group(items), groups[isolate_count..-1])
37
+ group_features_by_size(items_to_group(items), groups[isolate_count..])
42
38
  else
43
39
  # add all files that should run in a single non-isolated process to first group
44
40
  single_items.each { |item, size| add_to_group(groups.first, item, size) }
@@ -43,14 +43,14 @@ module ParallelTests
43
43
 
44
44
  def read
45
45
  sync do
46
- contents = IO.read(file_path)
46
+ contents = File.read(file_path)
47
47
  return if contents.empty?
48
48
  @pids = JSON.parse(contents)
49
49
  end
50
50
  end
51
51
 
52
52
  def save
53
- sync { IO.write(file_path, pids.to_json) }
53
+ sync { File.write(file_path, pids.to_json) }
54
54
  end
55
55
 
56
56
  def sync(&block)