xcpretty-bb 0.1.12.bb1

Sign up to get free protection for your applications and to get access to all the features.
Files changed (69) hide show
  1. checksums.yaml +7 -0
  2. data/.gitignore +19 -0
  3. data/.hound.yml +2 -0
  4. data/.kick +17 -0
  5. data/.rubocop.yml +239 -0
  6. data/.travis.yml +11 -0
  7. data/CHANGELOG.md +200 -0
  8. data/CONTRIBUTING.md +64 -0
  9. data/Gemfile +9 -0
  10. data/LICENSE.txt +61 -0
  11. data/README.md +93 -0
  12. data/Rakefile +26 -0
  13. data/assets/report.html.erb +172 -0
  14. data/bin/xcpretty +85 -0
  15. data/features/assets/RACCommandSpec, line 80, hello xcpretty.png +0 -0
  16. data/features/assets/apple_raw.png +0 -0
  17. data/features/custom_formatter.feature +15 -0
  18. data/features/fixtures/xcodebuild.log +5963 -0
  19. data/features/html_report.feature +54 -0
  20. data/features/json_compilation_database_report.feature +21 -0
  21. data/features/junit_report.feature +44 -0
  22. data/features/knock_format.feature +11 -0
  23. data/features/simple_format.feature +204 -0
  24. data/features/steps/formatting_steps.rb +330 -0
  25. data/features/steps/html_steps.rb +32 -0
  26. data/features/steps/json_steps.rb +37 -0
  27. data/features/steps/junit_steps.rb +39 -0
  28. data/features/steps/report_steps.rb +22 -0
  29. data/features/steps/xcpretty_steps.rb +31 -0
  30. data/features/support/env.rb +117 -0
  31. data/features/tap_format.feature +31 -0
  32. data/features/test_format.feature +49 -0
  33. data/features/xcpretty.feature +14 -0
  34. data/lib/xcpretty/ansi.rb +72 -0
  35. data/lib/xcpretty/formatters/formatter.rb +177 -0
  36. data/lib/xcpretty/formatters/knock.rb +35 -0
  37. data/lib/xcpretty/formatters/rspec.rb +33 -0
  38. data/lib/xcpretty/formatters/simple.rb +200 -0
  39. data/lib/xcpretty/formatters/tap.rb +40 -0
  40. data/lib/xcpretty/parser.rb +591 -0
  41. data/lib/xcpretty/printer.rb +24 -0
  42. data/lib/xcpretty/reporters/html.rb +98 -0
  43. data/lib/xcpretty/reporters/json_compilation_database.rb +62 -0
  44. data/lib/xcpretty/reporters/junit.rb +102 -0
  45. data/lib/xcpretty/snippet.rb +38 -0
  46. data/lib/xcpretty/syntax.rb +51 -0
  47. data/lib/xcpretty/term.rb +14 -0
  48. data/lib/xcpretty/version.rb +4 -0
  49. data/lib/xcpretty.rb +37 -0
  50. data/spec/fixtures/NSStringTests.m +64 -0
  51. data/spec/fixtures/constants.rb +600 -0
  52. data/spec/fixtures/custom_formatter.rb +18 -0
  53. data/spec/fixtures/oneliner.m +1 -0
  54. data/spec/fixtures/raw_kiwi_compilation_fail.txt +24 -0
  55. data/spec/fixtures/raw_kiwi_fail.txt +1896 -0
  56. data/spec/fixtures/raw_specta_fail.txt +3110 -0
  57. data/spec/spec_helper.rb +7 -0
  58. data/spec/support/matchers/colors.rb +21 -0
  59. data/spec/xcpretty/ansi_spec.rb +47 -0
  60. data/spec/xcpretty/formatters/formatter_spec.rb +140 -0
  61. data/spec/xcpretty/formatters/rspec_spec.rb +56 -0
  62. data/spec/xcpretty/formatters/simple_spec.rb +173 -0
  63. data/spec/xcpretty/parser_spec.rb +542 -0
  64. data/spec/xcpretty/printer_spec.rb +55 -0
  65. data/spec/xcpretty/snippet_spec.rb +46 -0
  66. data/spec/xcpretty/syntax_spec.rb +39 -0
  67. data/spec/xcpretty/term_spec.rb +26 -0
  68. data/xcpretty.gemspec +37 -0
  69. metadata +237 -0
@@ -0,0 +1,54 @@
1
+ Feature: Creating a HTML test report
2
+
3
+ Background:
4
+ Given the tests have started running
5
+
6
+ Scenario: Showing a test suite
7
+ Given I have a passing test in my suite
8
+ When I pipe to xcpretty with "--report html"
9
+ Then I should see a test suite section in HTML
10
+
11
+ Scenario: Showing failed tests
12
+ Given I have a failing test in my suite
13
+ When I pipe to xcpretty with "--report html"
14
+ Then I should see a failed test in HTML
15
+ And the failure counter should show 1 test
16
+
17
+ Scenario: Showing passing tests
18
+ Given I have a passing test in my suite
19
+ When I pipe to xcpretty with "--report html"
20
+ Then I should see a passing test in HTML
21
+
22
+ Scenario: Counting tests
23
+ Given I have a passing test in my suite
24
+ And I have a failing test in my suite
25
+ And the test suite has finished
26
+ When I pipe to xcpretty with "--report html"
27
+ Then I should see 2 tests in HTML
28
+
29
+ Scenario: Having many test classes
30
+ Given I have tests in my suite from 2 classes
31
+ When I pipe to xcpretty with "--report html"
32
+ Then I should see 2 test suite sections in HTML
33
+
34
+ Scenario: Writing to a custom file path
35
+ When I pipe to xcpretty with "--report html" and specify a custom path
36
+ Then I should have a test report in a custom path
37
+
38
+ Scenario: Writing to multiple custom file paths
39
+ When I pipe to xcpretty with two custom "html" report paths
40
+ Then I should have test reports in two custom paths
41
+
42
+ Scenario: Showing screenshots
43
+ Given I have a passing test in my suite
44
+ And the test suite has finished
45
+ And I have a screenshot in the output folder
46
+ When I pipe to xcpretty with "--report html --screenshots"
47
+ Then I should see a screenshot in HTML
48
+
49
+ Scenario: Preventing unrelated images to be included in final report
50
+ Given I have a passing test in my suite
51
+ And the test suite has finished
52
+ And I have an unrelated image in the output folder
53
+ When I pipe to xcpretty with "--report html --screenshots"
54
+ Then I should not see a screenshot in HTML
@@ -0,0 +1,21 @@
1
+ Feature: Create a JSON compilation database
2
+
3
+ Scenario: Showing file compilation
4
+ Given I have a file to compile
5
+ When I pipe to xcpretty with "--report json-compilation-database" and specify a custom path
6
+ Then the JSON compilation database should contain an entry with a command
7
+ Then the JSON compilation database should contain an entry with a directory
8
+ Then the JSON compilation database should contain an entry with a file
9
+
10
+ Scenario: Handling a complete xcodebuild session
11
+ Given some big input
12
+ When I pipe to xcpretty with "--report json-compilation-database" and specify a custom path
13
+ Then the JSON compilation database should be complete
14
+
15
+ Scenario: Writing to a custom file path
16
+ When I pipe to xcpretty with "--report json-compilation-database" and specify a custom path
17
+ Then I should have a JSON compilation database in a custom path
18
+
19
+ Scenario: Writing to multiple custom file paths
20
+ When I pipe to xcpretty with two custom "json-compilation-database" report paths
21
+ Then I should have JSON compilation databases in two custom paths
@@ -0,0 +1,44 @@
1
+ Feature: Creating a JUnit test report
2
+
3
+ Background:
4
+ Given the tests have started running
5
+
6
+ Scenario: Showing a test suite
7
+ Given I have a passing test in my suite
8
+ When I pipe to xcpretty with "--report junit"
9
+ Then I should see a test suite node
10
+
11
+ Scenario: Showing pending test output
12
+ Given I have a pending test in my suite
13
+ When I pipe to xcpretty with "--report junit"
14
+ Then I should see a pending test node in my report
15
+
16
+ Scenario: Showing failed tests
17
+ Given I have a failing test in my suite
18
+ When I pipe to xcpretty with "--report junit"
19
+ Then I should see a failed test node in my report
20
+
21
+ Scenario: Showing passing tests
22
+ Given I have a passing test in my suite
23
+ When I pipe to xcpretty with "--report junit"
24
+ Then I should see a passing test node in my report
25
+
26
+ Scenario: Counting tests
27
+ Given I have a passing test in my suite
28
+ And I have a failing test in my suite
29
+ And the test suite has finished
30
+ When I pipe to xcpretty with "--report junit"
31
+ Then I should see 2 tests in my report
32
+
33
+ Scenario: Having many test classes
34
+ Given I have tests in my suite from 2 classes
35
+ When I pipe to xcpretty with "--report junit"
36
+ Then I should see 2 test suites
37
+
38
+ Scenario: Writing to a custom file path
39
+ When I pipe to xcpretty with "--report junit" and specify a custom path
40
+ Then I should have a test report in a custom path
41
+
42
+ Scenario: Writing to multiple custom file paths
43
+ When I pipe to xcpretty with two custom "junit" report paths
44
+ Then I should have test reports in two custom paths
@@ -0,0 +1,11 @@
1
+ Feature: Showing Knock-style test output
2
+
3
+ Scenario: Showing failed tests
4
+ Given I have a failing test in my suite
5
+ When I pipe to xcpretty with "--knock"
6
+ Then I should see text beginning with "not ok"
7
+
8
+ Scenario: Showing passing tests
9
+ Given I have a passing test in my suite
10
+ When I pipe to xcpretty with "--knock"
11
+ Then I should see text beginning with "ok"
@@ -0,0 +1,204 @@
1
+ Feature: Showing build output in simple format
2
+
3
+ Scenario: Showing file compilation
4
+ Given I have a file to compile
5
+ When I pipe to xcpretty with "--simple --no-color"
6
+ Then I should see a successful compilation message
7
+
8
+ Scenario: Showing xib compilation
9
+ Given I have a xib to compile
10
+ When I pipe to xcpretty with "--simple --no-color"
11
+ Then I should see a successful compilation message
12
+
13
+ Scenario: Showing storyboard compilation
14
+ Given I have a storyboard to compile
15
+ When I pipe to xcpretty with "--simple"
16
+ Then I should see a successful compilation message
17
+
18
+ Scenario: Showing precompilation
19
+ Given I have a precompiled header
20
+ When I pipe to xcpretty with "--simple --no-color"
21
+ Then I should see a successful precompilation message
22
+
23
+ Scenario: Showing phase success
24
+ Given I have completed a build
25
+ When I pipe to xcpretty with "--simple"
26
+ Then I should see a "build" completion message
27
+
28
+ Scenario: Showing file compilation with color
29
+ Given I have a file to compile
30
+ When I pipe to xcpretty with "--simple --color"
31
+ Then I should see a yellow completion icon
32
+
33
+ Scenario: Showing xib compilation with color
34
+ Given I have a xib to compile
35
+ When I pipe to xcpretty with "--simple --color"
36
+ Then I should see a yellow completion icon
37
+
38
+ Scenario: Showing storyboard compilation with color
39
+ Given I have a storyboard to compile
40
+ When I pipe to xcpretty with "--simple --color"
41
+ Then I should see a yellow completion icon
42
+
43
+ Scenario: Showing precompilation
44
+ Given I have a precompiled header
45
+ When I pipe to xcpretty with "--simple --color"
46
+ Then I should see a yellow completion icon
47
+
48
+ Scenario: Showing analyze
49
+ Given I have a file to analyze
50
+ When I pipe to xcpretty with "--simple --no-color"
51
+ Then I should see a successful analyze message
52
+
53
+ Scenario: Showing shallow analyze
54
+ Given I have a file to shallow analyze
55
+ When I pipe to xcpretty with "--simple --no-color"
56
+ Then I should see a successful analyze message
57
+
58
+ Scenario: Showing tiff file validation
59
+ Given I have a tiff file to validate
60
+ When I pipe to xcpretty with "--simple --no-color"
61
+ Then I should see a successful tiff validation message
62
+
63
+ Scenario: Showing touch file
64
+ Given I have a file to touch
65
+ When I pipe to xcpretty with "--simple --no-color"
66
+ Then I should see a successful touch message
67
+
68
+ Scenario: Showing analyze with color
69
+ Given I have a file to analyze
70
+ When I pipe to xcpretty with "--simple --color"
71
+ Then I should see a yellow completion icon
72
+
73
+ Scenario: Showing shallow analyze with color
74
+ Given I have a file to shallow analyze
75
+ When I pipe to xcpretty with "--simple --color"
76
+ Then I should see a yellow completion icon
77
+
78
+ Scenario: Showing tiff file validation with color
79
+ Given I have a tiff file to validate
80
+ When I pipe to xcpretty with "--simple --color"
81
+ Then I should see a yellow completion icon
82
+
83
+ Scenario: Showing touch file with color
84
+ Given I have a file to touch
85
+ When I pipe to xcpretty with "--simple --color"
86
+ Then I should see a yellow completion icon
87
+
88
+ Scenario: Showing the start of a test run
89
+ Given the tests have started running
90
+ When I pipe to xcpretty with "--simple --no-color"
91
+ Then I should see that test suite has started
92
+
93
+ Scenario: Showing the start of a test suite
94
+ Given I start a test suite
95
+ When I pipe to xcpretty with "--simple --no-color"
96
+ Then I should see the name of suite only
97
+
98
+ Scenario: Showing the end of a test suite
99
+ Given the test suite has finished
100
+ When I pipe to xcpretty with "--simple --no-color"
101
+ Then I should see that the test suite finished
102
+
103
+ Scenario: Showing failed test output
104
+ Given I have a failing test in my suite
105
+ And the test suite has finished
106
+ When I pipe to xcpretty with "--simple --no-color"
107
+ Then I should see the name of a failed test
108
+ And I should see the path of a failed test
109
+
110
+ Scenario: Showing successful test output
111
+ Given I have a passing test in my suite
112
+ When I pipe to xcpretty with "--simple --no-color"
113
+ Then I should see the name of a passing test
114
+ And I should not see the name of the test group
115
+ And I should not see the path of a passing test
116
+
117
+ Scenario: Colorizing slow-ish tests in yellow
118
+ Given I have a slow-ish test in my suite
119
+ When I pipe to xcpretty with "--simple --color"
120
+ Then I should see the test time in yellow
121
+
122
+ Scenario: Colorizing slow tests in red
123
+ Given I have a slow test in my suite
124
+ When I pipe to xcpretty with "--simple --color"
125
+ Then I should see the test time in red
126
+
127
+ Scenario: Showing pending test output
128
+ Given I have a pending test in my suite
129
+ When I pipe to xcpretty with "--simple --no-color"
130
+ Then I should see the name of a pending test
131
+
132
+ Scenario: Showing measuring test output
133
+ Given I have a measuring test in my suite
134
+ When I pipe to xcpretty with "--simple --no-color"
135
+ Then I should see the name of a measuring test
136
+
137
+ Scenario: Showing failed test output with color
138
+ Given I have a failing test in my suite
139
+ And the test suite has finished
140
+ When I pipe to xcpretty with "--simple --color"
141
+ Then I should see a red failed test mark
142
+ And the final execution message should be red
143
+
144
+ Scenario: Showing successful test output with color
145
+ Given I have a passing test in my suite
146
+ And the test suite has finished
147
+ When I pipe to xcpretty with "--simple --color"
148
+ Then I should see a green passing test mark
149
+
150
+ Scenario: Running tests without UTF-8 support
151
+ Given I have a passing test in my suite
152
+ And I pipe to xcpretty with "--no-utf --color"
153
+ Then I should see a non-utf prefixed output
154
+
155
+ Scenario: Showing code signing
156
+ Given I have a file to code sign
157
+ When I pipe to xcpretty with "--simple --no-color"
158
+ Then I should see a successful code signing message
159
+
160
+ Scenario: Showing code signing a framework
161
+ Given I have a framework to code sign
162
+ When I pipe to xcpretty with "--simple --no-color"
163
+ Then I should see a successful code signing message
164
+
165
+ Scenario: Showing preprocess
166
+ Given I have a file to preprocess
167
+ When I pipe to xcpretty with "--simple --no-color"
168
+ Then I should see a successful preprocessing message
169
+
170
+ Scenario: Showing a PBXCp copy
171
+ Given I have a file to copy with PBXCp
172
+ When I pipe to xcpretty with "--simple --no-color"
173
+ Then I should see a successful copying message
174
+
175
+ Scenario: Build fails when Pod install hasn't been run
176
+ Given podfile.lock wasn't in sync
177
+ When I pipe to xcpretty with "--simple --color"
178
+ Then I should see a red error message
179
+ And I should see that sandbox is not in sync with Podfile.lock
180
+
181
+ Scenario: Compilation fails because missing files in the project
182
+ Given there was a missing file
183
+ When I pipe to xcpretty with "--simple --color"
184
+ Then I should see a red error message
185
+ And I should see which file is missing
186
+
187
+ Scenario: Compilation fails because of syntax errors
188
+ Given there was a syntax error
189
+ When I pipe to xcpretty with "--simple --color"
190
+ Then I should see a red compilation error
191
+ And I should see a failed line
192
+ And I should see a cyan cursor
193
+
194
+ Scenario: Linker fails with undefined symbols
195
+ Given the linker has failed with undefined symbols
196
+ When I pipe to xcpretty with "--simple --color"
197
+ Then I should see the undefined symbold message
198
+ And I should see the symbol and reference that caused failure
199
+
200
+ Scenario: There are build warnings
201
+ Given there were warnings in the code
202
+ When I pipe to xcpretty with "--simple --color"
203
+ Then I should see a yellow warning message
204
+
@@ -0,0 +1,330 @@
1
+ # encoding: utf-8
2
+ Given(/^I have a file to compile$/) do
3
+ add_run_input SAMPLE_COMPILE
4
+ end
5
+
6
+ Given(/^I have a xib to compile$/) do
7
+ add_run_input SAMPLE_COMPILE_XIB
8
+ end
9
+
10
+ Given(/^I have a storyboard to compile$/) do
11
+ add_run_input SAMPLE_COMPILE_STORYBOARD
12
+ end
13
+
14
+ Given(/^I have a precompiled header$/) do
15
+ add_run_input SAMPLE_PRECOMPILE
16
+ end
17
+
18
+ Given(/^I have a file to analyze$/) do
19
+ add_run_input SAMPLE_ANALYZE
20
+ end
21
+
22
+ Given(/^I have a file to shallow analyze$/) do
23
+ add_run_input SAMPLE_ANALYZE_SHALLOW
24
+ end
25
+
26
+ Given(/^I have a failing test in my suite$/) do
27
+ add_run_input SAMPLE_OLD_SPECTA_FAILURE
28
+ end
29
+
30
+ Given(/^all of my tests will pass in my suite$/) do
31
+ 3.times { add_run_input SAMPLE_OCUNIT_TEST }
32
+ end
33
+
34
+ Given(/^I have a passing test in my suite$/) do
35
+ add_run_input SAMPLE_OCUNIT_TEST
36
+ end
37
+
38
+ Given(/^I have a slow\-ish test in my suite$/) do
39
+ add_run_input SAMPLE_SLOWISH_TEST
40
+ end
41
+
42
+ Given(/^I have a slow test in my suite$/) do
43
+ add_run_input SAMPLE_SLOW_TEST
44
+ end
45
+
46
+ Given(/^the tests have started running$/) do
47
+ add_run_input SAMPLE_OCUNIT_TEST_RUN_BEGINNING
48
+ end
49
+
50
+ Given(/^I start a test suite$/) do
51
+ add_run_input SAMPLE_OCUNIT_SUITE_BEGINNING
52
+ end
53
+
54
+ Given(/^the test suite has finished$/) do
55
+ add_run_input SAMPLE_OCUNIT_SUITE_COMPLETION
56
+ add_run_input SAMPLE_EXECUTED_TESTS
57
+ end
58
+
59
+ Given(/^I have a file to code sign$/) do
60
+ add_run_input SAMPLE_CODESIGN
61
+ end
62
+
63
+ Given(/^I have a framework to code sign$/) do
64
+ add_run_input SAMPLE_CODESIGN_FRAMEWORK
65
+ end
66
+
67
+ Given(/^I have a file to preprocess$/) do
68
+ add_run_input SAMPLE_PREPROCESS
69
+ end
70
+
71
+ Given(/^I have a file to copy with PBXCp/) do
72
+ add_run_input SAMPLE_PBXCP
73
+ end
74
+
75
+ Given(/^podfile.lock wasn't in sync$/) do
76
+ add_run_input SAMPLE_PODS_ERROR
77
+ end
78
+
79
+ Given(/^there was a syntax error$/) do
80
+ add_run_input SAMPLE_COMPILE_ERROR
81
+ end
82
+
83
+ Given(/^there was a missing file$/) do
84
+ add_run_input SAMPLE_FILE_MISSING_ERROR
85
+ end
86
+
87
+ Given(/^there were warnings in the code$/) do
88
+ add_run_input SAMPLE_FORMAT_WARNING
89
+ end
90
+
91
+ Given(/^the linker has failed with undefined symbols$/) do
92
+ add_run_input SAMPLE_UNDEFINED_SYMBOLS
93
+ end
94
+
95
+ Given(/^I have a pending test in my suite$/) do
96
+ add_run_input SAMPLE_PENDING_KIWI_TEST
97
+ end
98
+
99
+ Given(/^I have a measuring test in my suite$/) do
100
+ add_run_input SAMPLE_MEASURING_TEST
101
+ end
102
+
103
+ Given(/^I have a tiff file to validate$/) do
104
+ add_run_input SAMPLE_TIFFUTIL
105
+ end
106
+
107
+ Given(/^I have a file to touch$/) do
108
+ add_run_input SAMPLE_TOUCH
109
+ end
110
+
111
+ Given(/^I have a screenshot in the output folder/) do
112
+ copy_file_to_screenshot_dir(SAMPLE_SCREENSHOT_FILE)
113
+ end
114
+
115
+ Given(/^I have an unrelated image in the output folder/) do
116
+ copy_file_to_screenshot_dir(SAMPLE_UNRELATED_IMAGE_FILE)
117
+ end
118
+
119
+ Given(/^I have completed a build$/) do
120
+ add_run_input SAMPLE_BUILD_SUCCEEDED
121
+ end
122
+
123
+ Given(/^I have completed a clean$/) do
124
+ add_run_input SAMPLE_CLEAN_SUCCEEDED
125
+ end
126
+
127
+ Then(/^I should see a "(\w+)" completion message$/) do |phase|
128
+ run_output.should start_with("▸ #{phase.capitalize} Succeeded")
129
+ end
130
+
131
+ Then(/^I should see text beginning with "(.*?)"$/) do |text|
132
+ run_output.lines.to_a.detect { |line| line.start_with? text }.should_not be_nil
133
+ end
134
+
135
+ Then(/^I should see text containing "(.*?)" and beginning with "(.*?)"$/) do |inner, start|
136
+ run_output.lines.to_a.detect { |line| line.start_with?(start) && line.include?(inner) }.should_not be_nil
137
+ end
138
+
139
+ Then(/^I should (green|red) text beginning with "(.*?)"$/) do |color, text|
140
+ run_output.should start_with(send(color.to_sym, text))
141
+ end
142
+
143
+ Then(/^I should see a successful tiff validation message$/) do
144
+ run_output.should start_with("▸ Validating")
145
+ end
146
+
147
+ Then(/^I should see a successful touch message$/) do
148
+ run_output.should start_with("▸ Touching")
149
+ end
150
+
151
+ When(/^I pipe to xcpretty with "(.*?)"$/) do |flags|
152
+ run_xcpretty(flags)
153
+ end
154
+
155
+ When(/^I pipe to xcpretty with a custom formatter$/) do
156
+ formatter_path = File.expand_path('../../../spec/fixtures/custom_formatter.rb', __FILE__)
157
+ run_xcpretty("-f #{formatter_path}")
158
+ end
159
+
160
+ Then(/^I should see a custom compilation message$/) do
161
+ run_output.should start_with("😎 Compilation party time")
162
+ end
163
+
164
+ Then(/^I should see a successful compilation message$/) do
165
+ run_output.should start_with("▸ Compiling")
166
+ end
167
+
168
+ Then(/^I should see a successful precompilation message$/) do
169
+ run_output.should start_with("▸ Precompiling")
170
+ end
171
+
172
+ Then(/^I should see a successful analyze message$/) do
173
+ run_output.should start_with("▸ Analyzing")
174
+ end
175
+
176
+ Then(/^I should see a successful code signing message$/) do
177
+ run_output.should start_with("▸ Signing")
178
+ end
179
+
180
+ Then(/^I should see a successful preprocessing message$/) do
181
+ run_output.should start_with("▸ Preprocessing")
182
+ end
183
+
184
+ Then(/^I should see a successful copying message$/) do
185
+ run_output.should start_with("▸ Copying")
186
+ end
187
+
188
+ Then(/^I should see a yellow completion icon$/) do
189
+ run_output.should start_with(yellow("▸"))
190
+ end
191
+
192
+ Then(/^I should see a failed test icon$/) do
193
+ run_output.should start_with("F")
194
+ end
195
+
196
+ Then(/^I should see a passing test icon in ASCII$/) do
197
+ run_output.should start_with(".")
198
+ end
199
+
200
+ Then(/^I should see a red failed test icon$/) do
201
+ run_output.should include(red("F"))
202
+ end
203
+
204
+ Then(/^I should see a pending test icon in ASCII$/) do
205
+ run_output.should start_with("P")
206
+ end
207
+
208
+ Then(/^I should see a yellow pending test icon$/) do
209
+ run_output.should start_with(yellow("P"))
210
+ end
211
+
212
+ Then(/^I should see a measuring test icon in ASCII$/) do
213
+ run_output.should start_with('T')
214
+ end
215
+
216
+ Then(/^I should see a yellow measuring test icon$/) do
217
+ run_output.should start_with(yellow('T'))
218
+ end
219
+
220
+ Then(/^the final execution message should be (red|green)$/) do |color|
221
+ last_line = run_output.lines.to_a.last
222
+ last_line.should be_colored(color.to_sym)
223
+ end
224
+
225
+ Then(/^I should see a green passing test icon$/) do
226
+ run_output.should include(green("."))
227
+ end
228
+
229
+ Then(/^I should see the name of a failed test$/) do
230
+ run_output.should =~ FAILING_TEST_NAME_MATCHER
231
+ end
232
+
233
+ Then(/^I should see the path of a failed test$/) do
234
+ run_output.should =~ TEST_PATH_MATCHER
235
+ end
236
+
237
+ Then(/^I should see the name of a passing test$/) do
238
+ run_output.should =~ PASSING_TEST_NAME_MATCHER
239
+ end
240
+
241
+ Then(/^I should not see the path of a passing test$/) do
242
+ run_output.should_not =~ TEST_PATH_MATCHER
243
+ end
244
+
245
+ Then(/^I should see that test suite has started$/) do
246
+ run_output.should =~ TEST_RUN_START_MATCHER
247
+ end
248
+
249
+ Then(/^I should see the name of suite only$/) do
250
+ run_output.should =~ TEST_SUITE_START_MATCHER
251
+ end
252
+
253
+ Then(/^I should see that the test suite finished$/) do
254
+ run_output.strip.should =~ TEST_SUITE_COMPLETION_MATCHER
255
+ end
256
+
257
+ Then(/^I should see a red failed test mark$/) do
258
+ run_output.should include(red("✗"))
259
+ end
260
+
261
+ Then(/^I should see a green passing test mark$/) do
262
+ run_output.should include(green("✓"))
263
+ end
264
+
265
+ Then(/^I should see a non-utf prefixed output$/) do
266
+ run_output.should start_with(" " + green("."))
267
+ end
268
+
269
+ Then(/^I should not see the name of the test group$/) do
270
+ run_output.should_not include("RACTupleSpec")
271
+ end
272
+
273
+ Then(/^I should see a red error message$/) do
274
+ run_output.should include(red("❌ error: ")[0..-5]) # trim \e[0m
275
+ end
276
+
277
+ Then(/^I should see that sandbox is not in sync with Podfile.lock$/) do
278
+ run_output.should include("The sandbox is not in sync with the Podfile.lock")
279
+ end
280
+
281
+ Then(/^I should see which file is missing$/) do
282
+ run_output.should include(SAMPLE_FILE_MISSING_ERROR.split('directory: ')[1].delete("'"))
283
+ end
284
+
285
+ Then(/^I should see a yellow warning message$/) do
286
+ run_output.should include("#{yellow('⚠️ ')}/Users/supermarin/code/oss/ObjectiveSugar/Example/ObjectiveSugar/AppDelegate.m:19:31:")
287
+ run_output.should include(yellow("format specifies type 'id' but the argument has type 'int' [-Wformat]"))
288
+ end
289
+
290
+ Then(/^I should see a red compilation error$/) do
291
+ run_output.should include(red("expected identifier"))
292
+ end
293
+
294
+ Then(/^I should see a failed line$/) do
295
+ run_output.should include("[[thread.lastMessage should] equal:thread.];")
296
+ end
297
+
298
+ Then(/^I should see a cyan cursor$/) do
299
+ run_output.should include(cyan(" ^"))
300
+ end
301
+
302
+ Then(/^I should see the undefined symbold message$/) do
303
+ run_output.should include(red("❌ Undefined symbols for architecture x86_64"))
304
+ end
305
+
306
+ Then(/^I should see the symbol and reference that caused failure$/) do
307
+ run_output.should include("_OBJC_CLASS_$_CABasicAnimation")
308
+ run_output.should include("objc-class-ref in ATZRadialProgressControl.o")
309
+ end
310
+
311
+ Then(/^I should see the name of a pending test$/) do
312
+ run_output.should =~ PENDING_TEST_NAME_MATCHER
313
+ end
314
+
315
+ Then(/^I should see the name of a measuring test$/) do
316
+ run_output.should =~ MEASURING_TEST_NAME_MATCHER
317
+ end
318
+
319
+ Then(/^I should see the test time in yellow$/) do
320
+ run_output.should include("#{yellow("0.026")}")
321
+ end
322
+
323
+ Then(/^I should see the test time in red$/) do
324
+ run_output.should include("#{red("0.101")}")
325
+ end
326
+
327
+ Then(/^I should see text matching "(.*?)"$/) do |text|
328
+ run_output.lines.to_a.last.strip.should == text
329
+ end
330
+
@@ -0,0 +1,32 @@
1
+ Then(/^I should see a test suite section in HTML$/) do
2
+ html_test_suites.first.should_not be_nil
3
+ end
4
+
5
+ Then(/^I should see a failed test in HTML$/) do
6
+ html_report_body.get_elements("//*[contains(@class, 'test failing')]/").to_a.size.should_not == 0
7
+ end
8
+
9
+ Then(/^the failure counter should show (\d+) tests?$/) do |fail_count|
10
+ html_report_body.get_elements("//*[@id='fail-count']/").first.elements.to_a.first.text.to_i.should == fail_count.to_i
11
+ end
12
+
13
+ Then(/^I should see a passing test in HTML$/) do
14
+ html_report_body.get_elements("//*[contains(@class, 'test passing')]/").to_a.size.should_not == 0
15
+ end
16
+
17
+ Then(/^I should see (\d+) tests in HTML$/) do |test_count|
18
+ html_report_body.get_elements("//*[contains(@class, 'test ')]/").size.should == test_count.to_i
19
+ end
20
+
21
+ Then(/^I should see (\d+) test suite sections? in HTML$/) do |section_count|
22
+ html_test_suites.size.should == section_count.to_i
23
+ end
24
+
25
+ Then(/^I should see a screenshot in HTML$/) do
26
+ html_report_body.get_elements("//*[contains(@class, 'screenshot')]/").to_a.size.should_not == 0
27
+ end
28
+
29
+ Then(/^I should not see a screenshot in HTML$/) do
30
+ html_report_body.get_elements("//*[contains(@class, 'screenshot')]/").to_a.size.should == 0
31
+ end
32
+