metric_fu-Saikuro 1.1.1.0

Sign up to get free protection for your applications and to get access to all the features.
data/README ADDED
@@ -0,0 +1,147 @@
1
+ The japgolly fork is a part of an attempt to get metric_fu working in a modern
2
+ Ruby environment, specifically compatibility with Ruby 1.9 and Bundler.
3
+
4
+ ===============================================================================
5
+
6
+ Version 0.2
7
+
8
+ Saikuro:
9
+ Saikuro is a Ruby cyclomatic complexity analyzer. When given Ruby
10
+ source code Saikuro will generate a report listing the cyclomatic
11
+ complexity of each method found. In addition, Saikuro counts the
12
+ number of lines per method and can generate a listing of the number of
13
+ tokens on each line of code.
14
+
15
+ License:
16
+ Saikuro uses the BSD license.
17
+
18
+ Installation:
19
+ Option 1: Using setup.rb
20
+ * login as root
21
+ * run "ruby setup.rb all"
22
+
23
+ Option 2: The manual way
24
+ Saikuro is a single Ruby file that is executable. You can run it where
25
+ you unpacked it or you can move it your preferred location such as
26
+ "/usr/local/bin" or "~/bin".
27
+
28
+ Note:
29
+ Ruby 1.8.5 has a bug in ri_options that will prevent Saikuro from
30
+ running. If you are using 1.8.5 please apply this patch :
31
+ http://www.ruby-lang.org/cgi-bin/cvsweb.cgi/ruby/lib/rdoc/ri/ri_options.rb.diff?r1=1.2.2.13;r2=1.2.2.14
32
+
33
+
34
+ Usage:
35
+ Saikuro is a command line program.
36
+ Running "saikuro -h" will output a usage statement describing all
37
+ the various arguments you can pass to it.
38
+
39
+ "saikuro -c -p tests/samples.rb"
40
+
41
+ The above command is a simple example that generates a cyclomatic
42
+ complexity report on the samples.rb file, using the default filter,
43
+ warning and error settings. The report is saved in the current
44
+ directory.
45
+
46
+
47
+ A more detailed example is
48
+ "saikuro -c -t -i tests -y 0 -w 11 -e 16 -o out/"
49
+
50
+ This will analyze all Ruby files found in the "tests/" directory.
51
+ Saikuro will generate a token count report and a cyclomatic complexity
52
+ report in the "out" directory . The "-y 0" command will turn off
53
+ filtering and thus show the complexity of all methods. The "-w 11"
54
+ will mark all methods with a complexity of 11 or higher with a
55
+ warning. Finally, "-e 16" will flag all methods with a complexity of
56
+ 16 or higher with an error.
57
+
58
+
59
+ About Cyclomatic Complexity:
60
+
61
+ The following document provides a very good and detailed description
62
+ by the author of cyclomatic complexity.
63
+
64
+ NIST Special Publication 500-235
65
+ Structured Testing: A Testing Methodology Using the Cyclomatic
66
+ Complexity Metric
67
+
68
+ By Arthur H. Watson and Thomas J. McCabe
69
+ HTML
70
+ http://hissa.nist.gov/HHRFdata/Artifacts/ITLdoc/235/title.htm
71
+ PDF
72
+ http://www.mccabe.com/iq_research_nist.htm
73
+
74
+
75
+ How and what Saikuro counts to calculate the cyclomatic complexity:
76
+
77
+ Saikuro uses the Simplified Complexity Calculation, which is just
78
+ adding up the number of branch points in a method.
79
+
80
+ Each method starts with a complexity of 1, because there is at least
81
+ one path through the code. Then each conditional or looping operator
82
+ (if, unless, while, until, for, elsif, when) adds one point to the
83
+ complexity. Each "when" in a case statement adds one point. Also each
84
+ "rescue" statement adds one.
85
+
86
+ Saikuro also regards blocks as an addition to a method's complexity
87
+ because in many cases a block does add a path that may be traversed.
88
+ For example, invoking the "each" method of an array with a block would
89
+ only traverse the give block if the array is not empty. Thus if you
90
+ want to find the basis set to get 100% coverage of your code then a
91
+ block should add one point to the method's complexity. It is not yet
92
+ for sure however to what level the accuracy is decreased through this
93
+ measurement, as normal Ruby code uses blocks quite heavily and new
94
+ paths are not necessarily introduced by every block.
95
+
96
+ In addition, the short-circuiting "and" operators (&& and "and")
97
+ currently do not contribute to a method's complexity, although
98
+ McCabe's paper listed above suggests doing so.
99
+
100
+
101
+ #Example for "and" operator handling:
102
+
103
+ # Starting values for case 1 and 2
104
+ x = false
105
+ y = 15
106
+ r, q = nil
107
+
108
+ # case 1
109
+ puts "W" if ((r = x) && (q = y))
110
+ puts r # => false
111
+ puts q # => nil
112
+
113
+ # case 2
114
+ puts "W" if ((q = y) && (r = x))
115
+ puts r # => false
116
+ puts q # => 15
117
+
118
+ Case 1 illustrates why "and" operators should add to a method's
119
+ complexity, because the result of ( r = x ) is false the if statement
120
+ stops and returns false without evaluating the ( q = y ) branch. Thus
121
+ if a total coverage of source code is desired, one point should be
122
+ added to the method's complexity.
123
+
124
+ So why is it not added?
125
+ Mainly, because we have not gotten around to it. We are wondering if
126
+ this would increase the noise more than it should.
127
+
128
+
129
+ Tests:
130
+ In the test directory is a sample file that has examples of the
131
+ various possible cases that we examined and documented the expected
132
+ cyclomatic complexity result. If you find mistakes or missing tests
133
+ please report them.
134
+
135
+ Contact:
136
+ Saikuro is written by
137
+ Zev Blut (zb at ubit dot com)
138
+
139
+ Acknowledgments:
140
+ Thanks to Elbert Corpuz for writing the CSS for the HTML output!
141
+
142
+ Other metric tools for Ruby:
143
+ Ryan Davis has an abc metric program as an example in his ParseTree
144
+ product: http://www.zenspider.com/ZSS/Products/ParseTree/
145
+
146
+ The PMD project has a tool called CPD that can scan Ruby source code
147
+ looking for source duplication: http://pmd.sourceforge.net/
data/bin/saikuro ADDED
@@ -0,0 +1,95 @@
1
+ #!/usr/bin/env ruby
2
+ # $Id$
3
+ # Version 0.2
4
+ # == Usage
5
+ #
6
+ # saikuro [ -h ] [-o output_directory] [-f type] [ -c, -t ]
7
+ # [ -y, -w, -e, -k, -s, -d - number ] ( -p file | -i directory )
8
+ #
9
+ # == Help
10
+ #
11
+ # -o, --output_directory (directory) : A directory to ouput the results in.
12
+ # The current directory is used if this option is not passed.
13
+ #
14
+ # -h, --help : This help message.
15
+ #
16
+ # -f, --formater (html | text) : The format to output the results in.
17
+ # The default is html
18
+ #
19
+ # -c, --cyclo : Compute the cyclomatic complexity of the input.
20
+ #
21
+ # -t, --token : Count the number of tokens per line of the input.
22
+ #
23
+ # -y, --filter_cyclo (number) : Filter the output to only include methods
24
+ # whose cyclomatic complexity are greater than the passed number.
25
+ #
26
+ # -w, --warn_cyclo (number) : Highlight with a warning methods whose
27
+ # cyclomatic complexity are greather than or equal to the passed number.
28
+ #
29
+ #
30
+ # -e, --error_cyclo (number) : Highligh with an error methods whose
31
+ # cyclomatic complexity are greather than or equal to the passed number.
32
+ #
33
+ #
34
+ # -k, --filter_token (number) : Filter the output to only include lines
35
+ # whose token count are greater than the passed number.
36
+ #
37
+ #
38
+ # -s, --warn_token (number) : Highlight with a warning lines whose
39
+ # token count are greater than or equal to the passed number.
40
+ #
41
+ #
42
+ # -d, --error_token (number) : Highlight with an error lines whose
43
+ # token count are greater than or equal to the passed number.
44
+ #
45
+ #
46
+ # -p, --parse_file (file) : A file to use as input.
47
+ #
48
+ # -i, --input_directory (directory) : All ruby files found recursively
49
+ # inside the directory are passed as input.
50
+
51
+ # == License
52
+ # Saikruo uses the BSD license.
53
+ #
54
+ # Copyright (c) 2005, Ubiquitous Business Technology (http://ubit.com)
55
+ # All rights reserved.
56
+ #
57
+ # Redistribution and use in source and binary forms, with or without
58
+ # modification, are permitted provided that the following conditions are
59
+ # met:
60
+ #
61
+ #
62
+ # * Redistributions of source code must retain the above copyright
63
+ # notice, this list of conditions and the following disclaimer.
64
+ #
65
+ # * Redistributions in binary form must reproduce the above
66
+ # copyright notice, this list of conditions and the following
67
+ # disclaimer in the documentation and/or other materials provided
68
+ # with the distribution.
69
+ #
70
+ # * Neither the name of Ubiquitous Business Technology nor the names
71
+ # of its contributors may be used to endorse or promote products
72
+ # derived from this software without specific prior written
73
+ # permission.
74
+ #
75
+ #
76
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
77
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
78
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
79
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
80
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
81
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
82
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
83
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
84
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
85
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
86
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
87
+ #
88
+ # == Author
89
+ # Zev Blut (zb@ubit.com)
90
+
91
+ $:.unshift File.expand_path('../../lib',__FILE__)
92
+ require 'saikuro'
93
+
94
+ SaikuroCMDLineRunner.new.run
95
+
data/lib/saikuro.rb ADDED
@@ -0,0 +1,1225 @@
1
+ # $Id$
2
+ # == Usage
3
+ #
4
+ # saikuro [ -h ] [-o output_directory] [-f type] [ -c, -t ]
5
+ # [ -y, -w, -e, -k, -s, -d - number ] ( -p file | -i directory )
6
+ #
7
+ # == Help
8
+ #
9
+ # -o, --output_directory (directory) : A directory to ouput the results in.
10
+ # The current directory is used if this option is not passed.
11
+ #
12
+ # -h, --help : This help message.
13
+ #
14
+ # -f, --formater (html | text) : The format to output the results in.
15
+ # The default is html
16
+ #
17
+ # -c, --cyclo : Compute the cyclomatic complexity of the input.
18
+ #
19
+ # -t, --token : Count the number of tokens per line of the input.
20
+ #
21
+ # -y, --filter_cyclo (number) : Filter the output to only include methods
22
+ # whose cyclomatic complexity are greater than the passed number.
23
+ #
24
+ # -w, --warn_cyclo (number) : Highlight with a warning methods whose
25
+ # cyclomatic complexity are greather than or equal to the passed number.
26
+ #
27
+ #
28
+ # -e, --error_cyclo (number) : Highligh with an error methods whose
29
+ # cyclomatic complexity are greather than or equal to the passed number.
30
+ #
31
+ #
32
+ # -k, --filter_token (number) : Filter the output to only include lines
33
+ # whose token count are greater than the passed number.
34
+ #
35
+ #
36
+ # -s, --warn_token (number) : Highlight with a warning lines whose
37
+ # token count are greater than or equal to the passed number.
38
+ #
39
+ #
40
+ # -d, --error_token (number) : Highlight with an error lines whose
41
+ # token count are greater than or equal to the passed number.
42
+ #
43
+ #
44
+ # -p, --parse_file (file) : A file to use as input.
45
+ #
46
+ # -i, --input_directory (directory) : All ruby files found recursively
47
+ # inside the directory are passed as input.
48
+
49
+
50
+ # Saikruo uses the BSD license.
51
+ #
52
+ # Copyright (c) 2005, Ubiquitous Business Technology (http://ubit.com)
53
+ # All rights reserved.
54
+ #
55
+ # Redistribution and use in source and binary forms, with or without
56
+ # modification, are permitted provided that the following conditions are
57
+ # met:
58
+ #
59
+ #
60
+ # * Redistributions of source code must retain the above copyright
61
+ # notice, this list of conditions and the following disclaimer.
62
+ #
63
+ # * Redistributions in binary form must reproduce the above
64
+ # copyright notice, this list of conditions and the following
65
+ # disclaimer in the documentation and/or other materials provided
66
+ # with the distribution.
67
+ #
68
+ # * Neither the name of Ubiquitous Business Technology nor the names
69
+ # of its contributors may be used to endorse or promote products
70
+ # derived from this software without specific prior written
71
+ # permission.
72
+ #
73
+ #
74
+ # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
75
+ # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
76
+ # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
77
+ # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
78
+ # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
79
+ # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
80
+ # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
81
+ # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
82
+ # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
83
+ # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
84
+ # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
85
+ #
86
+ # == Author
87
+ # Zev Blut (zb@ubit.com)
88
+
89
+ require 'irb/ruby-lex'
90
+ require 'yaml'
91
+
92
+ # States to watch for
93
+ # once in def get the token after space, because it may also
94
+ # be something like + or << for operator overloading.
95
+
96
+ # Counts the number of tokens in each line.
97
+ class TokenCounter
98
+ include RubyToken
99
+
100
+ attr_reader :current_file
101
+
102
+ def initialize
103
+ @files = Hash.new
104
+ @tokens_per_line = Hash.new(0)
105
+ @current_file = ""
106
+ end
107
+
108
+ # Mark file to associate with the token count.
109
+ def set_current_file(file)
110
+ @current_file = file
111
+ @tokens_per_line = Hash.new(0)
112
+ @files[@current_file] = @tokens_per_line
113
+ end
114
+
115
+ # Iterate through all tracked files, passing the
116
+ # the provided formater the token counts.
117
+ def list_tokens_per_line(formater)
118
+ formater.start_count(@files.size)
119
+ @files.each do |fname, tok_per_line|
120
+ formater.start_file(fname)
121
+ tok_per_line.sort.each do |line,num|
122
+ formater.line_token_count(line,num)
123
+ end
124
+ formater.end_file
125
+ end
126
+ end
127
+
128
+ # Count the token for the passed line.
129
+ def count_token(line_no,token)
130
+ case token
131
+ when TkSPACE, TkNL, TkRD_COMMENT
132
+ # Do not count these as tokens
133
+ when TkCOMMENT
134
+ # Ignore this only for comments in a statement?
135
+ # Ignore TkCOLON,TkCOLON2 and operators? like "." etc..
136
+ when TkRBRACK, TkRPAREN, TkRBRACE
137
+ # Ignore the closing of an array/index/hash/paren
138
+ # The opening is counted, but no more.
139
+ # Thus [], () {} is counted as 1 token not 2.
140
+ else
141
+ # may want to filter out comments...
142
+ @tokens_per_line[line_no] += 1
143
+ end
144
+ end
145
+
146
+ end
147
+
148
+ # Main class and structure used to compute the
149
+ # cyclomatic complexity of Ruby programs.
150
+ class ParseState
151
+ include RubyToken
152
+ attr_accessor :name, :children, :complexity, :parent, :lines
153
+
154
+ @@top_state = nil
155
+ def ParseState.make_top_state()
156
+ @@top_state = ParseState.new(nil)
157
+ @@top_state.name = "__top__"
158
+ @@top_state
159
+ end
160
+
161
+ @@token_counter = TokenCounter.new
162
+ def ParseState.set_token_counter(counter)
163
+ @@token_counter = counter
164
+ end
165
+ def ParseState.get_token_counter
166
+ @@token_counter
167
+ end
168
+
169
+ def initialize(lexer,parent=nil)
170
+ @name = ""
171
+ @children = Array.new
172
+ @complexity = 0
173
+ @parent = parent
174
+ @lexer = lexer
175
+ @run = true
176
+ # To catch one line def statements, We always have one line.
177
+ @lines = 0
178
+ @last_token_line_and_char = Array.new
179
+ end
180
+
181
+ def top_state?
182
+ self == @@top_state
183
+ end
184
+
185
+ def lexer=(lexer)
186
+ @run = true
187
+ @lexer = lexer
188
+ end
189
+
190
+ def make_state(type,parent = nil)
191
+ cstate = type.new(@lexer,self)
192
+ parent.children<< cstate
193
+ cstate
194
+ end
195
+
196
+ def calc_complexity
197
+ complexity = @complexity
198
+ children.each do |child|
199
+ complexity += child.calc_complexity
200
+ end
201
+ complexity
202
+ end
203
+
204
+ def calc_lines
205
+ lines = @lines
206
+ children.each do |child|
207
+ lines += child.calc_lines
208
+ end
209
+ lines
210
+ end
211
+
212
+ def compute_state(formater)
213
+ if top_state?
214
+ compute_state_for_global(formater)
215
+ end
216
+
217
+ @children.each do |s|
218
+ s.compute_state(formater)
219
+ end
220
+ end
221
+
222
+ def compute_state_for_global(formater)
223
+ global_def, @children = @children.partition do |s|
224
+ !s.kind_of?(ParseClass)
225
+ end
226
+ return if global_def.empty?
227
+ gx = global_def.inject(0) { |c,s| s.calc_complexity }
228
+ gl = global_def.inject(0) { |c,s| s.calc_lines }
229
+ formater.start_class_compute_state("Global", "", gx, gl)
230
+ global_def.each do |s|
231
+ s.compute_state(formater)
232
+ end
233
+ formater.end_class_compute_state("")
234
+ end
235
+
236
+ # Count the tokens parsed if true else ignore them.
237
+ def count_tokens?
238
+ true
239
+ end
240
+
241
+ def parse
242
+ while @run do
243
+ tok = @lexer.token
244
+ @run = false if tok.nil?
245
+ if lexer_loop?(tok)
246
+ STDERR.puts "Lexer loop at line : #{@lexer.line_no} char #{@lexer.char_no}."
247
+ @run = false
248
+ end
249
+ @last_token_line_and_char<< [@lexer.line_no.to_i, @lexer.char_no.to_i, tok]
250
+ if $VERBOSE
251
+ puts "DEBUG: #{@lexer.line_no} #{tok.class}:#{tok.name if tok.respond_to?(:name)}"
252
+ end
253
+ @@token_counter.count_token(@lexer.line_no, tok) if count_tokens?
254
+ parse_token(tok)
255
+ end
256
+ end
257
+
258
+ # Ruby-Lexer can go into a loop if the file does not end with a newline.
259
+ def lexer_loop?(token)
260
+ return false if @last_token_line_and_char.empty?
261
+ loop_flag = false
262
+ last = @last_token_line_and_char.last
263
+ line = last[0]
264
+ char = last[1]
265
+ ltok = last[2]
266
+
267
+ if ( (line == @lexer.line_no.to_i) &&
268
+ (char == @lexer.char_no.to_i) &&
269
+ (ltok.class == token.class) )
270
+ # We are potentially in a loop
271
+ if @last_token_line_and_char.size >= 3
272
+ loop_flag = true
273
+ end
274
+ else
275
+ # Not in a loop so clear stack
276
+ @last_token_line_and_char = Array.new
277
+ end
278
+
279
+ loop_flag
280
+ end
281
+
282
+ def do_begin_token(token)
283
+ make_state(EndableParseState, self)
284
+ end
285
+
286
+ def do_class_token(token)
287
+ make_state(ParseClass,self)
288
+ end
289
+
290
+ def do_module_token(token)
291
+ make_state(ParseModule,self)
292
+ end
293
+
294
+ def do_def_token(token)
295
+ make_state(ParseDef,self)
296
+ end
297
+
298
+ def do_constant_token(token)
299
+ nil
300
+ end
301
+
302
+ def do_identifier_token(token)
303
+ if (token.name == "__END__" && token.char_no.to_i == 0)
304
+ # The Ruby code has stopped and the rest is data so cease parsing.
305
+ @run = false
306
+ end
307
+ nil
308
+ end
309
+
310
+ def do_right_brace_token(token)
311
+ nil
312
+ end
313
+
314
+ def do_end_token(token)
315
+ end_debug
316
+ nil
317
+ end
318
+
319
+ def do_block_token(token)
320
+ make_state(ParseBlock,self)
321
+ end
322
+
323
+ def do_conditional_token(token)
324
+ make_state(ParseCond,self)
325
+ end
326
+
327
+ def do_conditional_do_control_token(token)
328
+ make_state(ParseDoCond,self)
329
+ end
330
+
331
+ def do_case_token(token)
332
+ make_state(EndableParseState, self)
333
+ end
334
+
335
+ def do_one_line_conditional_token(token)
336
+ # This is an if with no end
337
+ @complexity += 1
338
+ #STDOUT.puts "got IF_MOD: #{self.to_yaml}" if $VERBOSE
339
+ #if state.type != "class" && state.type != "def" && state.type != "cond"
340
+ #STDOUT.puts "Changing IF_MOD Parent" if $VERBOSE
341
+ #state = state.parent
342
+ #@run = false
343
+ nil
344
+ end
345
+
346
+ def do_else_token(token)
347
+ STDOUT.puts "Ignored/Unknown Token:#{token.class}" if $VERBOSE
348
+ nil
349
+ end
350
+
351
+ def do_comment_token(token)
352
+ make_state(ParseComment, self)
353
+ end
354
+
355
+ def do_symbol_token(token)
356
+ make_state(ParseSymbol, self)
357
+ end
358
+
359
+ def parse_token(token)
360
+ state = nil
361
+ case token
362
+ when TkCLASS
363
+ state = do_class_token(token)
364
+ when TkMODULE
365
+ state = do_module_token(token)
366
+ when TkDEF
367
+ state = do_def_token(token)
368
+ when TkCONSTANT
369
+ # Nothing to do with a constant at top level?
370
+ state = do_constant_token(token)
371
+ when TkIDENTIFIER,TkFID
372
+ # Nothing to do at top level?
373
+ state = do_identifier_token(token)
374
+ when TkRBRACE
375
+ # Nothing to do at top level
376
+ state = do_right_brace_token(token)
377
+ when TkEND
378
+ state = do_end_token(token)
379
+ # At top level this might be an error...
380
+ when TkDO,TkfLBRACE
381
+ state = do_block_token(token)
382
+ when TkIF,TkUNLESS
383
+ state = do_conditional_token(token)
384
+ when TkWHILE,TkUNTIL,TkFOR
385
+ state = do_conditional_do_control_token(token)
386
+ when TkELSIF #,TkELSE
387
+ @complexity += 1
388
+ when TkELSE
389
+ # Else does not increase complexity
390
+ when TkCASE
391
+ state = do_case_token(token)
392
+ when TkWHEN
393
+ @complexity += 1
394
+ when TkBEGIN
395
+ state = do_begin_token(token)
396
+ when TkRESCUE
397
+ # Maybe this should add complexity and not begin
398
+ @complexity += 1
399
+ when TkIF_MOD, TkUNLESS_MOD, TkUNTIL_MOD, TkWHILE_MOD, TkQUESTION
400
+ state = do_one_line_conditional_token(token)
401
+ when TkNL
402
+ #
403
+ @lines += 1
404
+ when TkRETURN
405
+ # Early returns do not increase complexity as the condition that
406
+ # calls the return is the one that increases it.
407
+ when TkCOMMENT
408
+ state = do_comment_token(token)
409
+ when TkSYMBEG
410
+ state = do_symbol_token(token)
411
+ when TkError
412
+ STDOUT.puts "Lexer received an error for line #{@lexer.line_no} char #{@lexer.char_no}"
413
+ else
414
+ state = do_else_token(token)
415
+ end
416
+ state.parse if state
417
+ end
418
+
419
+ def end_debug
420
+ STDOUT.puts "got an end: #{@name} in #{self.class.name}" if $VERBOSE
421
+ if @parent.nil?
422
+ STDOUT.puts "DEBUG: Line #{@lexer.line_no}"
423
+ STDOUT.puts "DEBUG: #{@name}; #{self.class}"
424
+ # to_yaml can cause an infinite loop?
425
+ #STDOUT.puts "TOP: #{@@top_state.to_yaml}"
426
+ #STDOUT.puts "TOP: #{@@top_state.inspect}"
427
+
428
+ # This may not be an error?
429
+ #exit 1
430
+ end
431
+ end
432
+
433
+ end
434
+
435
+ # Read and consume tokens in comments until a new line.
436
+ class ParseComment < ParseState
437
+
438
+ # While in a comment state do not count the tokens.
439
+ def count_tokens?
440
+ false
441
+ end
442
+
443
+ def parse_token(token)
444
+ if token.is_a?(TkNL)
445
+ @lines += 1
446
+ @run = false
447
+ end
448
+ end
449
+ end
450
+
451
+ class ParseSymbol < ParseState
452
+ def initialize(lexer, parent = nil)
453
+ super
454
+ STDOUT.puts "STARTING SYMBOL" if $VERBOSE
455
+ end
456
+
457
+ def parse_token(token)
458
+ STDOUT.puts "Symbol's token is #{token.class}" if $VERBOSE
459
+ # Consume the next token and stop
460
+ @run = false
461
+ nil
462
+ end
463
+ end
464
+
465
+ class EndableParseState < ParseState
466
+ def initialize(lexer,parent=nil)
467
+ super(lexer,parent)
468
+ STDOUT.puts "Starting #{self.class}" if $VERBOSE
469
+ end
470
+
471
+ def do_end_token(token)
472
+ end_debug
473
+ @run = false
474
+ nil
475
+ end
476
+ end
477
+
478
+ class ParseClass < EndableParseState
479
+ def initialize(lexer,parent=nil)
480
+ super(lexer,parent)
481
+ @type_name = "Class"
482
+ end
483
+
484
+ def do_constant_token(token)
485
+ @name = token.name if @name.empty?
486
+ nil
487
+ end
488
+
489
+ def compute_state(formater)
490
+ # Seperate the Module and Class Children out
491
+ cnm_children, @children = @children.partition do |child|
492
+ child.kind_of?(ParseClass)
493
+ end
494
+
495
+ formater.start_class_compute_state(@type_name,@name,self.calc_complexity,self.calc_lines)
496
+ super(formater)
497
+ formater.end_class_compute_state(@name)
498
+
499
+ cnm_children.each do |child|
500
+ child.name = @name + "::" + child.name
501
+ child.compute_state(formater)
502
+ end
503
+ end
504
+ end
505
+
506
+ class ParseModule < ParseClass
507
+ def initialize(lexer,parent=nil)
508
+ super(lexer,parent)
509
+ @type_name = "Module"
510
+ end
511
+ end
512
+
513
+ class ParseDef < EndableParseState
514
+
515
+ def initialize(lexer,parent=nil)
516
+ super(lexer,parent)
517
+ @complexity = 1
518
+ @looking_for_name = true
519
+ @first_space = true
520
+ end
521
+
522
+ # This way I don't need to list all possible overload
523
+ # tokens.
524
+ def create_def_name(token)
525
+ case token
526
+ when TkSPACE
527
+ # mark first space so we can stop at next space
528
+ if @first_space
529
+ @first_space = false
530
+ else
531
+ @looking_for_name = false
532
+ end
533
+ when TkNL,TkLPAREN,TkfLPAREN,TkSEMICOLON
534
+ # we can also stop at a new line or left parenthesis
535
+ @looking_for_name = false
536
+ when TkDOT
537
+ @name<< "."
538
+ when TkCOLON2
539
+ @name<< "::"
540
+ when TkASSIGN
541
+ @name<< "="
542
+ when TkfLBRACK
543
+ @name<< "["
544
+ when TkRBRACK
545
+ @name<< "]"
546
+ else
547
+ begin
548
+ @name<< token.name.to_s
549
+ rescue Exception => err
550
+ #what is this?
551
+ STDOUT.puts @@token_counter.current_file
552
+ STDOUT.puts @name
553
+ STDOUT.puts token.inspect
554
+ STDOUT.puts err.message
555
+ exit 1
556
+ end
557
+ end
558
+ end
559
+
560
+ def parse_token(token)
561
+ if @looking_for_name
562
+ create_def_name(token)
563
+ end
564
+ super(token)
565
+ end
566
+
567
+ def compute_state(formater)
568
+ formater.def_compute_state(@name, self.calc_complexity, self.calc_lines)
569
+ super(formater)
570
+ end
571
+ end
572
+
573
+ class ParseCond < EndableParseState
574
+ def initialize(lexer,parent=nil)
575
+ super(lexer,parent)
576
+ @complexity = 1
577
+ end
578
+ end
579
+
580
+ class ParseDoCond < ParseCond
581
+ def initialize(lexer,parent=nil)
582
+ super(lexer,parent)
583
+ @looking_for_new_line = true
584
+ end
585
+
586
+ # Need to consume the do that can appear at the
587
+ # end of these control structures.
588
+ def parse_token(token)
589
+ if @looking_for_new_line
590
+ if token.is_a?(TkDO)
591
+ nil
592
+ else
593
+ if token.is_a?(TkNL)
594
+ @looking_for_new_line = false
595
+ end
596
+ super(token)
597
+ end
598
+ else
599
+ super(token)
600
+ end
601
+ end
602
+
603
+ end
604
+
605
+ class ParseBlock < EndableParseState
606
+
607
+ def initialize(lexer,parent=nil)
608
+ super(lexer,parent)
609
+ @complexity = 1
610
+ @lbraces = Array.new
611
+ end
612
+
613
+ # Because the token for a block and hash right brace is the same,
614
+ # we need to track the hash left braces to determine when an end is
615
+ # encountered.
616
+ def parse_token(token)
617
+ if token.is_a?(TkLBRACE)
618
+ @lbraces.push(true)
619
+ elsif token.is_a?(TkRBRACE)
620
+ if @lbraces.empty?
621
+ do_right_brace_token(token)
622
+ #do_end_token(token)
623
+ else
624
+ @lbraces.pop
625
+ end
626
+ else
627
+ super(token)
628
+ end
629
+ end
630
+
631
+ def do_right_brace_token(token)
632
+ # we are done ? what about a hash in a block :-/
633
+ @run = false
634
+ nil
635
+ end
636
+
637
+ end
638
+
639
+ # ------------ END Analyzer logic ------------------------------------
640
+
641
+ class Filter
642
+ attr_accessor :limit, :error, :warn
643
+
644
+ def initialize(limit = -1, error = 11, warn = 8)
645
+ @limit = limit
646
+ @error = error
647
+ @warn = warn
648
+ end
649
+
650
+ def ignore?(count)
651
+ count < @limit
652
+ end
653
+
654
+ def warn?(count)
655
+ count >= @warn
656
+ end
657
+
658
+ def error?(count)
659
+ count >= @error
660
+ end
661
+
662
+ end
663
+
664
+
665
+ class BaseFormater
666
+ attr_accessor :warnings, :errors, :current
667
+
668
+ def initialize(out, filter = nil)
669
+ @out = out
670
+ @filter = filter
671
+ reset_data
672
+ end
673
+
674
+ def warn_error?(num, marker)
675
+ klass = ""
676
+
677
+ if @filter.error?(num)
678
+ klass = ' class="error"'
679
+ @errors<< [@current, marker, num]
680
+ elsif @filter.warn?(num)
681
+ klass = ' class="warning"'
682
+ @warnings<< [@current, marker, num]
683
+ end
684
+
685
+ klass
686
+ end
687
+
688
+ def reset_data
689
+ @warnings = Array.new
690
+ @errors = Array.new
691
+ @current = ""
692
+ end
693
+
694
+ end
695
+
696
+ class TokenCounterFormater < BaseFormater
697
+
698
+ def start(new_out=nil)
699
+ reset_data
700
+ @out = new_out if new_out
701
+ @out.puts "Token Count"
702
+ end
703
+
704
+ def start_count(number_of_files)
705
+ @out.puts "Counting tokens for #{number_of_files} files."
706
+ end
707
+
708
+ def start_file(file_name)
709
+ @current = file_name
710
+ @out.puts "File:#{file_name}"
711
+ end
712
+
713
+ def line_token_count(line_number,number_of_tokens)
714
+ return if @filter.ignore?(number_of_tokens)
715
+ warn_error?(number_of_tokens, line_number)
716
+ @out.puts "Line:#{line_number} ; Tokens : #{number_of_tokens}"
717
+ end
718
+
719
+ def end_file
720
+ @out.puts ""
721
+ end
722
+
723
+ def end_count
724
+ end
725
+
726
+ def end
727
+ end
728
+
729
+ end
730
+
731
+ module HTMLStyleSheet
732
+ def HTMLStyleSheet.style_sheet
733
+ out = StringIO.new
734
+
735
+ out.puts "<style>"
736
+ out.puts 'body {'
737
+ out.puts ' margin: 20px;'
738
+ out.puts ' padding: 0;'
739
+ out.puts ' font-size: 12px;'
740
+ out.puts ' font-family: bitstream vera sans, verdana, arial, sans serif;'
741
+ out.puts ' background-color: #efefef;'
742
+ out.puts '}'
743
+ out.puts ''
744
+ out.puts 'table { '
745
+ out.puts ' border-collapse: collapse;'
746
+ out.puts ' /*border-spacing: 0;*/'
747
+ out.puts ' border: 1px solid #666;'
748
+ out.puts ' background-color: #fff;'
749
+ out.puts ' margin-bottom: 20px;'
750
+ out.puts '}'
751
+ out.puts ''
752
+ out.puts 'table, th, th+th, td, td+td {'
753
+ out.puts ' border: 1px solid #ccc;'
754
+ out.puts '}'
755
+ out.puts ''
756
+ out.puts 'table th {'
757
+ out.puts ' font-size: 12px;'
758
+ out.puts ' color: #fc0;'
759
+ out.puts ' padding: 4px 0;'
760
+ out.puts ' background-color: #336;'
761
+ out.puts '}'
762
+ out.puts ''
763
+ out.puts 'th, td {'
764
+ out.puts ' padding: 4px 10px;'
765
+ out.puts '}'
766
+ out.puts ''
767
+ out.puts 'td { '
768
+ out.puts ' font-size: 13px;'
769
+ out.puts '}'
770
+ out.puts ''
771
+ out.puts '.class_name {'
772
+ out.puts ' font-size: 17px;'
773
+ out.puts ' margin: 20px 0 0;'
774
+ out.puts '}'
775
+ out.puts ''
776
+ out.puts '.class_complexity {'
777
+ out.puts 'margin: 0 auto;'
778
+ out.puts '}'
779
+ out.puts ''
780
+ out.puts '.class_complexity>.class_complexity {'
781
+ out.puts ' margin: 0;'
782
+ out.puts '}'
783
+ out.puts ''
784
+ out.puts '.class_total_complexity, .class_total_lines, .start_token_count, .file_count {'
785
+ out.puts ' font-size: 13px;'
786
+ out.puts ' font-weight: bold;'
787
+ out.puts '}'
788
+ out.puts ''
789
+ out.puts '.class_total_complexity, .class_total_lines {'
790
+ out.puts ' color: #c00;'
791
+ out.puts '}'
792
+ out.puts ''
793
+ out.puts '.start_token_count, .file_count {'
794
+ out.puts ' color: #333;'
795
+ out.puts '}'
796
+ out.puts ''
797
+ out.puts '.warning {'
798
+ out.puts ' background-color: yellow;'
799
+ out.puts '}'
800
+ out.puts ''
801
+ out.puts '.error {'
802
+ out.puts ' background-color: #f00;'
803
+ out.puts '}'
804
+ out.puts "</style>"
805
+
806
+ out.string
807
+ end
808
+
809
+ def style_sheet
810
+ HTMLStyleSheet.style_sheet
811
+ end
812
+ end
813
+
814
+
815
+ class HTMLTokenCounterFormater < TokenCounterFormater
816
+ include HTMLStyleSheet
817
+
818
+ def start(new_out=nil)
819
+ reset_data
820
+ @out = new_out if new_out
821
+ @out.puts "<html>"
822
+ @out.puts style_sheet
823
+ @out.puts "<body>"
824
+ end
825
+
826
+ def start_count(number_of_files)
827
+ @out.puts "<div class=\"start_token_count\">"
828
+ @out.puts "Number of files: #{number_of_files}"
829
+ @out.puts "</div>"
830
+ end
831
+
832
+ def start_file(file_name)
833
+ @current = file_name
834
+ @out.puts "<div class=\"file_count\">"
835
+ @out.puts "<p class=\"file_name\">"
836
+ @out.puts "File: #{file_name}"
837
+ @out.puts "</p>"
838
+ @out.puts "<table width=\"100%\" border=\"1\">"
839
+ @out.puts "<tr><th>Line</th><th>Tokens</th></tr>"
840
+ end
841
+
842
+ def line_token_count(line_number,number_of_tokens)
843
+ return if @filter.ignore?(number_of_tokens)
844
+ klass = warn_error?(number_of_tokens, line_number)
845
+ @out.puts "<tr><td>#{line_number}</td><td#{klass}>#{number_of_tokens}</td></tr>"
846
+ end
847
+
848
+ def end_file
849
+ @out.puts "</table>"
850
+ end
851
+
852
+ def end_count
853
+ end
854
+
855
+ def end
856
+ @out.puts "</body>"
857
+ @out.puts "</html>"
858
+ end
859
+ end
860
+
861
+ class ParseStateFormater < BaseFormater
862
+
863
+ def start(new_out=nil)
864
+ reset_data
865
+ @out = new_out if new_out
866
+ end
867
+
868
+ def end
869
+ end
870
+
871
+ def start_class_compute_state(type_name,name,complexity,lines)
872
+ @current = name
873
+ @out.puts "-- START #{name} --"
874
+ @out.puts "Type:#{type_name} Name:#{name} Complexity:#{complexity} Lines:#{lines}"
875
+ end
876
+
877
+ def end_class_compute_state(name)
878
+ @out.puts "-- END #{name} --"
879
+ end
880
+
881
+ def def_compute_state(name,complexity,lines)
882
+ return if @filter.ignore?(complexity)
883
+ warn_error?(complexity, name)
884
+ @out.puts "Type:Def Name:#{name} Complexity:#{complexity} Lines:#{lines}"
885
+ end
886
+
887
+ end
888
+
889
+
890
+
891
+ class StateHTMLComplexityFormater < ParseStateFormater
892
+ include HTMLStyleSheet
893
+
894
+ def start(new_out=nil)
895
+ reset_data
896
+ @out = new_out if new_out
897
+ @out.puts "<html><head><title>Cyclometric Complexity</title></head>"
898
+ @out.puts style_sheet
899
+ @out.puts "<body>"
900
+ end
901
+
902
+ def end
903
+ @out.puts "</body>"
904
+ @out.puts "</html>"
905
+ end
906
+
907
+ def start_class_compute_state(type_name,name,complexity,lines)
908
+ @current = name
909
+ @out.puts "<div class=\"class_complexity\">"
910
+ @out.puts "<h2 class=\"class_name\">#{type_name} : #{name}</h2>"
911
+ @out.puts "<div class=\"class_total_complexity\">Total Complexity: #{complexity}</div>"
912
+ @out.puts "<div class=\"class_total_lines\">Total Lines: #{lines}</div>"
913
+ @out.puts "<table width=\"100%\" border=\"1\">"
914
+ @out.puts "<tr><th>Method</th><th>Complexity</th><th># Lines</th></tr>"
915
+ end
916
+
917
+ def end_class_compute_state(name)
918
+ @out.puts "</table>"
919
+ @out.puts "</div>"
920
+ end
921
+
922
+ def def_compute_state(name, complexity, lines)
923
+ return if @filter.ignore?(complexity)
924
+ klass = warn_error?(complexity, name)
925
+ @out.puts "<tr><td>#{name}</td><td#{klass}>#{complexity}</td><td>#{lines}</td></tr>"
926
+ end
927
+
928
+ end
929
+
930
+
931
+ module ResultIndexGenerator
932
+ def summarize_errors_and_warnings(enw, header)
933
+ return "" if enw.empty?
934
+ f = StringIO.new
935
+ erval = Hash.new { |h,k| h[k] = Array.new }
936
+ wval = Hash.new { |h,k| h[k] = Array.new }
937
+
938
+ enw.each do |fname, warnings, errors|
939
+ errors.each do |c,m,v|
940
+ erval[v] << [fname, c, m]
941
+ end
942
+ warnings.each do |c,m,v|
943
+ wval[v] << [fname, c, m]
944
+ end
945
+ end
946
+
947
+ f.puts "<h2 class=\"class_name\">Errors and Warnings</h2>"
948
+ f.puts "<table width=\"100%\" border=\"1\">"
949
+ f.puts header
950
+
951
+ f.puts print_summary_table_rows(erval, "error")
952
+ f.puts print_summary_table_rows(wval, "warning")
953
+ f.puts "</table>"
954
+
955
+ f.string
956
+ end
957
+
958
+ def print_summary_table_rows(ewvals, klass_type)
959
+ f = StringIO.new
960
+ ewvals.sort { |a,b| b <=> a}.each do |v, vals|
961
+ vals.sort.each do |fname, c, m|
962
+ f.puts "<tr><td><a href=\"./#{fname}\">#{c}</a></td><td>#{m}</td>"
963
+ f.puts "<td class=\"#{klass_type}\">#{v}</td></tr>"
964
+ end
965
+ end
966
+ f.string
967
+ end
968
+
969
+ def list_analyzed_files(files)
970
+ f = StringIO.new
971
+ f.puts "<h2 class=\"class_name\">Analyzed Files</h2>"
972
+ f.puts "<ul>"
973
+ files.each do |fname, warnings, errors|
974
+ readname = fname.split("_")[0...-1].join("_")
975
+ f.puts "<li>"
976
+ f.puts "<p class=\"file_name\"><a href=\"./#{fname}\">#{readname}</a>"
977
+ f.puts "</li>"
978
+ end
979
+ f.puts "</ul>"
980
+ f.string
981
+ end
982
+
983
+ def write_index(files, filename, title, header)
984
+ return if files.empty?
985
+
986
+ File.open(filename,"w") do |f|
987
+ f.puts "<html><head><title>#{title}</title></head>"
988
+ f.puts "#{HTMLStyleSheet.style_sheet}\n<body>"
989
+ f.puts "<h1>#{title}</h1>"
990
+
991
+ enw = files.find_all { |fn,w,e| (!w.empty? || !e.empty?) }
992
+
993
+ f.puts summarize_errors_and_warnings(enw, header)
994
+
995
+ f.puts "<hr/>"
996
+ f.puts list_analyzed_files(files)
997
+ f.puts "</body></html>"
998
+ end
999
+ end
1000
+
1001
+ def write_cyclo_index(files, output_dir)
1002
+ header = "<tr><th>Class</th><th>Method</th><th>Complexity</th></tr>"
1003
+ write_index(files,
1004
+ "#{output_dir}/index_cyclo.html",
1005
+ "Index for cyclomatic complexity",
1006
+ header)
1007
+ end
1008
+
1009
+ def write_token_index(files, output_dir)
1010
+ header = "<tr><th>File</th><th>Line #</th><th>Tokens</th></tr>"
1011
+ write_index(files,
1012
+ "#{output_dir}/index_token.html",
1013
+ "Index for tokens per line",
1014
+ header)
1015
+ end
1016
+
1017
+ end
1018
+
1019
+ module Saikuro
1020
+
1021
+ #Returns the path without the file
1022
+ def Saikuro.seperate_file_from_path(path)
1023
+ res = path.split("/")
1024
+ if res.size == 1
1025
+ ""
1026
+ else
1027
+ res[0..res.size - 2].join("/")
1028
+ end
1029
+ end
1030
+
1031
+ def Saikuro.analyze(files, state_formater, token_count_formater, output_dir)
1032
+
1033
+ idx_states = Array.new
1034
+ idx_tokens = Array.new
1035
+
1036
+ # parse each file
1037
+ files.each do |file|
1038
+ begin
1039
+ STDOUT.puts "Parsing #{file}"
1040
+ # create top state
1041
+ top = ParseState.make_top_state
1042
+ STDOUT.puts "TOP State made" if $VERBOSE
1043
+ token_counter = TokenCounter.new
1044
+ ParseState.set_token_counter(token_counter)
1045
+ token_counter.set_current_file(file)
1046
+
1047
+ STDOUT.puts "Setting up Lexer" if $VERBOSE
1048
+ lexer = RubyLex.new
1049
+ # Turn of this, because it aborts when a syntax error is found...
1050
+ lexer.exception_on_syntax_error = false
1051
+ lexer.set_input(File.new(file,"r"))
1052
+ top.lexer = lexer
1053
+ STDOUT.puts "Parsing" if $VERBOSE
1054
+ top.parse
1055
+
1056
+
1057
+ fdir_path = seperate_file_from_path(file)
1058
+ FileUtils.makedirs("#{output_dir}/#{fdir_path}")
1059
+
1060
+ if state_formater
1061
+ # output results
1062
+ state_io = StringIO.new
1063
+ state_formater.start(state_io)
1064
+ top.compute_state(state_formater)
1065
+ state_formater.end
1066
+
1067
+ fname = "#{file}_cyclo.html"
1068
+ puts "writing cyclomatic #{file}" if $VERBOSE
1069
+ File.open("#{output_dir}/#{fname}","w") do |f|
1070
+ f.write state_io.string
1071
+ end
1072
+ idx_states<< [
1073
+ fname,
1074
+ state_formater.warnings.dup,
1075
+ state_formater.errors.dup,
1076
+ ]
1077
+ end
1078
+
1079
+ if token_count_formater
1080
+ token_io = StringIO.new
1081
+ token_count_formater.start(token_io)
1082
+ token_counter.list_tokens_per_line(token_count_formater)
1083
+ token_count_formater.end
1084
+
1085
+ fname = "#{file}_token.html"
1086
+ puts "writing token #{file}" if $VERBOSE
1087
+ File.open("#{output_dir}/#{fname}","w") do |f|
1088
+ f.write token_io.string
1089
+ end
1090
+ idx_tokens<< [
1091
+ fname,
1092
+ token_count_formater.warnings.dup,
1093
+ token_count_formater.errors.dup,
1094
+ ]
1095
+ end
1096
+
1097
+ rescue RubyLex::SyntaxError => synerr
1098
+ STDOUT.puts "Lexer error for file #{file} on line #{lexer.line_no}"
1099
+ STDOUT.puts "#{synerr.class.name} : #{synerr.message}"
1100
+ rescue StandardError => err
1101
+ STDOUT.puts "Error while parsing file : #{file}"
1102
+ STDOUT.puts err.class,err.message,err.backtrace.join("\n")
1103
+ rescue Exception => ex
1104
+ STDOUT.puts "Error while parsing file : #{file}"
1105
+ STDOUT.puts ex.class,ex.message,ex.backtrace.join("\n")
1106
+ end
1107
+ end
1108
+
1109
+ [idx_states, idx_tokens]
1110
+ end
1111
+ end
1112
+
1113
+
1114
+ # Really ugly command line runner stuff here for now
1115
+
1116
+ class SaikuroCMDLineRunner
1117
+ require 'stringio'
1118
+ require 'getoptlong'
1119
+ require 'fileutils'
1120
+ require 'find'
1121
+
1122
+ # modification to RDoc.usage that allows main_program_file to be set
1123
+ # for RDoc.usage
1124
+ require 'saikuro/usage'
1125
+ RDoc::main_program_file = __FILE__
1126
+
1127
+ include ResultIndexGenerator
1128
+
1129
+ def get_ruby_files(path)
1130
+ files = Array.new
1131
+ Find.find(path) do |f|
1132
+ if !FileTest.directory?(f)
1133
+ if f =~ /rb$/
1134
+ files<< f
1135
+ end
1136
+ end
1137
+ end
1138
+ files
1139
+ end
1140
+
1141
+ def run
1142
+ files = Array.new
1143
+ output_dir = "./"
1144
+ formater = "html"
1145
+ state_filter = Filter.new(5)
1146
+ token_filter = Filter.new(10, 25, 50)
1147
+ comp_state = comp_token = false
1148
+ begin
1149
+ opt = GetoptLong.new(
1150
+ ["-o","--output_directory", GetoptLong::REQUIRED_ARGUMENT],
1151
+ ["-h","--help", GetoptLong::NO_ARGUMENT],
1152
+ ["-f","--formater", GetoptLong::REQUIRED_ARGUMENT],
1153
+ ["-c","--cyclo", GetoptLong::NO_ARGUMENT],
1154
+ ["-t","--token", GetoptLong::NO_ARGUMENT],
1155
+ ["-y","--filter_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1156
+ ["-k","--filter_token", GetoptLong::REQUIRED_ARGUMENT],
1157
+ ["-w","--warn_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1158
+ ["-s","--warn_token", GetoptLong::REQUIRED_ARGUMENT],
1159
+ ["-e","--error_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1160
+ ["-d","--error_token", GetoptLong::REQUIRED_ARGUMENT],
1161
+ ["-p","--parse_file", GetoptLong::REQUIRED_ARGUMENT],
1162
+ ["-i","--input_directory", GetoptLong::REQUIRED_ARGUMENT],
1163
+ ["-v","--verbose", GetoptLong::NO_ARGUMENT]
1164
+ )
1165
+
1166
+ opt.each do |arg,val|
1167
+ case arg
1168
+ when "-o"
1169
+ output_dir = val
1170
+ when "-h"
1171
+ RDoc.usage('help')
1172
+ when "-f"
1173
+ formater = val
1174
+ when "-c"
1175
+ comp_state = true
1176
+ when "-t"
1177
+ comp_token = true
1178
+ when "-k"
1179
+ token_filter.limit = val.to_i
1180
+ when "-s"
1181
+ token_filter.warn = val.to_i
1182
+ when "-d"
1183
+ token_filter.error = val.to_i
1184
+ when "-y"
1185
+ state_filter.limit = val.to_i
1186
+ when "-w"
1187
+ state_filter.warn = val.to_i
1188
+ when "-e"
1189
+ state_filter.error = val.to_i
1190
+ when "-p"
1191
+ files<< val
1192
+ when "-i"
1193
+ files.concat(get_ruby_files(val))
1194
+ when "-v"
1195
+ STDOUT.puts "Verbose mode on"
1196
+ $VERBOSE = true
1197
+ end
1198
+
1199
+ end
1200
+ RDoc.usage if !comp_state && !comp_token
1201
+ rescue => err
1202
+ RDoc.usage
1203
+ end
1204
+
1205
+ if formater =~ /html/i
1206
+ state_formater = StateHTMLComplexityFormater.new(STDOUT,state_filter)
1207
+ token_count_formater = HTMLTokenCounterFormater.new(STDOUT,token_filter)
1208
+ else
1209
+ state_formater = ParseStateFormater.new(STDOUT,state_filter)
1210
+ token_count_formater = TokenCounterFormater.new(STDOUT,token_filter)
1211
+ end
1212
+
1213
+ state_formater = nil if !comp_state
1214
+ token_count_formater = nil if !comp_token
1215
+
1216
+ idx_states, idx_tokens = Saikuro.analyze(files,
1217
+ state_formater,
1218
+ token_count_formater,
1219
+ output_dir)
1220
+
1221
+ write_cyclo_index(idx_states, output_dir)
1222
+ write_token_index(idx_tokens, output_dir)
1223
+ end
1224
+
1225
+ end