metric_fu-Saikuro 1.1.1.0 → 1.1.2

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: 83b81fc0c30d1d6afecf95dd70400a3d51bc0704
4
+ data.tar.gz: 54276fffd50853d8294e03447b85944be13014b0
5
+ SHA512:
6
+ metadata.gz: 0b8dc94b48469081a74eac8d486a22708d49a6cc658ae143d634ec3d4440ea522b2601bbdb54bb130d04a8f6b4c5044d3af93988bca7eb9dfce4c9bd9d17439e
7
+ data.tar.gz: 8d236ecefa996138fb04a17419529d136ddc953a0cb545d895e4e4a95c1ec6ebaa71b67c526e889f6ec6ae7a2cfcfa3192823ae70b7ea453b31e9cc920eb0a8b
data/README CHANGED
@@ -1,42 +1,25 @@
1
- The japgolly fork is a part of an attempt to get metric_fu working in a modern
2
- Ruby environment, specifically compatibility with Ruby 1.9 and Bundler.
1
+ # Saikuro:
3
2
 
4
- ===============================================================================
5
-
6
- Version 0.2
7
-
8
- Saikuro:
9
3
  Saikuro is a Ruby cyclomatic complexity analyzer. When given Ruby
10
4
  source code Saikuro will generate a report listing the cyclomatic
11
5
  complexity of each method found. In addition, Saikuro counts the
12
6
  number of lines per method and can generate a listing of the number of
13
7
  tokens on each line of code.
14
8
 
15
- License:
16
- Saikuro uses the BSD license.
17
-
18
- Installation:
19
- Option 1: Using setup.rb
20
- * login as root
21
- * run "ruby setup.rb all"
9
+ ## Installation:
22
10
 
23
- Option 2: The manual way
24
- Saikuro is a single Ruby file that is executable. You can run it where
25
- you unpacked it or you can move it your preferred location such as
26
- "/usr/local/bin" or "~/bin".
11
+ gem install metric_fu-Saikuro
27
12
 
28
- Note:
29
- Ruby 1.8.5 has a bug in ri_options that will prevent Saikuro from
30
- running. If you are using 1.8.5 please apply this patch :
31
- http://www.ruby-lang.org/cgi-bin/cvsweb.cgi/ruby/lib/rdoc/ri/ri_options.rb.diff?r1=1.2.2.13;r2=1.2.2.14
13
+ ## Usage:
32
14
 
33
-
34
- Usage:
35
15
  Saikuro is a command line program.
36
- Running "saikuro -h" will output a usage statement describing all
16
+
17
+ Running `saikuro -h` will output a usage statement describing all
37
18
  the various arguments you can pass to it.
38
19
 
39
- "saikuro -c -p tests/samples.rb"
20
+ A simple example is:
21
+
22
+ saikuro -c -p examples/samples.rb
40
23
 
41
24
  The above command is a simple example that generates a cyclomatic
42
25
  complexity report on the samples.rb file, using the default filter,
@@ -44,10 +27,11 @@ warning and error settings. The report is saved in the current
44
27
  directory.
45
28
 
46
29
 
47
- A more detailed example is
48
- "saikuro -c -t -i tests -y 0 -w 11 -e 16 -o out/"
30
+ A more detailed example is:
31
+
32
+ saikuro -c -t -i examples -y 0 -w 11 -e 16 -o detailed/
49
33
 
50
- This will analyze all Ruby files found in the "tests/" directory.
34
+ This will analyze all Ruby files found in the "doc/examples/" directory.
51
35
  Saikuro will generate a token count report and a cyclomatic complexity
52
36
  report in the "out" directory . The "-y 0" command will turn off
53
37
  filtering and thus show the complexity of all methods. The "-w 11"
@@ -55,8 +39,15 @@ will mark all methods with a complexity of 11 or higher with a
55
39
  warning. Finally, "-e 16" will flag all methods with a complexity of
56
40
  16 or higher with an error.
57
41
 
42
+ ## Examples:
43
+
44
+ In the doc/examples directory is are sample files that has examples of the
45
+ various possible cases that we examined and documented the expected
46
+ cyclomatic complexity result.
47
+
48
+ The results of running the above commands is also included.
58
49
 
59
- About Cyclomatic Complexity:
50
+ ## About Cyclomatic Complexity:
60
51
 
61
52
  The following document provides a very good and detailed description
62
53
  by the author of cyclomatic complexity.
@@ -71,8 +62,7 @@ http://hissa.nist.gov/HHRFdata/Artifacts/ITLdoc/235/title.htm
71
62
  PDF
72
63
  http://www.mccabe.com/iq_research_nist.htm
73
64
 
74
-
75
- How and what Saikuro counts to calculate the cyclomatic complexity:
65
+ ## How and what Saikuro counts to calculate the cyclomatic complexity:
76
66
 
77
67
  Saikuro uses the Simplified Complexity Calculation, which is just
78
68
  adding up the number of branch points in a method.
@@ -97,23 +87,22 @@ In addition, the short-circuiting "and" operators (&& and "and")
97
87
  currently do not contribute to a method's complexity, although
98
88
  McCabe's paper listed above suggests doing so.
99
89
 
90
+ Example for "and" operator handling:
100
91
 
101
- #Example for "and" operator handling:
102
-
103
- # Starting values for case 1 and 2
104
- x = false
105
- y = 15
106
- r, q = nil
92
+ # Starting values for case 1 and 2
93
+ x = false
94
+ y = 15
95
+ r, q = nil
107
96
 
108
- # case 1
109
- puts "W" if ((r = x) && (q = y))
110
- puts r # => false
111
- puts q # => nil
97
+ # case 1
98
+ puts "W" if ((r = x) && (q = y))
99
+ puts r # => false
100
+ puts q # => nil
112
101
 
113
- # case 2
114
- puts "W" if ((q = y) && (r = x))
115
- puts r # => false
116
- puts q # => 15
102
+ # case 2
103
+ puts "W" if ((q = y) && (r = x))
104
+ puts r # => false
105
+ puts q # => 15
117
106
 
118
107
  Case 1 illustrates why "and" operators should add to a method's
119
108
  complexity, because the result of ( r = x ) is false the if statement
@@ -125,23 +114,32 @@ So why is it not added?
125
114
  Mainly, because we have not gotten around to it. We are wondering if
126
115
  this would increase the noise more than it should.
127
116
 
117
+ ## Changelog:
128
118
 
129
- Tests:
130
- In the test directory is a sample file that has examples of the
131
- various possible cases that we examined and documented the expected
132
- cyclomatic complexity result. If you find mistakes or missing tests
133
- please report them.
119
+ The japgolly fork is a part of an attempt to get metric_fu working in a modern
120
+ Ruby environment, specifically compatibility with Ruby 1.9 and Bundler.
121
+
122
+ Version 0.2
123
+
124
+ ## Contact:
134
125
 
135
- Contact:
136
126
  Saikuro is written by
137
127
  Zev Blut (zb at ubit dot com)
138
128
 
139
- Acknowledgments:
129
+ If you find mistakes or missing examples please report them in the issue tracker.
130
+
131
+ ## Acknowledgments:
132
+
140
133
  Thanks to Elbert Corpuz for writing the CSS for the HTML output!
141
134
 
142
- Other metric tools for Ruby:
135
+ ## Other metric tools for Ruby:
136
+
143
137
  Ryan Davis has an abc metric program as an example in his ParseTree
144
138
  product: http://www.zenspider.com/ZSS/Products/ParseTree/
145
139
 
146
140
  The PMD project has a tool called CPD that can scan Ruby source code
147
141
  looking for source duplication: http://pmd.sourceforge.net/
142
+
143
+ ## License:
144
+
145
+ Saikuro uses the BSD license.
@@ -1,52 +1,3 @@
1
- # $Id$
2
- # == Usage
3
- #
4
- # saikuro [ -h ] [-o output_directory] [-f type] [ -c, -t ]
5
- # [ -y, -w, -e, -k, -s, -d - number ] ( -p file | -i directory )
6
- #
7
- # == Help
8
- #
9
- # -o, --output_directory (directory) : A directory to ouput the results in.
10
- # The current directory is used if this option is not passed.
11
- #
12
- # -h, --help : This help message.
13
- #
14
- # -f, --formater (html | text) : The format to output the results in.
15
- # The default is html
16
- #
17
- # -c, --cyclo : Compute the cyclomatic complexity of the input.
18
- #
19
- # -t, --token : Count the number of tokens per line of the input.
20
- #
21
- # -y, --filter_cyclo (number) : Filter the output to only include methods
22
- # whose cyclomatic complexity are greater than the passed number.
23
- #
24
- # -w, --warn_cyclo (number) : Highlight with a warning methods whose
25
- # cyclomatic complexity are greather than or equal to the passed number.
26
- #
27
- #
28
- # -e, --error_cyclo (number) : Highligh with an error methods whose
29
- # cyclomatic complexity are greather than or equal to the passed number.
30
- #
31
- #
32
- # -k, --filter_token (number) : Filter the output to only include lines
33
- # whose token count are greater than the passed number.
34
- #
35
- #
36
- # -s, --warn_token (number) : Highlight with a warning lines whose
37
- # token count are greater than or equal to the passed number.
38
- #
39
- #
40
- # -d, --error_token (number) : Highlight with an error lines whose
41
- # token count are greater than or equal to the passed number.
42
- #
43
- #
44
- # -p, --parse_file (file) : A file to use as input.
45
- #
46
- # -i, --input_directory (directory) : All ruby files found recursively
47
- # inside the directory are passed as input.
48
-
49
-
50
1
  # Saikruo uses the BSD license.
51
2
  #
52
3
  # Copyright (c) 2005, Ubiquitous Business Technology (http://ubit.com)
@@ -93,928 +44,29 @@ require 'yaml'
93
44
  # once in def get the token after space, because it may also
94
45
  # be something like + or << for operator overloading.
95
46
 
96
- # Counts the number of tokens in each line.
97
- class TokenCounter
98
- include RubyToken
99
-
100
- attr_reader :current_file
101
-
102
- def initialize
103
- @files = Hash.new
104
- @tokens_per_line = Hash.new(0)
105
- @current_file = ""
106
- end
107
-
108
- # Mark file to associate with the token count.
109
- def set_current_file(file)
110
- @current_file = file
111
- @tokens_per_line = Hash.new(0)
112
- @files[@current_file] = @tokens_per_line
113
- end
114
-
115
- # Iterate through all tracked files, passing the
116
- # the provided formater the token counts.
117
- def list_tokens_per_line(formater)
118
- formater.start_count(@files.size)
119
- @files.each do |fname, tok_per_line|
120
- formater.start_file(fname)
121
- tok_per_line.sort.each do |line,num|
122
- formater.line_token_count(line,num)
123
- end
124
- formater.end_file
125
- end
126
- end
127
-
128
- # Count the token for the passed line.
129
- def count_token(line_no,token)
130
- case token
131
- when TkSPACE, TkNL, TkRD_COMMENT
132
- # Do not count these as tokens
133
- when TkCOMMENT
134
- # Ignore this only for comments in a statement?
135
- # Ignore TkCOLON,TkCOLON2 and operators? like "." etc..
136
- when TkRBRACK, TkRPAREN, TkRBRACE
137
- # Ignore the closing of an array/index/hash/paren
138
- # The opening is counted, but no more.
139
- # Thus [], () {} is counted as 1 token not 2.
140
- else
141
- # may want to filter out comments...
142
- @tokens_per_line[line_no] += 1
143
- end
144
- end
145
-
146
- end
147
-
148
- # Main class and structure used to compute the
149
- # cyclomatic complexity of Ruby programs.
150
- class ParseState
151
- include RubyToken
152
- attr_accessor :name, :children, :complexity, :parent, :lines
153
-
154
- @@top_state = nil
155
- def ParseState.make_top_state()
156
- @@top_state = ParseState.new(nil)
157
- @@top_state.name = "__top__"
158
- @@top_state
159
- end
160
-
161
- @@token_counter = TokenCounter.new
162
- def ParseState.set_token_counter(counter)
163
- @@token_counter = counter
164
- end
165
- def ParseState.get_token_counter
166
- @@token_counter
167
- end
168
-
169
- def initialize(lexer,parent=nil)
170
- @name = ""
171
- @children = Array.new
172
- @complexity = 0
173
- @parent = parent
174
- @lexer = lexer
175
- @run = true
176
- # To catch one line def statements, We always have one line.
177
- @lines = 0
178
- @last_token_line_and_char = Array.new
179
- end
180
-
181
- def top_state?
182
- self == @@top_state
183
- end
184
-
185
- def lexer=(lexer)
186
- @run = true
187
- @lexer = lexer
188
- end
189
-
190
- def make_state(type,parent = nil)
191
- cstate = type.new(@lexer,self)
192
- parent.children<< cstate
193
- cstate
194
- end
195
-
196
- def calc_complexity
197
- complexity = @complexity
198
- children.each do |child|
199
- complexity += child.calc_complexity
200
- end
201
- complexity
202
- end
203
-
204
- def calc_lines
205
- lines = @lines
206
- children.each do |child|
207
- lines += child.calc_lines
208
- end
209
- lines
210
- end
211
-
212
- def compute_state(formater)
213
- if top_state?
214
- compute_state_for_global(formater)
215
- end
216
-
217
- @children.each do |s|
218
- s.compute_state(formater)
219
- end
220
- end
221
-
222
- def compute_state_for_global(formater)
223
- global_def, @children = @children.partition do |s|
224
- !s.kind_of?(ParseClass)
225
- end
226
- return if global_def.empty?
227
- gx = global_def.inject(0) { |c,s| s.calc_complexity }
228
- gl = global_def.inject(0) { |c,s| s.calc_lines }
229
- formater.start_class_compute_state("Global", "", gx, gl)
230
- global_def.each do |s|
231
- s.compute_state(formater)
232
- end
233
- formater.end_class_compute_state("")
234
- end
235
-
236
- # Count the tokens parsed if true else ignore them.
237
- def count_tokens?
238
- true
239
- end
240
-
241
- def parse
242
- while @run do
243
- tok = @lexer.token
244
- @run = false if tok.nil?
245
- if lexer_loop?(tok)
246
- STDERR.puts "Lexer loop at line : #{@lexer.line_no} char #{@lexer.char_no}."
247
- @run = false
248
- end
249
- @last_token_line_and_char<< [@lexer.line_no.to_i, @lexer.char_no.to_i, tok]
250
- if $VERBOSE
251
- puts "DEBUG: #{@lexer.line_no} #{tok.class}:#{tok.name if tok.respond_to?(:name)}"
252
- end
253
- @@token_counter.count_token(@lexer.line_no, tok) if count_tokens?
254
- parse_token(tok)
255
- end
256
- end
257
-
258
- # Ruby-Lexer can go into a loop if the file does not end with a newline.
259
- def lexer_loop?(token)
260
- return false if @last_token_line_and_char.empty?
261
- loop_flag = false
262
- last = @last_token_line_and_char.last
263
- line = last[0]
264
- char = last[1]
265
- ltok = last[2]
266
-
267
- if ( (line == @lexer.line_no.to_i) &&
268
- (char == @lexer.char_no.to_i) &&
269
- (ltok.class == token.class) )
270
- # We are potentially in a loop
271
- if @last_token_line_and_char.size >= 3
272
- loop_flag = true
273
- end
274
- else
275
- # Not in a loop so clear stack
276
- @last_token_line_and_char = Array.new
277
- end
278
-
279
- loop_flag
280
- end
281
-
282
- def do_begin_token(token)
283
- make_state(EndableParseState, self)
284
- end
285
-
286
- def do_class_token(token)
287
- make_state(ParseClass,self)
288
- end
289
-
290
- def do_module_token(token)
291
- make_state(ParseModule,self)
292
- end
293
-
294
- def do_def_token(token)
295
- make_state(ParseDef,self)
296
- end
297
-
298
- def do_constant_token(token)
299
- nil
300
- end
301
-
302
- def do_identifier_token(token)
303
- if (token.name == "__END__" && token.char_no.to_i == 0)
304
- # The Ruby code has stopped and the rest is data so cease parsing.
305
- @run = false
306
- end
307
- nil
308
- end
309
-
310
- def do_right_brace_token(token)
311
- nil
312
- end
313
-
314
- def do_end_token(token)
315
- end_debug
316
- nil
317
- end
318
-
319
- def do_block_token(token)
320
- make_state(ParseBlock,self)
321
- end
322
-
323
- def do_conditional_token(token)
324
- make_state(ParseCond,self)
325
- end
326
-
327
- def do_conditional_do_control_token(token)
328
- make_state(ParseDoCond,self)
329
- end
330
-
331
- def do_case_token(token)
332
- make_state(EndableParseState, self)
333
- end
334
-
335
- def do_one_line_conditional_token(token)
336
- # This is an if with no end
337
- @complexity += 1
338
- #STDOUT.puts "got IF_MOD: #{self.to_yaml}" if $VERBOSE
339
- #if state.type != "class" && state.type != "def" && state.type != "cond"
340
- #STDOUT.puts "Changing IF_MOD Parent" if $VERBOSE
341
- #state = state.parent
342
- #@run = false
343
- nil
344
- end
345
-
346
- def do_else_token(token)
347
- STDOUT.puts "Ignored/Unknown Token:#{token.class}" if $VERBOSE
348
- nil
349
- end
350
-
351
- def do_comment_token(token)
352
- make_state(ParseComment, self)
353
- end
354
-
355
- def do_symbol_token(token)
356
- make_state(ParseSymbol, self)
357
- end
358
-
359
- def parse_token(token)
360
- state = nil
361
- case token
362
- when TkCLASS
363
- state = do_class_token(token)
364
- when TkMODULE
365
- state = do_module_token(token)
366
- when TkDEF
367
- state = do_def_token(token)
368
- when TkCONSTANT
369
- # Nothing to do with a constant at top level?
370
- state = do_constant_token(token)
371
- when TkIDENTIFIER,TkFID
372
- # Nothing to do at top level?
373
- state = do_identifier_token(token)
374
- when TkRBRACE
375
- # Nothing to do at top level
376
- state = do_right_brace_token(token)
377
- when TkEND
378
- state = do_end_token(token)
379
- # At top level this might be an error...
380
- when TkDO,TkfLBRACE
381
- state = do_block_token(token)
382
- when TkIF,TkUNLESS
383
- state = do_conditional_token(token)
384
- when TkWHILE,TkUNTIL,TkFOR
385
- state = do_conditional_do_control_token(token)
386
- when TkELSIF #,TkELSE
387
- @complexity += 1
388
- when TkELSE
389
- # Else does not increase complexity
390
- when TkCASE
391
- state = do_case_token(token)
392
- when TkWHEN
393
- @complexity += 1
394
- when TkBEGIN
395
- state = do_begin_token(token)
396
- when TkRESCUE
397
- # Maybe this should add complexity and not begin
398
- @complexity += 1
399
- when TkIF_MOD, TkUNLESS_MOD, TkUNTIL_MOD, TkWHILE_MOD, TkQUESTION
400
- state = do_one_line_conditional_token(token)
401
- when TkNL
402
- #
403
- @lines += 1
404
- when TkRETURN
405
- # Early returns do not increase complexity as the condition that
406
- # calls the return is the one that increases it.
407
- when TkCOMMENT
408
- state = do_comment_token(token)
409
- when TkSYMBEG
410
- state = do_symbol_token(token)
411
- when TkError
412
- STDOUT.puts "Lexer received an error for line #{@lexer.line_no} char #{@lexer.char_no}"
413
- else
414
- state = do_else_token(token)
415
- end
416
- state.parse if state
417
- end
418
-
419
- def end_debug
420
- STDOUT.puts "got an end: #{@name} in #{self.class.name}" if $VERBOSE
421
- if @parent.nil?
422
- STDOUT.puts "DEBUG: Line #{@lexer.line_no}"
423
- STDOUT.puts "DEBUG: #{@name}; #{self.class}"
424
- # to_yaml can cause an infinite loop?
425
- #STDOUT.puts "TOP: #{@@top_state.to_yaml}"
426
- #STDOUT.puts "TOP: #{@@top_state.inspect}"
427
-
428
- # This may not be an error?
429
- #exit 1
430
- end
431
- end
432
-
433
- end
434
-
435
- # Read and consume tokens in comments until a new line.
436
- class ParseComment < ParseState
437
-
438
- # While in a comment state do not count the tokens.
439
- def count_tokens?
440
- false
441
- end
442
-
443
- def parse_token(token)
444
- if token.is_a?(TkNL)
445
- @lines += 1
446
- @run = false
447
- end
448
- end
449
- end
450
-
451
- class ParseSymbol < ParseState
452
- def initialize(lexer, parent = nil)
453
- super
454
- STDOUT.puts "STARTING SYMBOL" if $VERBOSE
455
- end
456
-
457
- def parse_token(token)
458
- STDOUT.puts "Symbol's token is #{token.class}" if $VERBOSE
459
- # Consume the next token and stop
460
- @run = false
461
- nil
462
- end
463
- end
464
-
465
- class EndableParseState < ParseState
466
- def initialize(lexer,parent=nil)
467
- super(lexer,parent)
468
- STDOUT.puts "Starting #{self.class}" if $VERBOSE
469
- end
470
-
471
- def do_end_token(token)
472
- end_debug
473
- @run = false
474
- nil
475
- end
476
- end
477
-
478
- class ParseClass < EndableParseState
479
- def initialize(lexer,parent=nil)
480
- super(lexer,parent)
481
- @type_name = "Class"
482
- end
483
-
484
- def do_constant_token(token)
485
- @name = token.name if @name.empty?
486
- nil
487
- end
488
-
489
- def compute_state(formater)
490
- # Seperate the Module and Class Children out
491
- cnm_children, @children = @children.partition do |child|
492
- child.kind_of?(ParseClass)
493
- end
494
-
495
- formater.start_class_compute_state(@type_name,@name,self.calc_complexity,self.calc_lines)
496
- super(formater)
497
- formater.end_class_compute_state(@name)
498
-
499
- cnm_children.each do |child|
500
- child.name = @name + "::" + child.name
501
- child.compute_state(formater)
502
- end
503
- end
504
- end
505
-
506
- class ParseModule < ParseClass
507
- def initialize(lexer,parent=nil)
508
- super(lexer,parent)
509
- @type_name = "Module"
510
- end
511
- end
512
-
513
- class ParseDef < EndableParseState
514
-
515
- def initialize(lexer,parent=nil)
516
- super(lexer,parent)
517
- @complexity = 1
518
- @looking_for_name = true
519
- @first_space = true
520
- end
521
-
522
- # This way I don't need to list all possible overload
523
- # tokens.
524
- def create_def_name(token)
525
- case token
526
- when TkSPACE
527
- # mark first space so we can stop at next space
528
- if @first_space
529
- @first_space = false
530
- else
531
- @looking_for_name = false
532
- end
533
- when TkNL,TkLPAREN,TkfLPAREN,TkSEMICOLON
534
- # we can also stop at a new line or left parenthesis
535
- @looking_for_name = false
536
- when TkDOT
537
- @name<< "."
538
- when TkCOLON2
539
- @name<< "::"
540
- when TkASSIGN
541
- @name<< "="
542
- when TkfLBRACK
543
- @name<< "["
544
- when TkRBRACK
545
- @name<< "]"
546
- else
547
- begin
548
- @name<< token.name.to_s
549
- rescue Exception => err
550
- #what is this?
551
- STDOUT.puts @@token_counter.current_file
552
- STDOUT.puts @name
553
- STDOUT.puts token.inspect
554
- STDOUT.puts err.message
555
- exit 1
556
- end
557
- end
558
- end
559
-
560
- def parse_token(token)
561
- if @looking_for_name
562
- create_def_name(token)
563
- end
564
- super(token)
565
- end
566
-
567
- def compute_state(formater)
568
- formater.def_compute_state(@name, self.calc_complexity, self.calc_lines)
569
- super(formater)
570
- end
571
- end
572
-
573
- class ParseCond < EndableParseState
574
- def initialize(lexer,parent=nil)
575
- super(lexer,parent)
576
- @complexity = 1
577
- end
578
- end
579
-
580
- class ParseDoCond < ParseCond
581
- def initialize(lexer,parent=nil)
582
- super(lexer,parent)
583
- @looking_for_new_line = true
584
- end
585
-
586
- # Need to consume the do that can appear at the
587
- # end of these control structures.
588
- def parse_token(token)
589
- if @looking_for_new_line
590
- if token.is_a?(TkDO)
591
- nil
592
- else
593
- if token.is_a?(TkNL)
594
- @looking_for_new_line = false
595
- end
596
- super(token)
597
- end
598
- else
599
- super(token)
600
- end
601
- end
602
-
603
- end
604
-
605
- class ParseBlock < EndableParseState
606
-
607
- def initialize(lexer,parent=nil)
608
- super(lexer,parent)
609
- @complexity = 1
610
- @lbraces = Array.new
611
- end
612
-
613
- # Because the token for a block and hash right brace is the same,
614
- # we need to track the hash left braces to determine when an end is
615
- # encountered.
616
- def parse_token(token)
617
- if token.is_a?(TkLBRACE)
618
- @lbraces.push(true)
619
- elsif token.is_a?(TkRBRACE)
620
- if @lbraces.empty?
621
- do_right_brace_token(token)
622
- #do_end_token(token)
623
- else
624
- @lbraces.pop
625
- end
626
- else
627
- super(token)
628
- end
629
- end
630
-
631
- def do_right_brace_token(token)
632
- # we are done ? what about a hash in a block :-/
633
- @run = false
634
- nil
635
- end
636
-
637
- end
638
-
47
+ require 'saikuro/token_counter'
48
+ require 'saikuro/parse_state'
49
+ require 'saikuro/parse_comment'
50
+ require 'saikuro/parse_symbol'
51
+ require 'saikuro/endable_parse_state'
52
+ require 'saikuro/parse_class'
53
+ require 'saikuro/parse_module'
54
+ require 'saikuro/parse_def'
55
+ require 'saikuro/parse_cond'
56
+ require 'saikuro/parse_do_cond'
57
+ require 'saikuro/parse_block'
58
+ #
639
59
  # ------------ END Analyzer logic ------------------------------------
640
60
 
641
- class Filter
642
- attr_accessor :limit, :error, :warn
643
-
644
- def initialize(limit = -1, error = 11, warn = 8)
645
- @limit = limit
646
- @error = error
647
- @warn = warn
648
- end
649
-
650
- def ignore?(count)
651
- count < @limit
652
- end
653
-
654
- def warn?(count)
655
- count >= @warn
656
- end
657
-
658
- def error?(count)
659
- count >= @error
660
- end
661
-
662
- end
663
-
664
-
665
- class BaseFormater
666
- attr_accessor :warnings, :errors, :current
667
-
668
- def initialize(out, filter = nil)
669
- @out = out
670
- @filter = filter
671
- reset_data
672
- end
673
-
674
- def warn_error?(num, marker)
675
- klass = ""
676
-
677
- if @filter.error?(num)
678
- klass = ' class="error"'
679
- @errors<< [@current, marker, num]
680
- elsif @filter.warn?(num)
681
- klass = ' class="warning"'
682
- @warnings<< [@current, marker, num]
683
- end
684
-
685
- klass
686
- end
687
-
688
- def reset_data
689
- @warnings = Array.new
690
- @errors = Array.new
691
- @current = ""
692
- end
693
-
694
- end
695
-
696
- class TokenCounterFormater < BaseFormater
697
-
698
- def start(new_out=nil)
699
- reset_data
700
- @out = new_out if new_out
701
- @out.puts "Token Count"
702
- end
703
-
704
- def start_count(number_of_files)
705
- @out.puts "Counting tokens for #{number_of_files} files."
706
- end
707
-
708
- def start_file(file_name)
709
- @current = file_name
710
- @out.puts "File:#{file_name}"
711
- end
712
-
713
- def line_token_count(line_number,number_of_tokens)
714
- return if @filter.ignore?(number_of_tokens)
715
- warn_error?(number_of_tokens, line_number)
716
- @out.puts "Line:#{line_number} ; Tokens : #{number_of_tokens}"
717
- end
718
-
719
- def end_file
720
- @out.puts ""
721
- end
61
+ require 'saikuro/filter'
62
+ require 'saikuro/base_formatter'
63
+ require 'saikuro/token_counter_formatter'
64
+ require 'saikuro/html_stylesheet'
65
+ require 'saikuro/html_token_counter_formatter'
66
+ require 'saikuro/parse_state_formatter'
67
+ require 'saikuro/state_html_complexity_formatter'
68
+ require 'saikuro/result_index_generator'
722
69
 
723
- def end_count
724
- end
725
-
726
- def end
727
- end
728
-
729
- end
730
-
731
- module HTMLStyleSheet
732
- def HTMLStyleSheet.style_sheet
733
- out = StringIO.new
734
-
735
- out.puts "<style>"
736
- out.puts 'body {'
737
- out.puts ' margin: 20px;'
738
- out.puts ' padding: 0;'
739
- out.puts ' font-size: 12px;'
740
- out.puts ' font-family: bitstream vera sans, verdana, arial, sans serif;'
741
- out.puts ' background-color: #efefef;'
742
- out.puts '}'
743
- out.puts ''
744
- out.puts 'table { '
745
- out.puts ' border-collapse: collapse;'
746
- out.puts ' /*border-spacing: 0;*/'
747
- out.puts ' border: 1px solid #666;'
748
- out.puts ' background-color: #fff;'
749
- out.puts ' margin-bottom: 20px;'
750
- out.puts '}'
751
- out.puts ''
752
- out.puts 'table, th, th+th, td, td+td {'
753
- out.puts ' border: 1px solid #ccc;'
754
- out.puts '}'
755
- out.puts ''
756
- out.puts 'table th {'
757
- out.puts ' font-size: 12px;'
758
- out.puts ' color: #fc0;'
759
- out.puts ' padding: 4px 0;'
760
- out.puts ' background-color: #336;'
761
- out.puts '}'
762
- out.puts ''
763
- out.puts 'th, td {'
764
- out.puts ' padding: 4px 10px;'
765
- out.puts '}'
766
- out.puts ''
767
- out.puts 'td { '
768
- out.puts ' font-size: 13px;'
769
- out.puts '}'
770
- out.puts ''
771
- out.puts '.class_name {'
772
- out.puts ' font-size: 17px;'
773
- out.puts ' margin: 20px 0 0;'
774
- out.puts '}'
775
- out.puts ''
776
- out.puts '.class_complexity {'
777
- out.puts 'margin: 0 auto;'
778
- out.puts '}'
779
- out.puts ''
780
- out.puts '.class_complexity>.class_complexity {'
781
- out.puts ' margin: 0;'
782
- out.puts '}'
783
- out.puts ''
784
- out.puts '.class_total_complexity, .class_total_lines, .start_token_count, .file_count {'
785
- out.puts ' font-size: 13px;'
786
- out.puts ' font-weight: bold;'
787
- out.puts '}'
788
- out.puts ''
789
- out.puts '.class_total_complexity, .class_total_lines {'
790
- out.puts ' color: #c00;'
791
- out.puts '}'
792
- out.puts ''
793
- out.puts '.start_token_count, .file_count {'
794
- out.puts ' color: #333;'
795
- out.puts '}'
796
- out.puts ''
797
- out.puts '.warning {'
798
- out.puts ' background-color: yellow;'
799
- out.puts '}'
800
- out.puts ''
801
- out.puts '.error {'
802
- out.puts ' background-color: #f00;'
803
- out.puts '}'
804
- out.puts "</style>"
805
-
806
- out.string
807
- end
808
-
809
- def style_sheet
810
- HTMLStyleSheet.style_sheet
811
- end
812
- end
813
-
814
-
815
- class HTMLTokenCounterFormater < TokenCounterFormater
816
- include HTMLStyleSheet
817
-
818
- def start(new_out=nil)
819
- reset_data
820
- @out = new_out if new_out
821
- @out.puts "<html>"
822
- @out.puts style_sheet
823
- @out.puts "<body>"
824
- end
825
-
826
- def start_count(number_of_files)
827
- @out.puts "<div class=\"start_token_count\">"
828
- @out.puts "Number of files: #{number_of_files}"
829
- @out.puts "</div>"
830
- end
831
-
832
- def start_file(file_name)
833
- @current = file_name
834
- @out.puts "<div class=\"file_count\">"
835
- @out.puts "<p class=\"file_name\">"
836
- @out.puts "File: #{file_name}"
837
- @out.puts "</p>"
838
- @out.puts "<table width=\"100%\" border=\"1\">"
839
- @out.puts "<tr><th>Line</th><th>Tokens</th></tr>"
840
- end
841
-
842
- def line_token_count(line_number,number_of_tokens)
843
- return if @filter.ignore?(number_of_tokens)
844
- klass = warn_error?(number_of_tokens, line_number)
845
- @out.puts "<tr><td>#{line_number}</td><td#{klass}>#{number_of_tokens}</td></tr>"
846
- end
847
-
848
- def end_file
849
- @out.puts "</table>"
850
- end
851
-
852
- def end_count
853
- end
854
-
855
- def end
856
- @out.puts "</body>"
857
- @out.puts "</html>"
858
- end
859
- end
860
-
861
- class ParseStateFormater < BaseFormater
862
-
863
- def start(new_out=nil)
864
- reset_data
865
- @out = new_out if new_out
866
- end
867
-
868
- def end
869
- end
870
-
871
- def start_class_compute_state(type_name,name,complexity,lines)
872
- @current = name
873
- @out.puts "-- START #{name} --"
874
- @out.puts "Type:#{type_name} Name:#{name} Complexity:#{complexity} Lines:#{lines}"
875
- end
876
-
877
- def end_class_compute_state(name)
878
- @out.puts "-- END #{name} --"
879
- end
880
-
881
- def def_compute_state(name,complexity,lines)
882
- return if @filter.ignore?(complexity)
883
- warn_error?(complexity, name)
884
- @out.puts "Type:Def Name:#{name} Complexity:#{complexity} Lines:#{lines}"
885
- end
886
-
887
- end
888
-
889
-
890
-
891
- class StateHTMLComplexityFormater < ParseStateFormater
892
- include HTMLStyleSheet
893
-
894
- def start(new_out=nil)
895
- reset_data
896
- @out = new_out if new_out
897
- @out.puts "<html><head><title>Cyclometric Complexity</title></head>"
898
- @out.puts style_sheet
899
- @out.puts "<body>"
900
- end
901
-
902
- def end
903
- @out.puts "</body>"
904
- @out.puts "</html>"
905
- end
906
-
907
- def start_class_compute_state(type_name,name,complexity,lines)
908
- @current = name
909
- @out.puts "<div class=\"class_complexity\">"
910
- @out.puts "<h2 class=\"class_name\">#{type_name} : #{name}</h2>"
911
- @out.puts "<div class=\"class_total_complexity\">Total Complexity: #{complexity}</div>"
912
- @out.puts "<div class=\"class_total_lines\">Total Lines: #{lines}</div>"
913
- @out.puts "<table width=\"100%\" border=\"1\">"
914
- @out.puts "<tr><th>Method</th><th>Complexity</th><th># Lines</th></tr>"
915
- end
916
-
917
- def end_class_compute_state(name)
918
- @out.puts "</table>"
919
- @out.puts "</div>"
920
- end
921
-
922
- def def_compute_state(name, complexity, lines)
923
- return if @filter.ignore?(complexity)
924
- klass = warn_error?(complexity, name)
925
- @out.puts "<tr><td>#{name}</td><td#{klass}>#{complexity}</td><td>#{lines}</td></tr>"
926
- end
927
-
928
- end
929
-
930
-
931
- module ResultIndexGenerator
932
- def summarize_errors_and_warnings(enw, header)
933
- return "" if enw.empty?
934
- f = StringIO.new
935
- erval = Hash.new { |h,k| h[k] = Array.new }
936
- wval = Hash.new { |h,k| h[k] = Array.new }
937
-
938
- enw.each do |fname, warnings, errors|
939
- errors.each do |c,m,v|
940
- erval[v] << [fname, c, m]
941
- end
942
- warnings.each do |c,m,v|
943
- wval[v] << [fname, c, m]
944
- end
945
- end
946
-
947
- f.puts "<h2 class=\"class_name\">Errors and Warnings</h2>"
948
- f.puts "<table width=\"100%\" border=\"1\">"
949
- f.puts header
950
-
951
- f.puts print_summary_table_rows(erval, "error")
952
- f.puts print_summary_table_rows(wval, "warning")
953
- f.puts "</table>"
954
-
955
- f.string
956
- end
957
-
958
- def print_summary_table_rows(ewvals, klass_type)
959
- f = StringIO.new
960
- ewvals.sort { |a,b| b <=> a}.each do |v, vals|
961
- vals.sort.each do |fname, c, m|
962
- f.puts "<tr><td><a href=\"./#{fname}\">#{c}</a></td><td>#{m}</td>"
963
- f.puts "<td class=\"#{klass_type}\">#{v}</td></tr>"
964
- end
965
- end
966
- f.string
967
- end
968
-
969
- def list_analyzed_files(files)
970
- f = StringIO.new
971
- f.puts "<h2 class=\"class_name\">Analyzed Files</h2>"
972
- f.puts "<ul>"
973
- files.each do |fname, warnings, errors|
974
- readname = fname.split("_")[0...-1].join("_")
975
- f.puts "<li>"
976
- f.puts "<p class=\"file_name\"><a href=\"./#{fname}\">#{readname}</a>"
977
- f.puts "</li>"
978
- end
979
- f.puts "</ul>"
980
- f.string
981
- end
982
-
983
- def write_index(files, filename, title, header)
984
- return if files.empty?
985
-
986
- File.open(filename,"w") do |f|
987
- f.puts "<html><head><title>#{title}</title></head>"
988
- f.puts "#{HTMLStyleSheet.style_sheet}\n<body>"
989
- f.puts "<h1>#{title}</h1>"
990
-
991
- enw = files.find_all { |fn,w,e| (!w.empty? || !e.empty?) }
992
-
993
- f.puts summarize_errors_and_warnings(enw, header)
994
-
995
- f.puts "<hr/>"
996
- f.puts list_analyzed_files(files)
997
- f.puts "</body></html>"
998
- end
999
- end
1000
-
1001
- def write_cyclo_index(files, output_dir)
1002
- header = "<tr><th>Class</th><th>Method</th><th>Complexity</th></tr>"
1003
- write_index(files,
1004
- "#{output_dir}/index_cyclo.html",
1005
- "Index for cyclomatic complexity",
1006
- header)
1007
- end
1008
-
1009
- def write_token_index(files, output_dir)
1010
- header = "<tr><th>File</th><th>Line #</th><th>Tokens</th></tr>"
1011
- write_index(files,
1012
- "#{output_dir}/index_token.html",
1013
- "Index for tokens per line",
1014
- header)
1015
- end
1016
-
1017
- end
1018
70
 
1019
71
  module Saikuro
1020
72
 
@@ -1109,117 +161,4 @@ module Saikuro
1109
161
  [idx_states, idx_tokens]
1110
162
  end
1111
163
  end
1112
-
1113
-
1114
- # Really ugly command line runner stuff here for now
1115
-
1116
- class SaikuroCMDLineRunner
1117
- require 'stringio'
1118
- require 'getoptlong'
1119
- require 'fileutils'
1120
- require 'find'
1121
-
1122
- # modification to RDoc.usage that allows main_program_file to be set
1123
- # for RDoc.usage
1124
- require 'saikuro/usage'
1125
- RDoc::main_program_file = __FILE__
1126
-
1127
- include ResultIndexGenerator
1128
-
1129
- def get_ruby_files(path)
1130
- files = Array.new
1131
- Find.find(path) do |f|
1132
- if !FileTest.directory?(f)
1133
- if f =~ /rb$/
1134
- files<< f
1135
- end
1136
- end
1137
- end
1138
- files
1139
- end
1140
-
1141
- def run
1142
- files = Array.new
1143
- output_dir = "./"
1144
- formater = "html"
1145
- state_filter = Filter.new(5)
1146
- token_filter = Filter.new(10, 25, 50)
1147
- comp_state = comp_token = false
1148
- begin
1149
- opt = GetoptLong.new(
1150
- ["-o","--output_directory", GetoptLong::REQUIRED_ARGUMENT],
1151
- ["-h","--help", GetoptLong::NO_ARGUMENT],
1152
- ["-f","--formater", GetoptLong::REQUIRED_ARGUMENT],
1153
- ["-c","--cyclo", GetoptLong::NO_ARGUMENT],
1154
- ["-t","--token", GetoptLong::NO_ARGUMENT],
1155
- ["-y","--filter_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1156
- ["-k","--filter_token", GetoptLong::REQUIRED_ARGUMENT],
1157
- ["-w","--warn_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1158
- ["-s","--warn_token", GetoptLong::REQUIRED_ARGUMENT],
1159
- ["-e","--error_cyclo", GetoptLong::REQUIRED_ARGUMENT],
1160
- ["-d","--error_token", GetoptLong::REQUIRED_ARGUMENT],
1161
- ["-p","--parse_file", GetoptLong::REQUIRED_ARGUMENT],
1162
- ["-i","--input_directory", GetoptLong::REQUIRED_ARGUMENT],
1163
- ["-v","--verbose", GetoptLong::NO_ARGUMENT]
1164
- )
1165
-
1166
- opt.each do |arg,val|
1167
- case arg
1168
- when "-o"
1169
- output_dir = val
1170
- when "-h"
1171
- RDoc.usage('help')
1172
- when "-f"
1173
- formater = val
1174
- when "-c"
1175
- comp_state = true
1176
- when "-t"
1177
- comp_token = true
1178
- when "-k"
1179
- token_filter.limit = val.to_i
1180
- when "-s"
1181
- token_filter.warn = val.to_i
1182
- when "-d"
1183
- token_filter.error = val.to_i
1184
- when "-y"
1185
- state_filter.limit = val.to_i
1186
- when "-w"
1187
- state_filter.warn = val.to_i
1188
- when "-e"
1189
- state_filter.error = val.to_i
1190
- when "-p"
1191
- files<< val
1192
- when "-i"
1193
- files.concat(get_ruby_files(val))
1194
- when "-v"
1195
- STDOUT.puts "Verbose mode on"
1196
- $VERBOSE = true
1197
- end
1198
-
1199
- end
1200
- RDoc.usage if !comp_state && !comp_token
1201
- rescue => err
1202
- RDoc.usage
1203
- end
1204
-
1205
- if formater =~ /html/i
1206
- state_formater = StateHTMLComplexityFormater.new(STDOUT,state_filter)
1207
- token_count_formater = HTMLTokenCounterFormater.new(STDOUT,token_filter)
1208
- else
1209
- state_formater = ParseStateFormater.new(STDOUT,state_filter)
1210
- token_count_formater = TokenCounterFormater.new(STDOUT,token_filter)
1211
- end
1212
-
1213
- state_formater = nil if !comp_state
1214
- token_count_formater = nil if !comp_token
1215
-
1216
- idx_states, idx_tokens = Saikuro.analyze(files,
1217
- state_formater,
1218
- token_count_formater,
1219
- output_dir)
1220
-
1221
- write_cyclo_index(idx_states, output_dir)
1222
- write_token_index(idx_tokens, output_dir)
1223
- end
1224
-
1225
- end
164
+ require 'saikuro/saikuro_cmd_line_runner'