production_log_analyzer 1.3.0 → 1.5.0
Sign up to get free protection for your applications and to get access to all the features.
- data/History.txt +34 -0
- data/{LICENSE → LICENSE.txt} +1 -1
- data/Manifest.txt +7 -3
- data/{README → README.txt} +27 -42
- data/Rakefile +11 -57
- data/bin/action_errors +46 -0
- data/lib/production_log/action_grep.rb +14 -11
- data/lib/production_log/analyzer.rb +284 -275
- data/lib/production_log/parser.rb +121 -125
- data/test/test.syslog.0.14.x.log +1 -0
- data/test/test.syslog.1.2.shortname.log +5 -0
- data/test/test.syslog.empty.log +2 -0
- data/test/test.syslog.log +3 -0
- data/test/test_action_grep.rb +1 -1
- data/test/test_analyzer.rb +251 -236
- data/test/test_parser.rb +162 -88
- metadata +62 -42
data/History.txt
ADDED
@@ -0,0 +1,34 @@
|
|
1
|
+
= 1.5.0
|
2
|
+
|
3
|
+
* Fixed empty log bug. Patch by Tim Lucas.
|
4
|
+
* Fixed bug where sometimes lines would be logged before the
|
5
|
+
Processing line. Patch by Geoff Grosenbach.
|
6
|
+
|
7
|
+
= 1.4.0
|
8
|
+
|
9
|
+
* Switched to Hoe
|
10
|
+
* Allowed action_errors to suppress routing errors with > 3 occurances
|
11
|
+
* action_grep now works correctly with components
|
12
|
+
* pl_analyze now works correctly with components
|
13
|
+
* Added action_errors to extract error counts from logs
|
14
|
+
* Retabbed to match the rest of the world
|
15
|
+
|
16
|
+
= 1.3.0
|
17
|
+
|
18
|
+
* Added action_grep
|
19
|
+
* Added support for newer log format
|
20
|
+
|
21
|
+
= 1.2.0
|
22
|
+
|
23
|
+
* pl_analyze calculates per-action statistics
|
24
|
+
* pl_analyze can send an email with its output
|
25
|
+
|
26
|
+
= 1.1.0
|
27
|
+
|
28
|
+
* RDoc
|
29
|
+
* Other various fixes lost to time.
|
30
|
+
|
31
|
+
= 1.0.0
|
32
|
+
|
33
|
+
* Birthday!
|
34
|
+
|
data/{LICENSE → LICENSE.txt}
RENAMED
@@ -1,4 +1,4 @@
|
|
1
|
-
Copyright 2005 Eric Hodel, The Robot Co-op. All rights reserved.
|
1
|
+
Copyright 2005, 2007 Eric Hodel, The Robot Co-op. All rights reserved.
|
2
2
|
|
3
3
|
Redistribution and use in source and binary forms, with or without
|
4
4
|
modification, are permitted provided that the following conditions
|
data/Manifest.txt
CHANGED
@@ -1,13 +1,17 @@
|
|
1
|
-
|
2
|
-
|
3
|
-
LICENSE
|
1
|
+
History.txt
|
2
|
+
LICENSE.txt
|
4
3
|
Manifest.txt
|
4
|
+
README.txt
|
5
|
+
Rakefile
|
6
|
+
bin/action_errors
|
5
7
|
bin/action_grep
|
6
8
|
bin/pl_analyze
|
7
9
|
lib/production_log/action_grep.rb
|
8
10
|
lib/production_log/analyzer.rb
|
9
11
|
lib/production_log/parser.rb
|
10
12
|
test/test.syslog.0.14.x.log
|
13
|
+
test/test.syslog.1.2.shortname.log
|
14
|
+
test/test.syslog.empty.log
|
11
15
|
test/test.syslog.log
|
12
16
|
test/test_action_grep.rb
|
13
17
|
test/test_analyzer.rb
|
data/{README → README.txt}
RENAMED
@@ -1,63 +1,49 @@
|
|
1
|
-
|
1
|
+
= production_log_analyzer
|
2
2
|
|
3
|
-
|
3
|
+
production_log_analyzer lets you find out which actions on a Rails
|
4
|
+
site are slowing you down.
|
4
5
|
|
5
|
-
http://rubyforge.org/
|
6
|
+
http://seattlerb.rubyforge.org/production_log_analyzer
|
6
7
|
|
7
|
-
|
8
|
+
http://rubyforge.org/projects/seattlerb
|
8
9
|
|
9
|
-
|
10
|
-
dragging you down. The PL Analyzer requires the use of SyslogLogger from
|
11
|
-
rails_analyzer_tools because the default Logger doesn't give any way to
|
12
|
-
associate lines logged to a request.
|
10
|
+
Bug reports:
|
13
11
|
|
14
|
-
|
15
|
-
that only match a single action.
|
16
|
-
|
17
|
-
action_grep RssController#uber /var/log/production.log
|
18
|
-
|
19
|
-
= Installing
|
20
|
-
|
21
|
-
== Download
|
12
|
+
http://rubyforge.org/tracker/?func=add&group_id=1513&atid=5921
|
22
13
|
|
23
|
-
|
14
|
+
== About
|
24
15
|
|
25
|
-
|
16
|
+
production_log_analyzer provides three tools to analyze log files
|
17
|
+
created by SyslogLogger. pl_analyze for getting daily reports,
|
18
|
+
action_grep for pulling log lines for a single action and
|
19
|
+
action_errors to summarize errors with counts.
|
26
20
|
|
27
|
-
|
21
|
+
The analyzer currently requires the use of SyslogLogger because the
|
22
|
+
default Logger doesn't give any way to associate lines logged to a
|
23
|
+
request.
|
28
24
|
|
29
|
-
|
30
|
-
|
31
|
-
A syslogd that doesn't suck. This means that syslog.conf(5) shows a
|
32
|
-
!program_name specification. (FreeBSD's syslogd doesn't suck, but OS X's
|
33
|
-
syslogd does.)
|
34
|
-
|
35
|
-
or:
|
25
|
+
The PL Analyzer also includes action_grep which lets you grab lines from a log
|
26
|
+
that only match a single action.
|
36
27
|
|
37
|
-
|
38
|
-
logged. You'll also have to teach LogParser#parse about this. Feel free to
|
39
|
-
submit patches with tests. (Patches without tests are useless to me.)
|
28
|
+
action_grep RssController#uber /var/log/production.log
|
40
29
|
|
41
|
-
==
|
30
|
+
== Installing
|
42
31
|
|
43
|
-
|
32
|
+
sudo gem install production_log_analyzer
|
44
33
|
|
45
|
-
|
46
|
-
including setting up your non-sucky syslogd as directed. (SyslogLogger is
|
47
|
-
included via the rails_analyzer_tools gem.)
|
34
|
+
=== Setup
|
48
35
|
|
49
|
-
|
36
|
+
First:
|
50
37
|
|
51
|
-
|
38
|
+
Set up SyslogLogger according to the instructions here:
|
52
39
|
|
53
|
-
|
54
|
-
can deal with. (Then send my your patches for integration.)
|
40
|
+
http://seattlerb.rubyforge.org/SyslogLogger/
|
55
41
|
|
56
42
|
Then:
|
57
43
|
|
58
44
|
Set up a cronjob (or something like that) to run log files through pl_analyze.
|
59
45
|
|
60
|
-
|
46
|
+
== Using pl_analyze
|
61
47
|
|
62
48
|
To run pl_analyze simply give it the name of a log file to analyze.
|
63
49
|
|
@@ -73,7 +59,7 @@ Or, have pl_analyze email you (which is preferred, because tabs get preserved):
|
|
73
59
|
|
74
60
|
In the future, pl_analyze will be able to read from STDIN.
|
75
61
|
|
76
|
-
|
62
|
+
== Sample output
|
77
63
|
|
78
64
|
Request Times Summary: Count Avg Std Dev Min Max
|
79
65
|
ALL REQUESTS: 11 0.576 0.508 0.000 1.470
|
@@ -151,12 +137,11 @@ In the future, pl_analyze will be able to read from STDIN.
|
|
151
137
|
TeamsController#progress took 0.000s
|
152
138
|
TeamsController#progress took 0.000s
|
153
139
|
|
154
|
-
|
140
|
+
== What's missing
|
155
141
|
|
156
142
|
* More reports
|
157
143
|
* Command line arguments including:
|
158
144
|
* Help
|
159
145
|
* What type of log file you've got (if somebody sends patches with tests)
|
160
146
|
* Read from STDIN
|
161
|
-
* Lots more
|
162
147
|
|
data/Rakefile
CHANGED
@@ -1,63 +1,17 @@
|
|
1
|
-
require '
|
2
|
-
require 'rake'
|
3
|
-
require 'rake/testtask'
|
4
|
-
require 'rake/rdoctask'
|
5
|
-
require 'rake/gempackagetask'
|
6
|
-
require 'rake/contrib/sshpublisher'
|
1
|
+
require 'hoe'
|
7
2
|
|
8
|
-
|
3
|
+
$:.unshift './lib'
|
4
|
+
require 'production_log/analyzer'
|
9
5
|
|
10
|
-
|
11
|
-
|
12
|
-
|
13
|
-
|
14
|
-
|
15
|
-
|
6
|
+
Hoe.new 'production_log_analyzer', '1.5.0' do |p|
|
7
|
+
p.summary = p.paragraphs_of('README.txt', 1).join ' '
|
8
|
+
p.description = p.paragraphs_of('README.txt', 7).join ' '
|
9
|
+
p.author = 'Eric Hodel'
|
10
|
+
p.email = 'drbrain@segment7.net'
|
11
|
+
p.url = p.paragraphs_of('README.txt', 2).join ' '
|
16
12
|
|
17
|
-
|
18
|
-
s.files = File.read('Manifest.txt').split($/)
|
19
|
-
s.require_path = 'lib'
|
20
|
-
s.executables = ['pl_analyze', 'action_grep']
|
21
|
-
s.default_executable = 'pl_analyze'
|
13
|
+
p.rubyforge_name = 'seattlerb'
|
22
14
|
|
23
|
-
|
15
|
+
p.extra_deps << ['rails_analyzer_tools', '>= 1.4.0']
|
24
16
|
end
|
25
17
|
|
26
|
-
desc 'Run tests'
|
27
|
-
task :default => [ :test ]
|
28
|
-
|
29
|
-
Rake::TestTask.new('test') do |t|
|
30
|
-
t.libs << 'test'
|
31
|
-
t.pattern = 'test/test_*.rb'
|
32
|
-
t.verbose = true
|
33
|
-
end
|
34
|
-
|
35
|
-
desc 'Generate RDoc'
|
36
|
-
Rake::RDocTask.new :rdoc do |rd|
|
37
|
-
rd.rdoc_dir = 'doc'
|
38
|
-
rd.rdoc_files.add 'lib', 'README', 'LICENSE'
|
39
|
-
rd.main = 'README'
|
40
|
-
rd.options << '-d' if `which dot` =~ /\/dot/
|
41
|
-
end
|
42
|
-
|
43
|
-
desc 'Build Gem'
|
44
|
-
Rake::GemPackageTask.new spec do |pkg|
|
45
|
-
pkg.need_tar = true
|
46
|
-
end
|
47
|
-
|
48
|
-
desc 'Sends RDoc to RubyForge'
|
49
|
-
task :send_rdoc => [ :rerdoc ] do
|
50
|
-
publisher = Rake::SshDirPublisher.new('drbrain@rubyforge.org',
|
51
|
-
'/var/www/gforge-projects/rails-analyzer/pl_analyze',
|
52
|
-
'doc')
|
53
|
-
publisher.upload
|
54
|
-
end
|
55
|
-
|
56
|
-
desc 'Clean up'
|
57
|
-
task :clean => [ :clobber_rdoc, :clobber_package ]
|
58
|
-
|
59
|
-
desc 'Clean up'
|
60
|
-
task :clobber => [ :clean ]
|
61
|
-
|
62
|
-
# vim: syntax=Ruby
|
63
|
-
|
data/bin/action_errors
ADDED
@@ -0,0 +1,46 @@
|
|
1
|
+
#!/usr/local/bin/ruby -ws
|
2
|
+
|
3
|
+
$h ||= false
|
4
|
+
$r ||= false
|
5
|
+
$o ||= false
|
6
|
+
|
7
|
+
$r = $r ? ($r.to_i rescue false) : false
|
8
|
+
|
9
|
+
if $h then
|
10
|
+
$stderr.puts "Usage: #{$0} [-r=N] LOGFILE"
|
11
|
+
$stderr.puts "\t-r=N\tShow routing errors with N or more occurances"
|
12
|
+
$stderr.puts "\t-o\tShow errors with one occurance"
|
13
|
+
exit
|
14
|
+
end
|
15
|
+
|
16
|
+
errors = {}
|
17
|
+
counts = Hash.new 0
|
18
|
+
|
19
|
+
ARGF.each_line do |line|
|
20
|
+
line =~ /\]: (.*?) (.*)/
|
21
|
+
next if $1.nil?
|
22
|
+
msg = $1
|
23
|
+
trace = $2
|
24
|
+
key = msg.gsub(/\d/, '#')
|
25
|
+
counts[key] += 1
|
26
|
+
next if counts[key] > 1
|
27
|
+
trace = trace.split(' ')[0..-2].map { |l| l.strip }.join("\n\t")
|
28
|
+
error = "#{msg}\n\t#{trace}"
|
29
|
+
errors[key] = error
|
30
|
+
end
|
31
|
+
|
32
|
+
counts.sort_by { |_,c| -c }.each do |key, count|
|
33
|
+
next if count == 1 and not $o
|
34
|
+
error = errors[key]
|
35
|
+
|
36
|
+
if error =~ /^ActionController::RoutingError/ then
|
37
|
+
next unless $r
|
38
|
+
next if $r and count < $r
|
39
|
+
end
|
40
|
+
|
41
|
+
puts "count: #{count}"
|
42
|
+
puts "{{{"
|
43
|
+
puts error
|
44
|
+
puts "}}}"
|
45
|
+
end
|
46
|
+
|
@@ -4,31 +4,34 @@ class << ActionGrep
|
|
4
4
|
|
5
5
|
def grep(action_name, file_name)
|
6
6
|
unless action_name =~ /\A([A-Z][A-Za-z\d]*)(?:#([A-Za-z]\w*))?\Z/ then
|
7
|
-
raise ArgumentError, "Invalid action name #{action_name} expected something like
|
7
|
+
raise ArgumentError, "Invalid action name #{action_name} expected something like SomeController#action"
|
8
8
|
end
|
9
9
|
|
10
10
|
unless File.file? file_name and File.readable? file_name then
|
11
11
|
raise ArgumentError, "Unable to read #{file_name}"
|
12
12
|
end
|
13
13
|
|
14
|
-
in_component = 0
|
15
14
|
buckets = Hash.new { |h,k| h[k] = [] }
|
15
|
+
comp_count = Hash.new 0
|
16
|
+
|
16
17
|
File.open file_name do |fp|
|
17
18
|
fp.each_line do |line|
|
18
|
-
line =~ / ([^ ]+) ([^ ]+)\[(\d+)\]: /
|
19
|
+
line =~ / ([^ ]+) ([^ ]+)\[(\d+)\]: (.*)/
|
20
|
+
next if $2.nil? or $2 == 'newsyslog'
|
19
21
|
bucket = [$1, $2, $3].join '-'
|
22
|
+
data = $4
|
20
23
|
|
21
24
|
buckets[bucket] << line
|
22
25
|
|
23
|
-
case
|
24
|
-
when
|
25
|
-
|
26
|
-
when
|
27
|
-
|
28
|
-
when
|
29
|
-
next unless
|
26
|
+
case data
|
27
|
+
when /^Start rendering component / then
|
28
|
+
comp_count[bucket] += 1
|
29
|
+
when /^End of component rendering$/ then
|
30
|
+
comp_count[bucket] -= 1
|
31
|
+
when /^Completed/ then
|
32
|
+
next unless comp_count[bucket] == 0
|
30
33
|
action = buckets.delete bucket
|
31
|
-
next unless action.
|
34
|
+
next unless action.any? { |l| l =~ /: Processing #{action_name}/ }
|
32
35
|
puts action.join
|
33
36
|
end
|
34
37
|
end
|
@@ -4,41 +4,41 @@ require 'production_log/parser'
|
|
4
4
|
|
5
5
|
module Enumerable
|
6
6
|
|
7
|
-
|
8
|
-
|
9
|
-
|
10
|
-
|
11
|
-
|
12
|
-
|
13
|
-
|
14
|
-
|
15
|
-
|
16
|
-
|
17
|
-
|
18
|
-
|
19
|
-
|
20
|
-
|
21
|
-
|
22
|
-
|
23
|
-
|
24
|
-
|
25
|
-
|
26
|
-
|
27
|
-
|
28
|
-
|
29
|
-
|
30
|
-
|
31
|
-
|
32
|
-
|
33
|
-
|
34
|
-
|
35
|
-
|
36
|
-
|
37
|
-
|
38
|
-
|
39
|
-
|
40
|
-
|
41
|
-
|
7
|
+
##
|
8
|
+
# Sum of all the elements of the Enumerable
|
9
|
+
|
10
|
+
def sum
|
11
|
+
return self.inject(0) { |acc, i| acc + i }
|
12
|
+
end
|
13
|
+
|
14
|
+
##
|
15
|
+
# Average of all the elements of the Enumerable
|
16
|
+
#
|
17
|
+
# The Enumerable must respond to #length
|
18
|
+
|
19
|
+
def average
|
20
|
+
return self.sum / self.length.to_f
|
21
|
+
end
|
22
|
+
|
23
|
+
##
|
24
|
+
# Sample variance of all the elements of the Enumerable
|
25
|
+
#
|
26
|
+
# The Enumerable must respond to #length
|
27
|
+
|
28
|
+
def sample_variance
|
29
|
+
avg = self.average
|
30
|
+
sum = self.inject(0) { |acc, i| acc + (i - avg) ** 2 }
|
31
|
+
return (1 / self.length.to_f * sum)
|
32
|
+
end
|
33
|
+
|
34
|
+
##
|
35
|
+
# Standard deviation of all the elements of the Enumerable
|
36
|
+
#
|
37
|
+
# The Enumerable must respond to #length
|
38
|
+
|
39
|
+
def standard_deviation
|
40
|
+
return Math.sqrt(self.sample_variance)
|
41
|
+
end
|
42
42
|
|
43
43
|
end
|
44
44
|
|
@@ -47,27 +47,27 @@ end
|
|
47
47
|
|
48
48
|
class SizedList < Array
|
49
49
|
|
50
|
-
|
51
|
-
|
52
|
-
|
53
|
-
|
54
|
-
|
55
|
-
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
def initialize(limit, &delete_block)
|
60
|
-
@limit = limit
|
61
|
-
@delete_block = delete_block
|
62
|
-
end
|
50
|
+
##
|
51
|
+
# Creates a new SizedList that can hold up to +limit+ items. Whenever
|
52
|
+
# adding a new item to the SizedList would make the list larger than
|
53
|
+
# +limit+, +delete_block+ is called.
|
54
|
+
#
|
55
|
+
# +delete_block+ is passed the list and the item being added.
|
56
|
+
# +delete_block+ must take action to remove an item and return true or
|
57
|
+
# return false if the item should not be added to the list.
|
63
58
|
|
64
|
-
|
65
|
-
|
59
|
+
def initialize(limit, &delete_block)
|
60
|
+
@limit = limit
|
61
|
+
@delete_block = delete_block
|
62
|
+
end
|
66
63
|
|
67
|
-
|
68
|
-
|
69
|
-
|
70
|
-
|
64
|
+
##
|
65
|
+
# Attempts to add +obj+ to the list.
|
66
|
+
|
67
|
+
def <<(obj)
|
68
|
+
return super if self.length < @limit
|
69
|
+
return super if @delete_block.call self, obj
|
70
|
+
end
|
71
71
|
|
72
72
|
end
|
73
73
|
|
@@ -89,20 +89,21 @@ end
|
|
89
89
|
|
90
90
|
class SlowestTimes < SizedList
|
91
91
|
|
92
|
-
|
93
|
-
|
94
|
-
|
95
|
-
|
96
|
-
|
97
|
-
|
98
|
-
|
99
|
-
|
100
|
-
|
101
|
-
|
102
|
-
|
103
|
-
|
104
|
-
|
92
|
+
##
|
93
|
+
# Creates a new SlowestTimes SizedList that holds only +limit+ time/object
|
94
|
+
# pairs.
|
95
|
+
|
96
|
+
def initialize(limit)
|
97
|
+
super limit do |arr, new_item|
|
98
|
+
fastest_time = arr.sort_by { |time, name| time }.first
|
99
|
+
if fastest_time.first < new_item.first then
|
100
|
+
arr.delete_at index(fastest_time)
|
101
|
+
true
|
102
|
+
else
|
103
|
+
false
|
104
|
+
end
|
105
105
|
end
|
106
|
+
end
|
106
107
|
|
107
108
|
end
|
108
109
|
|
@@ -111,277 +112,285 @@ end
|
|
111
112
|
|
112
113
|
class Analyzer
|
113
114
|
|
114
|
-
|
115
|
-
|
115
|
+
##
|
116
|
+
# The version of the production log analyzer you are using.
|
116
117
|
|
117
|
-
|
118
|
+
VERSION = '1.5.0'
|
118
119
|
|
119
|
-
|
120
|
-
|
120
|
+
##
|
121
|
+
# The logfile being read by the Analyzer.
|
121
122
|
|
122
|
-
|
123
|
+
attr_reader :logfile_name
|
123
124
|
|
124
|
-
|
125
|
-
|
125
|
+
##
|
126
|
+
# An Array of all the request total times for the log file.
|
126
127
|
|
127
|
-
|
128
|
+
attr_reader :request_times
|
128
129
|
|
129
|
-
|
130
|
-
|
130
|
+
##
|
131
|
+
# An Array of all the request database times for the log file.
|
131
132
|
|
132
|
-
|
133
|
+
attr_reader :db_times
|
133
134
|
|
134
|
-
|
135
|
-
|
136
|
-
# way, Mail.app will behave when given tabs.
|
135
|
+
##
|
136
|
+
# An Array of all the request render times for the log file.
|
137
137
|
|
138
|
-
|
139
|
-
analyzer = self.new file_name
|
140
|
-
analyzer.process
|
141
|
-
body = analyzer.report count
|
138
|
+
attr_reader :render_times
|
142
139
|
|
143
|
-
|
144
|
-
|
145
|
-
|
146
|
-
email = email.join($/) << $/
|
140
|
+
##
|
141
|
+
# Generates and sends an email report with lots of fun stuff in it. This
|
142
|
+
# way, Mail.app will behave when given tabs.
|
147
143
|
|
148
|
-
|
144
|
+
def self.email(file_name, recipient, subject, count = 10)
|
145
|
+
analyzer = self.new file_name
|
146
|
+
analyzer.process
|
147
|
+
body = analyzer.report count
|
149
148
|
|
150
|
-
|
151
|
-
|
152
|
-
|
153
|
-
|
154
|
-
end
|
149
|
+
email = self.envelope(recipient, subject)
|
150
|
+
email << nil
|
151
|
+
email << "<pre>#{body}</pre>"
|
152
|
+
email = email.join($/) << $/
|
155
153
|
|
156
|
-
|
157
|
-
envelope = {}
|
158
|
-
envelope['To'] = recipient
|
159
|
-
envelope['Subject'] = subject || "pl_analyze"
|
160
|
-
envelope['Content-Type'] = "text/html"
|
154
|
+
return email if $TESTING
|
161
155
|
|
162
|
-
|
156
|
+
IO.popen("/usr/sbin/sendmail -i -t", "w+") do |sm|
|
157
|
+
sm.print email
|
158
|
+
sm.flush
|
163
159
|
end
|
160
|
+
end
|
161
|
+
|
162
|
+
def self.envelope(recipient, subject = nil) # :nodoc:
|
163
|
+
envelope = {}
|
164
|
+
envelope['To'] = recipient
|
165
|
+
envelope['Subject'] = subject || "pl_analyze"
|
166
|
+
envelope['Content-Type'] = "text/html"
|
167
|
+
|
168
|
+
return envelope.map { |(k,v)| "#{k}: #{v}" }
|
169
|
+
end
|
170
|
+
|
171
|
+
##
|
172
|
+
# Creates a new Analyzer that will read data from +logfile_name+.
|
173
|
+
|
174
|
+
def initialize(logfile_name)
|
175
|
+
@logfile_name = logfile_name
|
176
|
+
@request_times = Hash.new { |h,k| h[k] = [] }
|
177
|
+
@db_times = Hash.new { |h,k| h[k] = [] }
|
178
|
+
@render_times = Hash.new { |h,k| h[k] = [] }
|
179
|
+
end
|
180
|
+
|
181
|
+
##
|
182
|
+
# Processes the log file collecting statistics from each found LogEntry.
|
183
|
+
|
184
|
+
def process
|
185
|
+
File.open @logfile_name do |fp|
|
186
|
+
LogParser.parse fp do |entry|
|
187
|
+
entry_page = entry.page
|
188
|
+
next if entry_page.nil?
|
189
|
+
@request_times[entry_page] << entry.request_time
|
190
|
+
@db_times[entry_page] << entry.db_time
|
191
|
+
@render_times[entry_page] << entry.render_time
|
192
|
+
end
|
193
|
+
end
|
194
|
+
end
|
164
195
|
|
165
|
-
|
166
|
-
|
196
|
+
##
|
197
|
+
# The average total request time for all requests.
|
167
198
|
|
168
|
-
|
169
|
-
|
170
|
-
|
171
|
-
@db_times = Hash.new { |h,k| h[k] = [] }
|
172
|
-
@render_times = Hash.new { |h,k| h[k] = [] }
|
173
|
-
# @query_times = Hash.new { |h,k| h[k] = [] }
|
174
|
-
end
|
199
|
+
def average_request_time
|
200
|
+
return time_average(@request_times)
|
201
|
+
end
|
175
202
|
|
176
|
-
|
177
|
-
|
178
|
-
|
179
|
-
def process
|
180
|
-
File.open @logfile_name do |fp|
|
181
|
-
LogParser.parse fp do |entry|
|
182
|
-
entry_page = entry.page
|
183
|
-
@request_times[entry_page] << entry.request_time
|
184
|
-
@db_times[entry_page] << entry.db_time
|
185
|
-
@render_times[entry_page] << entry.render_time
|
186
|
-
end
|
187
|
-
end
|
188
|
-
end
|
203
|
+
##
|
204
|
+
# The standard deviation of the total request time for all requests.
|
189
205
|
|
190
|
-
|
191
|
-
|
206
|
+
def request_time_std_dev
|
207
|
+
return time_std_dev(@request_times)
|
208
|
+
end
|
192
209
|
|
193
|
-
|
194
|
-
|
195
|
-
end
|
210
|
+
##
|
211
|
+
# The +limit+ slowest total request times.
|
196
212
|
|
197
|
-
|
198
|
-
|
213
|
+
def slowest_request_times(limit = 10)
|
214
|
+
return slowest_times(@request_times, limit)
|
215
|
+
end
|
199
216
|
|
200
|
-
|
201
|
-
|
202
|
-
end
|
217
|
+
##
|
218
|
+
# The average total database time for all requests.
|
203
219
|
|
204
|
-
|
205
|
-
|
220
|
+
def average_db_time
|
221
|
+
return time_average(@db_times)
|
222
|
+
end
|
206
223
|
|
207
|
-
|
208
|
-
|
209
|
-
end
|
224
|
+
##
|
225
|
+
# The standard deviation of the total database time for all requests.
|
210
226
|
|
211
|
-
|
212
|
-
|
227
|
+
def db_time_std_dev
|
228
|
+
return time_std_dev(@db_times)
|
229
|
+
end
|
213
230
|
|
214
|
-
|
215
|
-
|
216
|
-
end
|
231
|
+
##
|
232
|
+
# The +limit+ slowest total database times.
|
217
233
|
|
218
|
-
|
219
|
-
|
234
|
+
def slowest_db_times(limit = 10)
|
235
|
+
return slowest_times(@db_times, limit)
|
236
|
+
end
|
220
237
|
|
221
|
-
|
222
|
-
|
223
|
-
end
|
238
|
+
##
|
239
|
+
# The average total render time for all requests.
|
224
240
|
|
225
|
-
|
226
|
-
|
241
|
+
def average_render_time
|
242
|
+
return time_average(@render_times)
|
243
|
+
end
|
227
244
|
|
228
|
-
|
229
|
-
|
230
|
-
end
|
245
|
+
##
|
246
|
+
# The standard deviation of the total render time for all requests.
|
231
247
|
|
232
|
-
|
233
|
-
|
248
|
+
def render_time_std_dev
|
249
|
+
return time_std_dev(@render_times)
|
250
|
+
end
|
234
251
|
|
235
|
-
|
236
|
-
|
237
|
-
end
|
252
|
+
##
|
253
|
+
# The +limit+ slowest total render times for all requests.
|
238
254
|
|
239
|
-
|
240
|
-
|
255
|
+
def slowest_render_times(limit = 10)
|
256
|
+
return slowest_times(@render_times, limit)
|
257
|
+
end
|
241
258
|
|
242
|
-
|
243
|
-
|
244
|
-
end
|
259
|
+
##
|
260
|
+
# A list of count/min/max/avg/std dev for request times.
|
245
261
|
|
246
|
-
|
247
|
-
|
262
|
+
def request_times_summary
|
263
|
+
return summarize("Request Times", @request_times)
|
264
|
+
end
|
248
265
|
|
249
|
-
|
250
|
-
|
251
|
-
end
|
266
|
+
##
|
267
|
+
# A list of count/min/max/avg/std dev for database times.
|
252
268
|
|
253
|
-
|
254
|
-
|
269
|
+
def db_times_summary
|
270
|
+
return summarize("DB Times", @db_times)
|
271
|
+
end
|
255
272
|
|
256
|
-
|
257
|
-
|
258
|
-
end
|
273
|
+
##
|
274
|
+
# A list of count/min/max/avg/std dev for request times.
|
259
275
|
|
260
|
-
|
261
|
-
|
276
|
+
def render_times_summary
|
277
|
+
return summarize("Render Times", @render_times)
|
278
|
+
end
|
262
279
|
|
263
|
-
|
264
|
-
|
265
|
-
end
|
280
|
+
##
|
281
|
+
# Builds a report containing +count+ slow items.
|
266
282
|
|
267
|
-
|
268
|
-
|
283
|
+
def report(count)
|
284
|
+
return "No requests to analyze" if request_times.empty?
|
269
285
|
|
270
|
-
|
271
|
-
return summarize("Render Times", @render_times)
|
272
|
-
end
|
286
|
+
text = []
|
273
287
|
|
274
|
-
|
275
|
-
|
276
|
-
|
277
|
-
|
278
|
-
|
279
|
-
|
280
|
-
text << request_times_summary
|
281
|
-
text << nil
|
282
|
-
text << "Slowest Request Times:"
|
283
|
-
slowest_request_times(count).each do |time, name|
|
284
|
-
text << "\t#{name} took #{'%0.3f' % time}s"
|
285
|
-
end
|
286
|
-
text << nil
|
287
|
-
text << "-" * 72
|
288
|
-
text << nil
|
289
|
-
|
290
|
-
text << db_times_summary
|
291
|
-
text << nil
|
292
|
-
text << "Slowest Total DB Times:"
|
293
|
-
slowest_db_times(count).each do |time, name|
|
294
|
-
text << "\t#{name} took #{'%0.3f' % time}s"
|
295
|
-
end
|
296
|
-
text << nil
|
297
|
-
text << "-" * 72
|
298
|
-
text << nil
|
299
|
-
|
300
|
-
text << render_times_summary
|
301
|
-
text << nil
|
302
|
-
text << "Slowest Total Render Times:"
|
303
|
-
slowest_render_times(count).each do |time, name|
|
304
|
-
text << "\t#{name} took #{'%0.3f' % time}s"
|
305
|
-
end
|
306
|
-
text << nil
|
307
|
-
|
308
|
-
return text.join($/)
|
288
|
+
text << request_times_summary
|
289
|
+
text << nil
|
290
|
+
text << "Slowest Request Times:"
|
291
|
+
slowest_request_times(count).each do |time, name|
|
292
|
+
text << "\t#{name} took #{'%0.3f' % time}s"
|
309
293
|
end
|
294
|
+
text << nil
|
295
|
+
text << "-" * 72
|
296
|
+
text << nil
|
297
|
+
|
298
|
+
text << db_times_summary
|
299
|
+
text << nil
|
300
|
+
text << "Slowest Total DB Times:"
|
301
|
+
slowest_db_times(count).each do |time, name|
|
302
|
+
text << "\t#{name} took #{'%0.3f' % time}s"
|
303
|
+
end
|
304
|
+
text << nil
|
305
|
+
text << "-" * 72
|
306
|
+
text << nil
|
307
|
+
|
308
|
+
text << render_times_summary
|
309
|
+
text << nil
|
310
|
+
text << "Slowest Total Render Times:"
|
311
|
+
slowest_render_times(count).each do |time, name|
|
312
|
+
text << "\t#{name} took #{'%0.3f' % time}s"
|
313
|
+
end
|
314
|
+
text << nil
|
310
315
|
|
311
|
-
|
316
|
+
return text.join($/)
|
317
|
+
end
|
312
318
|
|
313
|
-
|
314
|
-
record = nil
|
315
|
-
list = []
|
319
|
+
private unless $TESTING
|
316
320
|
|
317
|
-
|
318
|
-
|
319
|
-
|
320
|
-
list << record.join("\t")
|
321
|
+
def summarize(title, records) # :nodoc:
|
322
|
+
record = nil
|
323
|
+
list = []
|
321
324
|
|
322
|
-
|
323
|
-
|
324
|
-
|
325
|
-
|
326
|
-
record.unshift [pad_request_name('ALL REQUESTS'), times.size]
|
327
|
-
list << record.join("\t")
|
325
|
+
# header
|
326
|
+
record = [pad_request_name("#{title} Summary"), 'Count', 'Avg', 'Std Dev',
|
327
|
+
'Min', 'Max']
|
328
|
+
list << record.join("\t")
|
328
329
|
|
329
|
-
|
330
|
-
|
330
|
+
# all requests
|
331
|
+
times = records.values.flatten
|
332
|
+
record = [times.average, times.standard_deviation, times.min, times.max]
|
333
|
+
record.map! { |v| "%0.3f" % v }
|
334
|
+
record.unshift [pad_request_name('ALL REQUESTS'), times.size]
|
335
|
+
list << record.join("\t")
|
331
336
|
|
332
|
-
|
333
|
-
|
334
|
-
times.min, times.max]
|
335
|
-
record.map! { |v| "%0.3f" % v }
|
336
|
-
record.unshift ["#{pad_request_name req}", times.size]
|
337
|
-
list << record.join("\t")
|
338
|
-
end
|
337
|
+
# spacer
|
338
|
+
list << nil
|
339
339
|
|
340
|
-
|
340
|
+
records.sort_by { |k,v| v.size}.reverse_each do |req, times|
|
341
|
+
record = [times.average, times.standard_deviation, times.min, times.max]
|
342
|
+
record.map! { |v| "%0.3f" % v }
|
343
|
+
record.unshift ["#{pad_request_name req}", times.size]
|
344
|
+
list << record.join("\t")
|
341
345
|
end
|
342
346
|
|
343
|
-
|
344
|
-
|
347
|
+
return list.join("\n")
|
348
|
+
end
|
345
349
|
|
346
|
-
|
347
|
-
|
348
|
-
slowest_times << [time, name]
|
349
|
-
end
|
350
|
-
end
|
350
|
+
def slowest_times(records, limit) # :nodoc:
|
351
|
+
slowest_times = SlowestTimes.new limit
|
351
352
|
|
352
|
-
|
353
|
+
records.each do |name, times|
|
354
|
+
times.each do |time|
|
355
|
+
slowest_times << [time, name]
|
356
|
+
end
|
353
357
|
end
|
354
358
|
|
355
|
-
|
356
|
-
|
357
|
-
times.delete 0
|
358
|
-
return times.average
|
359
|
-
end
|
360
|
-
|
361
|
-
def time_std_dev(records) # :nodoc:
|
362
|
-
times = records.values.flatten
|
363
|
-
times.delete 0
|
364
|
-
return times.standard_deviation
|
365
|
-
end
|
359
|
+
return slowest_times.sort_by { |time, name| time }.reverse
|
360
|
+
end
|
366
361
|
|
367
|
-
|
368
|
-
|
362
|
+
def time_average(records) # :nodoc:
|
363
|
+
times = records.values.flatten
|
364
|
+
times.delete 0
|
365
|
+
return times.average
|
366
|
+
end
|
369
367
|
|
370
|
-
|
371
|
-
|
372
|
-
|
368
|
+
def time_std_dev(records) # :nodoc:
|
369
|
+
times = records.values.flatten
|
370
|
+
times.delete 0
|
371
|
+
return times.standard_deviation
|
372
|
+
end
|
373
373
|
|
374
|
-
|
374
|
+
def longest_request_name # :nodoc:
|
375
|
+
return @longest_req if defined? @longest_req
|
375
376
|
|
376
|
-
|
377
|
+
names = @request_times.keys.map do |name|
|
378
|
+
(name||'Unknown').length + 1 # + : - HACK where does nil come from?
|
377
379
|
end
|
378
380
|
|
379
|
-
|
380
|
-
name = (name||'Unknown') + ':' # HACK where does nil come from?
|
381
|
-
name += (' ' * (longest_request_name - name.length))
|
382
|
-
end
|
381
|
+
@longest_req = names.max
|
383
382
|
|
384
|
-
|
383
|
+
@longest_req = 'Unknown'.length + 1 if @longest_req.nil?
|
384
|
+
|
385
|
+
return @longest_req
|
386
|
+
end
|
385
387
|
|
386
|
-
#
|
388
|
+
def pad_request_name(name) # :nodoc:
|
389
|
+
name = (name||'Unknown') + ':' # HACK where does nil come from?
|
390
|
+
padding_width = longest_request_name - name.length
|
391
|
+
padding_width = 0 if padding_width < 0
|
392
|
+
name += (' ' * padding_width)
|
393
|
+
end
|
394
|
+
|
395
|
+
end
|
387
396
|
|