apache_log_report 1.1.3 → 1.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 45396921f6e594c2b2dcce25320c292c08f4bb35ab7c19dc72fb99cec8520dd4
4
- data.tar.gz: 990dacd5e607a0f62a55f7248396ac095f2867f905807f91c3df1f6c06b5def0
3
+ metadata.gz: 3ad003b9ae4768923ebb253dc937277372ff9adee264d77eb3db103153d8bcb5
4
+ data.tar.gz: f843e794726853611240ca3267cfed35666e3a04c030a78426bb524a8e40c632
5
5
  SHA512:
6
- metadata.gz: a1c69d138504939d0c1917c7d9f278fff974fce0c27f82c64fd45b80641070924ea5f63a17b6efb4c64c63ea54f458a8b7d3d3f3e100664c70cdc3488be78950
7
- data.tar.gz: a675921834c7551b2b95bf983c08a71f40f94f1f16e2b24d6b4fb10c4be61f9ffd8abb9947f985f0419d1678045758934e448d7e77f4904d806f3c966b8af358
6
+ metadata.gz: 9a609336d73e83d15dd97939c1e1282ebe73b82d1b8e6d2cf07d2af769d4ad1e93803e825a38549c198d64f5f7e32637dcb6d3dea39eee14711e23b8adde35a0
7
+ data.tar.gz: 7fecc82af8cbb5debc54542f271855607479d1f142d1a725175afa51acbcc921f8955f4ba444594abc4e2cda4a6baf30421822c5d01e7d07bffcaf68eadbfe1e
data/README.org CHANGED
@@ -4,21 +4,102 @@
4
4
 
5
5
  * Introduction
6
6
 
7
+ ApacheLogReport generates reports and statistics from Apache web logs
8
+ in the =combined= format. Written in Ruby, it runs from the command
9
+ line, it is fast, and it can be installed on any system which supports
10
+ Ruby.
11
+
12
+ ApacheLogReport moves along the lines of tools such as [[https://goaccess.io/][GoAccess]]
13
+ and [[https://umami.is/][Umami]], focusing on privacy and data-ownership: the data
14
+ generated by ApacheLogReport is stored on your computer and owned by
15
+ you (like it should be).
16
+
17
+ ApacheLogReport is also inspired by static websites generators:
18
+ statistics are generated from the command line and accessed as static
19
+ HTML files. By generating static resources, ApacheLogReport
20
+ significantly reduces the attack surface of your webserver and
21
+ installation headaches.
22
+
23
+ We have, for instance, a cron job running on our servers, generating
24
+ statistics at night. The generated files are then made available on a
25
+ private area on the web.
26
+
27
+ Statistics are generated from Apache log formats in the =combined=
28
+ format. Reports are tailored, but not limited, to web servers serving
29
+ static websites. No need to install Java Script code on your
30
+ websites, no cookies installed, no user tracking.
31
+
32
+ ApacheLogReport reports the following data:
33
+
34
+ - Visitors, hits, unique visitors, bandwidth used
35
+ - Most accessed HTML pages
36
+ - Most accessed resources
37
+ - Response statuses
38
+ - referers
39
+ - OS, browsers, and devices
40
+ - Streaks: resources accessed by a given IP over time
41
+ - Potential attacks: access to resources which are not meant to be
42
+ served by a web server serving static websites
43
+
44
+ Filters from the command line allow to analyze specific periods and
45
+ distinguish traffic generated by self polls and crawlers.
46
+
47
+ ApacheLogReport generates HTML and SQLite outputs. Moreover, it can
48
+ also generate reports in Org Mode format, which can be then processed
49
+ to various formats (including LaTeX and HTML).
50
+
7
51
  * Installation
8
52
 
53
+ #+begin_src bash
54
+ gem install apache_log_report
55
+ #+end_src
56
+
9
57
  * Usage
10
58
 
59
+ #+begin_src bash :results raw output :wrap example
60
+ apache_log_report --help
61
+ #+end_src
62
+
63
+ #+RESULTS:
64
+ #+begin_example
65
+ Usage: apache_log_report [options] [logfile]
66
+ -l, --limit=N Number of entries to show (defaults to 30)
67
+ -b, --begin=DATE Consider entries after or on DATE
68
+ -e, --end=DATE Consider entries before or on DATE
69
+ -i, --ignore-crawlers Ignore crawlers
70
+ -p, --ignore-selfpoll Ignore apaches self poll entries (from ::1)
71
+ --only-crawlers Perform analysis on crawlers only
72
+ -u, --prefix=PREFIX Prefix to add to all plots (used to run multiple analyses in the same dir)
73
+ -w, --suffix=SUFFIX Suffix to add to all plots (used to run multiple analyses in the same dir)
74
+ -c, --code-export=WHAT Control :export directive in Org Mode code blocks (code, results, *both*, none)
75
+ -f, --format=FORMAT Output format: html, org, sqlite. Defaults to org mode
76
+ -v, --version Prints version information
77
+ -h, --help Prints this help
78
+ This is version 1.1.6
79
+ #+end_example
80
+
11
81
  * Change Log
12
82
 
13
83
  See the [[file:CHANGELOG.org][CHANGELOG]] file.
14
84
 
15
85
  * Todo
16
86
 
87
+ ** TODO Referers should only include the hostname?
17
88
  ** TODO Graphs in HTML output
18
- ** TODO Countries
89
+ ** TODO Countries/IP Lookup
90
+ ** TODO Light and Dark Theme
91
+ ** TODO Embed CSS
92
+ ** TODO Declare datatypes in table outputs, so that we can format data
19
93
 
20
94
  * Compatibility
21
95
 
96
+ ApacheLogReport should run on any system on which Ruby runs.
97
+
98
+ Concerning the outputs:
99
+
100
+ - The HTML report uses [[https://picturepan2.github.io/spectre/][Spectre.css]] and (will use) [[https://vega.github.io/vega-lite/][Vega Light]], which
101
+ are downloaded from a CDN
102
+ - Org Mode plots data using [[http://gnuplot.info/][Gnuplot]]
22
103
 
23
104
  * Author and Contributors
24
105
 
@@ -1,17 +1,41 @@
1
+
1
2
  module ApacheLogReport
2
3
  module DataCruncher
3
4
 
4
5
  #
5
6
  # take a sqlite3 database and analyze data
6
7
  #
8
+ # @ variables are automatically put in the returned data
9
+ #
7
10
 
8
11
  def self.crunch db, options = {}
9
- @first_day = db.execute "SELECT datetime from LogLine order by datetime limit 1"
10
- @last_day = db.execute "SELECT datetime from LogLine order by datetime desc limit 1"
12
+ first_day_s = db.execute "SELECT datetime from LogLine order by datetime limit 1"
13
+ last_day_s = db.execute "SELECT datetime from LogLine order by datetime desc limit 1"
14
+
15
+ # make first and last day into dates or nil
16
+ @first_day = first_day_s.empty? ? nil : Date.parse(first_day_s[0][0])
17
+ @last_day = last_day_s.empty? ? nil : Date.parse(last_day_s[0][0])
18
+
19
+ @total_days = 0
20
+ if @first_day and @last_day
21
+ @total_days = (@last_day - @first_day).to_i
22
+ end
23
+
11
24
  @log_size = db.execute "SELECT count(datetime) from LogLine"
12
25
  @crawlers_size = db.execute "SELECT count(datetime) from LogLine where bot == 1"
13
26
  @selfpolls_size = db.execute "SELECT count(datetime) from LogLine where ip == '::1'"
14
27
 
28
+ @first_day_requested = options[:from_date]
29
+ @last_day_requested = options[:to_date]
30
+
31
+ @first_day_in_analysis = date_intersect options[:from_date], @first_day, :max
32
+ @last_day_in_analysis = date_intersect options[:to_date], @last_day, :min
33
+
34
+ @total_days_in_analysis = 0
35
+ if @first_day_in_analysis and @last_day_in_analysis
36
+ @total_days_in_analysis = (@last_day_in_analysis - @first_day_in_analysis).to_i
37
+ end
38
+
15
39
  #
16
40
  # generate the where clause corresponding to the command line options to filter data
17
41
  #
@@ -39,12 +63,23 @@ module ApacheLogReport
39
63
  END AS size
40
64
  EOS
41
65
 
66
+ human_readable_day = <<-EOS
67
+ case cast (strftime('%w', datetime) as integer)
68
+ when 0 then 'Sunday'
69
+ when 1 then 'Monday'
70
+ when 2 then 'Tuesday'
71
+ when 3 then 'Wednesday'
72
+ when 4 then 'Thursday'
73
+ when 5 then 'Friday'
74
+ else 'Saturday'
75
+ end as dow
76
+ EOS
77
+
42
78
  @total_hits = db.execute "SELECT count(datetime) from LogLine where #{filter}"
43
79
  @total_unique_visitors = db.execute "SELECT count(distinct(unique_visitor)) from LogLine where #{filter}"
44
80
  @total_size = db.execute "SELECT #{human_readable_size} from LogLine where #{filter}"
45
- @total_days = (Date.parse(@last_day[0][0]) - Date.parse(@first_day[0][0])).to_i
46
81
 
47
- @daily_distribution = db.execute "SELECT date(datetime), count(datetime), count(distinct(unique_visitor)), #{human_readable_size} from LogLine where #{filter} group by date(datetime)"
82
+ @daily_distribution = db.execute "SELECT date(datetime), #{human_readable_day}, count(datetime), count(distinct(unique_visitor)), #{human_readable_size} from LogLine where #{filter} group by date(datetime)"
48
83
  @time_distribution = db.execute "SELECT strftime('%H', datetime), count(datetime), count(distinct(unique_visitor)), #{human_readable_size} from LogLine where #{filter} group by strftime('%H', datetime)"
49
84
  @most_requested_pages = db.execute "SELECT path, count(path), count(distinct(unique_visitor)), #{human_readable_size} from LogLine where extension == '.html' and #{filter} group by path order by count(path) desc limit #{options[:limit]}"
50
85
  @most_requested_resources = db.execute "SELECT path, count(path), count(distinct(unique_visitor)), #{human_readable_size} from LogLine where #{filter} group by path order by count(path) desc limit #{options[:limit]}"
@@ -80,7 +115,20 @@ module ApacheLogReport
80
115
  end
81
116
  data
82
117
  end
83
- end
84
118
 
119
+ private
120
+
121
+ def self.date_intersect date1, date2, method
122
+ if date1 and date2
123
+ [date1, date2].send(method)
124
+ elsif date1
125
+ date1
126
+ else
127
+ date2
128
+ end
129
+ end
130
+
131
+
132
+ end
85
133
  end
86
134
 
@@ -18,11 +18,11 @@ module ApacheLogReport
18
18
  args[:limit] = n
19
19
  end
20
20
 
21
- opts.on("-bDATE", "--begin=DATE", DateTime, "Consider entries after or on DATE") do |n|
21
+ opts.on("-bDATE", "--begin=DATE", Date, "Consider entries after or on DATE") do |n|
22
22
  args[:from_date] = n
23
23
  end
24
24
 
25
- opts.on("-eDATE", "--end=DATE", DateTime, "Consider entries before or on DATE") do |n|
25
+ opts.on("-eDATE", "--end=DATE", Date, "Consider entries before or on DATE") do |n|
26
26
  args[:to_date] = n
27
27
  end
28
28
 
@@ -11,208 +11,264 @@
11
11
  </head>
12
12
 
13
13
  <body>
14
- <section class="container">
15
- <h1>Apache Log Analysis: <%= data[:log_file] || "stdin" %></h1>
16
-
17
- <div class="columns">
18
- <article class="col-6 column">
19
- <h2>Summary</h2>
20
-
21
- <table class="table summary">
22
- <tr>
23
- <th class="hits">Hits</th>
24
- <td class="hits"><%= data[:total_hits][0][0] %></td>
25
- </tr>
26
- <tr>
27
- <th class="unique-visitors">Unique Visitors</th>
28
- <td class="unique-visitors"><%= data[:total_unique_visitors][0][0] %></td>
29
- </tr>
30
- <tr>
31
- <th class="tx">Tx</th>
32
- <td class="tx"><%= data[:total_size][0][0] %></td>
33
- </tr>
34
- <tr>
35
- <th class="period">Period</th>
36
- <td class="period">
37
- <%= data[:first_day][0][0] %>
38
- --
39
- <%= data[:last_day][0][0] %>
40
- </td>
41
- </tr>
42
- <tr>
43
- <th class="days">Days </th>
44
- <td class="days"><%= data[:total_days] %></td>
45
- </tr>
46
- </table>
47
- </article>
48
- <article class="column col-6">
49
- <h3> Log Structure</h3>
14
+ <div class="container">
15
+ <nav>
16
+ <ul class="nav">
17
+ <li class="nav-item active">
18
+ <a href="#">Navigation</a>
19
+ <ul class="nav">
20
+ <% [ "Summary",
21
+ "Log Structure",
22
+ "Daily Distribution",
23
+ "Time Distribution",
24
+ "Most Requested Pages",
25
+ "Most Requested Resources",
26
+ "404 on HTML Files",
27
+ "404 on other Resources",
28
+ "Attacks",
29
+ "Statuses",
30
+ "Daily Statuses",
31
+ "Browsers",
32
+ "Platforms",
33
+ "Referers",
34
+ "IPs",
35
+ "Command Invocation",
36
+ "Performance"
37
+ ].each do |item| %>
38
+ <li class="nav-item">
39
+ <a href="#<%= item.downcase.gsub(' ', '-') %>"><%= item %></a>
40
+ </li>
41
+ <% end %>
42
+ </ul>
43
+ </li>
44
+ </ul>
45
+ <p>
46
+ Generated by<br />
47
+ <a href="https://www.ict4g.net/gitea/adolfo/apache_log_report">
48
+ Apache Log Report
49
+ </a> <br />
50
+ on <%= DateTime.now.strftime("%Y-%m-%d %H:%M") %>.<br />
51
+ The lean log analyzer.
52
+ </p>
53
+ </nav>
50
54
 
51
- <table class="table log-structure">
52
- <tbody>
55
+ <section>
56
+ <h1>Apache Log Analysis: <%= data[:log_file] || "stdin" %></h1>
57
+
58
+ <div class="columns">
59
+ <article class="col-6 column">
60
+ <h2 id="summary">Summary</h2>
61
+
62
+ <table class="table summary">
53
63
  <tr>
54
64
  <th>Input file</th>
55
65
  <td><b><%= (data[:log_file] || "stdin") %></b></td>
56
66
  </tr>
57
67
  <tr>
58
- <th>Log size</th>
59
- <td><%= data[:log_size][0][0] %></td>
68
+ <th class="period">Period Analyzed</th>
69
+ <td class="period">
70
+ <%= data[:first_day_in_analysis] %>
71
+ --
72
+ <%= data[:last_day_in_analysis] %>
73
+ </td>
60
74
  </tr>
61
75
  <tr>
62
- <th>Self poll entries</th>
63
- <td><%= data[:selfpolls_size][0][0] %></td>
76
+ <th class="days">Days </th>
77
+ <td class="days"><%= data[:total_days_in_analysis] %></td>
64
78
  </tr>
65
79
  <tr>
66
- <th>Crawlers</th>
67
- <td><%= data[:crawlers_size][0][0] %></td>
80
+ <th class="hits">Hits</th>
81
+ <td class="hits"><%= data[:total_hits][0][0] %></td>
68
82
  </tr>
69
83
  <tr>
70
- <th>Entries considered</th>
71
- <td><%= data[:total_hits][0][0] %></td>
84
+ <th class="unique-visitors">Unique Visitors</th>
85
+ <td class="unique-visitors"><%= data[:total_unique_visitors][0][0] %></td>
72
86
  </tr>
73
- </tbody>
74
- </table>
75
- </article>
76
- </div>
77
-
78
- <% @reports = [
79
- { title: "Daily Distribution", header: ["Day", "Hits", "Visits", "Size"], rows: data[:daily_distribution] },
80
- { title: "Time Distribution", header: ["Hour", "Hits", "Visits", "Size"], rows: data[:time_distribution] },
81
- { title: "Most Requested Pages", header: ["Path", "Hits", "Visits", "Size"], rows: data[:most_requested_pages] },
82
- { title: "Most Requested Resources", header: ["Path", "Hits", "Visits", "Size"], rows: data[:most_requested_resources] },
83
- { title: "404 on HTML Files", header: ["Path", "Hits", "Visitors"], rows: data[:missed_pages] },
84
- { title: "404 on other Resources", header: ["Path", "Hits", "Visitors"], rows: data[:missed_resources] },
85
- { title: "Attacks", header: ["Path", "Hits", "Visitors"], rows: data[:attacks] },
86
- { },
87
- { title: "Statuses", header: ["Status", "Count"], rows: data[:statuses] },
88
- { title: "Daily Statuses", header: ["Status", "2xx", "3xx", "4xx"], rows: data[:statuses_by_day] },
89
- { title: "Browsers", header: ["Browser", "Hits", "Visitors", "Size"], rows: data[:browsers] },
90
- { title: "Platforms", header: ["Platform", "Hits", "Visitors", "Size"], rows: data[:platforms] },
91
- { title: "Referers", header: ["Referers", "Hits", "Visitors", "Size"], rows: data[:referers], col: "col-12" },
92
- { title: "IPs", header: ["IPs", "Hits", "Visitors", "Size"], rows: data[:ips] },
93
- { },
94
- ]
95
- %>
96
- <div class="columns">
97
- <% @reports.each do |report| %>
98
- <div class="column <%= report[:col] || "col-6" %>">
99
- <article>
100
- <% if report[:title] != nil %>
101
- <h2><%= report[:title] %></h2>
102
- <%= render "output_table", report %>
103
- <% end %>
104
- </article>
105
- </div>
106
- <% end %>
107
- </div>
108
-
109
- <article>
110
- <h2>Streaks</h2>
111
-
112
- <table class="table streaks">
113
- <thead>
114
- <tr>
115
- <th>IP</th>
116
- <th>Day and URL</th>
117
- </tr>
118
- </thead>
119
- <tbody>
120
- <% data[:streaks].group_by(&:first).each do |ip, date_urls| %>
121
87
  <tr>
122
- <td class="ip"><%= ip %></td>
123
- <td class="streaks">
124
- <% date_urls.group_by(&:first).each do |date, urls| %>
125
- <% urls.each do |url| %>
126
- <b><%= url[1] %>:</b> <%= url[2] %> <br />
127
- <% end %>
128
- <% end %>
129
- </td>
88
+ <th class="tx">Tx</th>
89
+ <td class="tx"><%= data[:total_size][0][0] %></td>
130
90
  </tr>
131
- <% end %>
132
- </tbody>
133
- </table>
134
- </article>
135
-
136
- <div class="columns">
137
- <div class="column col-6">
138
- <article>
139
- <h2>Command Invocation</h2>
91
+ </table>
92
+ </article>
93
+ <article class="column col-6">
94
+ <h2 id="log-structure">Log Structure</h2>
140
95
 
141
- <table class="table command-invocation">
96
+ <table class="table log-structure">
142
97
  <tbody>
143
- <tr>
144
- <th>CLI Command</th>
145
- <td><pre><%= data[:command] %></pre></td>
146
- </tr>
147
98
  <tr>
148
99
  <th>Input file</th>
149
- <td><code><%= (data[:log_file] || "stdin") %></code></td>
100
+ <td><b><%= (data[:log_file] || "stdin") %></b></td>
150
101
  </tr>
151
102
  <tr>
152
- <th>Ignore crawlers</th>
153
- <td><code><%= options[:ignore_crawlers] %></code></td></tr>
103
+ <th>Period in Log</th>
104
+ <td><%= data[:first_day] %> -- <%= data[:last_day] %></td>
105
+ </tr>
154
106
  <tr>
155
- <th>Only crawlers</th>
156
- <td><code><%= options[:only_crawlers] %></code></td>
107
+ <th>Total days</th>
108
+ <td><%= data[:total_days] %></td>
157
109
  </tr>
158
110
  <tr>
159
- <th>No selfpoll</th>
160
- <td><code><%= options[:no_selfpoll] %></code></td>
111
+ <th>Log size</th>
112
+ <td><%= data[:log_size][0][0] %></td>
161
113
  </tr>
162
114
  <tr>
163
- <th>Filter by date</th>
164
- <td>
165
- <code><%= (options[:from_date] != nil or options[:to_date] != nil) %></code>
166
- </td>
115
+ <th>Self poll entries</th>
116
+ <td><%= data[:selfpolls_size][0][0] %></td>
167
117
  </tr>
168
118
  <tr>
169
- <th>Prefix</th>
170
- <td><code><%= @prefix %></code></td>
119
+ <th>Crawlers</th>
120
+ <td><%= data[:crawlers_size][0][0] %></td>
171
121
  </tr>
172
122
  <tr>
173
- <th>Suffix</th>
174
- <td><code><%= @suffix %></code></td>
123
+ <th>Entries considered</th>
124
+ <td><%= data[:total_hits][0][0] %></td>
175
125
  </tr>
176
126
  </tbody>
177
127
  </table>
178
128
  </article>
179
129
  </div>
180
130
 
181
- <div class="column col-6">
182
- <article>
183
- <h2> Performance</h2>
131
+ <% @reports = [
132
+ { title: "Daily Distribution", header: ["Day", "DOW", "Hits", "Visits", "Size"], rows: data[:daily_distribution] },
133
+ { title: "Time Distribution", header: ["Hour", "Hits", "Visits", "Size"], rows: data[:time_distribution] },
134
+ { title: "Most Requested Pages", header: ["Path", "Hits", "Visits", "Size"], rows: data[:most_requested_pages] },
135
+ { title: "Most Requested Resources", header: ["Path", "Hits", "Visits", "Size"], rows: data[:most_requested_resources] },
136
+ { title: "404 on HTML Files", header: ["Path", "Hits", "Visitors"], rows: data[:missed_pages] },
137
+ { title: "404 on other Resources", header: ["Path", "Hits", "Visitors"], rows: data[:missed_resources] },
138
+ { title: "Attacks", header: ["Path", "Hits", "Visitors"], rows: data[:attacks] },
139
+ { },
140
+ { title: "Statuses", header: ["Status", "Count"], rows: data[:statuses] },
141
+ { title: "Daily Statuses", header: ["Status", "2xx", "3xx", "4xx"], rows: data[:statuses_by_day] },
142
+ { title: "Browsers", header: ["Browser", "Hits", "Visitors", "Size"], rows: data[:browsers] },
143
+ { title: "Platforms", header: ["Platform", "Hits", "Visitors", "Size"], rows: data[:platforms] },
144
+ { title: "Referers", header: ["Referers", "Hits", "Visitors", "Size"], rows: data[:referers], col: "col-12" },
145
+ { title: "IPs", header: ["IPs", "Hits", "Visitors", "Size"], rows: data[:ips] },
146
+ { },
147
+ ]
148
+ %>
149
+ <div class="columns">
150
+ <% @reports.each do |report| %>
151
+ <div class="column <%= report[:col] || "col-6" %>">
152
+ <article>
153
+ <% if report[:title] != nil %>
154
+ <h2 id="<%= report[:title].downcase.gsub(/ +/, '-') %>">
155
+ <%= report[:title] %>
156
+ </h2>
157
+ <%= render "output_table", report %>
158
+ <% end %>
159
+ </article>
160
+ </div>
161
+ <% end %>
162
+ </div>
184
163
 
185
- <table class="table performance">
186
- <tbody>
187
- <tr>
188
- <th>Analysis started at</th>
189
- <td><%= data[:started_at].to_s %></td>
190
- </tr>
191
- <tr>
192
- <th>Analysis ended at</th>
193
- <td><%= data[:ended_at].to_s %></td>
194
- </tr>
195
- <tr>
196
- <th>Duration (sec)</th>
197
- <td><%= "%.1f" % data[:duration] %></td>
198
- </tr>
199
- <tr>
200
- <th>Duration (min)</th>
201
- <td><%= "%d" % (data[:duration] / 60 ) %></td>
202
- </tr>
164
+ <article>
165
+ <h2 id="streaks">Streaks</h2>
166
+
167
+ <table class="table streaks">
168
+ <thead>
169
+ <tr>
170
+ <th>IP</th>
171
+ <th>Day and URL</th>
172
+ </tr>
173
+ </thead>
174
+ <tbody>
175
+ <% data[:streaks].group_by(&:first).each do |ip, date_urls| %>
203
176
  <tr>
204
- <th>Log size</th>
205
- <td><%= data[:log_size][0][0] %></td>
177
+ <td class="ip"><%= ip %></td>
178
+ <td class="streaks">
179
+ <% date_urls.group_by(&:first).each do |date, urls| %>
180
+ <% urls.each do |url| %>
181
+ <b><%= url[1] %>:</b> <%= url[2] %> <br />
182
+ <% end %>
183
+ <% end %>
184
+ </td>
206
185
  </tr>
207
- <tr>
208
- <th>Lines/sec</th>
209
- <td><%= "%.2f" % (data[:log_size][0][0] / data[:duration]) %></td></tr>
210
- </tbody>
211
- </table>
212
- </article>
186
+ <% end %>
187
+ </tbody>
188
+ </table>
189
+ </article>
190
+
191
+ <div class="columns">
192
+ <div class="column col-6">
193
+ <article>
194
+ <h2 id="command-invocation">Command Invocation</h2>
195
+
196
+ <table class="table command-invocation">
197
+ <tbody>
198
+ <tr>
199
+ <th>CLI Command</th>
200
+ <td><code><%= data[:command] %></code></td>
201
+ </tr>
202
+ <tr>
203
+ <th>Input file</th>
204
+ <td><code><%= (data[:log_file] || "stdin") %></code></td>
205
+ </tr>
206
+ <tr>
207
+ <th>Ignore crawlers</th>
208
+ <td><code><%= options[:ignore_crawlers] %></code></td></tr>
209
+ <tr>
210
+ <th>Only crawlers</th>
211
+ <td><code><%= options[:only_crawlers] %></code></td>
212
+ </tr>
213
+ <tr>
214
+ <th>No selfpoll</th>
215
+ <td><code><%= options[:no_selfpoll] %></code></td>
216
+ </tr>
217
+ <tr>
218
+ <th>Filter by date</th>
219
+ <td>
220
+ <code><%= (options[:from_date] != nil or options[:to_date] != nil) %></code>
221
+ </td>
222
+ </tr>
223
+ <tr>
224
+ <th>Prefix</th>
225
+ <td><code><%= @prefix %></code></td>
226
+ </tr>
227
+ <tr>
228
+ <th>Suffix</th>
229
+ <td><code><%= @suffix %></code></td>
230
+ </tr>
231
+ </tbody>
232
+ </table>
233
+ </article>
234
+ </div>
235
+
236
+ <div class="column col-6">
237
+ <article>
238
+ <h2 id="performance"> Performance</h2>
239
+
240
+ <table class="table performance">
241
+ <tbody>
242
+ <tr>
243
+ <th>Analysis started at</th>
244
+ <td><%= data[:started_at].to_s %></td>
245
+ </tr>
246
+ <tr>
247
+ <th>Analysis ended at</th>
248
+ <td><%= data[:ended_at].to_s %></td>
249
+ </tr>
250
+ <tr>
251
+ <th>Duration (sec)</th>
252
+ <td><%= "%.1f" % data[:duration] %></td>
253
+ </tr>
254
+ <tr>
255
+ <th>Duration (min)</th>
256
+ <td><%= "%d" % (data[:duration] / 60 ) %></td>
257
+ </tr>
258
+ <tr>
259
+ <th>Log size</th>
260
+ <td><%= data[:log_size][0][0] %></td>
261
+ </tr>
262
+ <tr>
263
+ <th>Lines/sec</th>
264
+ <td><%= "%.2f" % (data[:log_size][0][0] / data[:duration]) %></td></tr>
265
+ </tbody>
266
+ </table>
267
+ </article>
268
+ </div>
213
269
  </div>
214
- </div>
215
- </section>
270
+ </section>
271
+ </div>
216
272
  </body>
217
273
  </html>
218
274
 
@@ -10,7 +10,11 @@
10
10
  | Hits | <%= "%10d" % data[:total_hits][0][0] %> |
11
11
  | Unique Visitors | <%= "%10d" % data[:total_unique_visitors][0][0] %> |
12
12
  | Tx | <%= "%10s" % data[:total_size][0][0] %> |
13
- | Days | <%= "%10d" % data[:total_days][0][0] %> |
13
+ | Logged Period | <%= data[:first_day] %> -- <%= data[:last_day] %> |
14
+ | Days | <%= "%10d" % data[:total_days] %> |
15
+ | Period Requested | <%= data[:first_day_requested] %> -- <%= data[:last_day_requested] %> |
16
+ | Period Analyzed | <%= data[:first_day_in_analysis] %> -- <%= data[:last_day_in_analysis] %> |
17
+ | Days in Analysis | <%= data[:total_days_in_analysis] %> |
14
18
 
15
19
  * Daily Distribution
16
20
 
@@ -1,3 +1,3 @@
1
1
  module ApacheLogReport
2
- VERSION = "1.1.3"
2
+ VERSION = "1.1.7"
3
3
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: apache_log_report
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.1.3
4
+ version: 1.1.7
5
5
  platform: ruby
6
6
  authors:
7
7
  - Adolfo Villafiorita
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2021-10-31 00:00:00.000000000 Z
11
+ date: 2021-11-01 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: apache_log-parser
@@ -119,7 +119,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
119
119
  - !ruby/object:Gem::Version
120
120
  version: '0'
121
121
  requirements: []
122
- rubygems_version: 3.2.22
122
+ rubygems_version: 3.0.3
123
123
  signing_key:
124
124
  specification_version: 4
125
125
  summary: Generate analytics from an Apache log file.