logstash-output-chronix 0.1.0 → 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 809b920a229b6a68d3d15a66c006255f7130fe55
4
- data.tar.gz: 243a71edf32944d0e2d11c4b481b3382c2600efb
3
+ metadata.gz: 79a2fa0b8712acee630806c6bacc9ff585a71555
4
+ data.tar.gz: c8d381ff95b73cf7bd954f825536ba4451aab845
5
5
  SHA512:
6
- metadata.gz: 7051227f0486a1de7944c917fa967e043e6e45419376a35b4a0cf85bc87344465e1771c4607271d1910c297d9bdf32a73f75112354f93f0d2a1177d5b3082195
7
- data.tar.gz: 0387b6a2ee030d7ee24eadceaa57400c956e729e9f7b35b1364257813aa705acb4af6b33ece4b5fa5431c16e5eedbab5769be2cb93e81a9d88eaf1866974913a
6
+ metadata.gz: 2a8f28c532b41215fc056821685eb0e3cfe2f313ff5c2bf0f51dee466529345dc50d86a381769a30c9ff2c19308f909b2e2f3dcea4ae109dc8a81d11825983f7
7
+ data.tar.gz: 25b29e31b85bdbddbcfa8c20fa0c71f3bfe1e844cff6c3c173f25fe5ee38565ad766ac6fe5d222a10599a41ccb3ff4507bcf6be94641a1a35ba29b7f9168133c
data/README.md CHANGED
@@ -2,13 +2,22 @@
2
2
 
3
3
  [![Build Status](https://travis-ci.org/ChronixDB/chronix.logstash.svg?branch=master)](https://travis-ci.org/ChronixDB/chronix.logstash)
4
4
 
5
- This is a plugin for [Logstash](https://github.com/elastic/logstash) to write time series to [Chronix](https://github.com/ChronixDB) released under the Apache 2.0 license.
5
+ This is a plugin for [Logstash](https://github.com/elastic/logstash) to write time series to [Chronix](https://github.com/ChronixDB) released under the [Apache 2.0 License](LICENSE).
6
6
 
7
7
  ## Install the plugin
8
8
 
9
- There are two options to install the plugin. (check one the official [Logstash Repos](https://github.com/logstash-plugins/logstash-output-example#2-running-your-unpublished-plugin-in-logstash) for reference)
9
+ There are two options to install the plugin. Install via gem install or manual install (check also the [Logstash Repos](https://github.com/logstash-plugins/logstash-output-example#2-running-your-unpublished-plugin-in-logstash) for reference).
10
10
 
11
- ### Install in a local Logstash Clone:
11
+ ### Install via gem install
12
+ Just type
13
+ ```sh
14
+ gem install logstash-output-chronix
15
+ ```
16
+ to install the plugin. See the [configuration-section](#configuration) for configuration options.
17
+
18
+ ### Install manually
19
+
20
+ #### Install in a local Logstash Clone
12
21
  - Edit Logstash `Gemfile` and add the local plugin path, for example:
13
22
  ```ruby
14
23
  gem "logstash-output-chronix", :path => "/path/to/logstash-output-chronix"
@@ -22,7 +31,7 @@ bin/plugin install --no-verify
22
31
  bin/logstash -e your_config.conf
23
32
  ```
24
33
 
25
- ### Run in an installed Logstash
34
+ #### Run in an installed Logstash
26
35
 
27
36
  You can use the same method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
28
37
 
@@ -54,6 +63,7 @@ chronix {
54
63
  host => "192.168.0.1" # default is 'localhost'
55
64
  port => "8983" # default is '8983'
56
65
  path => "/solr/chronix/" # default is '/solr/chronix/'
66
+ threshold => 10 # default is 10 (every delta with (delta - prev_delta) < threshold will be nulled)
57
67
  flush_size => 100 # default is '100' (Number of events to queue up before writing to Solr)
58
68
  idle_flush_time => 30 # default is '30' (Amount of time since the last flush before a flush is done)
59
69
  }
@@ -12,6 +12,7 @@ require "rubygems"
12
12
  require "stud/buffer"
13
13
  require "zlib"
14
14
  require_relative "proto/Point.rb"
15
+ require_relative "proto/StracePoint.rb"
15
16
 
16
17
  class LogStash::Outputs::Chronix < LogStash::Outputs::Base
17
18
  include Stud::Buffer
@@ -27,6 +28,9 @@ class LogStash::Outputs::Chronix < LogStash::Outputs::Base
27
28
  # path to chronix, default: /solr/chronix/
28
29
  config :path, :validate => :string, :default => "/solr/chronix/"
29
30
 
31
+ # threshold for delta-calculation, every (delta - prev_delta) < threshold will be nulled
32
+ config :threshold, :validate => :number, :default => 10
33
+
30
34
  # Number of events to queue up before writing to Solr
31
35
  config :flush_size, :validate => :number, :default => 100
32
36
 
@@ -59,10 +63,27 @@ class LogStash::Outputs::Chronix < LogStash::Outputs::Base
59
63
 
60
64
  public
61
65
  def flush(events, close=false)
66
+ pointHash = createPointHash(events)
67
+
68
+ documents = []
69
+
70
+ # iterate through pointHash and create a solr document
71
+ pointHash.each { |metric, phash|
72
+ documents << createSolrDocument(metric, phash)
73
+ }
74
+
75
+ # send to chronix
76
+ @solr.add documents
77
+ @solr.update :data => '<commit/>'
78
+ end #def flush
79
+
80
+ # this method iterates through all events and creates a hash with different lists of points sorted by metric
81
+ def createPointHash(events)
62
82
  pointHash = Hash.new
63
83
 
64
84
  # add each event to our hash, sorted by metrics as key
65
85
  events.each do |event|
86
+
66
87
  eventData = event.to_hash()
67
88
 
68
89
  # format the timestamp to unix format
@@ -71,27 +92,37 @@ class LogStash::Outputs::Chronix < LogStash::Outputs::Base
71
92
 
72
93
  # if there is no list for the current metric -> create a new one
73
94
  if pointHash[metric] == nil
74
- pointHash[metric] = {"startTime" => timestamp, "endTime" => timestamp, "points" => Chronix::Points.new}
95
+ pointHash[metric] = {"startTime" => timestamp, "lastTimestamp" => 0, "points" => Chronix::Points.new, "prevDelta" => 0, "timeSinceLastDelta" => 0, "lastStoredDate" => timestamp}
75
96
  end
76
97
 
77
- pointHash[metric]["endTime"] = timestamp
98
+ if pointHash[metric]["lastTimestamp"] == 0
99
+ delta = 0
100
+ else
101
+ delta = timestamp - pointHash[metric]["lastTimestamp"]
102
+ end
78
103
 
79
- # insert the current point in our list
80
- pointHash[metric]["points"].p << createChronixPoint(timestamp, eventData["value"])
104
+ if (almostEquals(delta, pointHash[metric]["prevDelta"]) && noDrift(timestamp, pointHash[metric]["lastStoredDate"], pointHash[metric]["timeSinceLastDelta"]))
105
+ # insert the current point in our list
106
+ pointHash[metric]["points"].p << createChronixPoint(0, eventData["value"], eventData["chronix_type"])
81
107
 
82
- end #end do
83
-
84
- documents = []
108
+ pointHash[metric]["timeSinceLastDelta"] += 1
85
109
 
86
- # iterate through pointHash and create a solr document
87
- pointHash.each { |metric, phash|
88
- documents << createSolrDocument(metric, phash)
89
- }
110
+ else
111
+ # insert the current point in our list
112
+ pointHash[metric]["points"].p << createChronixPoint(delta, eventData["value"], eventData["chronix_type"])
90
113
 
91
- # send to chronix
92
- @solr.add documents
93
- @solr.update :data => '<commit/>'
94
- end #def flush
114
+ pointHash[metric]["timeSinceLastDelta"] = 1
115
+ pointHash[metric]["lastStoredDate"] = timestamp
116
+ end
117
+
118
+ # save current timestamp as lastTimestamp and the previousOffset
119
+ pointHash[metric]["lastTimestamp"] = timestamp
120
+ pointHash[metric]["prevDelta"] = delta
121
+
122
+ end #end do
123
+
124
+ return pointHash
125
+ end
95
126
 
96
127
  # this method zips and base64 encodes the list of points
97
128
  def zipAndEncode(points)
@@ -109,12 +140,32 @@ class LogStash::Outputs::Chronix < LogStash::Outputs::Base
109
140
  return Base64.strict_encode64(data)
110
141
  end
111
142
 
112
- def createChronixPoint(timestamp, value)
113
- return Chronix::Point.new( :t => timestamp, :v => value )
143
+ def createChronixPoint(delta, value, type = "")
144
+ if type == "strace"
145
+ return Chronix::StracePoint.new( :t => delta, :v => value )
146
+ else
147
+ return Chronix::Point.new( :t => delta, :v => value )
148
+ end
114
149
  end
115
150
 
116
151
  def createSolrDocument(metric, phash)
117
- return { :metric => metric, :start => phash["startTime"], :end => phash["endTime"], :data => zipAndEncode(phash["points"]) }
152
+ endTime = phash["lastTimestamp"] # maybe use startTime + delta here?!
153
+ return { :metric => metric, :start => phash["startTime"], :end => endTime, :data => zipAndEncode(phash["points"]), :threshold => @threshold }
154
+ end
155
+
156
+ # checks if two offsets are almost equals
157
+ def almostEquals(delta, prevDelta)
158
+ diff = (delta - prevDelta).abs
159
+
160
+ return (diff <= @threshold)
161
+ end
162
+
163
+ # checks if there is a drift
164
+ def noDrift(timestamp, lastStoredDate, timeSinceLastDelta)
165
+ calcMaxOffset = @threshold * timeSinceLastDelta
166
+ drift = lastStoredDate + calcMaxOffset - timestamp.to_i
167
+
168
+ return (drift <= (@threshold / 2))
118
169
  end
119
170
 
120
171
  end # class LogStash::Outputs::Chronix
@@ -0,0 +1,15 @@
1
+ syntax = "proto3";
2
+ option optimize_for = SPEED;
3
+
4
+ //Our point
5
+ message StracePoint {
6
+ //The date as int64 (java long)
7
+ int64 t = 1;
8
+ string v = 2;
9
+ }
10
+
11
+ //The data of a time series is a list of points
12
+ message StracePoints {
13
+ //The list of points
14
+ repeated StracePoint p = 1;
15
+ }
@@ -0,0 +1,14 @@
1
+ # NOT GENERATED
2
+
3
+ require 'protobuf/message'
4
+
5
+ module Chronix
6
+ class StracePoint < ::Protobuf::Message
7
+ optional :int64, :t, 1
8
+ required :string, :v, 2
9
+ end
10
+
11
+ class StracePoints < ::Protobuf::Message
12
+ repeated ::Chronix::StracePoint, :p, 1
13
+ end
14
+ end
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-chronix'
3
- s.version = "0.1.0"
3
+ s.version = "0.1.1"
4
4
  s.licenses = ["Apache License (2.0)"]
5
5
  s.summary = "This output stores your logs in chronix"
6
6
  s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
@@ -23,5 +23,5 @@ Gem::Specification.new do |s|
23
23
  s.add_runtime_dependency "stud"
24
24
  s.add_runtime_dependency "rsolr"
25
25
  s.add_runtime_dependency "protobuf"
26
- s.add_development_dependency "logstash-devutils"
26
+ s.add_development_dependency "logstash-devutils", ">= 0.0.19", "< 0.0.20"
27
27
  end
@@ -36,11 +36,13 @@ describe LogStash::Outputs::Chronix do
36
36
  end
37
37
 
38
38
  context "test basic functions zip and encode, createPoint, createDocument" do
39
- subject { LogStash::Outputs::Chronix.new( "flush_size" => 3, "idle_flush_time" => 10 ) }
39
+ # no flushing of the buffer needed, that's why we use 3 as flush_size here
40
+ subject { LogStash::Outputs::Chronix.new( "threshold" => 10, "flush_size" => 3, "idle_flush_time" => 10 ) }
40
41
 
41
42
  let(:ttimestamp) { "1459353272" }
42
43
  let(:tmetric) { "test1" }
43
44
  let(:tvalue) { "10.5" }
45
+ let(:svalue) { "string" }
44
46
  let(:events) { [LogStash::Event.new("metric" => tmetric, "value" => tvalue)] }
45
47
 
46
48
  it "should return a Chronix::Point" do
@@ -48,6 +50,12 @@ describe LogStash::Outputs::Chronix do
48
50
  expectedResult = Chronix::Point.new( :t => ttimestamp, :v => tvalue )
49
51
  expect(point).to eq(expectedResult)
50
52
  end
53
+
54
+ it "should return a Chronix::StracePoint" do
55
+ point = subject.createChronixPoint(0, svalue, "strace")
56
+ expectedResult = Chronix::StracePoint.new( :t => 0, :v => svalue )
57
+ expect(point).to eq(expectedResult)
58
+ end
51
59
 
52
60
  it "should return a zipped and base64 encoded string containing the data" do
53
61
  points = Chronix::Points.new
@@ -56,12 +64,20 @@ describe LogStash::Outputs::Chronix do
56
64
  expect(subject.zipAndEncode(points)).to eq(expectedResult)
57
65
  end
58
66
 
67
+ it "should create a correct point hash" do
68
+ points = Chronix::Points.new
69
+ points.p << subject.createChronixPoint(0, tvalue)
70
+ phash = {tmetric => {"startTime" => ttimestamp.to_i, "lastTimestamp" => ttimestamp.to_i, "points" => points, "prevDelta" => 0, "timeSinceLastDelta" => 1, "lastStoredDate" => ttimestamp.to_i}}
71
+ events = [LogStash::Event.new("metric" => tmetric, "value" => tvalue, "@timestamp" => "2016-03-30T15:54:32.172Z")]
72
+ expect(subject.createPointHash(events)).to eq(phash)
73
+ end
74
+
59
75
  it "should create a valid document" do
60
76
  points = Chronix::Points.new
61
77
  points.p << subject.createChronixPoint(ttimestamp, tvalue)
62
- phash = {"startTime" => ttimestamp, "endTime" => ttimestamp, "points" => points}
78
+ phash = {"startTime" => ttimestamp, "lastTimestamp" => ttimestamp, "points" => points}
63
79
  document = subject.createSolrDocument(tmetric, phash)
64
- sampleDoc = { :metric => tmetric, :start => phash["startTime"], :end => phash["endTime"], :data => "H4sIAAAAAAAA/+Pi59jx9v12VkEGMFB1AACWVOXHEQAAAA==" }
80
+ sampleDoc = { :metric => tmetric, :start => phash["startTime"], :end => phash["lastTimestamp"], :data => "H4sIAAAAAAAA/+Pi59jx9v12VkEGMFB1AACWVOXHEQAAAA==", :threshold => 10 }
65
81
  expect(document).to eq(sampleDoc)
66
82
  end
67
83
 
@@ -71,20 +87,85 @@ describe LogStash::Outputs::Chronix do
71
87
  end
72
88
  end
73
89
 
90
+ context "test delta calculation" do
91
+ # no flushing of the buffer needed, that's why we use 3 as flush_size here
92
+ subject { LogStash::Outputs::Chronix.new( "threshold" => 10, "flush_size" => 3, "idle_flush_time" => 10 ) }
93
+
94
+ let(:tmetric) { "test1" }
95
+ let(:tvalue) { "10.5" }
96
+ let(:events) { [LogStash::Event.new("metric" => tmetric, "value" => tvalue)] }
97
+
98
+ p_ev = []
99
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "1.0", "@timestamp" => "2016-05-10T15:00:10.000Z")
100
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "2.0", "@timestamp" => "2016-05-10T15:00:20.000Z")
101
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "3.0", "@timestamp" => "2016-05-10T15:00:30.000Z")
102
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "4.0", "@timestamp" => "2016-05-10T15:00:39.000Z")
103
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "5.0", "@timestamp" => "2016-05-10T15:00:48.000Z")
104
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "6.0", "@timestamp" => "2016-05-10T15:00:57.000Z")
105
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "7.0", "@timestamp" => "2016-05-10T15:01:06.000Z")
106
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "8.0", "@timestamp" => "2016-05-10T15:01:15.000Z")
107
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "9.0", "@timestamp" => "2016-05-10T15:01:24.000Z")
108
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "10.0", "@timestamp" => "2016-05-10T15:01:33.000Z")
109
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "11.0", "@timestamp" => "2016-05-10T15:01:42.000Z")
110
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "12.0", "@timestamp" => "2016-05-10T15:01:51.000Z")
111
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "13.0", "@timestamp" => "2016-05-10T15:02:00.000Z")
112
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "14.0", "@timestamp" => "2016-05-10T15:02:09.000Z")
113
+ p_ev << LogStash::Event.new("metric" => "test1", "value" => "15.0", "@timestamp" => "2016-05-10T15:02:18.000Z")
114
+
115
+ it "delta should not be almost equals" do
116
+ expect(subject.almostEquals(21, 10)).to be false
117
+ end
118
+
119
+ it "delta should be almost equals" do
120
+ expect(subject.almostEquals(-18, -10)).to be true
121
+ end
122
+
123
+ it "should have no drift" do
124
+ expect(subject.noDrift(10, 5, 1)).to be true
125
+ end
126
+
127
+ it "should have a drift" do
128
+ expect(subject.noDrift(10, 5, 2)).to be false
129
+ end
130
+
131
+ it "should return a point hash with the correct timestamps aka delta" do
132
+ pointHash = subject.createPointHash(p_ev)
133
+
134
+ points = Chronix::Points.new
135
+ points.p << subject.createChronixPoint(0, "1.0")
136
+ points.p << subject.createChronixPoint(0, "2.0")
137
+ points.p << subject.createChronixPoint(0, "3.0")
138
+ points.p << subject.createChronixPoint(0, "4.0")
139
+ points.p << subject.createChronixPoint(0, "5.0")
140
+ points.p << subject.createChronixPoint(0, "6.0")
141
+ points.p << subject.createChronixPoint(0, "7.0")
142
+ points.p << subject.createChronixPoint(0, "8.0")
143
+ points.p << subject.createChronixPoint(9, "9.0")
144
+ points.p << subject.createChronixPoint(0, "10.0")
145
+ points.p << subject.createChronixPoint(0, "11.0")
146
+ points.p << subject.createChronixPoint(0, "12.0")
147
+ points.p << subject.createChronixPoint(0, "13.0")
148
+ points.p << subject.createChronixPoint(0, "14.0")
149
+ points.p << subject.createChronixPoint(9, "15.0")
150
+
151
+ expect(pointHash["test1"]["points"]).to eq(points)
152
+ end
153
+ end
154
+
74
155
  # these events are needed for the next two test-contexts
75
- e1 = LogStash::Event.new("metric" => "test1", "value" => "1.5")
76
- e2 = LogStash::Event.new("metric" => "test2", "value" => "2.5")
77
- e3 = LogStash::Event.new("metric" => "test1", "value" => "3.5")
78
- e4 = LogStash::Event.new("metric" => "test1", "value" => "4.5")
79
- e5 = LogStash::Event.new("metric" => "test2", "value" => "5.5")
80
- e6 = LogStash::Event.new("metric" => "test3", "value" => "6.5")
81
- e7 = LogStash::Event.new("metric" => "test1", "value" => "7.5")
82
- e8 = LogStash::Event.new("metric" => "test2", "value" => "8.5")
156
+ e21 = LogStash::Event.new("metric" => "test1", "value" => "1.5")
157
+ e22 = LogStash::Event.new("metric" => "test2", "value" => "2.5")
158
+ e23 = LogStash::Event.new("metric" => "test1", "value" => "3.5")
159
+ e24 = LogStash::Event.new("metric" => "test1", "value" => "4.5")
160
+ e25 = LogStash::Event.new("metric" => "test2", "value" => "5.5")
161
+ e26 = LogStash::Event.new("metric" => "test3", "value" => "6.5")
162
+ e27 = LogStash::Event.new("metric" => "test1", "value" => "7.5")
163
+ e28 = LogStash::Event.new("metric" => "test2", "value" => "8.5")
83
164
 
84
165
  context "adding and removing tests with different metrics" do
85
166
  subject { LogStash::Outputs::Chronix.new( "flush_size" => 1, "idle_flush_time" => 10 ) }
86
167
 
87
- let(:events) { [e1, e2, e3, e4, e5, e6, e7, e8] }
168
+ let(:events) { [e21, e22, e23, e24, e25, e26, e27, e28] }
88
169
 
89
170
  it "should have 3 different metrics" do
90
171
  expect(solr.size).to eq(3)
@@ -116,9 +197,10 @@ describe LogStash::Outputs::Chronix do
116
197
  # test2[0]: 1 elem, test2[1]: 2 elem
117
198
  # test3[0]: 1 elem
118
199
  context "adding and removing tests with different metrics and buffer-settings" do
200
+
119
201
  subject { LogStash::Outputs::Chronix.new( "flush_size" => 4, "idle_flush_time" => 10 ) }
120
202
 
121
- let(:events) { [e1, e2, e3, e4, e5, e6, e7, e8] }
203
+ let(:events) { [e21, e22, e23, e24, e25, e26, e27, e28] }
122
204
 
123
205
  it "should have 3 different metrics" do
124
206
  expect(solr.size).to eq(3)
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-chronix
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Max Jalowski
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2016-04-26 00:00:00.000000000 Z
11
+ date: 2016-05-18 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: logstash-core
@@ -92,14 +92,20 @@ dependencies:
92
92
  requirements:
93
93
  - - '>='
94
94
  - !ruby/object:Gem::Version
95
- version: '0'
95
+ version: 0.0.19
96
+ - - <
97
+ - !ruby/object:Gem::Version
98
+ version: 0.0.20
96
99
  type: :development
97
100
  prerelease: false
98
101
  version_requirements: !ruby/object:Gem::Requirement
99
102
  requirements:
100
103
  - - '>='
101
104
  - !ruby/object:Gem::Version
102
- version: '0'
105
+ version: 0.0.19
106
+ - - <
107
+ - !ruby/object:Gem::Version
108
+ version: 0.0.20
103
109
  description: This gem is a logstash plugin required to be installed on top of the
104
110
  Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not
105
111
  a stand-alone program
@@ -111,6 +117,8 @@ files:
111
117
  - lib/logstash/outputs/chronix.rb
112
118
  - lib/logstash/outputs/proto/Point.proto
113
119
  - lib/logstash/outputs/proto/Point.rb
120
+ - lib/logstash/outputs/proto/StracePoint.proto
121
+ - lib/logstash/outputs/proto/StracePoint.rb
114
122
  - spec/chronix_helper.rb
115
123
  - spec/outputs/chronix_spec.rb
116
124
  - spec/spec_helper.rb