logstash-input-file_progress 2.0.0 → 3.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 2412e35c82a969c7c76ff488f1f703c33b266219
4
- data.tar.gz: 69285a889b89e8272172d7896a4038d7cdea3d62
3
+ metadata.gz: 5a98b141a82a617952182b9043cba8dc852c32b0
4
+ data.tar.gz: fda790b78cd799b3e3414f98fd3995e9a571937f
5
5
  SHA512:
6
- metadata.gz: 5963e270c4920e6fa0f87883b3ebdbd16ce320ec893a4ff39cdcb66777880941d94ffdda452124f5a1f8cdf876865fd7a0bd0b6dcb086e3e3352b980a236ae22
7
- data.tar.gz: d2076be3d2a3778e8e852e9865a8261f3e230968dfae0c1d02d9385b483a96322cb0fd7b0d65dac25cafef0c6558a31548c1f6b1b2af9679e63ebf33ea5bf37e
6
+ metadata.gz: 66a14a04273111a463ada9617f2729b3546e983fcf83f7ed3f520dcfc0ab1ef42afea12dac5271370e7d585f5ecf2d240fe38dbf81ee3c82cfe10371267cce84
7
+ data.tar.gz: 6d6a255e26ff38daca119bab8f094b57d8d8bb57d2bd6df75034653b4ccabac486701ef10a42fbeb4d18a9f6ea05ac340fa3609610509a6c17d7c9855c4f4845
@@ -1,3 +1,6 @@
1
+ ## 2.1.0
2
+ - Changed dependency to new logstash-core-plugin-api
3
+
1
4
  ## 2.0.0
2
5
  - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
3
6
  instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
@@ -1,97 +1,198 @@
1
1
  # encoding: utf-8
2
-
3
- require "logstash/inputs/base"
4
2
  require "logstash/namespace"
3
+ require "logstash/inputs/base"
4
+ require "logstash/codecs/identity_map_codec"
5
+ require "logstash/codecs/plain"
5
6
 
6
7
  require "pathname"
7
8
  require "socket" # for Socket.gethostname
8
9
 
9
- # Stream events from files.
10
+ # Stream events from files, normally by tailing them in a manner
11
+ # similar to `tail -0F` but optionally reading them from the
12
+ # beginning.
13
+ #
14
+ # By default, each event is assumed to be one line. If you would like
15
+ # to join multiple log lines into one event, you'll want to use the
16
+ # multiline codec or filter.
17
+ #
18
+ # The plugin aims to track changing files and emit new content as it's
19
+ # appended to each file. It's not well-suited for reading a file from
20
+ # beginning to end and storing all of it in a single event (not even
21
+ # with the multiline codec or filter).
22
+ #
23
+ # ==== Tracking of current position in watched files
24
+ #
25
+ # The plugin keeps track of the current position in each file by
26
+ # recording it in a separate file named sincedb. This makes it
27
+ # possible to stop and restart Logstash and have it pick up where it
28
+ # left off without missing the lines that were added to the file while
29
+ # Logstash was stopped.
30
+ #
31
+ # By default, the sincedb file is placed in the home directory of the
32
+ # user running Logstash with a filename based on the filename patterns
33
+ # being watched (i.e. the `path` option). Thus, changing the filename
34
+ # patterns will result in a new sincedb file being used and any
35
+ # existing current position state will be lost. If you change your
36
+ # patterns with any frequency it might make sense to explicitly choose
37
+ # a sincedb path with the `sincedb_path` option.
10
38
  #
11
- # By default, each event is assumed to be one line. If you
12
- # want to join lines, you'll want to use the multiline filter.
39
+ # A different `sincedb_path` must be used for each input. Using the same
40
+ # path will cause issues. The read checkpoints for each input must be
41
+ # stored in a different path so the information does not override.
13
42
  #
14
- # Files are followed in a manner similar to "tail -0F". File rotation
15
- # is detected and handled by this input.
43
+ # Sincedb files are text files with four columns:
16
44
  #
17
- # In addition to 'normal' file input we add events with sincedb-data to the pipe.
18
- # This can be later used to serve progress date with e.g. faye-output
45
+ # . The inode number (or equivalent).
46
+ # . The major device number of the file system (or equivalent).
47
+ # . The minor device number of the file system (or equivalent).
48
+ # . The current byte offset within the file.
49
+ #
50
+ # On non-Windows systems you can obtain the inode number of a file
51
+ # with e.g. `ls -li`.
52
+ #
53
+ # ==== File rotation
54
+ #
55
+ # File rotation is detected and handled by this input, regardless of
56
+ # whether the file is rotated via a rename or a copy operation. To
57
+ # support programs that write to the rotated file for some time after
58
+ # the rotation has taken place, include both the original filename and
59
+ # the rotated filename (e.g. /var/log/syslog and /var/log/syslog.1) in
60
+ # the filename patterns to watch (the `path` option). Note that the
61
+ # rotated filename will be treated as a new file so if
62
+ # `start_position` is set to 'beginning' the rotated file will be
63
+ # reprocessed.
64
+ #
65
+ # With the default value of `start_position` ('end') any messages
66
+ # written to the end of the file between the last read operation prior
67
+ # to the rotation and its reopening under the new name (an interval
68
+ # determined by the `stat_interval` and `discover_interval` options)
69
+ # will not get picked up.
70
+
71
+ class LogStash::Codecs::Base
72
+ # TODO - move this to core
73
+ if !method_defined?(:accept)
74
+ def accept(listener)
75
+ decode(listener.data) do |event|
76
+ listener.process_event(event)
77
+ end
78
+ end
79
+ end
80
+ if !method_defined?(:auto_flush)
81
+ def auto_flush(*)
82
+ end
83
+ end
84
+ end
85
+
19
86
  class LogStash::Inputs::FileProgress < LogStash::Inputs::Base
20
87
  config_name "file_progress"
21
88
 
22
- # TODO(sissel): This should switch to use the 'line' codec by default
23
- # once file following
24
- default :codec, "line"
25
-
26
- # The path to the file to use as an input.
27
- # You can use globs here, such as `/var/log/*.log`
89
+ # The path(s) to the file(s) to use as an input.
90
+ # You can use filename patterns here, such as `/var/log/*.log`.
91
+ # If you use a pattern like `/var/log/**/*.log`, a recursive search
92
+ # of `/var/log` will be done for all `*.log` files.
28
93
  # Paths must be absolute and cannot be relative.
94
+ #
95
+ # You may also configure multiple paths. See an example
96
+ # on the <<array,Logstash configuration page>>.
29
97
  config :path, :validate => :array, :required => true
30
98
 
31
- # Exclusions (matched against the filename, not full path). Globs
32
- # are valid here, too. For example, if you have
33
- #
99
+ # Exclusions (matched against the filename, not full path). Filename
100
+ # patterns are valid here, too. For example, if you have
101
+ # [source,ruby]
34
102
  # path => "/var/log/*"
35
103
  #
36
- # you might want to exclude gzipped files:
37
- #
104
+ # You might want to exclude gzipped files:
105
+ # [source,ruby]
38
106
  # exclude => "*.gz"
39
107
  config :exclude, :validate => :array
40
108
 
41
- # How often we stat files to see if they have been modified. Increasing
42
- # this interval will decrease the number of system calls we make, but
43
- # increase the time to detect new log lines.
109
+ # How often (in seconds) we stat files to see if they have been modified.
110
+ # Increasing this interval will decrease the number of system calls we make,
111
+ # but increase the time to detect new log lines.
44
112
  config :stat_interval, :validate => :number, :default => 1
45
113
 
46
- # How often we expand globs to discover new files to watch.
114
+ # How often (in seconds) we expand the filename patterns in the
115
+ # `path` option to discover new files to watch.
47
116
  config :discover_interval, :validate => :number, :default => 15
48
117
 
49
- # Where to write the since database (keeps track of the current
50
- # position of monitored log files). The default will write
51
- # sincedb files to some path matching "$HOME/.sincedb*"
52
- config :sincedb_path, :validate => :string, :required => true
118
+ # Path of the sincedb database file (keeps track of the current
119
+ # position of monitored log files) that will be written to disk.
120
+ # The default will write sincedb files to some path matching `$HOME/.sincedb*`
121
+ # NOTE: it must be a file path and not a directory path
122
+ config :sincedb_path, :validate => :string
53
123
 
54
- # How often to write a since database with the current position of
124
+ # How often (in seconds) to write a since database with the current position of
55
125
  # monitored log files.
56
126
  config :sincedb_write_interval, :validate => :number, :default => 15
57
127
 
58
- # Choose where logstash starts initially reading files - at the beginning or
128
+ # TODO edit description
129
+ # Choose where Logstash starts initially reading files: at the beginning or
59
130
  # at the end. The default behavior treats files like live streams and thus
60
131
  # starts at the end. If you have old data you want to import, set this
61
- # to 'beginning'
132
+ # to 'beginning'.
62
133
  #
63
- # This option only modifieds "first contact" situations where a file is new
64
- # and not seen before. If a file has already been seen before, this option
65
- # has no effect.
134
+ # This option only modifies "first contact" situations where a file
135
+ # is new and not seen before, i.e. files that don't have a current
136
+ # position recorded in a sincedb file read by Logstash. If a file
137
+ # has already been seen before, this option has no effect and the
138
+ # position recorded in the sincedb file will be used.
66
139
  config :start_position, :validate => [ "beginning", "end"], :default => "beginning"
67
140
 
68
- # Should the progressdb events be send to the pipeline
69
- config :progressdb, :validate => :boolean, :default => false
141
+ # set the new line delimiter, defaults to "\n"
142
+ config :delimiter, :validate => :string, :default => "\n"
143
+
144
+ # When the file input discovers a file that was last modified
145
+ # before the specified timespan in seconds, the file is ignored.
146
+ # After it's discovery, if an ignored file is modified it is no
147
+ # longer ignored and any new data is read. The default is 1 year.
148
+ config :ignore_older, :validate => :number, :default => 24 * 60 * 60 * 365
70
149
 
71
- # Should the processdb entry be deleted after file-deletion
72
- config :progressdb_del, :validate => :boolean, :default => false
150
+ # The file input closes any files that were last read the specified
151
+ # timespan in seconds ago.
152
+ # This has different implications depending on if a file is being tailed or
153
+ # read. If tailing, and there is a large time gap in incoming data the file
154
+ # can be closed (allowing other files to be opened) but will be queued for
155
+ # reopening when new data is detected. If reading, the file will be closed
156
+ # after closed_older seconds from when the last bytes were read.
157
+ # The default is 1 hour
158
+ config :close_older, :validate => :number, :default => 1 * 60 * 60
73
159
 
74
- # Close the file when end is reached
160
+ # What is the maximum number of file_handles that this input consumes
161
+ # at any one time. Use close_older to close some files if you need to
162
+ # process more files than this number. This should not be set to the
163
+ # maximum the OS can do because file handles are needed for other
164
+ # LS plugins and OS processes.
165
+ # The default of 4095 is set in filewatch.
166
+ config :max_open_files, :validate => :number
167
+
168
+ # Close the file when end of file is reached
75
169
  # This make sense when reading a file once from the beginning and want to e.g.
76
170
  # proceed renaming or deleting the parent folder
77
- config :eof_close, :validate => :boolean, :default => false
171
+ config :eof_close, :validate => :boolean, :default => true
172
+
173
+ # How often (in milliseconds) to add progress-info to metadata with the current position of
174
+ # monitored log files.
175
+ config :progress_write_interval, :validate => :number, :default => 200
78
176
 
79
177
  public
80
178
  def register
81
179
  require "addressable/uri"
82
- require "filewatch/ext/filetail"
180
+ require "filewatch/ext/tail"
83
181
  require "digest/md5"
84
182
  @logger.info("Registering file input", :path => @path)
183
+ @host = Socket.gethostname.force_encoding(Encoding::UTF_8)
85
184
 
86
185
  @tail_config = {
87
186
  :exclude => @exclude,
88
187
  :stat_interval => @stat_interval,
89
188
  :discover_interval => @discover_interval,
90
189
  :sincedb_write_interval => @sincedb_write_interval,
91
- :logger => @logger,
92
- :progressdb => @progressdb,
93
- :progressdb_del => @progressdb_del,
94
- :eof_close => @eof_close,
190
+ :delimiter => @delimiter,
191
+ :ignore_older => @ignore_older,
192
+ :close_older => @close_older,
193
+ :max_open_files => @max_open_files,
194
+ :eof_close => true,
195
+ :progress_write_interval => @progress_write_interval
95
196
  }
96
197
 
97
198
  @path.each do |path|
@@ -100,52 +201,153 @@ class LogStash::Inputs::FileProgress < LogStash::Inputs::Base
100
201
  end
101
202
  end
102
203
 
204
+ if @sincedb_path.nil?
205
+ if ENV["SINCEDB_DIR"].nil? && ENV["HOME"].nil?
206
+ @logger.error("No SINCEDB_DIR or HOME environment variable set, I don't know where " \
207
+ "to keep track of the files I'm watching. Either set " \
208
+ "HOME or SINCEDB_DIR in your environment, or set sincedb_path in " \
209
+ "in your Logstash config for the file input with " \
210
+ "path '#{@path.inspect}'")
211
+ raise # TODO(sissel): HOW DO I FAIL PROPERLY YO
212
+ end
213
+
214
+ #pick SINCEDB_DIR if available, otherwise use HOME
215
+ sincedb_dir = ENV["SINCEDB_DIR"] || ENV["HOME"]
216
+
217
+ # Join by ',' to make it easy for folks to know their own sincedb
218
+ # generated path (vs, say, inspecting the @path array)
219
+ @sincedb_path = File.join(sincedb_dir, ".sincedb_" + Digest::MD5.hexdigest(@path.join(",")))
220
+
221
+ # Migrate any old .sincedb to the new file (this is for version <=1.1.1 compatibility)
222
+ old_sincedb = File.join(sincedb_dir, ".sincedb")
223
+ if File.exists?(old_sincedb)
224
+ @logger.info("Renaming old ~/.sincedb to new one", :old => old_sincedb,
225
+ :new => @sincedb_path)
226
+ File.rename(old_sincedb, @sincedb_path)
227
+ end
228
+
229
+ @logger.info("No sincedb_path set, generating one based on the file path",
230
+ :sincedb_path => @sincedb_path, :path => @path)
231
+ end
232
+
233
+ if File.directory?(@sincedb_path)
234
+ raise ArgumentError.new("The \"sincedb_path\" argument must point to a file, received a directory: \"#{@sincedb_path}\"")
235
+ end
236
+
103
237
  @tail_config[:sincedb_path] = @sincedb_path
104
238
 
105
239
  if @start_position == "beginning"
106
240
  @tail_config[:start_new_files_at] = :beginning
107
241
  end
108
242
 
109
- @codec_plain = LogStash::Codecs::Plain.new
243
+ @codec = LogStash::Codecs::IdentityMapCodec.new(@codec)
110
244
  end # def register
111
245
 
112
- public
113
- def run(queue)
114
- @tail = FileWatch::Ext::FileTail.new(@tail_config)
246
+ class ListenerTail
247
+ # use attr_reader to define noop methods
248
+ attr_reader :input, :path, :data, :size, :pos
249
+ attr_reader :deleted, :created, :error, :eof
250
+
251
+ # construct with upstream state
252
+ def initialize(path, input)
253
+ @path, @input = path, input
254
+ end
255
+
256
+ def timed_out
257
+ input.codec.evict(path)
258
+ end
259
+
260
+ def accept(data, size, pos)
261
+ # and push transient data filled dup listener downstream
262
+ input.log_line_received(path, data)
263
+ input.codec.accept(dup_adding_state(data, size, pos))
264
+ end
265
+
266
+ def process_event(event)
267
+ event["[@metadata][size]"] = size unless size.nil?
268
+ event["[@metadata][pos]"] = pos unless pos.nil?
269
+ event["[@metadata][path]"] = path
270
+ event["path"] = path if !event.include?("path")
271
+ input.post_process_this(event)
272
+ end
273
+
274
+ def add_state(data, size, pos)
275
+ @data = data
276
+ @size = size
277
+ @pos = pos
278
+ self
279
+ end
280
+
281
+ private
282
+
283
+ # duplicate and add state for downstream
284
+ def dup_adding_state(line, size, pos)
285
+ self.class.new(path, input).add_state(line, size, pos)
286
+ end
287
+ end
288
+
289
+ class FlushableListener < ListenerTail
290
+ attr_writer :path
291
+ end
292
+
293
+ def listener_for(path)
294
+ # path is the identity
295
+ ListenerTail.new(path, self)
296
+ end
297
+
298
+ def begin_tailing
299
+ # if the pipeline restarts this input,
300
+ # make sure previous files are closed
301
+ stop
302
+ # use observer listener api
303
+ @tail = FileWatch::Ext::Tail.new_observing_progress(@tail_config)
115
304
  @tail.logger = @logger
116
305
  @path.each { |path| @tail.tail(path) }
117
- hostname = Socket.gethostname
306
+ end
118
307
 
119
- @tail.subscribe do |path, data, type|
120
- @logger.debug("Received line", :path => path, :data => data) if logger.debug?
308
+ def run(queue)
309
+ begin_tailing
310
+ @queue = queue
311
+ @tail.subscribe(self)
312
+ exit_flush
313
+ end # def run
121
314
 
122
- if type == :log
123
- @codec.decode(data) do |event|
315
+ def post_process_this(event)
316
+ event["host"] = @host if !event.include?("host")
317
+ decorate(event)
318
+ @queue << event
319
+ end
124
320
 
125
- decorate(event)
126
- event["host"] = hostname
127
- event["path"] = path
321
+ def log_line_received(path, line)
322
+ return if !@logger.debug?
323
+ @logger.debug("Received line", :path => path, :text => line)
324
+ end
128
325
 
129
- queue << event
130
- end
131
- elsif type == :progressdb
132
- @codec_plain.decode(data) do |event|
133
-
134
- decorate(event)
135
- event["host"] = hostname
136
- event["path"] = path
137
- event["type"] = "progressdb";
138
-
139
- queue << event
140
- end
141
- end # if
142
- end # subscribe
143
- finished
144
- end # def run
326
+ def stop
327
+ # in filewatch >= 0.6.7, quit will closes and forget all files
328
+ # but it will write their last read positions to since_db
329
+ # beforehand
330
+ if @tail
331
+ @codec.close
332
+ @tail.quit
333
+ end
334
+ end
145
335
 
146
- public
147
- def teardown
148
- @tail.sincedb_write
149
- @tail.quit
150
- end # def teardown
151
- end # class LogStash::Inputs::TriggeredPackage
336
+ private
337
+
338
+ def exit_flush
339
+ listener = FlushableListener.new("none", self)
340
+ if @codec.identity_count.zero?
341
+ # using the base codec without identity/path info
342
+ @codec.base_codec.flush do |event|
343
+ begin
344
+ listener.process_event(event)
345
+ rescue => e
346
+ @logger.error("File Input: flush on exit downstream error", :exception => e)
347
+ end
348
+ end
349
+ else
350
+ @codec.flush_mapped(listener)
351
+ end
352
+ end
353
+ end # class LogStash::Inputs::File
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-input-file_progress'
4
- s.version = '2.0.0'
4
+ s.version = '3.0.0'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Stream events from files with progress-events."
7
7
  s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
@@ -20,13 +20,16 @@ Gem::Specification.new do |s|
20
20
  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
21
21
 
22
22
  # Gem dependencies
23
- s.add_runtime_dependency "logstash-core", ">= 2.0.0.beta2", "< 3.0.0"
23
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.20", "<= 2.99"
24
24
 
25
- s.add_runtime_dependency 'logstash-codec-line'
25
+ s.add_runtime_dependency 'logstash-codec-plain'
26
26
  s.add_runtime_dependency 'addressable'
27
- s.add_runtime_dependency "filewatch-ext", ["~> 0.3.0"]
28
-
27
+ s.add_runtime_dependency "filewatch-ext", ["~> 1.1.0"]
28
+ s.add_runtime_dependency 'logstash-codec-multiline', ['~> 2.0.7']
29
29
 
30
+ s.add_development_dependency 'stud', ['~> 0.0.19']
30
31
  s.add_development_dependency 'logstash-devutils'
32
+ s.add_development_dependency 'logstash-codec-json'
33
+ s.add_development_dependency 'rspec-sequencing'
31
34
  end
32
35
 
metadata CHANGED
@@ -1,42 +1,42 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-file_progress
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.0
4
+ version: 3.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Signify
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2015-11-11 00:00:00.000000000 Z
11
+ date: 2016-08-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
15
15
  requirements:
16
16
  - - '>='
17
17
  - !ruby/object:Gem::Version
18
- version: 2.0.0.beta2
19
- - - <
18
+ version: '1.20'
19
+ - - <=
20
20
  - !ruby/object:Gem::Version
21
- version: 3.0.0
22
- name: logstash-core
21
+ version: '2.99'
22
+ name: logstash-core-plugin-api
23
23
  prerelease: false
24
24
  type: :runtime
25
25
  version_requirements: !ruby/object:Gem::Requirement
26
26
  requirements:
27
27
  - - '>='
28
28
  - !ruby/object:Gem::Version
29
- version: 2.0.0.beta2
30
- - - <
29
+ version: '1.20'
30
+ - - <=
31
31
  - !ruby/object:Gem::Version
32
- version: 3.0.0
32
+ version: '2.99'
33
33
  - !ruby/object:Gem::Dependency
34
34
  requirement: !ruby/object:Gem::Requirement
35
35
  requirements:
36
36
  - - '>='
37
37
  - !ruby/object:Gem::Version
38
38
  version: '0'
39
- name: logstash-codec-line
39
+ name: logstash-codec-plain
40
40
  prerelease: false
41
41
  type: :runtime
42
42
  version_requirements: !ruby/object:Gem::Requirement
@@ -63,7 +63,7 @@ dependencies:
63
63
  requirements:
64
64
  - - ~>
65
65
  - !ruby/object:Gem::Version
66
- version: 0.3.0
66
+ version: 1.1.0
67
67
  name: filewatch-ext
68
68
  prerelease: false
69
69
  type: :runtime
@@ -71,7 +71,35 @@ dependencies:
71
71
  requirements:
72
72
  - - ~>
73
73
  - !ruby/object:Gem::Version
74
- version: 0.3.0
74
+ version: 1.1.0
75
+ - !ruby/object:Gem::Dependency
76
+ requirement: !ruby/object:Gem::Requirement
77
+ requirements:
78
+ - - ~>
79
+ - !ruby/object:Gem::Version
80
+ version: 2.0.7
81
+ name: logstash-codec-multiline
82
+ prerelease: false
83
+ type: :runtime
84
+ version_requirements: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - ~>
87
+ - !ruby/object:Gem::Version
88
+ version: 2.0.7
89
+ - !ruby/object:Gem::Dependency
90
+ requirement: !ruby/object:Gem::Requirement
91
+ requirements:
92
+ - - ~>
93
+ - !ruby/object:Gem::Version
94
+ version: 0.0.19
95
+ name: stud
96
+ prerelease: false
97
+ type: :development
98
+ version_requirements: !ruby/object:Gem::Requirement
99
+ requirements:
100
+ - - ~>
101
+ - !ruby/object:Gem::Version
102
+ version: 0.0.19
75
103
  - !ruby/object:Gem::Dependency
76
104
  requirement: !ruby/object:Gem::Requirement
77
105
  requirements:
@@ -86,6 +114,34 @@ dependencies:
86
114
  - - '>='
87
115
  - !ruby/object:Gem::Version
88
116
  version: '0'
117
+ - !ruby/object:Gem::Dependency
118
+ requirement: !ruby/object:Gem::Requirement
119
+ requirements:
120
+ - - '>='
121
+ - !ruby/object:Gem::Version
122
+ version: '0'
123
+ name: logstash-codec-json
124
+ prerelease: false
125
+ type: :development
126
+ version_requirements: !ruby/object:Gem::Requirement
127
+ requirements:
128
+ - - '>='
129
+ - !ruby/object:Gem::Version
130
+ version: '0'
131
+ - !ruby/object:Gem::Dependency
132
+ requirement: !ruby/object:Gem::Requirement
133
+ requirements:
134
+ - - '>='
135
+ - !ruby/object:Gem::Version
136
+ version: '0'
137
+ name: rspec-sequencing
138
+ prerelease: false
139
+ type: :development
140
+ version_requirements: !ruby/object:Gem::Requirement
141
+ requirements:
142
+ - - '>='
143
+ - !ruby/object:Gem::Version
144
+ version: '0'
89
145
  description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
90
146
  email: dietmar@signifydata.com
91
147
  executables: []