logstash-input-delf 3.0.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: 308100753a50a0d7d3c0a9956ce41ee9c001f62d
4
+ data.tar.gz: c8c424269c039b12f4195a610753af6c5ef183d6
5
+ SHA512:
6
+ metadata.gz: 31c2835ee3bc1e0b218e50f2fc103ee9e757dcbcd669cea2a440932805cbb777ffa5ffab2085743d6a767b06c0a06398932758c437906ff49a9f9da29dbbfc5f
7
+ data.tar.gz: f516dbccce766aed7e57347625ffabd9c1a632f3058d87002f3897494761576259d94118b601bb88b7f6ef1819bc81c83c56261b80ca00567d9651412dc625ef
data/CHANGELOG.md ADDED
@@ -0,0 +1,26 @@
1
+ ## 3.0.3
2
+ - Docs: Update doc examples to use new event API syntax
3
+
4
+ ## 3.0.2
5
+ - Relax constraint on logstash-core-plugin-api to >= 1.60 <= 2.99
6
+
7
+ ## 3.0.1
8
+ - Republish all the gems under jruby.
9
+ ## 3.0.0
10
+ - Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141
11
+ ## 2.0.8
12
+ - Make the `Event.from_json` return a single element instead of an array and make this plugin works under 5.0
13
+ ## 2.0.7
14
+ - Fix failing test caused by reverting Java Event back to Ruby Event
15
+ ## 2.0.6
16
+ - Fix plugin crash when Logstash::Json fails to parse a message, https://github.com/logstash-plugins/logstash-input-gelf/pull/27
17
+ ## 2.0.5
18
+ - Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash
19
+ ## 2.0.4
20
+ - New dependency requirements for logstash-core for the 5.0 release
21
+ ## 2.0.3
22
+ - Fix Timestamp coercion to preserve upto microsecond precision, https://github.com/logstash-plugins/logstash-input-gelf/pull/35
23
+ ## 2.0.0
24
+ - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
25
+ instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
26
+ - Dependency on logstash-core update to 2.0
data/CONTRIBUTORS ADDED
@@ -0,0 +1,23 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Bernd Ahlers (bernd)
6
+ * Chris McCoy (lofidellity)
7
+ * Colin Surprenant (colinsurprenant)
8
+ * JeremyEinfeld
9
+ * John E. Vincent (lusis)
10
+ * Jordan Sissel (jordansissel)
11
+ * Kurt Hurtado (kurtado)
12
+ * Nick Ethier (nickethier)
13
+ * Pete Fritchman (fetep)
14
+ * Pier-Hugues Pellerin (ph)
15
+ * Richard Pijnenburg (electrical)
16
+ * Suyog Rao (suyograo)
17
+ * joe miller (joemiller)
18
+ * Guy Boertje (guyboertje)
19
+
20
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
21
+ Logstash, and you aren't on the list above and want to be, please let us know
22
+ and we'll make sure you're here. Contributions from folks like you are what make
23
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,4 @@
1
+ source 'https://rubygems.org'
2
+
3
+ # Specify your gem's dependencies in logstash-mass_effect.gemspec
4
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
data/NOTICE.TXT ADDED
@@ -0,0 +1,5 @@
1
+ Elasticsearch
2
+ Copyright 2012-2015 Elasticsearch
3
+
4
+ This product includes software developed by The Apache Software
5
+ Foundation (http://www.apache.org/).
data/README.md ADDED
@@ -0,0 +1,104 @@
1
+ # Logstash Plugin - delf
2
+
3
+ [![Travis Build Status](https://travis-ci.org/wade-r/logstash-input-delf.svg)](https://travis-ci.org/wade-r/logstash-input-delf)
4
+
5
+ `delf` is a Docker flavored version of `gelf` input plugin, with multi-line support.
6
+
7
+ # Logstash Plugin
8
+
9
+ [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-input-gelf.svg)](https://travis-ci.org/logstash-plugins/logstash-input-gelf)
10
+
11
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
12
+
13
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
14
+
15
+ ## Documentation
16
+
17
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
18
+
19
+ - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
20
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
21
+
22
+ ## Need Help?
23
+
24
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
25
+
26
+ ## Developing
27
+
28
+ ### 1. Plugin Developement and Testing
29
+
30
+ #### Code
31
+ - To get started, you'll need JRuby with the Bundler gem installed.
32
+
33
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
34
+
35
+ - Install dependencies
36
+ ```sh
37
+ bundle install
38
+ ```
39
+
40
+ #### Test
41
+
42
+ - Update your dependencies
43
+
44
+ ```sh
45
+ bundle install
46
+ ```
47
+
48
+ - Run tests
49
+
50
+ ```sh
51
+ bundle exec rspec
52
+ ```
53
+
54
+ ### 2. Running your unpublished Plugin in Logstash
55
+
56
+ #### 2.1 Run in a local Logstash clone
57
+
58
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
59
+ ```ruby
60
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
61
+ ```
62
+ - Install plugin
63
+ ```sh
64
+ # Logstash 2.3 and higher
65
+ bin/logstash-plugin install --no-verify
66
+
67
+ # Prior to Logstash 2.3
68
+ bin/plugin install --no-verify
69
+
70
+ ```
71
+ - Run Logstash with your plugin
72
+ ```sh
73
+ bin/logstash -e 'filter {awesome {}}'
74
+ ```
75
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
76
+
77
+ #### 2.2 Run in an installed Logstash
78
+
79
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
80
+
81
+ - Build your plugin gem
82
+ ```sh
83
+ gem build logstash-filter-awesome.gemspec
84
+ ```
85
+ - Install the plugin from the Logstash home
86
+ ```sh
87
+ # Logstash 2.3 and higher
88
+ bin/logstash-plugin install --no-verify
89
+
90
+ # Prior to Logstash 2.3
91
+ bin/plugin install --no-verify
92
+
93
+ ```
94
+ - Start Logstash and proceed to test the plugin
95
+
96
+ ## Contributing
97
+
98
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
99
+
100
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
101
+
102
+ It is more important to the community that you are able to contribute.
103
+
104
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,104 @@
1
+ :plugin: gelf
2
+ :type: input
3
+
4
+ ///////////////////////////////////////////
5
+ START - GENERATED VARIABLES, DO NOT EDIT!
6
+ ///////////////////////////////////////////
7
+ :version: %VERSION%
8
+ :release_date: %RELEASE_DATE%
9
+ :changelog_url: %CHANGELOG_URL%
10
+ :include_path: ../../../logstash/docs/include
11
+ ///////////////////////////////////////////
12
+ END - GENERATED VARIABLES, DO NOT EDIT!
13
+ ///////////////////////////////////////////
14
+
15
+ [id="plugins-{type}-{plugin}"]
16
+
17
+ === Gelf
18
+
19
+ include::{include_path}/plugin_header.asciidoc[]
20
+
21
+ ==== Description
22
+
23
+ This input will read GELF messages as events over the network,
24
+ making it a good choice if you already use Graylog2 today.
25
+
26
+ The main use case for this input is to leverage existing GELF
27
+ logging libraries such as the GELF log4j appender. A library used
28
+ by this plugin has a bug which prevents it parsing uncompressed data.
29
+ If you use the log4j appender you need to configure it like this to force
30
+ gzip even for small messages:
31
+
32
+ <Socket name="logstash" protocol="udp" host="logstash.example.com" port="5001">
33
+ <GelfLayout compressionType="GZIP" compressionThreshold="1" />
34
+ </Socket>
35
+
36
+
37
+
38
+ [id="plugins-{type}s-{plugin}-options"]
39
+ ==== Gelf Input Configuration Options
40
+
41
+ This plugin supports the following configuration options plus the <<plugins-{type}s-common-options>> described later.
42
+
43
+ [cols="<,<,<",options="header",]
44
+ |=======================================================================
45
+ |Setting |Input type|Required
46
+ | <<plugins-{type}s-{plugin}-host>> |<<string,string>>|No
47
+ | <<plugins-{type}s-{plugin}-port>> |<<number,number>>|No
48
+ | <<plugins-{type}s-{plugin}-remap>> |<<boolean,boolean>>|No
49
+ | <<plugins-{type}s-{plugin}-strip_leading_underscore>> |<<boolean,boolean>>|No
50
+ |=======================================================================
51
+
52
+ Also see <<plugins-{type}s-common-options>> for a list of options supported by all
53
+ input plugins.
54
+
55
+ &nbsp;
56
+
57
+ [id="plugins-{type}s-{plugin}-host"]
58
+ ===== `host`
59
+
60
+ * Value type is <<string,string>>
61
+ * Default value is `"0.0.0.0"`
62
+
63
+ The IP address or hostname to listen on.
64
+
65
+ [id="plugins-{type}s-{plugin}-port"]
66
+ ===== `port`
67
+
68
+ * Value type is <<number,number>>
69
+ * Default value is `12201`
70
+
71
+ The port to listen on. Remember that ports less than 1024 (privileged
72
+ ports) may require root to use.
73
+
74
+ [id="plugins-{type}s-{plugin}-remap"]
75
+ ===== `remap`
76
+
77
+ * Value type is <<boolean,boolean>>
78
+ * Default value is `true`
79
+
80
+ Whether or not to remap the GELF message fields to Logstash event fields or
81
+ leave them intact.
82
+
83
+ Remapping converts the following GELF fields to Logstash equivalents:
84
+
85
+ * `full\_message` becomes `event.get("message")`.
86
+ * if there is no `full\_message`, `short\_message` becomes `event.get("message")`.
87
+
88
+ [id="plugins-{type}s-{plugin}-strip_leading_underscore"]
89
+ ===== `strip_leading_underscore`
90
+
91
+ * Value type is <<boolean,boolean>>
92
+ * Default value is `true`
93
+
94
+ Whether or not to remove the leading `\_` in GELF fields or leave them
95
+ in place. (Logstash < 1.2 did not remove them by default.). Note that
96
+ GELF version 1.1 format now requires all non-standard fields to be added
97
+ as an "additional" field, beginning with an underscore.
98
+
99
+ e.g. `\_foo` becomes `foo`
100
+
101
+
102
+
103
+
104
+ include::{include_path}/{type}.asciidoc[]
@@ -0,0 +1,241 @@
1
+ # encoding: utf-8
2
+ require "logstash/inputs/base"
3
+ require "logstash/namespace"
4
+ require "logstash/json"
5
+ require "logstash/timestamp"
6
+ require "stud/interval"
7
+ require "date"
8
+ require "socket"
9
+
10
+ # This input will read GELF messages as events over the network,
11
+ # making it a good choice if you already use Graylog2 today.
12
+ #
13
+ # The main use case for this input is to leverage existing GELF
14
+ # logging libraries such as the GELF log4j appender. A library used
15
+ # by this plugin has a bug which prevents it parsing uncompressed data.
16
+ # If you use the log4j appender you need to configure it like this to force
17
+ # gzip even for small messages:
18
+ #
19
+ # <Socket name="logstash" protocol="udp" host="logstash.example.com" port="5001">
20
+ # <GelfLayout compressionType="GZIP" compressionThreshold="1" />
21
+ # </Socket>
22
+ #
23
+ #
24
+ class LogStash::Inputs::Delf < LogStash::Inputs::Base
25
+ config_name "delf"
26
+
27
+ default :codec, "plain"
28
+
29
+ # The IP address or hostname to listen on.
30
+ config :host, :validate => :string, :default => "0.0.0.0"
31
+
32
+ # The port to listen on. Remember that ports less than 1024 (privileged
33
+ # ports) may require root to use.
34
+ config :port, :validate => :number, :default => 12201
35
+
36
+ # The incomplete mark, line ends with this mark will be considered as a incomplete event
37
+ config :continue_mark, :validate => :string, :default => "\\"
38
+
39
+ # The field to identify different event stream, for Docker events, it's 'container_id'
40
+ config :track, :validate => :string, :default => 'container_id'
41
+
42
+ RECONNECT_BACKOFF_SLEEP = 5
43
+ TIMESTAMP_GELF_FIELD = "timestamp".freeze
44
+ SOURCE_HOST_FIELD = "source_host".freeze
45
+ MESSAGE_FIELD = "message"
46
+ TAGS_FIELD = "tags"
47
+ PARSE_FAILURE_TAG = "_jsonparsefailure"
48
+ PARSE_FAILURE_LOG_MESSAGE = "JSON parse failure. Falling back to plain-text"
49
+
50
+ public
51
+ def initialize(params)
52
+ super
53
+ BasicSocket.do_not_reverse_lookup = true
54
+ @incomplete_events = {}
55
+ end # def initialize
56
+
57
+ public
58
+ def register
59
+ require 'gelfd'
60
+ end # def register
61
+
62
+ public
63
+ def run(output_queue)
64
+ begin
65
+ # udp server
66
+ udp_listener(output_queue)
67
+ rescue => e
68
+ unless stop?
69
+ @logger.warn("delf listener died", :exception => e, :backtrace => e.backtrace)
70
+ Stud.stoppable_sleep(RECONNECT_BACKOFF_SLEEP) { stop? }
71
+ retry unless stop?
72
+ end
73
+ end # begin
74
+ end # def run
75
+
76
+ public
77
+ def stop
78
+ @udp.close
79
+ rescue IOError # the plugin is currently shutting down, so its safe to ignore theses errors
80
+ end
81
+
82
+ private
83
+ def udp_listener(output_queue)
84
+ @logger.info("Starting delf listener", :address => "#{@host}:#{@port}")
85
+
86
+ @udp = UDPSocket.new(Socket::AF_INET)
87
+ @udp.bind(@host, @port)
88
+
89
+ while !stop?
90
+ line, client = @udp.recvfrom(8192)
91
+
92
+ begin
93
+ data = Gelfd::Parser.parse(line)
94
+ rescue => ex
95
+ @logger.warn("Gelfd failed to parse a message skipping", :exception => ex, :backtrace => ex.backtrace)
96
+ next
97
+ end
98
+
99
+ # Gelfd parser outputs null if it received and cached a non-final chunk
100
+ next if data.nil?
101
+
102
+ event = self.class.new_event(data, client[3])
103
+ next if event.nil?
104
+
105
+ remap_gelf(event)
106
+ strip_leading_underscore(event)
107
+ decorate(event)
108
+
109
+ event = handle_multiline(event)
110
+ next if event.nil?
111
+
112
+ output_queue << event
113
+ end
114
+ end # def udp_listener
115
+
116
+ # generate a new LogStash::Event from json input and assign host to source_host event field.
117
+ # @param json_gelf [String] GELF json data
118
+ # @param host [String] source host of GELF data
119
+ # @return [LogStash::Event] new event with parsed json gelf, assigned source host and coerced timestamp
120
+ def self.new_event(json_gelf, host)
121
+ event = parse(json_gelf)
122
+ return if event.nil?
123
+
124
+ event.set(SOURCE_HOST_FIELD, host)
125
+
126
+ if (gelf_timestamp = event.get(TIMESTAMP_GELF_FIELD)).is_a?(Numeric)
127
+ event.timestamp = self.coerce_timestamp(gelf_timestamp)
128
+ event.remove(TIMESTAMP_GELF_FIELD)
129
+ end
130
+
131
+ event
132
+ end
133
+
134
+ # transform a given timestamp value into a proper LogStash::Timestamp, preserving microsecond precision
135
+ # and work around a JRuby issue with Time.at loosing fractional part with BigDecimal.
136
+ # @param timestamp [Numeric] a Numeric (integer, float or bigdecimal) timestampo representation
137
+ # @return [LogStash::Timestamp] the proper LogStash::Timestamp representation
138
+ def self.coerce_timestamp(timestamp)
139
+ # bug in JRuby prevents correcly parsing a BigDecimal fractional part, see https://github.com/elastic/logstash/issues/4565
140
+ timestamp.is_a?(BigDecimal) ? LogStash::Timestamp.at(timestamp.to_i, timestamp.frac * 1000000) : LogStash::Timestamp.at(timestamp)
141
+ end
142
+
143
+ # from_json_parse uses the Event#from_json method to deserialize and directly produce events
144
+ def self.from_json_parse(json)
145
+ # from_json will always return an array of item.
146
+ # in the context of gelf, the payload should be an array of 1
147
+ LogStash::Event.from_json(json).first
148
+ rescue LogStash::Json::ParserError => e
149
+ logger.error(PARSE_FAILURE_LOG_MESSAGE, :error => e, :data => json)
150
+ LogStash::Event.new(MESSAGE_FIELD => json, TAGS_FIELD => [PARSE_FAILURE_TAG, '_fromjsonparser'])
151
+ end # def self.from_json_parse
152
+
153
+ # legacy_parse uses the LogStash::Json class to deserialize json
154
+ def self.legacy_parse(json)
155
+ o = LogStash::Json.load(json)
156
+ LogStash::Event.new(o)
157
+ rescue LogStash::Json::ParserError => e
158
+ logger.error(PARSE_FAILURE_LOG_MESSAGE, :error => e, :data => json)
159
+ LogStash::Event.new(MESSAGE_FIELD => json, TAGS_FIELD => [PARSE_FAILURE_TAG, '_legacyjsonparser'])
160
+ end # def self.parse
161
+
162
+ # keep compatibility with all v2.x distributions. only in 2.3 will the Event#from_json method be introduced
163
+ # and we need to keep compatibility for all v2 releases.
164
+ class << self
165
+ alias_method :parse, LogStash::Event.respond_to?(:from_json) ? :from_json_parse : :legacy_parse
166
+ end
167
+
168
+ private
169
+ def remap_gelf(event)
170
+ if event.get("full_message") && !event.get("full_message").empty?
171
+ event.set("message", event.get("full_message").dup)
172
+ event.remove("full_message")
173
+ if event.get("short_message") == event.get("message")
174
+ event.remove("short_message")
175
+ end
176
+ elsif event.get("short_message") && !event.get("short_message").empty?
177
+ event.set("message", event.get("short_message").dup)
178
+ event.remove("short_message")
179
+ end
180
+ end # def remap_gelf
181
+
182
+ private
183
+ def strip_leading_underscore(event)
184
+ # Map all '_foo' fields to simply 'foo'
185
+ event.to_hash.keys.each do |key|
186
+ next unless key[0,1] == "_"
187
+ event.set(key[1..-1], event.get(key))
188
+ event.remove(key)
189
+ end
190
+ end # def removing_leading_underscores
191
+
192
+ private
193
+ def handle_multiline(event)
194
+ # Ignore if no track found
195
+ track_id = event.get(@track)
196
+ return event unless track_id.kind_of?(String)
197
+
198
+ # Ignore if no message found
199
+ message = event.get("message")
200
+ return event unless message.kind_of?(String)
201
+
202
+ # Fetch last event
203
+ last_event = @incomplete_events[track_id]
204
+
205
+ if message.end_with?(@continue_mark)
206
+ # remove the continue_mark
207
+ message = message.slice(0, message.length - @continue_mark.length)
208
+ # If it's an incomplete event
209
+ if last_event.nil?
210
+ # update the message
211
+ event.set("message", message)
212
+ # cache it as a pending event
213
+ @incomplete_events[track_id] = event
214
+ return nil
215
+ else
216
+ # append content to pending event
217
+ last_event.set("message", last_event.get("message") + "\n" + message)
218
+ # limit message length to 5000
219
+ if last_event.get("message").length > 5000
220
+ @incomplete_events[track_id] = nil
221
+ return last_event
222
+ else
223
+ return nil
224
+ end
225
+ end
226
+ else
227
+ # If it's not an incomplete event
228
+ if last_event.nil?
229
+ # just return if no pending incomplete event
230
+ return event
231
+ else
232
+ # append content to pending incomplete event and return it
233
+ last_event.set("message", last_event.get("message") + "\n" + message)
234
+ # clear the pending incomplete event
235
+ @incomplete_events[track_id] = nil
236
+ return last_event
237
+ end
238
+ end
239
+ end #def handle_multiline
240
+
241
+ end # class LogStash::Inputs::Gelf
@@ -0,0 +1,32 @@
1
+ Gem::Specification.new do |s|
2
+
3
+ s.name = 'logstash-input-delf'
4
+ s.version = '3.0.3'
5
+ s.licenses = ['Apache License (2.0)']
6
+ s.summary = "This input will read GELF messages as events over the network, making it a good choice if you already use Graylog2 today."
7
+ s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
8
+ s.authors = ["Elastic"]
9
+ s.email = 'info@elastic.co'
10
+ s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html"
11
+ s.require_paths = ["lib"]
12
+
13
+ # Files
14
+ s.files = Dir["lib/**/*","spec/**/*","*.gemspec","*.md","CONTRIBUTORS","Gemfile","LICENSE","NOTICE.TXT", "vendor/jar-dependencies/**/*.jar", "vendor/jar-dependencies/**/*.rb", "VERSION", "docs/**/*"]
15
+
16
+ # Tests
17
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
18
+
19
+ # Special flag to let us know this is actually a logstash plugin
20
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
21
+
22
+ # Gem dependencies
23
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
+
25
+ s.add_runtime_dependency "gelfd", ["0.2.0"] #(Apache 2.0 license)
26
+ s.add_runtime_dependency 'logstash-codec-plain'
27
+ s.add_runtime_dependency "stud", "~> 0.0.22"
28
+
29
+ s.add_development_dependency 'logstash-devutils'
30
+ s.add_development_dependency "gelf", ["1.3.2"] #(MIT license)
31
+ s.add_development_dependency "flores"
32
+ end
@@ -0,0 +1,277 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/inputs/delf"
4
+ require_relative "../support/helpers"
5
+ require "gelf"
6
+ require "flores/random"
7
+
8
+ describe LogStash::Inputs::Delf do
9
+ context "when interrupting the plugin" do
10
+ let(:port) { Flores::Random.integer(1024..65535) }
11
+ let(:host) { "127.0.0.1" }
12
+ let(:chunksize) { 1420 }
13
+ let(:producer) { InfiniteDelfProducer.new(host, port, chunksize) }
14
+ let(:config) { { "host" => host, "port" => port } }
15
+
16
+ before { producer.run }
17
+ after { producer.stop }
18
+
19
+
20
+ it_behaves_like "an interruptible input plugin"
21
+ end
22
+
23
+ it "reads chunked gelf messages " do
24
+ port = 12209
25
+ host = "127.0.0.1"
26
+ chunksize = 1420
27
+ gelfclient = GELF::Notifier.new(host, port, chunksize)
28
+
29
+ conf = <<-CONFIG
30
+ input {
31
+ delf {
32
+ port => "#{port}"
33
+ host => "#{host}"
34
+ }
35
+ }
36
+ CONFIG
37
+
38
+ large_random = 2000.times.map{32 + rand(126 - 32)}.join("")
39
+
40
+ messages = [
41
+ "hello",
42
+ "world",
43
+ large_random,
44
+ "we survived delf!"
45
+ ]
46
+
47
+ events = input(conf) do |pipeline, queue|
48
+ # send a first message until plugin is up and receives it
49
+ while queue.size <= 0
50
+ gelfclient.notify!("short_message" => "prime")
51
+ sleep(0.1)
52
+ end
53
+ gelfclient.notify!("short_message" => "start")
54
+
55
+ e = queue.pop
56
+ while (e.get("message") != "start")
57
+ e = queue.pop
58
+ end
59
+
60
+ messages.each do |m|
61
+ gelfclient.notify!("short_message" => m)
62
+ end
63
+
64
+ messages.map{queue.pop}
65
+ end
66
+
67
+ events.each_with_index do |e, i|
68
+ insist { e.get("message") } == messages[i]
69
+ insist { e.get("host") } == Socket.gethostname
70
+ end
71
+ end
72
+
73
+ it "handles multi-line messages " do
74
+ port = 12209
75
+ host = "127.0.0.1"
76
+ chunksize = 1420
77
+ gelfclient = GELF::Notifier.new(host, port, chunksize)
78
+
79
+ conf = <<-CONFIG
80
+ input {
81
+ delf {
82
+ port => "#{port}"
83
+ host => "#{host}"
84
+ }
85
+ }
86
+ CONFIG
87
+
88
+ messages = [{
89
+ "_container_id" => "dummy1",
90
+ "short_message" => "single1"
91
+ },{
92
+ "_container_id" => "dummy2",
93
+ "short_message" => "single2"
94
+ },{
95
+ "_container_id" => "dummy1",
96
+ "short_message" => "multi1\\"
97
+ },{
98
+ "_container_id" => "dummy2",
99
+ "short_message" => "multi2\\"
100
+ },{
101
+ "_container_id" => "dummy1",
102
+ "short_message" => "multi3\\"
103
+ },{
104
+ "_container_id" => "dummy2",
105
+ "short_message" => "multi4\\"
106
+ },{
107
+ "_container_id" => "dummy2",
108
+ "short_message" => "multi5" # dummy2 multi ended first
109
+ },{
110
+ "_container_id" => "dummy1",
111
+ "short_message" => "multi6"
112
+ },{
113
+ "_container_id" => "dummy2",
114
+ "short_message" => "single3"
115
+ },{
116
+ "_container_id" => "dummy1",
117
+ "short_message" => "single4"
118
+ }]
119
+
120
+ events = input(conf) do |pipeline, queue|
121
+ # send a first message until plugin is up and receives it
122
+ while queue.size <= 0
123
+ gelfclient.notify!("short_message" => "prime")
124
+ sleep(0.1)
125
+ end
126
+ gelfclient.notify!("short_message" => "start")
127
+
128
+ e = queue.pop
129
+ while (e.get("message") != "start")
130
+ e = queue.pop
131
+ end
132
+
133
+ messages.each do |m|
134
+ gelfclient.notify!(m)
135
+ end
136
+
137
+ results = []
138
+
139
+ 6.times do
140
+ results << queue.pop
141
+ end
142
+
143
+ results
144
+ end
145
+
146
+ insist { events.count } == 6
147
+
148
+ insist { events[0].get("container_id") } == "dummy1"
149
+ insist { events[0].get("message") } == "single1"
150
+ insist { events[1].get("container_id") } == "dummy2"
151
+ insist { events[1].get("message") } == "single2"
152
+ insist { events[2].get("container_id") } == "dummy2"
153
+ insist { events[2].get("message") } == "multi2\nmulti4\nmulti5"
154
+ insist { events[3].get("container_id") } == "dummy1"
155
+ insist { events[3].get("message") } == "multi1\nmulti3\nmulti6"
156
+ insist { events[4].get("container_id") } == "dummy2"
157
+ insist { events[4].get("message") } == "single3"
158
+ insist { events[5].get("container_id") } == "dummy1"
159
+ insist { events[5].get("message") } == "single4"
160
+ end
161
+
162
+ context "timestamp coercion" do
163
+ # these test private methods, this is advisable for now until we roll out this coercion in the Timestamp class
164
+ # and remove this
165
+
166
+ context "integer numeric values" do
167
+ it "should coerce" do
168
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800).to_iso8601).to eq("2000-01-01T05:00:00.000Z")
169
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800).usec).to eq(0)
170
+ end
171
+ end
172
+
173
+ context "float numeric values" do
174
+ # using explicit and certainly useless to_f here just to leave no doubt about the numeric type involved
175
+
176
+ it "should coerce and preserve millisec precision in iso8601" do
177
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.1.to_f).to_iso8601).to eq("2000-01-01T05:00:00.100Z")
178
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.12.to_f).to_iso8601).to eq("2000-01-01T05:00:00.120Z")
179
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.123.to_f).to_iso8601).to eq("2000-01-01T05:00:00.123Z")
180
+ end
181
+
182
+ it "should coerce and preserve usec precision" do
183
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.1.to_f).usec).to eq(100000)
184
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.12.to_f).usec).to eq(120000)
185
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.123.to_f).usec).to eq(123000)
186
+
187
+ # since Java Timestamp in 2.3+ relies on JodaTime which supports only millisec precision
188
+ # the usec method will only be precise up to millisec.
189
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.1234.to_f).usec).to be_within(1000).of(123400)
190
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.12345.to_f).usec).to be_within(1000).of(123450)
191
+ expect(LogStash::Inputs::Delf.coerce_timestamp(946702800.123456.to_f).usec).to be_within(1000).of(123456)
192
+ end
193
+ end
194
+
195
+ context "BigDecimal numeric values" do
196
+ it "should coerce and preserve millisec precision in iso8601" do
197
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.1")).to_iso8601).to eq("2000-01-01T05:00:00.100Z")
198
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.12")).to_iso8601).to eq("2000-01-01T05:00:00.120Z")
199
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.123")).to_iso8601).to eq("2000-01-01T05:00:00.123Z")
200
+ end
201
+
202
+ it "should coerce and preserve usec precision" do
203
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.1")).usec).to eq(100000)
204
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.12")).usec).to eq(120000)
205
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.123")).usec).to eq(123000)
206
+
207
+ # since Java Timestamp in 2.3+ relies on JodaTime which supports only millisec precision
208
+ # the usec method will only be precise up to millisec.
209
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.1234")).usec).to be_within(1000).of(123400)
210
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.12345")).usec).to be_within(1000).of(123450)
211
+ expect(LogStash::Inputs::Delf.coerce_timestamp(BigDecimal.new("946702800.123456")).usec).to be_within(1000).of(123456)
212
+ end
213
+ end
214
+ end
215
+
216
+ context "json timestamp coercion" do
217
+ # these test private methods, this is advisable for now until we roll out this coercion in the Timestamp class
218
+ # and remove this
219
+
220
+ it "should coerce integer numeric json timestamp input" do
221
+ event = LogStash::Inputs::Delf.new_event("{\"timestamp\":946702800}", "dummy")
222
+ expect(event.timestamp.to_iso8601).to eq("2000-01-01T05:00:00.000Z")
223
+ end
224
+
225
+ it "should coerce float numeric value and preserve milliseconds precision in iso8601" do
226
+ event = LogStash::Inputs::Delf.new_event("{\"timestamp\":946702800.123}", "dummy")
227
+ expect(event.timestamp.to_iso8601).to eq("2000-01-01T05:00:00.123Z")
228
+ end
229
+
230
+ it "should coerce float numeric value and preserve usec precision" do
231
+ # since Java Timestamp in 2.3+ relies on JodaTime which supports only millisec precision
232
+ # the usec method will only be precise up to millisec.
233
+
234
+ event = LogStash::Inputs::Delf.new_event("{\"timestamp\":946702800.123456}", "dummy")
235
+ expect(event.timestamp.usec).to be_within(1000).of(123456)
236
+ end
237
+ end
238
+
239
+ context "when an invalid JSON is fed to the listener" do
240
+ subject { LogStash::Inputs::Delf.new_event(message, "host") }
241
+ let(:message) { "Invalid JSON message" }
242
+
243
+ if LogStash::Event.respond_to?(:from_json)
244
+ context "default :from_json parser output" do
245
+ it { should be_a(LogStash::Event) }
246
+
247
+ it "falls back to plain-text" do
248
+ expect(subject.get("message")).to eq(message)
249
+ end
250
+
251
+ it "tags message with _jsonparsefailure" do
252
+ expect(subject.get("tags")).to include("_jsonparsefailure")
253
+ end
254
+
255
+ it "tags message with _fromjsonparser" do
256
+ expect(subject.get("tags")).to include("_fromjsonparser")
257
+ end
258
+ end
259
+ else
260
+ context "legacy JSON parser output" do
261
+ it { should be_a(LogStash::Event) }
262
+
263
+ it "falls back to plain-text" do
264
+ expect(subject.get("message")).to eq(message)
265
+ end
266
+
267
+ it "tags message with _jsonparsefailure" do
268
+ expect(subject.get("tags")).to include("_jsonparsefailure")
269
+ end
270
+
271
+ it "tags message with _legacyjsonparser" do
272
+ expect(subject.get("tags")).to include("_legacyjsonparser")
273
+ end
274
+ end
275
+ end
276
+ end
277
+ end
@@ -0,0 +1,18 @@
1
+ # encoding: utf-8
2
+ class InfiniteDelfProducer
3
+ def initialize(host, port, chunksize)
4
+ @client = GELF::Notifier.new(host, port, chunksize)
5
+ end
6
+
7
+ def run
8
+ @producer = Thread.new do
9
+ while true
10
+ @client.notify!("short_message" => "hello world")
11
+ end
12
+ end
13
+ end
14
+
15
+ def stop
16
+ @producer.kill
17
+ end
18
+ end
metadata ADDED
@@ -0,0 +1,162 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-input-delf
3
+ version: !ruby/object:Gem::Version
4
+ version: 3.0.3
5
+ platform: ruby
6
+ authors:
7
+ - Elastic
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2017-05-22 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ">="
17
+ - !ruby/object:Gem::Version
18
+ version: '1.60'
19
+ - - "<="
20
+ - !ruby/object:Gem::Version
21
+ version: '2.99'
22
+ name: logstash-core-plugin-api
23
+ prerelease: false
24
+ type: :runtime
25
+ version_requirements: !ruby/object:Gem::Requirement
26
+ requirements:
27
+ - - ">="
28
+ - !ruby/object:Gem::Version
29
+ version: '1.60'
30
+ - - "<="
31
+ - !ruby/object:Gem::Version
32
+ version: '2.99'
33
+ - !ruby/object:Gem::Dependency
34
+ requirement: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - '='
37
+ - !ruby/object:Gem::Version
38
+ version: 0.2.0
39
+ name: gelfd
40
+ prerelease: false
41
+ type: :runtime
42
+ version_requirements: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - '='
45
+ - !ruby/object:Gem::Version
46
+ version: 0.2.0
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - ">="
51
+ - !ruby/object:Gem::Version
52
+ version: '0'
53
+ name: logstash-codec-plain
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - ">="
59
+ - !ruby/object:Gem::Version
60
+ version: '0'
61
+ - !ruby/object:Gem::Dependency
62
+ requirement: !ruby/object:Gem::Requirement
63
+ requirements:
64
+ - - "~>"
65
+ - !ruby/object:Gem::Version
66
+ version: 0.0.22
67
+ name: stud
68
+ prerelease: false
69
+ type: :runtime
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - "~>"
73
+ - !ruby/object:Gem::Version
74
+ version: 0.0.22
75
+ - !ruby/object:Gem::Dependency
76
+ requirement: !ruby/object:Gem::Requirement
77
+ requirements:
78
+ - - ">="
79
+ - !ruby/object:Gem::Version
80
+ version: '0'
81
+ name: logstash-devutils
82
+ prerelease: false
83
+ type: :development
84
+ version_requirements: !ruby/object:Gem::Requirement
85
+ requirements:
86
+ - - ">="
87
+ - !ruby/object:Gem::Version
88
+ version: '0'
89
+ - !ruby/object:Gem::Dependency
90
+ requirement: !ruby/object:Gem::Requirement
91
+ requirements:
92
+ - - '='
93
+ - !ruby/object:Gem::Version
94
+ version: 1.3.2
95
+ name: gelf
96
+ prerelease: false
97
+ type: :development
98
+ version_requirements: !ruby/object:Gem::Requirement
99
+ requirements:
100
+ - - '='
101
+ - !ruby/object:Gem::Version
102
+ version: 1.3.2
103
+ - !ruby/object:Gem::Dependency
104
+ requirement: !ruby/object:Gem::Requirement
105
+ requirements:
106
+ - - ">="
107
+ - !ruby/object:Gem::Version
108
+ version: '0'
109
+ name: flores
110
+ prerelease: false
111
+ type: :development
112
+ version_requirements: !ruby/object:Gem::Requirement
113
+ requirements:
114
+ - - ">="
115
+ - !ruby/object:Gem::Version
116
+ version: '0'
117
+ description: This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program
118
+ email: info@elastic.co
119
+ executables: []
120
+ extensions: []
121
+ extra_rdoc_files: []
122
+ files:
123
+ - CHANGELOG.md
124
+ - CONTRIBUTORS
125
+ - Gemfile
126
+ - LICENSE
127
+ - NOTICE.TXT
128
+ - README.md
129
+ - docs/index.asciidoc
130
+ - lib/logstash/inputs/delf.rb
131
+ - logstash-input-delf.gemspec
132
+ - spec/inputs/delf_spec.rb
133
+ - spec/support/helpers.rb
134
+ homepage: http://www.elastic.co/guide/en/logstash/current/index.html
135
+ licenses:
136
+ - Apache License (2.0)
137
+ metadata:
138
+ logstash_plugin: 'true'
139
+ logstash_group: input
140
+ post_install_message:
141
+ rdoc_options: []
142
+ require_paths:
143
+ - lib
144
+ required_ruby_version: !ruby/object:Gem::Requirement
145
+ requirements:
146
+ - - ">="
147
+ - !ruby/object:Gem::Version
148
+ version: '0'
149
+ required_rubygems_version: !ruby/object:Gem::Requirement
150
+ requirements:
151
+ - - ">="
152
+ - !ruby/object:Gem::Version
153
+ version: '0'
154
+ requirements: []
155
+ rubyforge_project:
156
+ rubygems_version: 2.4.8
157
+ signing_key:
158
+ specification_version: 4
159
+ summary: This input will read GELF messages as events over the network, making it a good choice if you already use Graylog2 today.
160
+ test_files:
161
+ - spec/inputs/delf_spec.rb
162
+ - spec/support/helpers.rb