logstash-filter-augment 0.1.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: c4299e05ab87426f7d354fc3133aa115e064e07b
4
+ data.tar.gz: b78aecd5b87b25e96eab92240808ccdc1221d88b
5
+ SHA512:
6
+ metadata.gz: 7dc36059f8478636395e7f12ab12b8a7b1d283c03c35434a4e9e699211de6262d35efab81c155ff443f03190e11a18e601bd5701342dc95a33d4551bae9cb22b
7
+ data.tar.gz: 2f4f7d31525289ac828bc17cdafd38c23e1f82a96456f8047a8fe2cac2d5ef988240d5a7f439ad590de5b9a94fb9fe384c2bd531181bc0dc77cd8d97807a6100
@@ -0,0 +1,2 @@
1
+ ## 0.1.0
2
+ - Plugin created with the logstash plugin generator
@@ -0,0 +1,10 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * Adam Caldwell - alcanzar@gmail.com
6
+
7
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
8
+ Logstash, and you aren't on the list above and want to be, please let us know
9
+ and we'll make sure you're here. Contributions from folks like you are what make
10
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,3 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
3
+
data/LICENSE ADDED
@@ -0,0 +1,11 @@
1
+ Licensed under the Apache License, Version 2.0 (the "License");
2
+ you may not use this file except in compliance with the License.
3
+ You may obtain a copy of the License at
4
+
5
+ http://www.apache.org/licenses/LICENSE-2.0
6
+
7
+ Unless required by applicable law or agreed to in writing, software
8
+ distributed under the License is distributed on an "AS IS" BASIS,
9
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10
+ See the License for the specific language governing permissions and
11
+ limitations under the License.
@@ -0,0 +1,103 @@
1
+ # logstash-filter-augment Plugin
2
+
3
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
4
+
5
+ It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
6
+
7
+ It can be used to augment events in logstash from config, CSV file, JSON file, or yaml files. This differs from the translate plugin in that it can add multiple fields to the event based on one lookup. For example say you have a geocode file that maps store numbers to coordinates. Using this plugin you can add a location.latitude and location.longitude to your event based on a simple lookup.
8
+
9
+ ## Documentation
10
+
11
+ The logstash-filter-augment plugin can be configured statically like this:
12
+ ```ruby
13
+ filter {
14
+ augment {
15
+ field => "status"
16
+ dictionary => {
17
+ "200" => {
18
+ "color" => "green"
19
+ "message" => "OK"
20
+ }
21
+ "404" => {
22
+ "color" => "red"
23
+ "message" => "Missing"
24
+ }
25
+ }
26
+ augment_default => {
27
+ "color" => "orange"
28
+ "message" => "not found"
29
+ }
30
+ }
31
+ }
32
+ ```
33
+ And then when an event with status=200 in, it will add color=green and message=OK to the event
34
+
35
+ Additionally you use a CSV, YAML, or JSON file to define the mapping.
36
+
37
+ ## Developing
38
+
39
+ ### 1. Plugin Developement and Testing
40
+
41
+ #### Code
42
+ - To get started, you'll need JRuby with the Bundler gem installed.
43
+
44
+ - Install dependencies
45
+ ```sh
46
+ bundle install
47
+ ```
48
+
49
+ #### Test
50
+
51
+ - Update your dependencies
52
+
53
+ ```sh
54
+ bundle install
55
+ ```
56
+
57
+ - Run tests
58
+
59
+ ```sh
60
+ bundle exec rspec
61
+ ```
62
+
63
+ ### 2. Running your unpublished Plugin in Logstash
64
+
65
+ #### 2.1 Run in a local Logstash clone
66
+
67
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
68
+ ```ruby
69
+ gem "logstash-filter-augment", :path => "/your/local/logstash-filter-augment"
70
+ ```
71
+ - Install plugin
72
+ ```sh
73
+ bin/logstash-plugin install --no-verify
74
+ ```
75
+ - Run Logstash with your plugin
76
+ ```sh
77
+ bin/logstash -e 'filter {augment {}}'
78
+ ```
79
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
80
+
81
+ #### 2.2 Run in an installed Logstash
82
+
83
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
84
+
85
+ - Build your plugin gem
86
+ ```sh
87
+ gem build logstash-filter-augment.gemspec
88
+ ```
89
+ - Install the plugin from the Logstash home
90
+ ```sh
91
+ bin/logstash-plugin install /your/local/plugin/logstash-filter-augment.gem
92
+ ```
93
+ - Start Logstash and proceed to test the plugin
94
+
95
+ ## Contributing
96
+
97
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
98
+
99
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
100
+
101
+ It is more important to the community that you are able to contribute.
102
+
103
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,334 @@
1
+ # encoding: utf-8
2
+ require "logstash/filters/base"
3
+ require "logstash/namespace"
4
+ require "json"
5
+ require "csv"
6
+
7
+ # This filter will allow you to augment events in logstash from
8
+ # an external file source
9
+ class LogStash::Filters::Augment < LogStash::Filters::Base
10
+ # [source,ruby]
11
+ # ----------------
12
+ # filter {
13
+ # augment {
14
+ # field => "status"
15
+ # dictionary => {
16
+ # "200" => {
17
+ # "color" => "green"
18
+ # "message" => "OK"
19
+ # }
20
+ # "404" => {
21
+ # "color" => "red"
22
+ # "message" => "Missing"
23
+ # }
24
+ # }
25
+ # default => {
26
+ # "color" => "orange"
27
+ # "message" => "not found"
28
+ # }
29
+ # }
30
+ # }
31
+ # ----------------
32
+ config_name "augment"
33
+
34
+ # the field to look up in the dictionary
35
+ config :field, :validate => :string, :required => true
36
+ # dictionary_path specifies the file to load from. This can be a .csv, yaml,
37
+ # or json file
38
+ config :dictionary_path, :validate => :array
39
+ # specifies the file type (json/yaml/csv/auto are valid values)
40
+ config :dictionary_type, :validate => ['auto', 'csv', 'json', 'yaml', 'yml'], :default=>'auto'
41
+ # if specified this should be a hash of objects like this:
42
+ # [source,ruby]
43
+ # ----------------
44
+ # dictionary => {
45
+ # "200" => {
46
+ # "color" => "green"
47
+ # "message" => "OK"
48
+ # }
49
+ # "404" => {
50
+ # "color" => "red"
51
+ # "message" => "Missing"
52
+ # }
53
+ # }
54
+ # ----------------
55
+ config :dictionary, :validate => :hash
56
+ # csv_header is columns of the csv file.
57
+ config :csv_header, :validate => :array
58
+ # csv_first_line indicates what to do with the first line of the file
59
+ # - 'ignore' skips it (csv_header must be set)
60
+ # - 'header' reads it and populates csv_header with it (csv_header must not be set)
61
+ # - 'data' reads it as data (csv_header must be set)
62
+ # - 'auto' treats the first line as data if csv_header is set or header if csv_data isn't set
63
+ config :csv_first_line, :validate => ["data","header","ignore","auto"], :default=>"auto"
64
+ # the csv_key determines which field of the csv file is the dictionary key
65
+ # if this is not set, it will default to first column of the csv file
66
+ config :csv_key, :validate => :string
67
+ # if csv_remove_key is set, it will remove that key from the csv fields for augmenting
68
+ # for example, say you have 200,green,ok as a line in the csv file where
69
+ # the fields are status,color,message and your csv_key is set to status. If csv_remove_key
70
+ # is false then the event will have a status=200. If csv_remove_key is true, then the event won't have
71
+ # a status unless it already existed in the event.
72
+ config :csv_remove_key, :validate => :boolean, :default => true
73
+ # if the json file provided is an array, this specifies which field of the
74
+ # array of objects is the key value
75
+ config :json_key, :validate => :string
76
+ # if json_remove_key is set and your json file is an array, it will remove the
77
+ # key field from object similar to csv_remove_key
78
+ config :json_remove_key, :validate => :boolean, :default => true
79
+ # if the yaml file provided is an array, this specifies which field of the
80
+ # array of objects is the key value
81
+ config :yaml_key, :validate => :string
82
+ # if yaml_remove_key is set and your json file is an array, it will remove the
83
+ # key field from object similar to csv_remove_key
84
+ config :yaml_remove_key, :validate => :boolean, :default => true
85
+ # augment_fields is the he list of fields of the dictionary's value to augment
86
+ # on to the event. If this is not set, then all set fields of the dictionary
87
+ # object are set on the event
88
+ config :augment_fields, :validate => :array
89
+ # if target is set, the augmented fields will be added to this event
90
+ # field instead of the root event.
91
+ config :target, :validate => :string, :default=>""
92
+ # augment_default will be used if the key is not found
93
+ # for example:
94
+ # [source,ruby]
95
+ # ----------------
96
+ # default => {
97
+ # status => 'unknown'
98
+ # color => 'orange'
99
+ # }
100
+ # ----------------
101
+ config :default, :validate => :hash
102
+ # refresh_interval specifies minimum time between file refreshes in seconds
103
+ # this plugin looks at the modification time of the file and only reloads if that changes
104
+ config :refresh_interval, :validate => :number, :default=>60
105
+ # ignore_fields are the fields of the dictionary value that you want to ignore
106
+ config :ignore_fields, :validate => :array
107
+
108
+ public
109
+ def register
110
+ @fileModifiedTime = Hash.new
111
+ rw_lock = java.util.concurrent.locks.ReentrantReadWriteLock.new
112
+ @read_lock = rw_lock.readLock
113
+ @write_lock = rw_lock.writeLock
114
+ if !@dictionary
115
+ @dictionary = Hash.new
116
+ end
117
+ @dictionaries = @dictionary_path.nil? ? nil : (@dictionary_path.is_a?(Array) ? @dictionary_path : [ @dictionary_path ])
118
+
119
+ if @dictionary_path && !@dictionary.empty?
120
+ raise LogStash::ConfigurationError, "The configuration options 'dictionary' and 'dictionary_path' are mutually exclusive"
121
+ end
122
+
123
+ if @csv_ignore_first_line && !@csv_header
124
+ raise LogStash::ConfigurationError, "The parameter csv_header is required if csv_ignore_first_line = true"
125
+ end
126
+
127
+ load_or_refresh_dictionaries(true)
128
+
129
+ @exclude_keys = Hash.new
130
+ if @ignore_fields
131
+ @ignore_fields.each { |k| @exclude_keys[k]=true }
132
+ end
133
+
134
+ # validate the dictionary is in the right format
135
+ if @dictionary
136
+ newdic = Hash.new
137
+ @dictionary.each do |key,val|
138
+ if val.is_a?(Array)
139
+ newdic[key] = Hash[*val]
140
+ elsif val.is_a?(Hash)
141
+ newdic[key] = val
142
+ else
143
+ raise LogStash::ConfigurationError, "The dictionary must be a hash of string to dictionary. "+key+" is neither a "+val.class.to_s
144
+ end
145
+ end
146
+ @dictionary = newdic
147
+ end
148
+
149
+ @logger.debug? and @logger.debug("#{self.class.name}: Dictionary - ", :dictionary => @dictionary)
150
+ end # def register
151
+
152
+ public
153
+ def filter(event)
154
+ load_or_refresh_dictionaries(false)
155
+
156
+ return unless event.include?(@field) # Skip translation in case event does not have @event field.
157
+
158
+ begin
159
+ #If source field is array use first value and make sure source value is string
160
+ source = event.get(@field).is_a?(Array) ? event.get(@field).first.to_s : event.get(@field).to_s
161
+ row = lock_for_read { @dictionary[source] }
162
+ if !row
163
+ row = @default
164
+ end
165
+ return unless row # nothing to do if there's nothing to add
166
+
167
+ if @only_fields
168
+ only_fields.each { |k| event.set("#{@target}[#{k}]",row[v]) if row[v] }
169
+ else
170
+ row.each { |k,v| event.set("#{@target}[#{k}]",v) unless @exclude_keys[k] }
171
+ end
172
+ filter_matched(event)
173
+ rescue Exception => e
174
+ @logger.error("Something went wrong when attempting to augment from dictionary", :exception => e, :field => @field, :event => event)
175
+ end
176
+ end # def filter
177
+
178
+
179
+ private
180
+ def lock_for_read
181
+ @read_lock.lock
182
+ begin
183
+ yield
184
+ ensure
185
+ @read_lock.unlock
186
+ end
187
+ end
188
+
189
+ def lock_for_write
190
+ @write_lock.lock
191
+ begin
192
+ yield
193
+ ensure
194
+ @write_lock.unlock
195
+ end
196
+ end
197
+ def load_dictionary(filename, raise_exception=false)
198
+ if !File.exists?(filename)
199
+ if raise_exception
200
+ raise "Dictionary #{filename} does not exist"
201
+ else
202
+ @logger.warn("Dictionary #{filename} does not exist")
203
+ return
204
+ end
205
+ end
206
+ if @dictionary_type == 'yaml' || @dictionary_type == 'yml' || (@dictionary_type == 'auto' && /.y[a]?ml$/.match(filename))
207
+ load_yaml(filename,raise_exception)
208
+ elsif @dictionary_type == 'json' || (@dictionary_type == 'auto' && filename.end_with?(".json"))
209
+ load_json(filename,raise_exception)
210
+ elsif @dictionary_type == 'csv' || (@dictionary_type == 'auto' && filename.end_with?(".csv"))
211
+ load_csv(filename,raise_exception)
212
+ else
213
+ raise "#{self.class.name}: Dictionary #{filename} format not recognized from filename or dictionary_type"
214
+ end
215
+ rescue => e
216
+ loading_exception(e, raise_exception)
217
+ end
218
+
219
+ def cleanup_data(filename, tree, key_if_array, key_type, remove_key)
220
+ if tree.is_a?(Array)
221
+ if !key_if_array
222
+ raise LogStash::ConfigurationError, "The #{filename} file is an array, but #{key_type}_key is not set"
223
+ end
224
+ newTree = Hash.new
225
+ tree.each do |v|
226
+ newTree[v[key_if_array].to_s] = v
227
+ if remove_key
228
+ v.delete(key_if_array)
229
+ end
230
+ end
231
+ tree = newTree
232
+ end
233
+ newTree = Hash.new
234
+ tree.each { |k,v| newTree[k.to_s] = v if (v.is_a?(Object))}
235
+ tree = newTree
236
+ return tree
237
+ end
238
+
239
+ def load_yaml(filename, raise_exception=false)
240
+ yaml = YAML.load_file(filename)
241
+ yaml = cleanup_data(filename, yaml, @yaml_key, 'yaml', @yaml_remove_key)
242
+ merge_dictionary!(filename, yaml)
243
+ end
244
+
245
+ def load_json(filename, raise_exception=false)
246
+ json = JSON.parse(File.read(filename))
247
+ json = cleanup_data(filename, json, @json_key, 'json', @json_remove_key)
248
+ merge_dictionary!(filename, json)
249
+ end
250
+
251
+ def load_csv(filename, raise_exception=false)
252
+ if !@initialized
253
+ @initialized = true
254
+ if @csv_first_line == 'auto'
255
+ if @csv_header
256
+ @csv_first_line = 'data'
257
+ else
258
+ @csv_first_line = 'header'
259
+ end
260
+ end
261
+ if @csv_first_line == 'header' && @csv_header
262
+ raise LogStash::ConfigurationError, "The csv_first_line is set to 'header' but csv_header is set"
263
+ end
264
+ if @csv_first_line == 'ignore' && !csv_header
265
+ raise LogStash::ConfigurationError, "The csv_first_line is set to 'ignore' but csv_header is not set"
266
+ end
267
+ end
268
+ csv_lines = CSV.read(filename);
269
+ if @csv_first_line == 'header'
270
+ @csv_header = csv_lines.shift
271
+ elsif @csv_first_line == 'ignore'
272
+ csv_lines.shift
273
+ end
274
+ if @csv_key.nil?
275
+ @csv_key = @csv_header[0];
276
+ end
277
+ data = Hash.new
278
+ csv_lines.each do |line|
279
+ o = Hash.new
280
+ line.zip(@csv_header).each do |value, header|
281
+ o[header] = value
282
+ end
283
+ key = o[csv_key]
284
+ if @csv_remove_key
285
+ o.delete(csv_key)
286
+ end
287
+ data[key] = o
288
+ end
289
+ merge_dictionary!(filename, data)
290
+ end
291
+
292
+ def merge_dictionary!(filename, data)
293
+ @logger.debug("Merging data from #{filename} = #{data}")
294
+ @dictionary.merge!(data)
295
+ end
296
+
297
+ def loading_exception(e, raise_exception=false)
298
+ msg = "#{self.class.name}: #{e.message} when loading dictionary file"
299
+ if raise_exception
300
+ raise RuntimeError.new(msg)
301
+ else
302
+ @logger.warn("#{msg}, continuing with old dictionary")
303
+ end
304
+ end
305
+
306
+ def refresh_dictionary(filename, raise_exception)
307
+ mtime = File.mtime(filename)
308
+ if ! @dictionary_mtime[filename] && @dictionary_mtime[filename] != mtime
309
+ @dictionary_mtime[filename] = mtime
310
+ @logger.info("file #{filename} has been modified, reloading")
311
+ load_dictionary(filename, raise_exception)
312
+ end
313
+ end
314
+
315
+ def load_or_refresh_dictionaries(raise_exception=false)
316
+ if ! @dictionaries
317
+ return
318
+ end
319
+ if (@next_refresh && @next_refresh + @refresh_interval < Time.now)
320
+ return
321
+ end
322
+ lock_for_write do
323
+ if ! @dictionary_mtime
324
+ @dictionary_mtime = Hash.new
325
+ end
326
+ if (@next_refresh && @next_refresh > Time.now)
327
+ return
328
+ end
329
+ @logger.info("checking for modified dictionary files")
330
+ @dictionaries.each { |filename| refresh_dictionary(filename,raise_exception) }
331
+ @next_refresh = Time.now + @refresh_interval
332
+ end
333
+ end
334
+ end # class LogStash::Filters::Augment
@@ -0,0 +1,23 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-filter-augment'
3
+ s.version = '0.1.0'
4
+ s.licenses = ['Apache License (2.0)']
5
+ s.summary = 'A logstash plugin to augment your events from data in files'
6
+ s.description = 'A logstash plugin that can merge data from CSV, YAML, and JSON files with events.'
7
+ s.homepage = 'https://github.com/alcanzar/logstash-filter-augment/'
8
+ s.authors = ['Adam Caldwell']
9
+ s.email = 'alcanzar@gmail.com'
10
+ s.require_paths = ['lib']
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
14
+ # Tests
15
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
16
+
17
+ # Special flag to let us know this is actually a logstash plugin
18
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" }
19
+
20
+ # Gem dependencies
21
+ s.add_runtime_dependency "logstash-core-plugin-api", "~> 2.0"
22
+ s.add_development_dependency 'logstash-devutils'
23
+ end
@@ -0,0 +1,196 @@
1
+ # encoding: utf-8
2
+ require_relative '../spec_helper'
3
+ require "logstash/filters/augment"
4
+
5
+ describe LogStash::Filters::Augment do
6
+ describe "static config" do
7
+ config <<-CONFIG
8
+ filter {
9
+ augment {
10
+ field => "status"
11
+ dictionary => {
12
+ "200" => {
13
+ "color" => "green"
14
+ "message" => "OK"
15
+ }
16
+ "404" => {
17
+ "color" => "red"
18
+ "message" => "Missing"
19
+ }
20
+ }
21
+ }
22
+ }
23
+ CONFIG
24
+ sample("status" => "200") do
25
+ insist { subject.get("color")} == "green"
26
+ insist { subject.get("message")} == "OK"
27
+ end
28
+ end
29
+ describe "static config with defaults" do
30
+ config <<-CONFIG
31
+ filter {
32
+ augment {
33
+ field => "status"
34
+ dictionary => {
35
+ "200" => {
36
+ "color" => "green"
37
+ "message" => "OK"
38
+ }
39
+ "404" => {
40
+ "color" => "red"
41
+ "message" => "Missing"
42
+ }
43
+ }
44
+ default => {
45
+ "color" => "orange"
46
+ "message" => "not found"
47
+ }
48
+ }
49
+ }
50
+ CONFIG
51
+ sample("status" => "201") do
52
+ insist { subject.get("color")} == "orange"
53
+ insist { subject.get("message")} == "not found"
54
+ end
55
+ end
56
+ describe "invalid config because dictionary isn't right" do
57
+ config <<-CONFIG
58
+ filter {
59
+ augment {
60
+ field => "status"
61
+ dictionary => {
62
+ "200" => "OK"
63
+ "404" => "Bogus"
64
+ }
65
+ }
66
+ }
67
+ CONFIG
68
+ sample("status" => "200") do
69
+ expect { subject }.to raise_exception LogStash::ConfigurationError
70
+ end
71
+ end
72
+ describe "simple csv file with header ignored" do
73
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "test-with-headers.csv")
74
+ config <<-CONFIG
75
+ filter {
76
+ augment {
77
+ field => "status"
78
+ dictionary_path => '#{filename}'
79
+ csv_header => ["status","color","message"]
80
+ csv_first_line => "ignore"
81
+ }
82
+ }
83
+ CONFIG
84
+ sample("status" => "200") do
85
+ insist { subject.get("color")} == "green"
86
+ insist { subject.get("message")} == "ok"
87
+ end
88
+ end
89
+ describe "simple csv file with header, but not ignored" do
90
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "test-with-headers.csv")
91
+ config <<-CONFIG
92
+ filter {
93
+ augment {
94
+ field => "status"
95
+ dictionary_path => '#{filename}'
96
+ }
97
+ }
98
+ CONFIG
99
+ sample("status" => "200") do
100
+ insist { subject.get("color")} == "green"
101
+ insist { subject.get("message")} == "ok"
102
+ end
103
+ end
104
+ describe "simple csv file with ignore_fields set" do
105
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "test-with-headers.csv")
106
+ config <<-CONFIG
107
+ filter {
108
+ augment {
109
+ field => "status"
110
+ dictionary_path => '#{filename}'
111
+ ignore_fields => ["color"]
112
+ }
113
+ }
114
+ CONFIG
115
+ sample("status" => "200") do
116
+ insist { subject.get("color")} == nil
117
+ insist { subject.get("message")} == "ok"
118
+ end
119
+ end
120
+ describe "json-hash" do
121
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "json-hash.json")
122
+ config <<-CONFIG
123
+ filter {
124
+ augment {
125
+ field => "status"
126
+ dictionary_path => '#{filename}'
127
+ }
128
+ }
129
+ CONFIG
130
+ sample("status" => "200") do
131
+ insist { subject.get("color")} == "green"
132
+ insist { subject.get("message")} == "ok"
133
+ end
134
+ end
135
+ describe "json-array no json_key" do
136
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "json-array.json")
137
+ config <<-CONFIG
138
+ filter {
139
+ augment {
140
+ field => "status"
141
+ dictionary_path => '#{filename}'
142
+ }
143
+ }
144
+ CONFIG
145
+ sample("status" => "404") do
146
+ expect { subject }.to raise_exception RuntimeError
147
+ end
148
+ end
149
+ describe "json-array with json_key" do
150
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "json-array.json")
151
+ config <<-CONFIG
152
+ filter {
153
+ augment {
154
+ field => "status"
155
+ dictionary_path => '#{filename}'
156
+ json_key => "code"
157
+ }
158
+ }
159
+ CONFIG
160
+ sample("status" => "404") do
161
+ insist { subject.get("color")} == "red"
162
+ insist { subject.get("message")} == "not found"
163
+ end
164
+ end
165
+ describe "yaml-array with integer key" do
166
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "yaml-array.yaml")
167
+ config <<-CONFIG
168
+ filter {
169
+ augment {
170
+ field => "status"
171
+ dictionary_path => '#{filename}'
172
+ yaml_key => "code"
173
+ }
174
+ }
175
+ CONFIG
176
+ sample("status" => "404") do
177
+ insist { subject.get("color")} == "red"
178
+ insist { subject.get("message")} == "not found"
179
+ end
180
+ end
181
+ describe "yaml-object" do
182
+ filename = File.join(File.dirname(__FILE__), "..", "fixtures", "yaml-object.yaml")
183
+ config <<-CONFIG
184
+ filter {
185
+ augment {
186
+ field => "status"
187
+ dictionary_path => '#{filename}'
188
+ }
189
+ }
190
+ CONFIG
191
+ sample("status" => "404") do
192
+ insist { subject.get("color")} == "red"
193
+ insist { subject.get("message")} == "not found"
194
+ end
195
+ end
196
+ end
@@ -0,0 +1,4 @@
1
+ [
2
+ {"code": 200, "color": "green", "message": "ok"},
3
+ {"code": 404, "color": "red", "message": "not found"}
4
+ ]
@@ -0,0 +1,4 @@
1
+ [
2
+ {"code": "200", "color": "green", "message": "ok"},
3
+ {"code": "404", "color": "red", "message": "not found"}
4
+ ]
@@ -0,0 +1,4 @@
1
+ {
2
+ "200": { "color": "green", "message": "ok" },
3
+ "404": { "color": "red", "message": "not found" }
4
+ }
@@ -0,0 +1,3 @@
1
+ code,color,message
2
+ 200,green,ok
3
+ 404,red,not found
@@ -0,0 +1,2 @@
1
+ 200,green,ok
2
+ 404,red,ok
@@ -0,0 +1,6 @@
1
+ - code: 200
2
+ color: green
3
+ message: ok
4
+ - code: 404
5
+ color: red
6
+ message: not found
@@ -0,0 +1,6 @@
1
+ 200:
2
+ color: green
3
+ message: ok
4
+ 404:
5
+ color: red
6
+ message: not found
@@ -0,0 +1,2 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
metadata ADDED
@@ -0,0 +1,98 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-filter-augment
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - Adam Caldwell
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2017-01-16 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - ~>
17
+ - !ruby/object:Gem::Version
18
+ version: '2.0'
19
+ name: logstash-core-plugin-api
20
+ prerelease: false
21
+ type: :runtime
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ~>
25
+ - !ruby/object:Gem::Version
26
+ version: '2.0'
27
+ - !ruby/object:Gem::Dependency
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - '>='
31
+ - !ruby/object:Gem::Version
32
+ version: '0'
33
+ name: logstash-devutils
34
+ prerelease: false
35
+ type: :development
36
+ version_requirements: !ruby/object:Gem::Requirement
37
+ requirements:
38
+ - - '>='
39
+ - !ruby/object:Gem::Version
40
+ version: '0'
41
+ description: A logstash plugin that can merge data from CSV, YAML, and JSON files with events.
42
+ email: alcanzar@gmail.com
43
+ executables: []
44
+ extensions: []
45
+ extra_rdoc_files: []
46
+ files:
47
+ - CHANGELOG.md
48
+ - CONTRIBUTORS
49
+ - Gemfile
50
+ - LICENSE
51
+ - README.md
52
+ - lib/logstash/filters/augment.rb
53
+ - logstash-filter-augment.gemspec
54
+ - spec/filters/augment_spec.rb
55
+ - spec/fixtures/json-array-int-key.json
56
+ - spec/fixtures/json-array.json
57
+ - spec/fixtures/json-hash.json
58
+ - spec/fixtures/test-with-headers.csv
59
+ - spec/fixtures/test-without-headers.csv
60
+ - spec/fixtures/yaml-array.yaml
61
+ - spec/fixtures/yaml-object.yaml
62
+ - spec/spec_helper.rb
63
+ homepage: https://github.com/alcanzar/logstash-filter-augment/
64
+ licenses:
65
+ - Apache License (2.0)
66
+ metadata:
67
+ logstash_plugin: 'true'
68
+ logstash_group: filter
69
+ post_install_message:
70
+ rdoc_options: []
71
+ require_paths:
72
+ - lib
73
+ required_ruby_version: !ruby/object:Gem::Requirement
74
+ requirements:
75
+ - - '>='
76
+ - !ruby/object:Gem::Version
77
+ version: '0'
78
+ required_rubygems_version: !ruby/object:Gem::Requirement
79
+ requirements:
80
+ - - '>='
81
+ - !ruby/object:Gem::Version
82
+ version: '0'
83
+ requirements: []
84
+ rubyforge_project:
85
+ rubygems_version: 2.4.5
86
+ signing_key:
87
+ specification_version: 4
88
+ summary: A logstash plugin to augment your events from data in files
89
+ test_files:
90
+ - spec/filters/augment_spec.rb
91
+ - spec/fixtures/json-array-int-key.json
92
+ - spec/fixtures/json-array.json
93
+ - spec/fixtures/json-hash.json
94
+ - spec/fixtures/test-with-headers.csv
95
+ - spec/fixtures/test-without-headers.csv
96
+ - spec/fixtures/yaml-array.yaml
97
+ - spec/fixtures/yaml-object.yaml
98
+ - spec/spec_helper.rb