logstash-codec-netflow 2.0.5 → 2.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 0ae020958c2421c6319d6f7b0ad7d9889329d988
4
- data.tar.gz: 68ed6e1be5662bf00cc4d59aae6be8315485cb33
3
+ metadata.gz: 918358ce2cdaac6c82befb4288a60dbf12bd582d
4
+ data.tar.gz: 801135d166dc217611ca6a079eee87f27df9a0d5
5
5
  SHA512:
6
- metadata.gz: d93c9c1608d2d4c838d9ce4f6625461c537e8009f75730c54e4fd2ccd77a867f758562b39e9aca1e17472bb00c70d8d40b7e11b22a3f745d41271dfe99e2ace6
7
- data.tar.gz: f388c2d88d5e9a21da0c4966d2c7c2780e10665b60d83ddff636e384c0b8b422e7651493fa7933c39ac88455bf3aacfdb08cfabdd72d89002d14a7019b4029aa
6
+ metadata.gz: 4eb4a32af83f1485e23aca07ad96795988980e4da059c328e3870fb346482d7d0709248ff3d9c78e559f85f7b26b40ca036cbf80a7e6d406b83dab844a725b33
7
+ data.tar.gz: 6e9b9e41db2f5df5483eb66b0657a58ea4ff2ce594d64bb22d04337004bd3e68cb96057a86273ce73d77136d9d71cac3d6e53d0841e7ef74f0d5dbd4c290c367
@@ -1,11 +1,26 @@
1
+ ## 2.1.0
2
+
3
+ - Added IPFIX support
4
+ - Fixed exception if Netflow data contains MAC addresses (issue #26, issue #34)
5
+ - Fixed exceptions when receiving invalid Netflow v5 and v9 data (issue #17, issue 18)
6
+ - Fixed decoding Netflow templates from multiple (non-identical) exporters
7
+ - Add support for Cisco ASA fields
8
+ - Add support for Netflow 9 options template with scope fields
9
+
1
10
  # 2.0.5
11
+
2
12
  - Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash
13
+
3
14
  # 2.0.4
15
+
4
16
  - New dependency requirements for logstash-core for the 5.0 release
17
+
5
18
  ## 2.0.3
19
+
6
20
  - Fixed JSON compare flaw in specs
7
21
 
8
22
  ## 2.0.0
23
+
9
24
  - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
10
25
  instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
11
26
  - Dependency on logstash-core update to 2.0
@@ -3,12 +3,24 @@ reports, or in general have helped logstash along its way.
3
3
 
4
4
  Contributors:
5
5
  * Aaron Mildenstein (untergeek)
6
+ * Adam Kaminski (thimslugga)
6
7
  * Colin Surprenant (colinsurprenant)
7
8
  * Jordan Sissel (jordansissel)
9
+ * Jorrit Folmer (jorritfolmer)
8
10
  * Matt Dainty (bodgit)
11
+ * Paul Warren (pwarren)
9
12
  * Pier-Hugues Pellerin (ph)
13
+ * Pulkit Agrawal (propulkit)
10
14
  * Richard Pijnenburg (electrical)
15
+ * Salvador Ferrer (salva-ferrer)
16
+ * Will Rigby (wrigby)
17
+ * Rojuinex
11
18
  * debadair
19
+ * hkshirish
20
+ * jstopinsek
21
+
22
+ Maintainer:
23
+ * Jorrit Folmer (jorritfolmer)
12
24
 
13
25
  Note: If you've sent us patches, bug reports, or otherwise contributed to
14
26
  Logstash, and you aren't on the list above and want to be, please let us know
data/Gemfile CHANGED
@@ -1,2 +1,2 @@
1
1
  source 'https://rubygems.org'
2
- gemspec
2
+ gemspec
data/LICENSE CHANGED
@@ -1,4 +1,4 @@
1
- Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
1
+ Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
2
 
3
3
  Licensed under the Apache License, Version 2.0 (the "License");
4
4
  you may not use this file except in compliance with the License.
data/README.md CHANGED
@@ -1,7 +1,6 @@
1
1
  # Logstash Plugin
2
2
 
3
- [![Build
4
- Status](http://build-eu-00.elastic.co/view/LS%20Plugins/view/LS%20Codecs/job/logstash-plugin-codec-netflow-unit/badge/icon)](http://build-eu-00.elastic.co/view/LS%20Plugins/view/LS%20Codecs/job/logstash-plugin-codec-netflow-unit/)
3
+ [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-codec-netflow.svg)](https://travis-ci.org/logstash-plugins/logstash-codec-netflow)
5
4
 
6
5
  This is a plugin for [Logstash](https://github.com/elastic/logstash).
7
6
 
@@ -56,7 +55,12 @@ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
56
55
  ```
57
56
  - Install plugin
58
57
  ```sh
58
+ # Logstash 2.3 and higher
59
+ bin/logstash-plugin install --no-verify
60
+
61
+ # Prior to Logstash 2.3
59
62
  bin/plugin install --no-verify
63
+
60
64
  ```
61
65
  - Run Logstash with your plugin
62
66
  ```sh
@@ -74,7 +78,12 @@ gem build logstash-filter-awesome.gemspec
74
78
  ```
75
79
  - Install the plugin from the Logstash home
76
80
  ```sh
77
- bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
81
+ # Logstash 2.3 and higher
82
+ bin/logstash-plugin install --no-verify
83
+
84
+ # Prior to Logstash 2.3
85
+ bin/plugin install --no-verify
86
+
78
87
  ```
79
88
  - Start Logstash and proceed to test the plugin
80
89
 
@@ -3,7 +3,59 @@ require "logstash/filters/base"
3
3
  require "logstash/namespace"
4
4
  require "logstash/timestamp"
5
5
 
6
- # The "netflow" codec is for decoding Netflow v5/v9 flows.
6
+ # The "netflow" codec is used for decoding Netflow v5/v9/v10 (IPFIX) flows.
7
+ #
8
+ # ==== Supported Netflow/IPFIX exporters
9
+ #
10
+ # The following Netflow/IPFIX exporters are known to work with the most recent version of the netflow codec:
11
+ #
12
+ # [cols="6,^2,^2,^2,12",options="header"]
13
+ # |===========================================================================================
14
+ # |Netflow exporter | v5 | v9 | IPFIX | Remarks
15
+ # |Softflowd | y | y | y | IPFIX supported in https://github.com/djmdjm/softflowd
16
+ # |nProbe | y | y | y |
17
+ # |ipt_NETFLOW | y | y | y |
18
+ # |Cisco ASA | | y | |
19
+ # |Cisco IOS 12.x | | y | |
20
+ # |fprobe | y | | |
21
+ # |===========================================================================================
22
+ #
23
+ # ==== Usage
24
+ #
25
+ # Example Logstash configuration:
26
+ #
27
+ # [source]
28
+ # -----------------------------
29
+ # input {
30
+ # udp {
31
+ # host => localhost
32
+ # port => 2055
33
+ # codec => netflow {
34
+ # versions => [5, 9]
35
+ # }
36
+ # type => netflow
37
+ # }
38
+ # udp {
39
+ # host => localhost
40
+ # port => 4739
41
+ # codec => netflow {
42
+ # versions => [10]
43
+ # target => ipfix
44
+ # }
45
+ # type => ipfix
46
+ # }
47
+ # tcp {
48
+ # host => localhost
49
+ # port => 4739
50
+ # codec => netflow {
51
+ # versions => [10]
52
+ # target => ipfix
53
+ # }
54
+ # type => ipfix
55
+ # }
56
+ # }
57
+ # -----------------------------
58
+
7
59
  class LogStash::Codecs::Netflow < LogStash::Codecs::Base
8
60
  config_name "netflow"
9
61
 
@@ -14,7 +66,7 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
14
66
  config :target, :validate => :string, :default => "netflow"
15
67
 
16
68
  # Specify which Netflow versions you will accept.
17
- config :versions, :validate => :array, :default => [5, 9]
69
+ config :versions, :validate => :array, :default => [5, 9, 10]
18
70
 
19
71
  # Override YAML file containing Netflow field definitions
20
72
  #
@@ -31,10 +83,36 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
31
83
  # - :skip
32
84
  #
33
85
  # See <https://github.com/logstash-plugins/logstash-codec-netflow/blob/master/lib/logstash/codecs/netflow/netflow.yaml> for the base set.
34
- config :definitions, :validate => :path
86
+ config :netflow_definitions, :validate => :path
87
+
88
+ # Override YAML file containing IPFIX field definitions
89
+ #
90
+ # Very similar to the Netflow version except there is a top level Private
91
+ # Enterprise Number (PEN) key added:
92
+ #
93
+ # ---
94
+ # pen:
95
+ # id:
96
+ # - :uintN or :ip4_addr or :ip6_addr or :mac_addr or :string
97
+ # - :name
98
+ # id:
99
+ # - :skip
100
+ #
101
+ # There is an implicit PEN 0 for the standard fields.
102
+ #
103
+ # See <https://github.com/logstash-plugins/logstash-codec-netflow/blob/master/lib/logstash/codecs/netflow/ipfix.yaml> for the base set.
104
+ config :ipfix_definitions, :validate => :path
35
105
 
36
106
  NETFLOW5_FIELDS = ['version', 'flow_seq_num', 'engine_type', 'engine_id', 'sampling_algorithm', 'sampling_interval', 'flow_records']
37
107
  NETFLOW9_FIELDS = ['version', 'flow_seq_num']
108
+ NETFLOW9_SCOPES = {
109
+ 1 => :scope_system,
110
+ 2 => :scope_interface,
111
+ 3 => :scope_line_card,
112
+ 4 => :scope_netflow_cache,
113
+ 5 => :scope_template,
114
+ }
115
+ IPFIX_FIELDS = ['version']
38
116
  SWITCHED = /_switched$/
39
117
  FLOWSET_ID = "flowset_id"
40
118
 
@@ -45,29 +123,19 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
45
123
 
46
124
  def register
47
125
  require "logstash/codecs/netflow/util"
48
- @templates = Vash.new()
126
+ @netflow_templates = Vash.new()
127
+ @ipfix_templates = Vash.new()
49
128
 
50
129
  # Path to default Netflow v9 field definitions
51
130
  filename = ::File.expand_path('netflow/netflow.yaml', ::File.dirname(__FILE__))
131
+ @netflow_fields = load_definitions(filename, @netflow_definitions)
52
132
 
53
- begin
54
- @fields = YAML.load_file(filename)
55
- rescue Exception => e
56
- raise "#{self.class.name}: Bad syntax in definitions file #{filename}"
57
- end
58
-
59
- # Allow the user to augment/override/rename the supported Netflow fields
60
- if @definitions
61
- raise "#{self.class.name}: definitions file #{@definitions} does not exists" unless File.exists?(@definitions)
62
- begin
63
- @fields.merge!(YAML.load_file(@definitions))
64
- rescue Exception => e
65
- raise "#{self.class.name}: Bad syntax in definitions file #{@definitions}"
66
- end
67
- end
133
+ # Path to default IPFIX field definitions
134
+ filename = ::File.expand_path('netflow/ipfix.yaml', ::File.dirname(__FILE__))
135
+ @ipfix_fields = load_definitions(filename, @ipfix_definitions)
68
136
  end # def register
69
137
 
70
- def decode(payload, &block)
138
+ def decode(payload, metadata = nil, &block)
71
139
  header = Header.read(payload)
72
140
 
73
141
  unless @versions.include?(header.version)
@@ -83,11 +151,22 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
83
151
  elsif header.version == 9
84
152
  flowset = Netflow9PDU.read(payload)
85
153
  flowset.records.each do |record|
86
- decode_netflow9(flowset, record).each{|event| yield(event)}
154
+ if metadata != nil
155
+ decode_netflow9(flowset, record, metadata).each{|event| yield(event)}
156
+ else
157
+ decode_netflow9(flowset, record).each{|event| yield(event)}
158
+ end
159
+ end
160
+ elsif header.version == 10
161
+ flowset = IpfixPDU.read(payload)
162
+ flowset.records.each do |record|
163
+ decode_ipfix(flowset, record).each { |event| yield(event) }
87
164
  end
88
165
  else
89
166
  @logger.warn("Unsupported Netflow version v#{header.version}")
90
167
  end
168
+ rescue BinData::ValidityError, IOError => e
169
+ @logger.warn("Invalid netflow packet received (#{e})")
91
170
  end
92
171
 
93
172
  private
@@ -125,9 +204,11 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
125
204
  end
126
205
 
127
206
  LogStash::Event.new(event)
207
+ rescue BinData::ValidityError, IOError => e
208
+ @logger.warn("Invalid netflow packet received (#{e})")
128
209
  end
129
210
 
130
- def decode_netflow9(flowset, record)
211
+ def decode_netflow9(flowset, record, metadata = nil)
131
212
  events = []
132
213
 
133
214
  case record.flowset_id
@@ -143,10 +224,14 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
143
224
  end
144
225
  # We get this far, we have a list of fields
145
226
  #key = "#{flowset.source_id}|#{event["source"]}|#{template.template_id}"
146
- key = "#{flowset.source_id}|#{template.template_id}"
147
- @templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
227
+ if metadata != nil
228
+ key = "#{flowset.source_id}|#{template.template_id}|#{metadata["host"]}|#{metadata["port"]}"
229
+ else
230
+ key = "#{flowset.source_id}|#{template.template_id}"
231
+ end
232
+ @netflow_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
148
233
  # Purge any expired templates
149
- @templates.cleanup!
234
+ @netflow_templates.cleanup!
150
235
  end
151
236
  end
152
237
  when 1
@@ -154,6 +239,9 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
154
239
  record.flowset_data.templates.each do |template|
155
240
  catch (:field) do
156
241
  fields = []
242
+ template.scope_fields.each do |field|
243
+ fields << [uint_field(0, field.field_length), NETFLOW9_SCOPES[field.field_type]]
244
+ end
157
245
  template.option_fields.each do |field|
158
246
  entry = netflow_field_for(field.field_type, field.field_length)
159
247
  throw :field unless entry
@@ -161,17 +249,25 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
161
249
  end
162
250
  # We get this far, we have a list of fields
163
251
  #key = "#{flowset.source_id}|#{event["source"]}|#{template.template_id}"
164
- key = "#{flowset.source_id}|#{template.template_id}"
165
- @templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
252
+ if metadata != nil
253
+ key = "#{flowset.source_id}|#{template.template_id}|#{metadata["host"]}|#{metadata["port"]}"
254
+ else
255
+ key = "#{flowset.source_id}|#{template.template_id}"
256
+ end
257
+ @netflow_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
166
258
  # Purge any expired templates
167
- @templates.cleanup!
259
+ @netflow_templates.cleanup!
168
260
  end
169
261
  end
170
262
  when 256..65535
171
263
  # Data flowset
172
264
  #key = "#{flowset.source_id}|#{event["source"]}|#{record.flowset_id}"
173
- key = "#{flowset.source_id}|#{record.flowset_id}"
174
- template = @templates[key]
265
+ if metadata != nil
266
+ key = "#{flowset.source_id}|#{record.flowset_id}|#{metadata["host"]}|#{metadata["port"]}"
267
+ else
268
+ key = "#{flowset.source_id}|#{record.flowset_id}"
269
+ end
270
+ template = @netflow_templates[key]
175
271
 
176
272
  unless template
177
273
  #@logger.warn("No matching template for flow id #{record.flowset_id} from #{event["source"]}")
@@ -224,6 +320,158 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
224
320
  end
225
321
 
226
322
  events
323
+ rescue BinData::ValidityError, IOError => e
324
+ @logger.warn("Invalid netflow packet received (#{e})")
325
+ end
326
+
327
+ def decode_ipfix(flowset, record)
328
+ events = []
329
+
330
+ case record.flowset_id
331
+ when 2
332
+ # Template flowset
333
+ record.flowset_data.templates.each do |template|
334
+ catch (:field) do
335
+ fields = []
336
+ template.fields.each do |field|
337
+ field_type = field.field_type
338
+ field_length = field.field_length
339
+ enterprise_id = field.enterprise ? field.enterprise_id : 0
340
+
341
+ if field.field_length == 0xffff
342
+ # FIXME
343
+ @logger.warn("Cowardly refusing to deal with variable length encoded field", :type => field_type, :enterprise => enterprise_id)
344
+ throw :field
345
+ end
346
+
347
+ if enterprise_id == 0
348
+ case field_type
349
+ when 291, 292, 293
350
+ # FIXME
351
+ @logger.warn("Cowardly refusing to deal with complex data types", :type => field_type, :enterprise => enterprise_id)
352
+ throw :field
353
+ end
354
+ end
355
+
356
+ entry = ipfix_field_for(field_type, enterprise_id, field.field_length)
357
+ throw :field unless entry
358
+ fields += entry
359
+ end
360
+ # FIXME Source IP address required in key
361
+ key = "#{flowset.observation_domain_id}|#{template.template_id}"
362
+ @ipfix_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
363
+ # Purge any expired templates
364
+ @ipfix_templates.cleanup!
365
+ end
366
+ end
367
+ when 3
368
+ # Options template flowset
369
+ record.flowset_data.templates.each do |template|
370
+ catch (:field) do
371
+ fields = []
372
+ (template.scope_fields.to_ary + template.option_fields.to_ary).each do |field|
373
+ field_type = field.field_type
374
+ field_length = field.field_length
375
+ enterprise_id = field.enterprise ? field.enterprise_id : 0
376
+
377
+ if field.field_length == 0xffff
378
+ # FIXME
379
+ @logger.warn("Cowardly refusing to deal with variable length encoded field", :type => field_type, :enterprise => enterprise_id)
380
+ throw :field
381
+ end
382
+
383
+ if enterprise_id == 0
384
+ case field_type
385
+ when 291, 292, 293
386
+ # FIXME
387
+ @logger.warn("Cowardly refusing to deal with complex data types", :type => field_type, :enterprise => enterprise_id)
388
+ throw :field
389
+ end
390
+ end
391
+
392
+ entry = ipfix_field_for(field_type, enterprise_id, field.field_length)
393
+ throw :field unless entry
394
+ fields += entry
395
+ end
396
+ # FIXME Source IP address required in key
397
+ key = "#{flowset.observation_domain_id}|#{template.template_id}"
398
+ @ipfix_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
399
+ # Purge any expired templates
400
+ @ipfix_templates.cleanup!
401
+ end
402
+ end
403
+ when 256..65535
404
+ # Data flowset
405
+ key = "#{flowset.observation_domain_id}|#{record.flowset_id}"
406
+ template = @ipfix_templates[key]
407
+
408
+ unless template
409
+ @logger.warn("No matching template for flow id #{record.flowset_id}")
410
+ next
411
+ end
412
+
413
+ array = BinData::Array.new(:type => template, :read_until => :eof)
414
+ records = array.read(record.flowset_data)
415
+
416
+ records.each do |r|
417
+ event = {
418
+ LogStash::Event::TIMESTAMP => LogStash::Timestamp.at(flowset.unix_sec),
419
+ @target => {}
420
+ }
421
+
422
+ IPFIX_FIELDS.each do |f|
423
+ event[@target][f] = flowset[f].snapshot
424
+ end
425
+
426
+ r.each_pair do |k, v|
427
+ case k.to_s
428
+ when /^flow(?:Start|End)Seconds$/
429
+ event[@target][k.to_s] = LogStash::Timestamp.at(v.snapshot).to_iso8601
430
+ when /^flow(?:Start|End)(Milli|Micro|Nano)seconds$/
431
+ divisor =
432
+ case $1
433
+ when 'Milli'
434
+ 1_000
435
+ when 'Micro'
436
+ 1_000_000
437
+ when 'Nano'
438
+ 1_000_000_000
439
+ end
440
+ event[@target][k.to_s] = LogStash::Timestamp.at(v.snapshot.to_f / divisor).to_iso8601
441
+ else
442
+ event[@target][k.to_s] = v.snapshot
443
+ end
444
+ end
445
+
446
+ events << LogStash::Event.new(event)
447
+ end
448
+ else
449
+ @logger.warn("Unsupported flowset id #{record.flowset_id}")
450
+ end
451
+
452
+ events
453
+ rescue BinData::ValidityError => e
454
+ @logger.warn("Invalid IPFIX packet received (#{e})")
455
+ end
456
+
457
+ def load_definitions(defaults, extra)
458
+ begin
459
+ fields = YAML.load_file(defaults)
460
+ rescue Exception => e
461
+ raise "#{self.class.name}: Bad syntax in definitions file #{defaults}"
462
+ end
463
+
464
+ # Allow the user to augment/override/rename the default fields
465
+ if extra
466
+ raise "#{self.class.name}: definitions file #{extra} does not exist" unless File.exists?(extra)
467
+ begin
468
+ fields.merge!(YAML.load_file(extra))
469
+ rescue Exception => e
470
+ raise "#{self.class.name}: Bad syntax in definitions file #{extra}"
471
+ end
472
+ end
473
+
474
+ fields
227
475
  end
228
476
 
229
477
  def uint_field(length, default)
@@ -232,8 +480,8 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
232
480
  end # def uint_field
233
481
 
234
482
  def netflow_field_for(type, length)
235
- if @fields.include?(type)
236
- field = @fields[type]
483
+ if @netflow_fields.include?(type)
484
+ field = @netflow_fields[type].clone
237
485
  if field.is_a?(Array)
238
486
 
239
487
  field[0] = uint_field(length, field[0]) if field[0].is_a?(Integer)
@@ -259,4 +507,38 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
259
507
  nil
260
508
  end
261
509
  end # def netflow_field_for
510
+
511
+ def ipfix_field_for(type, enterprise, length)
512
+ if @ipfix_fields.include?(enterprise)
513
+ if @ipfix_fields[enterprise].include?(type)
514
+ field = @ipfix_fields[enterprise][type].clone
515
+ else
516
+ @logger.warn("Unsupported enterprise field", :type => type, :enterprise => enterprise, :length => length)
517
+ end
518
+ else
519
+ @logger.warn("Unsupported enterprise", :enterprise => enterprise)
520
+ end
521
+
522
+ return nil unless field
523
+
524
+ if field.is_a?(Array)
525
+ case field[0]
526
+ when :skip
527
+ field += [nil, {:length => length}]
528
+ when :string
529
+ field += [{:length => length, :trim_padding => true}]
530
+ when :uint64
531
+ field[0] = uint_field(length, 8)
532
+ when :uint32
533
+ field[0] = uint_field(length, 4)
534
+ when :uint16
535
+ field[0] = uint_field(length, 2)
536
+ end
537
+
538
+ @logger.debug("Definition complete", :field => field)
539
+ [field]
540
+ else
541
+ @logger.warn("Definition should be an array", :field => field)
542
+ end
543
+ end
262
544
  end # class LogStash::Filters::Netflow