logstash-codec-netflow 3.0.0 → 3.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 95e7e7d16a47ec1ae4535979d9792b6051e6d36f
4
- data.tar.gz: b101a6c29ebc0fa9d65ade94fb14160cd7a1df33
3
+ metadata.gz: 39f14894de7025536b3188dc478d47f95f63fee0
4
+ data.tar.gz: 10b03be93979be247036c02dc77e8dcf04e9bfed
5
5
  SHA512:
6
- metadata.gz: 776e788d36ccdb7c30844b7f4b7925eb7c4fc5a5c40371b3330376bb00b659a4338a312160634813dfe4d6f14d98fbf4f9d83c9da3533ec954c8460b749a7e04
7
- data.tar.gz: 135bdba032f43b7855c4bf63fcf3468b40eb8faaefbb70ef9ada0ff63e2561cc230dcc87671d7f92afac0d702b1b406cccf46192ac4f12a15a351c46bb897f80
6
+ metadata.gz: 4e834252bcf93d1483a7d65bb2327420401a541531861a4bc3daeebda3e082ff5d359041af2d39b31f224e9a93a2aca8ec291f30a939b66d4cf8c71c80d798f2
7
+ data.tar.gz: 3f3bd7d64eee91851be697ace496edce09598d02e3799f4f1fb723224aa1c6a6fcb07dde053798bc35541a2d215276a6c1e2375045614a7cb4a467b2fc004d03
data/CHANGELOG.md CHANGED
@@ -1,5 +1,14 @@
1
+ ## 3.1.0
2
+ - Added IPFIX support
3
+ ## 3.0.1
4
+ - Republish all the gems under jruby.
1
5
  ## 3.0.0
2
6
  - Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141
7
+ - Fixed exception if Netflow data contains MAC addresses (issue #26, issue #34)
8
+ - Fixed exceptions when receiving invalid Netflow v5 and v9 data (issue #17, issue 18)
9
+ - Fixed decoding Netflow templates from multiple (non-identical) exporters
10
+ - Add support for Cisco ASA fields
11
+ - Add support for Netflow 9 options template with scope fields
3
12
  # 2.0.5
4
13
  - Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash
5
14
  # 2.0.4
data/CONTRIBUTORS CHANGED
@@ -3,12 +3,24 @@ reports, or in general have helped logstash along its way.
3
3
 
4
4
  Contributors:
5
5
  * Aaron Mildenstein (untergeek)
6
+ * Adam Kaminski (thimslugga)
6
7
  * Colin Surprenant (colinsurprenant)
7
8
  * Jordan Sissel (jordansissel)
9
+ * Jorrit Folmer (jorritfolmer)
8
10
  * Matt Dainty (bodgit)
11
+ * Paul Warren (pwarren)
9
12
  * Pier-Hugues Pellerin (ph)
13
+ * Pulkit Agrawal (propulkit)
10
14
  * Richard Pijnenburg (electrical)
15
+ * Salvador Ferrer (salva-ferrer)
16
+ * Will Rigby (wrigby)
17
+ * Rojuinex
11
18
  * debadair
19
+ * hkshirish
20
+ * jstopinsek
21
+
22
+ Maintainer:
23
+ * Jorrit Folmer (jorritfolmer)
12
24
 
13
25
  Note: If you've sent us patches, bug reports, or otherwise contributed to
14
26
  Logstash, and you aren't on the list above and want to be, please let us know
@@ -3,7 +3,59 @@ require "logstash/filters/base"
3
3
  require "logstash/namespace"
4
4
  require "logstash/timestamp"
5
5
 
6
- # The "netflow" codec is for decoding Netflow v5/v9 flows.
6
+ # The "netflow" codec is used for decoding Netflow v5/v9/v10 (IPFIX) flows.
7
+ #
8
+ # ==== Supported Netflow/IPFIX exporters
9
+ #
10
+ # The following Netflow/IPFIX exporters are known to work with the most recent version of the netflow codec:
11
+ #
12
+ # [cols="6,^2,^2,^2,12",options="header"]
13
+ # |===========================================================================================
14
+ # |Netflow exporter | v5 | v9 | IPFIX | Remarks
15
+ # |Softflowd | y | y | y | IPFIX supported in https://github.com/djmdjm/softflowd
16
+ # |nProbe | y | y | y |
17
+ # |ipt_NETFLOW | y | y | y |
18
+ # |Cisco ASA | | y | |
19
+ # |Cisco IOS 12.x | | y | |
20
+ # |fprobe | y | | |
21
+ # |===========================================================================================
22
+ #
23
+ # ==== Usage
24
+ #
25
+ # Example Logstash configuration:
26
+ #
27
+ # [source]
28
+ # -----------------------------
29
+ # input {
30
+ # udp {
31
+ # host => localhost
32
+ # port => 2055
33
+ # codec => netflow {
34
+ # versions => [5, 9]
35
+ # }
36
+ # type => netflow
37
+ # }
38
+ # udp {
39
+ # host => localhost
40
+ # port => 4739
41
+ # codec => netflow {
42
+ # versions => [10]
43
+ # target => ipfix
44
+ # }
45
+ # type => ipfix
46
+ # }
47
+ # tcp {
48
+ # host => localhost
49
+ # port => 4739
50
+ # codec => netflow {
51
+ # versions => [10]
52
+ # target => ipfix
53
+ # }
54
+ # type => ipfix
55
+ # }
56
+ # }
57
+ # -----------------------------
58
+
7
59
  class LogStash::Codecs::Netflow < LogStash::Codecs::Base
8
60
  config_name "netflow"
9
61
 
@@ -14,7 +66,7 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
14
66
  config :target, :validate => :string, :default => "netflow"
15
67
 
16
68
  # Specify which Netflow versions you will accept.
17
- config :versions, :validate => :array, :default => [5, 9]
69
+ config :versions, :validate => :array, :default => [5, 9, 10]
18
70
 
19
71
  # Override YAML file containing Netflow field definitions
20
72
  #
@@ -31,7 +83,25 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
31
83
  # - :skip
32
84
  #
33
85
  # See <https://github.com/logstash-plugins/logstash-codec-netflow/blob/master/lib/logstash/codecs/netflow/netflow.yaml> for the base set.
34
- config :definitions, :validate => :path
86
+ config :netflow_definitions, :validate => :path
87
+
88
+ # Override YAML file containing IPFIX field definitions
89
+ #
90
+ # Very similar to the Netflow version except there is a top level Private
91
+ # Enterprise Number (PEN) key added:
92
+ #
93
+ # ---
94
+ # pen:
95
+ # id:
96
+ # - :uintN or :ip4_addr or :ip6_addr or :mac_addr or :string
97
+ # - :name
98
+ # id:
99
+ # - :skip
100
+ #
101
+ # There is an implicit PEN 0 for the standard fields.
102
+ #
103
+ # See <https://github.com/logstash-plugins/logstash-codec-netflow/blob/master/lib/logstash/codecs/netflow/ipfix.yaml> for the base set.
104
+ config :ipfix_definitions, :validate => :path
35
105
 
36
106
  NETFLOW5_FIELDS = ['version', 'flow_seq_num', 'engine_type', 'engine_id', 'sampling_algorithm', 'sampling_interval', 'flow_records']
37
107
  NETFLOW9_FIELDS = ['version', 'flow_seq_num']
@@ -42,6 +112,7 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
42
112
  4 => :scope_netflow_cache,
43
113
  5 => :scope_template,
44
114
  }
115
+ IPFIX_FIELDS = ['version']
45
116
  SWITCHED = /_switched$/
46
117
  FLOWSET_ID = "flowset_id"
47
118
 
@@ -52,26 +123,16 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
52
123
 
53
124
  def register
54
125
  require "logstash/codecs/netflow/util"
55
- @templates = Vash.new()
126
+ @netflow_templates = Vash.new()
127
+ @ipfix_templates = Vash.new()
56
128
 
57
129
  # Path to default Netflow v9 field definitions
58
130
  filename = ::File.expand_path('netflow/netflow.yaml', ::File.dirname(__FILE__))
131
+ @netflow_fields = load_definitions(filename, @netflow_definitions)
59
132
 
60
- begin
61
- @fields = YAML.load_file(filename)
62
- rescue Exception => e
63
- raise "#{self.class.name}: Bad syntax in definitions file #{filename}"
64
- end
65
-
66
- # Allow the user to augment/override/rename the supported Netflow fields
67
- if @definitions
68
- raise "#{self.class.name}: definitions file #{@definitions} does not exists" unless File.exists?(@definitions)
69
- begin
70
- @fields.merge!(YAML.load_file(@definitions))
71
- rescue Exception => e
72
- raise "#{self.class.name}: Bad syntax in definitions file #{@definitions}"
73
- end
74
- end
133
+ # Path to default IPFIX field definitions
134
+ filename = ::File.expand_path('netflow/ipfix.yaml', ::File.dirname(__FILE__))
135
+ @ipfix_fields = load_definitions(filename, @ipfix_definitions)
75
136
  end # def register
76
137
 
77
138
  def decode(payload, metadata = nil, &block)
@@ -96,6 +157,11 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
96
157
  decode_netflow9(flowset, record).each{|event| yield(event)}
97
158
  end
98
159
  end
160
+ elsif header.version == 10
161
+ flowset = IpfixPDU.read(payload)
162
+ flowset.records.each do |record|
163
+ decode_ipfix(flowset, record).each { |event| yield(event) }
164
+ end
99
165
  else
100
166
  @logger.warn("Unsupported Netflow version v#{header.version}")
101
167
  end
@@ -163,9 +229,9 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
163
229
  else
164
230
  key = "#{flowset.source_id}|#{template.template_id}"
165
231
  end
166
- @templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
232
+ @netflow_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
167
233
  # Purge any expired templates
168
- @templates.cleanup!
234
+ @netflow_templates.cleanup!
169
235
  end
170
236
  end
171
237
  when 1
@@ -188,9 +254,9 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
188
254
  else
189
255
  key = "#{flowset.source_id}|#{template.template_id}"
190
256
  end
191
- @templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
257
+ @netflow_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
192
258
  # Purge any expired templates
193
- @templates.cleanup!
259
+ @netflow_templates.cleanup!
194
260
  end
195
261
  end
196
262
  when 256..65535
@@ -201,7 +267,7 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
201
267
  else
202
268
  key = "#{flowset.source_id}|#{record.flowset_id}"
203
269
  end
204
- template = @templates[key]
270
+ template = @netflow_templates[key]
205
271
 
206
272
  unless template
207
273
  #@logger.warn("No matching template for flow id #{record.flowset_id} from #{event["source"]}")
@@ -258,14 +324,164 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
258
324
  @logger.warn("Invalid netflow packet received (#{e})")
259
325
  end
260
326
 
327
+ def decode_ipfix(flowset, record)
328
+ events = []
329
+
330
+ case record.flowset_id
331
+ when 2
332
+ # Template flowset
333
+ record.flowset_data.templates.each do |template|
334
+ catch (:field) do
335
+ fields = []
336
+ template.fields.each do |field|
337
+ field_type = field.field_type
338
+ field_length = field.field_length
339
+ enterprise_id = field.enterprise ? field.enterprise_id : 0
340
+
341
+ if field.field_length == 0xffff
342
+ # FIXME
343
+ @logger.warn("Cowardly refusing to deal with variable length encoded field", :type => field_type, :enterprise => enterprise_id)
344
+ throw :field
345
+ end
346
+
347
+ if enterprise_id == 0
348
+ case field_type
349
+ when 291, 292, 293
350
+ # FIXME
351
+ @logger.warn("Cowardly refusing to deal with complex data types", :type => field_type, :enterprise => enterprise_id)
352
+ throw :field
353
+ end
354
+ end
355
+
356
+ entry = ipfix_field_for(field_type, enterprise_id, field.field_length)
357
+ throw :field unless entry
358
+ fields += entry
359
+ end
360
+ # FIXME Source IP address required in key
361
+ key = "#{flowset.observation_domain_id}|#{template.template_id}"
362
+ @ipfix_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
363
+ # Purge any expired templates
364
+ @ipfix_templates.cleanup!
365
+ end
366
+ end
367
+ when 3
368
+ # Options template flowset
369
+ record.flowset_data.templates.each do |template|
370
+ catch (:field) do
371
+ fields = []
372
+ (template.scope_fields.to_ary + template.option_fields.to_ary).each do |field|
373
+ field_type = field.field_type
374
+ field_length = field.field_length
375
+ enterprise_id = field.enterprise ? field.enterprise_id : 0
376
+
377
+ if field.field_length == 0xffff
378
+ # FIXME
379
+ @logger.warn("Cowardly refusing to deal with variable length encoded field", :type => field_type, :enterprise => enterprise_id)
380
+ throw :field
381
+ end
382
+
383
+ if enterprise_id == 0
384
+ case field_type
385
+ when 291, 292, 293
386
+ # FIXME
387
+ @logger.warn("Cowardly refusing to deal with complex data types", :type => field_type, :enterprise => enterprise_id)
388
+ throw :field
389
+ end
390
+ end
391
+
392
+ entry = ipfix_field_for(field_type, enterprise_id, field.field_length)
393
+ throw :field unless entry
394
+ fields += entry
395
+ end
396
+ # FIXME Source IP address required in key
397
+ key = "#{flowset.observation_domain_id}|#{template.template_id}"
398
+ @ipfix_templates[key, @cache_ttl] = BinData::Struct.new(:endian => :big, :fields => fields)
399
+ # Purge any expired templates
400
+ @ipfix_templates.cleanup!
401
+ end
402
+ end
403
+ when 256..65535
404
+ # Data flowset
405
+ key = "#{flowset.observation_domain_id}|#{record.flowset_id}"
406
+ template = @ipfix_templates[key]
407
+
408
+ unless template
409
+ @logger.warn("No matching template for flow id #{record.flowset_id}")
410
+ next
411
+ end
412
+
413
+ array = BinData::Array.new(:type => template, :read_until => :eof)
414
+ records = array.read(record.flowset_data)
415
+
416
+ records.each do |r|
417
+ event = {
418
+ LogStash::Event::TIMESTAMP => LogStash::Timestamp.at(flowset.unix_sec),
419
+ @target => {}
420
+ }
421
+
422
+ IPFIX_FIELDS.each do |f|
423
+ event[@target][f] = flowset[f].snapshot
424
+ end
425
+
426
+ r.each_pair do |k, v|
427
+ case k.to_s
428
+ when /^flow(?:Start|End)Seconds$/
429
+ event[@target][k.to_s] = LogStash::Timestamp.at(v.snapshot).to_iso8601
430
+ when /^flow(?:Start|End)(Milli|Micro|Nano)seconds$/
431
+ divisor =
432
+ case $1
433
+ when 'Milli'
434
+ 1_000
435
+ when 'Micro'
436
+ 1_000_000
437
+ when 'Nano'
438
+ 1_000_000_000
439
+ end
440
+ event[@target][k.to_s] = LogStash::Timestamp.at(v.snapshot.to_f / divisor).to_iso8601
441
+ else
442
+ event[@target][k.to_s] = v.snapshot
443
+ end
444
+ end
445
+
446
+ events << LogStash::Event.new(event)
447
+ end
448
+ else
449
+ @logger.warn("Unsupported flowset id #{record.flowset_id}")
450
+ end
451
+
452
+ events
453
+ rescue BinData::ValidityError => e
454
+ @logger.warn("Invalid IPFIX packet received (#{e})")
455
+ end
456
+
457
+ def load_definitions(defaults, extra)
458
+ begin
459
+ fields = YAML.load_file(defaults)
460
+ rescue Exception => e
461
+ raise "#{self.class.name}: Bad syntax in definitions file #{defaults}"
462
+ end
463
+
464
+ # Allow the user to augment/override/rename the default fields
465
+ if extra
466
+ raise "#{self.class.name}: definitions file #{extra} does not exist" unless File.exists?(extra)
467
+ begin
468
+ fields.merge!(YAML.load_file(extra))
469
+ rescue Exception => e
470
+ raise "#{self.class.name}: Bad syntax in definitions file #{extra}"
471
+ end
472
+ end
473
+
474
+ fields
475
+ end
476
+
261
477
  def uint_field(length, default)
262
478
  # If length is 4, return :uint32, etc. and use default if length is 0
263
479
  ("uint" + (((length > 0) ? length : default) * 8).to_s).to_sym
264
480
  end # def uint_field
265
481
 
266
482
  def netflow_field_for(type, length)
267
- if @fields.include?(type)
268
- field = @fields[type].clone
483
+ if @netflow_fields.include?(type)
484
+ field = @netflow_fields[type].clone
269
485
  if field.is_a?(Array)
270
486
 
271
487
  field[0] = uint_field(length, field[0]) if field[0].is_a?(Integer)
@@ -291,4 +507,38 @@ class LogStash::Codecs::Netflow < LogStash::Codecs::Base
291
507
  nil
292
508
  end
293
509
  end # def netflow_field_for
510
+
511
+ def ipfix_field_for(type, enterprise, length)
512
+ if @ipfix_fields.include?(enterprise)
513
+ if @ipfix_fields[enterprise].include?(type)
514
+ field = @ipfix_fields[enterprise][type].clone
515
+ else
516
+ @logger.warn("Unsupported enterprise field", :type => type, :enterprise => enterprise, :length => length)
517
+ end
518
+ else
519
+ @logger.warn("Unsupported enterprise", :enterprise => enterprise)
520
+ end
521
+
522
+ return nil unless field
523
+
524
+ if field.is_a?(Array)
525
+ case field[0]
526
+ when :skip
527
+ field += [nil, {:length => length}]
528
+ when :string
529
+ field += [{:length => length, :trim_padding => true}]
530
+ when :uint64
531
+ field[0] = uint_field(length, 8)
532
+ when :uint32
533
+ field[0] = uint_field(length, 4)
534
+ when :uint16
535
+ field[0] = uint_field(length, 2)
536
+ end
537
+
538
+ @logger.debug("Definition complete", :field => field)
539
+ [field]
540
+ else
541
+ @logger.warn("Definition should be an array", :field => field)
542
+ end
543
+ end
294
544
  end # class LogStash::Filters::Netflow
@@ -0,0 +1,77 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require 'open-uri'
4
+ require 'csv'
5
+ require 'yaml'
6
+
7
+ # Convert IANA types to those used by BinData or created by ourselves
8
+ def iana2bindata(type)
9
+ case type
10
+ when /^unsigned(\d+)$/
11
+ return 'uint' + $1
12
+ when /^signed(\d+)$/
13
+ return 'int' + $1
14
+ when 'float32'
15
+ return 'float'
16
+ when 'float64'
17
+ return 'double'
18
+ when 'ipv4Address'
19
+ return 'ip4_addr'
20
+ when 'ipv6Address'
21
+ return 'ip6_addr'
22
+ when 'macAddress'
23
+ return 'mac_addr'
24
+ when 'octetArray', 'string'
25
+ return 'string'
26
+ when 'dateTimeSeconds'
27
+ return 'uint32'
28
+ when 'dateTimeMilliseconds', 'dateTimeMicroseconds', 'dateTimeNanoseconds'
29
+ return 'uint64'
30
+ when 'boolean'
31
+ return 'uint8'
32
+ when 'basicList', 'subTemplateList', 'subTemplateMultiList'
33
+ return 'skip'
34
+ else
35
+ raise "Unknown type #{type}"
36
+ end
37
+ end
38
+
39
+ def iana2hash(url)
40
+ fields = { 0 => {} }
41
+
42
+ # Read in IANA-registered Information Elements (PEN 0)
43
+ CSV.new(open(url), :headers => :first_row, :converters => :numeric).each do |line|
44
+ # If it's not a Fixnum it's something like 'x-y' used to mark reserved blocks
45
+ next if line['ElementID'].class != Fixnum
46
+
47
+ # Blacklisted ID's
48
+ next if [0].include?(line['ElementID'])
49
+
50
+ # Skip any elements with no name
51
+ next unless line['Name'] and line['Data Type']
52
+
53
+ fields[0][line['ElementID']] = [iana2bindata(line['Data Type']).to_sym]
54
+ if fields[0][line['ElementID']][0] != :skip
55
+ fields[0][line['ElementID']] << line['Name'].to_sym
56
+ end
57
+ end
58
+
59
+ # Overrides
60
+ fields[0][210][0] = :skip # 210 is PaddingOctets so skip them properly
61
+ fields[0][210].delete_at(1)
62
+
63
+ # Generate the reverse PEN (PEN 29305)
64
+ reversed = fields[0].reject { |k|
65
+ # Excluded according to RFC 5103
66
+ [40,41,42,130,131,137,145,148,149,163,164,165,166,167,168,173,210,211,212,213,214,215,216,217,239].include?(k)
67
+ }.map { |k,v|
68
+ [k, v.size > 1 ? [v[0], ('reverse' + v[1].to_s.slice(0,1).capitalize + v[1].to_s.slice(1..-1)).to_sym] : [v[0]]]
69
+ }
70
+ fields[29305] = Hash[reversed]
71
+
72
+ return fields
73
+ end
74
+
75
+ ipfix_fields = iana2hash('http://www.iana.org/assignments/ipfix/ipfix-information-elements.csv')
76
+
77
+ puts YAML.dump(ipfix_fields)