fluentd 0.14.20 → 0.14.21

Sign up to get free protection for your applications and to get access to all the features.

Potentially problematic release.


This version of fluentd might be problematic. Click here for more details.

checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 419af849ffe16bdb83d13106862b94018d71dca1
4
- data.tar.gz: 6f1435c4d5f9f1645267bb88f43aa86ee838075e
3
+ metadata.gz: e0c04447f174a283ac32276dd0dbde42bfabac9b
4
+ data.tar.gz: dc5da1036653db4373345239dc4486918423e479
5
5
  SHA512:
6
- metadata.gz: 96ff26acaccd6870648fd36ca991e101058296abfa8585366c0b132d94a19c79ef75214527c026b3719061c943d0d0aeaec45c3bb58655b3f08f7da88d5a6aa6
7
- data.tar.gz: a83370feeeabb3f9a3200f5575ce8ba9f4fe0a57349b9fe73dfd3f6b386492993483412883f91214b8903709a86dcacccac658cf663d1734aa37d5b1e4f5fa21
6
+ metadata.gz: 7015711b0401b911a3242e96bf90f5bd56693e1ccc62565d2cee241403354599a3c29bf487f10989297bfcf41a7d0c90c274396f0a07dc9c338fc26a66769323
7
+ data.tar.gz: 9cd584791a19c95166e34e325f104cb38918769c836f1779008e77e7d36367ccb786d3fa9b16381d8d9ce680ec22b903b60e9a8e0244f635954d188c6a5e2a6c
data/ChangeLog CHANGED
@@ -1,5 +1,25 @@
1
1
  # v0.14
2
2
 
3
+ ## Release v0.14.21 - 2017/09/07
4
+
5
+ ### New features / Enhancements
6
+
7
+ * filter_parser: Support record_accessor in key_name
8
+ https://github.com/fluent/fluentd/pull/1654
9
+ * buffer: Support record_accessor in chunk keys
10
+ https://github.com/fluent/fluentd/pull/1662
11
+
12
+ ### Bug fixes
13
+
14
+ * compat_parameters: Support all syslog parser parameters
15
+ https://github.com/fluent/fluentd/pull/1650
16
+ * filter_record_transformer: Don't create new keys if the original record doesn't have `keep_keys` keys
17
+ https://github.com/fluent/fluentd/pull/1663
18
+ * in_tail: Fix the error when 'tag *' is configured
19
+ https://github.com/fluent/fluentd/pull/1664
20
+ * supervisor: Clear previous worker pids when receive kill signals.
21
+ https://github.com/fluent/fluentd/pull/1683
22
+
3
23
  ## Release v0.14.20 - 2017/07/31
4
24
 
5
25
  ### New features / Enhancements
@@ -0,0 +1,43 @@
1
+ # Fluentd Enterprise Providers
2
+
3
+ [Fluentd](http://www.fluentd.org) is widely adopted in the enterprise and in most of scenarios, production environments requires special support and services. The following document describe the certified service providers and further guidelines to apply to become one.
4
+
5
+ ## Fluentd Providers
6
+
7
+ The following section lists the companies that provides Support, Consulting Services and Products around Fluentd for the Enterprise needs.
8
+
9
+ - [Treasure Data](https://fluentd.treasuredata.com)
10
+
11
+ ### Treasure Data
12
+
13
+ [Treasure Data](https://www.treasuredata.com) is one of the principal sponsors of Fluentd, it provides technical support, consulting services and [Fluentd Enterprise](https://fluentd.treasuredata.com/), a Fluentd-on steroids with enhanced security and connectors for enterprise backends such as [Apache Kafka](http://kafka.apache.org) and [Splunk](http://www.splunk.com) within others.
14
+
15
+ For more details about Fluentd Enterprise provided by Treasure Data, click [here](https://fluentd.treasuredata.com/).
16
+
17
+ ## Apply to become a Fluentd Service Provider
18
+
19
+ In order to keep a transparent involvement of companies in Fluentd growth, to be listed as a service provider your company must fulfil the following criteria:
20
+
21
+ - The company offering the services or products must be fully incorporated.
22
+ - At least two former employees must be contributors of Fluentd ecosystem (core, plugins, documentation, etc).
23
+ - Company participate from Fluentd community activities: meetups, conferences (or webinars) or online communication channels (Slack)
24
+
25
+ In order to be listed as a service provider, please send an email using your corporate account to the maintainers (listed at bottom) with the following information:
26
+
27
+ - Company Information
28
+ - Company Name
29
+ - Company URL
30
+ - Country / Location
31
+ - Employees
32
+ - Name, email and github handle of employees who contribute to Fluentd ecosystem
33
+ - List of projects or contributions where the company is involved
34
+ - Services
35
+ - What kind of services around Fluentd do you provide ?
36
+ - Community
37
+ - List of activities of your company within the Fluentd community in the last 12 months
38
+ - Are you a member of the [Cloud Native Computing Foundation](http://cncf.io) ?
39
+
40
+ ### Maintainers / Recipients
41
+
42
+ - Kiyoto Tamura <kiyoto@treasure-data.com>
43
+ - Eduardo Silva <eduardo@treasure-data.com>
@@ -3,3 +3,4 @@
3
3
  - Kiyoto Tamura <me@ktamura.com>
4
4
  - Kazuki Ohta <kazuki.ohta@gmail.com>
5
5
  - Satoshi "Moris" Tagomori <tagomoris@gmail.com>
6
+ - Eduardo Silva <eduardo@treasure-data.com>
data/README.md CHANGED
@@ -2,6 +2,8 @@ Fluentd: Open-Source Log Collector
2
2
  ===================================
3
3
 
4
4
  [<img src="https://travis-ci.org/fluent/fluentd.svg" />](https://travis-ci.org/fluent/fluentd) [![Code Climate](https://codeclimate.com/github/fluent/fluentd/badges/gpa.svg)](https://codeclimate.com/github/fluent/fluentd)
5
+ [![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/1189/badge)](https://bestpractices.coreinfrastructure.org/projects/1189)
6
+ [![FOSSA Status](https://app.fossa.io/api/projects/git%2Bhttps%3A%2F%2Fgithub.com%2Ffluent%2Ffluentd.svg?type=shield)](https://app.fossa.io/projects/git%2Bhttps%3A%2F%2Fgithub.com%2Ffluent%2Ffluentd?ref=badge_shield)
5
7
 
6
8
  [Fluentd](http://fluentd.org/) collects events from various data sources and writes them to files, RDBMS, NoSQL, IaaS, SaaS, Hadoop and so on. Fluentd helps you unify your logging infrastructure (Learn more about the [Unified Logging Layer](http://www.fluentd.org/blog/unified-logging-layer)).
7
9
 
@@ -76,6 +76,11 @@ module Fluent
76
76
  "name:#{@name}, arg:#{@arg}, " + attrs + ", " + @elements.inspect
77
77
  end
78
78
 
79
+ # Used by PP and Pry
80
+ def pretty_print(q)
81
+ q.text(inspect)
82
+ end
83
+
79
84
  # This method assumes _o_ is an Element object. Should return false for nil or other object
80
85
  def ==(o)
81
86
  self.name == o.name && self.arg == o.arg &&
@@ -24,7 +24,7 @@ module Fluent::Plugin
24
24
  class ParserFilter < Filter
25
25
  Fluent::Plugin.register_filter('parser', self)
26
26
 
27
- helpers :parser, :compat_parameters
27
+ helpers :parser, :record_accessor, :compat_parameters
28
28
 
29
29
  config_param :key_name, :string
30
30
  config_param :reserve_data, :bool, default: false
@@ -41,6 +41,7 @@ module Fluent::Plugin
41
41
 
42
42
  super
43
43
 
44
+ @accessor = record_accessor_create(@key_name)
44
45
  @parser = parser_create
45
46
  end
46
47
 
@@ -48,7 +49,7 @@ module Fluent::Plugin
48
49
  REPLACE_CHAR = '?'.freeze
49
50
 
50
51
  def filter_with_time(tag, time, record)
51
- raw_value = record[@key_name]
52
+ raw_value = @accessor.call(record)
52
53
  if raw_value.nil?
53
54
  if @emit_invalid_record_to_error
54
55
  router.emit_error_event(tag, time, record, ArgumentError.new("#{@key_name} does not exist"))
@@ -28,9 +28,9 @@ module Fluent::Plugin
28
28
  Fluent::Plugin.register_filter('record_transformer', self)
29
29
 
30
30
  desc 'A comma-delimited list of keys to delete.'
31
- config_param :remove_keys, :string, default: nil
31
+ config_param :remove_keys, :array, default: nil
32
32
  desc 'A comma-delimited list of keys to keep.'
33
- config_param :keep_keys, :string, default: nil
33
+ config_param :keep_keys, :array, default: nil
34
34
  desc 'Create new Hash to transform incoming data'
35
35
  config_param :renew_record, :bool, default: false
36
36
  desc 'Specify field name of the record to overwrite the time of events. Its value must be unix time.'
@@ -52,13 +52,8 @@ module Fluent::Plugin
52
52
  end
53
53
  end
54
54
 
55
- if @remove_keys
56
- @remove_keys = @remove_keys.split(',')
57
- end
58
-
59
55
  if @keep_keys
60
56
  raise Fluent::ConfigError, "`renew_record` must be true to use `keep_keys`" unless @renew_record
61
- @keep_keys = @keep_keys.split(',')
62
57
  end
63
58
 
64
59
  placeholder_expander_params = {
@@ -129,7 +124,9 @@ module Fluent::Plugin
129
124
  placeholders = @placeholder_expander.prepare_placeholders(placeholder_values)
130
125
 
131
126
  new_record = @renew_record ? {} : record.dup
132
- @keep_keys.each {|k| new_record[k] = record[k]} if @keep_keys and @renew_record
127
+ @keep_keys.each do |k|
128
+ new_record[k] = record[k] if record.has_key?(k)
129
+ end if @keep_keys && @renew_record
133
130
  new_record.merge!(expand_placeholders(@map, placeholders))
134
131
 
135
132
  new_record
@@ -218,7 +215,7 @@ module Fluent::Plugin
218
215
  # @param [String] str
219
216
  # @param [Boolean] force_stringify the value must be string, used for hash key
220
217
  def expand(str, placeholders, force_stringify = false)
221
- if @auto_typecast and !force_stringify
218
+ if @auto_typecast && !force_stringify
222
219
  single_placeholder_matched = str.match(/\A(\${[^}]+}|__[A-Z_]+__)\z/)
223
220
  if single_placeholder_matched
224
221
  log_if_unknown_placeholder($1, placeholders)
@@ -261,9 +258,9 @@ module Fluent::Plugin
261
258
  def preprocess_map(value, force_stringify = false)
262
259
  new_value = nil
263
260
  if value.is_a?(String)
264
- if @auto_typecast and !force_stringify
261
+ if @auto_typecast && !force_stringify
265
262
  num_placeholders = value.scan('${').size
266
- if num_placeholders == 1 and value.start_with?('${') && value.end_with?('}')
263
+ if num_placeholders == 1 && value.start_with?('${') && value.end_with?('}')
267
264
  new_value = value[2..-2] # ${..} => ..
268
265
  end
269
266
  end
@@ -130,6 +130,7 @@ module Fluent::Plugin
130
130
  def configure_tag
131
131
  if @tag.index('*')
132
132
  @tag_prefix, @tag_suffix = @tag.split('*')
133
+ @tag_prefix ||= ''
133
134
  @tag_suffix ||= ''
134
135
  else
135
136
  @tag_prefix = nil
@@ -15,6 +15,7 @@
15
15
  #
16
16
 
17
17
  require 'fluent/plugin/base'
18
+ require 'fluent/plugin_helper/record_accessor'
18
19
  require 'fluent/log'
19
20
  require 'fluent/plugin_id'
20
21
  require 'fluent/plugin_helper'
@@ -36,7 +37,7 @@ module Fluent
36
37
  helpers_internal :thread, :retry_state
37
38
 
38
39
  CHUNK_KEY_PATTERN = /^[-_.@a-zA-Z0-9]+$/
39
- CHUNK_KEY_PLACEHOLDER_PATTERN = /\$\{[-_.@a-zA-Z0-9]+\}/
40
+ CHUNK_KEY_PLACEHOLDER_PATTERN = /\$\{[-_.@$a-zA-Z0-9]+\}/
40
41
  CHUNK_TAG_PLACEHOLDER_PATTERN = /\$\{(tag(?:\[\d+\])?)\}/
41
42
 
42
43
  CHUNKING_FIELD_WARN_NUM = 4
@@ -161,7 +162,7 @@ module Fluent
161
162
  attr_reader :num_errors, :emit_count, :emit_records, :write_count, :rollback_count
162
163
 
163
164
  # for tests
164
- attr_reader :buffer, :retry, :secondary, :chunk_keys, :chunk_key_time, :chunk_key_tag
165
+ attr_reader :buffer, :retry, :secondary, :chunk_keys, :chunk_key_accessors, :chunk_key_time, :chunk_key_tag
165
166
  attr_accessor :output_enqueue_thread_waiting, :dequeued_chunks, :dequeued_chunks_mutex
166
167
  # output_enqueue_thread_waiting: for test of output.rb itself
167
168
  attr_accessor :retry_for_error_chunk # if true, error flush will be retried even if under_plugin_development is true
@@ -203,7 +204,7 @@ module Fluent
203
204
  @output_flush_threads = nil
204
205
 
205
206
  @simple_chunking = nil
206
- @chunk_keys = @chunk_key_time = @chunk_key_tag = nil
207
+ @chunk_keys = @chunk_key_accessors = @chunk_key_time = @chunk_key_tag = nil
207
208
  @flush_mode = nil
208
209
  @timekey_zone = nil
209
210
 
@@ -276,8 +277,25 @@ module Fluent
276
277
  @chunk_keys = @buffer_config.chunk_keys.dup
277
278
  @chunk_key_time = !!@chunk_keys.delete('time')
278
279
  @chunk_key_tag = !!@chunk_keys.delete('tag')
279
- if @chunk_keys.any?{ |key| key !~ CHUNK_KEY_PATTERN }
280
+ if @chunk_keys.any? { |key|
281
+ begin
282
+ k = Fluent::PluginHelper::RecordAccessor::Accessor.parse_parameter(key)
283
+ if k.is_a?(String)
284
+ k !~ CHUNK_KEY_PATTERN
285
+ else
286
+ if key.start_with?('$[')
287
+ raise Fluent::ConfigError, "in chunk_keys: bracket notation is not allowed"
288
+ else
289
+ false
290
+ end
291
+ end
292
+ rescue => e
293
+ raise Fluent::ConfigError, "in chunk_keys: #{e.message}"
294
+ end
295
+ }
280
296
  raise Fluent::ConfigError, "chunk_keys specification includes invalid char"
297
+ else
298
+ @chunk_key_accessors = Hash[@chunk_keys.map { |key| [key.to_sym, Fluent::PluginHelper::RecordAccessor::Accessor.new(key)] }]
281
299
  end
282
300
 
283
301
  if @chunk_key_time
@@ -778,7 +796,7 @@ module Fluent
778
796
  else
779
797
  nil
780
798
  end
781
- pairs = Hash[@chunk_keys.map{|k| [k.to_sym, record[k]]}]
799
+ pairs = Hash[@chunk_key_accessors.map { |k, a| [k, a.call(record)] }]
782
800
  @buffer.metadata(timekey: timekey, tag: (@chunk_key_tag ? tag : nil), variables: pairs)
783
801
  end
784
802
  end
@@ -65,7 +65,9 @@ module Fluent
65
65
  "label_delimiter" => "label_delimiter", # LabeledTSVParser
66
66
  "format_firstline" => "format_firstline", # MultilineParser
67
67
  "message_key" => "message_key", # NoneParser
68
- "with_priority" => "with_priority", # SyslogParser
68
+ "with_priority" => "with_priority", # SyslogParser
69
+ "message_format" => "message_format", # SyslogParser
70
+ "rfc5424_time_format" => "rfc5424_time_format", # SyslogParser
69
71
  # There has been no parsers which can handle timezone in v0.12
70
72
  }
71
73
 
@@ -173,7 +173,9 @@ module Fluent
173
173
 
174
174
  def kill_worker
175
175
  if config[:worker_pid]
176
- config[:worker_pid].each do |pid|
176
+ pids = config[:worker_pid].clone
177
+ config[:worker_pid].clear
178
+ pids.each do |pid|
177
179
  if Fluent.windows?
178
180
  Process.kill :KILL, pid
179
181
  else
@@ -16,6 +16,6 @@
16
16
 
17
17
  module Fluent
18
18
 
19
- VERSION = '0.14.20'
19
+ VERSION = '0.14.21'
20
20
 
21
21
  end
@@ -2,6 +2,7 @@ require_relative '../helper'
2
2
  require 'fluent/config/element'
3
3
  require 'fluent/config/configure_proxy'
4
4
  require 'fluent/configurable'
5
+ require 'pp'
5
6
 
6
7
  class TestConfigElement < ::Test::Unit::TestCase
7
8
  def element(name = 'ROOT', arg = '', attrs = {}, elements = [], unused = nil)
@@ -463,4 +464,13 @@ CONF
463
464
  assert_false e.for_another_worker?
464
465
  end
465
466
  end
467
+
468
+ sub_test_case '#pretty_print' do
469
+ test 'prints inspect to pp object' do
470
+ q = PP.new
471
+ e = element()
472
+ e.pretty_print(q)
473
+ assert_equal e.inspect, q.output
474
+ end
475
+ end
466
476
  end
@@ -390,6 +390,29 @@ class ParserFilterTest < Test::Unit::TestCase
390
390
  assert_equal 'value"ThreeYes!', first[1]['key3']
391
391
  end
392
392
 
393
+ def test_filter_with_nested_record
394
+ d = create_driver(%[
395
+ key_name $.data.log
396
+ <parse>
397
+ @type csv
398
+ keys key1,key2,key3
399
+ </parse>
400
+ ])
401
+ time = @default_time.to_i
402
+ d.run do
403
+ d.feed(@tag, time, {'data' => {'log' => 'value1,"value2","value""ThreeYes!"'}, 'xxx' => 'x', 'yyy' => 'y'})
404
+ end
405
+ filtered = d.filtered
406
+ assert_equal 1, filtered.length
407
+
408
+ first = filtered[0]
409
+ assert_equal time, first[0]
410
+ assert_nil first[1]['data']
411
+ assert_equal 'value1', first[1]['key1']
412
+ assert_equal 'value2', first[1]['key2']
413
+ assert_equal 'value"ThreeYes!', first[1]['key3']
414
+ end
415
+
393
416
  CONFIG_HASH_VALUE_FIELD = %[
394
417
  key_name data
395
418
  hash_value_field parsed
@@ -137,6 +137,18 @@ class RecordTransformerFilterTest < Test::Unit::TestCase
137
137
  end
138
138
  end
139
139
 
140
+ test 'keep_keys that are not present in the original record should not be included in the result record' do
141
+ config = %[renew_record true\nkeep_keys foo, bar, baz, message]
142
+ msgs = ['1', '2', nil]
143
+ filtered = filter(config, msgs)
144
+ filtered.each_with_index do |(_t, r), i|
145
+ assert_equal('bar', r['foo'])
146
+ assert_equal(msgs[i], r['message'])
147
+ assert_equal(false, r.has_key?('bar'))
148
+ assert_equal(false, r.has_key?('baz'))
149
+ end
150
+ end
151
+
140
152
  test 'enable_ruby' do
141
153
  config = %[
142
154
  enable_ruby yes
@@ -972,6 +972,20 @@ class TailInputTest < Test::Unit::TestCase
972
972
  plugin.receive_lines(['foo', 'bar'], DummyWatcher.new('foo.bar.log'))
973
973
  end
974
974
 
975
+ def test_tag_with_only_star
976
+ config = config_element("", "", {
977
+ "tag" => "*",
978
+ "path" => "test/plugin/*/%Y/%m/%Y%m%d-%H%M%S.log,test/plugin/data/log/**/*.log",
979
+ "format" => "none",
980
+ "read_from_head" => true
981
+ })
982
+ d = create_driver(config, false)
983
+ d.run {}
984
+ plugin = d.instance
985
+ mock(plugin.router).emit_stream('foo.bar.log', anything).once
986
+ plugin.receive_lines(['foo', 'bar'], DummyWatcher.new('foo.bar.log'))
987
+ end
988
+
975
989
  def test_tag_prefix
976
990
  config = config_element("", "", {
977
991
  "tag" => "pre.*",
@@ -267,6 +267,18 @@ class OutputTest < Test::Unit::TestCase
267
267
  assert_equal "/mypath/%Y/%m/%d/%H-%M/${tag}/${tag[1]}/${tag[2]}/value1/value2/tail", @i.extract_placeholders(tmpl, m)
268
268
  end
269
269
 
270
+ test '#extract_placeholders can extract nested variables if variables are configured with dot notation' do
271
+ @i.configure(config_element('ROOT', '', {}, [config_element('buffer', 'key,$.nest.key', {})]))
272
+ assert !@i.chunk_key_time
273
+ assert !@i.chunk_key_tag
274
+ assert_equal ['key','$.nest.key'], @i.chunk_keys
275
+ tmpl = "/mypath/%Y/%m/%d/%H-%M/${tag}/${tag[1]}/${tag[2]}/${key}/${$.nest.key}/tail"
276
+ t = event_time('2016-04-11 20:30:00 +0900')
277
+ v = {:key => "value1", :"$.nest.key" => "value2"}
278
+ m = create_metadata(timekey: t, tag: 'fluentd.test.output', variables: v)
279
+ assert_equal "/mypath/%Y/%m/%d/%H-%M/${tag}/${tag[1]}/${tag[2]}/value1/value2/tail", @i.extract_placeholders(tmpl, m)
280
+ end
281
+
270
282
  test '#extract_placeholders can extract all chunk keys if configured' do
271
283
  @i.configure(config_element('ROOT', '', {}, [config_element('buffer', 'time,tag,key1,key2', {'timekey' => 60*30, 'timekey_zone' => "+0900"})]))
272
284
  assert @i.chunk_key_time
@@ -493,11 +505,21 @@ class OutputTest < Test::Unit::TestCase
493
505
  assert_equal ['.hidden', '0001', '@timestamp', 'a_key', 'my-domain'], @i.get_placeholders_keys("http://${my-domain}/${.hidden}/${0001}/${a_key}?timestamp=${@timestamp}")
494
506
  end
495
507
 
508
+ data('include space' => 'ke y',
509
+ 'bracket notation' => "$['key']",
510
+ 'invalid notation' => "$.ke y")
511
+ test 'configure checks invalid chunk keys' do |chunk_keys|
512
+ i = create_output(:buffered)
513
+ assert_raise Fluent::ConfigError do
514
+ i.configure(config_element('ROOT' , '', {}, [config_element('buffer', chunk_keys)]))
515
+ end
516
+ end
517
+
496
518
  test '#metadata returns object which contains tag/timekey/variables from records as specified in configuration' do
497
519
  tag = 'test.output'
498
520
  time = event_time('2016-04-12 15:31:23 -0700')
499
521
  timekey = event_time('2016-04-12 15:00:00 -0700')
500
- record = {"key1" => "value1", "num1" => 1, "message" => "my message"}
522
+ record = {"key1" => "value1", "num1" => 1, "message" => "my message", "nest" => {"key" => "nested value"}}
501
523
 
502
524
  i1 = create_output(:buffered)
503
525
  i1.configure(config_element('ROOT','',{},[config_element('buffer', '')]))
@@ -530,6 +552,10 @@ class OutputTest < Test::Unit::TestCase
530
552
  i8 = create_output(:buffered)
531
553
  i8.configure(config_element('ROOT','',{},[config_element('buffer', 'time,tag,key1', {"timekey" => 3600, "timekey_zone" => "-0700"})]))
532
554
  assert_equal create_metadata(timekey: timekey, tag: tag, variables: {key1: "value1"}), i8.metadata(tag, time, record)
555
+
556
+ i9 = create_output(:buffered)
557
+ i9.configure(config_element('ROOT','',{},[config_element('buffer', 'key1,$.nest.key', {})]))
558
+ assert_equal create_metadata(variables: {:key1 => "value1", :"$.nest.key" => 'nested value'}), i9.metadata(tag, time, record)
533
559
  end
534
560
 
535
561
  test '#emit calls #process via #emit_sync for non-buffered output' do
@@ -328,4 +328,26 @@ class CompatParameterTest < Test::Unit::TestCase
328
328
  # TODO:
329
329
  end
330
330
  end
331
+
332
+ sub_test_case 'parser plugins' do
333
+ test 'syslog parser parameters' do
334
+ hash = {
335
+ 'format' => 'syslog',
336
+ 'message_format' => 'rfc5424',
337
+ 'with_priority' => 'true',
338
+ 'rfc5424_time_format' => '%Y'
339
+ }
340
+ conf = config_element('ROOT', '', hash)
341
+ @i = DummyI0.new
342
+ @i.configure(conf)
343
+ @i.start
344
+ @i.after_start
345
+
346
+ parser = @i.parser
347
+ assert_kind_of(Fluent::Plugin::SyslogParser, parser)
348
+ assert_equal :rfc5424, parser.message_format
349
+ assert_equal true, parser.with_priority
350
+ assert_equal '%Y', parser.rfc5424_time_format
351
+ end
352
+ end
331
353
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluentd
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.14.20
4
+ version: 0.14.21
5
5
  platform: ruby
6
6
  authors:
7
7
  - Sadayuki Furuhashi
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-07-31 00:00:00.000000000 Z
11
+ date: 2017-09-07 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: msgpack
@@ -322,6 +322,7 @@ files:
322
322
  - CONTRIBUTING.md
323
323
  - COPYING
324
324
  - ChangeLog
325
+ - ENTERPRISE_PROVIDERS.md
325
326
  - Gemfile
326
327
  - MAINTAINERS.md
327
328
  - README.md