fluent-plugin-memcached 0.1.1 → 0.2.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 369bfe5ecc11048b67015f77bcfe16e78a033aae
4
- data.tar.gz: 00a867ad20989e5ef7c1c1134a87813de1bd52ea
3
+ metadata.gz: 76053cd2a49aafb9f108310579ddd20c6b75cc12
4
+ data.tar.gz: 391f5f6286dab6655e725ded9c7da27cdac57fe9
5
5
  SHA512:
6
- metadata.gz: 916fe9ccbfca3577f64599d285b28a4be8ed1e2efdf6054d427a6dee870ae1a052c94a2e6fd35eda25a2302bf69a119a3ffb01fd58ab804d9304e865dbe7c386
7
- data.tar.gz: 949d12468856097b688f26e07cc9131171bccce332d4bfd0a436909ce0a8bb990e95c22b92e8c2f995a3062b571a3f49528e585073124374fa510cdf0c7195ea
6
+ metadata.gz: c872f7ca9bb0b5aa884f0de84a99cd55c2124a61b5e9a7598e01be5f7489549f38a3fd272569f0c55d5e31ff48168dfac94d4cdf208e6e931bb553e5c7ec7658
7
+ data.tar.gz: e338d70a15e5ca4e0588d1f4395f9736d384e5551dec16bb1638878d7a71b24b81f3df59303074152d3610bfdeeb44a23f76e1e264c30509828ecdd255c46723
data/README.md CHANGED
@@ -5,32 +5,151 @@
5
5
 
6
6
  Send your logs to Memcached.
7
7
 
8
+ ## Requirements
9
+
10
+ | fluent-plugin-memcached | fluentd | ruby |
11
+ |-------------------------|---------|------|
12
+ | >= 0.1.0 | >= v0.14.0 | >= 2.1 |
13
+ | < 0.1.0 | >= v0.12.0 | >= 1.9 |
14
+
8
15
  ## Installation
9
16
 
10
- ```sh
17
+ ```console
11
18
  $ gem install fluent-plugin-memcached
12
19
  ```
13
20
 
14
- ## Usage
21
+ ## Configuration
15
22
 
16
- In your Fluentd configuration, use `type memcached`.
17
- Default values would look like this:
23
+ **NOTE: The version 0.2.0 includes breaking changes for configuration.** Please see [here](#for-previous-versions) if you use v0.1.1 or earlier.
24
+
25
+ In your Fluentd configuration, use `@type memcached`.
18
26
 
19
27
  ```
20
28
  <match dummy>
21
- type memcached
22
- host localhost
23
- port 11211
24
- increment false
25
- # value_separater " "
29
+ @type memcached
30
+ host localhost # Optional, default:localhost
31
+ port 11211 # Optional, default:11211
32
+
33
+ key id # Required, set a key name, the value of which is used as memcached key
34
+ include_key false # Optional, default: false
35
+ increment false # Optional, default: false
36
+
37
+ format csv # Optional, default: csv
38
+ fields field1,field2 # Required, set field names, the value of which is stored in memcached
39
+ delimiter " " # Optional, default: " "
40
+ force_quotes false # Optional, default: false
41
+ </match>
42
+ ```
43
+
44
+ ### Use cases
45
+
46
+ There are some results when the following input is coming.
47
+
48
+ input: `{"id" => "key1", "field1" => "value1", "field2" => "value2", "field_incr" => "1"}`
49
+
50
+ #### To store a data as CSV
51
+
52
+ ```
53
+ <match dummy>
54
+ @type memcached
55
+ key id
56
+ fields field1,field2
57
+ delimiter ,
26
58
  </match>
27
59
  ```
28
60
 
29
- To store values as json, like this:
61
+ The result of stored a data is as below:
62
+
63
+ - key: `key1`
64
+ - value: `value1,value2`
65
+
66
+ #### To store a data as JSON
67
+
68
+ ```
69
+ <match dummy>
70
+ @type memcached
71
+ key id
72
+ fields field1,field2
73
+ format json
74
+ </match>
75
+ ```
76
+
77
+ The result of stored a data is as below:
78
+
79
+ - key: `key1`
80
+ - value: `{"field1":"value1","field2":"value2"}`
81
+
82
+ #### To store a data as single incremental value
83
+
84
+ ```
85
+ <match dummy>
86
+ @type memcached
87
+ key id
88
+ increment true
89
+ format single_value
90
+ message_key field_incr
91
+ </match>
92
+ ```
93
+
94
+ The result of stored a data is as below:
95
+
96
+ - key: `key1`
97
+ - value: `1`
98
+
99
+ Then the following input is also coming,
100
+
101
+ input: `{"id" => "key1", "field1" => "value3", "field2" => "value4", "field_incr" => "2"}`
102
+
103
+ The result of stored a data will be as below:
104
+
105
+ - key: `key1`
106
+ - value: `3`
107
+
108
+ ### Fluentd v0.14 style
109
+
110
+ When using v0.14 style configuration, you can choose three different types of buffer behavior:
111
+
112
+ #### Simple Buffered Output
113
+
114
+ ```
115
+ <match dummy>
116
+ ...
117
+ <buffer>
118
+ @type memory
119
+ </buffer>
120
+ </match>
121
+ ```
122
+
123
+ #### Tag Separated Buffered Output
124
+
125
+ ```
126
+ <match dummy>
127
+ ...
128
+ <buffer tag>
129
+ @type memory
130
+ </buffer>
131
+ </match>
132
+ ```
133
+
134
+ #### Time Sliced Buffered Output
135
+
136
+ ```
137
+ <match dummy>
138
+ ...
139
+ <buffer tag, time>
140
+ @type memory
141
+ timekey 3600 # for 1 hour
142
+ </buffer>
143
+ </match>
144
+ ```
145
+
146
+ ### For previous versions
147
+
148
+ In v0.1.1 or earlier, to store a data as JSON, like this:
30
149
 
31
150
  ```
32
151
  <match dummy>
33
- type memcached
152
+ @type memcached
34
153
  host localhost
35
154
  port 11211
36
155
  value_format json
@@ -4,7 +4,7 @@ $LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
4
4
 
5
5
  Gem::Specification.new do |gem|
6
6
  gem.name = "fluent-plugin-memcached"
7
- gem.version = "0.1.1"
7
+ gem.version = "0.2.0"
8
8
  gem.authors = ["innossh"]
9
9
  gem.email = ["innossh@users.noreply.github.com"]
10
10
 
@@ -1,109 +1,81 @@
1
1
  require 'dalli'
2
2
  require 'fluent/plugin/output'
3
3
 
4
- class Fluent::Plugin::MemcachedOutput < Fluent::Plugin::Output
5
- Fluent::Plugin.register_output('memcached', self)
4
+ module Fluent::Plugin
5
+ class MemcachedOutput < Output
6
+ Fluent::Plugin.register_output('memcached', self)
6
7
 
7
- helpers :compat_parameters
8
+ include Fluent::SetTimeKeyMixin
9
+ include Fluent::SetTagKeyMixin
8
10
 
9
- DEFAULT_BUFFER_TYPE = "memory"
11
+ helpers :inject, :formatter, :compat_parameters
10
12
 
11
- config_param :host, :string, :default => 'localhost'
12
- config_param :port, :integer, :default => 11211
13
+ DEFAULT_BUFFER_TYPE = 'memory'
14
+ DEFAULT_FORMAT_TYPE = 'csv'
13
15
 
14
- config_param :increment, :bool, :default => false
15
- config_param :value_separater, :string, :default => ' '
16
+ config_param :host, :string, :default => 'localhost'
17
+ config_param :port, :integer, :default => 11211
18
+ config_param :key, :string
19
+ config_param :include_key, :bool, :default => false
20
+ config_param :increment, :bool, :default => false
16
21
 
17
- config_param :value_format, :string, :default => 'raw'
18
- config_param :param_names, :string, :default => nil # nil doesn't allowed for json
22
+ config_section :buffer do
23
+ config_set_default :@type, DEFAULT_BUFFER_TYPE
24
+ end
19
25
 
20
- config_section :buffer do
21
- config_set_default :@type, DEFAULT_BUFFER_TYPE
22
- end
26
+ config_section :format do
27
+ config_set_default :@type, DEFAULT_FORMAT_TYPE
28
+ config_set_default :delimiter, ' '
29
+ config_set_default :force_quotes, false
30
+ end
23
31
 
24
- attr_accessor :memcached
25
- attr_accessor :formatter
32
+ attr_accessor :memcached
33
+ attr_accessor :formatter
26
34
 
27
- def configure(conf)
28
- compat_parameters_convert(conf, :buffer)
29
- super
30
- if @value_format == 'json' and @param_names.nil?
31
- raise Fluent::ConfigError, "param_names MUST be specified in the case of json format"
32
- end
33
- @formatter = RecordValueFormatter.new(@increment, @value_separater, @value_format, @param_names)
34
- end
35
+ def configure(conf)
36
+ compat_parameters_convert(conf, :buffer, :inject, :formatter)
37
+ super
35
38
 
36
- def start
37
- super
38
- @memcached = Dalli::Client.new("#{@host}:#{@port}")
39
- end
39
+ @formatter = formatter_create
40
+ end
40
41
 
41
- def shutdown
42
- @memcached.close
43
- super
44
- end
42
+ def start
43
+ super
45
44
 
46
- def format(tag, time, record)
47
- [time, record].to_msgpack
48
- end
45
+ @memcached = Dalli::Client.new("#{@host}:#{@port}")
46
+ end
49
47
 
50
- def formatted_to_msgpack_binary?
51
- true
52
- end
48
+ def shutdown
49
+ @memcached.close
53
50
 
54
- def multi_workers_ready?
55
- true
56
- end
51
+ super
52
+ end
57
53
 
58
- def write(chunk)
59
- chunk.msgpack_each { |time, record|
60
- key = @formatter.key(record)
61
- value = @formatter.value(record)
62
- if @increment
63
- if @memcached.get(key) == nil
64
- # initialize increment value
65
- @memcached.incr(key, 1, nil, 0)
66
- end
67
- @memcached.incr(key, amt=value)
54
+ def format(tag, time, record)
55
+ record = inject_values_to_record(tag, time, record)
68
56
 
69
- else
70
- @memcached.set(key, value)
71
- end
72
- }
73
- end
57
+ key = @include_key ? record[@key] : record.delete(@key)
58
+ [time, key, @formatter.format(tag, time, record).chomp].to_msgpack
59
+ end
74
60
 
75
- class RecordValueFormatter
76
- attr_reader :increment
77
- attr_reader :value_separater
78
- attr_reader :value_format
79
- attr_reader :param_names
80
-
81
- def initialize(increment, value_separater, value_format, param_names)
82
- @increment = increment
83
- @value_separater = value_separater
84
- @value_format = value_format
85
- @param_names = param_names
61
+ def formatted_to_msgpack_binary?
62
+ true
86
63
  end
87
64
 
88
- def key(record)
89
- record.values.first
65
+ def multi_workers_ready?
66
+ true
90
67
  end
91
68
 
92
- def value(record)
93
- values = record.values.drop(1)
94
- case @value_format
95
- when 'json'
96
- hash = {}
97
- @param_names.split(/\s*,\s*/).each_with_index { |param_name, i|
98
- hash[param_name] = (i > values.size - 1) ? nil : values[i]
99
- }
100
- hash.to_json
101
- else
102
- return values.first.to_i if @increment
103
-
104
- values.join(@value_separater)
69
+ def write(chunk)
70
+ chunk.msgpack_each do |time, key, value|
71
+ unless @increment
72
+ @memcached.set(key, value)
73
+ next
74
+ end
75
+
76
+ @memcached.incr(key, value.to_i, nil, value.to_i)
105
77
  end
106
78
  end
107
- end
108
79
 
80
+ end
109
81
  end
@@ -1,4 +1,3 @@
1
- require 'test/unit'
2
1
  require 'fluent/test'
3
2
  require 'fluent/test/helpers'
4
3
  require 'fluent/test/driver/output'
@@ -9,30 +8,38 @@ class MemcachedOutputTest < Test::Unit::TestCase
9
8
 
10
9
  def setup
11
10
  Fluent::Test.setup
11
+
12
+ @d = create_driver
13
+ # Invalidate all existing cache items before testing
14
+ Dalli::Client.new("#{@d.instance.host}:#{@d.instance.port}").flush_all
15
+ @time = event_time('2011-01-02 13:14:15 UTC')
12
16
  end
13
17
 
14
18
  CONFIG = %[
15
- host 127.0.0.1
16
- port 11211
19
+ key id
20
+ fields field1,field2
17
21
  ]
18
22
 
19
23
  CONFIG_JSON = %[
20
- host 127.0.0.1
21
- port 11211
22
- value_format json
23
- param_names param1,param2
24
+ key id
25
+ include_key true
26
+ format json
27
+ include_tag_key true
24
28
  ]
25
29
 
26
30
  CONFIG_INCREMENT = %[
27
- host 127.0.0.1
28
- port 11211
31
+ key id
32
+ format single_value
33
+ message_key field_incr
29
34
  increment true
30
35
  ]
31
36
 
32
37
  CONFIG_MYSQL = %[
33
- host 127.0.0.1
34
- port 11211
35
- value_separater |
38
+ include_time_key true
39
+ time_format %s
40
+ key time
41
+ fields metrics_name,metrics_value
42
+ delimiter |
36
43
  ]
37
44
 
38
45
  def create_driver(conf = CONFIG)
@@ -40,114 +47,94 @@ class MemcachedOutputTest < Test::Unit::TestCase
40
47
  end
41
48
 
42
49
  def test_configure
43
- d = create_driver('')
44
- assert_equal 'localhost', d.instance.host
45
- assert_equal 11211, d.instance.port
46
- assert_equal false, d.instance.increment
47
- assert_equal ' ', d.instance.value_separater
50
+ assert_raise(Fluent::ConfigError) {
51
+ create_driver('')
52
+ }
48
53
 
49
- d = create_driver
50
- assert_equal '127.0.0.1', d.instance.host
51
- assert_equal 11211, d.instance.port
54
+ assert_equal 'localhost', @d.instance.host
55
+ assert_equal 11211, @d.instance.port
56
+ assert_equal false, @d.instance.increment
57
+ assert_equal ' ', @d.instance.formatter.delimiter
58
+ assert_equal 'id', @d.instance.key
59
+ assert_equal ['field1', 'field2'], @d.instance.formatter.fields
52
60
 
53
61
  d = create_driver(CONFIG_JSON)
54
- assert_equal '127.0.0.1', d.instance.host
55
- assert_equal 11211, d.instance.port
56
- assert_equal 'json', d.instance.value_format
57
- assert_equal 'param1,param2', d.instance.param_names
58
-
59
- assert_raise(Fluent::ConfigError) {
60
- create_driver %[
61
- host 127.0.0.1
62
- port 11211
63
- value_format json
64
- ]
65
- }
62
+ assert_equal 'json', d.instance.formatter_configs.first[:@type]
66
63
 
67
64
  d = create_driver(CONFIG_INCREMENT)
68
- assert_equal '127.0.0.1', d.instance.host
69
- assert_equal 11211, d.instance.port
70
65
  assert_equal true, d.instance.increment
66
+ assert_equal 'single_value', d.instance.formatter_configs.first[:@type]
67
+ assert_equal 'field_incr', d.instance.formatter.message_key
71
68
 
72
69
  d = create_driver(CONFIG_MYSQL)
73
- assert_equal '127.0.0.1', d.instance.host
74
- assert_equal 11211, d.instance.port
75
- assert_equal '|', d.instance.value_separater
70
+ assert_equal 'time', d.instance.key
71
+ assert_equal true, d.instance.include_time_key
72
+ assert_equal '%s', d.instance.inject_config.time_format
73
+ assert_equal ['metrics_name', 'metrics_value'], d.instance.formatter.fields
74
+ assert_equal '|', d.instance.formatter.delimiter
76
75
  end
77
76
 
78
77
  def test_format
79
- d = create_driver
80
- time = Time.parse('2011-01-02 13:14:15 UTC').to_i
81
- record = {'key' => 'key', 'param1' => 'value'}
82
- d.run(default_tag: 'test') do
83
- d.feed(time, record)
78
+ record = {'id' => 'key', 'field1' => 'value1', 'field2' => 'value2'}
79
+ @d.run(default_tag: 'test') do
80
+ @d.feed(@time, record)
84
81
  end
85
- assert_equal [[time, record].to_msgpack], d.formatted
82
+ assert_equal [[@time.to_i, 'key', 'value1 value2'].to_msgpack], @d.formatted
86
83
  end
87
84
 
88
85
  def test_write
89
- d = create_driver
90
- time = event_time('2011-01-02 13:14:15 UTC')
91
- record1 = {'key' => 'a', 'param1' => '1'}
92
- record2 = {'key' => 'b', 'param1' => '2', 'param2' => '3'}
93
- d.run(default_tag: 'test') do
94
- d.feed(time, record1)
95
- d.feed(time, record2)
86
+ @d = create_driver
87
+ record1 = {'id' => 'a', 'field1' => '1'}
88
+ record2 = {'id' => 'b', 'field1' => '2', 'field2' => '3'}
89
+ @d.run(default_tag: 'test') do
90
+ @d.feed(@time, record1)
91
+ @d.feed(@time, record2)
96
92
  end
97
93
 
98
- assert_equal '1', d.instance.memcached.get('a')
99
- assert_equal '2 3', d.instance.memcached.get('b')
94
+ assert_equal '1 ', @d.instance.memcached.get('a')
95
+ assert_equal '2 3', @d.instance.memcached.get('b')
100
96
  end
101
97
 
102
98
  def test_write_json
103
99
  d = create_driver(CONFIG_JSON)
104
- time = event_time('2011-01-02 13:14:15 UTC')
105
- record1 = {'key' => 'c', 'param1' => '4'}
106
- record2 = {'key' => 'd', 'param1' => '5', 'param2' => '6'}
107
- record1_value_json = {'param1' => '4', 'param2' => nil}.to_json
108
- record2_value_json = {'param1' => '5', 'param2' => '6'}.to_json
100
+ record1 = {'id' => 'a', 'field1' => '4'}
101
+ record2 = {'id' => 'b', 'field1' => '5', 'field2' => '6'}
102
+ record1_value_json = {'id' => 'a', 'field1' => '4', 'tag' => 'test'}.to_json
103
+ record2_value_json = {'id' => 'b', 'field1' => '5', 'field2' => '6', 'tag' => 'test'}.to_json
109
104
  d.run(default_tag: 'test') do
110
- d.feed(time, record1)
111
- d.feed(time, record2)
105
+ d.feed(@time, record1)
106
+ d.feed(@time, record2)
112
107
  end
113
108
 
114
- assert_equal record1_value_json, d.instance.memcached.get('c')
115
- assert_equal record2_value_json, d.instance.memcached.get('d')
109
+ assert_equal record1_value_json, d.instance.memcached.get('a')
110
+ assert_equal record2_value_json, d.instance.memcached.get('b')
116
111
  end
117
112
 
118
- class IncrementTest < self
119
- def teardown
120
- @d.instance.memcached.flush_all
113
+ def test_write_increment
114
+ d = create_driver(CONFIG_INCREMENT)
115
+ record1 = {'id' => 'count1', 'field_incr' => 1}
116
+ record2 = {'id' => 'count2', 'field_incr' => 2}
117
+ record3 = {'id' => 'count1', 'field_incr' => 3}
118
+ record4 = {'id' => 'count2', 'field_incr' => 4}
119
+ d.run(default_tag: 'test') do
120
+ d.feed(@time, record1)
121
+ d.feed(@time, record2)
122
+ d.feed(@time, record3)
123
+ d.feed(@time, record4)
121
124
  end
122
125
 
123
- def test_write_increment
124
- @d = create_driver(CONFIG_INCREMENT)
125
- time = event_time('2011-01-02 13:14:15 UTC')
126
- record1 = {'key' => 'count1', 'param1' => 1}
127
- record2 = {'key' => 'count2', 'param1' => 2}
128
- record3 = {'key' => 'count1', 'param1' => 3}
129
- record4 = {'key' => 'count2', 'param1' => 4}
130
- @d.run(default_tag: 'test') do
131
- @d.feed(time, record1)
132
- @d.feed(time, record2)
133
- @d.feed(time, record3)
134
- @d.feed(time, record4)
135
- end
136
-
137
- assert_equal (1 + 3), @d.instance.memcached.get('count1').to_i
138
- assert_equal (2 + 4), @d.instance.memcached.get('count2').to_i
139
- end
126
+ assert_equal (1 + 3), d.instance.memcached.get('count1').to_i
127
+ assert_equal (2 + 4), d.instance.memcached.get('count2').to_i
140
128
  end
141
129
 
142
130
  def test_write_to_mysql
143
131
  d = create_driver(CONFIG_MYSQL)
144
- time = event_time('2011-01-02 13:14:15 UTC')
145
- record = {'key' => time, 'metrics_name' => 'count', 'metrics_value' => '100'}
132
+ record = {'metrics_name' => 'count', 'metrics_value' => '100'}
146
133
  d.run(default_tag: 'test') do
147
- d.feed(time, record)
134
+ d.feed(@time, record)
148
135
  end
149
136
 
150
- assert_equal 'count|100', d.instance.memcached.get(time)
137
+ assert_equal 'count|100', d.instance.memcached.get(@time.to_i)
151
138
  end
152
139
 
153
140
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fluent-plugin-memcached
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.1
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - innossh
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-06-07 00:00:00.000000000 Z
11
+ date: 2017-06-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fluentd