logstash-output-loggly 3.0.5 → 4.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 7e0cec0e9971901b8db8203aab5425054757e7aa55419b011168fcae88917930
4
- data.tar.gz: f0c58f28c958a39190a380e0e311b2f8267b755084f5ec72f0602b25988e0ebe
3
+ metadata.gz: 1e15e34b7a8077fdb9c36ebd61ddad7d3e89adf138381cb339849f7dfe55b2ea
4
+ data.tar.gz: 509a2c487b8faf7821e8e83b32db90096dc46c23af35bfc7757dc8a698643901
5
5
  SHA512:
6
- metadata.gz: 9eb9897a67cbf461400c899e4674bb1cc4bb8b8230021152c3ec778c570241f466640704ffdf1d1b27c9b683c0eaf7fe8d21350aea793f7bd9283e15c40d05ff
7
- data.tar.gz: bbb3ec8a63196e7f2091b2eb03fa16b2717a7b85a771cf742ffeac304ff8d3fca98c8cdf514f9e58c27c177f40c280d39db2f6f9db687debfdcc2715df7552ab
6
+ metadata.gz: aef1bc454004b3978558d783ae2fc2194f715405cc3e60bc9f06b635fe722eab60a6b9b088c8c0a61fd0ac92cc4c609510e2c17198d9980db006505b66905050
7
+ data.tar.gz: 36dd4857187cefd3bca6fed7c65badd69b3a0adb8b0f220af085758b99f4d23a82a6a598bab479dee55d3b2179bb74d065e2af0ab5f51e8984e6038f83f2fe91
@@ -1,3 +1,19 @@
1
+ ## 4.0.0
2
+ - The plugin now uses the Loggly bulk API.
3
+ - If you need to modify event batch sizes and max delay between flushes,
4
+ please adjust the Logstash settings `pipeline.batch.size` and
5
+ `pipeline.batch.delay` respectively.
6
+ - New settings: `max_event_size` and `max_payload_size`.
7
+ Both are currently set according to Loggly's [published API limits](https://www.loggly.com/docs/http-bulk-endpoint/).
8
+ They only need to be changed if Loggly changes these limits.
9
+ - The plugin now skips events bigger than the API limit for single event size.
10
+ A proper warning is logged when this happens.
11
+ - When interpolating `key` field, drop messages where interpolation doesn't
12
+ resolve (meaning we don't have the API key for the event).
13
+ - When interpolating `tag` field, revert to default of 'logstash' if interpolation doesn't resolve.
14
+ - Beef up unit tests significantly.
15
+ - See pull request [#29](https://github.com/logstash-plugins/logstash-output-loggly/pull/29) for all details.
16
+
1
17
  ## 3.0.5
2
18
  - [#24](https://github.com/logstash-plugins/logstash-output-loggly/pull/24)
3
19
  Get rid of a Ruby warning from using `timeout`.
@@ -40,6 +40,8 @@ This plugin supports the following configuration options plus the <<plugins-{typ
40
40
  | <<plugins-{type}s-{plugin}-can_retry>> |<<boolean,boolean>>|No
41
41
  | <<plugins-{type}s-{plugin}-host>> |<<string,string>>|No
42
42
  | <<plugins-{type}s-{plugin}-key>> |<<string,string>>|Yes
43
+ | <<plugins-{type}s-{plugin}-max_event_size>> |<<bytes,bytes>>|Yes
44
+ | <<plugins-{type}s-{plugin}-max_payload_size>> |<<bytes,bytes>>|Yes
43
45
  | <<plugins-{type}s-{plugin}-proto>> |<<string,string>>|No
44
46
  | <<plugins-{type}s-{plugin}-proxy_host>> |<<string,string>>|No
45
47
  | <<plugins-{type}s-{plugin}-proxy_password>> |<<password,password>>|No
@@ -64,15 +66,15 @@ Can Retry.
64
66
  Setting this value true helps user to send multiple retry attempts if the first request fails
65
67
 
66
68
  [id="plugins-{type}s-{plugin}-host"]
67
- ===== `host`
69
+ ===== `host`
68
70
 
69
71
  * Value type is <<string,string>>
70
72
  * Default value is `"logs-01.loggly.com"`
71
73
 
72
74
  The hostname to send logs to. This should target the loggly http input
73
75
  server which is usually "logs-01.loggly.com" (Gen2 account).
74
- See Loggly HTTP endpoint documentation at
75
- https://www.loggly.com/docs/http-endpoint/
76
+ See the https://www.loggly.com/docs/http-endpoint/[Loggly HTTP endpoint documentation].
77
+
76
78
 
77
79
  [id="plugins-{type}s-{plugin}-key"]
78
80
  ===== `key`
@@ -88,6 +90,34 @@ You can use `%{foo}` field lookups here if you need to pull the api key from
88
90
  the event. This is mainly aimed at multitenant hosting providers who want
89
91
  to offer shipping a customer's logs to that customer's loggly account.
90
92
 
93
+ [id="plugins-{type}s-{plugin}-max_event_size"]
94
+ ===== `max_event_size`
95
+
96
+ * This is a required setting.
97
+ * Value type is <<bytes,bytes>>
98
+ * Default value is 1 Mib
99
+
100
+ The Loggly API supports event size up to 1 Mib.
101
+
102
+ You should only need to change this setting if the
103
+ API limits have changed and you need to override the plugin's behaviour.
104
+
105
+ See the https://www.loggly.com/docs/http-bulk-endpoint/[Loggly bulk API documentation]
106
+
107
+ [id="plugins-{type}s-{plugin}-max_payload_size"]
108
+ ===== `max_payload_size`
109
+
110
+ * This is a required setting.
111
+ * Value type is <<bytes,bytes>>
112
+ * Default value is 5 Mib
113
+
114
+ The Loggly API supports API call payloads up to 5 Mib.
115
+
116
+ You should only need to change this setting if the
117
+ API limits have changed and you need to override the plugin's behaviour.
118
+
119
+ See the https://www.loggly.com/docs/http-bulk-endpoint/[Loggly bulk API documentation]
120
+
91
121
  [id="plugins-{type}s-{plugin}-proto"]
92
122
  ===== `proto`
93
123
 
@@ -145,14 +175,15 @@ It will try to submit request until retry_count and then halt
145
175
  * Value type is <<string,string>>
146
176
  * Default value is `"logstash"`
147
177
 
148
- Loggly Tag
149
- Tag helps you to find your logs in the Loggly dashboard easily
150
- You can make a search in Loggly using tag as "tag:logstash-contrib"
151
- or the tag set by you in the config file.
178
+ Loggly Tags help you to find your logs in the Loggly dashboard easily.
179
+ You can search for a tag in Loggly using `"tag:logstash"`.
180
+
181
+ If you need to specify multiple tags here on your events,
182
+ specify them as outlined in https://www.loggly.com/docs/tags/[the tag documentation].
183
+ E.g. `"tag" => "foo,bar,myApp"`.
152
184
 
153
- You can use %{somefield} to allow for custom tag values.
154
- Helpful for leveraging Loggly source groups.
155
- https://www.loggly.com/docs/source-groups/
185
+ You can also use `"tag" => "%{somefield}"` to take your tag value from `somefield` on your event.
186
+ Helpful for leveraging https://www.loggly.com/docs/source-groups/[Loggly source groups].
156
187
 
157
188
 
158
189
 
@@ -22,11 +22,12 @@ end
22
22
  # Got a loggly account? Use logstash to ship logs to Loggly!
23
23
  #
24
24
  # This is most useful so you can use logstash to parse and structure
25
- # your logs and ship structured, json events to your account at Loggly.
25
+ # your logs and ship structured, json events to your Loggly account.
26
26
  #
27
27
  # To use this, you'll need to use a Loggly input with type 'http'
28
28
  # and 'json logging' enabled.
29
29
  class LogStash::Outputs::Loggly < LogStash::Outputs::Base
30
+
30
31
  config_name "loggly"
31
32
 
32
33
  # The hostname to send logs to. This should target the loggly http input
@@ -46,17 +47,20 @@ class LogStash::Outputs::Loggly < LogStash::Outputs::Base
46
47
  # Should the log action be sent over https instead of plain http
47
48
  config :proto, :validate => :string, :default => "http"
48
49
 
49
- # Loggly Tag
50
- # Tag helps you to find your logs in the Loggly dashboard easily
51
- # You can make a search in Loggly using tag as "tag:logstash-contrib"
52
- # or the tag set by you in the config file.
50
+ # Loggly Tags help you to find your logs in the Loggly dashboard easily.
51
+ # You can search for a tag in Loggly using `"tag:logstash"`.
52
+ #
53
+ # If you need to specify multiple tags here on your events,
54
+ # specify them as outlined in the tag documentation (https://www.loggly.com/docs/tags/).
55
+ # E.g. `"tag" => "foo,bar,myApp"`.
53
56
  #
54
- # You can use %{somefield} to allow for custom tag values.
55
- # Helpful for leveraging Loggly source groups.
56
- # https://www.loggly.com/docs/source-groups/
57
- config :tag, :validate => :string, :default => "logstash"
57
+ # You can also use `"tag" => "%{somefield}"` to take your tag value from `somefield` on your event.
58
+ # Helpful for leveraging Loggly source groups (https://www.loggly.com/docs/source-groups/).
59
+
60
+ DEFAULT_LOGGLY_TAG = 'logstash'
61
+ config :tag, :validate => :string, :default => DEFAULT_LOGGLY_TAG
58
62
 
59
- # Retry count.
63
+ # Retry count.
60
64
  # It may be possible that the request may timeout due to slow Internet connection
61
65
  # if such condition appears, retry_count helps in retrying request for multiple times
62
66
  # It will try to submit request until retry_count and then halt
@@ -78,6 +82,19 @@ class LogStash::Outputs::Loggly < LogStash::Outputs::Base
78
82
  # Proxy Password
79
83
  config :proxy_password, :validate => :password, :default => ""
80
84
 
85
+ # The Loggly API supports event size up to 1 Mib.
86
+ # You should only need to change this setting if the
87
+ # API limits have changed and you need to override the plugin's behaviour.
88
+ #
89
+ # See https://www.loggly.com/docs/http-bulk-endpoint/
90
+ config :max_event_size, :validate => :bytes, :default => '1 Mib', :required => true
91
+
92
+ # The Loggly API supports API call payloads up to 5 Mib.
93
+ # You should only need to change this setting if the
94
+ # API limits have changed and you need to override the plugin's behaviour.
95
+ #
96
+ # See https://www.loggly.com/docs/http-bulk-endpoint/
97
+ config :max_payload_size, :validate => :bytes, :default => '5 Mib', :required => true
81
98
 
82
99
  # HTTP constants
83
100
  HTTP_SUCCESS = "200"
@@ -88,31 +105,114 @@ class LogStash::Outputs::Loggly < LogStash::Outputs::Base
88
105
 
89
106
  public
90
107
  def register
91
- # nothing to do
92
108
  end
93
109
 
94
110
  public
111
+ def multi_receive(events)
112
+ send_batch events.collect { |event| prepare_meta(event) }
113
+ end
114
+
95
115
  def receive(event)
116
+ send_batch [prepare_meta(event)]
117
+ end
118
+
119
+ private
120
+ # Returns one meta event {key: '...', tag: '...', event: event },
121
+ # or returns nil, if event's key doesn't resolve.
122
+ def prepare_meta(event)
96
123
  key = event.sprintf(@key)
97
124
  tag = event.sprintf(@tag)
98
125
 
126
+ if expected_field = key[/%{(.*)}/, 1]
127
+ @logger.warn "Skipping sending message to Loggly. No key provided (key='#{key}'). Make sure to set field '#{expected_field}'."
128
+ @logger.debug "Dropped message", :event => event.to_json
129
+ return nil
130
+ end
131
+
99
132
  # For those cases where %{somefield} doesn't exist
100
133
  # we should ship logs with the default tag value.
101
- tag = 'logstash' if /^%{\w+}/.match(tag)
134
+ tag = DEFAULT_LOGGLY_TAG if /%{\w+}/.match(tag)
102
135
 
103
- # Send event
104
- send_event("#{@proto}://#{@host}/inputs/#{key}/tag/#{tag}", format_message(event))
105
- end # def receive
136
+ meta_event = { key: key, tag: tag, event: event }
137
+ end # prepare_meta
106
138
 
107
139
  public
108
140
  def format_message(event)
109
141
  event.to_json
110
142
  end
111
143
 
144
+ # Takes an array of meta_events or nils. Will split the batch in appropriate
145
+ # sub-batches per key+tag combination (which need to be posted to different URIs).
146
+ def send_batch(meta_events)
147
+ split_batches(meta_events.compact).each_pair do |k, batch|
148
+ key, tag = *k
149
+ url = "#{@proto}://#{@host}/bulk/#{key}/tag/#{tag}"
150
+
151
+ build_message_bodies(batch) do |body|
152
+ perform_api_call url, body
153
+ end
154
+ end
155
+ end
156
+
157
+ # Gets all API calls to the same URI together in common batches.
158
+ #
159
+ # Expects an array of meta_events {key: '...', tag: '...', event: event }
160
+ # Outputs a hash with event batches split out by key+tag combination.
161
+ # { [key1, tag1] => [event1, ...],
162
+ # [key2, tag1] => [...],
163
+ # [key2, tag2] => [...],
164
+ # ... }
165
+ def split_batches(events)
166
+ events.reduce( Hash.new { |h,k| h[k] = [] } ) do |acc, meta_event|
167
+ key = meta_event[:key]
168
+ tag = meta_event[:tag]
169
+ acc[ [key, tag] ] << meta_event[:event]
170
+ acc
171
+ end
172
+ end
173
+
174
+ # Concatenates JSON events to build an API call body.
175
+ #
176
+ # Will yield before going over the body size limit. May yield more than once.
177
+ #
178
+ # This is also where we check that each message respects the message size,
179
+ # and where we skip those if they don't.
180
+ def build_message_bodies(events)
181
+ body = ''
182
+ event_count = 0
183
+
184
+ events.each do |event|
185
+ encoded_event = format_message(event)
186
+ event_size = encoded_event.bytesize
187
+
188
+ if event_size > @max_event_size
189
+ @logger.warn "Skipping event over max event size",
190
+ :event_size => encoded_event.bytesize, :max_event_size => @max_event_size
191
+ @logger.debug "Skipped event", :event => encoded_event
192
+ next
193
+ end
194
+
195
+ if body.bytesize + 1 + event_size > @max_payload_size
196
+ @logger.debug "Flushing events to Loggly", count: event_count, bytes: body.bytesize
197
+ yield body
198
+ body = ''
199
+ event_count = 0
200
+ end
201
+
202
+ body << "\n" unless body.bytesize.zero?
203
+ body << encoded_event
204
+ event_count += 1
205
+ end
206
+
207
+ if event_count > 0
208
+ @logger.debug "Flushing events to Loggly", count: event_count, bytes: body.bytesize
209
+ yield body
210
+ end
211
+ end
212
+
112
213
  private
113
- def send_event(url, message)
214
+ def perform_api_call(url, message)
114
215
  url = URI.parse(url)
115
- @logger.debug("Loggly URL", :url => url)
116
216
 
117
217
  http = Net::HTTP::Proxy(@proxy_host,
118
218
  @proxy_port,
@@ -139,34 +239,37 @@ class LogStash::Outputs::Loggly < LogStash::Outputs::Base
139
239
  @retry_count.times do
140
240
  begin
141
241
  response = http.request(request)
242
+ @logger.debug("Loggly response", code: response.code, body: response.body)
243
+
142
244
  case response.code
143
245
 
144
- # HTTP_SUCCESS :Code 2xx
246
+ # HTTP_SUCCESS :Code 2xx
145
247
  when HTTP_SUCCESS
146
- @logger.debug("Event sent to Loggly")
248
+ @logger.debug("Event batch sent successfully")
147
249
 
148
- # HTTP_FORBIDDEN :Code 403
250
+ # HTTP_FORBIDDEN :Code 403
149
251
  when HTTP_FORBIDDEN
150
252
  @logger.warn("User does not have privileges to execute the action.")
151
253
 
152
- # HTTP_NOT_FOUND :Code 404
254
+ # HTTP_NOT_FOUND :Code 404
153
255
  when HTTP_NOT_FOUND
154
- @logger.warn("Invalid URL. Please check URL should be http://logs-01.loggly.com/inputs/CUSTOMER_TOKEN/tag/logstash")
256
+ @logger.warn("Invalid URL. Please check URL should be http://logs-01.loggly.com/inputs/CUSTOMER_TOKEN/tag/TAG", :url => url.to_s)
155
257
 
156
- # HTTP_INTERNAL_SERVER_ERROR :Code 500
258
+ # HTTP_INTERNAL_SERVER_ERROR :Code 500
157
259
  when HTTP_INTERNAL_SERVER_ERROR
158
260
  @logger.warn("Internal Server Error")
159
261
 
160
- # HTTP_GATEWAY_TIMEOUT :Code 504
262
+ # HTTP_GATEWAY_TIMEOUT :Code 504
161
263
  when HTTP_GATEWAY_TIMEOUT
162
264
  @logger.warn("Gateway Time Out")
163
265
  else
164
266
  @logger.error("Unexpected response code", :code => response.code)
165
267
  end # case
166
268
 
167
- if [HTTP_SUCCESS,HTTP_FORBIDDEN,HTTP_NOT_FOUND].include?(response.code) # break the retries loop for the specified response code
269
+ if [HTTP_SUCCESS,HTTP_FORBIDDEN,HTTP_NOT_FOUND].include?(response.code) # break the retries loop for the specified response code
168
270
  break
169
271
  end
272
+
170
273
  rescue StandardError => e
171
274
  @logger.error("An unexpected error occurred", :exception => e.class.name, :error => e.to_s, :backtrace => e.backtrace)
172
275
  end # rescue
@@ -179,4 +282,5 @@ class LogStash::Outputs::Loggly < LogStash::Outputs::Base
179
282
  totalRetries = totalRetries + 1
180
283
  end #loop
181
284
  end # def send_event
285
+
182
286
  end # class LogStash::Outputs::Loggly
@@ -1,6 +1,6 @@
1
1
  Gem::Specification.new do |s|
2
2
  s.name = 'logstash-output-loggly'
3
- s.version = '3.0.5'
3
+ s.version = '4.0.0'
4
4
  s.licenses = ['Apache License (2.0)']
5
5
  s.summary = "Ships logs to Loggly"
6
6
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -2,12 +2,22 @@
2
2
  require 'logstash/devutils/rspec/spec_helper'
3
3
  require 'logstash/outputs/loggly'
4
4
 
5
+ def logger_for(plugin)
6
+ plugin.instance_variable_get('@logger')
7
+ end
8
+
5
9
  describe 'outputs/loggly' do
6
10
  let(:config) { { 'key' => 'abcdef123456' } }
7
11
 
12
+ let(:output) do
13
+ LogStash::Outputs::Loggly.new(config).tap do |output|
14
+ output.register
15
+ end
16
+ end
17
+
8
18
  let(:event) do
9
19
  LogStash::Event.new(
10
- 'message' => 'fanastic log entry',
20
+ 'message' => 'fantastic log entry',
11
21
  'source' => 'someapp',
12
22
  'type' => 'nginx',
13
23
  '@timestamp' => LogStash::Timestamp.now)
@@ -21,45 +31,168 @@ describe 'outputs/loggly' do
21
31
  end
22
32
 
23
33
  it 'should have default config values' do
24
- insist { subject.proto } == 'http'
25
- insist { subject.host } == 'logs-01.loggly.com'
26
- insist { subject.tag } == 'logstash'
34
+ expect(subject.proto).to eq('http')
35
+ expect(subject.host).to eq('logs-01.loggly.com')
36
+ expect(subject.tag).to eq('logstash')
37
+ expect(subject.max_event_size).to eq(1_048_576)
38
+ expect(subject.max_payload_size).to eq(5_242_880)
27
39
  end
28
40
  end
29
41
 
30
- context 'when outputting messages' do
42
+ context 'when sending events' do
43
+ it 'should set the default tag to logstash' do
44
+ expect(output).to receive(:send_batch).with([{event: event, key: 'abcdef123456', tag: 'logstash'}])
45
+ output.receive(event)
46
+ end
47
+
31
48
  it 'should support field interpolation on key' do
32
49
  # add a custom key value for Loggly config
33
50
  event.set('token', 'xxxxxxx1234567')
34
51
  config['key'] = '%{token}'
35
52
 
36
- output = LogStash::Outputs::Loggly.new(config)
37
- allow(output).to receive(:send_event).with('http://logs-01.loggly.com/inputs/xxxxxxx1234567/tag/logstash',
38
- event.to_json)
39
- output.receive(event)
40
- end
41
-
42
- it 'should set the default tag to logstash' do
43
- output = LogStash::Outputs::Loggly.new(config)
44
- allow(output).to receive(:send_event).with('http://logs-01.loggly.com/inputs/abcdef123456/tag/logstash',
45
- event.to_json)
53
+ expect(output).to receive(:send_batch).with([{event: event, key: 'xxxxxxx1234567', tag: 'logstash'}])
46
54
  output.receive(event)
47
55
  end
48
56
 
49
57
  it 'should support field interpolation for tag' do
50
58
  config['tag'] = '%{source}'
51
- output = LogStash::Outputs::Loggly.new(config)
52
- allow(output).to receive(:send_event).with('http://logs-01.loggly.com/inputs/abcdef123456/tag/someapp',
53
- event.to_json)
59
+ expect(output).to receive(:send_batch).with([{event: event, key: 'abcdef123456', tag: 'someapp'}])
54
60
  output.receive(event)
55
61
  end
56
62
 
57
- it 'should default tag to logstash if interpolated field does not exist' do
63
+ it 'should default tag to logstash if interpolated field for tag does not exist' do
58
64
  config['tag'] = '%{foobar}'
59
- output = LogStash::Outputs::Loggly.new(config)
60
- allow(output).to receive(:send_event).with('http://logs-01.loggly.com/inputs/abcdef123456/tag/logstash',
61
- event.to_json)
65
+ expect(output).to receive(:send_batch).with([{event: event, key: 'abcdef123456', tag: 'logstash'}])
62
66
  output.receive(event)
63
67
  end
68
+
69
+ it 'should drop messages where interpolated field for key does not exist' do
70
+ config['key'] = '%{custom_key}'
71
+ event.set('custom_key', 'a_key')
72
+ event2 = event.clone
73
+ event2.remove('custom_key')
74
+
75
+ expect(output).to receive(:send_batch).once.with([{event: event, key: 'a_key', tag: 'logstash'}, nil])
76
+ logger = logger_for(output)
77
+ expect(logger).to receive(:warn).with(/No key provided/)
78
+ expect(logger).to receive(:debug).with(/Dropped message/, kind_of(Hash))
79
+
80
+ output.multi_receive([event, event2])
81
+ end
82
+
83
+ context 'with different combinations of key and tag' do
84
+ it 'should perform one http request per batch of common key+tag' do
85
+ config['key'] = '%{custom_key}'
86
+ config['tag'] = '%{custom_tag}'
87
+ event.set('custom_key', 'generally_used_key')
88
+
89
+ event1 = event.clone.tap { |e| e.set('message', 'event1') }
90
+ event2 = event.clone.tap { |e| e.set('message', 'event2') ; e.set('custom_key', 'other_key') }
91
+ event3 = event.clone.tap { |e| e.set('message', 'event3') ; e.set('custom_tag', 'other_tag') }
92
+ event4 = event.clone.tap { |e| e.set('message', 'event4') }
93
+
94
+ expect(output).to receive(:perform_api_call) { |url, body|
95
+ expect(body).to match /"event1"/
96
+ expect(body).to match /"event4"/
97
+ expect(url).to eq('http://logs-01.loggly.com/bulk/generally_used_key/tag/logstash')
98
+ }
99
+ expect(output).to receive(:perform_api_call) { |url, body|
100
+ expect(body).to match /"event2"/
101
+ expect(url).to eq('http://logs-01.loggly.com/bulk/other_key/tag/logstash')
102
+ }
103
+ expect(output).to receive(:perform_api_call) { |url, body|
104
+ expect(body).to match /"event3"/
105
+ expect(url).to eq('http://logs-01.loggly.com/bulk/generally_used_key/tag/other_tag')
106
+ }
107
+ expect(output).not_to receive(:perform_api_call) # anymore
108
+
109
+ output.multi_receive([event1, event2, event3, event4])
110
+ end
111
+ end
112
+ end
113
+
114
+ context 'splitting batches of events' do
115
+ context 'when they are all with the same key+tag' do
116
+ it 'should return one batch' do
117
+ batches = output.split_batches([ {event: :event1, key: 'key1', tag: 'tag1'},
118
+ {event: :event2, key: 'key1', tag: 'tag1'} ])
119
+ expect(batches.size).to eq(1)
120
+ expect(batches).to eq({ ['key1', 'tag1'] => [:event1, :event2] })
121
+ end
122
+ end
123
+
124
+ context 'when messages have different key & tag' do
125
+ it 'should return one batch for each key+tag combination' do
126
+ batches = output.split_batches([
127
+ {event: :event1, key: 'key1', tag: 'tag1'},
128
+ {event: :event2, key: 'key2', tag: 'tag1'},
129
+ {event: :event3, key: 'key2', tag: 'tag2'},
130
+ {event: :event4, key: 'key1', tag: 'tag1'},
131
+ {event: :event5, key: 'key2', tag: 'tag1'},
132
+ {event: :event6, key: 'key1', tag: 'tag1'},
133
+ {event: :event7, key: 'key1', tag: 'tag1'},
134
+ ])
135
+ expect(batches.size).to eq(3)
136
+ expect(batches).to eq(
137
+ { ['key1', 'tag1'] => [:event1, :event4, :event6, :event7],
138
+ ['key2', 'tag1'] => [:event2, :event5],
139
+ ['key2', 'tag2'] => [:event3],
140
+ })
141
+ end
142
+ end
143
+ end
144
+
145
+ context 'when building message bodies' do
146
+ it 'should send only one payload when everything fits' do
147
+ yielded_times = 0
148
+ output.build_message_bodies([event] * 10) do |body|
149
+ expect(body.lines.count).to eq(10)
150
+
151
+ yielded_times += 1
152
+ end
153
+ expect(yielded_times).to eq(1)
154
+ end
155
+
156
+ it 'should skip events that are bigger than max_event_size' do
157
+ config['max_event_size'] = 1024
158
+ good_event = LogStash::Event.new('message' => 'fantastic log entry',
159
+ 'source' => 'someapp',
160
+ 'type' => 'nginx',
161
+ '@timestamp' => LogStash::Timestamp.now)
162
+ big_event = good_event.clone.tap { |e| e.set('filler', 'helloworld' * 100) }
163
+
164
+ logger = logger_for output
165
+ expect(logger).to receive(:warn).once.with(
166
+ /Skipping event/, hash_including(:event_size => 1134,
167
+ :max_event_size => 1024))
168
+ expect(logger).to receive(:debug).twice
169
+
170
+ yielded_times = 0
171
+ output.build_message_bodies([good_event, big_event]) do |body|
172
+ expect(body.lines.count).to eq(1)
173
+ expect(body).not_to match /helloworld/
174
+
175
+ yielded_times += 1
176
+ end
177
+ expect(yielded_times).to eq(1)
178
+ end
179
+
180
+ it 'should yield as many times as needed to send appropriately-sized payloads' do
181
+ config['max_payload_size'] = 1024
182
+ # Once JSON-encoded, these events are 122 bytes each.
183
+ # 8 of them fit in a 1024 bytes payload
184
+ event = LogStash::Event.new('message' => 'fantastic log entry',
185
+ 'source' => 'someapp',
186
+ 'type' => 'nginx',
187
+ '@timestamp' => LogStash::Timestamp.now)
188
+
189
+ payloads = []
190
+ output.build_message_bodies([event] * 10) do |body|
191
+ payloads << body
192
+ end
193
+ expect(payloads.size).to eq(2)
194
+ expect(payloads[0].lines.count).to eq(8)
195
+ expect(payloads[1].lines.count).to eq(2)
196
+ end
64
197
  end
65
198
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-loggly
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.0.5
4
+ version: 4.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2018-05-09 00:00:00.000000000 Z
11
+ date: 2018-05-15 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement