logstash-output-cloudwatchlogs 0.9.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: 7b4debc394af3e53030e42ef52215c7860062092
4
+ data.tar.gz: df55731be057a90a525f1e994ac41f2a6d37f594
5
+ SHA512:
6
+ metadata.gz: d1c73151697cf84d44bee88b71eb81a90f9c91e75089eecc64b6bca20bc5d8bbb6943fbaf4ba18f522a9760cebfc7e14cd92abf7e2047e3f799a1861c97e9e1a
7
+ data.tar.gz: 2e2813e492f5a06fb7ee6b3f95b624a6ee571a08671e5e993a73e319694a4a09b9d6828e9983e0b52e0dc1b750bb4172c4ffb6d9318aecda7995c5789ba5e8f2
@@ -0,0 +1,17 @@
1
+ *.gem
2
+ *.rbc
3
+ .bundle
4
+ .config
5
+ .yardoc
6
+ Gemfile.lock
7
+ InstalledFiles
8
+ _yardoc
9
+ coverage
10
+ doc/
11
+ lib/bundler/man
12
+ pkg
13
+ rdoc
14
+ spec/reports
15
+ test/tmp
16
+ test/version_tmp
17
+ tmp
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,26 @@
1
+ Amazon Software License
2
+ 1. Definitions
3
+ "Licensor" means any person or entity that distributes its Work.
4
+
5
+ "Software" means the original work of authorship made available under this License.
6
+
7
+ "Work" means the Software and any additions to or derivative works of the Software that are made available under this License.
8
+
9
+ The terms "reproduce," "reproduction," "derivative works," and "distribution" have the meaning as provided under U.S. copyright law; provided, however, that for the purposes of this License, derivative works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work.
10
+
11
+ Works, including the Software, are "made available" under this License by including in or with the Work either (a) a copyright notice referencing the applicability of this License to the Work, or (b) a copy of this License.
12
+ 2. License Grants
13
+ 2.1 Copyright Grant. Subject to the terms and conditions of this License, each Licensor grants to you a perpetual, worldwide, non-exclusive, royalty-free, copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense and distribute its Work and any resulting derivative works in any form.
14
+ 2.2 Patent Grant. Subject to the terms and conditions of this License, each Licensor grants to you a perpetual, worldwide, non-exclusive, royalty-free patent license to make, have made, use, sell, offer for sale, import, and otherwise transfer its Work, in whole or in part. The foregoing license applies only to the patent claims licensable by Licensor that would be infringed by Licensor’s Work (or portion thereof) individually and excluding any combinations with any other materials or technology.
15
+ 3. Limitations
16
+ 3.1 Redistribution. You may reproduce or distribute the Work only if (a) you do so under this License, (b) you include a complete copy of this License with your distribution, and (c) you retain without modification any copyright, patent, trademark, or attribution notices that are present in the Work.
17
+ 3.2 Derivative Works. You may specify that additional or different terms apply to the use, reproduction, and distribution of your derivative works of the Work ("Your Terms") only if (a) Your Terms provide that the use limitation in Section 3.3 applies to your derivative works, and (b) you identify the specific derivative works that are subject to Your Terms. Notwithstanding Your Terms, this License (including the redistribution requirements in Section 3.1) will continue to apply to the Work itself.
18
+ 3.3 Use Limitation. The Work and any derivative works thereof only may be used or intended for use with the web services, computing platforms or applications provided by Amazon.com, Inc. or its affiliates, including Amazon Web Services, Inc.
19
+ 3.4 Patent Claims. If you bring or threaten to bring a patent claim against any Licensor (including any claim, cross-claim or counterclaim in a lawsuit) to enforce any patents that you allege are infringed by any Work, then your rights under this License from such Licensor (including the grants in Sections 2.1 and 2.2) will terminate immediately.
20
+ 3.5 Trademarks. This License does not grant any rights to use any Licensor’s or its affiliates’ names, logos, or trademarks, except as necessary to reproduce the notices described in this License.
21
+ 3.6 Termination. If you violate any term of this License, then your rights under this License (including the grants in Sections 2.1 and 2.2) will terminate immediately.
22
+ 4. Disclaimer of Warranty.
23
+ THE WORK IS PROVIDED "AS IS" WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WARRANTIES OR CONDITIONS OF M ERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE OR NON-INFRINGEMENT. YOU BEAR THE RISK OF UNDERTAKING ANY ACTIVITIES UNDER THIS LICENSE. SOME STATES’ CONSUMER LAWS DO NOT ALLOW EXCLUSION OF AN IMPLIED WARRANTY, SO THIS DISCLAIMER MAY NOT APPLY TO YOU.
24
+ 5. Limitation of Liability.
25
+ EXCEPT AS PROHIBITED BY APPLICABLE LAW, IN NO EVENT AND UNDER NO LEGAL THEORY, WHETHER IN TORT (INCLUDING NEGLIGENCE), CONTRACT, OR OTHERWISE SHALL ANY LICENSOR BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING OUT OF OR RELATED TO THIS LICENSE, THE USE OR INABILITY TO USE THE WORK (INCLUDING BUT NOT LIMITED TO LOSS OF GOODWILL, BUSINESS INTERRUPTION, LOST PROFITS OR DATA, COMPUTER FAILURE OR MALFUNCTION, OR ANY OTHER COMM ERCIAL DAMAGES OR LOSSES), EVEN IF THE LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
26
+ Effective Date – April 18, 2008 © 2008 Amazon.com, Inc. or its affiliates. All rights reserved.
@@ -0,0 +1,2 @@
1
+ logstash-output-cloudwatchlogs
2
+ Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
@@ -0,0 +1,150 @@
1
+ # logstash-output-cloudwatchlogs
2
+ A logstash plugin that allows to send logs to AWS CloudWatch Logs service.
3
+
4
+ ## Developing
5
+
6
+ ### 1. Plugin Developement and Testing
7
+
8
+ #### Code
9
+ - To get started, you'll need JRuby with the Bundler gem installed.
10
+
11
+ - Clone the repository.
12
+
13
+ - Install dependencies
14
+ ```sh
15
+ bundle install
16
+ ```
17
+
18
+ #### Test
19
+
20
+ - Update your dependencies
21
+
22
+ ```sh
23
+ bundle install
24
+ ```
25
+
26
+ - Run tests
27
+
28
+ ```sh
29
+ bundle exec rspec
30
+ ```
31
+
32
+ ### 2. Running your unpublished Plugin in Logstash
33
+
34
+ #### 2.1 Run in a local Logstash clone
35
+
36
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
37
+ ```ruby
38
+ gem "logstash-output-cloudwatchlogs", :path => "/your/local/logstash-output-cloudwatchlogs"
39
+ ```
40
+ - Install plugin
41
+ ```sh
42
+ bin/plugin install --no-verify
43
+ ```
44
+ - Run Logstash with your plugin
45
+
46
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
47
+
48
+ #### 2.2 Run in an installed Logstash
49
+
50
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
51
+
52
+ - Build your plugin gem
53
+ ```sh
54
+ gem build logstash-output-cloudwatchlogs.gemspec
55
+ ```
56
+ - Install the plugin from the Logstash home
57
+ ```sh
58
+ bin/plugin install /your/local/plugin/logstash-output-cloudwatchlogs.gem
59
+ ```
60
+ - Start Logstash and proceed to test the plugin
61
+
62
+ ## Usage
63
+
64
+ Below sample configuration reads 2 log4j logs and sends them to 2 log streams respectively.
65
+
66
+ ```
67
+ input {
68
+ file {
69
+ path => "/path/to/app1.log"
70
+ start_position => beginning
71
+ tags => ["app1"]
72
+ }
73
+ file {
74
+ path => "/path/to/app2.log"
75
+ start_position => beginning
76
+ tags => ["app2"]
77
+ }
78
+ }
79
+
80
+ filter {
81
+ multiline {
82
+ pattern => "^%{MONTHDAY} %{MONTH} %{YEAR} %{TIME}"
83
+ negate => true
84
+ what => "previous"
85
+ }
86
+ grok {
87
+ match => { "message" => "(?<timestamp>%{MONTHDAY} %{MONTH} %{YEAR} %{TIME})" }
88
+ }
89
+ date {
90
+ match => [ "timestamp", "dd MMM yyyy HH:mm:ss,SSS" ]
91
+ target => "@timestamp"
92
+ }
93
+ }
94
+
95
+ output {
96
+ if "app1" in [tags] {
97
+ cloudwatchlogs {
98
+ "log_group_name" => "app1"
99
+ "log_stream_name" => "host1"
100
+ }
101
+ }
102
+ if "app2" in [tags] {
103
+ cloudwatchlogs {
104
+ "log_group_name" => "app2"
105
+ "log_stream_name" => "host1"
106
+ }
107
+ }
108
+ }
109
+
110
+ ```
111
+
112
+ Here are all the supported options:
113
+
114
+ * region: string, the AWS region, defaults to us-east-1.
115
+ * access_key_id: string, specifies the access key.
116
+ * secret_access_key: string, specifies the secret access key.
117
+ * aws_credentials_file: string, points to a file which specifies access_key_id and secret_access_key.
118
+ * log_group_name: string, required, specifies the destination log group. A log group will be created automatically if it doesn't already exist.
119
+ * log_stream_name: string, required, specifies the destination log stream. A log stream will be created automatically if it doesn't already exist.
120
+ * batch_count: number, the max number of log events in a batch, up to 10000.
121
+ * batch_size: number, the max size of log events in a batch, in bytes, up to 1048576.
122
+ * buffer_duration: number, the amount of time to batch log events, in milliseconds, from 5000 milliseconds.
123
+ * queue_size: number, the max number of batches to buffer, defaults to 5.
124
+ * dry_run: boolean, prints out the log events to stdout instead of sending them to CloudWatch Logs service.
125
+
126
+
127
+ In addition to configuring the AWS credential in the configuration file, credentials can also be loaded automatically from the following locations:
128
+
129
+ * ENV['AWS_ACCESS_KEY_ID'] and ENV['AWS_SECRET_ACCESS_KEY']
130
+ * The shared credentials ini file at ~/.aws/credentials (more information)
131
+ * From an instance profile when running on EC2
132
+
133
+ ```
134
+ cloudwatchlogs {
135
+ "log_group_name" => "lg2"
136
+ "log_stream_name" => "ls1"
137
+ "batch_count" => 1000
138
+ "batch_size" => 1048576
139
+ "buffer_duration" => 5000
140
+ "queue_size" => 10
141
+ "dry_run" => false
142
+ }
143
+ ```
144
+ ## Contributing
145
+
146
+ 1. Fork it
147
+ 2. Create your feature branch (`git checkout -b my-new-feature`)
148
+ 3. Commit your changes (`git commit -am 'Added some feature'`)
149
+ 4. Push to the branch (`git push origin my-new-feature`)
150
+ 5. Create new Pull Request
@@ -0,0 +1,7 @@
1
+ @files=[]
2
+
3
+ task :default do
4
+ system("rake -T")
5
+ end
6
+
7
+ require "logstash/devutils/rake"
@@ -0,0 +1,387 @@
1
+ # encoding: utf-8
2
+
3
+ #
4
+ # Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
5
+ #
6
+ # Licensed under the Amazon Software License (the "License").
7
+ # You may not use this file except in compliance with the License.
8
+ # A copy of the License is located at
9
+ #
10
+ # http://aws.amazon.com/asl/
11
+ #
12
+ # or in the "license" file accompanying this file. This file is distributed
13
+ # on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
14
+ # express or implied. See the License for the specific language governing
15
+ # permissions and limitations under the License.
16
+
17
+ require "logstash/outputs/base"
18
+ require "logstash/namespace"
19
+ require "logstash/plugin_mixins/aws_config"
20
+
21
+ require "time"
22
+
23
+ # This output lets you send log data to AWS CloudWatch Logs service
24
+ #
25
+ class LogStash::Outputs::CloudWatchLogs < LogStash::Outputs::Base
26
+
27
+ include LogStash::PluginMixins::AwsConfig::V2
28
+
29
+ config_name "cloudwatchlogs"
30
+
31
+ # Constants
32
+ LOG_GROUP_NAME = "log_group_name"
33
+ LOG_STREAM_NAME = "log_stream_name"
34
+ SEQUENCE_TOKEN = "sequence_token"
35
+ TIMESTAMP = "@timestamp"
36
+ MESSAGE = "message"
37
+
38
+ PER_EVENT_OVERHEAD = 26
39
+ MAX_BATCH_SIZE = 1024 * 1024
40
+ MAX_BATCH_COUNT = 10000
41
+ MAX_DISTANCE_BETWEEN_EVENTS = 86400 * 1000
42
+ MIN_DELAY = 0.2
43
+ MIN_BUFFER_DURATION = 5000
44
+ # Backoff up to 64 seconds upon failure
45
+ MAX_BACKOFF_IN_SECOND = 64
46
+
47
+ # The destination log group
48
+ config :log_group_name, :validate => :string, :required => true
49
+
50
+ # The destination log stream
51
+ config :log_stream_name, :validate => :string, :required => true
52
+
53
+ # The max number of log events in a batch.
54
+ config :batch_count, :validate => :number, :default => MAX_BATCH_COUNT
55
+
56
+ # The max size of log events in a batch.
57
+ config :batch_size, :validate => :number, :default => MAX_BATCH_SIZE
58
+
59
+ # The amount of time to batch log events, in milliseconds.
60
+ config :buffer_duration, :validate => :number, :default => 5000
61
+
62
+ # The max number of batches to buffer.
63
+ # Log event is added to batch first. When batch is full or exceeds specified
64
+ # buffer_duration milliseconds, batch is added to queue which is consumed by
65
+ # a different thread. When both batch and queue are full, the add operation is
66
+ # blocked.
67
+ config :queue_size, :validate => :number, :default => 5
68
+
69
+ # Print out the log events to stdout
70
+ config :dry_run, :validate => :boolean, :default => false
71
+
72
+ attr_accessor :sequence_token, :last_flush, :cwl
73
+
74
+ # Only accessed by tests
75
+ attr_reader :buffer
76
+
77
+ public
78
+ def register
79
+ require "aws-sdk"
80
+ @cwl = Aws::CloudWatchLogs::Client.new(aws_options_hash)
81
+
82
+ if @batch_count > MAX_BATCH_COUNT
83
+ @logger.warn(":batch_count exceeds the max number of log events. Use #{MAX_BATCH_COUNT} instead.")
84
+ @batch_count = MAX_BATCH_COUNT
85
+ end
86
+ if @batch_size > MAX_BATCH_SIZE
87
+ @logger.warn(":batch_size exceeds the max size of log events. Use #{MAX_BATCH_SIZE} instead.")
88
+ @batch_size = MAX_BATCH_SIZE
89
+ end
90
+ if @buffer_duration < MIN_BUFFER_DURATION
91
+ @logger.warn(":buffer_duration is smaller than the min value. Use #{MIN_BUFFER_DURATION} instead.")
92
+ @buffer_duration = MIN_BUFFER_DURATION
93
+ end
94
+ @sequence_token = nil
95
+ @last_flush = Time.now.to_f
96
+ @buffer = Buffer.new(
97
+ max_batch_count: batch_count, max_batch_size: batch_size,
98
+ buffer_duration: @buffer_duration, out_queue_size: @queue_size, logger: @logger,
99
+ size_of_item_proc: Proc.new {|event| event[:message].bytesize + PER_EVENT_OVERHEAD})
100
+ @publisher = Thread.new do
101
+ @buffer.deq do |batch|
102
+ flush(batch)
103
+ end
104
+ end
105
+ end # def register
106
+
107
+ public
108
+ def receive(event)
109
+ return unless output?(event)
110
+
111
+ if event == LogStash::SHUTDOWN
112
+ @buffer.close
113
+ @publisher.join
114
+ @logger.info("CloudWatch Logs output plugin shutdown.")
115
+ finished
116
+ return
117
+ end
118
+ return if invalid?(event)
119
+
120
+ @buffer.enq(
121
+ {:timestamp => event.timestamp.time.to_f*1000,
122
+ :message => event[MESSAGE] })
123
+ end # def receive
124
+
125
+ public
126
+ def teardown
127
+ @logger.info("Going to clean up resources")
128
+ @buffer.close
129
+ @publisher.join
130
+ @cwl = nil
131
+ finished
132
+ end # def teardown
133
+
134
+ public
135
+ def flush(events)
136
+ return if events.nil? or events.empty?
137
+ log_event_batches = prepare_log_events(events)
138
+ log_event_batches.each do |log_events|
139
+ put_log_events(log_events)
140
+ end
141
+ end
142
+
143
+ private
144
+ def put_log_events(log_events)
145
+ return if log_events.nil? or log_events.empty?
146
+ # Shouldn't send two requests within MIN_DELAY
147
+ delay = MIN_DELAY - (Time.now.to_f - @last_flush)
148
+ sleep(delay) if delay > 0
149
+ backoff = 1
150
+ begin
151
+ @logger.info("Sending #{log_events.size} events to #{@log_group_name}/#{@log_stream_name}")
152
+ @last_flush = Time.now.to_f
153
+ if @dry_run
154
+ log_events.each do |event|
155
+ puts event[:message]
156
+ end
157
+ return
158
+ end
159
+ response = @cwl.put_log_events(
160
+ :log_group_name => @log_group_name,
161
+ :log_stream_name => @log_stream_name,
162
+ :log_events => log_events,
163
+ :sequence_token => @sequence_token
164
+ )
165
+ @sequence_token = response.next_sequence_token
166
+ rescue Aws::CloudWatchLogs::Errors::InvalidSequenceTokenException => e
167
+ @logger.warn(e)
168
+ if /sequenceToken(?:\sis)?: ([^\s]+)/ =~ e.to_s
169
+ if $1 == 'null'
170
+ @sequence_token = nil
171
+ else
172
+ @sequence_token = $1
173
+ end
174
+ @logger.info("Will retry with new sequence token #{@sequence_token}")
175
+ retry
176
+ else
177
+ @logger.error("Cannot find sequence token from response")
178
+ end
179
+ rescue Aws::CloudWatchLogs::Errors::DataAlreadyAcceptedException => e
180
+ @logger.warn(e)
181
+ if /sequenceToken(?:\sis)?: ([^\s]+)/ =~ e.to_s
182
+ if $1 == 'null'
183
+ @sequence_token = nil
184
+ else
185
+ @sequence_token = $1
186
+ end
187
+ @logger.info("Data already accepted and no need to resend")
188
+ else
189
+ @logger.error("Cannot find sequence token from response")
190
+ end
191
+ rescue Aws::CloudWatchLogs::Errors::ResourceNotFoundException => e
192
+ @logger.info("Will create log group/stream and retry")
193
+ begin
194
+ @cwl.create_log_group(:log_group_name => @log_group_name)
195
+ rescue Aws::CloudWatchLogs::Errors::ResourceAlreadyExistsException => e
196
+ @logger.info("Log group #{@log_group_name} already exists")
197
+ rescue Exception => e
198
+ @logger.error(e)
199
+ end
200
+ begin
201
+ @cwl.create_log_stream(:log_group_name => @log_group_name, :log_stream_name => @log_stream_name)
202
+ rescue Aws::CloudWatchLogs::Errors::ResourceAlreadyExistsException => e
203
+ @logger.info("Log stream #{@log_stream_name} already exists")
204
+ rescue Exception => e
205
+ @logger.error(e)
206
+ end
207
+ retry
208
+ rescue Aws::CloudWatchLogs::Errors::InvalidParameterException => e
209
+ # swallow exception
210
+ @logger.error("Skip batch due to #{e}")
211
+ rescue Exception => e
212
+ if backoff * 2 <= MAX_BACKOFF_IN_SECOND
213
+ backoff = backoff * 2
214
+ end
215
+ @logger.error("Will retry for #{e} after #{backoff} seconds")
216
+ sleep backoff
217
+ retry
218
+ end
219
+ end# def flush
220
+
221
+ private
222
+ def invalid?(event)
223
+ status = event[TIMESTAMP].nil? || event[MESSAGE].nil?
224
+ if status
225
+ @logger.warn("Skipping invalid event #{event.to_hash}")
226
+ end
227
+ return status
228
+ end
229
+
230
+ private
231
+ def prepare_log_events(events)
232
+ log_events = events.sort {|e1,e2| e1[:timestamp] <=> e2[:timestamp]}
233
+ batches = []
234
+ if log_events[-1][:timestamp] - log_events[0][:timestamp] > MAX_DISTANCE_BETWEEN_EVENTS
235
+ temp_batch = []
236
+ log_events.each do |log_event|
237
+ if temp_batch.empty? || log_event[:timestamp] - temp_batch[0][:timestamp] <= MAX_DISTANCE_BETWEEN_EVENTS
238
+ temp_batch << log_event
239
+ else
240
+ batches << temp_batch
241
+ temp_batch = []
242
+ temp_batch << log_event
243
+ end
244
+ end
245
+ if not temp_batch.empty?
246
+ batches << temp_batch
247
+ end
248
+ else
249
+ batches << log_events
250
+ end
251
+ batches
252
+ end
253
+
254
+ ##
255
+ # This class buffers series of single item to batches and puts batches to
256
+ # a SizedQueue for consumption.
257
+ # A buffer includes an ongoing batch and an out queue. An item is added
258
+ # to the ongoing batch first, when the ongoing batch becomes to ready for
259
+ # consumption and then is added to out queue/emptified.
260
+ # An ongoing batch becomes to comsumption ready if the number of items is going to
261
+ # exceed *max_batch_count*, or the size of items is going to exceed
262
+ # *max_batch_size*, with the addition of one more item,
263
+ # or the batch has opend more than *buffer_duration* milliseconds and has at least one item.
264
+
265
+ class Buffer
266
+
267
+ CLOSE_BATCH = :close
268
+
269
+ attr_reader :in_batch, :in_count, :in_size, :out_queue
270
+
271
+ # Creates a new buffer
272
+ def initialize(options = {})
273
+ @max_batch_count = options.fetch(:max_batch_count)
274
+ @max_batch_size = options.fetch(:max_batch_size)
275
+ @buffer_duration = options.fetch(:buffer_duration)
276
+ @out_queue_size = options.fetch(:out_queue_size, 10)
277
+ @logger = options.fetch(:logger, nil)
278
+ @size_of_item_proc = options.fetch(:size_of_item_proc)
279
+ @in_batch = Array.new
280
+ @in_count = 0
281
+ @in_size = 0
282
+ @out_queue = SizedQueue.new(@out_queue_size)
283
+ @batch_update_mutex = Mutex.new
284
+ @last_batch_time = Time.now
285
+ if @buffer_duration > 0
286
+ @scheduled_batcher = Thread.new do
287
+ loop do
288
+ sleep(@buffer_duration / 1000.0)
289
+ enq(:scheduled)
290
+ end
291
+ end
292
+ end
293
+ end
294
+
295
+ # Enques an item to buffer
296
+ #
297
+ # * If ongoing batch is not full with this addition, adds item to batch.
298
+ # * If ongoing batch is full with this addition, adds item to batch and add batch to out queue.
299
+ # * If ongoing batch is going to overflow with this addition, adds batch to out queue,
300
+ # and then adds the item to the new batch
301
+ def enq(item)
302
+ @batch_update_mutex.synchronize do
303
+ if item == :scheduled || item == :close
304
+ add_current_batch_to_out_queue(item)
305
+ return
306
+ end
307
+ status = try_add_item(item)
308
+ if status != 0
309
+ add_current_batch_to_out_queue(:add)
310
+ if status == -1
311
+ try_add_item(item)
312
+ end
313
+ end
314
+ end
315
+ end
316
+
317
+ # Closes the buffer
318
+ #
319
+ # Adds current batch to the queue and adds CLOSE_BATCH to queue.
320
+ # Waits until consumer completes.
321
+ def close
322
+ while @in_size != 0 do
323
+ enq(:close)
324
+ sleep(1)
325
+ end
326
+ @out_queue.enq(CLOSE_BATCH)
327
+ end
328
+
329
+ # Deques ready for consumption batches
330
+ #
331
+ # The caller blocks on this call until the buffer is closed.
332
+ def deq(&proc)
333
+ loop do
334
+ batch = @out_queue.deq
335
+ if batch == CLOSE_BATCH
336
+ break
337
+ end
338
+ proc.call(batch)
339
+ end
340
+ end
341
+
342
+ private
343
+ # Tries to add an item to buffer
344
+ def try_add_item(item)
345
+ item_size = @size_of_item_proc.call(item)
346
+ if @in_count + 1 == @max_batch_count ||
347
+ @in_size + item_size == @max_batch_size
348
+ # accept item, but can't accept more items
349
+ add_item(item)
350
+ return 1
351
+ elsif @in_size + item_size > @max_batch_size
352
+ # cannot accept item
353
+ return -1
354
+ else
355
+ add_item(item)
356
+ # accept item, and may accept next item
357
+ return 0
358
+ end
359
+ end
360
+
361
+ # Adds item to batch
362
+ def add_item(item)
363
+ @in_batch << item
364
+ @in_count += 1
365
+ @in_size += @size_of_item_proc.call(item)
366
+ end
367
+
368
+ # Adds batch to out queue
369
+ def add_current_batch_to_out_queue(from)
370
+ if from == :scheduled && (Time.now - @last_batch_time) * 1000 < @buffer_duration
371
+ return
372
+ end
373
+ if @in_batch.size == 0
374
+ @last_batch_time = Time.now
375
+ return
376
+ end
377
+ @logger.debug("Added batch with #{in_count} items in #{in_size} by #{from}") if @logger
378
+ @out_queue.enq(@in_batch)
379
+ @in_batch = Array.new
380
+ @in_count = 0
381
+ @in_size = 0
382
+ @last_batch_time = Time.now
383
+ end
384
+
385
+ end
386
+
387
+ end # class LogStash::Outputs::CloudWatchLogs