logstash-output-sqs 3.0.2 → 4.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 5099a8853aa054d817e2a5e96d48f9d0e9f0c900
4
- data.tar.gz: b9220f0a03ac674117ebc0eecfd5b970051297fd
3
+ metadata.gz: 4a12c855c89c8c21ea8ce285b37a6634d49edc2e
4
+ data.tar.gz: f2f7443979136609b06cba6f4fad00a7d279b882
5
5
  SHA512:
6
- metadata.gz: 7558caee7aa481c71ae522900f8a000df55715553acf5e6660894ffb3f556a1aa6b067be9473662ce5e8146c62162e165e03d7ec8b1d9314a46c3bdb761ac52e
7
- data.tar.gz: d766b61f6235c8996552671f9ba2815b6e80240c3d389e372427d3450ba80ef360ba88be923f49338d9beec06e74f30780eaca9330ada4183b5fac419ea5f94c
6
+ metadata.gz: 3b0fef2f371a8d541e45690cb958461237ec7413ddee50585427e56ee9aefd42b1f4a53fb2be3b911ebf627f78f241f42fdfc444b3fbb0cad4f12b28340ec529
7
+ data.tar.gz: fb92b8e9f07c111a7d0cc7228bba59059a77b8bc7ab1adea5429b0586047d614cc6292d7a49408c717d780f6806d11603494bd081c75aaaeb1ddff278bd92db4
@@ -1,3 +1,13 @@
1
+ ## 4.0.0
2
+ - Add unit and integration tests.
3
+ - Adjust the sample IAM policy in the documentation, removing actions which are not actually required by the plugin. Specifically, the following actions are not required: `sqs:ChangeMessageVisibility`, `sqs:ChangeMessageVisibilityBatch`, `sqs:GetQueueAttributes` and `sqs:ListQueues`.
4
+ - Dynamically adjust the batch message size. SQS allows up to 10 messages to be published in a single batch, however the total size of the batch is limited to 256KiB (see [Limits in Amazon SQS](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/limits-messages.html)). This plugin will now dynamically adjust the number of events included in each batch to ensure that the total batch size does not exceed `message_max_size`. Note that any single messages which exceeds the 256KiB size limit will be dropped.
5
+ - Move to the new concurrency model, `:shared`.
6
+ - The `batch_timeout` parameter has been deprecated because it no longer has any effect.
7
+ - The individual (non-batch) mode of operation (i.e. `batch => false`) has been deprecated. Batch mode is vastly more performant and we do not believe that there are any use cases which require non-batch mode. You can emulate non-batch mode by setting `batch_events => 1`, although this will call `sqs:SendMessageBatch` with a batch size of 1 rather than calling `sqs:SendMessage`.
8
+ - The plugin now implements `#multi_receive_encoded` and no longer uses `Stud::Buffer`.
9
+ - Update the AWS SDK to version 2.
10
+
1
11
  ## 3.0.2
2
12
  - Relax constraint on logstash-core-plugin-api to >= 1.60 <= 2.99
3
13
 
@@ -13,7 +23,7 @@
13
23
  # 2.0.3
14
24
  - New dependency requirements for logstash-core for the 5.0 release
15
25
  ## 2.0.0
16
- - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
26
+ - Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
17
27
  instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
18
28
  - Dependency on logstash-core update to 2.0
19
29
 
@@ -6,12 +6,13 @@ Contributors:
6
6
  * James Turnbull (jamtur01)
7
7
  * John Price (awesometown)
8
8
  * Jordan Sissel (jordansissel)
9
+ * Joshua Spence (joshuaspence)
10
+ * Jurgens du Toit (jrgns)
9
11
  * Kurt Hurtado (kurtado)
10
12
  * Pier-Hugues Pellerin (ph)
11
13
  * Richard Pijnenburg (electrical)
12
14
  * Sean Laurent (organicveggie)
13
15
  * Toby Collier (tobyw4n)
14
- * Jurgens du Toit (jrgns)
15
16
 
16
17
  Note: If you've sent us patches, bug reports, or otherwise contributed to
17
18
  Logstash, and you aren't on the list above and want to be, please let us know
@@ -1,136 +1,175 @@
1
1
  # encoding: utf-8
2
- require "logstash/outputs/base"
3
- require "logstash/namespace"
4
- require "logstash/plugin_mixins/aws_config"
5
- require "stud/buffer"
6
- require "digest/sha2"
7
2
 
8
- # Push events to an Amazon Web Services Simple Queue Service (SQS) queue.
3
+ require 'aws-sdk'
4
+ require 'logstash/errors'
5
+ require 'logstash/namespace'
6
+ require 'logstash/outputs/base'
7
+ require 'logstash/outputs/sqs/patch'
8
+ require 'logstash/plugin_mixins/aws_config'
9
+
10
+ # Forcibly load all modules marked to be lazily loaded.
9
11
  #
10
- # SQS is a simple, scalable queue system that is part of the
11
- # Amazon Web Services suite of tools.
12
+ # It is recommended that this is called prior to launching threads. See
13
+ # https://aws.amazon.com/blogs/developer/threading-with-the-aws-sdk-for-ruby/.
14
+ Aws.eager_autoload!
15
+
16
+ # Push events to an Amazon Web Services (AWS) Simple Queue Service (SQS) queue.
12
17
  #
13
- # Although SQS is similar to other queuing systems like AMQP, it
14
- # uses a custom API and requires that you have an AWS account.
15
- # See http://aws.amazon.com/sqs/ for more details on how SQS works,
16
- # what the pricing schedule looks like and how to setup a queue.
18
+ # SQS is a simple, scalable queue system that is part of the Amazon Web
19
+ # Services suite of tools. Although SQS is similar to other queuing systems
20
+ # such as Advanced Message Queuing Protocol (AMQP), it uses a custom API and
21
+ # requires that you have an AWS account. See http://aws.amazon.com/sqs/ for
22
+ # more details on how SQS works, what the pricing schedule looks like and how
23
+ # to setup a queue.
17
24
  #
18
- # To use this plugin, you *must*:
25
+ # The "consumer" identity must have the following permissions on the queue:
19
26
  #
20
- # * Have an AWS account
21
- # * Setup an SQS queue
22
- # * Create an identify that has access to publish messages to the queue.
27
+ # * `sqs:GetQueueUrl`
28
+ # * `sqs:SendMessage`
29
+ # * `sqs:SendMessageBatch`
23
30
  #
24
- # The "consumer" identity must have the following permissions on the queue:
31
+ # Typically, you should setup an IAM policy, create a user and apply the IAM
32
+ # policy to the user. See http://aws.amazon.com/iam/ for more details on
33
+ # setting up AWS identities. A sample policy is as follows:
34
+ #
35
+ # [source,json]
36
+ # {
37
+ # "Version": "2012-10-17",
38
+ # "Statement": [
39
+ # {
40
+ # "Effect": "Allow",
41
+ # "Action": [
42
+ # "sqs:GetQueueUrl",
43
+ # "sqs:SendMessage",
44
+ # "sqs:SendMessageBatch"
45
+ # ],
46
+ # "Resource": "arn:aws:sqs:us-east-1:123456789012:my-sqs-queue"
47
+ # }
48
+ # ]
49
+ # }
25
50
  #
26
- # * sqs:ChangeMessageVisibility
27
- # * sqs:ChangeMessageVisibilityBatch
28
- # * sqs:GetQueueAttributes
29
- # * sqs:GetQueueUrl
30
- # * sqs:ListQueues
31
- # * sqs:SendMessage
32
- # * sqs:SendMessageBatch
51
+ # = Batch Publishing
52
+ # This output publishes messages to SQS in batches in order to optimize event
53
+ # throughput and increase performance. This is done using the
54
+ # [`SendMessageBatch`](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_SendMessageBatch.html)
55
+ # API. When publishing messages to SQS in batches, the following service limits
56
+ # must be respected (see
57
+ # [Limits in Amazon SQS](http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/limits-messages.html)):
33
58
  #
34
- # Typically, you should setup an IAM policy, create a user and apply the IAM policy to the user.
35
- # A sample policy is as follows:
36
- # [source,ruby]
37
- # {
38
- # "Statement": [
39
- # {
40
- # "Sid": "Stmt1347986764948",
41
- # "Action": [
42
- # "sqs:ChangeMessageVisibility",
43
- # "sqs:ChangeMessageVisibilityBatch",
44
- # "sqs:GetQueueAttributes",
45
- # "sqs:GetQueueUrl",
46
- # "sqs:ListQueues",
47
- # "sqs:SendMessage",
48
- # "sqs:SendMessageBatch"
49
- # ],
50
- # "Effect": "Allow",
51
- # "Resource": [
52
- # "arn:aws:sqs:us-east-1:200850199751:Logstash"
53
- # ]
54
- # }
55
- # ]
56
- # }
59
+ # * The maximum allowed individual message size is 256KiB.
60
+ # * The maximum total payload size (i.e. the sum of the sizes of all
61
+ # individual messages within a batch) is also 256KiB.
57
62
  #
58
- # See http://aws.amazon.com/iam/ for more details on setting up AWS identities.
63
+ # This plugin will dynamically adjust the size of the batch published to SQS in
64
+ # order to ensure that the total payload size does not exceed 256KiB.
65
+ #
66
+ # WARNING: This output cannot currently handle messages larger than 256KiB. Any
67
+ # single message exceeding this size will be dropped.
59
68
  #
60
69
  class LogStash::Outputs::SQS < LogStash::Outputs::Base
61
- include LogStash::PluginMixins::AwsConfig
62
- include Stud::Buffer
70
+ include LogStash::PluginMixins::AwsConfig::V2
63
71
 
64
- config_name "sqs"
72
+ config_name 'sqs'
73
+ default :codec, 'json'
65
74
 
66
- # Name of SQS queue to push messages into. Note that this is just the name of the queue, not the URL or ARN.
67
- config :queue, :validate => :string, :required => true
75
+ concurrency :shared
68
76
 
69
- # Set to true if you want send messages to SQS in batches with `batch_send`
70
- # from the amazon sdk
71
- config :batch, :validate => :boolean, :default => true
77
+ # Set to `true` to send messages to SQS in batches (with the
78
+ # `SendMessageBatch` API) or `false` to send messages to SQS individually
79
+ # (with the `SendMessage` API). The size of the batch is configurable via
80
+ # `batch_events`.
81
+ config :batch, :validate => :boolean, :default => true, :deprecated => true
72
82
 
73
- # If `batch` is set to true, the number of events we queue up for a `batch_send`.
83
+ # The number of events to be sent in each batch. Set this to `1` to disable
84
+ # the batch sending of messages.
74
85
  config :batch_events, :validate => :number, :default => 10
75
86
 
76
- # If `batch` is set to true, the maximum amount of time between `batch_send` commands when there are pending events to flush.
77
- config :batch_timeout, :validate => :number, :default => 5
87
+ config :batch_timeout, :validate => :number, :deprecated => 'This setting no longer has any effect.'
78
88
 
79
- public
80
- def aws_service_endpoint(region)
81
- return {
82
- :sqs_endpoint => "sqs.#{region}.amazonaws.com"
83
- }
84
- end
89
+ # The maximum number of bytes for any message sent to SQS. Messages exceeding
90
+ # this size will be dropped. See
91
+ # http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/limits-messages.html.
92
+ config :message_max_size, :validate => :bytes, :default => '256KiB'
85
93
 
86
- public
94
+ # The name of the target SQS queue. Note that this is just the name of the
95
+ # queue, not the URL or ARN.
96
+ config :queue, :validate => :string, :required => true
97
+
98
+ public
87
99
  def register
88
- require "aws-sdk"
89
-
90
- @sqs = AWS::SQS.new(aws_options_hash)
91
-
92
- if @batch
93
- if @batch_events > 10
94
- raise RuntimeError.new(
95
- "AWS only allows a batch_events parameter of 10 or less"
96
- )
97
- elsif @batch_events <= 1
98
- raise RuntimeError.new(
99
- "batch_events parameter must be greater than 1 (or its not a batch)"
100
- )
101
- end
102
- buffer_initialize(
103
- :max_items => @batch_events,
104
- :max_interval => @batch_timeout,
105
- :logger => @logger
106
- )
107
- end
100
+ @sqs = Aws::SQS::Client.new(aws_options_hash)
101
+
102
+ if @batch_events > 10
103
+ raise LogStash::ConfigurationError, 'The maximum batch size is 10 events'
104
+ elsif @batch_events < 1
105
+ raise LogStash::ConfigurationError, 'The batch size must be greater than 0'
106
+ end
108
107
 
109
108
  begin
110
- @logger.debug("Connecting to AWS SQS queue '#{@queue}'...")
111
- @sqs_queue = @sqs.queues.named(@queue)
112
- @logger.info("Connected to AWS SQS queue '#{@queue}' successfully.")
113
- rescue Exception => e
114
- @logger.error("Unable to access SQS queue '#{@queue}': #{e.to_s}")
115
- end # begin/rescue
116
- end # def register
109
+ @logger.debug('Connecting to SQS queue', :queue => @queue, :region => region)
110
+ @queue_url = @sqs.get_queue_url(:queue_name => @queue)[:queue_url]
111
+ @logger.info('Connected to SQS queue successfully', :queue => @queue, :region => region)
112
+ rescue Aws::SQS::Errors::ServiceError => e
113
+ @logger.error('Failed to connect to SQS', :error => e)
114
+ raise LogStash::ConfigurationError, 'Verify the SQS queue name and your credentials'
115
+ end
116
+ end
117
117
 
118
118
  public
119
- def receive(event)
120
- if @batch
121
- buffer_receive(event.to_json)
122
- return
119
+ def multi_receive_encoded(encoded_events)
120
+ if @batch and @batch_events > 1
121
+ multi_receive_encoded_batch(encoded_events)
122
+ else
123
+ multi_receive_encoded_single(encoded_events)
123
124
  end
124
- @sqs_queue.send_message(event.to_json)
125
- end # def receive
125
+ end
126
126
 
127
- # called from Stud::Buffer#buffer_flush when there are events to flush
128
- def flush(events, close=false)
129
- @sqs_queue.batch_send(events)
127
+ private
128
+ def multi_receive_encoded_batch(encoded_events)
129
+ bytes = 0
130
+ entries = []
131
+
132
+ # Split the events into multiple batches to ensure that no single batch
133
+ # exceeds `@message_max_size` bytes.
134
+ encoded_events.each_with_index do |encoded_event, index|
135
+ event, encoded = encoded_event
136
+
137
+ if encoded.bytesize > @message_max_size
138
+ @logger.warn('Message exceeds maximum length and will be dropped', :message_size => encoded.bytesize)
139
+ next
140
+ end
141
+
142
+ if entries.size >= @batch_events or (bytes + encoded.bytesize) > @message_max_size
143
+ send_message_batch(entries)
144
+
145
+ bytes = 0
146
+ entries = []
147
+ end
148
+
149
+ bytes += encoded.bytesize
150
+ entries.push(:id => index.to_s, :message_body => encoded)
151
+ end
152
+
153
+ send_message_batch(entries) unless entries.empty?
130
154
  end
131
155
 
132
- public
133
- def close
134
- buffer_flush(:final => true)
135
- end # def close
156
+ private
157
+ def multi_receive_encoded_single(encoded_events)
158
+ encoded_events.each do |encoded_event|
159
+ event, encoded = encoded_event
160
+
161
+ if encoded.bytesize > @message_max_size
162
+ @logger.warn('Message exceeds maximum length and will be dropped', :message_size => encoded.bytesize)
163
+ next
164
+ end
165
+
166
+ @sqs.send_message(:queue_url => @queue_url, :message_body => encoded)
167
+ end
168
+ end
169
+
170
+ private
171
+ def send_message_batch(entries)
172
+ @logger.debug("Publishing #{entries.size} messages to SQS", :queue_url => @queue_url, :entries => entries)
173
+ @sqs.send_message_batch(:queue_url => @queue_url, :entries => entries)
174
+ end
136
175
  end
@@ -0,0 +1,21 @@
1
+ # This patch was stolen from logstash-plugins/logstash-output-s3#102.
2
+ #
3
+ # This patch is a workaround for a JRuby issue which has been fixed in JRuby
4
+ # 9000, but not in JRuby 1.7. See https://github.com/jruby/jruby/issues/3645
5
+ # and https://github.com/jruby/jruby/issues/3920. This is necessary because the
6
+ # `aws-sdk` is doing tricky name discovery to generate the correct error class.
7
+ #
8
+ # As per https://github.com/aws/aws-sdk-ruby/issues/1301#issuecomment-261115960,
9
+ # this patch may be short-lived anyway.
10
+ require 'aws-sdk'
11
+
12
+ begin
13
+ old_stderr = $stderr
14
+ $stderr = StringIO.new
15
+
16
+ module Aws
17
+ const_set(:SQS, Aws::SQS)
18
+ end
19
+ ensure
20
+ $stderr = old_stderr
21
+ end
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-output-sqs'
4
- s.version = '3.0.2'
4
+ s.version = '4.0.0'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Push events to an Amazon Web Services Simple Queue Service (SQS) queue."
7
7
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
@@ -20,12 +20,10 @@ Gem::Specification.new do |s|
20
20
  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" }
21
21
 
22
22
  # Gem dependencies
23
- s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
- s.add_runtime_dependency "logstash-mixin-aws", ">= 1.0.0"
25
-
26
- s.add_runtime_dependency 'aws-sdk'
27
- s.add_runtime_dependency 'stud'
23
+ s.add_runtime_dependency 'logstash-core-plugin-api', '>= 1.60', '<= 2.99'
24
+ s.add_runtime_dependency 'logstash-mixin-aws', '>= 1.0.0'
28
25
 
26
+ s.add_development_dependency 'logstash-codec-json'
29
27
  s.add_development_dependency 'logstash-devutils'
30
28
  end
31
29
 
@@ -0,0 +1,92 @@
1
+ # encoding: utf-8
2
+
3
+ require_relative '../../spec_helper'
4
+ require 'logstash/event'
5
+ require 'logstash/json'
6
+ require 'securerandom'
7
+
8
+ describe LogStash::Outputs::SQS, :integration => true do
9
+ let(:config) do
10
+ {
11
+ 'queue' => @queue_name,
12
+ }
13
+ end
14
+ subject { described_class.new(config) }
15
+
16
+ # Create an SQS queue with a random name.
17
+ before(:all) do
18
+ @sqs = Aws::SQS::Client.new
19
+ @queue_name = "logstash-output-sqs-#{SecureRandom.hex}"
20
+ @queue_url = @sqs.create_queue(:queue_name => @queue_name)[:queue_url]
21
+ end
22
+
23
+ # Destroy the SQS queue which was created in `before(:all)`.
24
+ after(:all) do
25
+ @sqs.delete_queue(:queue_url => @queue_url)
26
+ end
27
+
28
+ describe '#register' do
29
+ context 'with invalid credentials' do
30
+ let(:config) do
31
+ super.merge({
32
+ 'access_key_id' => 'bad_access',
33
+ 'secret_access_key' => 'bad_secret_key',
34
+ })
35
+ end
36
+
37
+ it 'raises a configuration error' do
38
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
39
+ end
40
+ end
41
+
42
+ context 'with a nonexistent queue' do
43
+ let(:config) { super.merge('queue' => 'nonexistent-queue') }
44
+
45
+ it 'raises a configuration error' do
46
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
47
+ end
48
+ end
49
+
50
+ context 'with valid configuration' do
51
+ it 'does not raise an error' do
52
+ expect { subject.register }.not_to raise_error
53
+ end
54
+ end
55
+ end
56
+
57
+ describe '#multi_receive_encoded' do
58
+ let(:sample_count) { 10 }
59
+ let(:sample_event) { LogStash::Event.new('message' => 'This is a message') }
60
+ let(:sample_events) do
61
+ sample_count.times.map do
62
+ [sample_event, LogStash::Json.dump(sample_event)]
63
+ end
64
+ end
65
+
66
+ before do
67
+ subject.register
68
+ end
69
+
70
+ after do
71
+ subject.close
72
+ end
73
+
74
+ context 'with batching disabled' do
75
+ let(:config) { super.merge('batch' => false) }
76
+
77
+ it 'publishes to SQS' do
78
+ subject.multi_receive_encoded(sample_events)
79
+ expect(receive_all_messages(@queue_url).count).to eq(sample_events.count)
80
+ end
81
+ end
82
+
83
+ context 'with batching enabled' do
84
+ let(:config) { super.merge('batch' => true) }
85
+
86
+ it 'publishes to SQS' do
87
+ subject.multi_receive_encoded(sample_events)
88
+ expect(receive_all_messages(@queue_url).count).to eq(sample_events.count)
89
+ end
90
+ end
91
+ end
92
+ end
@@ -0,0 +1,8 @@
1
+ # encoding: utf-8
2
+
3
+ require 'aws-sdk'
4
+ require 'logstash/devutils/rspec/spec_helper'
5
+ require 'logstash/outputs/sqs'
6
+ require_relative 'supports/helpers'
7
+
8
+ LogStash::Logging::Logger::configure_logging('debug') if ENV['DEBUG']
@@ -0,0 +1,20 @@
1
+ require 'aws-sdk'
2
+
3
+ # Retrieve all available messages from the specified queue.
4
+ #
5
+ # Rather than utilizing `Aws::SQS::QueuePoller` directly in order to poll an
6
+ # SQS queue for messages, this method retrieves and returns all messages that
7
+ # are able to be received from the specified SQS queue.
8
+ def receive_all_messages(queue_url, options = {})
9
+ options[:idle_timeout] ||= 0
10
+ options[:max_number_of_messages] ||= 10
11
+
12
+ messages = []
13
+ poller = Aws::SQS::QueuePoller.new(queue_url, options)
14
+
15
+ poller.poll do |received_messages|
16
+ messages.concat(received_messages)
17
+ end
18
+
19
+ messages
20
+ end
@@ -0,0 +1,263 @@
1
+ # encoding: utf-8
2
+
3
+ require_relative '../../spec_helper'
4
+ require 'logstash/errors'
5
+ require 'logstash/event'
6
+ require 'logstash/json'
7
+
8
+ describe LogStash::Outputs::SQS do
9
+ let(:config) do
10
+ {
11
+ 'queue' => queue_name,
12
+ 'region' => region,
13
+ }
14
+ end
15
+ let(:queue_name) { 'my-sqs-queue' }
16
+ let(:queue_url) { "https://sqs.#{region}.amazonaws.com/123456789012/#{queue_name}" }
17
+ let(:region) { 'us-east-1' }
18
+
19
+ let(:sqs) { Aws::SQS::Client.new(:stub_responses => true) }
20
+ subject { described_class.new(config) }
21
+
22
+ describe '#register' do
23
+ context 'with a batch size that is too large' do
24
+ let(:config) { super.merge('batch_events' => 100) }
25
+
26
+ before do
27
+ allow(Aws::SQS::Client).to receive(:new).and_return(sqs)
28
+ end
29
+
30
+ it 'raises a configuration error' do
31
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
32
+ end
33
+ end
34
+
35
+ context 'with a batch size that is too small' do
36
+ let(:config) { super.merge('batch_events' => 0) }
37
+
38
+ before do
39
+ allow(Aws::SQS::Client).to receive(:new).and_return(sqs)
40
+ end
41
+
42
+ it 'raises a configuration error' do
43
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
44
+ end
45
+ end
46
+
47
+ context 'without a queue' do
48
+ let(:config) { Hash.new }
49
+
50
+ it 'raises a configuration error' do
51
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
52
+ end
53
+ end
54
+
55
+ context 'with a nonexistent queue' do
56
+ before do
57
+ expect(Aws::SQS::Client).to receive(:new).and_return(sqs)
58
+ expect(sqs).to receive(:get_queue_url).with(:queue_name => queue_name) do
59
+ raise Aws::SQS::Errors::NonExistentQueue.new(nil, 'The specified queue does not exist for this wsdl version.')
60
+ end
61
+ end
62
+
63
+ it 'raises a configuration error' do
64
+ expect { subject.register }.to raise_error(LogStash::ConfigurationError)
65
+ end
66
+ end
67
+
68
+ context 'with a valid queue' do
69
+ before do
70
+ expect(Aws::SQS::Client).to receive(:new).and_return(sqs)
71
+ expect(sqs).to receive(:get_queue_url).with(:queue_name => queue_name).and_return(:queue_url => queue_url)
72
+ end
73
+
74
+ it 'does not raise an error' do
75
+ expect { subject.register }.not_to raise_error
76
+ end
77
+ end
78
+ end
79
+
80
+ describe '#multi_receive_encoded' do
81
+ before do
82
+ expect(Aws::SQS::Client).to receive(:new).and_return(sqs)
83
+ expect(sqs).to receive(:get_queue_url).with(:queue_name => queue_name).and_return(:queue_url => queue_url)
84
+ subject.register
85
+ end
86
+
87
+ after do
88
+ subject.close
89
+ end
90
+
91
+ let(:sample_count) { 10 }
92
+ let(:sample_event) { LogStash::Event.new('message' => 'This is a message') }
93
+ let(:sample_event_encoded) { LogStash::Json.dump(sample_event) }
94
+ let(:sample_events) do
95
+ sample_count.times.map do
96
+ [sample_event, sample_event_encoded]
97
+ end
98
+ end
99
+
100
+ context 'with batching disabled using the `batch` parameter' do
101
+ let(:config) { super.merge('batch' => false) }
102
+
103
+ it 'should call send_message' do
104
+ expect(sqs).to receive(:send_message).with(:queue_url => queue_url, :message_body => sample_event_encoded).exactly(sample_count).times
105
+ subject.multi_receive_encoded(sample_events)
106
+ end
107
+
108
+ it 'should not call send_message_batch' do
109
+ expect(sqs).not_to receive(:send_message_batch)
110
+ subject.multi_receive_encoded(sample_events)
111
+ end
112
+ end
113
+
114
+ context 'with batching disabled' do
115
+ let(:config) do
116
+ super.merge({
117
+ 'batch' => true,
118
+ 'batch_events' => 1,
119
+ })
120
+ end
121
+
122
+ it 'should call send_message' do
123
+ expect(sqs).to receive(:send_message).with(:queue_url => queue_url, :message_body => sample_event_encoded).exactly(sample_count).times
124
+ subject.multi_receive_encoded(sample_events)
125
+ end
126
+
127
+ it 'should not call send_message_batch' do
128
+ expect(sqs).not_to receive(:send_message_batch)
129
+ subject.multi_receive_encoded(sample_events)
130
+ end
131
+ end
132
+
133
+ context 'with batching enabled' do
134
+ let(:batch_events) { 3 }
135
+ let(:config) do
136
+ super.merge({
137
+ 'batch' => true,
138
+ 'batch_events' => batch_events,
139
+ })
140
+ end
141
+
142
+ let(:sample_batches) do
143
+ sample_events.each_slice(batch_events).each_with_index.map do |sample_batch, batch_index|
144
+ sample_batch.each_with_index.map do |encoded_event, index|
145
+ event, encoded = encoded_event
146
+ {
147
+ :id => (batch_index * batch_events + index).to_s,
148
+ :message_body => encoded,
149
+ }
150
+ end
151
+ end
152
+ end
153
+
154
+ it 'should call send_message_batch' do
155
+ expect(sqs).to receive(:send_message_batch).at_least(:once)
156
+ subject.multi_receive_encoded(sample_events)
157
+ end
158
+
159
+ it 'should batch events' do
160
+ sample_batches.each do |batch_entries|
161
+ expect(sqs).to receive(:send_message_batch).with(:queue_url => queue_url, :entries => batch_entries)
162
+ end
163
+
164
+ subject.multi_receive_encoded(sample_events)
165
+ end
166
+ end
167
+
168
+ context 'with empty payload' do
169
+ let(:sample_count) { 0 }
170
+
171
+ it 'does not raise an error' do
172
+ expect { subject.multi_receive_encoded(sample_events) }.not_to raise_error
173
+ end
174
+
175
+ it 'should not send any messages' do
176
+ expect(sqs).not_to receive(:send_message)
177
+ expect(sqs).not_to receive(:send_message_batch)
178
+ subject.multi_receive_encoded(sample_events)
179
+ end
180
+ end
181
+
182
+ context 'with event exceeding maximum size' do
183
+ let(:config) { super.merge('message_max_size' => message_max_size) }
184
+ let(:message_max_size) { 1024 }
185
+
186
+ let(:sample_count) { 1 }
187
+ let(:sample_event) { LogStash::Event.new('message' => '.' * message_max_size) }
188
+
189
+ it 'should drop event' do
190
+ expect(sqs).not_to receive(:send_message)
191
+ expect(sqs).not_to receive(:send_message_batch)
192
+ subject.multi_receive_encoded(sample_events)
193
+ end
194
+ end
195
+
196
+
197
+
198
+ context 'with large batch' do
199
+ let(:batch_events) { 4 }
200
+ let(:config) do
201
+ super.merge({
202
+ 'batch_events' => batch_events,
203
+ 'message_max_size' => message_max_size,
204
+ })
205
+ end
206
+ let(:message_max_size) { 1024 }
207
+
208
+ let(:sample_events) do
209
+ # This is the overhead associating with transmitting each message. The
210
+ # overhead is caused by metadata (such as the `message` field name and
211
+ # the `@timestamp` field) as well as additional characters as a result
212
+ # of encoding the event.
213
+ overhead = 69
214
+
215
+ events = [
216
+ LogStash::Event.new('message' => 'a' * (0.6 * message_max_size - overhead)),
217
+ LogStash::Event.new('message' => 'b' * (0.5 * message_max_size - overhead)),
218
+ LogStash::Event.new('message' => 'c' * (0.5 * message_max_size - overhead)),
219
+ LogStash::Event.new('message' => 'd' * (0.4 * message_max_size - overhead)),
220
+ ]
221
+
222
+ events.map do |event|
223
+ [event, LogStash::Json.dump(event)]
224
+ end
225
+ end
226
+
227
+ let(:sample_batches) do
228
+ [
229
+ [
230
+ {
231
+ :id => '0',
232
+ :message_body => sample_events[0][1],
233
+ },
234
+ ],
235
+ [
236
+ {
237
+ :id => '1',
238
+ :message_body => sample_events[1][1],
239
+ },
240
+ {
241
+ :id => '2',
242
+ :message_body => sample_events[2][1],
243
+ },
244
+ ],
245
+ [
246
+ {
247
+ :id => '3',
248
+ :message_body => sample_events[3][1],
249
+ },
250
+ ],
251
+ ]
252
+ end
253
+
254
+ it 'should split events into smaller batches' do
255
+ sample_batches.each do |entries|
256
+ expect(sqs).to receive(:send_message_batch).with(:queue_url => queue_url, :entries => entries)
257
+ end
258
+
259
+ subject.multi_receive_encoded(sample_events)
260
+ end
261
+ end
262
+ end
263
+ end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-output-sqs
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.0.2
4
+ version: 4.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2016-07-14 00:00:00.000000000 Z
11
+ date: 2017-01-10 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -50,23 +50,9 @@ dependencies:
50
50
  - - ">="
51
51
  - !ruby/object:Gem::Version
52
52
  version: '0'
53
- name: aws-sdk
53
+ name: logstash-codec-json
54
54
  prerelease: false
55
- type: :runtime
56
- version_requirements: !ruby/object:Gem::Requirement
57
- requirements:
58
- - - ">="
59
- - !ruby/object:Gem::Version
60
- version: '0'
61
- - !ruby/object:Gem::Dependency
62
- requirement: !ruby/object:Gem::Requirement
63
- requirements:
64
- - - ">="
65
- - !ruby/object:Gem::Version
66
- version: '0'
67
- name: stud
68
- prerelease: false
69
- type: :runtime
55
+ type: :development
70
56
  version_requirements: !ruby/object:Gem::Requirement
71
57
  requirements:
72
58
  - - ">="
@@ -99,8 +85,12 @@ files:
99
85
  - NOTICE.TXT
100
86
  - README.md
101
87
  - lib/logstash/outputs/sqs.rb
88
+ - lib/logstash/outputs/sqs/patch.rb
102
89
  - logstash-output-sqs.gemspec
103
- - spec/outputs/sqs_spec.rb
90
+ - spec/integration/outputs/sqs_spec.rb
91
+ - spec/spec_helper.rb
92
+ - spec/supports/helpers.rb
93
+ - spec/unit/outputs/sqs_spec.rb
104
94
  homepage: http://www.elastic.co/guide/en/logstash/current/index.html
105
95
  licenses:
106
96
  - Apache License (2.0)
@@ -123,9 +113,12 @@ required_rubygems_version: !ruby/object:Gem::Requirement
123
113
  version: '0'
124
114
  requirements: []
125
115
  rubyforge_project:
126
- rubygems_version: 2.6.3
116
+ rubygems_version: 2.4.8
127
117
  signing_key:
128
118
  specification_version: 4
129
119
  summary: Push events to an Amazon Web Services Simple Queue Service (SQS) queue.
130
120
  test_files:
131
- - spec/outputs/sqs_spec.rb
121
+ - spec/integration/outputs/sqs_spec.rb
122
+ - spec/spec_helper.rb
123
+ - spec/supports/helpers.rb
124
+ - spec/unit/outputs/sqs_spec.rb
@@ -1,5 +0,0 @@
1
- # encoding: utf-8
2
- require "logstash/devutils/rspec/spec_helper"
3
- require 'logstash/outputs/sqs'
4
-
5
-