logstash-input-aws-s3-sns-sqs 1.1.2

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: d021893574e85e66adfaf76e5204ad66420e7143
4
+ data.tar.gz: 4ef9767ee60fe866c9cb7b3537e0e2fec8a5f7ef
5
+ SHA512:
6
+ metadata.gz: 40179397b04cdcd863485224e801c83e64b2038f5e898ab6518706cfc53c155b2a80c814f461ad9c4bc9651f6e0f0f5c027ef4b1f5fccbce57f12ea644e7e448
7
+ data.tar.gz: c115323d783a5cdd8c77861ed63a6bb77b4ac3d29ccc9a7dd4845b3ca53766810d1aaef67724d357f202cc532173ea4d28965c2b05e995013645141a82fd90ee
@@ -0,0 +1,22 @@
1
+ ## 1.1.2
2
+ - Fix a Bug in the S3 Key generation
3
+ - Enable shipping throug SNS Topic (needs another toJSON)
4
+ ## 1.1.1
5
+ - Added the ability to remove objects from S3 after processing.
6
+ - Workaround an issue with the Ruby autoload that causes "uninitialized constant `Aws::Client::Errors`" errors.
7
+
8
+ ## 1.1.0
9
+ - Logstash 5 compatibility
10
+
11
+ ## 1.0.3
12
+ - added some metadata to the event (bucket and object name as commited by joshuaspence)
13
+ - also try to unzip files ending with ".gz" (ALB logs are zipped but not marked with proper Content-Encoding)
14
+
15
+ ## 1.0.2
16
+ - fix for broken UTF-8 (so we won't lose a whole s3 log file because of a single invalid line, ruby's split will die on those)
17
+
18
+ ## 1.0.1
19
+ - same (because of screwed up rubygems.org release)
20
+
21
+ ## 1.0.0
22
+ - Initial Release
@@ -0,0 +1,13 @@
1
+ The following is a list of people who have contributed ideas, code, bug
2
+ reports, or in general have helped logstash along its way.
3
+
4
+ Contributors:
5
+ * cherweg(this fork + some bugfixes)
6
+ * joshuaspence (event metadata)
7
+ * Heiko-san (initial contributor)
8
+ * logstash-input-sqs plugin as code base
9
+
10
+ Note: If you've sent us patches, bug reports, or otherwise contributed to
11
+ Logstash, and you aren't on the list above and want to be, please let us know
12
+ and we'll make sure you're here. Contributions from folks like you are what make
13
+ open source awesome.
data/Gemfile ADDED
@@ -0,0 +1,2 @@
1
+ source 'https://rubygems.org'
2
+ gemspec
data/LICENSE ADDED
@@ -0,0 +1,13 @@
1
+ Copyright (c) 2012–2015 Elasticsearch <http://www.elastic.co>
2
+
3
+ Licensed under the Apache License, Version 2.0 (the "License");
4
+ you may not use this file except in compliance with the License.
5
+ You may obtain a copy of the License at
6
+
7
+ http://www.apache.org/licenses/LICENSE-2.0
8
+
9
+ Unless required by applicable law or agreed to in writing, software
10
+ distributed under the License is distributed on an "AS IS" BASIS,
11
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ See the License for the specific language governing permissions and
13
+ limitations under the License.
@@ -0,0 +1,5 @@
1
+ Elasticsearch
2
+ Copyright 2012-2015 Elasticsearch
3
+
4
+ This product includes software developed by The Apache Software
5
+ Foundation (http://www.apache.org/).
@@ -0,0 +1,147 @@
1
+ # Logstash Plugin
2
+
3
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
4
+
5
+ It is fully free and fully open source. The license is Apache 2.0.
6
+
7
+ ## Documentation
8
+
9
+ Get logs from AWS s3 buckets as issued by an object-created event via sqs.
10
+
11
+ This plugin is based on the logstash-input-sqs plugin but doesn't log the sqs event itself.
12
+ Instead it assumes, that the event is an s3 object-created event and will then download
13
+ and process the given file.
14
+
15
+ Some issues of logstash-input-sqs, like logstash not shutting down properly, have been
16
+ fixed for this plugin.
17
+
18
+ In contrast to logstash-input-sqs this plugin uses the "Receive Message Wait Time"
19
+ configured for the sqs queue in question, a good value will be something like 10 seconds
20
+ to ensure a reasonable shutdown time of logstash.
21
+ Also use a "Default Visibility Timeout" that is high enough for log files to be downloaded
22
+ and processed (I think a good value should be 5-10 minutes for most use cases), the plugin will
23
+ avoid removing the event from the queue if the associated log file couldn't be correctly
24
+ passed to the processing level of logstash (e.g. downloaded content size doesn't match sqs event).
25
+
26
+ This plugin is meant for high availability setups, in contrast to logstash-input-s3 you can safely
27
+ use multiple logstash nodes, since the usage of sqs will ensure that each logfile is processed
28
+ only once and no file will get lost on node failure or downscaling for auto-scaling groups.
29
+ (You should use a "Message Retention Period" >= 4 days for your sqs to ensure you can survive
30
+ a weekend of faulty log file processing)
31
+ The plugin will not delete objects from s3 buckets, so make sure to have a reasonable "Lifecycle"
32
+ configured for your buckets, which should keep the files at least "Message Retention Period" days.
33
+
34
+ A typical setup will contain some s3 buckets containing elb, cloudtrail or other log files.
35
+ These will be configured to send object-created events to a sqs queue, which will be configured
36
+ as the source queue for this plugin.
37
+ (The plugin supports gzipped content if it is marked with "contend-encoding: gzip" as it is the
38
+ case for cloudtrail logs)
39
+
40
+ The logstash node therefore must have sqs permissions + the permissions to download objects
41
+ from the s3 buckets that send events to the queue.
42
+ (If logstash nodes are running on EC2 you should use a ServerRole to provide permissions)
43
+ [source,json]
44
+ {
45
+ "Version": "2012-10-17",
46
+ "Statement": [
47
+ {
48
+ "Effect": "Allow",
49
+ "Action": [
50
+ "sqs:Get*",
51
+ "sqs:List*",
52
+ "sqs:ReceiveMessage",
53
+ "sqs:ChangeMessageVisibility*",
54
+ "sqs:DeleteMessage*"
55
+ ],
56
+ "Resource": [
57
+ "arn:aws:sqs:us-east-1:123456789012:my-elb-log-queue"
58
+ ]
59
+ },
60
+ {
61
+ "Effect": "Allow",
62
+ "Action": [
63
+ "s3:Get*",
64
+ "s3:List*",
65
+ "s3:DeleteObject"
66
+ ],
67
+ "Resource": [
68
+ "arn:aws:s3:::my-elb-logs",
69
+ "arn:aws:s3:::my-elb-logs/*"
70
+ ]
71
+ }
72
+ ]
73
+ }
74
+
75
+ ## Need Help?
76
+
77
+ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
78
+
79
+ ## Developing
80
+
81
+ ### 1. Plugin Developement and Testing
82
+
83
+ #### Code
84
+ - To get started, you'll need JRuby with the Bundler gem installed.
85
+
86
+ - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
87
+
88
+ - Install dependencies
89
+ ```sh
90
+ bundle install
91
+ ```
92
+
93
+ #### Test
94
+
95
+ - Update your dependencies
96
+
97
+ ```sh
98
+ bundle install
99
+ ```
100
+
101
+ - Run tests
102
+
103
+ ```sh
104
+ bundle exec rspec
105
+ ```
106
+
107
+ ### 2. Running your unpublished Plugin in Logstash
108
+
109
+ #### 2.1 Run in a local Logstash clone
110
+
111
+ - Edit Logstash `Gemfile` and add the local plugin path, for example:
112
+ ```ruby
113
+ gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
114
+ ```
115
+ - Install plugin
116
+ ```sh
117
+ bin/plugin install --no-verify
118
+ ```
119
+ - Run Logstash with your plugin
120
+ ```sh
121
+ bin/logstash -e 'filter {awesome {}}'
122
+ ```
123
+ At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
124
+
125
+ #### 2.2 Run in an installed Logstash
126
+
127
+ You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using:
128
+
129
+ - Build your plugin gem
130
+ ```sh
131
+ gem build logstash-filter-awesome.gemspec
132
+ ```
133
+ - Install the plugin from the Logstash home
134
+ ```sh
135
+ bin/plugin install /your/local/plugin/logstash-filter-awesome.gem
136
+ ```
137
+ - Start Logstash and proceed to test the plugin
138
+
139
+ ## Contributing
140
+
141
+ All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
142
+
143
+ Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
144
+
145
+ It is more important to the community that you are able to contribute.
146
+
147
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -0,0 +1,244 @@
1
+ # encoding: utf-8
2
+ #
3
+ require "logstash/inputs/threadable"
4
+ require "logstash/namespace"
5
+ require "logstash/timestamp"
6
+ require "logstash/plugin_mixins/aws_config"
7
+ require "logstash/errors"
8
+ require 'logstash/inputs/s3sqs/patch'
9
+
10
+ Aws.eager_autoload!
11
+
12
+ # Get logs from AWS s3 buckets as issued by an object-created event via sqs.
13
+ #
14
+ # This plugin is based on the logstash-input-sqs plugin but doesn't log the sqs event itself.
15
+ # Instead it assumes, that the event is an s3 object-created event and will then download
16
+ # and process the given file.
17
+ #
18
+ # Some issues of logstash-input-sqs, like logstash not shutting down properly, have been
19
+ # fixed for this plugin.
20
+ #
21
+ # In contrast to logstash-input-sqs this plugin uses the "Receive Message Wait Time"
22
+ # configured for the sqs queue in question, a good value will be something like 10 seconds
23
+ # to ensure a reasonable shutdown time of logstash.
24
+ # Also use a "Default Visibility Timeout" that is high enough for log files to be downloaded
25
+ # and processed (I think a good value should be 5-10 minutes for most use cases), the plugin will
26
+ # avoid removing the event from the queue if the associated log file couldn't be correctly
27
+ # passed to the processing level of logstash (e.g. downloaded content size doesn't match sqs event).
28
+ #
29
+ # This plugin is meant for high availability setups, in contrast to logstash-input-s3 you can safely
30
+ # use multiple logstash nodes, since the usage of sqs will ensure that each logfile is processed
31
+ # only once and no file will get lost on node failure or downscaling for auto-scaling groups.
32
+ # (You should use a "Message Retention Period" >= 4 days for your sqs to ensure you can survive
33
+ # a weekend of faulty log file processing)
34
+ # The plugin will not delete objects from s3 buckets, so make sure to have a reasonable "Lifecycle"
35
+ # configured for your buckets, which should keep the files at least "Message Retention Period" days.
36
+ #
37
+ # A typical setup will contain some s3 buckets containing elb, cloudtrail or other log files.
38
+ # These will be configured to send object-created events to a sqs queue, which will be configured
39
+ # as the source queue for this plugin.
40
+ # (The plugin supports gzipped content if it is marked with "contend-encoding: gzip" as it is the
41
+ # case for cloudtrail logs)
42
+ #
43
+ # The logstash node therefore must have sqs permissions + the permissions to download objects
44
+ # from the s3 buckets that send events to the queue.
45
+ # (If logstash nodes are running on EC2 you should use a ServerRole to provide permissions)
46
+ # [source,json]
47
+ # {
48
+ # "Version": "2012-10-17",
49
+ # "Statement": [
50
+ # {
51
+ # "Effect": "Allow",
52
+ # "Action": [
53
+ # "sqs:Get*",
54
+ # "sqs:List*",
55
+ # "sqs:ReceiveMessage",
56
+ # "sqs:ChangeMessageVisibility*",
57
+ # "sqs:DeleteMessage*"
58
+ # ],
59
+ # "Resource": [
60
+ # "arn:aws:sqs:us-east-1:123456789012:my-elb-log-queue"
61
+ # ]
62
+ # },
63
+ # {
64
+ # "Effect": "Allow",
65
+ # "Action": [
66
+ # "s3:Get*",
67
+ # "s3:List*",
68
+ # "s3:DeleteObject"
69
+ # ],
70
+ # "Resource": [
71
+ # "arn:aws:s3:::my-elb-logs",
72
+ # "arn:aws:s3:::my-elb-logs/*"
73
+ # ]
74
+ # }
75
+ # ]
76
+ # }
77
+ #
78
+ class LogStash::Inputs::S3SQS < LogStash::Inputs::Threadable
79
+ include LogStash::PluginMixins::AwsConfig::V2
80
+
81
+ BACKOFF_SLEEP_TIME = 1
82
+ BACKOFF_FACTOR = 2
83
+ MAX_TIME_BEFORE_GIVING_UP = 60
84
+ EVENT_SOURCE = 'aws:s3'
85
+ EVENT_TYPE = 'ObjectCreated'
86
+
87
+ config_name "s3sqs"
88
+
89
+ default :codec, "plain"
90
+
91
+ # Name of the SQS Queue to pull messages from. Note that this is just the name of the queue, not the URL or ARN.
92
+ config :queue, :validate => :string, :required => true
93
+
94
+ # Whether to delete files from S3 after processing.
95
+ config :delete_on_success, :validate => :boolean, :default => false
96
+
97
+ attr_reader :poller
98
+ attr_reader :s3
99
+
100
+ def register
101
+ require "aws-sdk"
102
+ require 'cgi'
103
+ @logger.info("Registering SQS input", :queue => @queue)
104
+ setup_queue
105
+ end
106
+
107
+ def setup_queue
108
+ aws_sqs_client = Aws::SQS::Client.new(aws_options_hash)
109
+ queue_url = aws_sqs_client.get_queue_url(:queue_name => @queue)[:queue_url]
110
+ @poller = Aws::SQS::QueuePoller.new(queue_url, :client => aws_sqs_client)
111
+ @s3 = Aws::S3::Client.new(aws_options_hash)
112
+ rescue Aws::SQS::Errors::ServiceError => e
113
+ @logger.error("Cannot establish connection to Amazon SQS", :error => e)
114
+ raise LogStash::ConfigurationError, "Verify the SQS queue name and your credentials"
115
+ end
116
+
117
+ def polling_options
118
+ {
119
+ # we will query 1 message at a time, so we can ensure correct error handling if we can't download a single file correctly
120
+ # (we will throw :skip_delete if download size isn't correct to process the event again later
121
+ # -> set a reasonable "Default Visibility Timeout" for your queue, so that there's enough time to process the log files)
122
+ :max_number_of_messages => 1,
123
+ # we will use the queue's setting, a good value is 10 seconds
124
+ # (to ensure fast logstash shutdown on the one hand and few api calls on the other hand)
125
+ :wait_time_seconds => nil,
126
+ }
127
+ end
128
+
129
+ def handle_message(message, queue)
130
+ hash = JSON.parse message.body
131
+ @logger.debug("handle_message", :hash => hash, :message => message)
132
+ # there may be test events sent from the s3 bucket which won't contain a Records array,
133
+ # we will skip those events and remove them from queue
134
+ if hash['Message'] then
135
+ # typically there will be only 1 record per event, but since it is an array we will
136
+ # treat it as if there could be more records
137
+ JSON.parse(hash['Message'])['Records'].each do |record|
138
+ @logger.info("We found a record", :record => record)
139
+ # in case there are any events with Records that aren't s3 object-created events and can't therefore be
140
+ # processed by this plugin, we will skip them and remove them from queue
141
+ if record['eventSource'] == EVENT_SOURCE and record['eventName'].start_with?(EVENT_TYPE) then
142
+ @logger.debug("It is a valid record")
143
+ bucket = CGI.unescape(record['s3']['bucket']['name'])
144
+ key = CGI.unescape(record['s3']['object']['key'])
145
+
146
+ # try download and :skip_delete if it fails
147
+ begin
148
+ response = @s3.get_object(
149
+ bucket: bucket,
150
+ key: key,
151
+ )
152
+ rescue => e
153
+ @logger.warn("issuing :skip_delete on failed download", :bucket => bucket, :object => key, :error => e)
154
+ throw :skip_delete
155
+ end
156
+
157
+ # verify downloaded content size
158
+ if response.content_length == record['s3']['object']['size'] then
159
+ body = response.body
160
+ # if necessary unzip
161
+ if response.content_encoding == "gzip" or record['s3']['object']['key'].end_with?(".gz") then
162
+ @logger.debug("Ohhh i´ll try to unzip")
163
+ begin
164
+ temp = Zlib::GzipReader.new(body)
165
+ rescue => e
166
+ @logger.warn("content is marked to be gzipped but can't unzip it, assuming plain text", :bucket => bucket, :object => key, :error => e)
167
+ temp = body
168
+ end
169
+ body = temp
170
+ end
171
+ # process the plain text content
172
+ begin
173
+ lines = body.read.encode('UTF-8', 'binary', invalid: :replace, undef: :replace, replace: "\u2370").split(/\n/)
174
+ lines.each do |line|
175
+ @logger.debug("Decorating the event")
176
+ @codec.decode(line) do |event|
177
+ decorate(event)
178
+
179
+ event.set('[@metadata][s3_bucket_name]', record['s3']['bucket']['name'])
180
+ event.set('[@metadata][s3_object_key]', record['s3']['object']['key'])
181
+
182
+ queue << event
183
+ end
184
+ end
185
+ rescue => e
186
+ @logger.warn("issuing :skip_delete on failed plain text processing", :bucket => bucket, :object => key, :error => e)
187
+ throw :skip_delete
188
+ end
189
+
190
+ # Delete the files from S3
191
+ begin
192
+ @s3.delete_object(bucket: bucket, key: key) if @delete_on_success
193
+ rescue => e
194
+ @logger.warn("Failed to delete S3 object", :bucket => bucket, :object => key, :error => e)
195
+ end
196
+ # otherwise try again later
197
+ else
198
+ @logger.warn("issuing :skip_delete on wrong download content size", :bucket => bucket, :object => key,
199
+ :download_size => response.content_length, :expected => record['s3']['object']['size'])
200
+ throw :skip_delete
201
+ end
202
+ end
203
+ end
204
+ end
205
+ end
206
+
207
+ def run(queue)
208
+ # ensure we can stop logstash correctly
209
+ poller.before_request do |stats|
210
+ if stop? then
211
+ @logger.warn("issuing :stop_polling on stop?", :queue => @queue)
212
+ # this can take up to "Receive Message Wait Time" (of the sqs queue) seconds to be recognized
213
+ throw :stop_polling
214
+ end
215
+ end
216
+ # poll a message and process it
217
+ run_with_backoff do
218
+ poller.poll(polling_options) do |message|
219
+ handle_message(message, queue)
220
+ end
221
+ end
222
+ end
223
+
224
+ private
225
+ # Runs an AWS request inside a Ruby block with an exponential backoff in case
226
+ # we experience a ServiceError.
227
+ #
228
+ # @param [Integer] max_time maximum amount of time to sleep before giving up.
229
+ # @param [Integer] sleep_time the initial amount of time to sleep before retrying.
230
+ # @param [Block] block Ruby code block to execute.
231
+ def run_with_backoff(max_time = MAX_TIME_BEFORE_GIVING_UP, sleep_time = BACKOFF_SLEEP_TIME, &block)
232
+ next_sleep = sleep_time
233
+ begin
234
+ block.call
235
+ next_sleep = sleep_time
236
+ rescue Aws::SQS::Errors::ServiceError => e
237
+ @logger.warn("Aws::SQS::Errors::ServiceError ... retrying SQS request with exponential backoff", :queue => @queue, :sleep_time => sleep_time, :error => e)
238
+ sleep(next_sleep)
239
+ next_sleep = next_sleep > max_time ? sleep_time : sleep_time * BACKOFF_FACTOR
240
+ retry
241
+ end
242
+ end
243
+
244
+ end # class
@@ -0,0 +1,22 @@
1
+ # This patch was stolen from logstash-plugins/logstash-output-s3#102.
2
+ #
3
+ # This patch is a workaround for a JRuby issue which has been fixed in JRuby
4
+ # 9000, but not in JRuby 1.7. See https://github.com/jruby/jruby/issues/3645
5
+ # and https://github.com/jruby/jruby/issues/3920. This is necessary because the
6
+ # `aws-sdk` is doing tricky name discovery to generate the correct error class.
7
+ #
8
+ # As per https://github.com/aws/aws-sdk-ruby/issues/1301#issuecomment-261115960,
9
+ # this patch may be short-lived anyway.
10
+ require 'aws-sdk'
11
+
12
+ begin
13
+ old_stderr = $stderr
14
+ $stderr = StringIO.new
15
+
16
+ module Aws
17
+ const_set(:S3, Aws::S3)
18
+ const_set(:SQS, Aws::SQS)
19
+ end
20
+ ensure
21
+ $stderr = old_stderr
22
+ end
@@ -0,0 +1,29 @@
1
+ Gem::Specification.new do |s|
2
+ s.name = 'logstash-input-aws-s3-sns-sqs'
3
+ s.version = '1.1.2'
4
+ s.licenses = ['Apache License (2.0)']
5
+ s.summary = "Get logs from AWS s3 buckets as issued by an object-created event via sns -> sqs."
6
+ s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
7
+ s.authors = ["Christian Herweg"]
8
+ s.email = 'christian.herweg@gmail.com'
9
+ s.homepage = "https://github.com/cherweg/logstash-input-s3-sns-sqs"
10
+ s.require_paths = ["lib"]
11
+
12
+ # Files
13
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
14
+
15
+ # Tests
16
+ s.test_files = s.files.grep(%r{^(test|spec|features)/})
17
+
18
+ # Special flag to let us know this is actually a logstash plugin
19
+ s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
20
+
21
+ # Gem dependencies
22
+ s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
23
+
24
+ s.add_runtime_dependency 'logstash-codec-json'
25
+ s.add_runtime_dependency "logstash-mixin-aws", ">= 1.0.0"
26
+
27
+ s.add_development_dependency 'logstash-devutils'
28
+ end
29
+
@@ -0,0 +1,9 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
3
+ require "logstash/inputs/s3sqs"
4
+
5
+ describe LogStash::Inputs::S3SQS do
6
+
7
+ true.should be_true
8
+
9
+ end
@@ -0,0 +1,2 @@
1
+ # encoding: utf-8
2
+ require "logstash/devutils/rspec/spec_helper"
metadata ADDED
@@ -0,0 +1,120 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: logstash-input-aws-s3-sns-sqs
3
+ version: !ruby/object:Gem::Version
4
+ version: 1.1.2
5
+ platform: ruby
6
+ authors:
7
+ - Christian Herweg
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2017-11-10 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ requirement: !ruby/object:Gem::Requirement
15
+ requirements:
16
+ - - '>='
17
+ - !ruby/object:Gem::Version
18
+ version: '1.60'
19
+ - - <=
20
+ - !ruby/object:Gem::Version
21
+ version: '2.99'
22
+ name: logstash-core-plugin-api
23
+ prerelease: false
24
+ type: :runtime
25
+ version_requirements: !ruby/object:Gem::Requirement
26
+ requirements:
27
+ - - '>='
28
+ - !ruby/object:Gem::Version
29
+ version: '1.60'
30
+ - - <=
31
+ - !ruby/object:Gem::Version
32
+ version: '2.99'
33
+ - !ruby/object:Gem::Dependency
34
+ requirement: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - '>='
37
+ - !ruby/object:Gem::Version
38
+ version: '0'
39
+ name: logstash-codec-json
40
+ prerelease: false
41
+ type: :runtime
42
+ version_requirements: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - '>='
45
+ - !ruby/object:Gem::Version
46
+ version: '0'
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - '>='
51
+ - !ruby/object:Gem::Version
52
+ version: 1.0.0
53
+ name: logstash-mixin-aws
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - '>='
59
+ - !ruby/object:Gem::Version
60
+ version: 1.0.0
61
+ - !ruby/object:Gem::Dependency
62
+ requirement: !ruby/object:Gem::Requirement
63
+ requirements:
64
+ - - '>='
65
+ - !ruby/object:Gem::Version
66
+ version: '0'
67
+ name: logstash-devutils
68
+ prerelease: false
69
+ type: :development
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - '>='
73
+ - !ruby/object:Gem::Version
74
+ version: '0'
75
+ description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
76
+ email: christian.herweg@gmail.com
77
+ executables: []
78
+ extensions: []
79
+ extra_rdoc_files: []
80
+ files:
81
+ - CHANGELOG.md
82
+ - CONTRIBUTORS
83
+ - Gemfile
84
+ - LICENSE
85
+ - NOTICE.TXT
86
+ - README.md
87
+ - lib/logstash/inputs/s3sqs.rb
88
+ - lib/logstash/inputs/s3sqs/patch.rb
89
+ - logstash-input-aws-s3-sns-sqs.gemspec
90
+ - spec/inputs/s3sqs_spec.rb
91
+ - spec/spec_helper.rb
92
+ homepage: https://github.com/cherweg/logstash-input-s3-sns-sqs
93
+ licenses:
94
+ - Apache License (2.0)
95
+ metadata:
96
+ logstash_plugin: 'true'
97
+ logstash_group: input
98
+ post_install_message:
99
+ rdoc_options: []
100
+ require_paths:
101
+ - lib
102
+ required_ruby_version: !ruby/object:Gem::Requirement
103
+ requirements:
104
+ - - '>='
105
+ - !ruby/object:Gem::Version
106
+ version: '0'
107
+ required_rubygems_version: !ruby/object:Gem::Requirement
108
+ requirements:
109
+ - - '>='
110
+ - !ruby/object:Gem::Version
111
+ version: '0'
112
+ requirements: []
113
+ rubyforge_project:
114
+ rubygems_version: 2.4.5
115
+ signing_key:
116
+ specification_version: 4
117
+ summary: Get logs from AWS s3 buckets as issued by an object-created event via sns -> sqs.
118
+ test_files:
119
+ - spec/inputs/s3sqs_spec.rb
120
+ - spec/spec_helper.rb