logstash-input-s3 1.0.0 → 2.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 97fc564bb487d52387e701d636f9a3dc9e680664
4
- data.tar.gz: f913210e99472100a84fb118c22aa1dbd8322e41
3
+ metadata.gz: 091d07a422c428d7e3a286487f0105fa74061871
4
+ data.tar.gz: 13f4efb4b72e68feab947f88a6188d4a253847c7
5
5
  SHA512:
6
- metadata.gz: 35cd954ada60910705cb28bea4f15a99a361cc7c1c4a366ce695989410496214a7aba99c906ad7d7ab70cebac82406ff3cb7d801f4c32cdb7566af321976252a
7
- data.tar.gz: eae0811f559d08279afaa248a54b6aa2e7578a4fd5a1f24430f5b2e3054d779fef41ba7b6dd229a865526dcd9577d926680345a3b10efe262dc20e8c6e67dc52
6
+ metadata.gz: 4031253ceb6d2df1a99dbc55017012efe0b599953ab47717c9a2335b55c8b486ed3de00ade5b005541f6e45c9c16daacc5aee776e7e19c5d123e33d84d4b1fd8
7
+ data.tar.gz: 32261a2c04084c6ac9674277070c6e965bf4c78d5fb2099855909f8e847ee3ae9ded5a904fabe65fa33382331daa3cfbadd8df6f540a87b94b3f3ab3904e285b
data/README.md CHANGED
@@ -1,15 +1,33 @@
1
1
  # Logstash Plugin
2
2
 
3
- This is a plugin for [Logstash](https://github.com/elasticsearch/logstash).
3
+ This is a plugin for [Logstash](https://github.com/elastic/logstash).
4
4
 
5
5
  It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
6
6
 
7
+ ## Required S3 Permissions
8
+
9
+ This plugin reads from your S3 bucket, and would require the following
10
+ permissions applied to the AWS IAM Policy being used:
11
+
12
+ * `s3:ListBucket` to check if the S3 bucket exists and list objects in it.
13
+ * `s3:GetObject` to check object metadata and download objects from S3 buckets.
14
+
15
+ You might also need `s3:DeleteObject` when setting S3 input to delete on read.
16
+ And the `s3:CreateBucket` permission to create a backup bucket unless already
17
+ exists.
18
+
19
+ For buckets that have versioning enabled, you might need to add additional
20
+ permissions.
21
+
22
+ More information about S3 permissions can be found at -
23
+ http://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html
24
+
7
25
  ## Documentation
8
26
 
9
- Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elasticsearch.org/guide/en/logstash/current/).
27
+ Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/).
10
28
 
11
29
  - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive
12
- - For more asciidoc formatting tips, see the excellent reference here https://github.com/elasticsearch/docs#asciidoc-guide
30
+ - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
13
31
 
14
32
  ## Need Help?
15
33
 
@@ -83,4 +101,4 @@ Programming is not a required skill. Whatever you've seen about open source and
83
101
 
84
102
  It is more important to the community that you are able to contribute.
85
103
 
86
- For more information about contributing, see the [CONTRIBUTING](https://github.com/elasticsearch/logstash/blob/master/CONTRIBUTING.md) file.
104
+ For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file.
@@ -97,6 +97,7 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
97
97
 
98
98
  public
99
99
  def run(queue)
100
+ @current_thread = Thread.current
100
101
  Stud.interval(@interval) do
101
102
  process_files(queue)
102
103
  end
@@ -143,18 +144,40 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
143
144
  objects = list_new_files
144
145
 
145
146
  objects.each do |key|
146
- @logger.debug("S3 input processing", :bucket => @bucket, :key => key)
147
-
148
- lastmod = @s3bucket.objects[key].last_modified
149
-
150
- process_log(queue, key)
151
-
152
- sincedb.write(lastmod)
147
+ if stop?
148
+ break
149
+ else
150
+ @logger.debug("S3 input processing", :bucket => @bucket, :key => key)
151
+ process_log(queue, key)
152
+ end
153
153
  end
154
154
  end # def process_files
155
155
 
156
+ public
157
+ def stop
158
+ # @current_thread is initialized in the `#run` method,
159
+ # this variable is needed because the `#stop` is a called in another thread
160
+ # than the `#run` method and requiring us to call stop! with a explicit thread.
161
+ Stud.stop!(@current_thread)
162
+ end
163
+
164
+ public
165
+ def aws_service_endpoint(region)
166
+ region_to_use = get_region
167
+
168
+ return {
169
+ :s3_endpoint => region_to_use == 'us-east-1' ?
170
+ 's3.amazonaws.com' : "s3-#{region_to_use}.amazonaws.com"
171
+ }
172
+ end
156
173
 
157
174
  private
175
+
176
+ # Read the content of the local file
177
+ #
178
+ # @param [Queue] Where to push the event
179
+ # @param [String] Which file to read from
180
+ # @return [Boolean] True if the file was completely read, false otherwise.
158
181
  def process_local_log(queue, filename)
159
182
  @logger.debug('Processing file', :filename => filename)
160
183
 
@@ -163,6 +186,11 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
163
186
  # So all IO stuff: decompression, reading need to be done in the actual
164
187
  # input and send as bytes to the codecs.
165
188
  read_file(filename) do |line|
189
+ if stop?
190
+ @logger.warn("Logstash S3 input, stop reading in the middle of the file, we will read it again when logstash is started")
191
+ return false
192
+ end
193
+
166
194
  @codec.decode(line) do |event|
167
195
  # We are making an assumption concerning cloudfront
168
196
  # log format, the user will use the plain or the line codec
@@ -186,6 +214,8 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
186
214
  end
187
215
  end
188
216
  end
217
+
218
+ return true
189
219
  end # def process_local_log
190
220
 
191
221
  private
@@ -287,26 +317,40 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
287
317
  object = @s3bucket.objects[key]
288
318
 
289
319
  filename = File.join(temporary_directory, File.basename(key))
290
-
291
- download_remote_file(object, filename)
292
-
293
- process_local_log(queue, filename)
294
-
295
- backup_to_bucket(object, key)
296
- backup_to_dir(filename)
297
-
298
- delete_file_from_bucket(object)
299
- FileUtils.remove_entry_secure(filename, true)
320
+
321
+ if download_remote_file(object, filename)
322
+ if process_local_log(queue, filename)
323
+ backup_to_bucket(object, key)
324
+ backup_to_dir(filename)
325
+ delete_file_from_bucket(object)
326
+ FileUtils.remove_entry_secure(filename, true)
327
+ lastmod = object.last_modified
328
+ sincedb.write(lastmod)
329
+ end
330
+ else
331
+ FileUtils.remove_entry_secure(filename, true)
332
+ end
300
333
  end
301
334
 
302
335
  private
336
+ # Stream the remove file to the local disk
337
+ #
338
+ # @param [S3Object] Reference to the remove S3 objec to download
339
+ # @param [String] The Temporary filename to stream to.
340
+ # @return [Boolean] True if the file was completely downloaded
303
341
  def download_remote_file(remote_object, local_filename)
342
+ completed = false
343
+
304
344
  @logger.debug("S3 input: Download remote file", :remote_key => remote_object.key, :local_filename => local_filename)
305
345
  File.open(local_filename, 'wb') do |s3file|
306
346
  remote_object.read do |chunk|
347
+ return completed if stop?
307
348
  s3file.write(chunk)
308
349
  end
309
350
  end
351
+ completed = true
352
+
353
+ return completed
310
354
  end
311
355
 
312
356
  private
@@ -350,22 +394,10 @@ class LogStash::Inputs::S3 < LogStash::Inputs::Base
350
394
  @secret_access_key = @credentials[1]
351
395
  end
352
396
 
353
- if @credentials
354
- s3 = AWS::S3.new(
355
- :access_key_id => @access_key_id,
356
- :secret_access_key => @secret_access_key,
357
- :region => @region
358
- )
359
- else
360
- s3 = AWS::S3.new(aws_options_hash)
361
- end
397
+ s3 = AWS::S3.new(aws_options_hash)
362
398
  end
363
399
 
364
400
  private
365
- def aws_service_endpoint(region)
366
- return { :s3_endpoint => region }
367
- end
368
-
369
401
  module SinceDB
370
402
  class File
371
403
  def initialize(file)
@@ -1,7 +1,7 @@
1
1
  Gem::Specification.new do |s|
2
2
 
3
3
  s.name = 'logstash-input-s3'
4
- s.version = '1.0.0'
4
+ s.version = '2.0.0'
5
5
  s.licenses = ['Apache License (2.0)']
6
6
  s.summary = "Stream events from files from a S3 bucket."
7
7
  s.description = "This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program"
@@ -11,7 +11,7 @@ Gem::Specification.new do |s|
11
11
  s.require_paths = ["lib"]
12
12
 
13
13
  # Files
14
- s.files = `git ls-files`.split($\)+::Dir.glob('vendor/*')
14
+ s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT']
15
15
 
16
16
  # Tests
17
17
  s.test_files = s.files.grep(%r{^(test|spec|features)/})
@@ -20,7 +20,7 @@ Gem::Specification.new do |s|
20
20
  s.metadata = { "logstash_plugin" => "true", "logstash_group" => "input" }
21
21
 
22
22
  # Gem dependencies
23
- s.add_runtime_dependency "logstash-core", '>= 1.4.0', '< 2.0.0'
23
+ s.add_runtime_dependency "logstash-core", "~> 2.0.0.snapshot"
24
24
  s.add_runtime_dependency 'logstash-mixin-aws'
25
25
  s.add_runtime_dependency 'stud', '~> 0.0.18'
26
26
 
@@ -9,38 +9,90 @@ require "fileutils"
9
9
 
10
10
  describe LogStash::Inputs::S3 do
11
11
  let(:temporary_directory) { Stud::Temporary.pathname }
12
+ let(:sincedb_path) { Stud::Temporary.pathname }
12
13
  let(:day) { 3600 * 24 }
13
- let(:settings) {
14
+ let(:config) {
14
15
  {
15
16
  "access_key_id" => "1234",
16
17
  "secret_access_key" => "secret",
17
18
  "bucket" => "logstash-test",
18
- "temporary_directory" => temporary_directory
19
+ "temporary_directory" => temporary_directory,
20
+ "sincedb_path" => File.join(sincedb_path, ".sincedb")
19
21
  }
20
22
  }
21
23
 
22
24
  before do
25
+ FileUtils.mkdir_p(sincedb_path)
23
26
  AWS.stub!
24
27
  Thread.abort_on_exception = true
25
28
  end
26
29
 
30
+ context "when interrupting the plugin" do
31
+ let(:config) { super.merge({ "interval" => 5 }) }
32
+
33
+ before do
34
+ expect_any_instance_of(LogStash::Inputs::S3).to receive(:list_new_files).and_return(TestInfiniteS3Object.new)
35
+ end
36
+
37
+ it_behaves_like "an interruptible input plugin"
38
+ end
39
+
27
40
  describe "#register" do
28
- subject { LogStash::Inputs::S3.new(settings) }
41
+ subject { LogStash::Inputs::S3.new(config) }
29
42
 
30
43
  context "with temporary directory" do
44
+ let(:temporary_directory) { Stud::Temporary.pathname }
45
+
46
+ it "creates the direct when it doesn't exist" do
47
+ expect { subject.register }.to change { Dir.exist?(temporary_directory) }.from(false).to(true)
48
+ end
49
+ end
50
+ end
51
+
52
+ describe '#get_s3object' do
53
+ subject { LogStash::Inputs::S3.new(settings) }
54
+
55
+ context 'with deprecated credentials option' do
56
+ let(:settings) {
57
+ {
58
+ "credentials" => ["1234", "secret"],
59
+ "proxy_uri" => "http://example.com",
60
+ "bucket" => "logstash-test",
61
+ }
62
+ }
63
+
64
+ it 'should instantiate AWS::S3 clients with a proxy set' do
65
+ expect(AWS::S3).to receive(:new).with({
66
+ :access_key_id => "1234",
67
+ :secret_access_key => "secret",
68
+ :proxy_uri => 'http://example.com',
69
+ :use_ssl => subject.use_ssl,
70
+ }.merge(subject.aws_service_endpoint(subject.region)))
71
+
72
+ subject.send(:get_s3object)
73
+ end
74
+ end
75
+
76
+ context 'with modern access key options' do
31
77
  let(:settings) {
32
78
  {
33
79
  "access_key_id" => "1234",
34
80
  "secret_access_key" => "secret",
81
+ "proxy_uri" => "http://example.com",
35
82
  "bucket" => "logstash-test",
36
- "temporary_directory" => temporary_directory
37
83
  }
38
84
  }
39
85
 
40
- let(:temporary_directory) { Stud::Temporary.pathname }
86
+ it 'should instantiate AWS::S3 clients with a proxy set' do
87
+ expect(AWS::S3).to receive(:new).with({
88
+ :access_key_id => "1234",
89
+ :secret_access_key => "secret",
90
+ :proxy_uri => 'http://example.com',
91
+ :use_ssl => subject.use_ssl,
92
+ }.merge(subject.aws_service_endpoint(subject.region)))
93
+
41
94
 
42
- it "creates the direct when it doesn't exist" do
43
- expect { subject.register }.to change { Dir.exist?(temporary_directory) }.from(false).to(true)
95
+ subject.send(:get_s3object)
44
96
  end
45
97
  end
46
98
  end
@@ -58,15 +110,15 @@ describe LogStash::Inputs::S3 do
58
110
  }
59
111
 
60
112
  it 'should allow user to exclude files from the s3 bucket' do
61
- config = LogStash::Inputs::S3.new(settings.merge({ "exclude_pattern" => "^exclude" }))
62
- config.register
63
- expect(config.list_new_files).to eq([present_object.key])
113
+ plugin = LogStash::Inputs::S3.new(config.merge({ "exclude_pattern" => "^exclude" }))
114
+ plugin.register
115
+ expect(plugin.list_new_files).to eq([present_object.key])
64
116
  end
65
117
 
66
118
  it 'should support not providing a exclude pattern' do
67
- config = LogStash::Inputs::S3.new(settings)
68
- config.register
69
- expect(config.list_new_files).to eq(objects_list.map(&:key))
119
+ plugin = LogStash::Inputs::S3.new(config)
120
+ plugin.register
121
+ expect(plugin.list_new_files).to eq(objects_list.map(&:key))
70
122
  end
71
123
 
72
124
  context "If the bucket is the same as the backup bucket" do
@@ -78,20 +130,20 @@ describe LogStash::Inputs::S3 do
78
130
 
79
131
  allow_any_instance_of(AWS::S3::ObjectCollection).to receive(:with_prefix).with(nil) { objects_list }
80
132
 
81
- config = LogStash::Inputs::S3.new(settings.merge({ 'backup_add_prefix' => 'mybackup',
82
- 'backup_to_bucket' => settings['bucket']}))
83
- config.register
84
- expect(config.list_new_files).to eq([present_object.key])
133
+ plugin = LogStash::Inputs::S3.new(config.merge({ 'backup_add_prefix' => 'mybackup',
134
+ 'backup_to_bucket' => config['bucket']}))
135
+ plugin.register
136
+ expect(plugin.list_new_files).to eq([present_object.key])
85
137
  end
86
138
  end
87
139
 
88
140
  it 'should ignore files older than X' do
89
- config = LogStash::Inputs::S3.new(settings.merge({ 'backup_add_prefix' => 'exclude-this-file'}))
141
+ plugin = LogStash::Inputs::S3.new(config.merge({ 'backup_add_prefix' => 'exclude-this-file'}))
90
142
 
91
143
  expect_any_instance_of(LogStash::Inputs::S3::SinceDB::File).to receive(:read).exactly(objects_list.size) { Time.now - day }
92
- config.register
144
+ plugin.register
93
145
 
94
- expect(config.list_new_files).to eq([present_object.key])
146
+ expect(plugin.list_new_files).to eq([present_object.key])
95
147
  end
96
148
 
97
149
  it 'should ignore file if the file match the prefix' do
@@ -104,9 +156,9 @@ describe LogStash::Inputs::S3 do
104
156
 
105
157
  allow_any_instance_of(AWS::S3::ObjectCollection).to receive(:with_prefix).with(prefix) { objects_list }
106
158
 
107
- config = LogStash::Inputs::S3.new(settings.merge({ 'prefix' => prefix }))
108
- config.register
109
- expect(config.list_new_files).to eq([present_object.key])
159
+ plugin = LogStash::Inputs::S3.new(config.merge({ 'prefix' => prefix }))
160
+ plugin.register
161
+ expect(plugin.list_new_files).to eq([present_object.key])
110
162
  end
111
163
 
112
164
  it 'should sort return object sorted by last_modification date with older first' do
@@ -119,41 +171,41 @@ describe LogStash::Inputs::S3 do
119
171
  allow_any_instance_of(AWS::S3::ObjectCollection).to receive(:with_prefix).with(nil) { objects }
120
172
 
121
173
 
122
- config = LogStash::Inputs::S3.new(settings)
123
- config.register
124
- expect(config.list_new_files).to eq(['TWO_DAYS_AGO', 'YESTERDAY', 'TODAY'])
174
+ plugin = LogStash::Inputs::S3.new(config)
175
+ plugin.register
176
+ expect(plugin.list_new_files).to eq(['TWO_DAYS_AGO', 'YESTERDAY', 'TODAY'])
125
177
  end
126
178
 
127
179
  describe "when doing backup on the s3" do
128
180
  it 'should copy to another s3 bucket when keeping the original file' do
129
- config = LogStash::Inputs::S3.new(settings.merge({ "backup_to_bucket" => "mybackup"}))
130
- config.register
181
+ plugin = LogStash::Inputs::S3.new(config.merge({ "backup_to_bucket" => "mybackup"}))
182
+ plugin.register
131
183
 
132
184
  s3object = double()
133
185
  expect(s3object).to receive(:copy_to).with('test-file', :bucket => an_instance_of(AWS::S3::Bucket))
134
186
 
135
- config.backup_to_bucket(s3object, 'test-file')
187
+ plugin.backup_to_bucket(s3object, 'test-file')
136
188
  end
137
189
 
138
190
  it 'should move to another s3 bucket when deleting the original file' do
139
- config = LogStash::Inputs::S3.new(settings.merge({ "backup_to_bucket" => "mybackup", "delete" => true }))
140
- config.register
191
+ plugin = LogStash::Inputs::S3.new(config.merge({ "backup_to_bucket" => "mybackup", "delete" => true }))
192
+ plugin.register
141
193
 
142
194
  s3object = double()
143
195
  expect(s3object).to receive(:move_to).with('test-file', :bucket => an_instance_of(AWS::S3::Bucket))
144
196
 
145
- config.backup_to_bucket(s3object, 'test-file')
197
+ plugin.backup_to_bucket(s3object, 'test-file')
146
198
  end
147
199
 
148
200
  it 'should add the specified prefix to the backup file' do
149
- config = LogStash::Inputs::S3.new(settings.merge({ "backup_to_bucket" => "mybackup",
201
+ plugin = LogStash::Inputs::S3.new(config.merge({ "backup_to_bucket" => "mybackup",
150
202
  "backup_add_prefix" => 'backup-' }))
151
- config.register
203
+ plugin.register
152
204
 
153
205
  s3object = double()
154
206
  expect(s3object).to receive(:copy_to).with('backup-test-file', :bucket => an_instance_of(AWS::S3::Bucket))
155
207
 
156
- config.backup_to_bucket(s3object, 'test-file')
208
+ plugin.backup_to_bucket(s3object, 'test-file')
157
209
  end
158
210
  end
159
211
 
@@ -162,9 +214,9 @@ describe LogStash::Inputs::S3 do
162
214
  Stud::Temporary.file do |source_file|
163
215
  backup_file = File.join(backup_dir.to_s, Pathname.new(source_file.path).basename.to_s)
164
216
 
165
- config = LogStash::Inputs::S3.new(settings.merge({ "backup_to_dir" => backup_dir }))
217
+ plugin = LogStash::Inputs::S3.new(config.merge({ "backup_to_dir" => backup_dir }))
166
218
 
167
- config.backup_to_dir(source_file)
219
+ plugin.backup_to_dir(source_file)
168
220
 
169
221
  expect(File.exists?(backup_file)).to eq(true)
170
222
  end
@@ -173,26 +225,26 @@ describe LogStash::Inputs::S3 do
173
225
 
174
226
  it 'should accepts a list of credentials for the aws-sdk, this is deprecated' do
175
227
  Stud::Temporary.directory do |tmp_directory|
176
- old_credentials_settings = {
228
+ old_credentials_config = {
177
229
  "credentials" => ['1234', 'secret'],
178
230
  "backup_to_dir" => tmp_directory,
179
231
  "bucket" => "logstash-test"
180
232
  }
181
233
 
182
- config = LogStash::Inputs::S3.new(old_credentials_settings)
183
- expect{ config.register }.not_to raise_error
234
+ plugin = LogStash::Inputs::S3.new(old_credentials_config)
235
+ expect{ plugin.register }.not_to raise_error
184
236
  end
185
237
  end
186
238
  end
187
239
 
188
240
  shared_examples "generated events" do
189
241
  it 'should process events' do
190
- events = fetch_events(settings)
242
+ events = fetch_events(config)
191
243
  expect(events.size).to eq(2)
192
244
  end
193
245
 
194
246
  it "deletes the temporary file" do
195
- events = fetch_events(settings)
247
+ events = fetch_events(config)
196
248
  expect(Dir.glob(File.join(temporary_directory, "*")).size).to eq(0)
197
249
  end
198
250
  end
@@ -209,7 +261,7 @@ describe LogStash::Inputs::S3 do
209
261
 
210
262
  context "when event doesn't have a `message` field" do
211
263
  let(:log_file) { File.join(File.dirname(__FILE__), '..', 'fixtures', 'json.log') }
212
- let(:settings) {
264
+ let(:config) {
213
265
  {
214
266
  "access_key_id" => "1234",
215
267
  "secret_access_key" => "secret",
@@ -225,7 +277,6 @@ describe LogStash::Inputs::S3 do
225
277
  let(:log) { double(:key => 'log.gz', :last_modified => Time.now - 2 * day) }
226
278
  let(:log_file) { File.join(File.dirname(__FILE__), '..', 'fixtures', 'compressed.log.gz') }
227
279
 
228
-
229
280
  include_examples "generated events"
230
281
  end
231
282
 
@@ -245,7 +296,7 @@ describe LogStash::Inputs::S3 do
245
296
  let(:log_file) { File.join(File.dirname(__FILE__), '..', 'fixtures', 'cloudfront.log') }
246
297
 
247
298
  it 'should extract metadata from cloudfront log' do
248
- events = fetch_events(settings)
299
+ events = fetch_events(config)
249
300
 
250
301
  events.each do |event|
251
302
  expect(event['cloudfront_fields']).to eq('date time x-edge-location c-ip x-event sc-bytes x-cf-status x-cf-client-id cs-uri-stem cs-uri-query c-referrer x-page-url​ c-user-agent x-sname x-sname-query x-file-ext x-sid')
@@ -3,7 +3,6 @@ def fetch_events(settings)
3
3
  s3 = LogStash::Inputs::S3.new(settings)
4
4
  s3.register
5
5
  s3.process_files(queue)
6
- s3.teardown
7
6
  queue
8
7
  end
9
8
 
@@ -32,3 +31,15 @@ end
32
31
  def s3object
33
32
  AWS::S3.new
34
33
  end
34
+
35
+ class TestInfiniteS3Object
36
+ def each
37
+ counter = 1
38
+
39
+ loop do
40
+ yield "awesome-#{counter}"
41
+ counter +=1
42
+ end
43
+ end
44
+ end
45
+
metadata CHANGED
@@ -1,133 +1,125 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-input-s3
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.0
4
+ version: 2.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2015-06-24 00:00:00.000000000 Z
11
+ date: 2015-09-23 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: logstash-core
15
- version_requirements: !ruby/object:Gem::Requirement
16
- requirements:
17
- - - '>='
18
- - !ruby/object:Gem::Version
19
- version: 1.4.0
20
- - - <
21
- - !ruby/object:Gem::Version
22
- version: 2.0.0
23
14
  requirement: !ruby/object:Gem::Requirement
24
15
  requirements:
25
- - - '>='
26
- - !ruby/object:Gem::Version
27
- version: 1.4.0
28
- - - <
16
+ - - ~>
29
17
  - !ruby/object:Gem::Version
30
- version: 2.0.0
18
+ version: 2.0.0.snapshot
19
+ name: logstash-core
31
20
  prerelease: false
32
21
  type: :runtime
33
- - !ruby/object:Gem::Dependency
34
- name: logstash-mixin-aws
35
22
  version_requirements: !ruby/object:Gem::Requirement
36
23
  requirements:
37
- - - '>='
24
+ - - ~>
38
25
  - !ruby/object:Gem::Version
39
- version: '0'
26
+ version: 2.0.0.snapshot
27
+ - !ruby/object:Gem::Dependency
40
28
  requirement: !ruby/object:Gem::Requirement
41
29
  requirements:
42
30
  - - '>='
43
31
  - !ruby/object:Gem::Version
44
32
  version: '0'
33
+ name: logstash-mixin-aws
45
34
  prerelease: false
46
35
  type: :runtime
47
- - !ruby/object:Gem::Dependency
48
- name: stud
49
36
  version_requirements: !ruby/object:Gem::Requirement
50
37
  requirements:
51
- - - ~>
38
+ - - '>='
52
39
  - !ruby/object:Gem::Version
53
- version: 0.0.18
40
+ version: '0'
41
+ - !ruby/object:Gem::Dependency
54
42
  requirement: !ruby/object:Gem::Requirement
55
43
  requirements:
56
44
  - - ~>
57
45
  - !ruby/object:Gem::Version
58
46
  version: 0.0.18
47
+ name: stud
59
48
  prerelease: false
60
49
  type: :runtime
61
- - !ruby/object:Gem::Dependency
62
- name: logstash-devutils
63
50
  version_requirements: !ruby/object:Gem::Requirement
64
51
  requirements:
65
- - - '>='
52
+ - - ~>
66
53
  - !ruby/object:Gem::Version
67
- version: '0'
54
+ version: 0.0.18
55
+ - !ruby/object:Gem::Dependency
68
56
  requirement: !ruby/object:Gem::Requirement
69
57
  requirements:
70
58
  - - '>='
71
59
  - !ruby/object:Gem::Version
72
60
  version: '0'
61
+ name: logstash-devutils
73
62
  prerelease: false
74
63
  type: :development
75
- - !ruby/object:Gem::Dependency
76
- name: simplecov
77
64
  version_requirements: !ruby/object:Gem::Requirement
78
65
  requirements:
79
66
  - - '>='
80
67
  - !ruby/object:Gem::Version
81
68
  version: '0'
69
+ - !ruby/object:Gem::Dependency
82
70
  requirement: !ruby/object:Gem::Requirement
83
71
  requirements:
84
72
  - - '>='
85
73
  - !ruby/object:Gem::Version
86
74
  version: '0'
75
+ name: simplecov
87
76
  prerelease: false
88
77
  type: :development
89
- - !ruby/object:Gem::Dependency
90
- name: coveralls
91
78
  version_requirements: !ruby/object:Gem::Requirement
92
79
  requirements:
93
80
  - - '>='
94
81
  - !ruby/object:Gem::Version
95
82
  version: '0'
83
+ - !ruby/object:Gem::Dependency
96
84
  requirement: !ruby/object:Gem::Requirement
97
85
  requirements:
98
86
  - - '>='
99
87
  - !ruby/object:Gem::Version
100
88
  version: '0'
89
+ name: coveralls
101
90
  prerelease: false
102
91
  type: :development
103
- - !ruby/object:Gem::Dependency
104
- name: logstash-codec-json
105
92
  version_requirements: !ruby/object:Gem::Requirement
106
93
  requirements:
107
94
  - - '>='
108
95
  - !ruby/object:Gem::Version
109
96
  version: '0'
97
+ - !ruby/object:Gem::Dependency
110
98
  requirement: !ruby/object:Gem::Requirement
111
99
  requirements:
112
100
  - - '>='
113
101
  - !ruby/object:Gem::Version
114
102
  version: '0'
103
+ name: logstash-codec-json
115
104
  prerelease: false
116
105
  type: :development
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - '>='
109
+ - !ruby/object:Gem::Version
110
+ version: '0'
117
111
  description: This gem is a logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/plugin install gemname. This gem is not a stand-alone program
118
112
  email: info@elastic.co
119
113
  executables: []
120
114
  extensions: []
121
115
  extra_rdoc_files: []
122
116
  files:
123
- - .gitignore
124
117
  - CHANGELOG.md
125
118
  - CONTRIBUTORS
126
119
  - Gemfile
127
120
  - LICENSE
128
121
  - NOTICE.TXT
129
122
  - README.md
130
- - Rakefile
131
123
  - lib/logstash/inputs/s3.rb
132
124
  - logstash-input-s3.gemspec
133
125
  - spec/fixtures/cloudfront.log
@@ -160,7 +152,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
160
152
  version: '0'
161
153
  requirements: []
162
154
  rubyforge_project:
163
- rubygems_version: 2.2.2
155
+ rubygems_version: 2.4.8
164
156
  signing_key:
165
157
  specification_version: 4
166
158
  summary: Stream events from files from a S3 bucket.
data/.gitignore DELETED
@@ -1,5 +0,0 @@
1
- *.gem
2
- Gemfile.lock
3
- .bundle
4
- vendor
5
- coverage/
data/Rakefile DELETED
@@ -1,7 +0,0 @@
1
- @files=[]
2
-
3
- task :default do
4
- system("rake -T")
5
- end
6
-
7
- require "logstash/devutils/rake"