s3-publisher 0.4.4 → 0.9.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: a3a229558c18ee709daa8680c4d15d89cc832be0
4
- data.tar.gz: ae139bdbc993ae23251f102c727bde45db901e6e
3
+ metadata.gz: 228be6f6a0276a570e4abae98164b259ce62bfa0
4
+ data.tar.gz: 7c489b18c2183fa36648ddd36b2afdcb0e8f86a6
5
5
  SHA512:
6
- metadata.gz: 052acc3beae0da6d7eb63c3ffedfca7a55ea5e3db429d4f8b98b38acefcdc48d1770df7a15e6d73f28143c3426936b61e77bdbdfeea7496f1add05e029aeaec3
7
- data.tar.gz: 6e9abb6df1e5b8e0d82b17d5fe6d1e7b81737ab54f5c529de26c962e3f8c50d8b470d833657829e1dec8128b169a99662e4a40824e4e257825204fadce1fe184
6
+ metadata.gz: 423f3fa87a913e384771e99d2f0c6a23016f95bc6bff62a7ef6d2d54977ade31e165f8af11dbfa0dfdcdde427e0ea915214474775370b41134b9b1c5ed36f4fe
7
+ data.tar.gz: b4e9a958d93e70cf6273198f87a593b377296806f83bdaa71f2aff28456ebe590ea6649ef7a5649d801dc6b7095445ca8d3a5bcad87b6f6c51b51c5baa807acc
data/.rspec ADDED
@@ -0,0 +1,2 @@
1
+ --color
2
+ --format progress
data/README.rdoc CHANGED
@@ -1,39 +1,55 @@
1
- = s3-publisher
1
+ # s3-publisher
2
2
 
3
- S3Publisher is meant as a clean, simple, sensible-defaults way to publish
4
- files to Amazon S3 for the world to see.
3
+ Quickly pub your data files to S3.
5
4
 
6
- Basic usage:
5
+ Reasons you might want to use this instead of aws-sdk directly:
7
6
 
8
- require 's3-publisher'
9
- S3Publisher.publish('my-bucket') do |p|
10
- p.push('test.txt', 'abc1234')
11
- end
7
+ * parallel uploads using ruby threads. Concurrency defaults to 3 but can be increased.
8
+ * gzip, by default S3Publisher gzips so you don't have to.
9
+ * try-again technology, it will retry if a S3 request fails.
10
+ * no need to remember all the correct opts for content-type, acl, etc.
11
+
12
+ ### Basic usage:
13
+
14
+ ```
15
+ require 's3-publisher'
16
+ S3Publisher.publish('my-bucket') do |p|
17
+ p.push('test.txt', data: 'abc1234')
18
+ end
19
+ ```
12
20
 
13
21
  This will:
14
- * push test.txt to my-bucket.s3.amazonaws.com
15
- * set security to public-read
16
- * gzip contents ('abc1234') and set a Content-Encoding: gzip header so clients know to decompress
17
- * set a Cache-Control: max-age=5 header
18
22
 
19
- Slightly more advanced example:
23
+ * push test.txt to my-bucket.s3.amazonaws.com
24
+ * set security to public-read
25
+ * gzip contents ('abc1234') and set a Content-Encoding: gzip header so clients know to decompress
26
+ * set a Cache-Control: max-age=5 header
27
+
28
+
29
+ You can also pass file paths, rather than string data. Files aren't read until publish-time, saving memory.
20
30
 
21
- S3Publisher.publish('my-bucket', :base_path => 'world_cup') do |p|
22
- p.push('events.xml', '<xml>...', :ttl => 15)
23
- end
31
+ ```
32
+ require 's3-publisher'
33
+ S3Publisher.publish('my-bucket') do |p|
34
+ p.push('test.json', file: '/tmp/test.json')
35
+ end
36
+ ```
37
+
38
+ ### Slightly more advanced example:
39
+
40
+ ```
41
+ S3Publisher.publish('my-bucket', :base_path => 'world_cup') do |p|
42
+ p.push('events.xml', '<xml>...', :ttl => 15)
43
+ end
44
+ ```
24
45
 
25
46
  In this example:
26
- * file will be written to my-bucket.s3.amazonaws.com/world_cup/events.xml
27
- * Cache-Control: max-age=15 will be set
47
+
48
+ * file will be written to my-bucket.s3.amazonaws.com/world_cup/events.xml
49
+ * Cache-Control: max-age=15 will be set
28
50
 
29
51
  A few miscellaneous notes:
30
- * gzip compress is skipped on .jpg/gif/png/tif files
31
- * uploads are multi-threaded. You can control worker thread count on instantiation.
32
- * pass :redundancy => :reduced when instantiating the publisher to write to reduced
33
- redundancy storage (this used to be the default, but now requires the option to be set.)
34
-
35
- See class docs for further options.
36
52
 
37
- == Copyright
53
+ * gzip compress is skipped on .jpg/gif/png/tif files
38
54
 
39
- Copyright (c) 2011 Ben Koski. See LICENSE for details.
55
+ See class docs for more options.
data/lib/s3-publisher.rb CHANGED
@@ -1,10 +1,9 @@
1
1
  require 'zlib'
2
2
  require 'thread'
3
+ require 'pathname'
3
4
 
4
- require 'right_aws'
5
- require 'aws_credentials'
6
-
7
- Thread.abort_on_exception = true
5
+ require 'aws-sdk'
6
+ require 'mime-types'
8
7
 
9
8
  # You can either use the block syntax, or:
10
9
  # * instantiate a class
@@ -24,54 +23,71 @@ class S3Publisher
24
23
  p.run
25
24
  end
26
25
 
27
- # Pass the publisher a bucket_name along with any of the following options:
28
- # * <tt>base_path</tt> - path prepended to supplied file_name on upload
29
- # * <tt>logger</tt> - a logger object to recieve 'uploaded' messages. Defaults to STDOUT.
30
- # * <tt>protocol</tt> - protocol to use for S3 requests. Defaults to 'http', but 'https' can also be used.
31
- # * <tt>workers</tt> - number of threads to use when pushing to S3. Defaults to 3.
26
+ # @param [String] bucket_name
27
+ # @option opts [String] :base_path Path prepended to supplied file_name on upload
28
+ # @option opts [Integer] :workers Number of threads to use when pushing to S3. Defaults to 3.
29
+ # @option opts [Object] :logger A logger object to recieve 'uploaded' messages. Defaults to STDOUT.
32
30
  def initialize bucket_name, opts={}
33
- @s3 = RightAws::S3.new(AWSCredentials.access_key, AWSCredentials.secret_access_key, :multi_thread => true,
34
- :protocol => opts[:protocol] || 'http',
35
- :port => 80,
36
- :logger => Logger.new(nil))
37
- @bucket_name, @base_path = bucket_name, opts[:base_path]
38
- raise ArgumentError, "#{bucket_name} doesn't seem to be a valid bucket on your account" if @s3.bucket(bucket_name).nil?
39
- @logger = opts[:logger] || $stdout
40
- @workers_to_use = opts[:workers] || 3
41
31
  @publish_queue = Queue.new
32
+ @workers_to_use = opts[:workers] || 3
33
+ @logger = opts[:logger] || $stdout
34
+
35
+ s3_opts = {}
36
+ s3_opts[:access_key_id] = opts[:access_key_id] if opts.key?(:access_key_id)
37
+ s3_opts[:secret_access_key] = opts[:secret_access_key] if opts.key?(:secret_access_key)
38
+
39
+ @s3 = AWS::S3.new(s3_opts)
40
+
41
+ @bucket_name, @base_path = bucket_name, opts[:base_path]
42
+ raise ArgumentError, "#{bucket_name} doesn't seem to be a valid bucket on your account" if @s3.buckets[bucket_name].nil?
42
43
  end
43
-
44
- # Pass:
45
- # * <tt>file_name</tt> - name of file on S3. base_path will be prepended if supplied on instantiate.
46
- # * <tt>data</tt> - data to be uploaded as a string
44
+
45
+ # Queues a file to be published.
46
+ # You can provide :data as a string, or a path to a file with :file.
47
+ # :file references won't be evaluated until publish-time, reducing memory overhead.
47
48
  #
48
- # And one or many options:
49
- # * <tt>:gzip (true|false)</tt> - gzip file contents? defaults to true.
50
- # * <tt>:ttl</tt> - TTL in seconds for cache-control header. defaults to 5.
51
- # * <tt>:cache_control</tt> - specify Cache-Control header directly if you don't like the default
52
- # * <tt>:content_type</tt> - no need to specify if default based on extension is okay. But if you need to force,
53
- # you can provide :xml, :html, :text, or your own custom string.
54
- # * <tt>:redundancy</tt> - by default objects are stored at reduced redundancy, pass :standard to store at full
55
- def push file_name, data, opts={}
56
- headers = {}
49
+ # @param [String] key_name The name of the file on S3. base_path will be prepended if supplied.
50
+ # @option opts [String] :data a string to be published
51
+ # @option opts [String] :file path to a file to publish
52
+ # @option opts [Boolean] :gzip gzip file contents? defaults to true.
53
+ # @option opts [Integer] :ttl TTL in seconds for cache-control header. defaults to 5.
54
+ # @option opts [String] :cache_control specify Cache-Control header directly if you don't like the default
55
+ # @option opts [String] :content_type no need to specify if default based on extension is okay. But if you need to force,
56
+ # you can provide :xml, :html, :text, or your own custom string.
57
+ def push key_name, opts={}
58
+ write_opts = { acl: 'public-read' }
57
59
 
58
- file_name = "#{base_path}/#{file_name}" unless base_path.nil?
60
+ key_name = "#{base_path}/#{key_name}" unless base_path.nil?
59
61
 
60
- unless opts[:gzip] == false || file_name.match(/\.(jpg|gif|png|tif)$/)
61
- data = gzip(data)
62
- headers['Content-Encoding'] = 'gzip'
62
+ # Setup data.
63
+ if opts[:data]
64
+ contents = opts[:data]
65
+ elsif opts[:file]
66
+ contents = Pathname.new(opts[:file])
67
+ raise ArgumentError, "'#{opts[:file]}' does not exist!" if !contents.exist?
68
+ else
69
+ raise ArgumentError, "A :file or :data attr must be provided to publish to S3!"
70
+ end
71
+
72
+ # Then Content-Type
73
+ if opts[:content_type]
74
+ write_opts[:content_type] = opts[:content_type]
75
+ else
76
+ matching_mimes = MIME::Types.type_for(key_name)
77
+ raise ArgumentError, "Can't infer the content-type for '#{key_name}'! Please specify with the :content_type opt." if matching_mimes.empty?
78
+ write_opts[:content_type] = matching_mimes.first.to_s
63
79
  end
64
-
65
- headers['x-amz-storage-class'] = opts[:redundancy] == :reduced ? 'REDUCED_REDUNDANCY' : 'STANDARD'
66
- headers['content-type'] = parse_content_type(opts[:content_type]) if opts[:content_type]
67
80
 
81
+ # And Cache-Control
68
82
  if opts.has_key?(:cache_control)
69
- headers['Cache-Control'] = opts[:cache_control]
83
+ write_opts[:cache_control] = opts[:cache_control]
70
84
  else
71
- headers['Cache-Control'] = "max-age=#{opts[:ttl] || 5}"
85
+ write_opts[:cache_control] = "max-age=#{opts[:ttl] || 5}"
72
86
  end
73
87
 
74
- @publish_queue.push({:key_name => file_name, :data => data, :headers => headers})
88
+ opts[:gzip] = true unless opts.has_key?(:gzip)
89
+
90
+ @publish_queue.push({ key_name: key_name, contents: contents, write_opts: write_opts, gzip: opts[:gzip] })
75
91
  end
76
92
 
77
93
  # Process queued uploads and push to S3
@@ -96,19 +112,6 @@ class S3Publisher
96
112
 
97
113
  return gzipped_data.string
98
114
  end
99
-
100
- def parse_content_type content_type
101
- case content_type
102
- when :xml
103
- 'application/xml'
104
- when :text
105
- 'text/plain'
106
- when :html
107
- 'text/html'
108
- else
109
- content_type
110
- end
111
- end
112
115
 
113
116
  def publish_from_queue
114
117
  loop do
@@ -116,14 +119,25 @@ class S3Publisher
116
119
 
117
120
  try_count = 0
118
121
  begin
119
- @s3.bucket(bucket_name).put(item[:key_name], item[:data], {}, 'public-read', item[:headers])
122
+ obj = @s3.buckets[bucket_name].objects[item[:key_name]]
123
+
124
+ gzip = item[:gzip] != false && !item[:key_name].match(/\.(jpg|gif|png|tif)$/)
125
+
126
+ if gzip
127
+ item[:write_opts][:content_encoding] = 'gzip'
128
+ gzip_body = item[:contents].is_a?(Pathname) ? item[:contents].read : item[:contents]
129
+ item[:contents] = gzip(gzip_body)
130
+ end
131
+
132
+ obj.write(item[:contents], item[:write_opts])
133
+
120
134
  rescue Exception => e # backstop against transient S3 errors
121
135
  raise e if try_count >= 1
122
136
  try_count += 1
123
137
  retry
124
138
  end
125
139
 
126
- logger << "Wrote http://#{bucket_name}.s3.amazonaws.com/#{item[:key_name]} with #{item[:headers].inspect}\n"
140
+ logger << "Wrote http://#{bucket_name}.s3.amazonaws.com/#{item[:key_name]} with #{item[:write_opts].inspect}\n"
127
141
  end
128
142
  rescue ThreadError # ThreadError hit when queue is empty. Simply jump out of loop and return to join().
129
143
  end
data/s3-publisher.gemspec CHANGED
@@ -2,9 +2,9 @@
2
2
 
3
3
  Gem::Specification.new do |s|
4
4
  s.name = "s3-publisher"
5
- s.version = "0.4.4"
5
+ s.version = "0.9.0"
6
6
  s.authors = ["Ben Koski"]
7
- s.email = "gems@benkoski.com"
7
+ s.email = "bkoski@nytimes.com"
8
8
  s.summary = "Publish data to S3 for the world to see"
9
9
  s.description = "Publish data to S3 for the world to see"
10
10
  s.homepage = "http://github.com/bkoski/s3-publisher"
@@ -16,7 +16,7 @@ Gem::Specification.new do |s|
16
16
 
17
17
  s.require_paths = ["lib"]
18
18
 
19
- s.add_development_dependency(%q<thoughtbot-shoulda>, [">= 0"])
20
- s.add_runtime_dependency(%q<aws_credentials>, [">= 0"])
21
- s.add_runtime_dependency(%q<right_aws>, [">= 3.0.0"])
19
+ s.add_development_dependency(%q<rspec>, [">= 0"])
20
+ s.add_runtime_dependency(%q<aws-sdk>, [">= 1.0"])
21
+ s.add_runtime_dependency(%q<mime-types>, [">= 0"])
22
22
  end
@@ -0,0 +1,138 @@
1
+ require 'spec_helper'
2
+
3
+ describe S3Publisher do
4
+ describe "#push" do
5
+
6
+ describe "file_name" do
7
+ it "prepends base_path if provided" do
8
+ set_put_expectation(key_name: 'world_cup_2010/events.xml')
9
+ p = S3Publisher.new('test-bucket', :logger => Logger.new(nil), :base_path => 'world_cup_2010')
10
+ p.push('events.xml', data: '1234')
11
+ p.run
12
+ end
13
+
14
+ it "passes name through unaltered if base_path not specified" do
15
+ set_put_expectation(key_name: 'events.xml')
16
+ p = S3Publisher.new('test-bucket', :logger => Logger.new(nil))
17
+ p.push('events.xml', data: '1234')
18
+ p.run
19
+ end
20
+ end
21
+
22
+ describe "gzip" do
23
+ it "gzips data if :gzip => true" do
24
+ set_put_expectation(data: gzip('1234'))
25
+ push_test_data('myfile.txt', data: '1234', gzip: true)
26
+ end
27
+
28
+ it "does not gzip data if :gzip => false" do
29
+ set_put_expectation(data: '1234')
30
+ push_test_data('myfile.txt', data: '1234', gzip: false)
31
+ end
32
+
33
+ it "does not gzip data if file ends in .jpg" do
34
+ set_put_expectation(key_name: 'myfile.jpg', data: '1234')
35
+ push_test_data('myfile.jpg', data: '1234')
36
+ end
37
+
38
+ it "gzips data by default" do
39
+ set_put_expectation(data: gzip('1234'))
40
+ push_test_data('myfile.txt', data: '1234')
41
+ end
42
+ end
43
+
44
+ describe ":file opt" do
45
+ it "queues files as a pathname to be read if gzip is false" do
46
+ set_put_expectation(file: __FILE__)
47
+ push_test_data('myfile.txt', file: __FILE__, gzip: false)
48
+ end
49
+
50
+ it "queues gzipped contents of the file if gzip is true" do
51
+ set_put_expectation(data: gzip(File.read(__FILE__)))
52
+ push_test_data('myfile.txt', file: __FILE__, gzip: true)
53
+ end
54
+ end
55
+
56
+ describe "content type" do
57
+ it "detects content type based on extension" do
58
+ set_put_expectation(key_name: 'myfile.xml', content_type: 'application/xml')
59
+ push_test_data('myfile.xml', data: '1234')
60
+ end
61
+
62
+ it "forces Content-Type to user-supplied string if provided" do
63
+ set_put_expectation(content_type: 'audio/vorbis')
64
+ push_test_data('myfile.txt', data: '1234', content_type: 'audio/vorbis')
65
+ end
66
+
67
+ it "raises an exception if the content-type cannot be parsed" do
68
+ expect { push_test_data('myfile', data: '1234') }.to raise_error(ArgumentError)
69
+ end
70
+ end
71
+
72
+ describe "cache-control" do
73
+ it "sets Cache-Control to user-supplied string if :cache_control provided" do
74
+ set_put_expectation(cache_control: 'private, max-age=0')
75
+ push_test_data('myfile.txt', data: '1234', cache_control: 'private, max-age=0')
76
+ end
77
+
78
+ it "sets Cache-Control with :ttl provided" do
79
+ set_put_expectation(cache_control: 'max-age=55')
80
+ push_test_data('myfile.txt', data: '1234', ttl: 55)
81
+ end
82
+
83
+ it "sets Cache-Control to a 5s ttl if no :ttl or :cache_control was provided" do
84
+ set_put_expectation(cache_control: 'max-age=5')
85
+ push_test_data('myfile.txt', data: '1234')
86
+ end
87
+ end
88
+
89
+ # Based on opts, sets expecations for AWS::S3Object.write
90
+ # Can provide expected values for:
91
+ # * :key_name
92
+ # * :data
93
+ # * :content_type, :cache_control, :content_encoding
94
+ def set_put_expectation opts
95
+ s3_stub = mock()
96
+ bucket_stub = mock()
97
+ object_stub = mock()
98
+
99
+ key_name = opts[:key_name] || 'myfile.txt'
100
+
101
+ expected_entries = {}
102
+ [:content_type, :cache_control, :content_encoding].each do |k|
103
+ expected_entries[k] = opts[k] if opts.has_key?(k)
104
+ end
105
+
106
+ if opts[:data]
107
+ expected_contents = opts[:data]
108
+ elsif opts[:file]
109
+ expected_contents = Pathname.new(opts[:file])
110
+ else
111
+ expected_contents = anything
112
+ end
113
+
114
+ object_stub.expects(:write).with(expected_contents, has_entries(expected_entries))
115
+
116
+ s3_stub.stubs(:buckets).returns({'test-bucket' => bucket_stub })
117
+ bucket_stub.stubs(:objects).returns({ key_name => object_stub })
118
+
119
+ AWS::S3.stubs(:new).returns(s3_stub)
120
+ end
121
+
122
+ def gzip data
123
+ gzipped_data = StringIO.open('', 'w+')
124
+
125
+ gzip_writer = Zlib::GzipWriter.new(gzipped_data)
126
+ gzip_writer.write(data)
127
+ gzip_writer.close
128
+
129
+ return gzipped_data.string
130
+ end
131
+
132
+ def push_test_data file_name, opts
133
+ p = S3Publisher.new('test-bucket', :logger => Logger.new(nil))
134
+ p.push(file_name, opts)
135
+ p.run
136
+ end
137
+ end
138
+ end
@@ -0,0 +1,27 @@
1
+ # This file was generated by the `rspec --init` command. Conventionally, all
2
+ # specs live under a `spec` directory, which RSpec adds to the `$LOAD_PATH`.
3
+ # Require this file using `require "spec_helper"` to ensure that it is only
4
+ # loaded once.
5
+ #
6
+ # See http://rubydoc.info/gems/rspec-core/RSpec/Core/Configuration
7
+
8
+ require File.expand_path('../lib/s3-publisher.rb', File.dirname(__FILE__))
9
+
10
+ RSpec.configure do |config|
11
+ # Limit the spec run to only specs with the focus metadata. If no specs have
12
+ # the filtering metadata and `run_all_when_everything_filtered = true` then
13
+ # all specs will run.
14
+ #config.filter_run :focus
15
+
16
+ # Run all specs when none match the provided filter. This works well in
17
+ # conjunction with `config.filter_run :focus`, as it will run the entire
18
+ # suite when no specs have `:filter` metadata.
19
+ #config.run_all_when_everything_filtered = true
20
+ config.mock_framework = :mocha
21
+
22
+ # Run specs in random order to surface order dependencies. If you find an
23
+ # order dependency and want to debug it, you can fix the order by providing
24
+ # the seed, which is printed after each run.
25
+ # --seed 1234
26
+ #config.order = 'random'
27
+ end
metadata CHANGED
@@ -1,17 +1,17 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: s3-publisher
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.4.4
4
+ version: 0.9.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ben Koski
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2013-12-14 00:00:00.000000000 Z
11
+ date: 2013-12-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
- name: thoughtbot-shoulda
14
+ name: rspec
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
17
  - - '>='
@@ -25,40 +25,41 @@ dependencies:
25
25
  - !ruby/object:Gem::Version
26
26
  version: '0'
27
27
  - !ruby/object:Gem::Dependency
28
- name: aws_credentials
28
+ name: aws-sdk
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - '>='
32
32
  - !ruby/object:Gem::Version
33
- version: '0'
33
+ version: '1.0'
34
34
  type: :runtime
35
35
  prerelease: false
36
36
  version_requirements: !ruby/object:Gem::Requirement
37
37
  requirements:
38
38
  - - '>='
39
39
  - !ruby/object:Gem::Version
40
- version: '0'
40
+ version: '1.0'
41
41
  - !ruby/object:Gem::Dependency
42
- name: right_aws
42
+ name: mime-types
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
45
  - - '>='
46
46
  - !ruby/object:Gem::Version
47
- version: 3.0.0
47
+ version: '0'
48
48
  type: :runtime
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
52
  - - '>='
53
53
  - !ruby/object:Gem::Version
54
- version: 3.0.0
54
+ version: '0'
55
55
  description: Publish data to S3 for the world to see
56
- email: gems@benkoski.com
56
+ email: bkoski@nytimes.com
57
57
  executables: []
58
58
  extensions: []
59
59
  extra_rdoc_files: []
60
60
  files:
61
61
  - .gitignore
62
+ - .rspec
62
63
  - Gemfile
63
64
  - Gemfile.lock
64
65
  - LICENSE
@@ -66,8 +67,8 @@ files:
66
67
  - Rakefile
67
68
  - lib/s3-publisher.rb
68
69
  - s3-publisher.gemspec
69
- - test/s3-publisher_test.rb
70
- - test/test_helper.rb
70
+ - spec/s3_publisher_spec.rb
71
+ - spec/spec_helper.rb
71
72
  homepage: http://github.com/bkoski/s3-publisher
72
73
  licenses:
73
74
  - MIT
@@ -88,10 +89,10 @@ required_rubygems_version: !ruby/object:Gem::Requirement
88
89
  version: '0'
89
90
  requirements: []
90
91
  rubyforge_project:
91
- rubygems_version: 2.0.6
92
+ rubygems_version: 2.0.3
92
93
  signing_key:
93
94
  specification_version: 4
94
95
  summary: Publish data to S3 for the world to see
95
96
  test_files:
96
- - test/s3-publisher_test.rb
97
- - test/test_helper.rb
97
+ - spec/s3_publisher_spec.rb
98
+ - spec/spec_helper.rb
@@ -1,127 +0,0 @@
1
- require 'test_helper'
2
-
3
- class S3PublisherTest < Test::Unit::TestCase
4
-
5
- context "push" do
6
-
7
- context "file_name" do
8
- should "prepend base_path if provided on instantiate" do
9
- set_put_expectation(:key_name => 'world_cup_2010/events.xml')
10
- p = S3Publisher.new('test-bucket', :logger => Logger.new(nil), :base_path => 'world_cup_2010')
11
- p.push('events.xml', '1234')
12
- p.run
13
- end
14
-
15
- should "pass through unaltered if base_path not specified" do
16
- set_put_expectation(:key_name => 'events.xml')
17
- p = S3Publisher.new('test-bucket', :logger => Logger.new(nil))
18
- p.push('events.xml', '1234')
19
- p.run
20
- end
21
- end
22
-
23
- context "gzip" do
24
- should "gzip data if :gzip => true" do
25
- set_put_expectation(:data => gzip('1234'))
26
- push_test_data('myfile.txt', '1234', :gzip => true)
27
- end
28
-
29
- should "not gzip data if :gzip => false" do
30
- set_put_expectation(:data => '1234')
31
- push_test_data('myfile.txt', '1234', :gzip => false)
32
- end
33
-
34
- should "not gzip data if file ends in .jpg" do
35
- set_put_expectation(:data => '1234')
36
- push_test_data('myfile.jpg', '1234', {})
37
- end
38
-
39
- should "gzip data by default" do
40
- set_put_expectation(:data => gzip('1234'))
41
- push_test_data('myfile.txt', '1234', {})
42
- end
43
- end
44
-
45
- context "redundancy" do
46
- should "set STANDARD by default" do
47
- set_put_expectation(:headers => { 'x-amz-storage-class' => 'STANDARD' })
48
- push_test_data('myfile.txt', '1234', {})
49
-
50
- end
51
-
52
- should "set REDUCED_REDUNDANCY if :redundancy => :reduced is passed" do
53
- set_put_expectation(:headers => { 'x-amz-storage-class' => 'REDUCED_REDUNDANCY' })
54
- push_test_data('myfile.txt', '1234', :redundancy => :reduced)
55
- end
56
- end
57
-
58
- context "content type" do
59
- should "force Content-Type to user-supplied string if provided" do
60
- set_put_expectation(:headers => { 'Content-Type' => 'audio/vorbis' })
61
- push_test_data('myfile.txt', '1234', :content_type => 'audio/vorbis')
62
- end
63
-
64
- should "force Content-Type to application/xml if :xml provided" do
65
- set_put_expectation(:headers => { 'Content-Type' => 'application/xml' })
66
- push_test_data('myfile.txt', '1234', :content_type => :xml)
67
- end
68
-
69
- should "force Content-Type to text/plain if :text provided" do
70
- set_put_expectation(:headers => { 'Content-Type' => 'text/plain' })
71
- push_test_data('myfile.txt', '1234', :content_type => :text)
72
- end
73
-
74
- should "force Content-Type to text/html if :html provided" do
75
- set_put_expectation(:headers => { 'Content-Type' => 'text/html' })
76
- push_test_data('myfile.txt', '1234', :content_type => :html)
77
- end
78
- end
79
-
80
- context "cache-control" do
81
- should "set Cache-Control to user-supplied string if :cache_control provided" do
82
- set_put_expectation(:headers => { 'Cache-Control' => 'private, max-age=0' })
83
- push_test_data('myfile.txt', '1234', :cache_control => 'private, max-age=0')
84
- end
85
-
86
- should "set Cache-Control with :ttl provided" do
87
- set_put_expectation(:headers => { 'Cache-Control' => 'max-age=55' })
88
- push_test_data('myfile.txt', '1234', :ttl => 55)
89
- end
90
-
91
- should "set Cache-Control to a 5s ttl if no :ttl or :cache_control was provided" do
92
- set_put_expectation(:headers => { 'Cache-Control' => 'max-age=5' })
93
- push_test_data('myfile.txt', '1234', {})
94
- end
95
- end
96
-
97
-
98
-
99
-
100
- end
101
-
102
- def set_put_expectation opts
103
- s3_stub = mock()
104
- bucket_stub = mock()
105
- bucket_stub.expects(:put).with(opts[:key_name] || anything, opts[:data] || anything, {}, 'public-read', opts[:headers] ? has_entries(opts[:headers]) : anything)
106
-
107
- s3_stub.stubs(:bucket).returns(bucket_stub)
108
- RightAws::S3.stubs(:new).returns(s3_stub)
109
- end
110
-
111
- def gzip data
112
- gzipped_data = StringIO.open('', 'w+')
113
-
114
- gzip_writer = Zlib::GzipWriter.new(gzipped_data)
115
- gzip_writer.write(data)
116
- gzip_writer.close
117
-
118
- return gzipped_data.string
119
- end
120
-
121
- def push_test_data file_name, data, opts
122
- p = S3Publisher.new('test-bucket', :logger => Logger.new(nil))
123
- p.push(file_name, data, opts)
124
- p.run
125
- end
126
-
127
- end
data/test/test_helper.rb DELETED
@@ -1,11 +0,0 @@
1
- require 'rubygems'
2
- require 'test/unit'
3
- require 'shoulda'
4
- require 'mocha'
5
-
6
- $LOAD_PATH.unshift(File.join(File.dirname(__FILE__), '..', 'lib'))
7
- $LOAD_PATH.unshift(File.dirname(__FILE__))
8
- require 's3-publisher'
9
-
10
- class Test::Unit::TestCase
11
- end