aws-s3 0.2.0 → 0.2.1

Sign up to get free protection for your applications and to get access to all the features.
data/INSTALL CHANGED
@@ -26,9 +26,15 @@ Then add the following line, save and exit:
26
26
 
27
27
  aws svn://rubyforge.org/var/svn/amazon/s3/trunk
28
28
 
29
+ If you go the svn route, be sure that you have all the dependencies installed. The list of dependencies follow.
30
+
29
31
  == Dependencies
30
32
 
31
- === XML parsing
33
+ sudo gem i xml-simple -ry
34
+ sudo gem i builder -ry
35
+ sudo gem i mime-types -ry
36
+
37
+ === XML parsing (xml-simple)
32
38
 
33
39
  AWS::S3 depends on XmlSimple (http://xml-simple.rubyforge.org/). When installing aws/s3 with
34
40
  Rubygems, this dependency will be taken care of for you. Otherwise, installation instructions are listed on the xml-simple
@@ -36,10 +42,10 @@ site.
36
42
 
37
43
  If your system has the Ruby libxml bindings installed (http://libxml.rubyforge.org/) they will be used instead of REXML (which is what XmlSimple uses). For those concerned with speed and efficiency, it would behoove you to install libxml (instructions here: http://libxml.rubyforge.org/install.html) as it is considerably faster and less expensive than REXML.
38
44
 
39
- === XML generation
45
+ === XML generation (builder)
40
46
 
41
47
  AWS::S3 also depends on the Builder library (http://builder.rubyforge.org/ and http://rubyforge.org/projects/builder/). This will also automatically be installed for you when using Rubygems.
42
48
 
43
- === Content type inference
49
+ === Content type inference (mime-types)
44
50
 
45
51
  AWS::S3 depends on the MIME::Types library (http://mime-types.rubyforge.org/) to infer the content type of an object that does not explicitly specify it. This library will automatically be installed for you when using Rubygems.
data/README CHANGED
@@ -135,7 +135,7 @@ You can store an object on S3 by specifying a key, its data and the name of the
135
135
  'photos'
136
136
  )
137
137
 
138
- The content type of the object will be infered by its extension. If the appropriate content type can not be infered, S3 defaults
138
+ The content type of the object will be inferred by its extension. If the appropriate content type can not be inferred, S3 defaults
139
139
  to <tt>binary/octect-stream</tt>.
140
140
 
141
141
  If you want to override this, you can explicitly indicate what content type the object should have with the <tt>:content_type</tt> option:
@@ -279,7 +279,7 @@ This time we didn't have to explicitly pass in the bucket name, as the JukeBoxSo
279
279
  always use the 'jukebox' bucket.
280
280
 
281
281
  "Astute readers", as they say, may have noticed that we used the third parameter to pass in the content type,
282
- rather than the fourth parameter as we had the last time we created an object. If the bucket can be infered, or
282
+ rather than the fourth parameter as we had the last time we created an object. If the bucket can be inferred, or
283
283
  is explicitly set, as we've done in the JukeBoxSong class, then the third argument can be used to pass in
284
284
  options.
285
285
 
data/Rakefile CHANGED
@@ -129,10 +129,12 @@ namespace :dist do
129
129
  sh %(svn ci CHANGELOG -m "Bump changelog version marker for release")
130
130
  end
131
131
 
132
+ package_name = lambda {|specification| File.join('pkg', "#{specification.name}-#{specification.version}")}
133
+
132
134
  desc 'Push a release to rubyforge'
133
135
  task :release => [:confirm_release, :clean, :add_release_marker_to_changelog, :package, :commit_changelog, :tag] do
134
136
  require 'rubyforge'
135
- package = File.join('pkg', "#{spec.name}-#{spec.version}")
137
+ package = package_name[spec]
136
138
 
137
139
  rubyforge = RubyForge.new
138
140
  rubyforge.login
@@ -151,6 +153,12 @@ namespace :dist do
151
153
  end
152
154
  end
153
155
 
156
+ desc 'Upload a beta gem'
157
+ task :push_beta_gem => [:clobber_package, :package] do
158
+ beta_gem = package_name[spec]
159
+ sh %(scp #{beta_gem}.gem marcel@rubyforge.org:/var/www/gforge-projects/amazon/beta)
160
+ end
161
+
154
162
  task :spec do
155
163
  puts spec.to_ruby
156
164
  end
data/bin/s3sh CHANGED
@@ -1,4 +1,6 @@
1
1
  #!/usr/bin/env ruby
2
- s3_lib = File.dirname(__FILE__) + '/../lib/aws/s3'
3
- setup = File.dirname(__FILE__) + '/setup'
4
- exec "irb -r #{s3_lib} -r #{setup} --simple-prompt"
2
+ s3_lib = File.dirname(__FILE__) + '/../lib/aws/s3'
3
+ setup = File.dirname(__FILE__) + '/setup'
4
+ irb_name = RUBY_PLATFORM =~ /mswin32/ ? 'irb.bat' : 'irb'
5
+
6
+ exec "#{irb_name} -r #{s3_lib} -r #{setup} --simple-prompt"
@@ -121,7 +121,7 @@ module AWS #:nodoc:
121
121
  # always use the 'jukebox' bucket.
122
122
  #
123
123
  # "Astute readers", as they say, may have noticed that we used the third parameter to pass in the content type,
124
- # rather than the fourth parameter as we had the last time we created an object. If the bucket can be infered, or
124
+ # rather than the fourth parameter as we had the last time we created an object. If the bucket can be inferred, or
125
125
  # is explicitly set, as we've done in the JukeBoxSong class, then the third argument can be used to pass in
126
126
  # options.
127
127
  #
@@ -175,10 +175,14 @@ module AWS
175
175
 
176
176
  private
177
177
  def validate_name!(name)
178
- raise InvalidBucketName.new(name) unless name =~ /^[-\w]{3,255}$/
178
+ raise InvalidBucketName.new(name) unless name =~ /^[-\w.]{3,255}$/
179
179
  end
180
180
 
181
181
  def path(name, options = {})
182
+ if name.is_a?(Hash)
183
+ options = name
184
+ name = nil
185
+ end
182
186
  "/#{bucket_name(name)}#{RequestOptions.process(options).to_query_string}"
183
187
  end
184
188
  end
@@ -279,7 +283,7 @@ module AWS
279
283
  end
280
284
  alias_method :clear, :delete_all
281
285
 
282
- # Buckets observer their objects and have this method called when one of their objects
286
+ # Buckets observe their objects and have this method called when one of their objects
283
287
  # is either stored or deleted.
284
288
  def update(action, object) #:nodoc:
285
289
  case action
@@ -22,12 +22,14 @@ module AWS
22
22
  http.start do
23
23
  request = request_method(verb).new(path, headers)
24
24
  authenticate!(request)
25
- case body
26
- when String then request.body = body
27
- when IO then
28
- request.body_stream = body
29
- request.content_length = body.lstat.size
30
- end if body
25
+ if body
26
+ if body.respond_to?(:read)
27
+ request.body_stream = body
28
+ request.content_length = body.respond_to?(:lstat) ? body.lstat.size : body.size
29
+ else
30
+ request.body = body
31
+ end
32
+ end
31
33
  http.request(request, &block)
32
34
  end
33
35
  end
@@ -82,7 +84,8 @@ module AWS
82
84
 
83
85
  module Management #:nodoc:
84
86
  def self.included(base)
85
- base.send(:class_variable_set, :@@connections, {})
87
+ base.cattr_accessor :connections
88
+ base.connections = {}
86
89
  base.extend ClassMethods
87
90
  end
88
91
 
@@ -154,10 +157,6 @@ module AWS
154
157
  end
155
158
 
156
159
  private
157
- def connections
158
- class_variable_get(:@@connections)
159
- end
160
-
161
160
  def connection_name
162
161
  name
163
162
  end
@@ -88,7 +88,7 @@ module AWS
88
88
  end
89
89
  end
90
90
 
91
- # Raised if the current bucket can not be infered when not explicitly specifying the target bucket in the calling
91
+ # Raised if the current bucket can not be inferred when not explicitly specifying the target bucket in the calling
92
92
  # method's arguments.
93
93
  class CurrentBucketNotSpecified < S3Exception
94
94
  def initialize(address)
@@ -6,7 +6,7 @@ class Hash
6
6
  query_string = ''
7
7
  query_string << '?' if include_question_mark
8
8
  query_string << inject([]) do |parameters, (key, value)|
9
- parameters << [key, value].map {|element| CGI.escape(element.to_s)}.join('=')
9
+ parameters << [key, value].map {|element| element.to_s}.join('=')
10
10
  end.join('&')
11
11
  end
12
12
 
@@ -123,21 +123,60 @@ class Module
123
123
  alias_method :const_missing, :const_missing_from_s3_library
124
124
  end
125
125
 
126
+
127
+ class Class # :nodoc:
128
+ def cattr_reader(*syms)
129
+ syms.flatten.each do |sym|
130
+ class_eval(<<-EOS, __FILE__, __LINE__)
131
+ unless defined? @@#{sym}
132
+ @@#{sym} = nil
133
+ end
134
+
135
+ def self.#{sym}
136
+ @@#{sym}
137
+ end
138
+
139
+ def #{sym}
140
+ @@#{sym}
141
+ end
142
+ EOS
143
+ end
144
+ end
145
+
146
+ def cattr_writer(*syms)
147
+ syms.flatten.each do |sym|
148
+ class_eval(<<-EOS, __FILE__, __LINE__)
149
+ unless defined? @@#{sym}
150
+ @@#{sym} = nil
151
+ end
152
+
153
+ def self.#{sym}=(obj)
154
+ @@#{sym} = obj
155
+ end
156
+
157
+ def #{sym}=(obj)
158
+ @@#{sym} = obj
159
+ end
160
+ EOS
161
+ end
162
+ end
163
+
164
+ def cattr_accessor(*syms)
165
+ cattr_reader(*syms)
166
+ cattr_writer(*syms)
167
+ end
168
+ end if Class.instance_methods(false).grep(/^cattr_(?:reader|writer|accessor)$/).empty?
169
+
126
170
  module SelectiveAttributeProxy
127
171
  def self.included(klass)
128
172
  klass.extend(ClassMethods)
129
173
  klass.class_eval(<<-EVAL, __FILE__, __LINE__)
130
- # Default name for attribute storage
131
- @@attribute_proxy = :attributes
132
- @@attribute_proxy_options = {:exclusively => true}
133
-
134
- def self.attribute_proxy
135
- @@attribute_proxy
136
- end
174
+ cattr_accessor :attribute_proxy
175
+ cattr_accessor :attribute_proxy_options
137
176
 
138
- def self.attribute_proxy_options
139
- @@attribute_proxy_options
140
- end
177
+ # Default name for attribute storage
178
+ self.attribute_proxy = :attributes
179
+ self.attribute_proxy_options = {:exclusively => true}
141
180
 
142
181
  private
143
182
  # By default proxy all attributes
@@ -168,9 +207,9 @@ module SelectiveAttributeProxy
168
207
  if attribute_name.is_a?(Hash)
169
208
  options = attribute_name
170
209
  else
171
- class_variable_set(:@@attribute_proxy, attribute_name)
210
+ self.attribute_proxy = attribute_name
172
211
  end
173
- class_variable_set(:@@attribute_proxy_options, options)
212
+ self.attribute_proxy_options = options
174
213
  end
175
214
  end
176
215
  end
@@ -11,7 +11,7 @@ module AWS
11
11
  # 'photos'
12
12
  # )
13
13
  #
14
- # The content type of the object will be infered by its extension. If the appropriate content type can not be infered, S3 defaults
14
+ # The content type of the object will be inferred by its extension. If the appropriate content type can not be inferred, S3 defaults
15
15
  # to <tt>binary/octect-stream</tt>.
16
16
  #
17
17
  # If you want to override this, you can explicitly indicate what content type the object should have with the <tt>:content_type</tt> option:
@@ -191,8 +191,15 @@ module AWS
191
191
 
192
192
  # Fetch information about the key with <tt>name</tt> from <tt>bucket</tt>. Information includes content type, content length,
193
193
  # last modified time, and others.
194
+ #
195
+ # If the specified key does not exist, NoSuchKey is raised.
194
196
  def about(key, bucket = nil, options = {})
195
- About.new(head(path!(bucket, key, options), options).headers)
197
+ response = head(path!(bucket, key, options), options)
198
+ if response.client_error?
199
+ raise NoSuchKey.new("No such key `#{key}'", bucket)
200
+ else
201
+ About.new(response.headers)
202
+ end
196
203
  end
197
204
 
198
205
  # Delete object with <tt>key</tt> from <tt>bucket</tt>.
@@ -80,9 +80,11 @@ module AWS
80
80
 
81
81
  def initialize(body)
82
82
  @body = body
83
- parse
84
- set_root
85
- typecast_xml_in
83
+ unless body.strip.empty?
84
+ parse
85
+ set_root
86
+ typecast_xml_in
87
+ end
86
88
  end
87
89
 
88
90
  private
@@ -8,7 +8,15 @@ module AWS
8
8
  @@response = nil #:nodoc:
9
9
 
10
10
  class << self
11
- # List all your buckets
11
+ # List all your buckets.
12
+ #
13
+ # Service.buckets
14
+ # # => []
15
+ #
16
+ # For performance reasons, the bucket list will be cached. If you want avoid all caching, pass the <tt>:reload</tt>
17
+ # as an argument:
18
+ #
19
+ # Service.buckets(:reload)
12
20
  def buckets
13
21
  response = get('/')
14
22
  if response.empty?
@@ -3,7 +3,7 @@ module AWS
3
3
  module VERSION #:nodoc:
4
4
  MAJOR = '0'
5
5
  MINOR = '2'
6
- TINY = '0'
6
+ TINY = '1'
7
7
  BETA = nil # Time.now.to_i.to_s
8
8
  end
9
9
 
@@ -85,7 +85,7 @@ class MultiConnectionsTest < Test::Unit::TestCase
85
85
  end
86
86
 
87
87
  def setup
88
- Base.send(:class_variable_get, :@@connections).clear
88
+ Base.send(:connections).clear
89
89
  end
90
90
  alias_method :teardown, :setup
91
91
 
@@ -2,7 +2,7 @@ require File.dirname(__FILE__) + '/test_helper'
2
2
 
3
3
  class BucketTest < Test::Unit::TestCase
4
4
  def test_bucket_name_validation
5
- valid_names = %w(123 joe step-one step_two step3 step_4 step-5)
5
+ valid_names = %w(123 joe step-one step_two step3 step_4 step-5 step.six)
6
6
  invalid_names = ['12', 'jo', 'kevin spacey', 'larry@wall', '', 'a' * 256]
7
7
  validate_name = Proc.new {|name| Bucket.send(:validate_name!, name)}
8
8
  valid_names.each do |valid_name|
@@ -27,11 +27,6 @@ class HashExtensionsTest < Test::Unit::TestCase
27
27
  assert qs['one=1&two=2'] || qs['two=2&one=1']
28
28
  end
29
29
 
30
- def test_keys_and_values_are_url_encoded
31
- hash = {'key with spaces' => 'value with spaces'}
32
- assert_equal '?key+with+spaces=value+with+spaces', hash.to_query_string
33
- end
34
-
35
30
  def test_normalized_options
36
31
  expectations = [
37
32
  [{:foo_bar => 1}, {'foo-bar' => '1'}],
@@ -136,6 +136,15 @@ class ObjectTest < Test::Unit::TestCase
136
136
  assert_equal 'f21f7c4e8ea6e34b268887b07d6da745', file.etag
137
137
  end
138
138
  end
139
+
140
+ def test_fetching_information_about_an_object_that_does_not_exist_raises_no_such_key
141
+ S3Object.in_test_mode do
142
+ S3Object.request_returns :body => '', :code => 404
143
+ assert_raises(NoSuchKey) do
144
+ S3Object.about('asdfasdfasdfas-this-does-not-exist', 'bucket does not matter')
145
+ end
146
+ end
147
+ end
139
148
  end
140
149
 
141
150
  class MetadataTest < Test::Unit::TestCase
@@ -79,4 +79,8 @@ class XmlParserTest < Test::Unit::TestCase
79
79
  policy = Parsing::XmlParser.new(Fixtures::Policies.policy_with_one_grant)
80
80
  assert_kind_of Array, policy['access_control_list']['grant']
81
81
  end
82
+
83
+ def test_empty_xml_response_is_not_parsed
84
+ assert_equal({}, Parsing::XmlParser.new(''))
85
+ end
82
86
  end
@@ -123,5 +123,24 @@ class RemoteBucketTest < Test::Unit::TestCase
123
123
 
124
124
  assert bucket.empty?
125
125
  end
126
-
126
+
127
+ def test_bucket_name_is_switched_with_options_when_bucket_is_implicit_and_options_are_passed
128
+ Object.const_set(:ImplicitlyNamedBucket, Class.new(Bucket))
129
+ ImplicitlyNamedBucket.current_bucket = TEST_BUCKET
130
+ assert ImplicitlyNamedBucket.objects.empty?
131
+
132
+ %w(a b c).each {|key| S3Object.store(key, 'value does not matter', TEST_BUCKET)}
133
+
134
+ assert_equal 3, ImplicitlyNamedBucket.objects.size
135
+
136
+ objects = nil
137
+ assert_nothing_raised do
138
+ objects = ImplicitlyNamedBucket.objects(:max_keys => 1)
139
+ end
140
+
141
+ assert objects
142
+ assert_equal 1, objects.size
143
+ ensure
144
+ %w(a b c).each {|key| S3Object.delete(key, TEST_BUCKET)}
145
+ end
127
146
  end
@@ -13,7 +13,6 @@ class RemoteS3ObjectTest < Test::Unit::TestCase
13
13
  key = 'testing_s3objects'
14
14
  value = 'testing'
15
15
  content_type = 'text/plain'
16
- fetch_object_at = Proc.new {|url| Net::HTTP.get_response(URI.parse(url))}
17
16
  unauthenticated_url = ['http:/', Base.connection.http.address, TEST_BUCKET, key].join('/')
18
17
 
19
18
  # Create an object
@@ -103,7 +102,7 @@ class RemoteS3ObjectTest < Test::Unit::TestCase
103
102
 
104
103
  # Test that it is publicly readable
105
104
 
106
- response = fetch_object_at[unauthenticated_url]
105
+ response = fetch_object_at(unauthenticated_url)
107
106
  assert (200..299).include?(response.code.to_i)
108
107
 
109
108
  # Confirm that it has no meta data
@@ -151,12 +150,12 @@ class RemoteS3ObjectTest < Test::Unit::TestCase
151
150
 
152
151
  # Confirm object is no longer publicly readable
153
152
 
154
- response = fetch_object_at[unauthenticated_url]
153
+ response = fetch_object_at(unauthenticated_url)
155
154
  assert (400..499).include?(response.code.to_i)
156
155
 
157
156
  # Confirm object is accessible from its authenticated url
158
157
 
159
- response = fetch_object_at[object.url]
158
+ response = fetch_object_at(object.url)
160
159
  assert (200..299).include?(response.code.to_i)
161
160
 
162
161
  # Copy the object
@@ -234,39 +233,10 @@ class RemoteS3ObjectTest < Test::Unit::TestCase
234
233
  end
235
234
 
236
235
  assert result
237
-
238
- # Confirm we can create an object with spaces in its key
239
-
240
- object = S3Object.new(:value => 'just some text')
241
- object.key = 'name with spaces'
242
- object.bucket = Bucket.find(TEST_BUCKET)
243
-
244
- assert_nothing_raised do
245
- object.store
246
- end
247
-
248
- object = nil
249
- assert_nothing_raised do
250
- object = S3Object.find('name with spaces', TEST_BUCKET)
251
- end
252
-
253
- assert object
254
- assert_equal 'name with spaces', object.key
255
-
256
- # Confirm authenticated url is generated correctly despite space in file name
257
-
258
- response = fetch_object_at[object.url]
259
- assert (200..299).include?(response.code.to_i)
260
-
261
- # Confirm we can delete objects with spaces in their key
262
-
263
- assert_nothing_raised do
264
- object.delete
265
- end
266
236
  end
267
237
 
268
238
  def test_content_type_inference
269
- # Confirm appropriate content type is infered when not specified
239
+ # Confirm appropriate content type is inferred when not specified
270
240
 
271
241
  content_type_objects = {'foo.jpg' => 'image/jpeg', 'no-extension-specified' => 'binary/octet-stream', 'foo.txt' => 'text/plain'}
272
242
  content_type_objects.each_key do |key|
@@ -291,4 +261,63 @@ class RemoteS3ObjectTest < Test::Unit::TestCase
291
261
  # Get rid of objects we just created
292
262
  content_type_objects.each_key {|key| S3Object.delete(key, TEST_BUCKET) }
293
263
  end
264
+
265
+ def test_body_can_be_more_than_just_string_or_io
266
+ require 'stringio'
267
+ key = 'testing-body-as-string-io'
268
+ io = StringIO.new('hello there')
269
+ S3Object.store(key, io, TEST_BUCKET)
270
+ assert_equal 'hello there', S3Object.value(key, TEST_BUCKET)
271
+ ensure
272
+ S3Object.delete(key, TEST_BUCKET)
273
+ end
274
+
275
+ def test_fetching_information_about_an_object_that_does_not_exist_raises_no_such_key
276
+ assert_raises(NoSuchKey) do
277
+ S3Object.about('asdfasdfasdfas-this-does-not-exist', TEST_BUCKET)
278
+ end
279
+ end
280
+
281
+ # Regression test for http://developer.amazonwebservices.com/connect/thread.jspa?messageID=49152&tstart=0#49152
282
+ def test_finding_an_object_with_slashes_in_its_name_does_not_escape_the_slash
283
+ S3Object.store('rails/1', 'value does not matter', TEST_BUCKET)
284
+ S3Object.store('rails/1.html', 'value does not matter', TEST_BUCKET)
285
+
286
+ object = nil
287
+ assert_nothing_raised do
288
+ object = S3Object.find('rails/1.html', TEST_BUCKET)
289
+ end
290
+
291
+ assert_equal 'rails/1.html', object.key
292
+ ensure
293
+ %w(rails/1 rails/1.html).each {|key| S3Object.delete(key, TEST_BUCKET)}
294
+ end
295
+
296
+ def test_finding_an_object_with_spaces_in_its_name
297
+ assert_nothing_raised do
298
+ S3Object.store('name with spaces', 'value does not matter', TEST_BUCKET)
299
+ end
300
+
301
+ object = nil
302
+ assert_nothing_raised do
303
+ object = S3Object.find('name with spaces', TEST_BUCKET)
304
+ end
305
+
306
+ assert object
307
+ assert_equal 'name with spaces', object.key
308
+
309
+ # Confirm authenticated url is generated correctly despite space in file name
310
+
311
+ response = fetch_object_at(object.url)
312
+ assert (200..299).include?(response.code.to_i)
313
+
314
+ ensure
315
+ S3Object.delete('name with spaces', TEST_BUCKET)
316
+ end
317
+
318
+ private
319
+ def fetch_object_at(url)
320
+ Net::HTTP.get_response(URI.parse(url))
321
+ end
322
+
294
323
  end
metadata CHANGED
@@ -3,8 +3,8 @@ rubygems_version: 0.9.0
3
3
  specification_version: 1
4
4
  name: aws-s3
5
5
  version: !ruby/object:Gem::Version
6
- version: 0.2.0
7
- date: 2006-11-30 00:00:00 -06:00
6
+ version: 0.2.1
7
+ date: 2006-12-04 00:00:00 -06:00
8
8
  summary: Client library for Amazon's Simple Storage Service's REST API
9
9
  require_paths:
10
10
  - lib