s3cp 1.0.6 → 1.1.0.pre.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
data/History.txt CHANGED
@@ -1,7 +1,19 @@
1
+ === 1.1.0 / (Pending)
2
+
3
+ * Changed: Underlying AWS library has been changed from 'right-aws' to the official
4
+ AWS SDK ('aws-sdk'). The main driver for this is to stay current on the
5
+ latest features, including instance-specific roles, temporary credentials,
6
+ etc.
7
+
8
+ * Added: New 's3buckets' command displays all S3 buckets associated with the
9
+ account.
10
+
11
+ * Added: New '--acl PERMISSION' option to 's3cp' and 's3up' commands.
12
+
1
13
  === 1.0.6 / (2012-07-23)
2
14
 
3
15
  * Fixed: s3cp would exit with errorlevel 0 despite errors when running in
4
- interactive mode.
16
+ interactive mode.
5
17
 
6
18
  === 1.0.5 / (2012-06-25)
7
19
 
@@ -15,15 +27,15 @@
15
27
 
16
28
  * Added: Exponential backoff for `s3cp` command based on the formula:
17
29
 
18
- delay = initial_retry_delay * (factor ^ retries)
30
+ delay = initial_retry_delay * (factor ^ retries)
19
31
 
20
- where:
21
- - `initial_retry_delay` is 1 second by default.
22
- - `factor` is 1.4142 by default
23
- (retry delay doubles every two retries)
24
- - max. number of retries is now 20 by default.
32
+ where:
33
+ - `initial_retry_delay` is 1 second by default.
34
+ - `factor` is 1.4142 by default
35
+ (retry delay doubles every two retries)
36
+ - max. number of retries is now 20 by default.
25
37
 
26
- (which means s3cp retries for roughly 58 minutes by default)
38
+ (which means s3cp retries for roughly 58 minutes by default)
27
39
 
28
40
  * Fixed: `s3ls` and `s3du` now properly handle Errno::EPIPE
29
41
 
@@ -33,18 +45,18 @@
33
45
 
34
46
  * Added: `s3up` now outputs uploaded file size to STDERR, e.g.,
35
47
 
36
- % ls | s3up s3://bucket/path/to/file
37
- s3://bucket/path/to/file => 214B
48
+ % ls | s3up s3://bucket/path/to/file
49
+ s3://bucket/path/to/file => 214B
38
50
 
39
51
  === 1.0.1 / 2012-03-9
40
52
 
41
53
  * Added: New command `s3up` to upload STDIN to S3
42
- e.g. some_cmd | s3up s3://bucket/path/to/destination
54
+ e.g. some_cmd | s3up s3://bucket/path/to/destination
43
55
 
44
56
  Note: `s3up` does not yet directly stream data to S3 since right_aws gem
45
- requires the upload size to be known in advance. `s3up` currently
46
- persists STDIN into a temp file although this may change in the
47
- future.
57
+ requires the upload size to be known in advance. `s3up` currently
58
+ persists STDIN into a temp file although this may change in the
59
+ future.
48
60
 
49
61
  === 1.0.0 / 2012-03-9
50
62
 
@@ -58,27 +70,27 @@ bugs -- but at least all known bugs have been addressed.
58
70
  Probably worth mentioning: there are two outstanding experimental features,
59
71
 
60
72
  1) s3cp --sync: Does not handle missing files. It's basically
61
- a faster copy; not a true "sync" at this point.
73
+ a faster copy; not a true "sync" at this point.
62
74
 
63
75
  2) Bash command-line completion is still very rough.
64
76
 
65
77
  === 0.2.7 / 2012-03-09
66
78
 
67
79
  * Fixed #3: s3cp adds extra slash when copying to the root of a bucket.
68
- e.g. s3://myBucket//someFile or s3://myBucket//someFolderToCopy/someFile
80
+ e.g. s3://myBucket//someFile or s3://myBucket//someFolderToCopy/someFile
69
81
 
70
82
  === 0.2.6 / 2012-03-06
71
83
 
72
84
  * Fixed: Possible division-by-zero error in s3du if it encounters
73
- zero-length files.
85
+ zero-length files.
74
86
 
75
87
  === 0.2.5 / 2012-03-02
76
88
 
77
89
  * Added: "s3du" command to calculate disk usage.
78
- Supports --depth, --regex, --unit parameters and more!
90
+ Supports --depth, --regex, --unit parameters and more!
79
91
 
80
92
  * Changed: "s3ls -l" command now accepts --unit and --precision to configure
81
- file size display. Uses "smart" unit by default.
93
+ file size display. Uses "smart" unit by default.
82
94
 
83
95
  * Changed: "s3ls -l" will now use S3CP_DATE_FORMAT environment if set.
84
96
 
@@ -120,12 +132,12 @@ Probably worth mentioning: there are two outstanding experimental features,
120
132
  * Added: Progress bars during upload/download if $stdout.isatty
121
133
 
122
134
  * Fixed: s3cat now handles broken pipes properly
123
- e.g. "s3cat bucket:some/file | head" will now terminate early.
135
+ e.g. "s3cat bucket:some/file | head" will now terminate early.
124
136
 
125
137
  === 0.2.1 / (2012-02-20)
126
138
 
127
139
  * Added: Bash completion now supports exclusions through S3CP_EXCLUDES
128
- and defaults to excluding keys containing "_$folder$".
140
+ and defaults to excluding keys containing "_$folder$".
129
141
 
130
142
  * Changed: s3dir and s3ls --delimiter now display both directories and files
131
143
 
@@ -134,15 +146,15 @@ Probably worth mentioning: there are two outstanding experimental features,
134
146
  * Added: s3stat command to display S3 object properties
135
147
 
136
148
  * Added: s3dir as a shortcut for "s3ls --delimiter / ..."
137
- (default to character in S3CP_DELIMITER environment variable or "/" if not defined)
149
+ (default to character in S3CP_DELIMITER environment variable or "/" if not defined)
138
150
 
139
151
  * Added: s3cp defaults can now be set using environment variables
140
- S3CP_OVERWRITE, S3CP_CHECKSUM, S3CP_RETRIES, S3CP_RETRY_DELAY
152
+ S3CP_OVERWRITE, S3CP_CHECKSUM, S3CP_RETRIES, S3CP_RETRY_DELAY
141
153
 
142
154
  * Added: Support for Bash command-line completion of S3 URLs (see below).
143
155
 
144
156
  * Fixed: Skip checksum verification for S3 objects with invalid MD5's
145
- (details @ https://forums.aws.amazon.com/message.jspa?messageID=234538)
157
+ (details @ https://forums.aws.amazon.com/message.jspa?messageID=234538)
146
158
 
147
159
  To install Bash completion for S3 URLs, add the following to ~/.bashrc:
148
160
 
@@ -153,10 +165,10 @@ To install Bash completion for S3 URLs, add the following to ~/.bashrc:
153
165
  === 0.1.15 / (2012-02-17)
154
166
 
155
167
  * Added: s3cp now automatically checks MD5 checksums during download/upload
156
- and retries up to 5 times by default if the checksum fails.
157
- The number of attempts may be configured using --max-attempts,
158
- the retry delay may be changed with --retry-delay and the check
159
- may be disabled completely using --no-checksum.
168
+ and retries up to 5 times by default if the checksum fails.
169
+ The number of attempts may be configured using --max-attempts,
170
+ the retry delay may be changed with --retry-delay and the check
171
+ may be disabled completely using --no-checksum.
160
172
 
161
173
  === 0.1.14 / (2012-02-09)
162
174
 
@@ -178,7 +190,7 @@ To install Bash completion for S3 URLs, add the following to ~/.bashrc:
178
190
 
179
191
  * Fixed: --max-keys now works correctly with --delimiter
180
192
  * Fixed: do not display any keys if there are no common-prefix
181
- delimiter-matching keys
193
+ delimiter-matching keys
182
194
 
183
195
  === 0.1.9 / (2012-01-23)
184
196
 
@@ -187,18 +199,18 @@ To install Bash completion for S3 URLs, add the following to ~/.bashrc:
187
199
  === 0.1.8 / (2011-12-29)
188
200
 
189
201
  * Fixed: Apparently, File.new(path, File::CREAT|File::WRONLY) does not
190
- truncate existing files; use File.new(path, "wb") instead.
202
+ truncate existing files; use File.new(path, "wb") instead.
191
203
 
192
204
  === 0.1.7 / (2011-12-29)
193
205
 
194
206
  * Fixed: s3cp would not truncate existing files when overwriting, possibly
195
- resulting in corrupted files.
207
+ resulting in corrupted files.
196
208
 
197
209
  === 0.1.6 / (2011-12-16)
198
210
 
199
211
  * Changed: s3rm now uses multi-object delete operation for faster deletes
200
212
  * Changed: dependency on 'aboisvert-aws' instead of 'right-aws' until
201
- it supports multi-object delete.
213
+ it supports multi-object delete.
202
214
 
203
215
  === 0.1.5 / 2011-10-17
204
216
 
@@ -211,7 +223,7 @@ To install Bash completion for S3 URLs, add the following to ~/.bashrc:
211
223
  === 0.1.3 / 2011-09-29
212
224
 
213
225
  * Fixed: s3cp --headers names are now converted to lowercase since underlying
214
- RightAWS gem expects lowercase header names.
226
+ RightAWS gem expects lowercase header names.
215
227
 
216
228
  === 0.1.2 / 2011-09-29
217
229
 
data/bin/s3buckets ADDED
@@ -0,0 +1,3 @@
1
+ #!/usr/bin/env ruby
2
+ require 's3cp/s3buckets'
3
+
@@ -0,0 +1,35 @@
1
+ def set_header_options(options, headers)
2
+ return options unless headers
3
+
4
+ # legacy options that were previously passed as headers
5
+ # are now passed explicitly as options/metadata.
6
+ mappings = {
7
+ "Content-Type" => :content_type,
8
+ "x-amz-acl" => :acl,
9
+ "Cache-Control" => :cache_control,
10
+ "x-amz-storage-class" => :reduced_redundancy
11
+ }
12
+
13
+ lambdas = {
14
+ "x-amz-storage-class" => lambda { |v| (v =~ /^REDUCED_REDUNDANCY$/i) ? true : false }
15
+ }
16
+
17
+ remaining = headers.dup
18
+ headers.each do |hk, hv|
19
+ mappings.each do |mk, mv|
20
+ if hk.to_s =~ /^#{mk}$/i
21
+ lambda = lambdas[mk]
22
+ options[mv] = lambda ? lambda.call(hv) : hv
23
+ remaining.delete(hk)
24
+ end
25
+ end
26
+ end
27
+
28
+
29
+ options[:metadata] = remaining unless remaining.empty?
30
+
31
+ options
32
+ end
33
+
34
+
35
+ set_header_options({}, { "Content-type" => "foo ", 'x-amz-acl' => 'private', 'foo' => 'bar', 'x-amz-storage-class' => 'REDUCED_REDUNDANCY'})
@@ -0,0 +1,41 @@
1
+ # Copyright (C) 2010-2012 Alex Boisvert and Bizo Inc. / All rights reserved.
2
+ #
3
+ # Licensed to the Apache Software Foundation (ASF) under one or more contributor
4
+ # license agreements. See the NOTICE file distributed with this work for
5
+ # additional information regarding copyright ownership. The ASF licenses this
6
+ # file to you under the Apache License, Version 2.0 (the "License"); you may not
7
+ # use this file except in compliance with the License. You may obtain a copy of
8
+ # the License at
9
+ #
10
+ # http://www.apache.org/licenses/LICENSE-2.0
11
+ #
12
+ # Unless required by applicable law or agreed to in writing, software
13
+ # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
14
+ # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
15
+ # License for the specific language governing permissions and limitations under
16
+ # the License.
17
+
18
+ require 's3cp/utils'
19
+
20
+ # Parse arguments
21
+ options = {}
22
+
23
+ op = OptionParser.new do |opts|
24
+ opts.banner = "s3buckets"
25
+
26
+ opts.on("--verbose", "Verbose mode") do
27
+ options[:verbose] = true
28
+ end
29
+
30
+ opts.on_tail("-h", "--help", "Show this message") do
31
+ puts op
32
+ exit
33
+ end
34
+ end
35
+ op.parse!(ARGV)
36
+
37
+ s3 = S3CP.connect()
38
+ s3.buckets.each do |bucket|
39
+ puts bucket.name
40
+ end
41
+
data/lib/s3cp/s3cat.rb CHANGED
@@ -15,16 +15,9 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'date'
23
- require 'highline/import'
24
- require 'tempfile'
25
- require 'progressbar'
26
-
27
18
  require 's3cp/utils'
19
+ require 'progressbar'
20
+ require 'tempfile'
28
21
 
29
22
  # Parse arguments
30
23
  options = {}
@@ -64,12 +57,11 @@ end
64
57
  @bucket, @prefix = S3CP.bucket_and_key(url)
65
58
  fail "Your URL looks funny, doesn't it?" unless @bucket
66
59
 
67
- @s3 = S3CP.connect().interface
60
+ @s3 = S3CP.connect().buckets[@bucket]
68
61
 
69
62
  if options[:tty]
70
63
  # store contents to file to display with PAGER
71
- metadata = @s3.head(@bucket, @prefix)
72
- size = metadata["content-length"].to_i
64
+ size = @s3.objects[@prefix].content_length
73
65
 
74
66
  progress_bar = ProgressBar.new(File.basename(@prefix), size).tap do |p|
75
67
  p.file_transfer_mode
@@ -78,7 +70,7 @@ if options[:tty]
78
70
  file = Tempfile.new('s3cat')
79
71
  out = File.new(file.path, "wb")
80
72
  begin
81
- @s3.get(@bucket, @prefix) do |chunk|
73
+ @s3.objects[@prefix].read_as_stream do |chunk|
82
74
  out.write(chunk)
83
75
  progress_bar.inc chunk.size
84
76
  end
@@ -89,7 +81,7 @@ if options[:tty]
89
81
  exec "#{ENV['PAGER'] || 'less'} #{file.path}"
90
82
  file.delete()
91
83
  else
92
- @s3.get(@bucket, @prefix) do |chunk|
84
+ @s3.objects[@prefix].read_as_stream do |chunk|
93
85
  begin
94
86
  STDOUT.print(chunk)
95
87
  rescue Errno::EPIPE
data/lib/s3cp/s3cp.rb CHANGED
@@ -15,17 +15,8 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'date'
23
- require 'highline/import'
24
- require 'fileutils'
25
- require 'digest'
26
- require 'progressbar'
27
-
28
18
  require 's3cp/utils'
19
+ require 'progressbar'
29
20
 
30
21
  # Parse arguments
31
22
  options = {}
@@ -103,6 +94,10 @@ op = OptionParser.new do |opts|
103
94
  options[:headers] = h
104
95
  end
105
96
 
97
+ opts.on("--acl PERMISSION", "One of 'private', 'authenticated-read', 'public-read', 'public-read-write'") do |permission|
98
+ options[:acl] = S3CP.validate_acl(permission)
99
+ end
100
+
106
101
  opts.separator " e.g.,"
107
102
  opts.separator " HTTP headers: \'Content-Type: image/jpg\'"
108
103
  opts.separator " AMZ headers: \'x-amz-acl: public-read\'"
@@ -243,12 +238,16 @@ end
243
238
 
244
239
  def s3_to_s3(bucket_from, key, bucket_to, dest, options = {})
245
240
  log(with_headers("Copy s3://#{bucket_from}/#{key} to s3://#{bucket_to}/#{dest}"))
246
- if @headers.empty?
247
- @s3.interface.copy(bucket_from, key, bucket_to, dest)
241
+ s3_source = @s3.buckets[bucket_from].objects[key]
242
+ s3_dest = @s3.buckets[bucket_to].objects[dest]
243
+ s3_options = {}
244
+ S3CP.set_header_options(s3_, @headers)
245
+ s3_options[:acl] = options[:acl]
246
+ unless options[:move]
247
+ s3_source.copy_to(s3_dest, s3_options)
248
248
  else
249
- @s3.interface.copy(bucket_from, key, bucket_to, dest, :copy, @headers)
249
+ s3_source.move_to(s3_dest, s3_options)
250
250
  end
251
- @s3.interface.delete(bucket_from, key) if options[:move]
252
251
  end
253
252
 
254
253
  def local_to_s3(bucket_to, key, file, options = {})
@@ -279,37 +278,33 @@ def local_to_s3(bucket_to, key, file, options = {})
279
278
  end
280
279
  if retries > 0
281
280
  delay = options[:retry_delay] * (options[:retry_backoff] ** retries)
282
- STDERR.puts "Sleeping #{delay} seconds. Will retry #{options[:retries] - retries} more time(s)."
281
+ STDERR.puts "Sleeping #{"%0.2f" % delay} seconds. Will retry #{options[:retries] - retries} more time(s)."
283
282
  sleep delay
284
283
  end
285
284
 
286
- f = File.open(file)
287
285
  begin
288
- if $stdout.isatty
289
- f = Proxy.new(f)
290
- progress_bar = ProgressBar.new(File.basename(file), File.size(file)).tap do |p|
291
- p.file_transfer_mode
292
- end
293
- class << f
294
- attr_accessor :progress_bar
295
-
296
- def read(length, buffer=nil)
297
- begin
298
- result = @target.read(length, buffer)
299
- @progress_bar.inc result.length if result
300
- result
301
- rescue => e
302
- STDERR.puts e
303
- raise e
304
- end
286
+ s3_options = {
287
+ :bucket_name => bucket_to,
288
+ :key => key
289
+ }
290
+ S3CP.set_header_options(s3_options, @headers)
291
+ s3_options[:acl] = options[:acl]
292
+ s3_options[:content_length] = File.size(file)
293
+
294
+ progress_bar = ProgressBar.new(File.basename(file), File.size(file)).tap do |p|
295
+ p.file_transfer_mode
296
+ end
297
+
298
+ meta = @s3.client.put_object(s3_options) do |buffer|
299
+ File.open(file) do |io|
300
+ while !io.eof?
301
+ result = io.read(32 * 1024)
302
+ progress_bar.inc result.length if result
303
+ buffer.write(result)
305
304
  end
306
305
  end
307
- f.progress_bar = progress_bar
308
- else
309
- progress_bar = nil
310
306
  end
311
307
 
312
- meta = @s3.interface.put(bucket_to, key, f, @headers)
313
308
  progress_bar.finish if progress_bar
314
309
 
315
310
  if options[:checksum]
@@ -329,10 +324,8 @@ def local_to_s3(bucket_to, key, file, options = {})
329
324
  progress_bar.clear
330
325
  puts "Error copying #{file} to s3://#{bucket_to}/#{key}"
331
326
  end
332
- raise e if !options[:checksum] || e.to_s =~ /Denied/
327
+ raise e if !options[:checksum] || e.is_a?(AWS::S3::Errors::AccessDenied)
333
328
  STDERR.puts e
334
- ensure
335
- f.close()
336
329
  end
337
330
  retries += 1
338
331
  end until options[:checksum] == false || actual_md5.nil? || expected_md5 == actual_md5
@@ -353,7 +346,8 @@ def s3_to_local(bucket_from, key_from, dest, options = {})
353
346
  end
354
347
  if retries > 0
355
348
  delay = options[:retry_delay] * (options[:retry_backoff] ** retries)
356
- STDERR.puts "Sleeping #{delay} seconds. Will retry #{options[:retries] - retries} more time(s)."
349
+ delay = delay.to_i
350
+ STDERR.puts "Sleeping #{"%0.2f" % delay} seconds. Will retry #{options[:retries] - retries} more time(s)."
357
351
  sleep delay
358
352
  end
359
353
  begin
@@ -383,7 +377,7 @@ def s3_to_local(bucket_from, key_from, dest, options = {})
383
377
  p.file_transfer_mode
384
378
  end
385
379
  end
386
- @s3.interface.get(bucket_from, key_from) do |chunk|
380
+ @s3.buckets[bucket_from].objects[key_from].read_as_stream do |chunk|
387
381
  f.write(chunk)
388
382
  progress_bar.inc chunk.size if progress_bar
389
383
  end
@@ -413,25 +407,23 @@ def s3_to_local(bucket_from, key_from, dest, options = {})
413
407
  retries += 1
414
408
  end until options[:checksum] == false || expected_md5.nil? || md5(dest) == expected_md5
415
409
 
416
- @s3.interface.delete(bucket_from, key_from) if options[:move]
410
+ @s3.buckets[bucket_from].objects[key_from].delete() if options[:move]
417
411
  end
418
412
 
419
413
  def s3_exist?(bucket, key)
420
- metadata = @s3.interface.head(bucket, key)
421
- #puts "exist? #{bucket} #{key} => #{metadata != nil}"
422
- (metadata != nil)
414
+ @s3.buckets[bucket].objects[key].exist?
423
415
  end
424
416
 
425
417
  def s3_checksum(bucket, key)
426
418
  begin
427
- metadata = @s3.interface.head(bucket, key)
419
+ metadata = @s3.buckets[bucket].objects[key].head()
428
420
  return :not_found unless metadata
429
421
  rescue => e
430
- return :not_found if e.is_a?(RightAws::AwsError) && e.http_code == "404"
422
+ return :not_found if e.is_a?(AWS::S3::Errors::NoSuchKey)
431
423
  raise e
432
424
  end
433
425
 
434
- md5 = metadata["etag"] or fail "Unable to get etag/md5 for #{bucket_to}:#{key}"
426
+ md5 = metadata[:etag] or fail "Unable to get etag/md5 for #{bucket_to}:#{key}"
435
427
  return :invalid unless md5
436
428
 
437
429
  md5 = md5.sub(/^"/, "").sub(/"$/, "") # strip beginning and trailing quotes
@@ -449,8 +441,7 @@ def key_path(prefix, key)
449
441
  end
450
442
 
451
443
  def s3_size(bucket, key)
452
- metadata = @s3.interface.head(bucket, key)
453
- metadata["content-length"].to_i
444
+ @s3.buckets[bucket].objects[key].content_length
454
445
  end
455
446
 
456
447
  def copy(from, to, options)
data/lib/s3cp/s3du.rb CHANGED
@@ -15,13 +15,6 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'date'
23
- require 'highline/import'
24
-
25
18
  require 's3cp/utils'
26
19
 
27
20
  # Parse arguments
@@ -72,9 +65,6 @@ fail "Your URL looks funny, doesn't it?" unless @bucket
72
65
 
73
66
  @s3 = S3CP.connect()
74
67
 
75
- s3_options = Hash.new
76
- s3_options[:prefix] = @prefix
77
-
78
68
  def depth(path)
79
69
  path.count("/")
80
70
  end
@@ -103,29 +93,25 @@ def print(key, size)
103
93
  end
104
94
 
105
95
  begin
106
- @s3.interface.incrementally_list_bucket(@bucket, s3_options) do |page|
107
-
108
- entries = page[:contents]
109
- entries.each do |entry|
110
- key = entry[:key]
111
- size = entry[:size]
112
-
113
- if options[:regex].nil? || options[:regex].match(key)
114
- current_key = if actual_depth
115
- pos = nth_occurrence(key, "/", actual_depth)
116
- (pos != -1) ? key[0..pos-1] : key
117
- end
118
-
119
- if (last_key && last_key != current_key)
120
- print(last_key, subtotal_size)
121
- subtotal_size = size
122
- else
123
- subtotal_size += size
124
- end
125
-
126
- last_key = current_key
127
- total_size += size
96
+ @s3.buckets[@bucket].objects.with_prefix(@prefix).each do |entry|
97
+ key = entry.key
98
+ size = entry.content_length
99
+
100
+ if options[:regex].nil? || options[:regex].match(key)
101
+ current_key = if actual_depth
102
+ pos = nth_occurrence(key, "/", actual_depth)
103
+ (pos != -1) ? key[0..pos-1] : key
128
104
  end
105
+
106
+ if (last_key && last_key != current_key)
107
+ print(last_key, subtotal_size)
108
+ subtotal_size = size
109
+ else
110
+ subtotal_size += size
111
+ end
112
+
113
+ last_key = current_key
114
+ total_size += size
129
115
  end
130
116
  end
131
117
 
data/lib/s3cp/s3ls.rb CHANGED
@@ -15,19 +15,12 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'date'
23
- require 'highline/import'
24
-
25
18
  require 's3cp/utils'
26
19
 
27
20
  # Parse arguments
28
21
  options = {}
29
22
  options[:date_format] = ENV['S3CP_DATE_FORMAT'] || '%x %X'
30
- options[:rows_per_page] = $terminal.output_rows if $stdout.isatty
23
+ options[:rows_per_page] = ($terminal.output_rows - 1) if $stdout.isatty
31
24
  options[:precision] = 0
32
25
 
33
26
  op = OptionParser.new do |opts|
@@ -93,53 +86,49 @@ if options[:verbose]
93
86
  puts "key #{@key}"
94
87
  end
95
88
 
96
- @s3 = S3CP.connect()
89
+ @s3 = S3CP.connect().buckets[@bucket]
97
90
 
98
91
  keys = 0
99
92
  rows = 0
100
93
 
101
- s3_options = Hash.new
102
- s3_options[:prefix] = @key
103
- s3_options["max-keys"] = options[:max_keys] if options[:max_keys] && !options[:delimiter]
104
- s3_options[:delimiter] = options[:delimiter] if options[:delimiter]
105
-
106
94
  begin
107
- @s3.interface.incrementally_list_bucket(@bucket, s3_options) do |page|
108
- entries = []
109
- if options[:delimiter]
110
- entries << { :key => page[:contents][0][:key] } if page[:contents].length > 0 && entries.length > 0
111
- page[:common_prefixes].each do |entry|
112
- entries << { :key => entry }
113
- end
114
- entries << { :key => nil }
95
+ display = lambda do |entry|
96
+ key = entry.key ? "s3://#{@bucket}/#{entry.key}" : "---"
97
+ if options[:long_format] && entry.last_modified && entry.content_length
98
+ size = entry.content_length
99
+ size = S3CP.format_filesize(size, :unit => options[:unit], :precision => options[:precision])
100
+ size = ("%#{7 + options[:precision]}s " % size)
101
+ puts "#{entry.last_modified.strftime(options[:date_format])} #{size} #{key}"
102
+ else
103
+ puts key
115
104
  end
116
- entries += page[:contents]
117
- entries.each do |entry|
118
- key = entry[:key] ? "s3://#{@bucket}/#{entry[:key]}" : "---"
119
- if options[:long_format] && entry[:last_modified] && entry[:size]
120
- last_modified = DateTime.parse(entry[:last_modified])
121
- size = entry[:size]
122
- size = S3CP.format_filesize(size, :unit => options[:unit], :precision => options[:precision])
123
- size = ("%#{7 + options[:precision]}s " % size)
124
- puts "#{last_modified.strftime(options[:date_format])} #{size} #{key}"
125
- else
126
- puts key
127
- end
128
- rows += 1
129
- keys += 1
130
- if options[:max_keys] && keys >= options[:max_keys]
131
- exit
132
- end
133
- if options[:rows_per_page] && (rows % options[:rows_per_page] == 0)
134
- begin
135
- print "Continue? (Y/n) "
136
- response = STDIN.gets.chomp.downcase
137
- end until response == 'n' || response == 'y' || response == ''
138
- exit if response == 'n'
105
+ rows += 1
106
+ keys += 1
107
+ if options[:rows_per_page] && (rows % options[:rows_per_page] == 0)
108
+ begin
109
+ print "Continue? (Y/n) "
110
+ response = STDIN.gets.chomp.downcase
111
+ end until response == 'n' || response == 'y' || response == ''
112
+ exit if response == 'n'
113
+ end
114
+ end
115
+
116
+ if options[:delimiter]
117
+ @s3.objects.with_prefix(@key).as_tree(:delimier => options[:delimiter], :append => false).children.each do |entry|
118
+ if entry.leaf?
119
+ entry = @s3.objects[entry.key]
120
+ break if display.call(entry)
139
121
  end
140
122
  end
123
+ else
124
+ s3_options = Hash.new
125
+ s3_options[:limit] = options[:max_keys] if options[:max_keys]
126
+ @s3.objects.with_prefix(@key).each(s3_options) do |entry|
127
+ break if display.call(entry)
128
+ end
141
129
  end
142
130
  rescue Errno::EPIPE
143
131
  # ignore
132
+ break
144
133
  end
145
134
 
data/lib/s3cp/s3mod.rb CHANGED
@@ -15,10 +15,6 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
18
  require 's3cp/utils'
23
19
 
24
20
  op = OptionParser.new do |opts|
@@ -48,14 +44,11 @@ end
48
44
 
49
45
  def update_permissions(s3, bucket, key, permission)
50
46
  puts "Setting #{permission} on s3://#{bucket}/#{key}"
51
- s3.interface.copy(bucket, key, bucket, key, :replace, {"x-amz-acl" => permission })
47
+ s3.buckets[bucket].objects[key].acl = permission
52
48
  end
53
49
 
54
50
  source = ARGV[0]
55
- permission = ARGV.last
56
-
57
- LEGAL_MODS = %w{private authenticated-read public-read public-read-write}
58
- raise "Permissions must be one of the following values: #{LEGAL_MODS}" unless LEGAL_MODS.include?(permission)
51
+ permission = S3CP.validate_acl(ARGV.last)
59
52
 
60
53
  @s3 = S3CP.connect()
61
54
  bucket,key = S3CP.bucket_and_key(source)
data/lib/s3cp/s3rm.rb CHANGED
@@ -15,13 +15,6 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'date'
23
- require 'highline/import'
24
-
25
18
  require 's3cp/utils'
26
19
 
27
20
  # Parse arguments
@@ -110,20 +103,18 @@ exclude_regex = options[:exclude_regex] ? Regexp.new(options[:exclude_regex]) :
110
103
  if options[:recursive]
111
104
  matching_keys = []
112
105
 
113
- @s3.interface.incrementally_list_bucket(@bucket, :prefix => @key) do |page|
114
- page[:contents].each do |entry|
115
- key = "s3://#{@bucket}/#{entry[:key]}"
106
+ @s3.buckets[@bucket].objects.with_prefix(@key).each do |entry|
107
+ key = "s3://#{@bucket}/#{entry.key}"
116
108
 
117
- matching = true
118
- matching = false if include_regex && !include_regex.match(entry[:key])
119
- matching = false if exclude_regex && exclude_regex.match(entry[:key])
109
+ matching = true
110
+ matching = false if include_regex && !include_regex.match(entry.key)
111
+ matching = false if exclude_regex && exclude_regex.match(entry.key)
120
112
 
121
- puts "#{key} => #{matching}" if options[:verbose]
113
+ puts "#{key} => #{matching}" if options[:verbose]
122
114
 
123
- if matching
124
- matching_keys << entry[:key]
125
- puts key unless options[:silent] || options[:verbose]
126
- end
115
+ if matching
116
+ matching_keys << entry.key
117
+ puts key unless options[:silent] || options[:verbose]
127
118
  end
128
119
  end
129
120
 
@@ -132,28 +123,20 @@ if options[:recursive]
132
123
  exit(1)
133
124
  end
134
125
 
135
- errors = []
136
- errors = @s3.interface.delete_multiple(@bucket, matching_keys) unless options[:test]
137
-
138
- if errors.length > 0
139
- puts "Errors during deletion:"
140
- errors.each do |error|
141
- puts "#{error[:key]} #{error[:code]} #{error[:message]}"
142
- end
143
- exit(1)
144
- end
126
+ # if any of the objects failed to delete, a BatchDeleteError will be raised with a summary of the errors
127
+ @s3.buckets[@bucket].objects.delete(matching_keys) unless options[:test]
145
128
  else
146
129
  # delete a single file; check if it exists
147
- if options[:fail_if_not_exist] && @s3.interface.head(@bucket, @key) == nil
130
+ if options[:fail_if_not_exist] && !@s3.buckets[@bucket].objects[@key].exist?
148
131
  key = "s3://#{@bucket}/#{@key}"
149
132
  puts "#{key} does not exist."
150
133
  exit(1)
151
134
  end
152
135
 
153
136
  begin
154
- @s3.interface.delete(@bucket, @key) unless options[:test]
137
+ @s3.buckets[@bucket].objects[@key].delete() unless options[:test]
155
138
  rescue => e
156
139
  puts e.to_s
157
- raise e unless e.to_s =~ /Not Found/
140
+ raise e unless e.is_a? AWS::S3::Errors::NoSuchKey
158
141
  end
159
142
  end
data/lib/s3cp/s3stat.rb CHANGED
@@ -15,10 +15,6 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
18
  require 's3cp/utils'
23
19
 
24
20
  op = OptionParser.new do |opts|
@@ -40,15 +36,11 @@ end
40
36
  source = ARGV[0]
41
37
  permission = ARGV.last
42
38
 
43
- @s3 = S3CP.connect()
39
+ @bucket, @key = S3CP.bucket_and_key(source)
40
+ @s3 = S3CP.connect().buckets[@bucket]
44
41
 
45
- def get_metadata(bucket, key)
46
- metadata = @s3.interface.head(bucket, key)
47
- metadata.sort.each do |k,v|
48
- puts "#{"%20s" % k} #{v}"
49
- end
42
+ metadata = @s3.objects[@key].head
43
+ metadata.keys.sort { |k1, k2| k1.to_s <=> k2.to_s}.each do |k|
44
+ puts "#{"%30s" % k} #{metadata[k]}"
50
45
  end
51
46
 
52
- bucket,key = S3CP.bucket_and_key(source)
53
- get_metadata(bucket, key)
54
-
data/lib/s3cp/s3up.rb CHANGED
@@ -15,13 +15,8 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
- require 'rubygems'
19
- require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
- require 'right_aws'
21
- require 'optparse'
22
- require 'tempfile'
23
-
24
18
  require 's3cp/utils'
19
+ require 'tempfile'
25
20
 
26
21
  # Parse arguments
27
22
  options = {}
@@ -37,6 +32,10 @@ op = OptionParser.new do |opts|
37
32
  options[:headers] = h
38
33
  end
39
34
 
35
+ opts.on("--acl PERMISSION", "One of 'private', 'authenticated-read', 'public-read', 'public-read-write'") do |permission|
36
+ options[:acl] = S3CP.validate_acl(permission)
37
+ end
38
+
40
39
  opts.separator " e.g.,"
41
40
  opts.separator " HTTP headers: \'Content-Type: image/jpg\'"
42
41
  opts.separator " AMZ headers: \'x-amz-acl: public-read\'"
@@ -78,7 +77,10 @@ temp.open
78
77
 
79
78
  # upload temp file
80
79
  begin
81
- @s3.interface.put(bucket, key, temp, @headers)
80
+ s3_options = {}
81
+ S3CP.set_header_options(s3_options, @headers)
82
+ s3_options[:acl] = options[:acl]
83
+ @s3.buckets[bucket].objects[key].write(temp, s3_options)
82
84
  STDERR.puts "s3://#{bucket}/#{key} => #{S3CP.format_filesize(temp.size)} "
83
85
  ensure
84
86
  # cleanup
data/lib/s3cp/utils.rb CHANGED
@@ -15,19 +15,37 @@
15
15
  # License for the specific language governing permissions and limitations under
16
16
  # the License.
17
17
 
18
+ require 'rubygems'
19
+ require 'extensions/kernel' if RUBY_VERSION =~ /1.8/
20
+ require 'aws/s3'
21
+ require 'optparse'
22
+ require 'date'
23
+ require 'highline/import'
24
+
18
25
  module S3CP
19
26
  extend self
20
27
 
21
28
  # Valid units for file size formatting
22
29
  UNITS = %w{B KB MB GB TB EB ZB YB BB}
23
30
 
31
+ LEGAL_MODS = %w{
32
+ private
33
+ public-read
34
+ public-read-write
35
+ authenticated-read
36
+ bucket_owner_read
37
+ bucket_owner_full_control
38
+ }
39
+
24
40
  # Connect to AWS S3
25
41
  def connect()
26
42
  access_key = ENV["AWS_ACCESS_KEY_ID"] || raise("Missing environment variable AWS_ACCESS_KEY_ID")
27
43
  secret_key = ENV["AWS_SECRET_ACCESS_KEY"] || raise("Missing environment variable AWS_SECRET_ACCESS_KEY")
28
44
 
29
- logger = Logger.new('/dev/null')
30
- RightAws::S3.new(access_key, secret_key, :logger => logger)
45
+ ::AWS::S3.new(
46
+ :access_key_id => access_key,
47
+ :secret_access_key => secret_key
48
+ )
31
49
  end
32
50
 
33
51
  # Parse URL and return bucket and key.
@@ -100,5 +118,67 @@ module S3CP
100
118
  end
101
119
  end
102
120
 
121
+ def set_header_options(options, headers)
122
+ return options unless headers
123
+
124
+ # legacy options that were previously passed as headers
125
+ # are now passed explicitly as options/metadata.
126
+ mappings = {
127
+ "Content-Type" => :content_type,
128
+ "x-amz-acl" => :acl,
129
+ "Cache-Control" => :cache_control,
130
+ "x-amz-storage-class" => :reduced_redundancy
131
+ }
132
+
133
+ lambdas = {
134
+ "x-amz-storage-class" => lambda { |v| (v =~ /^REDUCED_REDUNDANCY$/i) ? true : false }
135
+ }
136
+
137
+ remaining = headers.dup
138
+ headers.each do |hk, hv|
139
+ mappings.each do |mk, mv|
140
+ if hk.to_s =~ /^#{mk}$/i
141
+ lambda = lambdas[mk]
142
+ options[mv] = lambda ? lambda.call(hv) : hv
143
+ remaining.delete(hk)
144
+ end
145
+ end
146
+ end
147
+
148
+
149
+ options[:metadata] = remaining unless remaining.empty?
150
+
151
+ options
152
+ end
153
+
154
+ def validate_acl(permission)
155
+ if !LEGAL_MODS.include?(permission)
156
+ raise "Permissions must be one of the following values: #{LEGAL_MODS}"
157
+ end
158
+ permission
159
+ end
160
+ end
161
+
162
+ # Monkey-patch S3 object for download streaming
163
+ # https://forums.aws.amazon.com/thread.jspa?messageID=295587
164
+ module AWS
165
+
166
+ DEFAULT_STREAMING_CHUNK_SIZE = ENV["S3CP_STREAMING_CHUNK_SIZE"] ? ENV["S3CP_STREAMING_CHUNK_SIZE"].to_i : (512 * 1024)
167
+
168
+ class S3
169
+ class S3Object
170
+ def read_as_stream(options = nil, &blk)
171
+ options ||= {}
172
+ chunk_size = options[:chunk] || DEFAULT_STREAMING_CHUNK_SIZE
173
+ size = content_length
174
+ byte_offset = 0
175
+ while byte_offset < size
176
+ range = "bytes=#{byte_offset}-#{byte_offset + chunk_size - 1}"
177
+ yield read(:range => range)
178
+ byte_offset += chunk_size
179
+ end
180
+ end
181
+ end
182
+ end
103
183
  end
104
184
 
data/lib/s3cp/version.rb CHANGED
@@ -16,5 +16,5 @@
16
16
  # the License.
17
17
 
18
18
  module S3CP
19
- VERSION = "1.0.6"
19
+ VERSION = "1.1.0.pre.1"
20
20
  end
metadata CHANGED
@@ -1,13 +1,15 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: s3cp
3
3
  version: !ruby/object:Gem::Version
4
- hash: 27
5
- prerelease:
4
+ hash: -3842078234
5
+ prerelease: 6
6
6
  segments:
7
7
  - 1
8
+ - 1
8
9
  - 0
9
- - 6
10
- version: 1.0.6
10
+ - pre
11
+ - 1
12
+ version: 1.1.0.pre.1
11
13
  platform: ruby
12
14
  authors:
13
15
  - Alex Boisvert
@@ -15,7 +17,7 @@ autorequire:
15
17
  bindir: bin
16
18
  cert_chain: []
17
19
 
18
- date: 2012-07-23 00:00:00 Z
20
+ date: 2012-07-24 00:00:00 Z
19
21
  dependencies:
20
22
  - !ruby/object:Gem::Dependency
21
23
  prerelease: false
@@ -50,40 +52,24 @@ dependencies:
50
52
  type: :runtime
51
53
  - !ruby/object:Gem::Dependency
52
54
  prerelease: false
53
- name: aboisvert_aws
55
+ name: aws-sdk
54
56
  version_requirements: &id003 !ruby/object:Gem::Requirement
55
57
  none: false
56
58
  requirements:
57
59
  - - ~>
58
60
  - !ruby/object:Gem::Version
59
- hash: 7
60
- segments:
61
- - 3
62
- - 0
63
- - 0
64
- version: 3.0.0
65
- requirement: *id003
66
- type: :runtime
67
- - !ruby/object:Gem::Dependency
68
- prerelease: false
69
- name: right_http_connection
70
- version_requirements: &id004 !ruby/object:Gem::Requirement
71
- none: false
72
- requirements:
73
- - - ~>
74
- - !ruby/object:Gem::Version
75
- hash: 27
61
+ hash: 13
76
62
  segments:
77
63
  - 1
78
- - 3
79
- - 0
80
- version: 1.3.0
81
- requirement: *id004
64
+ - 5
65
+ - 7
66
+ version: 1.5.7
67
+ requirement: *id003
82
68
  type: :runtime
83
69
  - !ruby/object:Gem::Dependency
84
70
  prerelease: false
85
71
  name: progressbar
86
- version_requirements: &id005 !ruby/object:Gem::Requirement
72
+ version_requirements: &id004 !ruby/object:Gem::Requirement
87
73
  none: false
88
74
  requirements:
89
75
  - - ~>
@@ -94,12 +80,12 @@ dependencies:
94
80
  - 10
95
81
  - 0
96
82
  version: 0.10.0
97
- requirement: *id005
83
+ requirement: *id004
98
84
  type: :runtime
99
85
  - !ruby/object:Gem::Dependency
100
86
  prerelease: false
101
87
  name: rspec
102
- version_requirements: &id006 !ruby/object:Gem::Requirement
88
+ version_requirements: &id005 !ruby/object:Gem::Requirement
103
89
  none: false
104
90
  requirements:
105
91
  - - ~>
@@ -110,12 +96,12 @@ dependencies:
110
96
  - 5
111
97
  - 0
112
98
  version: 2.5.0
113
- requirement: *id006
99
+ requirement: *id005
114
100
  type: :development
115
101
  - !ruby/object:Gem::Dependency
116
102
  prerelease: false
117
103
  name: rake
118
- version_requirements: &id007 !ruby/object:Gem::Requirement
104
+ version_requirements: &id006 !ruby/object:Gem::Requirement
119
105
  none: false
120
106
  requirements:
121
107
  - - ~>
@@ -126,7 +112,7 @@ dependencies:
126
112
  - 8
127
113
  - 7
128
114
  version: 0.8.7
129
- requirement: *id007
115
+ requirement: *id006
130
116
  type: :development
131
117
  description:
132
118
  email:
@@ -157,7 +143,9 @@ files:
157
143
  - lib/s3cp/s3mod.rb
158
144
  - lib/s3cp/utils.rb
159
145
  - lib/s3cp/s3cp.rb
146
+ - lib/s3cp/#Untitled-1#
160
147
  - lib/s3cp/s3cat.rb
148
+ - lib/s3cp/s3buckets.rb
161
149
  - lib/s3cp/s3rm.rb
162
150
  - lib/s3cp/s3stat.rb
163
151
  - lib/s3cp/s3up.rb
@@ -167,6 +155,7 @@ files:
167
155
  - bin/s3mv
168
156
  - bin/s3rm
169
157
  - bin/s3dir
158
+ - bin/s3buckets
170
159
  - bin/s3mod
171
160
  - bin/s3du
172
161
  - bin/s3cat