s3sync 0.3.4 → 1.2.5

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,172 @@
1
+ Welcome to s3cmd.rb
2
+ -------------------
3
+ This is a ruby program that wraps S3 operations into a simple command-line tool.
4
+ It is inspired by things like rsh3ll, #sh3ll, etc., but shares no code from
5
+ them. It's meant as a companion utility to s3sync.rb but could be used on its
6
+ own (provided you have read the other readme file and know how to use s3sync in
7
+ theory).
8
+
9
+ I made this even though lots of other "shell"s exist, because I wanted a
10
+ single-operation utility, instead of a shell "environment". This lends itself
11
+ more to scripting, etc. Also the delete operation on rsh3ll seems to be borken
12
+ at the moment? =(
13
+
14
+ Users not yet familiar with s3sync should read about that first, since s3cmd and
15
+ s3sync share a tremendous amount of conventions and syntax. Particularly you
16
+ have to set up environment variables prior to calling s3cmd, and s3cmd also uses
17
+ the "bucket:key" syntax popularized by s3sync. Many of the options are the same
18
+ too. Really, go read the other readme first if you haven't used s3sync yet.
19
+ Otherwise you will become confused. It's OK, I'll wait.
20
+
21
+ ....
22
+
23
+ In general, s3sync and s3cmd complement each other. s3sync is useful to perform
24
+ serious synchronization operations, and s3cmd allows you to do simple things
25
+ such as bucket management, listing, transferring single files, and the like.
26
+
27
+ Here is the usage, with examples to follow.
28
+
29
+ s3cmd.rb [options] <command> [arg(s)] version 1.0.0
30
+ --help -h --verbose -v --dryrun -n
31
+ --ssl -s --debug -d
32
+
33
+ Commands:
34
+ s3cmd.rb listbuckets [headers]
35
+ s3cmd.rb createbucket|deletebucket <bucket> [headers]
36
+ s3cmd.rb list <bucket>[:prefix] [max/page] [delimiter] [headers]
37
+ s3cmd.rb delete <bucket>:key [headers]
38
+ s3cmd.rb deleteall <bucket>[:prefix] [headers]
39
+ s3cmd.rb get|put <bucket>:key <file> [headers]
40
+
41
+
42
+ A note about [headers]
43
+ ----------------------
44
+ For some S3 operations, such as "put", you might want to specify certain headers
45
+ to the request such as Cache-Control, Expires, x-amz-acl, etc. Rather than
46
+ supporting a load of separate command-line options for these, I just allow
47
+ header specification. So to upload a file with public-read access you could
48
+ say:
49
+ s3cmd.rb put MyBucket:TheFile.txt x-amz-acl:public-read
50
+
51
+ If you don't need to add any particular headers then you can just ignore this
52
+ whole [headers] thing and pretend it's not there. This is somewhat of an
53
+ advanced option.
54
+
55
+
56
+ Examples
57
+ --------
58
+ List all the buckets your account owns:
59
+ s3cmd.rb listbuckets
60
+
61
+ Create a new bucket:
62
+ s3cmd.rb createbucket BucketName
63
+
64
+ Create a new bucket in the EU:
65
+ s3cmd.rb createbucket BucketName EU
66
+
67
+ Find out the location constraint of a bucket:
68
+ s3cmd.rb location BucketName
69
+
70
+ Delete an old bucket you don't want any more:
71
+ s3cmd.rb deletebucket BucketName
72
+
73
+ Find out what's in a bucket, 10 lines at a time:
74
+ s3cmd.rb list BucketName 10
75
+
76
+ Only look in a particular prefix:
77
+ s3cmd.rb list BucketName:startsWithThis
78
+
79
+ Look in the virtual "directory" named foo;
80
+ lists sub-"directories" and keys that are at this level.
81
+ Note that if you specify a delimiter you must specify a max before it.
82
+ (until I make the options parsing smarter)
83
+ s3cmd.rb list BucketName:foo/ 10 /
84
+
85
+ Delete a key:
86
+ s3cmd.rb delete BucketName:AKey
87
+
88
+ Delete all keys that match (like a combo between list and delete):
89
+ s3cmd.rb deleteall BucketName:SomePrefix
90
+
91
+ Only pretend you're going to delete all keys that match, but list them:
92
+ s3cmd.rb --dryrun deleteall BucketName:SomePrefix
93
+
94
+ Delete all keys in a bucket (leaving the bucket):
95
+ s3cmd.rb deleteall BucketName
96
+
97
+ Get a file from S3 and store it to a local file
98
+ s3cmd.rb get BucketName:TheFileOnS3.txt ALocalFile.txt
99
+
100
+ Put a local file up to S3
101
+ Note we don't automatically set mime type, etc.
102
+ NOTE that the order of the options doesn't change. S3 stays first!
103
+ s3cmd.rb put BucketName:TheFileOnS3.txt ALocalFile.txt
104
+
105
+
106
+ Change Log:
107
+ -----------
108
+ 2006-10-14:
109
+ Created.
110
+ -----------
111
+
112
+ 2006-10-16
113
+ Version 1.0.1
114
+ Force content length to a string value since some ruby's don't convert it right.
115
+ -----------
116
+
117
+ 2006-10-25
118
+ UTF-8 fixes.
119
+ -----------
120
+
121
+ 2006-11-28
122
+ Version 1.0.3
123
+ Added a couple more error catches to s3try.
124
+ ----------
125
+
126
+ 2007-01-25
127
+ Version 1.0.4
128
+ Peter Fales' marker fix.
129
+ Also, markers should be decoded into native charset (because that's what s3
130
+ expects to see).
131
+ ----------
132
+
133
+ 2007-02-19
134
+ - Updated s3try and s3_s3sync_mod to allow SSL_CERT_FILE
135
+ ----------
136
+
137
+ 2007-2-25
138
+ Added --progress
139
+ ----------
140
+
141
+ 2007-07-12
142
+ Version 1.0.6
143
+ Added Alastair Brunton's yaml config code.
144
+ ----------
145
+
146
+ 2007-11-17
147
+ Version 1.2.1
148
+ Compatibility for S3 API revisions.
149
+ When retries are exhausted, emit an error.
150
+ ----------
151
+
152
+ 2007-11-20
153
+ Version 1.2.2
154
+ Handle EU bucket 307 redirects (in s3try.rb)
155
+ ----------
156
+
157
+ 2007-11-20
158
+ Version 1.2.3
159
+ Fix SSL verification settings that broke in new S3 API.
160
+ ----------
161
+
162
+ 2008-01-06
163
+ Version 1.2.4
164
+ Run from any dir (search "here" for includes).
165
+ Search out s3config.yml in some likely places.
166
+ Reset connection (properly) on retry-able non-50x errors.
167
+ Fix calling format bug preventing it from working from yml.
168
+ Added http proxy support.
169
+ ----------
170
+
171
+
172
+ FNORD
@@ -0,0 +1,35 @@
1
+ require 'rubygems'
2
+ require 'rake'
3
+ require 'rake/clean'
4
+ require 'rake/testtask'
5
+ require 'rake/packagetask'
6
+ require 'rake/gempackagetask'
7
+ require 'rake/rdoctask'
8
+ require File.join(File.dirname(__FILE__), 'lib', 'version')
9
+
10
+ Gem::manage_gems
11
+
12
+ readmes = ["README","README_s3cmd"]
13
+
14
+ spec = Gem::Specification.new do |s|
15
+ s.platform = Gem::Platform::RUBY
16
+ s.name = "s3sync"
17
+ s.version = S3sync::VERSION::STRING
18
+ s.author = ""
19
+ s.email = ""
20
+ s.homepage = "http://s3sync.net/"
21
+ s.rubyforge_project = "s3sync"
22
+ s.summary = "rsync-like client for backing up to Amazons S3"
23
+ s.files = Dir.glob("{bin,lib,docs}/**/*.rb") + ["Rakefile", "setup.rb", "CHANGELOG"] + readmes
24
+ s.require_path = "lib"
25
+ s.executables = ['s3sync','s3cmd']
26
+ s.has_rdoc = true
27
+ s.extra_rdoc_files = readmes
28
+ end
29
+ Rake::GemPackageTask.new(spec) do |pkg|
30
+ pkg.need_zip = true
31
+ pkg.need_tar = true
32
+ end
33
+ task :default => "pkg/#{spec.name}-#{spec.version}.gem" do
34
+ puts "generated latest version"
35
+ end
@@ -0,0 +1,245 @@
1
+ #! /System/Library/Frameworks/Ruby.framework/Versions/1.8/usr/bin/ruby
2
+ # This software code is made available "AS IS" without warranties of any
3
+ # kind. You may copy, display, modify and redistribute the software
4
+ # code either by itself or as incorporated into your code; provided that
5
+ # you do not remove any proprietary notices. Your use of this software
6
+ # code is at your own risk and you waive any claim against the author
7
+ # with respect to your use of this software code.
8
+ # (c) 2007 s3sync.net
9
+ #
10
+
11
+ module S3sync
12
+
13
+ # always look "here" for include files (thanks aktxyz)
14
+ $LOAD_PATH << File.expand_path(File.dirname(__FILE__))
15
+
16
+ require 's3try'
17
+
18
+ $S3CMD_VERSION = '1.2.5'
19
+
20
+ require 'getoptlong'
21
+
22
+ # after other mods, so we don't overwrite yaml vals with defaults
23
+ require 's3config'
24
+ include S3Config
25
+
26
+ def S3sync.s3cmdMain
27
+ # ---------- OPTIONS PROCESSING ---------- #
28
+
29
+ $S3syncOptions = Hash.new
30
+ optionsParser = GetoptLong.new(
31
+ [ '--help', '-h', GetoptLong::NO_ARGUMENT ],
32
+ [ '--ssl', '-s', GetoptLong::NO_ARGUMENT ],
33
+ [ '--verbose', '-v', GetoptLong::NO_ARGUMENT ],
34
+ [ '--dryrun', '-n', GetoptLong::NO_ARGUMENT ],
35
+ [ '--debug', '-d', GetoptLong::NO_ARGUMENT ],
36
+ [ '--progress', GetoptLong::NO_ARGUMENT ],
37
+ [ '--expires-in', GetoptLong::REQUIRED_ARGUMENT ]
38
+ )
39
+
40
+ def S3sync.s3cmdUsage(message = nil)
41
+ $stderr.puts message if message
42
+ name = $0.split('/').last
43
+ $stderr.puts <<"ENDUSAGE"
44
+ #{name} [options] <command> [arg(s)]\t\tversion #{$S3CMD_VERSION}
45
+ --help -h --verbose -v --dryrun -n
46
+ --ssl -s --debug -d --progress
47
+ --expires-in=( <# of seconds> | [#d|#h|#m|#s] )
48
+
49
+ Commands:
50
+ #{name} listbuckets [headers]
51
+ #{name} createbucket <bucket> [constraint (i.e. EU)]
52
+ #{name} deletebucket <bucket> [headers]
53
+ #{name} list <bucket>[:prefix] [max/page] [delimiter] [headers]
54
+ #{name} location <bucket> [headers]
55
+ #{name} delete <bucket>:key [headers]
56
+ #{name} deleteall <bucket>[:prefix] [headers]
57
+ #{name} get|put <bucket>:key <file> [headers]
58
+ ENDUSAGE
59
+ exit
60
+ end #usage
61
+
62
+ begin
63
+ optionsParser.each {|opt, arg| $S3syncOptions[opt] = (arg || true)}
64
+ rescue StandardError
65
+ s3cmdUsage # the parser already printed an error message
66
+ end
67
+ s3cmdUsage if $S3syncOptions['--help']
68
+ $S3syncOptions['--verbose'] = true if $S3syncOptions['--dryrun'] or $S3syncOptions['--debug'] or $S3syncOptions['--progress']
69
+ $S3syncOptions['--ssl'] = true if $S3syncOptions['--ssl'] # change from "" to true to appease s3 port chooser
70
+
71
+ if $S3syncOptions['--expires-in'] =~ /d|h|m|s/
72
+ e = $S3syncOptions['--expires-in']
73
+ days = (e =~ /(\d+)d/)? (/(\d+)d/.match(e))[1].to_i : 0
74
+ hours = (e =~ /(\d+)h/)? (/(\d+)h/.match(e))[1].to_i : 0
75
+ minutes = (e =~ /(\d+)m/)? (/(\d+)m/.match(e))[1].to_i : 0
76
+ seconds = (e =~ /(\d+)s/)? (/(\d+)s/.match(e))[1].to_i : 0
77
+ $S3syncOptions['--expires-in'] = seconds + 60 * ( minutes + 60 * ( hours + 24 * ( days ) ) )
78
+ end
79
+
80
+ # ---------- CONNECT ---------- #
81
+ S3sync::s3trySetup
82
+ # ---------- COMMAND PROCESSING ---------- #
83
+
84
+ command, path, file = ARGV
85
+
86
+ s3cmdUsage("You didn't set up your environment variables; see README.txt") if not($AWS_ACCESS_KEY_ID and $AWS_SECRET_ACCESS_KEY)
87
+ s3cmdUsage("Need a command (etc)") if not command
88
+
89
+ path = '' unless path
90
+ path = path.dup # modifiable
91
+ path += ':' unless path.match(':')
92
+ bucket = (/^(.*?):/.match(path))[1]
93
+ path.replace((/:(.*)$/.match(path))[1])
94
+
95
+ case command
96
+ when "delete"
97
+ s3cmdUsage("Need a bucket") if bucket == ''
98
+ s3cmdUsage("Need a key") if path == ''
99
+ headers = hashPairs(ARGV[2...ARGV.length])
100
+ $stderr.puts "delete #{bucket}:#{path} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
101
+ S3try(:delete, bucket, path) unless $S3syncOptions['--dryrun']
102
+ when "deleteall"
103
+ s3cmdUsage("Need a bucket") if bucket == ''
104
+ headers = hashPairs(ARGV[2...ARGV.length])
105
+ $stderr.puts "delete ALL entries in #{bucket}:#{path} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
106
+ more = true
107
+ marker = nil
108
+ while more do
109
+ res = s3cmdList(bucket, path, nil, nil, marker)
110
+ res.entries.each do |item|
111
+ # the s3 commands (with my modified UTF-8 conversion) expect native char encoding input
112
+ key = Iconv.iconv($S3SYNC_NATIVE_CHARSET, "UTF-8", item.key).join
113
+ $stderr.puts "delete #{bucket}:#{key} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
114
+ S3try(:delete, bucket, key) unless $S3syncOptions['--dryrun']
115
+ end
116
+ more = res.properties.is_truncated
117
+ marker = (res.properties.next_marker)? res.properties.next_marker : ((res.entries.length > 0) ? res.entries.last.key : nil)
118
+ # get this into local charset; when we pass it to s3 that is what's expected
119
+ marker = Iconv.iconv($S3SYNC_NATIVE_CHARSET, "UTF-8", marker).join if marker
120
+ end
121
+ when "list"
122
+ s3cmdUsage("Need a bucket") if bucket == ''
123
+ max, delim = ARGV[2..3]
124
+ headers = hashPairs(ARGV[4...ARGV.length])
125
+ $stderr.puts "list #{bucket}:#{path} #{max} #{delim} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
126
+ puts "--------------------"
127
+
128
+ more = true
129
+ marker = nil
130
+ while more do
131
+ res = s3cmdList(bucket, path, max, delim, marker, headers)
132
+ if delim
133
+ res.common_prefix_entries.each do |item|
134
+
135
+ puts "dir: " + Iconv.iconv($S3SYNC_NATIVE_CHARSET, "UTF-8", item.prefix).join
136
+ end
137
+ puts "--------------------"
138
+ end
139
+ res.entries.each do |item|
140
+ puts Iconv.iconv($S3SYNC_NATIVE_CHARSET, "UTF-8", item.key).join
141
+ end
142
+ if res.properties.is_truncated
143
+ printf "More? Y/n: "
144
+ more = (STDIN.gets.match('^[Yy]?$'))
145
+ marker = (res.properties.next_marker)? res.properties.next_marker : ((res.entries.length > 0) ? res.entries.last.key : nil)
146
+ # get this into local charset; when we pass it to s3 that is what's expected
147
+ marker = Iconv.iconv($S3SYNC_NATIVE_CHARSET, "UTF-8", marker).join if marker
148
+
149
+ else
150
+ more = false
151
+ end
152
+ end # more
153
+ when "listbuckets"
154
+ headers = hashPairs(ARGV[1...ARGV.length])
155
+ $stderr.puts "list all buckets #{headers.inspect if headers}" if $S3syncOptions['--verbose']
156
+ if $S3syncOptions['--expires-in']
157
+ $stdout.puts S3url(:list_all_my_buckets, headers)
158
+ else
159
+ res = S3try(:list_all_my_buckets, headers)
160
+ res.entries.each do |item|
161
+ puts item.name
162
+ end
163
+ end
164
+ when "createbucket"
165
+ s3cmdUsage("Need a bucket") if bucket == ''
166
+ lc = ''
167
+ if(ARGV.length > 2)
168
+ lc = '<CreateBucketConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01"><LocationConstraint>' + ARGV[2] + '</LocationConstraint></CreateBucketConfiguration>'
169
+ end
170
+ $stderr.puts "create bucket #{bucket} #{lc}" if $S3syncOptions['--verbose']
171
+ S3try(:create_bucket, bucket, lc) unless $S3syncOptions['--dryrun']
172
+ when "deletebucket"
173
+ s3cmdUsage("Need a bucket") if bucket == ''
174
+ headers = hashPairs(ARGV[2...ARGV.length])
175
+ $stderr.puts "delete bucket #{bucket} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
176
+ S3try(:delete_bucket, bucket, headers) unless $S3syncOptions['--dryrun']
177
+ when "location"
178
+ s3cmdUsage("Need a bucket") if bucket == ''
179
+ headers = hashPairs(ARGV[2...ARGV.length])
180
+ query = Hash.new
181
+ query['location'] = 'location'
182
+ $stderr.puts "location request bucket #{bucket} #{query.inspect} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
183
+ S3try(:get_query_stream, bucket, '', query, headers, $stdout) unless $S3syncOptions['--dryrun']
184
+ when "get"
185
+ s3cmdUsage("Need a bucket") if bucket == ''
186
+ s3cmdUsage("Need a key") if path == ''
187
+ s3cmdUsage("Need a file") if file == ''
188
+ headers = hashPairs(ARGV[3...ARGV.length])
189
+ $stderr.puts "get from key #{bucket}:#{path} into #{file} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
190
+ unless $S3syncOptions['--dryrun']
191
+ if $S3syncOptions['--expires-in']
192
+ $stdout.puts S3url(:get, bucket, path, headers)
193
+ else
194
+ outStream = File.open(file, 'wb')
195
+ outStream = ProgressStream.new(outStream) if $S3syncOptions['--progress']
196
+ S3try(:get_stream, bucket, path, headers, outStream)
197
+ outStream.close
198
+ end
199
+ end
200
+ when "put"
201
+ s3cmdUsage("Need a bucket") if bucket == ''
202
+ s3cmdUsage("Need a key") if path == ''
203
+ s3cmdUsage("Need a file") if file == ''
204
+ headers = hashPairs(ARGV[3...ARGV.length])
205
+ stream = File.open(file, 'rb')
206
+ stream = ProgressStream.new(stream, File.stat(file).size) if $S3syncOptions['--progress']
207
+ s3o = S3::S3Object.new(stream, {}) # support meta later?
208
+ headers['Content-Length'] = FileTest.size(file).to_s
209
+ $stderr.puts "put to key #{bucket}:#{path} from #{file} #{headers.inspect if headers}" if $S3syncOptions['--verbose']
210
+ S3try(:put, bucket, path, s3o, headers) unless $S3syncOptions['--dryrun']
211
+ stream.close
212
+ else
213
+ s3cmdUsage
214
+ end
215
+
216
+ end #main
217
+ def S3sync.s3cmdList(bucket, path, max=nil, delim=nil, marker=nil, headers={})
218
+ debug(max)
219
+ options = Hash.new
220
+ options['prefix'] = path # start at the right depth
221
+ options['max-keys'] = max ? max.to_s : 100
222
+ options['delimiter'] = delim if delim
223
+ options['marker'] = marker if marker
224
+ S3try(:list_bucket, bucket, options, headers)
225
+ end
226
+
227
+ # turn an array into a hash of pairs
228
+ def S3sync.hashPairs(ar)
229
+ ret = Hash.new
230
+ ar.each do |item|
231
+ name = (/^(.*?):/.match(item))[1]
232
+ item = (/^.*?:(.*)$/.match(item))[1]
233
+ ret[name] = item
234
+ end if ar
235
+ ret
236
+ end
237
+ end #module
238
+
239
+
240
+
241
+ def debug(str)
242
+ $stderr.puts str if $S3syncOptions['--debug']
243
+ end
244
+
245
+ S3sync::s3cmdMain #go!