fakes3 0.2.5 → 1.0.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: a7c7f31dbcec54846627f3c63f998eb7fa82fa56
4
- data.tar.gz: e0f13efeb4b223f9094360a42837ba8c4830f7b0
3
+ metadata.gz: 16457cde1a8c9b7d7f0dc52dd894a7c868096318
4
+ data.tar.gz: 18428ffb5399d332c335c27fcba9927a7d216016
5
5
  SHA512:
6
- metadata.gz: fb67ce99ae0fa4eb1948584e338b6ccebeb542154f2f1c606632614283b5d950d4d9056a118972c20ac623436c7ed3685b05dd41672fd7183d923e068e724910
7
- data.tar.gz: ce6b1e73a3ab5464dcaad3285200b51e9eb4e16ddd8f153196a0dff80381d5e370f4c5c6b97f2fa67149f3c5c4709d36260de9cc58167f10450a2847bb2548ab
6
+ metadata.gz: b135cbfe0ae4ded874344b00ee0977119791e43da5f6dab25ca0813a07f9cf6563ee2ca1cb1f3fdabe146ed250ba33416654f948401e5525e2827c7f1ed55cd9
7
+ data.tar.gz: 09ba62b7df31ecc535b1912117b813bf47217bbe5e7c321153cb684447160b8337f13e3a7d8e177fff8ebd3886b908939abb0bf691ad3e9f075e4a511e799e69
data/Dockerfile ADDED
@@ -0,0 +1,13 @@
1
+ FROM alpine:3.4
2
+
3
+ RUN apk add --no-cache --update ruby ruby-dev ruby-bundler python py-pip git build-base libxml2-dev libxslt-dev
4
+ RUN pip install boto s3cmd
5
+
6
+ COPY fakes3.gemspec Gemfile Gemfile.lock /app/
7
+ COPY lib/fakes3/version.rb /app/lib/fakes3/
8
+
9
+ WORKDIR /app
10
+
11
+ RUN bundle install
12
+
13
+ COPY . /app/
data/Gemfile CHANGED
@@ -1,4 +1,3 @@
1
1
  source 'https://rubygems.org'
2
- gem 'fakes3', :path => '.' # for dev and test, use local fakes3
3
2
  # Specify your gem's dependencies in fakes3.gemspec
4
3
  gemspec
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- fakes3 (0.2.5)
4
+ fakes3 (1.0.0)
5
5
  builder
6
6
  thor
7
7
 
data/ISSUE_TEMPLATE.md ADDED
@@ -0,0 +1,9 @@
1
+ Thanks for the issue! We do have a few small rules to follow:
2
+
3
+ 1. Please add [Feature], [Bug], [Question], or [Other] to your title.
4
+
5
+ 2. If it is a bug, please add steps to reproduce it. More information is almost always helpful.
6
+
7
+ 3. If it is a feature, is there any pull request you should refer to? If so, please add a link to the pull request somewhere in the comments.
8
+
9
+ Thanks!
data/Makefile ADDED
@@ -0,0 +1,7 @@
1
+ .PHONY: test
2
+
3
+ build-test-container:
4
+ docker build -t fake-s3 .
5
+
6
+ test: build-test-container
7
+ docker run --rm --add-host="posttest.localhost:127.0.0.1" -e "RUBYOPT=-W0" fake-s3 sh -c "rake test_server & rake test"
@@ -0,0 +1,9 @@
1
+ Thanks for the pull request! We do have a few small rules to follow:
2
+
3
+ 1. Ensure tests pass before creating the pull request.
4
+
5
+ 2. Please use a coding style similar to the rest of the file(s) you change. This isn't a hard rule but if it's too different, it will stick out.
6
+
7
+ 3. We have a contributor license agreement (CLA) based off of Google and Apache's CLA. If you would feel comfortable contributing to, say, Angular.js, you should feel comfortable with this CLA. Unless you've previously signed, please sign at: https://docs.google.com/forms/d/e/1FAIpQLSeKKSKNNz5ji1fd5bbu5RaGFbhD45zEaCnAjzBZPpzOaXQsvQ/viewform
8
+
9
+ To read more about all three of the above, visit: https://github.com/jubos/fake-s3/blob/master/CONTRIBUTING.md
data/README.md CHANGED
@@ -1,3 +1,5 @@
1
+ ![Fake S3](static/logo.png "Fake S3")
2
+
1
3
  ## Introduction
2
4
 
3
5
  Fake S3 is a lightweight server that responds to the same API of Amazon S3.
@@ -19,6 +21,14 @@ To run the server, you just specify a root and a port.
19
21
 
20
22
  fakes3 -r /mnt/fakes3_root -p 4567
21
23
 
24
+ ## Licensing
25
+
26
+ As of the latest version, we are licensing with Supported Source. To get a license, visit:
27
+
28
+ https://supportedsource.org/projects/fake-s3
29
+
30
+ Depending on your company's size, the license may be free. It is also free for individuals.
31
+
22
32
  ## Connecting to Fake S3
23
33
 
24
34
  Take a look at the test cases to see client example usage. For now, Fake S3 is
@@ -29,6 +39,4 @@ Here is a running list of [supported clients](https://github.com/jubos/fake-s3/w
29
39
 
30
40
  ## Contributing
31
41
 
32
- Contributions in the form of pull requests, bug reports, documentation, or anything else are welcome! Please read the CONTRIBUTING.md file for more info:
33
-
34
- (https://github.com/jubos/fake-s3/CONTRIBUTING.md)[https://github.com/jubos/fake-s3/CONTRIBUTING.md]
42
+ Contributions in the form of pull requests, bug reports, documentation, or anything else are welcome! Please read the CONTRIBUTING.md file for more info: [CONTRIBUTING.md](https://github.com/jubos/fake-s3/blob/master/CONTRIBUTING.md)
data/fakes3.gemspec CHANGED
@@ -6,13 +6,12 @@ Gem::Specification.new do |s|
6
6
  s.version = FakeS3::VERSION
7
7
  s.platform = Gem::Platform::RUBY
8
8
  s.authors = ["Curtis Spencer"]
9
- s.email = ["thorin@gmail.com"]
9
+ s.email = ["fakes3@supportedsource.org"]
10
10
  s.homepage = "https://github.com/jubos/fake-s3"
11
- s.summary = %q{Fake S3 is a server that simulates S3 commands so you can test your S3 functionality in your projects}
12
- s.description = %q{Use Fake S3 to test basic S3 functionality without actually connecting to S3}
13
- s.license = "MIT"
14
-
15
- s.rubyforge_project = "fakes3"
11
+ s.summary = %q{Fake S3 is a server that simulates Amazon S3 commands so you can test your S3 functionality in your projects}
12
+ s.description = %q{Use Fake S3 to test basic Amazon S3 functionality without actually connecting to AWS}
13
+ s.license = "Supported-Source"
14
+ s.post_install_message = "Fake S3: if you don't already have a license for Fake S3, you can get one at https://supportedsource.org/projects/fake-s3"
16
15
 
17
16
  s.add_development_dependency "bundler", ">= 1.0.0"
18
17
  s.add_development_dependency "aws-s3"
@@ -8,7 +8,8 @@ require 'yaml'
8
8
 
9
9
  module FakeS3
10
10
  class FileStore
11
- SHUCK_METADATA_DIR = ".fakes3_metadataFFF"
11
+ FAKE_S3_METADATA_DIR = ".fakes3_metadataFFF"
12
+
12
13
  # S3 clients with overly strict date parsing fails to parse ISO 8601 dates
13
14
  # without any sub second precision (e.g. jets3t v0.7.2), and the examples
14
15
  # given in the official AWS S3 documentation specify three (3) decimals for
@@ -51,7 +52,7 @@ module FakeS3
51
52
  end
52
53
 
53
54
  def get_bucket_folder(bucket)
54
- File.join(@root,bucket.name)
55
+ File.join(@root, bucket.name)
55
56
  end
56
57
 
57
58
  def get_bucket(bucket)
@@ -59,8 +60,8 @@ module FakeS3
59
60
  end
60
61
 
61
62
  def create_bucket(bucket)
62
- FileUtils.mkdir_p(File.join(@root,bucket))
63
- bucket_obj = Bucket.new(bucket,Time.now,[])
63
+ FileUtils.mkdir_p(File.join(@root, bucket))
64
+ bucket_obj = Bucket.new(bucket, Time.now, [])
64
65
  if !@bucket_hash[bucket]
65
66
  @buckets << bucket_obj
66
67
  @bucket_hash[bucket] = bucket_obj
@@ -76,20 +77,20 @@ module FakeS3
76
77
  @bucket_hash.delete(bucket_name)
77
78
  end
78
79
 
79
- def get_object(bucket,object_name, request)
80
+ def get_object(bucket, object_name, request)
80
81
  begin
81
82
  real_obj = S3Object.new
82
- obj_root = File.join(@root,bucket,object_name,SHUCK_METADATA_DIR)
83
- metadata = YAML.load(File.open(File.join(obj_root,"metadata"),'rb'))
83
+ obj_root = File.join(@root,bucket,object_name,FAKE_S3_METADATA_DIR)
84
+ metadata = File.open(File.join(obj_root, "metadata")) { |file| YAML::load(file) }
84
85
  real_obj.name = object_name
85
86
  real_obj.md5 = metadata[:md5]
86
87
  real_obj.content_type = metadata.fetch(:content_type) { "application/octet-stream" }
87
- #real_obj.io = File.open(File.join(obj_root,"content"),'rb')
88
- real_obj.io = RateLimitableFile.open(File.join(obj_root,"content"),'rb')
88
+ real_obj.content_encoding = metadata.fetch(:content_encoding)
89
+ real_obj.io = RateLimitableFile.open(File.join(obj_root, "content"), 'rb')
89
90
  real_obj.size = metadata.fetch(:size) { 0 }
90
91
  real_obj.creation_date = File.ctime(obj_root).utc.iso8601(SUBSECOND_PRECISION)
91
92
  real_obj.modified_date = metadata.fetch(:modified_date) do
92
- File.mtime(File.join(obj_root,"content")).utc.iso8601(SUBSECOND_PRECISION)
93
+ File.mtime(File.join(obj_root, "content")).utc.iso8601(SUBSECOND_PRECISION)
93
94
  end
94
95
  real_obj.custom_metadata = metadata.fetch(:custom_metadata) { {} }
95
96
  return real_obj
@@ -100,27 +101,27 @@ module FakeS3
100
101
  end
101
102
  end
102
103
 
103
- def object_metadata(bucket,object)
104
+ def object_metadata(bucket, object)
104
105
  end
105
106
 
106
107
  def copy_object(src_bucket_name, src_name, dst_bucket_name, dst_name, request)
107
- src_root = File.join(@root,src_bucket_name,src_name,SHUCK_METADATA_DIR)
108
- src_metadata_filename = File.join(src_root,"metadata")
109
- src_metadata = YAML.load(File.open(src_metadata_filename,'rb').read)
110
- src_content_filename = File.join(src_root,"content")
108
+ src_root = File.join(@root,src_bucket_name,src_name,FAKE_S3_METADATA_DIR)
109
+ src_metadata_filename = File.join(src_root, "metadata")
110
+ src_metadata = YAML.load(File.open(src_metadata_filename, 'rb').read)
111
+ src_content_filename = File.join(src_root, "content")
111
112
 
112
113
  dst_filename= File.join(@root,dst_bucket_name,dst_name)
113
114
  FileUtils.mkdir_p(dst_filename)
114
115
 
115
- metadata_dir = File.join(dst_filename,SHUCK_METADATA_DIR)
116
+ metadata_dir = File.join(dst_filename,FAKE_S3_METADATA_DIR)
116
117
  FileUtils.mkdir_p(metadata_dir)
117
118
 
118
- content = File.join(metadata_dir,"content")
119
- metadata = File.join(metadata_dir,"metadata")
119
+ content = File.join(metadata_dir, "content")
120
+ metadata = File.join(metadata_dir, "metadata")
120
121
 
121
122
  if src_bucket_name != dst_bucket_name || src_name != dst_name
122
- File.open(content,'wb') do |f|
123
- File.open(src_content_filename,'rb') do |input|
123
+ File.open(content, 'wb') do |f|
124
+ File.open(src_content_filename, 'rb') do |input|
124
125
  f << input.read
125
126
  end
126
127
  end
@@ -147,10 +148,11 @@ module FakeS3
147
148
  obj.name = dst_name
148
149
  obj.md5 = src_metadata[:md5]
149
150
  obj.content_type = src_metadata[:content_type]
151
+ obj.content_encoding = src_metadata[:content_encoding]
150
152
  obj.size = src_metadata[:size]
151
153
  obj.modified_date = src_metadata[:modified_date]
152
154
 
153
- src_obj = src_bucket.find(src_name)
155
+ src_bucket.find(src_name)
154
156
  dst_bucket.add(obj)
155
157
  return obj
156
158
  end
@@ -164,10 +166,10 @@ module FakeS3
164
166
  match = content_type.match(/^multipart\/form-data; boundary=(.+)/)
165
167
  boundary = match[1] if match
166
168
  if boundary
167
- boundary = WEBrick::HTTPUtils::dequote(boundary)
169
+ boundary = WEBrick::HTTPUtils::dequote(boundary)
168
170
  form_data = WEBrick::HTTPUtils::parse_form_data(request.body, boundary)
169
171
 
170
- if form_data['file'] == nil or form_data['file'] == ""
172
+ if form_data['file'] == nil || form_data['file'] == ""
171
173
  raise WEBrick::HTTPStatus::BadRequest
172
174
  end
173
175
 
@@ -181,18 +183,18 @@ module FakeS3
181
183
 
182
184
  def do_store_object(bucket, object_name, filedata, request)
183
185
  begin
184
- filename = File.join(@root,bucket.name,object_name)
186
+ filename = File.join(@root, bucket.name, object_name)
185
187
  FileUtils.mkdir_p(filename)
186
188
 
187
- metadata_dir = File.join(filename,SHUCK_METADATA_DIR)
189
+ metadata_dir = File.join(filename, FAKE_S3_METADATA_DIR)
188
190
  FileUtils.mkdir_p(metadata_dir)
189
191
 
190
- content = File.join(filename,SHUCK_METADATA_DIR,"content")
191
- metadata = File.join(filename,SHUCK_METADATA_DIR,"metadata")
192
+ content = File.join(filename, FAKE_S3_METADATA_DIR, "content")
193
+ metadata = File.join(filename, FAKE_S3_METADATA_DIR, "metadata")
192
194
 
193
195
  File.open(content,'wb') { |f| f << filedata }
194
196
 
195
- metadata_struct = create_metadata(content,request)
197
+ metadata_struct = create_metadata(content, request)
196
198
  File.open(metadata,'w') do |f|
197
199
  f << YAML::dump(metadata_struct)
198
200
  end
@@ -201,6 +203,7 @@ module FakeS3
201
203
  obj.name = object_name
202
204
  obj.md5 = metadata_struct[:md5]
203
205
  obj.content_type = metadata_struct[:content_type]
206
+ obj.content_encoding = metadata_struct[:content_encoding]
204
207
  obj.size = metadata_struct[:size]
205
208
  obj.modified_date = metadata_struct[:modified_date]
206
209
 
@@ -223,7 +226,7 @@ module FakeS3
223
226
 
224
227
  parts.sort_by { |part| part[:number] }.each do |part|
225
228
  part_path = "#{base_path}_part#{part[:number]}"
226
- content_path = File.join(part_path, SHUCK_METADATA_DIR, 'content')
229
+ content_path = File.join(part_path, FAKE_S3_METADATA_DIR, 'content')
227
230
 
228
231
  File.open(content_path, 'rb') { |f| chunk = f.read }
229
232
  etag = Digest::MD5.hexdigest(chunk)
@@ -257,10 +260,11 @@ module FakeS3
257
260
  end
258
261
 
259
262
  # TODO: abstract getting meta data from request.
260
- def create_metadata(content,request)
263
+ def create_metadata(content, request)
261
264
  metadata = {}
262
265
  metadata[:md5] = Digest::MD5.file(content).hexdigest
263
266
  metadata[:content_type] = request.header["content-type"].first
267
+ metadata[:content_encoding] = request.header["content-encoding"].first
264
268
  metadata[:size] = File.size(content)
265
269
  metadata[:modified_date] = File.mtime(content).utc.iso8601(SUBSECOND_PRECISION)
266
270
  metadata[:amazon_metadata] = {}
@@ -1,7 +1,7 @@
1
1
  module FakeS3
2
2
  class S3Object
3
3
  include Comparable
4
- attr_accessor :name,:size,:creation_date,:modified_date,:md5,:io,:content_type,:custom_metadata
4
+ attr_accessor :name,:size,:creation_date,:modified_date,:md5,:io,:content_type,:content_encoding,:custom_metadata
5
5
 
6
6
  def hash
7
7
  @name.hash
data/lib/fakes3/server.rb CHANGED
@@ -4,6 +4,7 @@ require 'webrick/https'
4
4
  require 'openssl'
5
5
  require 'securerandom'
6
6
  require 'cgi'
7
+ require 'fakes3/util'
7
8
  require 'fakes3/file_store'
8
9
  require 'fakes3/xml_adapter'
9
10
  require 'fakes3/bucket_query'
@@ -26,9 +27,9 @@ module FakeS3
26
27
  DELETE_OBJECT = "DELETE_OBJECT"
27
28
  DELETE_BUCKET = "DELETE_BUCKET"
28
29
 
29
- attr_accessor :bucket,:object,:type,:src_bucket,
30
- :src_object,:method,:webrick_request,
31
- :path,:is_path_style,:query,:http_verb
30
+ attr_accessor :bucket, :object, :type, :src_bucket,
31
+ :src_object, :method, :webrick_request,
32
+ :path, :is_path_style, :query, :http_verb
32
33
 
33
34
  def inspect
34
35
  puts "-----Inspect FakeS3 Request"
@@ -89,10 +90,10 @@ module FakeS3
89
90
  end
90
91
  when 'GET_ACL'
91
92
  response.status = 200
92
- response.body = XmlAdapter.acl()
93
+ response.body = XmlAdapter.acl
93
94
  response['Content-Type'] = 'application/xml'
94
95
  when 'GET'
95
- real_obj = @store.get_object(s_req.bucket,s_req.object,request)
96
+ real_obj = @store.get_object(s_req.bucket, s_req.object, request)
96
97
  if !real_obj
97
98
  response.status = 404
98
99
  response.body = XmlAdapter.error_no_such_key(s_req.object)
@@ -117,9 +118,15 @@ module FakeS3
117
118
 
118
119
  response.status = 200
119
120
  response['Content-Type'] = real_obj.content_type
121
+
122
+ if real_obj.content_encoding
123
+ response.header['X-Content-Encoding'] = real_obj.content_encoding
124
+ response.header['Content-Encoding'] = real_obj.content_encoding
125
+ end
126
+
120
127
  stat = File::Stat.new(real_obj.io.path)
121
128
 
122
- response['Last-Modified'] = Time.iso8601(real_obj.modified_date).httpdate()
129
+ response['Last-Modified'] = Time.iso8601(real_obj.modified_date).httpdate
123
130
  response.header['ETag'] = "\"#{real_obj.md5}\""
124
131
  response['Accept-Ranges'] = "bytes"
125
132
  response['Last-Ranges'] = "bytes"
@@ -132,7 +139,8 @@ module FakeS3
132
139
  content_length = stat.size
133
140
 
134
141
  # Added Range Query support
135
- if range = request.header["range"].first
142
+ range = request.header["range"].first
143
+ if range
136
144
  response.status = 206
137
145
  if range =~ /bytes=(\d*)-(\d*)/
138
146
  start = $1.to_i
@@ -155,6 +163,7 @@ module FakeS3
155
163
  response['Content-Length'] = File::Stat.new(real_obj.io.path).size
156
164
  if s_req.http_verb == 'HEAD'
157
165
  response.body = ""
166
+ real_obj.io.close
158
167
  else
159
168
  response.body = real_obj.io
160
169
  end
@@ -174,7 +183,7 @@ module FakeS3
174
183
 
175
184
  case s_req.type
176
185
  when Request::COPY
177
- object = @store.copy_object(s_req.src_bucket,s_req.src_object,s_req.bucket,s_req.object,request)
186
+ object = @store.copy_object(s_req.src_bucket, s_req.src_object, s_req.bucket, s_req.object, request)
178
187
  response.body = XmlAdapter.copy_object_result(object)
179
188
  when Request::STORE
180
189
  bucket_obj = @store.get_bucket(s_req.bucket)
@@ -183,7 +192,7 @@ module FakeS3
183
192
  bucket_obj = @store.create_bucket(s_req.bucket)
184
193
  end
185
194
 
186
- real_obj = @store.store_object(bucket_obj,s_req.object,s_req.webrick_request)
195
+ real_obj = @store.store_object(bucket_obj, s_req.object, s_req.webrick_request)
187
196
  response.header['ETag'] = "\"#{real_obj.md5}\""
188
197
  when Request::CREATE_BUCKET
189
198
  @store.create_bucket(s_req.bucket)
@@ -320,7 +329,7 @@ module FakeS3
320
329
 
321
330
  response['Access-Control-Allow-Origin'] = '*'
322
331
  response['Access-Control-Allow-Methods'] = 'PUT, POST, HEAD, GET, OPTIONS'
323
- response['Access-Control-Allow-Headers'] = 'Accept, Content-Type, Authorization, Content-Length, ETag'
332
+ response['Access-Control-Allow-Headers'] = 'Accept, Content-Type, Authorization, Content-Length, ETag, X-CSRF-Token'
324
333
  response['Access-Control-Expose-Headers'] = 'ETag'
325
334
  end
326
335
 
@@ -481,13 +490,13 @@ module FakeS3
481
490
  parts_xml = ""
482
491
  request.body { |chunk| parts_xml << chunk }
483
492
 
484
- # TODO: I suck at parsing xml
485
- parts_xml = parts_xml.scan /\<Part\>.*?<\/Part\>/m
493
+ # TODO: improve parsing xml
494
+ parts_xml = parts_xml.scan(/<Part>.*?<\/Part>/m)
486
495
 
487
496
  parts_xml.collect do |xml|
488
497
  {
489
- number: xml[/\<PartNumber\>(\d+)\<\/PartNumber\>/, 1].to_i,
490
- etag: xml[/\<ETag\>\"(.+)\"\<\/ETag\>/, 1]
498
+ number: xml[/<PartNumber>(\d+)<\/PartNumber>/, 1].to_i,
499
+ etag: FakeS3::Util.strip_before_and_after(xml[/\<ETag\>(.+)<\/ETag>/, 1], '"')
491
500
  }
492
501
  end
493
502
  end
@@ -0,0 +1,8 @@
1
+ module FakeS3
2
+ module Util
3
+ def Util.strip_before_and_after(string, strip_this)
4
+ regex_friendly_strip_this = Regexp.escape(strip_this)
5
+ string.gsub(/\A[#{regex_friendly_strip_this}]+|[#{regex_friendly_strip_this}]+\z/, '')
6
+ end
7
+ end
8
+ end
@@ -1,3 +1,3 @@
1
1
  module FakeS3
2
- VERSION = "0.2.5"
2
+ VERSION = "1.0.0"
3
3
  end
data/static/button.svg ADDED
@@ -0,0 +1,4 @@
1
+ <?xml version="1.0" encoding="UTF-8" standalone="no"?>
2
+ <!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
3
+ <svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" width="110px" height="20px" viewBox="0 0 110 20" enable-background="new 0 0 110 20" xml:space="preserve"> <image id="image0" width="110" height="20" x="0" y="0" xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAG4AAAAUCAMAAABbPPhuAAAABGdBTUEAALGPC/xhBQAAACBjSFJN AAB6JQAAgIMAAPn/AACA6QAAdTAAAOpgAAA6mAAAF2+SX8VGAAABL1BMVEUAUJX///8AUJUAUJUA UJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUAUJUA UJUAUJUAUJUAUJUAUJUAUJUAUJU5e7Rmns59r9uAs917sNx6r9yDtN5lnc0vdK9EhLtzq9o/i8wp fsYhecQdd8Mke8U9iss6fLU4e7RyqtougccfeMNQjcx0otRjmNAhesQzhMliodVKib8+i8zT4PDI 2OyTtdyvx+ShvuD///8qf8dOlNCAstwlfMU7fbZ/st0gecQ5g8e7z+iErNibw+QyhMnq8Pj19/vf 6PTG3fAwg8jm8PhNlND5+/12rNs0hcluqNh3rdtNisDn8fl5rtvH3fBnns4xdbCcw+VhoNU5fLVo oM99sNsds+t8AAAAHXRSTlMAAAZux/NbG8+ZCNAobZoK+dNcn/VgLF+dB2/L9Kp1EagAAAABYktH RAH/Ai3eAAAACXBIWXMAAAsTAAALEwEAmpwYAAAAB3RJTUUH4AsECgU1BHGi1AAAAmZJREFUSMe9 lgtT1DAQxytwgoLC+X4BeiegGEUkCXqXtgkomlabqvh+P77/Z3A3SXst5Wa4cY6duUmbbPaX/HeT XhAEE5NTrcXx2ckTFQuC6Zml5Zu3Ot3x2O2V1VOnK7iJmbU7d9fvkTHZ/Qedjdm5Ae7Mw81HW0ef Thkfjbf9+EmvVQoaTC33HU2ghdGQWZGIbSuVGHF/Wzu7K6tn5z1u4em661bO4sMnFRgmo6NRKvas 39lrO0GDxecvChzDqMpFjCTjUiJH0gZOy1BIDS+JbcBXigTfQxFTHBZJ6oK+tEWx/QoEPedwHVLB pSpzkYWSVClKuDK13WGrc5RBkAgbo8HXwENOKL5nfli6oK/flIKen0dct8TlAibyEkdC+MUqaeAS lVMdc21UCj4J+GYMZOESMsEYumiOS7VB3+57QbsoaBXnc1fiUlhxrngDlykrFQwzwGTWF8VhsGWQ OleSsVxFPui7965CP3zsXbhYxYGYFEKVOIjLIFoDZz1tnq2VOCJB1IQY1y0L3KcC97mBw6XhgoEk MWDmc1DD2fCY1gzKQ5MBjpLUKI45wG4X9MuuF/PrXvtSrVQgdxmkIIWn3K4Oc08LnFsy4sAhiQ3m TEgByS5w0sSwP8qUiXHYBv3mS2X/+w9XKoODgHUmIC2h5eG2hApJE0ci0MtERAvoCXWJSzOnoR12 J+GnPwi/eq3L9WNeMc7cGYt9wskwD830gW4nBq/ddL4q65dY02CXIflvs2fuSnGJTS4Nv6L5iNfx YXbgir46u7bZH+cHqLvRnqt9Xn//+Xtsn1f483Dt+o3j+vPwD8d6fpTeP3ACAAAAJXRFWHRkYXRl OmNyZWF0ZQAyMDE2LTExLTA0VDEwOjA1OjUzLTA3OjAwJ5wXuQAAACV0RVh0ZGF0ZTptb2RpZnkA MjAxNi0xMS0wNFQxMDowNTo1My0wNzowMFbBrwUAAAAASUVORK5CYII="/>
4
+ </svg>
data/static/logo.png ADDED
Binary file
@@ -54,6 +54,5 @@ class AwsSdkCommandsTest < Test::Unit::TestCase
54
54
  assert metadata_file.has_key?(:amazon_metadata), 'Metadata file does not contain an :amazon_metadata key'
55
55
  assert metadata_file[:amazon_metadata].has_key?('storage-class'), ':amazon_metadata does not contain field "storage-class"'
56
56
  assert_equal 'REDUCED_REDUNDANCY', metadata_file[:amazon_metadata]['storage-class'], '"storage-class" does not equal expected value "REDUCED_REDUNDANCY"'
57
-
58
57
  end
59
58
  end
data/test/post_test.rb CHANGED
@@ -11,9 +11,13 @@ class PostTest < Test::Unit::TestCase
11
11
  end
12
12
 
13
13
  def test_options
14
- res= RestClient.options(@url) { |response|
14
+ RestClient.options(@url) do |response|
15
+ assert_equal(response.code, 200)
15
16
  assert_equal(response.headers[:access_control_allow_origin],"*")
16
- }
17
+ assert_equal(response.headers[:access_control_allow_methods], "PUT, POST, HEAD, GET, OPTIONS")
18
+ assert_equal(response.headers[:access_control_allow_headers], "Accept, Content-Type, Authorization, Content-Length, ETag, X-CSRF-Token")
19
+ assert_equal(response.headers[:access_control_expose_headers], "ETag")
20
+ end
17
21
  end
18
22
 
19
23
  def test_redirect
@@ -9,7 +9,8 @@ class RightAWSCommandsTest < Test::Unit::TestCase
9
9
  def setup
10
10
  @s3 = RightAws::S3Interface.new('1E3GDYEOGFJPIT7XXXXXX','hgTHt68JY07JKUY08ftHYtERkjgtfERn57XXXXXX',
11
11
  {:multi_thread => false, :server => 'localhost',
12
- :port => 10453, :protocol => 'http',:logger => Logger.new("/dev/null"),:no_subdomains => true })
12
+ :port => 10453, :protocol => 'http', :logger => Logger.new("/dev/null"),
13
+ :no_subdomains => true })
13
14
  end
14
15
 
15
16
  def teardown
@@ -21,16 +22,16 @@ class RightAWSCommandsTest < Test::Unit::TestCase
21
22
  end
22
23
 
23
24
  def test_store
24
- @s3.put("s3media","helloworld","Hello World Man!")
25
- obj = @s3.get("s3media","helloworld")
26
- assert_equal "Hello World Man!",obj[:object]
25
+ @s3.put("s3media","helloworld", "Hello World Man!")
26
+ obj = @s3.get("s3media", "helloworld")
27
+ assert_equal "Hello World Man!", obj[:object]
27
28
 
28
- obj = @s3.get("s3media","helloworld")
29
+ obj = @s3.get("s3media", "helloworld")
29
30
  end
30
31
 
31
32
  def test_store_not_found
32
33
  begin
33
- obj = @s3.get("s3media","helloworldnotexist")
34
+ obj = @s3.get("s3media", "helloworldnotexist")
34
35
  rescue RightAws::AwsError
35
36
  assert $!.message.include?('NoSuchKey')
36
37
  rescue
@@ -39,20 +40,20 @@ class RightAWSCommandsTest < Test::Unit::TestCase
39
40
  end
40
41
 
41
42
  def test_large_store
42
- @s3.put("s3media","helloworld","Hello World Man!")
43
+ @s3.put("s3media", "helloworld", "Hello World Man!")
43
44
  buffer = ""
44
45
  500000.times do
45
46
  buffer << "#{(rand * 100).to_i}"
46
47
  end
47
48
 
48
49
  buf_len = buffer.length
49
- @s3.put("s3media","big",buffer)
50
+ @s3.put("s3media", "big", buffer)
50
51
 
51
52
  output = ""
52
53
  @s3.get("s3media","big") do |chunk|
53
54
  output << chunk
54
55
  end
55
- assert_equal buf_len,output.size
56
+ assert_equal buf_len, output.size
56
57
  end
57
58
 
58
59
  # Test that GET requests with a delimiter return a list of
@@ -88,37 +89,53 @@ class RightAWSCommandsTest < Test::Unit::TestCase
88
89
  end
89
90
 
90
91
  def test_multi_directory
91
- @s3.put("s3media","dir/right/123.txt","recursive")
92
+ @s3.put("s3media", "dir/right/123.txt", "recursive")
92
93
  output = ""
93
- obj = @s3.get("s3media","dir/right/123.txt") do |chunk|
94
+ obj = @s3.get("s3media", "dir/right/123.txt") do |chunk|
94
95
  output << chunk
95
96
  end
96
97
  assert_equal "recursive", output
97
98
  end
98
99
 
99
100
  def test_intra_bucket_copy
100
- @s3.put("s3media","original.txt","Hello World")
101
- @s3.copy("s3media","original.txt","s3media","copy.txt")
102
- obj = @s3.get("s3media","copy.txt")
103
- assert_equal "Hello World",obj[:object]
101
+ @s3.put("s3media", "original.txt", "Hello World")
102
+ @s3.copy("s3media", "original.txt", "s3media", "copy.txt")
103
+ obj = @s3.get("s3media", "copy.txt")
104
+ assert_equal "Hello World", obj[:object]
104
105
  end
105
106
 
106
107
  def test_copy_in_place
107
- @s3.put("s3media","foo","Hello World")
108
- @s3.copy("s3media","foo","s3media","foo")
109
- obj = @s3.get("s3media","foo")
110
- assert_equal "Hello World",obj[:object]
108
+ @s3.put("s3media", "copy-in-place", "Hello World")
109
+ @s3.copy("s3media", "copy-in-place", "s3media","copy-in-place")
110
+ obj = @s3.get("s3media", "copy-in-place")
111
+ assert_equal "Hello World", obj[:object]
111
112
  end
112
113
 
114
+ def test_content_encoding
115
+ foo_compressed = Zlib::Deflate.deflate("foo")
116
+ @s3.put("s3media", "foo", foo_compressed, {"content-encoding" => "gzip"})
117
+ obj = @s3.get("s3media", "foo")
118
+ # assert_equal "gzip", obj[:headers]["content-encoding"] # TODO why doesn't checking content-encoding work?
119
+ assert_equal "gzip", obj[:headers]["x-content-encoding"] # TODO why doesn't checking content-encoding work?
120
+ end
121
+
122
+ # def test_content_encoding_data
123
+ # foo_compressed = Zlib::Deflate.deflate("foo-two")
124
+ # @s3.put("s3media", "foo-two", foo_compressed, {"content-encoding" => "gzip"})
125
+ # obj = @s3.get("s3media", "foo-two")
126
+ # puts "*** GOT HERE 1 #{ obj[:object] }"
127
+ # assert_equal "foo-two", Zlib::Inflate::inflate(obj[:object])
128
+ # end
129
+
113
130
  def test_copy_replace_metadata
114
- @s3.put("s3media","foo","Hello World",{"content-type"=>"application/octet-stream"})
115
- obj = @s3.get("s3media","foo")
116
- assert_equal "Hello World",obj[:object]
117
- assert_equal "application/octet-stream",obj[:headers]["content-type"]
118
- @s3.copy("s3media","foo","s3media","foo",:replace,{"content-type"=>"text/plain"})
119
- obj = @s3.get("s3media","foo")
120
- assert_equal "Hello World",obj[:object]
121
- assert_equal "text/plain",obj[:headers]["content-type"]
131
+ @s3.put("s3media", "copy_replace", "Hello World", {"content-type" => "application/octet-stream"})
132
+ obj = @s3.get("s3media", "copy_replace")
133
+ assert_equal "Hello World", obj[:object]
134
+ assert_equal "application/octet-stream", obj[:headers]["content-type"]
135
+ @s3.copy("s3media", "copy_replace", "s3media", "copy_replace", :replace, {"content-type"=>"text/plain"})
136
+ obj = @s3.get("s3media", "copy_replace")
137
+ assert_equal "Hello World", obj[:object]
138
+ assert_equal "text/plain", obj[:headers]["content-type"]
122
139
  end
123
140
 
124
141
  def test_larger_lists
@@ -151,27 +168,27 @@ class RightAWSCommandsTest < Test::Unit::TestCase
151
168
  end
152
169
 
153
170
  def test_if_none_match
154
- @s3.put("s3media","if_none_match_test","Hello World 1!")
155
- obj = @s3.get("s3media","if_none_match_test")
171
+ @s3.put("s3media", "if_none_match_test", "Hello World 1!")
172
+ obj = @s3.get("s3media", "if_none_match_test")
156
173
  tag = obj[:headers]["etag"]
157
174
  begin
158
- @s3.get("s3media", "if_none_match_test", {"If-None-Match"=>tag})
175
+ @s3.get("s3media", "if_none_match_test", {"If-None-Match" => tag})
159
176
  rescue URI::InvalidURIError
160
177
  # expected error for 304
161
178
  else
162
179
  fail 'Should have encountered an error due to the server not returning a response due to caching'
163
180
  end
164
- @s3.put("s3media","if_none_match_test","Hello World 2!")
165
- obj = @s3.get("s3media", "if_none_match_test", {"If-None-Match"=>tag})
166
- assert_equal "Hello World 2!",obj[:object]
181
+ @s3.put("s3media", "if_none_match_test", "Hello World 2!")
182
+ obj = @s3.get("s3media", "if_none_match_test", {"If-None-Match" => tag})
183
+ assert_equal "Hello World 2!", obj[:object]
167
184
  end
168
185
 
169
186
  def test_if_modified_since
170
- @s3.put("s3media","if_modified_since_test","Hello World 1!")
171
- obj = @s3.get("s3media","if_modified_since_test")
187
+ @s3.put("s3media", "if_modified_since_test", "Hello World 1!")
188
+ obj = @s3.get("s3media", "if_modified_since_test")
172
189
  modified = obj[:headers]["last-modified"]
173
190
  begin
174
- @s3.get("s3media", "if_modified_since_test", {"If-Modified-Since"=>modified})
191
+ @s3.get("s3media", "if_modified_since_test", {"If-Modified-Since" => modified})
175
192
  rescue URI::InvalidURIError
176
193
  # expected error for 304
177
194
  else
@@ -179,9 +196,9 @@ class RightAWSCommandsTest < Test::Unit::TestCase
179
196
  end
180
197
  # Granularity of an HTTP Date is 1 second which isn't enough for the test
181
198
  # so manually rewind the clock by a second
182
- timeInThePast = Time.httpdate(modified) - 1
199
+ time_in_the_past = Time.httpdate(modified) - 1
183
200
  begin
184
- obj = @s3.get("s3media", "if_modified_since_test", {"If-Modified-Since"=>timeInThePast.httpdate()})
201
+ obj = @s3.get("s3media", "if_modified_since_test", {"If-Modified-Since" => time_in_the_past.httpdate})
185
202
  rescue
186
203
  fail 'Should have been downloaded since the date is in the past now'
187
204
  else
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: fakes3
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.5
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Curtis Spencer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2016-12-19 00:00:00.000000000 Z
11
+ date: 2017-01-28 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: bundler
@@ -164,10 +164,10 @@ dependencies:
164
164
  - - ">="
165
165
  - !ruby/object:Gem::Version
166
166
  version: '0'
167
- description: Use Fake S3 to test basic S3 functionality without actually connecting
168
- to S3
167
+ description: Use Fake S3 to test basic Amazon S3 functionality without actually connecting
168
+ to AWS
169
169
  email:
170
- - thorin@gmail.com
170
+ - fakes3@supportedsource.org
171
171
  executables:
172
172
  - fakes3
173
173
  extensions: []
@@ -176,9 +176,12 @@ files:
176
176
  - ".gitignore"
177
177
  - CONTRIBUTING.md
178
178
  - DEPLOY_README.md
179
+ - Dockerfile
179
180
  - Gemfile
180
181
  - Gemfile.lock
181
- - MIT-LICENSE
182
+ - ISSUE_TEMPLATE.md
183
+ - Makefile
184
+ - PULL_REQUEST_TEMPLATE.md
182
185
  - README.md
183
186
  - Rakefile
184
187
  - bin/fakes3
@@ -194,8 +197,11 @@ files:
194
197
  - lib/fakes3/server.rb
195
198
  - lib/fakes3/sorted_object_list.rb
196
199
  - lib/fakes3/unsupported_operation.rb
200
+ - lib/fakes3/util.rb
197
201
  - lib/fakes3/version.rb
198
202
  - lib/fakes3/xml_adapter.rb
203
+ - static/button.svg
204
+ - static/logo.png
199
205
  - test/aws_sdk_commands_test.rb
200
206
  - test/aws_sdk_v2_commands_test.rb
201
207
  - test/boto_test.rb
@@ -210,9 +216,10 @@ files:
210
216
  - test/test_helper.rb
211
217
  homepage: https://github.com/jubos/fake-s3
212
218
  licenses:
213
- - MIT
219
+ - Supported-Source
214
220
  metadata: {}
215
- post_install_message:
221
+ post_install_message: 'Fake S3: if you don''t already have a license for Fake S3,
222
+ you can get one at https://supportedsource.org/projects/fake-s3'
216
223
  rdoc_options: []
217
224
  require_paths:
218
225
  - lib
@@ -227,12 +234,12 @@ required_rubygems_version: !ruby/object:Gem::Requirement
227
234
  - !ruby/object:Gem::Version
228
235
  version: '0'
229
236
  requirements: []
230
- rubyforge_project: fakes3
237
+ rubyforge_project:
231
238
  rubygems_version: 2.6.8
232
239
  signing_key:
233
240
  specification_version: 4
234
- summary: Fake S3 is a server that simulates S3 commands so you can test your S3 functionality
235
- in your projects
241
+ summary: Fake S3 is a server that simulates Amazon S3 commands so you can test your
242
+ S3 functionality in your projects
236
243
  test_files:
237
244
  - test/aws_sdk_commands_test.rb
238
245
  - test/aws_sdk_v2_commands_test.rb
data/MIT-LICENSE DELETED
@@ -1,20 +0,0 @@
1
- Copyright (c) 2011,2012 Curtis W Spencer (@jubos) and Spool
2
-
3
- Permission is hereby granted, free of charge, to any person obtaining
4
- a copy of this software and associated documentation files (the
5
- "Software"), to deal in the Software without restriction, including
6
- without limitation the rights to use, copy, modify, merge, publish,
7
- distribute, sublicense, and/or sell copies of the Software, and to
8
- permit persons to whom the Software is furnished to do so, subject to
9
- the following conditions:
10
-
11
- The above copyright notice and this permission notice shall be
12
- included in all copies or substantial portions of the Software.
13
-
14
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
15
- EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
16
- MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
17
- NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
18
- LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
19
- OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
20
- WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.