capistrano-s3_archive 0.5.4 → 0.9.9

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: d9e6383d1737e4e90cdb9bff196905fe7bcf6b66
4
- data.tar.gz: f74ce9246e6fa2e51c1ebdbb4cfe2d6b5982e338
3
+ metadata.gz: 97cf12d50fcd21483d0594609a57b4867f898e47
4
+ data.tar.gz: 7109636be3cb0841c520546ca318991c5a6bbf28
5
5
  SHA512:
6
- metadata.gz: 8d40317da88ca00be017c73fc0ec930dc9f50187af85d3df432a93da6e4604a674d0221e252084b47fba3ee464b9b2b5989822d03ea487ef0e8e64c98496fc4e
7
- data.tar.gz: 606f2213cf280af3ce02fa9ea031a15aed91f6707591e103b1bffb00d9f2106dc2bd4899513ad429de33ae1eae7378f4aeeb362b58837eeb0b37b8dca19c7e0c
6
+ metadata.gz: f0953e0106a31337751b0e0c9eb045ce845581d313ad040a96ab529308a2911c4fc127b182be47ff9d3da80d94a614974163521862e40d2fb0acefef23b24903
7
+ data.tar.gz: 243cc2aa9fd85e1977a08705827ae26f3a2d10be857bdd1430c6cc47406cbf39c09775c28baff8c5bc298fd37650364d72b786efe5e9b331df24f1e6fe5bfad2
data/README.md CHANGED
@@ -1,16 +1,19 @@
1
1
  # Capistrano::S3Archive
2
2
 
3
- Capistrano::S3Archive is an extention of [Capistrano](http://www.capistranorb.com/) which enables to `set :scm, :s3_archive`.
3
+ Capistrano::S3Archive is an extention of [Capistrano](http://www.capistranorb.com/).
4
4
 
5
5
  This behaves like the [capistrano-rsync](https://github.com/moll/capistrano-rsync) except downloading sources from S3 instead of GIT by default.
6
6
 
7
+ ![img_s3archive_to_local_to_remote](./img/s3_archive-rsync.png)
8
+
9
+ **CAUTION!!** Document for VERSION < 0.9 is [legacy_README](legacy_README.md)
7
10
 
8
11
  ## Installation
9
12
 
10
13
  Add this line to your application's Gemfile:
11
14
 
12
15
  ```ruby
13
- gem 'capistrano-s3_archive'
16
+ gem 'capistrano-s3_archive', '>= 0.9'
14
17
  ```
15
18
 
16
19
  And then execute:
@@ -23,9 +26,16 @@ And then execute:
23
26
 
24
27
  ## Usage
25
28
 
26
- `set :scm, :s3_archive` in your config file.
29
+ ### Quick Start
30
+
31
+ In Capfile:
32
+
33
+ ```
34
+ require "capistrano/scm/s3_archive"
35
+ install_plugin Capistrano::SCM::S3Archive
36
+ ```
27
37
 
28
- Set a S3 path containing source archives to `repo_url`. For example, if you has following tree,
38
+ And set a S3 path containing source archives to `:repo_url` and the parameters to access Amazon S3 to `:s3_archive_client_options`, For example, if you has following tree,
29
39
 
30
40
  s3://yourbucket/somedirectory/
31
41
  |- 201506011200.zip
@@ -34,15 +44,15 @@ Set a S3 path containing source archives to `repo_url`. For example, if you has
34
44
  |- 201506020100.zip
35
45
  `- 201506030100.zip
36
46
 
37
- then `set :repo_url, 's3://yourbucket/somedirectory'`.
47
+ then your `config/deploy.rb` would be:
38
48
 
39
- Set parameters to access Amazon S3:
40
-
41
- ```ruby
42
- set :s3_client_options, { region: 'ap-northeast-1', credentials: somecredentials }
43
49
  ```
50
+ set :repo_url, 's3://yourbucket/somedirectory/'
51
+ set :s3_archive_client_options, { region: 'ap-northeast-1', credentials: somecredentials }
52
+ ```
53
+
54
+ To deploy staging:
44
55
 
45
- And set regular capistrano options. To deploy staging:
46
56
  ```
47
57
  $ bundle exec cap staging deploy
48
58
  ```
@@ -54,26 +64,69 @@ $ bundle exec cap staging deploy_only
54
64
 
55
65
 
56
66
  ### Configuration
57
- Set parameters with `set :key, value`.
58
-
59
- #### Rsync Strategy (default)
60
-
61
- Key | Default | Description
62
- --------------|---------|------------
63
- branch | `latest` | The S3 Object basename to download. Support `:latest` or such as `'201506011500.zip'`.
64
- version_id | nil | Version ID of version-controlled S3 object. It should use with `:branch`. e.g. `set :branch, 'myapp.zip'; set :version_id, 'qawsedrftgyhujikolq'`
65
- sort_proc | `->(a,b) { b.key <=> a.key }` | Sort algorithm used to detect `:latest` object basename. It should be proc object for `a,b` as `Aws::S3::Object` comparing.
66
- rsync_options | `['-az']` | Options used to rsync.
67
- local_cache | `tmp/deploy` | Path where to extruct your archive on local for staging and rsyncing. Can be both relative or absolute.
68
- rsync_cache | `shared/deploy` | Path where to cache your repository on the server to avoid rsyncing from scratch each time. Can be both relative or absolute.<br> Set to `nil` if you want to disable the cache.
69
- s3_archive | `tmp/archives` | Path where to download source archives. Can be both relative or absolute.
70
- archive_release_runner_options | { in: :groups, limit: fetch(:archive_release_runner_concurrency) } | Runner options on creating releases.
71
- (archive_release_runner_concurrency) | 20 | Default value of runner concurrency option.
72
-
73
- ##### Experimental configration
74
- Key | Default | Description
75
- --------------|---------|------------
76
- hardlink | nil | Enable `--link-dest` option when remote rsyncing. It could speed deployment up in the case rsync_cache enabled.
67
+
68
+ Available configurations are followings (key, default).
69
+
70
+ :repo_url, nil
71
+ :branch, :latest
72
+ :s3_archive_client_options, nil
73
+ :s3_archive_sort_proc, ->(new, old) { old.key <=> new.key }
74
+ :s3_archive_object_version_id, nil
75
+ :s3_archive_local_download_dir, "tmp/archives"
76
+ :s3_archive_local_cache_dir, "tmp/deploy"
77
+ :s3_archive_remote_rsync_options, ['-az', '--delete']
78
+ :s3_archive_remote_rsync_ssh_options, []
79
+ :s3_archive_remote_rsync_runner_options, {}
80
+ :s3_archive_rsync_cache_dir, "shared/deploy"
81
+ :s3_archive_hardlink_release, false
82
+
83
+ **`repo_url` (required)**
84
+
85
+ The S3 bucket and prefix where the archives are stored. e.g. 's3://yourbucket/somedirectory/'.
86
+
87
+ **`branch`**
88
+
89
+ Basename of archive object to deploy. In the previous example at Quick Start section, you can use `'201506011500.zip'`, `'201506020100.zip'`, etc. And `:latest` is a special symbol to select latest object automatically by `:s3_archive_sort_proc`.
90
+
91
+ **`s3_archive_client_options` (required)**
92
+
93
+ Options passed to `Aws::S3::Client.new(options)` to fetch archives.
94
+
95
+ **`s3_archive_sort_proc`**
96
+
97
+ Sort algorithm used to detect basename of `:latest` object. It should be proc object for `new,old` as `Aws::S3::Object` comparing.
98
+
99
+ **`:s3_archive_object_version_id`**
100
+
101
+ Version ID of version-controlled S3 object. It should use with `:branch`. e.g. `set :branch, 'myapp.zip'; set :version_id, 'qawsedrftgyhujikolq'`
102
+
103
+ **`:s3_archive_local_download_dir`**
104
+
105
+ Path where to download source archives. Can use both relative or absolute.
106
+
107
+ **`:s3_archive_local_cache_dir`**
108
+
109
+ Path where to extruct your archive on local for staging and rsyncing. Can use both relative or absolute.
110
+
111
+ **`:s3_archive_remote_rsync_options`**
112
+
113
+ Options used to rsync to remote cache dir.
114
+
115
+ **`:s3_archive_remote_rsync_ssh_options`**
116
+
117
+ Options used in `rsync -e 'ssh OPTIONS'`.
118
+
119
+ **`:s3_archive_remote_rsync_runner_options`**
120
+
121
+ Runner options of a task to rsync to remote cache, this options are passed to `on release_roles(:all), options` in the rsyncing task. It's useful when to reduce the overload of the machine running Capistrano. e.g. `set :s3_archive_remote_rsync_runner_options, { in: :groups, limit: 10 }`.
122
+
123
+ **`:s3_archive_rsync_cache_dir`**
124
+
125
+ Path where to cache your sources on the remote server to avoid rsyncing from scratch each time. Can use both relative or absolute from `deploy_to` path.
126
+
127
+ **`:s3_archive_hardlink_release`**
128
+
129
+ Enable `--link-dest` option when creating release directory by remote rsyncing. It could speed deployment up.
77
130
 
78
131
  ## Development
79
132
 
@@ -19,9 +19,10 @@ Gem::Specification.new do |spec|
19
19
  spec.require_paths = ["lib"]
20
20
 
21
21
  spec.required_ruby_version = '>= 2.0.0'
22
- spec.add_dependency 'capistrano', '~> 3.6.0'
23
- spec.add_dependency 'aws-sdk-core', '~> 2.0'
22
+ spec.add_dependency 'capistrano', '~> 3.0'
23
+ spec.add_dependency 'aws-sdk', '~> 2.0'
24
24
 
25
- spec.add_development_dependency "bundler", "~> 1.9"
26
- spec.add_development_dependency "rake", "~> 10.0"
25
+ spec.add_development_dependency "bundler"
26
+ spec.add_development_dependency "rake"
27
+ spec.add_development_dependency "rubocop"
27
28
  end
Binary file
data/legacy_README.md ADDED
@@ -0,0 +1,97 @@
1
+ # Capistrano::S3Archive (README for legacy version)
2
+
3
+ **CAUTION**
4
+ Capistrano plugin system has been renewed at version 3.7. This legacy_README is the document for old version.
5
+
6
+ https://github.com/capistrano/capistrano/blob/master/UPGRADING-3.7.md
7
+
8
+ ================
9
+
10
+ Capistrano::S3Archive is an extention of [Capistrano](http://www.capistranorb.com/) which enables to `set :scm, :s3_archive`.
11
+
12
+ This behaves like the [capistrano-rsync](https://github.com/moll/capistrano-rsync) except downloading sources from S3 instead of GIT by default.
13
+
14
+
15
+ ## Installation
16
+
17
+ Add this line to your application's Gemfile:
18
+
19
+ ```ruby
20
+ gem 'capistrano-s3_archive'
21
+ ```
22
+
23
+ And then execute:
24
+
25
+ $ bundle
26
+
27
+ <!-- Or install it yourself as: -->
28
+
29
+ <!-- $ gem install capistrano-s3_archive -->
30
+
31
+ ## Usage
32
+
33
+ `set :scm, :s3_archive` in your config file.
34
+
35
+ Set a S3 path containing source archives to `repo_url`. For example, if you has following tree,
36
+
37
+ s3://yourbucket/somedirectory/
38
+ |- 201506011200.zip
39
+ |- 201506011500.zip
40
+ ...
41
+ |- 201506020100.zip
42
+ `- 201506030100.zip
43
+
44
+ then `set :repo_url, 's3://yourbucket/somedirectory'`.
45
+
46
+ Set parameters to access Amazon S3:
47
+
48
+ ```ruby
49
+ set :s3_client_options, { region: 'ap-northeast-1', credentials: somecredentials }
50
+ ```
51
+
52
+ And set regular capistrano options. To deploy staging:
53
+ ```
54
+ $ bundle exec cap staging deploy
55
+ ```
56
+
57
+ Or to skip download & extruct archive and deploy local files:
58
+ ```
59
+ $ bundle exec cap staging deploy_only
60
+ ```
61
+
62
+
63
+ ### Configuration
64
+ Set parameters with `set :key, value`.
65
+
66
+ #### Rsync Strategy (default)
67
+
68
+ Key | Default | Description
69
+ --------------|---------|------------
70
+ branch | `latest` | The S3 Object basename to download. Support `:latest` or such as `'201506011500.zip'`.
71
+ version_id | nil | Version ID of version-controlled S3 object. It should use with `:branch`. e.g. `set :branch, 'myapp.zip'; set :version_id, 'qawsedrftgyhujikolq'`
72
+ sort_proc | `->(a,b) { b.key <=> a.key }` | Sort algorithm used to detect `:latest` object basename. It should be proc object for `a,b` as `Aws::S3::Object` comparing.
73
+ rsync_options | `['-az']` | Options used to rsync.
74
+ local_cache | `tmp/deploy` | Path where to extruct your archive on local for staging and rsyncing. Can be both relative or absolute.
75
+ rsync_cache | `shared/deploy` | Path where to cache your repository on the server to avoid rsyncing from scratch each time. Can be both relative or absolute.<br> Set to `nil` if you want to disable the cache.
76
+ s3_archive | `tmp/archives` | Path where to download source archives. Can be both relative or absolute.
77
+ archive_release_runner_options | { in: :groups, limit: fetch(:archive_release_runner_concurrency) } | Runner options on creating releases.
78
+ (archive_release_runner_concurrency) | 20 | Default value of runner concurrency option.
79
+
80
+ ##### Experimental configration
81
+ Key | Default | Description
82
+ --------------|---------|------------
83
+ hardlink | nil | Enable `--link-dest` option when remote rsyncing. It could speed deployment up in the case rsync_cache enabled.
84
+
85
+ ## Development
86
+
87
+ After checking out the repo, run `bin/setup` to install dependencies. Then, run `bin/console` for an interactive prompt that will allow you to experiment.
88
+
89
+ To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release` to create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
90
+
91
+ ## Contributing
92
+
93
+ 1. Fork it ( https://github.com/[my-github-username]/capistrano-s3_archive/fork )
94
+ 2. Create your feature branch (`git checkout -b my-new-feature`)
95
+ 3. Commit your changes (`git commit -am 'Add some feature'`)
96
+ 4. Push to the branch (`git push origin my-new-feature`)
97
+ 5. Create a new Pull Request
@@ -0,0 +1,249 @@
1
+ $stderr.puts "DEPRECATION WARNING: `set :scm, :s3_archive` is deprecated. see https://github.com/capistrano/capistrano/blob/master/UPGRADING-3.7.md to update to the plugin architecture."
2
+
3
+ require 'aws-sdk-core'
4
+
5
+ load File.expand_path("../tasks/legacy_s3_archive.rake", __FILE__)
6
+
7
+ require "capistrano/s3_archive/version"
8
+ require 'capistrano/scm'
9
+
10
+ set_if_empty :rsync_options, ['-az --delete']
11
+ set_if_empty :rsync_ssh_options, []
12
+ set_if_empty :rsync_copy, "rsync --archive --acls --xattrs"
13
+ set_if_empty :rsync_cache, "shared/deploy"
14
+ set_if_empty :local_cache, "tmp/deploy"
15
+ set_if_empty :s3_archive, "tmp/archives"
16
+ set_if_empty :sort_proc, ->(a,b) { b.key <=> a.key }
17
+ set_if_empty :archive_release_runner_concurrency, 20
18
+ set_if_empty :archive_release_runner_options, { in: :groups, limit: fetch(:archive_release_runner_concurrency) }
19
+
20
+ module Capistrano
21
+ module S3Archive
22
+ class SCM < Capistrano::SCM
23
+ include FileUtils
24
+ attr_accessor :bucket, :object_prefix
25
+
26
+ def initialize(*args)
27
+ super
28
+ @bucket, @object_prefix = parse_s3_uri(repo_url)
29
+ set :local_cache_dir, "#{fetch(:local_cache)}/#{fetch(:stage)}"
30
+ end
31
+
32
+ def get_object(target)
33
+ opts = { bucket: bucket, key: archive_object_key }
34
+ opts[:version_id] = fetch(:version_id) if fetch(:version_id)
35
+ s3_client.get_object(opts, target: target)
36
+ end
37
+
38
+ def get_object_metadata
39
+ s3_client.list_object_versions(bucket: bucket, prefix: archive_object_key).versions.find do |v|
40
+ if fetch(:version_id) then v.version_id == fetch(:version_id)
41
+ else v.is_latest
42
+ end
43
+ end
44
+ end
45
+
46
+ def list_objects(all_page = true)
47
+ response = s3_client.list_objects(bucket: bucket, prefix: object_prefix)
48
+ if all_page
49
+ response.inject([]) do |objects, page|
50
+ objects += page.contents
51
+ end
52
+ else
53
+ response
54
+ end
55
+ end
56
+
57
+ def archive_object_key
58
+ @archive_object_key ||=
59
+ case fetch(:branch).to_sym
60
+ when :master, :latest, nil
61
+ latest_object_key
62
+ else
63
+ object_prefix + fetch(:branch)
64
+ end
65
+ end
66
+
67
+ private
68
+ def s3_client
69
+ @s3_client ||= Aws::S3::Client.new(fetch(:s3_client_options))
70
+ end
71
+
72
+ def latest_object_key
73
+ list_objects.sort(&fetch(:sort_proc)).first.key
74
+ end
75
+
76
+ def parse_s3_uri(uri)
77
+ pathes = uri.split('://')[1].split('/')
78
+ [pathes.first, pathes.drop(1).push('').join('/')]
79
+ end
80
+
81
+ ### Default strategy
82
+ private
83
+ module RsyncStrategy
84
+ class MissingSSHKyesError < StandardError; end
85
+ class ResourceBusyError < StandardError; end
86
+
87
+ def local_check
88
+ list_objects(false)
89
+ end
90
+
91
+ def check
92
+ return if context.class == SSHKit::Backend::Local
93
+ ssh_key = ssh_key_for(context.host)
94
+ if ssh_key.nil?
95
+ fail MissingSSHKyesError, "#{RsyncStrategy} only supports publickey authentication. Please set #{context.host.hostname}.keys or ssh_options."
96
+ end
97
+ end
98
+
99
+ def stage
100
+ stage_lock do
101
+ archive_dir = File.join(fetch(:s3_archive), fetch(:stage).to_s)
102
+ archive_file = File.join(archive_dir, File.basename(archive_object_key))
103
+ tmp_file = "#{archive_file}.part"
104
+ etag_file = File.join(archive_dir, ".#{File.basename(archive_object_key)}.etag")
105
+ fail "#{tmp_file} is found. Another process is running?" if File.exist?(tmp_file)
106
+
107
+ etag = get_object_metadata.tap { |it| fail "No such object: #{current_revision}" if it.nil? }.etag
108
+ if [archive_file, etag_file].all? { |f| File.exist?(f) } && File.read(etag_file) == etag
109
+ context.info "#{archive_file} (etag:#{etag}) is found. download skipped."
110
+ else
111
+ context.info "Download #{current_revision} to #{archive_file}"
112
+ mkdir_p(File.dirname(archive_file))
113
+ File.open(tmp_file, 'w') do |f|
114
+ get_object(f)
115
+ end
116
+ move(tmp_file, archive_file)
117
+ File.write(etag_file, etag)
118
+ end
119
+
120
+ remove_entry_secure(fetch(:local_cache_dir)) if File.exist? fetch(:local_cache_dir)
121
+ mkdir_p(fetch(:local_cache_dir))
122
+ case archive_file
123
+ when /\.zip\Z/
124
+ cmd = "unzip -q -d #{fetch(:local_cache_dir)} #{archive_file}"
125
+ when /\.tar\.gz\Z|\.tar\.bz2\Z|\.tgz\Z/
126
+ cmd = "tar xf #{archive_file} -C #{fetch(:local_cache_dir)}"
127
+ end
128
+
129
+ release_lock(true) do
130
+ run_locally do
131
+ execute cmd
132
+ end
133
+ end
134
+ end
135
+ end
136
+
137
+ def cleanup
138
+ run_locally do
139
+ archives_dir = File.join(fetch(:s3_archive), fetch(:stage).to_s)
140
+ archives = capture(:ls, '-xtr', archives_dir).split
141
+ if archives.count >= fetch(:keep_releases)
142
+ tobe_removes = (archives - archives.last(fetch(:keep_releases)))
143
+ if tobe_removes.any?
144
+ tobe_removes_str = tobe_removes.map do |file|
145
+ File.join(archives_dir, file)
146
+ end.join(' ')
147
+ execute :rm, tobe_removes_str
148
+ end
149
+ end
150
+ end
151
+ end
152
+
153
+ def release(server = context.host)
154
+ unless context.class == SSHKit::Backend::Local
155
+ user = user_for(server) + '@' unless user_for(server).nil?
156
+ key = ssh_key_for(server)
157
+ ssh_port_option = server.port.nil? ? '' : "-p #{server.port}"
158
+ end
159
+ rsync = ['rsync']
160
+ rsync.concat fetch(:rsync_options)
161
+ rsync << fetch(:local_cache_dir) + '/'
162
+ unless context.class == SSHKit::Backend::Local
163
+ rsync << "-e 'ssh -i #{key} #{ssh_port_option} #{fetch(:rsync_ssh_options).join(' ')}'"
164
+ rsync << "#{user}#{server.hostname}:#{rsync_cache || release_path}"
165
+ else
166
+ rsync << '--no-compress'
167
+ rsync << "#{rsync_cache || release_path}"
168
+ end
169
+ release_lock do
170
+ run_locally do
171
+ execute *rsync
172
+ end
173
+ end
174
+
175
+ unless fetch(:rsync_cache).nil?
176
+ cache = rsync_cache
177
+ link_option = if fetch(:hardlink) && test!("[ `readlink #{current_path}` != #{release_path} ]")
178
+ "--link-dest `readlink #{current_path}`"
179
+ end
180
+ copy = %(#{fetch(:rsync_copy)} #{link_option} "#{cache}/" "#{release_path}/")
181
+ context.execute copy
182
+ end
183
+ end
184
+
185
+ def current_revision
186
+ fetch(:version_id) ? "#{archive_object_key}?versionid=#{fetch(:version_id)}" : archive_object_key
187
+ end
188
+
189
+ def ssh_key_for(host)
190
+ if not host.keys.empty?
191
+ host.keys.first
192
+ elsif host.ssh_options && host.ssh_options.has_key?(:keys)
193
+ Array(host.ssh_options[:keys]).first
194
+ elsif fetch(:ssh_options, nil) && fetch(:ssh_options).has_key?(:keys)
195
+ fetch(:ssh_options)[:keys].first
196
+ end
197
+ end
198
+
199
+ def user_for(host)
200
+ if host.user
201
+ host.user
202
+ elsif host.ssh_options && host.ssh_options.has_key?(:user)
203
+ host.ssh_options[:user]
204
+ elsif fetch(:ssh_options, nil) && fetch(:ssh_options).has_key?(:user)
205
+ fetch(:ssh_options)[:user]
206
+ end
207
+ end
208
+
209
+ private
210
+ def rsync_cache
211
+ cache = fetch(:rsync_cache)
212
+ cache = deploy_to + "/" + cache if cache && cache !~ /^\//
213
+ cache
214
+ end
215
+
216
+ def stage_lock(&block)
217
+ mkdir_p(File.dirname(fetch(:local_cache)))
218
+ lockfile = "#{fetch(:local_cache)}.#{fetch(:stage)}.lock"
219
+ File.open(lockfile, "w") do |f|
220
+ if f.flock(File::LOCK_EX | File::LOCK_NB)
221
+ block.call
222
+ else
223
+ fail ResourceBusyError, "Could not get #{lockfile}"
224
+ end
225
+ end
226
+ ensure
227
+ rm lockfile if File.exist? lockfile
228
+ end
229
+
230
+ def release_lock(exclusive = false, &block)
231
+ mkdir_p(File.dirname(fetch(:local_cache)))
232
+ lockfile = "#{fetch(:local_cache)}.#{fetch(:stage)}.release.lock"
233
+ File.open(lockfile, File::RDONLY|File::CREAT) do |f|
234
+ mode = if exclusive
235
+ File::LOCK_EX | File::LOCK_NB
236
+ else
237
+ File::LOCK_SH
238
+ end
239
+ if f.flock(mode)
240
+ block.call
241
+ else
242
+ fail ResourceBusyError, "Could not get #{fetch(:lockfile)}"
243
+ end
244
+ end
245
+ end
246
+ end
247
+ end
248
+ end
249
+ end
@@ -1,247 +1 @@
1
- require 'aws-sdk-core'
2
-
3
- load File.expand_path("../tasks/s3_archive.rake", __FILE__)
4
-
5
- require "capistrano/s3_archive/version"
6
- require 'capistrano/scm'
7
-
8
- set_if_empty :rsync_options, ['-az --delete']
9
- set_if_empty :rsync_ssh_options, []
10
- set_if_empty :rsync_copy, "rsync --archive --acls --xattrs"
11
- set_if_empty :rsync_cache, "shared/deploy"
12
- set_if_empty :local_cache, "tmp/deploy"
13
- set_if_empty :s3_archive, "tmp/archives"
14
- set_if_empty :sort_proc, ->(a,b) { b.key <=> a.key }
15
- set_if_empty :archive_release_runner_concurrency, 20
16
- set_if_empty :archive_release_runner_options, { in: :groups, limit: fetch(:archive_release_runner_concurrency) }
17
-
18
- module Capistrano
19
- module S3Archive
20
- class SCM < Capistrano::SCM
21
- include FileUtils
22
- attr_accessor :bucket, :object_prefix
23
-
24
- def initialize(*args)
25
- super
26
- @bucket, @object_prefix = parse_s3_uri(repo_url)
27
- set :local_cache_dir, "#{fetch(:local_cache)}/#{fetch(:stage)}"
28
- end
29
-
30
- def get_object(target)
31
- opts = { bucket: bucket, key: archive_object_key }
32
- opts[:version_id] = fetch(:version_id) if fetch(:version_id)
33
- s3_client.get_object(opts, target: target)
34
- end
35
-
36
- def get_object_metadata
37
- s3_client.list_object_versions(bucket: bucket, prefix: archive_object_key).versions.find do |v|
38
- if fetch(:version_id) then v.version_id == fetch(:version_id)
39
- else v.is_latest
40
- end
41
- end
42
- end
43
-
44
- def list_objects(all_page = true)
45
- response = s3_client.list_objects(bucket: bucket, prefix: object_prefix)
46
- if all_page
47
- response.inject([]) do |objects, page|
48
- objects += page.contents
49
- end
50
- else
51
- response
52
- end
53
- end
54
-
55
- def archive_object_key
56
- @archive_object_key ||=
57
- case fetch(:branch).to_sym
58
- when :master, :latest, nil
59
- latest_object_key
60
- else
61
- object_prefix + fetch(:branch)
62
- end
63
- end
64
-
65
- private
66
- def s3_client
67
- @s3_client ||= Aws::S3::Client.new(fetch(:s3_client_options))
68
- end
69
-
70
- def latest_object_key
71
- list_objects.sort(&fetch(:sort_proc)).first.key
72
- end
73
-
74
- def parse_s3_uri(uri)
75
- pathes = uri.split('://')[1].split('/')
76
- [pathes.first, pathes.drop(1).push('').join('/')]
77
- end
78
-
79
- ### Default strategy
80
- private
81
- module RsyncStrategy
82
- class MissingSSHKyesError < StandardError; end
83
- class ResourceBusyError < StandardError; end
84
-
85
- def local_check
86
- list_objects(false)
87
- end
88
-
89
- def check
90
- return if context.class == SSHKit::Backend::Local
91
- ssh_key = ssh_key_for(context.host)
92
- if ssh_key.nil?
93
- fail MissingSSHKyesError, "#{RsyncStrategy} only supports publickey authentication. Please set #{context.host.hostname}.keys or ssh_options."
94
- end
95
- end
96
-
97
- def stage
98
- stage_lock do
99
- archive_dir = File.join(fetch(:s3_archive), fetch(:stage).to_s)
100
- archive_file = File.join(archive_dir, File.basename(archive_object_key))
101
- tmp_file = "#{archive_file}.part"
102
- etag_file = File.join(archive_dir, ".#{File.basename(archive_object_key)}.etag")
103
- fail "#{tmp_file} is found. Another process is running?" if File.exist?(tmp_file)
104
-
105
- etag = get_object_metadata.tap { |it| fail "No such object: #{current_revision}" if it.nil? }.etag
106
- if [archive_file, etag_file].all? { |f| File.exist?(f) } && File.read(etag_file) == etag
107
- context.info "#{archive_file} (etag:#{etag}) is found. download skipped."
108
- else
109
- context.info "Download #{current_revision} to #{archive_file}"
110
- mkdir_p(File.dirname(archive_file))
111
- File.open(tmp_file, 'w') do |f|
112
- get_object(f)
113
- end
114
- move(tmp_file, archive_file)
115
- File.write(etag_file, etag)
116
- end
117
-
118
- remove_entry_secure(fetch(:local_cache_dir)) if File.exist? fetch(:local_cache_dir)
119
- mkdir_p(fetch(:local_cache_dir))
120
- case archive_file
121
- when /\.zip\Z/
122
- cmd = "unzip -q -d #{fetch(:local_cache_dir)} #{archive_file}"
123
- when /\.tar\.gz\Z|\.tar\.bz2\Z|\.tgz\Z/
124
- cmd = "tar xf #{archive_file} -C #{fetch(:local_cache_dir)}"
125
- end
126
-
127
- release_lock(true) do
128
- run_locally do
129
- execute cmd
130
- end
131
- end
132
- end
133
- end
134
-
135
- def cleanup
136
- run_locally do
137
- archives_dir = File.join(fetch(:s3_archive), fetch(:stage).to_s)
138
- archives = capture(:ls, '-xtr', archives_dir).split
139
- if archives.count >= fetch(:keep_releases)
140
- tobe_removes = (archives - archives.last(fetch(:keep_releases)))
141
- if tobe_removes.any?
142
- tobe_removes_str = tobe_removes.map do |file|
143
- File.join(archives_dir, file)
144
- end.join(' ')
145
- execute :rm, tobe_removes_str
146
- end
147
- end
148
- end
149
- end
150
-
151
- def release(server = context.host)
152
- unless context.class == SSHKit::Backend::Local
153
- user = user_for(server) + '@' unless user_for(server).nil?
154
- key = ssh_key_for(server)
155
- ssh_port_option = server.port.nil? ? '' : "-p #{server.port}"
156
- end
157
- rsync = ['rsync']
158
- rsync.concat fetch(:rsync_options)
159
- rsync << fetch(:local_cache_dir) + '/'
160
- unless context.class == SSHKit::Backend::Local
161
- rsync << "-e 'ssh -i #{key} #{ssh_port_option} #{fetch(:rsync_ssh_options).join(' ')}'"
162
- rsync << "#{user}#{server.hostname}:#{rsync_cache || release_path}"
163
- else
164
- rsync << '--no-compress'
165
- rsync << "#{rsync_cache || release_path}"
166
- end
167
- release_lock do
168
- run_locally do
169
- execute *rsync
170
- end
171
- end
172
-
173
- unless fetch(:rsync_cache).nil?
174
- cache = rsync_cache
175
- link_option = if fetch(:hardlink) && test!("[ `readlink #{current_path}` != #{release_path} ]")
176
- "--link-dest `readlink #{current_path}`"
177
- end
178
- copy = %(#{fetch(:rsync_copy)} #{link_option} "#{cache}/" "#{release_path}/")
179
- context.execute copy
180
- end
181
- end
182
-
183
- def current_revision
184
- fetch(:version_id) ? "#{archive_object_key}?versionid=#{fetch(:version_id)}" : archive_object_key
185
- end
186
-
187
- def ssh_key_for(host)
188
- if not host.keys.empty?
189
- host.keys.first
190
- elsif host.ssh_options && host.ssh_options.has_key?(:keys)
191
- Array(host.ssh_options[:keys]).first
192
- elsif fetch(:ssh_options, nil) && fetch(:ssh_options).has_key?(:keys)
193
- fetch(:ssh_options)[:keys].first
194
- end
195
- end
196
-
197
- def user_for(host)
198
- if host.user
199
- host.user
200
- elsif host.ssh_options && host.ssh_options.has_key?(:user)
201
- host.ssh_options[:user]
202
- elsif fetch(:ssh_options, nil) && fetch(:ssh_options).has_key?(:user)
203
- fetch(:ssh_options)[:user]
204
- end
205
- end
206
-
207
- private
208
- def rsync_cache
209
- cache = fetch(:rsync_cache)
210
- cache = deploy_to + "/" + cache if cache && cache !~ /^\//
211
- cache
212
- end
213
-
214
- def stage_lock(&block)
215
- mkdir_p(File.dirname(fetch(:local_cache)))
216
- lockfile = "#{fetch(:local_cache)}.#{fetch(:stage)}.lock"
217
- File.open(lockfile, "w") do |f|
218
- if f.flock(File::LOCK_EX | File::LOCK_NB)
219
- block.call
220
- else
221
- fail ResourceBusyError, "Could not get #{lockfile}"
222
- end
223
- end
224
- ensure
225
- rm lockfile if File.exist? lockfile
226
- end
227
-
228
- def release_lock(exclusive = false, &block)
229
- mkdir_p(File.dirname(fetch(:local_cache)))
230
- lockfile = "#{fetch(:local_cache)}.#{fetch(:stage)}.release.lock"
231
- File.open(lockfile, File::RDONLY|File::CREAT) do |f|
232
- mode = if exclusive
233
- File::LOCK_EX | File::LOCK_NB
234
- else
235
- File::LOCK_SH
236
- end
237
- if f.flock(mode)
238
- block.call
239
- else
240
- fail ResourceBusyError, "Could not get #{fetch(:lockfile)}"
241
- end
242
- end
243
- end
244
- end
245
- end
246
- end
247
- end
1
+ load File.expand_path("../legacy_s3_archive.rb", __FILE__)
@@ -1,5 +1,5 @@
1
1
  module Capistrano
2
2
  module S3Archive
3
- VERSION = "0.5.4"
3
+ VERSION = "0.9.9"
4
4
  end
5
5
  end
@@ -0,0 +1,308 @@
1
+ require "capistrano/scm/plugin"
2
+ require "aws-sdk"
3
+ require "uri"
4
+
5
+ module Capistrano
6
+ class SCM
7
+ class S3Archive < Capistrano::SCM::Plugin
8
+ attr_reader :extractor
9
+ include FileUtils
10
+
11
+ class ResourceBusyError < StandardError; end
12
+
13
+ def set_defaults
14
+ set_if_empty :s3_archive_client_options, {}
15
+ set_if_empty :s3_archive_extract_to, :local # :local or :remote
16
+ set_if_empty(:s3_archive_sort_proc, ->(new, old) { old.key <=> new.key })
17
+ set_if_empty :s3_archive_object_version_id, nil
18
+ set_if_empty :s3_archive_local_download_dir, "tmp/archives"
19
+ set_if_empty :s3_archive_local_cache_dir, "tmp/deploy"
20
+ set_if_empty :s3_archive_remote_rsync_options, ['-az', '--delete']
21
+ set_if_empty :s3_archive_remote_rsync_ssh_options, []
22
+ set_if_empty :s3_archive_remote_rsync_runner_options, {}
23
+ set_if_empty :s3_archive_rsync_cache_dir, "shared/deploy"
24
+ set_if_empty :s3_archive_hardlink_release, false
25
+ # internal use
26
+ set_if_empty :s3_archive_rsync_copy, "rsync --archive --acls --xattrs"
27
+ end
28
+
29
+ def define_tasks
30
+ eval_rakefile File.expand_path("../tasks/s3_archive.rake", __FILE__)
31
+ end
32
+
33
+ def register_hooks
34
+ after "deploy:new_release_path", "s3_archive:create_release"
35
+ before "deploy:check", "s3_archive:check"
36
+ before "deploy:set_current_revision", "s3_archive:set_current_revision"
37
+ end
38
+
39
+ def local_check
40
+ s3_client.list_objects(bucket: s3params.bucket, prefix: s3params.object_prefix)
41
+ end
42
+
43
+ def get_object(target)
44
+ opts = { bucket: s3params.bucket, key: archive_object_key }
45
+ opts[:version_id] = fetch(:s3_archive_object_version_id) if fetch(:s3_archive_object_version_id)
46
+ s3_client.get_object(opts, target: target)
47
+ end
48
+
49
+ def remote_check
50
+ backend.execute :echo, 'check ssh'
51
+ end
52
+
53
+ def stage
54
+ stage_lock do
55
+ archive_dir = File.join(fetch(:s3_archive_local_download_dir), fetch(:stage).to_s)
56
+ archive_file = File.join(archive_dir, File.basename(archive_object_key))
57
+ tmp_file = "#{archive_file}.part"
58
+ etag_file = File.join(archive_dir, ".#{File.basename(archive_object_key)}.etag")
59
+ fail "#{tmp_file} is found. Another process is running?" if File.exist?(tmp_file)
60
+ etag = get_object_metadata.tap { |it| fail "No such object: #{current_revision}" if it.nil? }.etag
61
+
62
+
63
+ if [archive_file, etag_file].all? { |f| File.exist?(f) } && File.read(etag_file) == etag
64
+ backend.info "#{archive_file} (etag:#{etag}) is found. download skipped."
65
+ else
66
+ backend.info "Download #{current_revision} to #{archive_file}"
67
+ mkdir_p(File.dirname(archive_file))
68
+ File.open(tmp_file, 'w') do |f|
69
+ get_object(f)
70
+ end
71
+ move(tmp_file, archive_file)
72
+ File.write(etag_file, etag)
73
+ end
74
+
75
+ remove_entry_secure(fetch(:s3_archive_local_cache_dir)) if File.exist? fetch(:s3_archive_local_cache_dir)
76
+ mkdir_p(fetch(:s3_archive_local_cache_dir))
77
+ case archive_file
78
+ when /\.zip\Z/
79
+ cmd = "unzip -q -d #{fetch(:s3_archive_local_cache_dir)} #{archive_file}"
80
+ when /\.tar\.gz\Z|\.tar\.bz2\Z|\.tgz\Z/
81
+ cmd = "tar xf #{archive_file} -C #{fetch(:s3_archive_local_cache_dir)}"
82
+ end
83
+
84
+ release_lock_on_stage do
85
+ run_locally do
86
+ execute cmd
87
+ end
88
+ end
89
+ end
90
+ end
91
+
92
+ def cleanup_stage_dir
93
+ run_locally do
94
+ archives_dir = File.join(fetch(:s3_archive_local_download_dir), fetch(:stage).to_s)
95
+ archives = capture(:ls, '-xtr', archives_dir).split
96
+ if archives.count >= fetch(:keep_releases)
97
+ to_be_removes = (archives - archives.last(fetch(:keep_releases)))
98
+ if to_be_removes.any?
99
+ to_be_removes_str = to_be_removes.map do |file|
100
+ File.join(archives_dir, file)
101
+ end.join(' ')
102
+ execute :rm, to_be_removes_str
103
+ end
104
+ end
105
+ end
106
+ end
107
+
108
+ def transfer_sources(dest)
109
+ fail "#{__method__} must be called in run_locally" unless backend.is_a?(SSHKit::Backend::Local)
110
+
111
+ rsync = ['rsync']
112
+ rsync.concat fetch(:s3_archive_remote_rsync_options, [])
113
+ rsync << (fetch(:s3_archive_local_cache_dir) + '/')
114
+
115
+ if dest.local?
116
+ rsync << ('--no-compress')
117
+ rsync << rsync_cache_dir
118
+ else
119
+ rsync << "-e 'ssh #{dest.ssh_key_option} #{fetch(:s3_archive_remote_rsync_ssh_options).join(' ')}'"
120
+ rsync << "#{dest.login_user_at}#{dest.hostname}:#{rsync_cache_dir}"
121
+ end
122
+
123
+ release_lock_on_create do
124
+ backend.execute(*rsync)
125
+ end
126
+ end
127
+
128
+ def release
129
+ link_option = if fetch(:s3_archive_hardlink_release) && backend.test("[ `readlink #{current_path}` != #{release_path} ]")
130
+ "--link-dest `readlink #{current_path}`"
131
+ end
132
+ create_release = %[#{fetch(:s3_archive_rsync_copy)} #{link_option} "#{rsync_cache_dir}/" "#{release_path}/"]
133
+ backend.execute create_release
134
+ end
135
+
136
+ def current_revision
137
+ if fetch(:s3_archive_object_version_id)
138
+ "#{archive_object_key}?versionid=#{fetch(:s3_archive_object_version_id)}"
139
+ else
140
+ archive_object_key
141
+ end
142
+ end
143
+
144
+ def archive_object_key
145
+ @archive_object_key ||=
146
+ case fetch(:branch, :latest).to_sym
147
+ when :master, :latest
148
+ latest_object_key
149
+ else
150
+ s3params.object_prefix + fetch(:branch).to_s
151
+ end
152
+ end
153
+
154
+ def rsync_cache_dir
155
+ File.join(deploy_to, fetch(:s3_archive_rsync_cache_dir))
156
+ end
157
+
158
+ def s3params
159
+ @s3params ||= S3Params.new(fetch(:repo_url))
160
+ end
161
+
162
+ def get_object_metadata
163
+ s3_client.list_object_versions(bucket: s3params.bucket, prefix: archive_object_key).versions.find do |v|
164
+ if fetch(:s3_archive_object_version_id) then v.version_id == fetch(:s3_archive_object_version_id)
165
+ else v.is_latest
166
+ end
167
+ end
168
+ end
169
+
170
+ def list_all_objects
171
+ response = s3_client.list_objects(bucket: s3params.bucket, prefix: s3params.object_prefix)
172
+ response.inject([]) do |objects, page|
173
+ objects + page.contents
174
+ end
175
+ end
176
+
177
+ def latest_object_key
178
+ list_all_objects.sort(&fetch(:s3_archive_sort_proc)).first.key
179
+ end
180
+
181
+ private
182
+
183
+ def release_lock_on_stage(&block)
184
+ release_lock((File::LOCK_EX | File::LOCK_NB), &block) # exclusive lock
185
+ end
186
+
187
+ def release_lock_on_create(&block)
188
+ release_lock(File::LOCK_SH, &block)
189
+ end
190
+
191
+ def release_lock(lock_mode, &block)
192
+ mkdir_p(File.dirname(fetch(:s3_archive_local_cache_dir)))
193
+ lockfile = "#{fetch(:s3_archive_local_cache_dir)}.#{fetch(:stage)}.release.lock"
194
+ File.open(lockfile, File::RDONLY | File::CREAT) do |file|
195
+ if file.flock(lock_mode)
196
+ block.call
197
+ else
198
+ fail ResourceBusyError, "Could not get #{lockfile}"
199
+ end
200
+ end
201
+ end
202
+
203
+ def stage_lock(&block)
204
+ mkdir_p(File.dirname(fetch(:s3_archive_local_cache_dir)))
205
+ lockfile = "#{fetch(:s3_archive_local_cache_dir)}.#{fetch(:stage)}.lock"
206
+ File.open(lockfile, "w") do |file|
207
+ fail ResourceBusyError, "Could not get #{lockfile}" unless file.flock(File::LOCK_EX | File::LOCK_NB)
208
+ block.call
209
+ end
210
+ ensure
211
+ rm lockfile if File.exist? lockfile
212
+ end
213
+
214
+ def s3_client
215
+ @s3_client ||= Aws::S3::Client.new(fetch(:s3_archive_client_options))
216
+ end
217
+
218
+ class LocalExtractor
219
+ # class ResourceBusyError < StandardError; end
220
+
221
+ # include FileUtils
222
+
223
+ def stage
224
+ stage_lock do
225
+ archive_dir = File.join(fetch(:s3_archive_local_download_dir), fetch(:stage).to_s)
226
+ archive_file = File.join(archive_dir, File.basename(archive_object_key))
227
+ tmp_file = "#{archive_file}.part"
228
+ etag_file = File.join(archive_dir, ".#{File.basename(archive_object_key)}.etag")
229
+ fail "#{tmp_file} is found. Another process is running?" if File.exist?(tmp_file)
230
+ etag = get_object_metadata.tap { |it| fail "No such object: #{current_revision}" if it.nil? }.etag
231
+
232
+
233
+ if [archive_file, etag_file].all? { |f| File.exist?(f) } && File.read(etag_file) == etag
234
+ context.info "#{archive_file} (etag:#{etag}) is found. download skipped."
235
+ else
236
+ context.info "Download #{current_revision} to #{archive_file}"
237
+ mkdir_p(File.dirname(archive_file))
238
+ File.open(tmp_file, 'w') do |f|
239
+ get_object(f)
240
+ end
241
+ move(tmp_file, archive_file)
242
+ File.write(etag_file, etag)
243
+ end
244
+
245
+ remove_entry_secure(fetch(:s3_archive_local_cache_dir)) if File.exist? fetch(:s3_archive_local_cache_dir)
246
+ mkdir_p(fetch(:s3_archive_local_cache_dir))
247
+ case archive_file
248
+ when /\.zip\Z/
249
+ cmd = "unzip -q -d #{fetch(:s3_archive_local_cache_dir)} #{archive_file}"
250
+ when /\.tar\.gz\Z|\.tar\.bz2\Z|\.tgz\Z/
251
+ cmd = "tar xf #{archive_file} -C #{fetch(:s3_archive_local_cache_dir)}"
252
+ end
253
+
254
+ release_lock_on_stage do
255
+ run_locally do
256
+ execute cmd
257
+ end
258
+ end
259
+ end
260
+ end
261
+
262
+ def stage_lock(&block)
263
+ mkdir_p(File.dirname(fetch(:s3_archive_local_cache_dir)))
264
+ lockfile = "#{fetch(:s3_archive_local_cache_dir)}.#{fetch(:stage)}.lock"
265
+ begin
266
+ File.open(lockfile, "w") do |file|
267
+ fail ResourceBusyError, "Could not get #{lockfile}" unless file.flock(File::LOCK_EX | File::LOCK_NB)
268
+ block.call
269
+ end
270
+ ensure
271
+ rm lockfile if File.exist? lockfile
272
+ end
273
+ end
274
+ end
275
+
276
+ class RemoteExtractor
277
+ end
278
+
279
+ class S3Params
280
+ attr_reader :bucket, :object_prefix
281
+
282
+ def initialize(repo_url)
283
+ uri = URI.parse(repo_url)
284
+ @bucket = uri.host
285
+ @object_prefix = uri.path.sub(/\/?\Z/, '/').slice(1..-1) # normalize path
286
+ end
287
+ end
288
+ end
289
+ end
290
+
291
+ class Configuration
292
+ class Server
293
+ def login_user_at
294
+ user = [user, ssh_options[:user]].compact.first
295
+ user ? "#{user}@" : ''
296
+ end
297
+
298
+ def ssh_key_option
299
+ key = [keys, ssh_options[:keys]].flatten.compact.first
300
+ key ? "-i #{key}" : ''
301
+ end
302
+
303
+ def ssh_port_option
304
+ port ? "-p #{port}" : ''
305
+ end
306
+ end
307
+ end
308
+ end
@@ -0,0 +1,57 @@
1
+ plugin = self
2
+
3
+ namespace :s3_archive do
4
+ desc 'Check that the S3 buckets are reachable'
5
+ task :check do
6
+ run_locally do
7
+ plugin.local_check
8
+ end
9
+
10
+ on release_roles :all do
11
+ plugin.remote_check
12
+ end
13
+ end
14
+
15
+ desc 'Extruct and stage the S3 archive in a stage directory'
16
+ task :stage do
17
+ if fetch(:skip_staging, false)
18
+ info "Skip extracting and staging."
19
+ next
20
+ end
21
+
22
+ run_locally do
23
+ plugin.stage
24
+ end
25
+ end
26
+
27
+ after :stage, :cleanup_stage_dir do
28
+ run_locally do
29
+ plugin.cleanup_stage_dir
30
+ end
31
+ end
32
+
33
+ desc 'Copy repo to releases'
34
+ task create_release: :stage do
35
+ on release_roles(:all), fetch(:s3_archive_remote_rsync_runner_options) do |server|
36
+ test "[ -e #{plugin.rsync_cache_dir} ]" # implicit initialize for 'server'
37
+ run_locally do
38
+ plugin.transfer_sources(server)
39
+ end
40
+ end
41
+
42
+ on release_roles(:all) do
43
+ execute :mkdir, '-p', release_path
44
+ plugin.release
45
+ end
46
+ end
47
+
48
+ desc 'Determine the revision that will be deployed'
49
+ task :set_current_revision do
50
+ set :current_revision, plugin.current_revision
51
+ end
52
+ end unless Rake::Task.task_defined?("s3_archive:check")
53
+
54
+ task :deploy_only do
55
+ set :skip_staging, true
56
+ invoke :deploy
57
+ end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: capistrano-s3_archive
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.5.4
4
+ version: 0.9.9
5
5
  platform: ruby
6
6
  authors:
7
7
  - Takuto Komazaki
8
8
  autorequire:
9
9
  bindir: exe
10
10
  cert_chain: []
11
- date: 2017-03-10 00:00:00.000000000 Z
11
+ date: 2017-07-13 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: capistrano
@@ -16,16 +16,16 @@ dependencies:
16
16
  requirements:
17
17
  - - "~>"
18
18
  - !ruby/object:Gem::Version
19
- version: 3.6.0
19
+ version: '3.0'
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
24
  - - "~>"
25
25
  - !ruby/object:Gem::Version
26
- version: 3.6.0
26
+ version: '3.0'
27
27
  - !ruby/object:Gem::Dependency
28
- name: aws-sdk-core
28
+ name: aws-sdk
29
29
  requirement: !ruby/object:Gem::Requirement
30
30
  requirements:
31
31
  - - "~>"
@@ -42,30 +42,44 @@ dependencies:
42
42
  name: bundler
43
43
  requirement: !ruby/object:Gem::Requirement
44
44
  requirements:
45
- - - "~>"
45
+ - - ">="
46
46
  - !ruby/object:Gem::Version
47
- version: '1.9'
47
+ version: '0'
48
48
  type: :development
49
49
  prerelease: false
50
50
  version_requirements: !ruby/object:Gem::Requirement
51
51
  requirements:
52
- - - "~>"
52
+ - - ">="
53
53
  - !ruby/object:Gem::Version
54
- version: '1.9'
54
+ version: '0'
55
55
  - !ruby/object:Gem::Dependency
56
56
  name: rake
57
57
  requirement: !ruby/object:Gem::Requirement
58
58
  requirements:
59
- - - "~>"
59
+ - - ">="
60
60
  - !ruby/object:Gem::Version
61
- version: '10.0'
61
+ version: '0'
62
62
  type: :development
63
63
  prerelease: false
64
64
  version_requirements: !ruby/object:Gem::Requirement
65
65
  requirements:
66
- - - "~>"
66
+ - - ">="
67
+ - !ruby/object:Gem::Version
68
+ version: '0'
69
+ - !ruby/object:Gem::Dependency
70
+ name: rubocop
71
+ requirement: !ruby/object:Gem::Requirement
72
+ requirements:
73
+ - - ">="
74
+ - !ruby/object:Gem::Version
75
+ version: '0'
76
+ type: :development
77
+ prerelease: false
78
+ version_requirements: !ruby/object:Gem::Requirement
79
+ requirements:
80
+ - - ">="
67
81
  - !ruby/object:Gem::Version
68
- version: '10.0'
82
+ version: '0'
69
83
  description: Capistrano deployment from an archive on Amazon S3.
70
84
  email:
71
85
  - komazarari@gmail.com
@@ -82,9 +96,14 @@ files:
82
96
  - bin/console
83
97
  - bin/setup
84
98
  - capistrano-s3_archive.gemspec
99
+ - img/s3_archive-rsync.png
100
+ - legacy_README.md
101
+ - lib/capistrano/legacy_s3_archive.rb
85
102
  - lib/capistrano/s3_archive.rb
86
103
  - lib/capistrano/s3_archive/version.rb
87
- - lib/capistrano/tasks/s3_archive.rake
104
+ - lib/capistrano/scm/s3_archive.rb
105
+ - lib/capistrano/scm/tasks/s3_archive.rake
106
+ - lib/capistrano/tasks/legacy_s3_archive.rake
88
107
  - vagrant_example/.insecure_private_key
89
108
  - vagrant_example/Capfile
90
109
  - vagrant_example/Gemfile
@@ -109,7 +128,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
109
128
  version: '0'
110
129
  requirements: []
111
130
  rubyforge_project:
112
- rubygems_version: 2.6.8
131
+ rubygems_version: 2.6.11
113
132
  signing_key:
114
133
  specification_version: 4
115
134
  summary: Capistrano deployment from an archive on Amazon S3.