heroku-mongo-backup-pm 0.1
Sign up to get free protection for your applications and to get access to all the features.
- data/.gitignore +4 -0
- data/README.md +67 -0
- data/Rakefile +7 -0
- data/heroku-mongo-backup.gemspec +20 -0
- data/lib/heroku-mongo-backup.rb +239 -0
- data/lib/s3_helpers.rb +147 -0
- data/lib/tasks/heroku_mongo_backup.rake +26 -0
- metadata +73 -0
data/.gitignore
ADDED
data/README.md
ADDED
@@ -0,0 +1,67 @@
|
|
1
|
+
## heroku-mongo-backup *— backup mongodb on Heroku and push it to S3 or FTP storage*
|
2
|
+
|
3
|
+
**heroku-mongo-backup** does:
|
4
|
+
|
5
|
+
1. Backup mongodb collections to one file;
|
6
|
+
2. Compress backup file with gzip;
|
7
|
+
3. Push backup to the specified S3 bucket or FTP server;
|
8
|
+
|
9
|
+
> Why not mongodump command?
|
10
|
+
|
11
|
+
*mongodump* command is not available on Heroku side. If you don't want to setup third party backup service for every project *heroku-mongo-backup* may be helpful.
|
12
|
+
|
13
|
+
|
14
|
+
## Configuration
|
15
|
+
|
16
|
+
Add gem to the ```Gemfile```: ```gem "heroku-mongo-backup"``` - if everything's okay ```rake -T``` command should show ```rake mongo:backup``` rake tasks.
|
17
|
+
|
18
|
+
For S3 support **heroku-mongo-backup** requires ```s3``` or ```aws-s3``` or ```fog``` library. One of those should be in ```Gemfile```, if you don't care add ```fog``` it's seems to be the most advanced.
|
19
|
+
|
20
|
+
Configure heroku scheduler to run ```mongo:backup``` rake task. Or if cron is used add backup task to ```/lib/tasks/cron.rake``` file:
|
21
|
+
|
22
|
+
```
|
23
|
+
desc "This task is called by the Heroku cron add-on"
|
24
|
+
task :cron => :environment do
|
25
|
+
Rake::Task['mongo:backup'].invoke
|
26
|
+
end
|
27
|
+
```
|
28
|
+
|
29
|
+
Set Heroku environment variables:
|
30
|
+
|
31
|
+
```heroku config:add S3_BACKUPS_BUCKET=_value_ S3_KEY_ID=_value_ S3_SECRET_KEY=_value_ MONGO_URL=_value_```
|
32
|
+
|
33
|
+
On MONGO_URL place anyone of these is assaptable: *MONGOHQ_URI* or *MONGOLAB_URI*.
|
34
|
+
|
35
|
+
For FTP set these variables:
|
36
|
+
|
37
|
+
```heroku config:add UPLOAD_TYPE=ftp FTP_HOST=_host_ FTP_PASSWORD=_pass_ FTP_USERNAME=_user_```
|
38
|
+
|
39
|
+
|
40
|
+
## Rake Commands
|
41
|
+
|
42
|
+
* ```heroku run rake mongo:backup```
|
43
|
+
|
44
|
+
If you want to automatically remove old backup files pass ```MAX_BACKUPS``` parameter to the rake command:
|
45
|
+
|
46
|
+
* ```heroku run rake mongo:backup MAX_BACKUPS=7```
|
47
|
+
|
48
|
+
Restore from backup:
|
49
|
+
|
50
|
+
* ```heroku run rake mongo:restore FILE=backup-file-name.gz```
|
51
|
+
|
52
|
+
If you want to restore from local file run:
|
53
|
+
|
54
|
+
* ```rake mongo:restore LOCAL=/absolute/path/to/<backup-file.gz>```
|
55
|
+
|
56
|
+
For Rails 2 add this to your Rakefile to import rake tasks:
|
57
|
+
|
58
|
+
```import File.expand_path(File.join(Gem.datadir('heroku-mongo-backup'), '..', '..', 'lib', 'tasks', 'heroku_mongo_backup.rake'))```
|
59
|
+
|
60
|
+
## Gem Contributors
|
61
|
+
|
62
|
+
1. [alexkravets - slatestudio.com](http://slatestudio.com "Slate Studio") - gem itself with S3 support
|
63
|
+
2. [matyi](https://github.com/matyi "Matyi - GitHub Profile") - FTP support
|
64
|
+
3. [stefl - stef.io](http://stef.io "Stef Lewandowski") - Rails is not required for production
|
65
|
+
4. [moonhouse - moonhouse.se](http://www.moonhouse.se/ "David Hall") - default config improvement
|
66
|
+
5. [wolfpakz](https://github.com/wolfpakz "Dan Porter") - Rails2 support
|
67
|
+
6. [solacreative](http://sola-la.com/creative "Taro Murao") - Max backups feature for aws/s3 and s3 gems
|
data/Rakefile
ADDED
@@ -0,0 +1,20 @@
|
|
1
|
+
Gem::Specification.new do |s|
|
2
|
+
s.name = 'heroku-mongo-backup-pm'
|
3
|
+
s.version = '0.1'
|
4
|
+
s.summary = 'Rake task backups mongo database on Heroku and push gzipped file to Amazon S3 or FTP.'
|
5
|
+
s.description = 'Rake task for backing up mongo database on heroku and push it to S3 or FTP. Library can be used as rake task or be easily integrated into daily cron job.'
|
6
|
+
|
7
|
+
s.authors = ['Alex Kravets', 'matyi', 'Stef Lewandowski', 'David Hall', 'Dan Porter']
|
8
|
+
s.email = 'santyor@gmail.com'
|
9
|
+
s.homepage = 'https://github.com/alexkravets/heroku-mongo-backup'
|
10
|
+
|
11
|
+
s.require_paths = ["lib"]
|
12
|
+
s.files = `git ls-files`.split($\)
|
13
|
+
s.executables = s.files.grep(%r{^bin/}).map{ |f| File.basename(f) }
|
14
|
+
s.test_files = s.files.grep(%r{^(test|spec|features)/})
|
15
|
+
|
16
|
+
# Supress the warning about no rubyforge project
|
17
|
+
s.rubyforge_project = 'nowarning'
|
18
|
+
|
19
|
+
s.add_runtime_dependency 'mongo'
|
20
|
+
end
|
@@ -0,0 +1,239 @@
|
|
1
|
+
# encoding: UTF-8
|
2
|
+
|
3
|
+
require 'mongo'
|
4
|
+
require 'json'
|
5
|
+
require 'zlib'
|
6
|
+
require 'uri'
|
7
|
+
require 'yaml'
|
8
|
+
require 'rubygems'
|
9
|
+
require 'net/ftp'
|
10
|
+
|
11
|
+
module HerokuMongoBackup
|
12
|
+
|
13
|
+
if defined?(Rails::Railtie)
|
14
|
+
class Railtie < Rails::Railtie
|
15
|
+
rake_tasks do
|
16
|
+
load "tasks/heroku_mongo_backup.rake"
|
17
|
+
end
|
18
|
+
end
|
19
|
+
end
|
20
|
+
|
21
|
+
require 's3_helpers'
|
22
|
+
|
23
|
+
class Backup
|
24
|
+
def chdir
|
25
|
+
Dir.chdir("/tmp")
|
26
|
+
begin
|
27
|
+
Dir.mkdir("dump")
|
28
|
+
rescue
|
29
|
+
end
|
30
|
+
Dir.chdir("dump")
|
31
|
+
end
|
32
|
+
|
33
|
+
def store
|
34
|
+
backup = {}
|
35
|
+
|
36
|
+
@db.collections.each do |col|
|
37
|
+
backup['system.indexes.db.name'] = col.db.name if col.name == "system.indexes"
|
38
|
+
|
39
|
+
records = []
|
40
|
+
|
41
|
+
col.find().each do |record|
|
42
|
+
records << record
|
43
|
+
end
|
44
|
+
|
45
|
+
backup[col.name] = records
|
46
|
+
end
|
47
|
+
|
48
|
+
marshal_dump = Marshal.dump(backup)
|
49
|
+
|
50
|
+
file = File.new(@file_name, 'w')
|
51
|
+
file.binmode
|
52
|
+
file = Zlib::GzipWriter.new(file)
|
53
|
+
file.write marshal_dump
|
54
|
+
file.close
|
55
|
+
end
|
56
|
+
|
57
|
+
def load
|
58
|
+
file = Zlib::GzipReader.open(@file_name)
|
59
|
+
obj = Marshal.load file.read
|
60
|
+
file.close
|
61
|
+
|
62
|
+
obj.each do |col_name, records|
|
63
|
+
next if col_name =~ /^system\./
|
64
|
+
|
65
|
+
@db.drop_collection(col_name)
|
66
|
+
dest_col = @db.create_collection(col_name)
|
67
|
+
|
68
|
+
records.each do |record|
|
69
|
+
dest_col.insert record
|
70
|
+
end
|
71
|
+
end
|
72
|
+
|
73
|
+
# Load indexes here
|
74
|
+
col_name = "system.indexes"
|
75
|
+
dest_index_col = @db.collection(col_name)
|
76
|
+
obj[col_name].each do |index|
|
77
|
+
if index['_id']
|
78
|
+
index['ns'] = index['ns'].sub(obj['system.indexes.db.name'], dest_index_col.db.name)
|
79
|
+
dest_index_col.insert index
|
80
|
+
end
|
81
|
+
end
|
82
|
+
end
|
83
|
+
|
84
|
+
def db_connect
|
85
|
+
uri = URI.parse(@url)
|
86
|
+
connection = ::Mongo::Connection.new(uri.host, uri.port)
|
87
|
+
@db = connection.db(uri.path.gsub(/^\//, ''))
|
88
|
+
@db.authenticate(uri.user, uri.password) if uri.user
|
89
|
+
end
|
90
|
+
|
91
|
+
def ftp_connect
|
92
|
+
@ftp = Net::FTP.new(ENV['FTP_HOST'])
|
93
|
+
@ftp.passive = true
|
94
|
+
@ftp.login(ENV['FTP_USERNAME'], ENV['FTP_PASSWORD'])
|
95
|
+
end
|
96
|
+
|
97
|
+
def ftp_upload
|
98
|
+
@ftp.putbinaryfile(@file_name)
|
99
|
+
end
|
100
|
+
|
101
|
+
def ftp_download
|
102
|
+
open(@file_name, 'w') do |file|
|
103
|
+
file_content = @ftp.getbinaryfile(@file_name)
|
104
|
+
file.binmode
|
105
|
+
file.write file_content
|
106
|
+
end
|
107
|
+
end
|
108
|
+
|
109
|
+
def s3_connect
|
110
|
+
bucket = ENV['S3_BACKUPS_BUCKET']
|
111
|
+
if bucket.nil?
|
112
|
+
bucket = ENV['S3_BACKUP_BUCKET']
|
113
|
+
end
|
114
|
+
if bucket.nil?
|
115
|
+
bucket = ENV['S3_BACKUP']
|
116
|
+
end
|
117
|
+
if bucket.nil?
|
118
|
+
bucket = ENV['S3_BUCKET']
|
119
|
+
end
|
120
|
+
|
121
|
+
access_key_id = ENV['S3_KEY_ID']
|
122
|
+
if access_key_id.nil?
|
123
|
+
access_key_id = ENV['S3_KEY']
|
124
|
+
end
|
125
|
+
if access_key_id.nil?
|
126
|
+
access_key_id = ENV['AWS_ACCESS_KEY_ID']
|
127
|
+
end
|
128
|
+
|
129
|
+
secret_access_key = ENV['S3_SECRET_KEY']
|
130
|
+
if secret_access_key.nil?
|
131
|
+
secret_access_key = ENV['S3_SECRET']
|
132
|
+
end
|
133
|
+
if secret_access_key.nil?
|
134
|
+
secret_access_key = ENV['AWS_SECRET_ACCESS_KEY']
|
135
|
+
end
|
136
|
+
|
137
|
+
@bucket = HerokuMongoBackup::s3_connect(bucket, access_key_id, secret_access_key)
|
138
|
+
end
|
139
|
+
|
140
|
+
def s3_upload
|
141
|
+
HerokuMongoBackup::s3_upload(@bucket, @file_name)
|
142
|
+
end
|
143
|
+
|
144
|
+
def s3_download
|
145
|
+
open(@file_name, 'w') do |file|
|
146
|
+
file_content = HerokuMongoBackup::s3_download(@bucket, @file_name)
|
147
|
+
file.binmode
|
148
|
+
file.write file_content
|
149
|
+
end
|
150
|
+
end
|
151
|
+
|
152
|
+
def initialize connect = true
|
153
|
+
@file_name = Time.now.strftime("%Y-%m-%d.gz")
|
154
|
+
|
155
|
+
if( ['production', 'staging'].include?(ENV['RAILS_ENV'] || ENV['RACK_ENV']) )
|
156
|
+
|
157
|
+
#config_template = ERB.new(IO.read("config/mongoid.yml"))
|
158
|
+
#uri = YAML.load(config_template.result)['production']['uri']
|
159
|
+
uri = ENV['MONGO_URL']
|
160
|
+
|
161
|
+
if uri.nil?
|
162
|
+
uri = ENV['MONGOHQ_URL']
|
163
|
+
end
|
164
|
+
if uri.nil?
|
165
|
+
uri = ENV['MONGOLAB_URI']
|
166
|
+
end
|
167
|
+
|
168
|
+
else
|
169
|
+
mongoid_config = YAML.load_file("config/mongoid.yml")
|
170
|
+
config = {}
|
171
|
+
defaults = mongoid_config['defaults']
|
172
|
+
dev_config = mongoid_config['development']
|
173
|
+
|
174
|
+
config.merge!(defaults) unless defaults.nil?
|
175
|
+
config.merge!(dev_config)
|
176
|
+
|
177
|
+
host = config['host']
|
178
|
+
port = config['port']
|
179
|
+
database = config['database']
|
180
|
+
uri = "mongodb://#{host}:#{port}/#{database}"
|
181
|
+
|
182
|
+
if uri == 'mongodb://:/' # new mongoid version 3.x
|
183
|
+
mongoid_config = YAML.load_file("config/mongoid.yml")
|
184
|
+
dev_config = mongoid_config['development']['sessions']['default']
|
185
|
+
host_port = dev_config['hosts'].first
|
186
|
+
database = dev_config['database']
|
187
|
+
uri = "mongodb://#{host_port}/#{database}"
|
188
|
+
end
|
189
|
+
end
|
190
|
+
|
191
|
+
@url = uri
|
192
|
+
|
193
|
+
puts "Using database: #{@url}"
|
194
|
+
|
195
|
+
self.db_connect
|
196
|
+
|
197
|
+
if connect
|
198
|
+
if ENV['UPLOAD_TYPE'] == 'ftp'
|
199
|
+
self.ftp_connect
|
200
|
+
else
|
201
|
+
self.s3_connect
|
202
|
+
end
|
203
|
+
end
|
204
|
+
end
|
205
|
+
|
206
|
+
def backup files_number_to_leave=0
|
207
|
+
self.chdir
|
208
|
+
self.store
|
209
|
+
|
210
|
+
if ENV['UPLOAD_TYPE'] == 'ftp'
|
211
|
+
self.ftp_upload
|
212
|
+
@ftp.close
|
213
|
+
else
|
214
|
+
self.s3_upload
|
215
|
+
end
|
216
|
+
|
217
|
+
if files_number_to_leave > 0
|
218
|
+
HerokuMongoBackup::remove_old_backup_files(@bucket, files_number_to_leave)
|
219
|
+
end
|
220
|
+
end
|
221
|
+
|
222
|
+
def restore file_name, download_file = true
|
223
|
+
@file_name = file_name
|
224
|
+
|
225
|
+
self.chdir
|
226
|
+
|
227
|
+
if download_file
|
228
|
+
if ENV['UPLOAD_TYPE'] == 'ftp'
|
229
|
+
self.ftp_download
|
230
|
+
@ftp.close
|
231
|
+
else
|
232
|
+
self.s3_download
|
233
|
+
end
|
234
|
+
end
|
235
|
+
|
236
|
+
self.load
|
237
|
+
end
|
238
|
+
end
|
239
|
+
end
|
data/lib/s3_helpers.rb
ADDED
@@ -0,0 +1,147 @@
|
|
1
|
+
begin
|
2
|
+
require 's3'
|
3
|
+
|
4
|
+
rescue LoadError
|
5
|
+
#
|
6
|
+
# There is no 's3' gem in Gmefile
|
7
|
+
#
|
8
|
+
#puts "There is no 's3' gem in Gemfile."
|
9
|
+
end
|
10
|
+
|
11
|
+
if defined?(S3)
|
12
|
+
#
|
13
|
+
# Using 's3' gem an Amazon S3 interface
|
14
|
+
#
|
15
|
+
#puts "Using \'s3\' gem as Amazon S3 interface."
|
16
|
+
def HerokuMongoBackup::s3_connect(bucket, key, secret)
|
17
|
+
service = S3::Service.new(:access_key_id => key,
|
18
|
+
:secret_access_key => secret)
|
19
|
+
bucket = service.buckets.find(bucket)
|
20
|
+
return bucket
|
21
|
+
end
|
22
|
+
|
23
|
+
def HerokuMongoBackup::s3_upload(bucket, filename)
|
24
|
+
object = bucket.objects.build("backups/#{filename}")
|
25
|
+
object.content = open(filename)
|
26
|
+
object.save
|
27
|
+
end
|
28
|
+
|
29
|
+
def HerokuMongoBackup::s3_download(bucket, filename)
|
30
|
+
object = bucket.objects.find("backups/#{filename}")
|
31
|
+
content = object.content(reload=true)
|
32
|
+
|
33
|
+
puts "Backup file:"
|
34
|
+
puts " name: #{filename}"
|
35
|
+
puts " type: #{object.content_type}"
|
36
|
+
puts " size: #{content.size} bytes"
|
37
|
+
puts "\n"
|
38
|
+
|
39
|
+
return content
|
40
|
+
end
|
41
|
+
|
42
|
+
def HerokuMongoBackup::remove_old_backup_files(bucket, files_number_to_leave)
|
43
|
+
excess = ( object_keys = bucket.objects.find_all(:prefix => "backups/").map { |o| o.key }.sort ).count - files_number_to_leave
|
44
|
+
(0..excess-1).each { |i| bucket.objects.find(object_keys[i]).destroy } if excess > 0
|
45
|
+
end
|
46
|
+
|
47
|
+
end
|
48
|
+
|
49
|
+
|
50
|
+
|
51
|
+
begin
|
52
|
+
require 'aws/s3'
|
53
|
+
rescue LoadError
|
54
|
+
#
|
55
|
+
# There is no 'aws/s3' in Gemfile
|
56
|
+
#
|
57
|
+
#puts "There is no 'aws/s3' gem in Gemfile."
|
58
|
+
end
|
59
|
+
|
60
|
+
if defined?(AWS)
|
61
|
+
#
|
62
|
+
# Using 'aws/s3' gem as Amazon S3 interface
|
63
|
+
#
|
64
|
+
#puts "Using \'aws/s3\' gem as Amazon S3 interface."
|
65
|
+
def HerokuMongoBackup::s3_connect(bucket, key, secret)
|
66
|
+
AWS::S3::Base.establish_connection!(:access_key_id => key,
|
67
|
+
:secret_access_key => secret)
|
68
|
+
return bucket
|
69
|
+
end
|
70
|
+
|
71
|
+
def HerokuMongoBackup::s3_upload(bucket, filename)
|
72
|
+
AWS::S3::S3Object.store("backups/#{filename}", open(filename), bucket)
|
73
|
+
end
|
74
|
+
|
75
|
+
def HerokuMongoBackup::s3_download(bucket, filename)
|
76
|
+
content = AWS::S3::S3Object.value("backups/#{filename}", bucket)
|
77
|
+
return content
|
78
|
+
end
|
79
|
+
|
80
|
+
def HerokuMongoBackup::remove_old_backup_files(bucket, files_number_to_leave)
|
81
|
+
excess = ( object_keys = AWS::S3::Bucket.find(bucket).objects(:prefix => 'backups/').map { |o| o.key }.sort ).count - files_number_to_leave
|
82
|
+
(0..excess-1).each { |i| AWS::S3::S3Object.find(object_keys[i], bucket).delete } if excess > 0
|
83
|
+
end
|
84
|
+
|
85
|
+
end
|
86
|
+
|
87
|
+
|
88
|
+
|
89
|
+
|
90
|
+
begin
|
91
|
+
require 'fog'
|
92
|
+
rescue LoadError
|
93
|
+
#
|
94
|
+
# There is no 'fog' in Gemfile
|
95
|
+
#
|
96
|
+
#puts "There is no 'fog' gem in Gemfile."
|
97
|
+
end
|
98
|
+
|
99
|
+
if defined?(Fog)
|
100
|
+
#
|
101
|
+
# Using 'aws/s3' gem as Amazon S3 interface
|
102
|
+
#
|
103
|
+
#puts "Using \'aws/s3\' gem as Amazon S3 interface."
|
104
|
+
def HerokuMongoBackup::s3_connect(bucket, key, secret)
|
105
|
+
connection = Fog::Storage.new({
|
106
|
+
:provider => 'AWS',
|
107
|
+
:aws_access_key_id => key,
|
108
|
+
:aws_secret_access_key => secret
|
109
|
+
})
|
110
|
+
directory = connection.directories.new(:key => bucket)
|
111
|
+
return directory
|
112
|
+
end
|
113
|
+
|
114
|
+
def HerokuMongoBackup::s3_upload(directory, filename)
|
115
|
+
file = directory.files.create(
|
116
|
+
:key => "backups/#{filename}",
|
117
|
+
:body => open(filename),
|
118
|
+
:public => true
|
119
|
+
)
|
120
|
+
end
|
121
|
+
|
122
|
+
def HerokuMongoBackup::s3_download(directory, filename)
|
123
|
+
file = directory.files.get("backups/#{filename}")
|
124
|
+
return file.body
|
125
|
+
end
|
126
|
+
|
127
|
+
def HerokuMongoBackup::remove_old_backup_files(directory, files_number_to_leave)
|
128
|
+
total_backups = directory.files.all.size
|
129
|
+
|
130
|
+
if total_backups > files_number_to_leave
|
131
|
+
|
132
|
+
files_to_destroy = (0..total_backups-files_number_to_leave-1).collect{|i| directory.files.all[i] }
|
133
|
+
|
134
|
+
files_to_destroy.each do |f|
|
135
|
+
f.destroy
|
136
|
+
end
|
137
|
+
end
|
138
|
+
end
|
139
|
+
|
140
|
+
else
|
141
|
+
logging = Logger.new(STDOUT)
|
142
|
+
logging.error "\n\nheroku-mongo-backup: Please include 's3', 'aws/s3' or 'fog' gem in applications Gemfile for uploading backup to S3 bucket. (ignore this if using FTP)\n\n"
|
143
|
+
end
|
144
|
+
|
145
|
+
|
146
|
+
|
147
|
+
|
@@ -0,0 +1,26 @@
|
|
1
|
+
# encoding: UTF-8
|
2
|
+
|
3
|
+
namespace :mongo do
|
4
|
+
desc "Backup prodution database and store it on S3.\n
|
5
|
+
Example of usage: rake mongo:backup OR rake mongo:backup MAX_BACKUPS=7"
|
6
|
+
task :backup => :environment do
|
7
|
+
if ENV['MAX_BACKUPS']
|
8
|
+
HerokuMongoBackup::Backup.new.backup(ENV['MAX_BACKUPS'].to_i)
|
9
|
+
else
|
10
|
+
HerokuMongoBackup::Backup.new.backup
|
11
|
+
end
|
12
|
+
end
|
13
|
+
|
14
|
+
desc "Restore command gets backup file from S3 server or local file and pushes data to production db.\n
|
15
|
+
Example of usage: rake mongo:restore FILE=<backup-file.gz>"
|
16
|
+
task :restore => :environment do
|
17
|
+
if ENV['FILE']
|
18
|
+
HerokuMongoBackup::Backup.new.restore ENV['FILE']
|
19
|
+
elsif ENV['LOCAL']
|
20
|
+
HerokuMongoBackup::Backup.new(false).restore ENV['LOCAL'], false
|
21
|
+
else
|
22
|
+
latest = Time.new.strftime("%Y-%m-%d.gz")
|
23
|
+
HerokuMongoBackup::Backup.new.restore latest
|
24
|
+
end
|
25
|
+
end
|
26
|
+
end
|
metadata
ADDED
@@ -0,0 +1,73 @@
|
|
1
|
+
--- !ruby/object:Gem::Specification
|
2
|
+
name: heroku-mongo-backup-pm
|
3
|
+
version: !ruby/object:Gem::Version
|
4
|
+
version: '0.1'
|
5
|
+
prerelease:
|
6
|
+
platform: ruby
|
7
|
+
authors:
|
8
|
+
- Alex Kravets
|
9
|
+
- matyi
|
10
|
+
- Stef Lewandowski
|
11
|
+
- David Hall
|
12
|
+
- Dan Porter
|
13
|
+
autorequire:
|
14
|
+
bindir: bin
|
15
|
+
cert_chain: []
|
16
|
+
date: 2012-10-25 00:00:00.000000000 Z
|
17
|
+
dependencies:
|
18
|
+
- !ruby/object:Gem::Dependency
|
19
|
+
name: mongo
|
20
|
+
requirement: !ruby/object:Gem::Requirement
|
21
|
+
none: false
|
22
|
+
requirements:
|
23
|
+
- - ! '>='
|
24
|
+
- !ruby/object:Gem::Version
|
25
|
+
version: '0'
|
26
|
+
type: :runtime
|
27
|
+
prerelease: false
|
28
|
+
version_requirements: !ruby/object:Gem::Requirement
|
29
|
+
none: false
|
30
|
+
requirements:
|
31
|
+
- - ! '>='
|
32
|
+
- !ruby/object:Gem::Version
|
33
|
+
version: '0'
|
34
|
+
description: Rake task for backing up mongo database on heroku and push it to S3 or
|
35
|
+
FTP. Library can be used as rake task or be easily integrated into daily cron job.
|
36
|
+
email: santyor@gmail.com
|
37
|
+
executables: []
|
38
|
+
extensions: []
|
39
|
+
extra_rdoc_files: []
|
40
|
+
files:
|
41
|
+
- .gitignore
|
42
|
+
- README.md
|
43
|
+
- Rakefile
|
44
|
+
- heroku-mongo-backup.gemspec
|
45
|
+
- lib/heroku-mongo-backup.rb
|
46
|
+
- lib/s3_helpers.rb
|
47
|
+
- lib/tasks/heroku_mongo_backup.rake
|
48
|
+
homepage: https://github.com/alexkravets/heroku-mongo-backup
|
49
|
+
licenses: []
|
50
|
+
post_install_message:
|
51
|
+
rdoc_options: []
|
52
|
+
require_paths:
|
53
|
+
- lib
|
54
|
+
required_ruby_version: !ruby/object:Gem::Requirement
|
55
|
+
none: false
|
56
|
+
requirements:
|
57
|
+
- - ! '>='
|
58
|
+
- !ruby/object:Gem::Version
|
59
|
+
version: '0'
|
60
|
+
required_rubygems_version: !ruby/object:Gem::Requirement
|
61
|
+
none: false
|
62
|
+
requirements:
|
63
|
+
- - ! '>='
|
64
|
+
- !ruby/object:Gem::Version
|
65
|
+
version: '0'
|
66
|
+
requirements: []
|
67
|
+
rubyforge_project: nowarning
|
68
|
+
rubygems_version: 1.8.24
|
69
|
+
signing_key:
|
70
|
+
specification_version: 3
|
71
|
+
summary: Rake task backups mongo database on Heroku and push gzipped file to Amazon
|
72
|
+
S3 or FTP.
|
73
|
+
test_files: []
|