backup 2.4.0 → 2.4.1

Sign up to get free protection for your applications and to get access to all the features.
data/CHANGELOG CHANGED
@@ -1,3 +1,9 @@
1
+ UPDATE
2
+ === 2.4.1 ======================================
3
+
4
+ Dropbox storage added.
5
+
6
+
1
7
  BIG UPDATE
2
8
  === 2.4.0 =======================================
3
9
 
data/README.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  ## A Backup Ruby Gem
4
4
 
5
- __Backup__ is a Ruby Gem written for __Unix__ and __Ruby on Rails (2 and 3)__ environments. It can be used both with and without the Ruby on Rails framework! This gem offers a quick and simple solution to backing up databases such as MySQL/PostgreSQL/SQLite and Files/Folders. All backups can be transferred to Amazon S3, Rackspace Cloud Files, any remote server you have access to (using either SCP, SFTP or regular FTP), or a Local server. Backup handles Compression, Archiving, Encryption, Backup Cleaning (Cycling) and supports Email Notifications.
5
+ __Backup__ is a Ruby Gem written for __Unix__ and __Ruby on Rails (2 and 3)__ environments. It can be used both with and without the Ruby on Rails framework! This gem offers a quick and simple solution to backing up databases such as MySQL/PostgreSQL/SQLite and Files/Folders. All backups can be transferred to Amazon S3, Rackspace Cloud Files, Dropbox Web Service, any remote server you have access to (using either SCP, SFTP or regular FTP), or a Local server. Backup handles Compression, Archiving, Encryption, Backup Cleaning (Cycling) and supports Email Notifications.
6
6
 
7
7
  ## Written for Environments
8
8
 
@@ -15,6 +15,17 @@ __Backup__ is a Ruby Gem written for __Unix__ and __Ruby on Rails (2 and 3)__ en
15
15
  * [Meskyanichi - Michael van Rooijen](http://github.com/meskyanichi)
16
16
  * [Fernandoluizao - Fernando Migliorini Luizão](http://github.com/fernandoluizao)
17
17
 
18
+ ### Contributors
19
+
20
+ * [dtrueman](http://github.com/dtrueman)
21
+ * [Nathan L Smith](http://github.com/smith)
22
+ * [Francesc Esplugas](http://github.com/fesplugas)
23
+ * [wakiki](http://github.com/wakiki)
24
+ * [Dan Hixon](http://github.com/danhixon)
25
+ * [Adam Greene](http://github.com/skippy)
26
+ * [Dmitriy Novotochinov](http://github.com/trybeee)
27
+
28
+
18
29
  ## Backup's Current Capabilities
19
30
 
20
31
  ### Storage Methods
@@ -23,12 +34,14 @@ __Backup__ is a Ruby Gem written for __Unix__ and __Ruby on Rails (2 and 3)__ en
23
34
  * Rackspace Cloud Files
24
35
  * Remote Server (Available Protocols: SCP, SFTP, FTP)
25
36
  * Local server (Example Locations: Another Hard Drive, Network path)
37
+ * Dropbox (you should get your API key and secret at developers.dropbox.com)
26
38
 
27
39
  ### Adapters
28
40
 
29
41
  * MySQL
30
42
  * PostgreSQL
31
43
  * SQLite
44
+ * MongoDB
32
45
  * Archive (Any files and/or folders)
33
46
  * Custom (Anything you can produce using the command line)
34
47
 
@@ -63,6 +76,7 @@ Simply fill in the email configuration block and set "notify" to true inside the
63
76
  storage :s3 do
64
77
  access_key_id 'access_key_id'
65
78
  secret_access_key 'secret_access_key'
79
+ # host 's3-ap-southeast-1.amazonaws.com' #the s3 host. Defaults to s3.amazonaws.com
66
80
  bucket '/bucket/backups/mysql/'
67
81
  use_ssl true
68
82
  end
@@ -168,13 +182,4 @@ Send us a message! Fork the project!
168
182
  [Report it](http://github.com/meskyanichi/backup/issues)
169
183
 
170
184
 
171
- ### Contributors
172
-
173
- * [dtrueman](http://github.com/dtrueman)
174
- * [Nathan L Smith](http://github.com/smith)
175
- * [Francesc Esplugas](http://github.com/fesplugas)
176
- * [wakiki](http://github.com/wakiki)
177
- * [Dan Hixon](http://github.com/danhixon)
178
- * [Adam Greene](http://github.com/skippy)
179
-
180
- __Michael van Rooijen | Final Creation. ([http://michaelvanrooijen.com](http://michaelvanrooijen.com))__
185
+ __Michael van Rooijen | Final Creation. ([http://michaelvanrooijen.com](http://michaelvanrooijen.com))__
data/VERSION CHANGED
@@ -1 +1 @@
1
- 2.4.0
1
+ 2.4.1
@@ -21,12 +21,13 @@
21
21
  # - Custom
22
22
  #
23
23
  # STORAGE METHODS
24
- # - S3 (Amazon)
25
- # - CF (Rackspace Cloud Files)
26
- # - SCP (Remote Server)
27
- # - FTP (Remote Server)
28
- # - SFTP (Remote Server)
29
- # - LOCAL (Local Server)
24
+ # - S3 (Amazon)
25
+ # - CF (Rackspace Cloud Files)
26
+ # - Dropbox (Dropbox Web Service)
27
+ # - SCP (Remote Server)
28
+ # - FTP (Remote Server)
29
+ # - SFTP (Remote Server)
30
+ # - LOCAL (Local Server)
30
31
  #
31
32
  # GLOBAL OPTIONS
32
33
  # - Keep Backups (keep_backups)
@@ -227,3 +228,26 @@ backup 'sqlite-backup-local' do
227
228
  notify false
228
229
 
229
230
  end
231
+
232
+
233
+ # Initialize with
234
+ # rake backup:run trigger='postgresql-backup-dropbox'
235
+ backup 'postgresql-backup-dropbox' do
236
+
237
+ adapter :postgresql do
238
+ user 'devmen'
239
+ database 'domowoi_test'
240
+ end
241
+
242
+ storage :dropbox do
243
+ api_key 'your_api_key'
244
+ secret_access_key 'your_api_secret'
245
+ username 'user@example.org'
246
+ password 'super_s3cret'
247
+ path 'backups'
248
+ end
249
+
250
+ keep_backups :all
251
+ encrypt_with_password false
252
+ notify false
253
+ end
@@ -54,6 +54,7 @@ module Backup
54
54
  module Adapters
55
55
  autoload :Base, 'backup/adapters/base'
56
56
  autoload :MySQL, 'backup/adapters/mysql'
57
+ autoload :MongoDB, 'backup/adapters/mongo_db'
57
58
  autoload :SQLite, 'backup/adapters/sqlite'
58
59
  autoload :PostgreSQL, 'backup/adapters/postgresql'
59
60
  autoload :Archive, 'backup/adapters/archive'
@@ -68,6 +69,7 @@ module Backup
68
69
  autoload :FTP, 'backup/storage/ftp'
69
70
  autoload :SFTP, 'backup/storage/sftp'
70
71
  autoload :Local, 'backup/storage/local'
72
+ autoload :Dropbox, 'backup/storage/dropbox'
71
73
  end
72
74
 
73
75
  module Record
@@ -78,6 +80,7 @@ module Backup
78
80
  autoload :FTP, 'backup/record/ftp'
79
81
  autoload :SFTP, 'backup/record/sftp'
80
82
  autoload :Local, 'backup/record/local'
83
+ autoload :Dropbox, 'backup/record/dropbox'
81
84
  end
82
85
 
83
86
  class Setup
@@ -95,6 +98,7 @@ module Backup
95
98
  # Initializes one of the few adapters and start the backup process
96
99
  def initialize_adapter
97
100
  case procedure.adapter_name.to_sym
101
+ when :mongo then Backup::Adapters::MongoDB.new trigger, procedure
98
102
  when :mysql then Backup::Adapters::MySQL.new trigger, procedure
99
103
  when :sqlite then Backup::Adapters::SQLite.new trigger, procedure
100
104
  when :postgresql then Backup::Adapters::PostgreSQL.new trigger, procedure
@@ -48,9 +48,11 @@ module Backup
48
48
  begin
49
49
  create_tmp_folder
50
50
  load_settings # if respond_to?(:load_settings)
51
+ handle_before_backup
51
52
  perform
52
53
  encrypt
53
54
  store
55
+ handle_after_backup
54
56
  record
55
57
  notify
56
58
  ensure
@@ -60,16 +62,37 @@ module Backup
60
62
 
61
63
  # Creates the temporary folder for the specified adapter
62
64
  def create_tmp_folder
63
- run "mkdir -p #{tmp_path}"
65
+ #need to create with universal privlages as some backup tasks might create this path under sudo
66
+ run "mkdir -m 0777 -p #{tmp_path.sub(/\/[^\/]+$/, '')}" #this is the parent to the tmp_path
67
+ run "mkdir -m 0777 -p #{tmp_path}" #the temp path dir
64
68
  end
65
69
 
66
70
  # TODO make methods in derived classes public? respond_to cannot identify private methods
67
71
  def load_settings
68
72
  end
73
+
74
+ def skip_backup(msg)
75
+ log "Terminating backup early because: #{msg}"
76
+ exit 1
77
+ end
69
78
 
70
79
  # Removes the files inside the temporary folder
71
80
  def remove_tmp_files
72
- run "rm #{File.join(tmp_path, '*')}"
81
+ run "rm -r #{File.join(tmp_path)}" if File.exists?(tmp_path) #just in case there isn't one because the process was skipped
82
+ end
83
+
84
+ def handle_before_backup
85
+ return unless self.procedure.before_backup_block
86
+ log system_messages[:before_backup_hook]
87
+ #run it through this instance so the block is run as a part of this adapter...which means it has access to all sorts of sutff
88
+ self.instance_eval &self.procedure.before_backup_block
89
+ end
90
+
91
+ def handle_after_backup
92
+ return unless self.procedure.after_backup_block
93
+ log system_messages[:after_backup_hook]
94
+ #run it through this instance so the block is run as a part of this adapter...which means it has access to all sorts of sutff
95
+ self.instance_eval &self.procedure.after_backup_block
73
96
  end
74
97
 
75
98
  # Encrypts the archive file
@@ -92,7 +115,7 @@ module Backup
92
115
  tmp_file << encrypt_with_gpg_public_key
93
116
  tmp_file.close
94
117
  # that will either say the key was added OR that it wasn't needed, but either way we need to parse for the uid
95
- # which will be wrapped in '<' and '>' like <sweetspot-backup2007@6bar8.com>
118
+ # which will be wrapped in '<' and '>' like <someone_famous@me.com>
96
119
  encryptionKeyId = `gpg --import #{tmp_file.path} 2>&1`.match(/<(.+)>/)[1]
97
120
  run "gpg -e --trust-model always -o #{File.join(tmp_path, encrypted_file)} -r '#{encryptionKeyId}' #{File.join(tmp_path, compressed_file)}"
98
121
  elsif encrypt_with_password.is_a?(String)
@@ -128,6 +151,10 @@ module Backup
128
151
  :encrypting_w_pass => "Encrypting backup with password..",
129
152
  :encrypting_w_key => "Encrypting backup with gpg public key..",
130
153
  :mysqldump => "Creating MySQL dump..",
154
+ :mongo_dump => "Creating MongoDB dump..",
155
+ :mongo_copy => "Creating MongoDB disk level copy..",
156
+ :before_backup_hook => "Running before backup hook..",
157
+ :after_backup_hook => "Running after backup hook..",
131
158
  :pgdump => "Creating PostgreSQL dump..",
132
159
  :sqlite => "Copying and compressing SQLite database..",
133
160
  :commands => "Executing commands.." }
@@ -0,0 +1,139 @@
1
+ module Backup
2
+ module Adapters
3
+ class MongoDB < Backup::Adapters::Base
4
+ require 'json'
5
+
6
+ attr_accessor :user, :password, :database, :collections, :host, :port, :additional_options, :backup_method
7
+
8
+ private
9
+
10
+ BACKUP_METHOD_OPTIONS = [:mongodump, :disk_copy]
11
+
12
+ # Dumps and Compresses the Mongodump file
13
+ def perform
14
+ tmp_mongo_dir = "mongodump-#{Time.now.strftime("%Y%m%d%H%M%S")}"
15
+ tmp_dump_dir = File.join(tmp_path, tmp_mongo_dir)
16
+
17
+ case self.backup_method.to_sym
18
+ when :mongodump
19
+ #this is the default options
20
+ # PROS:
21
+ # * non-locking
22
+ # * much smaller archive sizes
23
+ # * can specifically target different databases or collections to dump
24
+ # * de-fragements the datastore
25
+ # * don't need to run under sudo
26
+ # * simple logic
27
+ # CONS:
28
+ # * a bit longer to restore as you have to do an import
29
+ # * does not include indexes or other meta data
30
+ log system_messages[:mongo_dump]
31
+ exit 1 unless run "#{mongodump} #{mongodump_options} #{collections_to_include} -o #{tmp_dump_dir} #{additional_options} > /dev/null 2>&1"
32
+ when :disk_copy
33
+ #this is a bit more complicated AND potentially a lot riskier:
34
+ # PROS:
35
+ # * byte level copy, so it includes all the indexes, meta data, etc
36
+ # * fast recovery; you just copy the files into place and startup mongo
37
+ # CONS:
38
+ # * locks the database, so ONLY use against a slave instance
39
+ # * copies everything; cannot specify a collection or a database
40
+ # * will probably need to run under sudo as the mongodb db_path file is probably under a different owner.
41
+ # If you do run under sudo, you will probably need to run rake RAILS_ENV=... if you aren't already
42
+ # * the logic is a bit brittle...
43
+ log system_messages[:mongo_copy]
44
+
45
+ cmd = "#{mongo} #{mongo_disk_copy_options} --quiet --eval 'printjson(db.isMaster());' admin"
46
+ output = JSON.parse(run(cmd, :exit_on_failure => true))
47
+ if output['ismaster']
48
+ puts "You cannot run in disk_copy mode against a master instance. This mode will lock the database. Please use :mongodump instead."
49
+ exit 1
50
+ end
51
+
52
+ begin
53
+ cmd = "#{mongo} #{mongo_disk_copy_options} --quiet --eval 'db.runCommand({fsync : 1, lock : 1}); printjson(db.runCommand({getCmdLineOpts:1}));' admin"
54
+ output = JSON.parse(run(cmd, :exit_on_failure => true))
55
+
56
+ #lets go find the dbpath. it is either going to be in the argv just returned OR we are going to have to parse through the mongo config file
57
+ cmd = "mongo --quiet --eval 'printjson(db.runCommand({getCmdLineOpts:1}));' admin"
58
+ output = JSON.parse(run(cmd, :exit_on_failure => true))
59
+ #see if --dbpath was passed in
60
+ db_path = output['argv'][output['argv'].index('--dbpath') + 1] if output['argv'].index('--dbpath')
61
+ #see if --config is passed in, and if so, lets parse it
62
+ db_path ||= $1 if output['argv'].index('--config') && File.read(output['argv'][output['argv'].index('--config') + 1]) =~ /dbpath\s*=\s*([^\s]*)/
63
+ db_path ||= "/data/db/" #mongo's default path
64
+ run "cp -rp #{db_path} #{tmp_dump_dir}"
65
+ ensure
66
+ #attempting to unlock
67
+ cmd = "#{mongo} #{mongo_disk_copy_options} --quiet --eval 'printjson(db.currentOp());' admin"
68
+ output = JSON.parse(run(cmd, :exit_on_failure => true))
69
+ (output['fsyncLock'] || 1).to_i.times do
70
+ run "#{mongo} #{mongo_disk_copy_options} --quiet --eval 'db.$cmd.sys.unlock.findOne();' admin"
71
+ end
72
+ end
73
+ else
74
+ puts "you did not enter a valid backup_method option. Your choices are: #{BACKUP_METHOD_OPTIONS.join(', ')}"
75
+ exit 1
76
+ end
77
+
78
+ log system_messages[:compressing]
79
+ run "tar -cz -C #{tmp_path} -f #{File.join(tmp_path, compressed_file)} #{tmp_mongo_dir}"
80
+ end
81
+
82
+ def mongodump
83
+ cmd = run("which mongodump").chomp
84
+ cmd = 'mongodump' if cmd.empty?
85
+ cmd
86
+ end
87
+
88
+ def mongo
89
+ cmd = run("which mongo").chomp
90
+ cmd = 'mongo' if cmd.empty?
91
+ cmd
92
+ end
93
+
94
+ def performed_file_extension
95
+ ".tar"
96
+ end
97
+
98
+ # Loads the initial settings
99
+ def load_settings
100
+ %w(user password database collections additional_options backup_method).each do |attribute|
101
+ send(:"#{attribute}=", procedure.get_adapter_configuration.attributes[attribute])
102
+ end
103
+
104
+ %w(host port).each do |attribute|
105
+ send(:"#{attribute}=", procedure.get_adapter_configuration.get_options.attributes[attribute])
106
+ end
107
+
108
+ self.backup_method ||= :mongodump
109
+ end
110
+
111
+ # Returns a list of options in Mongodump syntax
112
+ def mongodump_options
113
+ options = String.new
114
+ options += " --username='#{user}' " unless user.blank?
115
+ options += " --password='#{password}' " unless password.blank?
116
+ options += " --host='#{host}' " unless host.blank?
117
+ options += " --port='#{port}' " unless port.blank?
118
+ options += " --db='#{database}' " unless database.blank?
119
+ options
120
+ end
121
+
122
+ def mongo_disk_copy_options
123
+ options = String.new
124
+ options += " --username='#{user}' " unless user.blank?
125
+ options += " --password='#{password}' " unless password.blank?
126
+ options += " --host='#{host}' " unless host.blank?
127
+ options += " --port='#{port}' " unless port.blank?
128
+ options
129
+ end
130
+
131
+ # Returns a list of collections to include in Mongodump syntax
132
+ def collections_to_include
133
+ return "" unless collections
134
+ "--collection #{[*collections].join(" ")}"
135
+ end
136
+
137
+ end
138
+ end
139
+ end
@@ -1,7 +1,10 @@
1
1
  module Backup
2
2
  module CommandHelper
3
- def run(command)
4
- Kernel.system command
3
+ def run(command, opts={})
4
+ opts[:exit_on_failure] ||= false
5
+ output = `#{command}`
6
+ exit 1 if opts[:exit_on_failure] && !$?.success?
7
+ output
5
8
  end
6
9
 
7
10
  def log(command)
@@ -2,7 +2,7 @@ module Backup
2
2
  module Configuration
3
3
  class Adapter
4
4
  extend Backup::Configuration::Attributes
5
- generate_attributes %w(files exclude user password database tables skip_tables commands additional_options)
5
+ generate_attributes %w(files exclude user password database tables skip_tables commands additional_options backup_method)
6
6
 
7
7
  def initialize
8
8
  @options = Backup::Configuration::AdapterOptions.new
@@ -4,7 +4,7 @@ module Backup
4
4
  extend Backup::Configuration::Attributes
5
5
  generate_attributes %w(encrypt_with_password encrypt_with_gpg_public_key keep_backups notify)
6
6
 
7
- attr_accessor :trigger, :storage_name, :adapter_name
7
+ attr_accessor :trigger, :storage_name, :adapter_name, :before_backup_block, :after_backup_block
8
8
 
9
9
  def initialize(trigger)
10
10
  @trigger = trigger
@@ -22,6 +22,14 @@ module Backup
22
22
  @storage_configuration.instance_eval &block
23
23
  end
24
24
 
25
+ def before_backup(&block)
26
+ @before_backup_block = block
27
+ end
28
+
29
+ def after_backup(&block)
30
+ @after_backup_block = block
31
+ end
32
+
25
33
  def storage_class
26
34
  case @storage_name.to_sym
27
35
  when :cloudfiles then Backup::Storage::CloudFiles
@@ -30,6 +38,7 @@ module Backup
30
38
  when :ftp then Backup::Storage::FTP
31
39
  when :sftp then Backup::Storage::SFTP
32
40
  when :local then Backup::Storage::Local
41
+ when :dropbox then Backup::Storage::Dropbox
33
42
  end
34
43
  end
35
44
 
@@ -41,6 +50,7 @@ module Backup
41
50
  when :ftp then Backup::Record::FTP
42
51
  when :sftp then Backup::Record::SFTP
43
52
  when :local then Backup::Record::Local
53
+ when :dropbox then Backup::Record::Dropbox
44
54
  end
45
55
  end
46
56
 
@@ -2,7 +2,7 @@ module Backup
2
2
  module Configuration
3
3
  class Storage
4
4
  extend Backup::Configuration::Attributes
5
- generate_attributes %w(ip user password path access_key_id secret_access_key use_ssl bucket username api_key container)
5
+ generate_attributes %w(ip user password path access_key_id secret_access_key host use_ssl bucket username api_key container)
6
6
  end
7
7
  end
8
8
  end
@@ -0,0 +1,62 @@
1
+ require 'dropbox'
2
+
3
+ module Backup
4
+ module Connection
5
+ class Dropbox
6
+
7
+ attr_accessor :adapter, :procedure, :final_file, :tmp_path, :api_key, :secret_access_key, :username, :password, :path
8
+
9
+ def initialize(adapter=false)
10
+ if adapter
11
+ self.adapter = adapter
12
+ self.procedure = adapter.procedure
13
+ self.final_file = adapter.final_file
14
+ self.tmp_path = adapter.tmp_path.gsub('\ ', ' ')
15
+
16
+ load_storage_configuration_attributes
17
+ end
18
+ end
19
+
20
+ def static_initialize(procedure)
21
+ self.procedure = procedure
22
+ load_storage_configuration_attributes(true)
23
+ end
24
+
25
+ def session
26
+ @session ||= ::Dropbox::Session.new(api_key, secret_access_key)
27
+ unless @session.authorized?
28
+ @session.authorizing_user = username
29
+ @session.authorizing_password = password
30
+ @session.authorize!
31
+ end
32
+
33
+ @session
34
+ end
35
+
36
+ def connect
37
+ session
38
+ end
39
+
40
+ def path
41
+ @path || "backups"
42
+ end
43
+
44
+ def store
45
+ path_to_file = File.join(tmp_path, final_file)
46
+ session.upload(path_to_file, path, :mode => :dropbox)
47
+ end
48
+
49
+ private
50
+
51
+ def load_storage_configuration_attributes(static=false)
52
+ %w(api_key secret_access_key username password path).each do |attribute|
53
+ if static
54
+ send("#{attribute}=", procedure.get_storage_configuration.attributes[attribute])
55
+ else
56
+ send("#{attribute}=", adapter.procedure.get_storage_configuration.attributes[attribute])
57
+ end
58
+ end
59
+ end
60
+ end
61
+ end
62
+ end
@@ -1,10 +1,13 @@
1
- require 's3'
1
+ require "fog"
2
2
 
3
3
  module Backup
4
4
  module Connection
5
5
  class S3
6
+ include Backup::CommandHelper
6
7
 
7
- attr_accessor :adapter, :procedure, :access_key_id, :secret_access_key, :s3_bucket, :use_ssl, :final_file, :tmp_path
8
+ MAX_S3_FILE_SIZE = 5368709120 - 1
9
+
10
+ attr_accessor :adapter, :procedure, :access_key_id, :secret_access_key, :host, :s3_bucket, :use_ssl, :final_file, :tmp_path
8
11
 
9
12
  # Initializes the S3 connection, setting the values using the S3 adapter
10
13
  def initialize(adapter = false)
@@ -24,48 +27,48 @@ module Backup
24
27
  end
25
28
 
26
29
  # Establishes a connection with Amazon S3 using the credentials provided by the user
27
- def connect
28
- service
29
- end
30
-
31
- # Wrapper for the Service object
32
- def service
33
- ::S3::Service.new(:access_key_id => access_key_id,
34
- :secret_access_key => secret_access_key,
35
- :use_ssl => use_ssl)
36
- end
37
-
38
- # Wrapper for the Bucket object
39
- def bucket
40
- begin
41
- # Find existing bucket:
42
- bucket = service.buckets.find(s3_bucket)
43
- rescue ::S3::Error::NoSuchBucket => e
44
- # Apparently the bucket doesn't exist yet, so create a new one:
45
- bucket = service.buckets.build(s3_bucket)
46
- bucket.save
47
- end
48
- bucket.retrieve
30
+ def connection
31
+ @_connection ||= Fog::AWS::Storage.new(
32
+ :aws_access_key_id => access_key_id,
33
+ :aws_secret_access_key => secret_access_key
34
+ )
49
35
  end
50
36
 
51
37
  # Initializes the file transfer to Amazon S3
52
38
  # This can only run after a connection has been made using the #connect method
53
39
  def store
54
- object = bucket.objects.build(final_file)
55
- object.content = open(File.join(tmp_path, final_file))
56
- object.save
40
+ #TODO: need to add logic like this to restore: `cat /mnt/backups/part.xx >>restore.tgz`
41
+ tmp_file_path = File.join(tmp_path, final_file)
42
+ store_files = []
43
+ if File.stat(File.join(tmp_path, final_file)).size >= MAX_S3_FILE_SIZE
44
+ #we need to split!
45
+ `split -b #{MAX_S3_FILE_SIZE} #{tmp_file_path} #{tmp_file_path}.`
46
+ store_files += `ls #{tmp_file_path}.*`.split
47
+ log("Splitting '#{final_file}' into #{store_files.length} parts as it is too large for s3.")
48
+ else
49
+ store_files << tmp_file_path
50
+ end
51
+
52
+ #lets make sure it exists
53
+ self.connection.put_bucket(s3_bucket)
54
+
55
+ store_files.each do |tmp_file|
56
+ file_name = File.basename(tmp_file)
57
+ log("Saving '#{file_name}' to s3 bucket '#{s3_bucket}'")
58
+ self.connection.put_object(s3_bucket, file_name, open(tmp_file))
59
+ end
57
60
  end
58
61
 
59
62
  # Destroys file from a bucket on Amazon S3
60
63
  def destroy(file, bucket_as_string)
61
- object = bucket.objects.find(file)
62
- object.destroy
64
+ self.connection.put_bucket(s3_bucket)
65
+ connection.delete_object(s3_bucket, file)
63
66
  end
64
67
 
65
68
  private
66
69
 
67
70
  def load_storage_configuration_attributes(static = false)
68
- %w(access_key_id secret_access_key use_ssl).each do |attribute|
71
+ %w(access_key_id secret_access_key use_ssl host).each do |attribute|
69
72
  if static
70
73
  send("#{attribute}=", procedure.get_storage_configuration.attributes[attribute])
71
74
  else
@@ -0,0 +1,27 @@
1
+ require 'backup/connection/dropbox'
2
+
3
+ module Backup
4
+ module Record
5
+ class Dropbox < Backup::Record::Base
6
+ def load_specific_settings(adapter)
7
+ end
8
+
9
+ private
10
+
11
+ def self.destroy_backups(procedure, backups)
12
+ dropbox = Backup::Connection::Dropbox.new
13
+ dropbox.static_initialize(procedure)
14
+ session = dropbox.session
15
+ backups.each do |backup|
16
+ puts "\nDestroying backup \"#{backup.filename}\"."
17
+ path_to_file = File.join(dropbox.path, backup.filename)
18
+ begin
19
+ session.delete(path_to_file, :mode => :dropbox)
20
+ rescue ::Dropbox::FileNotFoundError => e
21
+ puts "\n Backup with name '#{backup.filename}' was not found in '#{dropbox.path}'"
22
+ end
23
+ end
24
+ end
25
+ end
26
+ end
27
+ end
@@ -13,7 +13,6 @@ module Backup
13
13
  def self.destroy_backups(procedure, backups)
14
14
  s3 = Backup::Connection::S3.new
15
15
  s3.static_initialize(procedure)
16
- s3.connect
17
16
  backups.each do |backup|
18
17
  puts "\nDestroying backup \"#{backup.filename}\" from bucket \"#{backup.bucket}\"."
19
18
  s3.destroy(backup.filename, backup.bucket)
@@ -0,0 +1,12 @@
1
+ require 'backup/connection/dropbox'
2
+
3
+ module Backup
4
+ module Storage
5
+ class Dropbox < Base
6
+ def initialize(adapter)
7
+ dropbox = Backup::Connection::Dropbox.new(adapter)
8
+ dropbox.store
9
+ end
10
+ end
11
+ end
12
+ end
@@ -7,8 +7,6 @@ module Backup
7
7
  # Stores the backup file on the remote server using S3
8
8
  def initialize(adapter)
9
9
  s3 = Backup::Connection::S3.new(adapter)
10
- s3.connect
11
- log("Saving '#{s3.final_file}' to s3 bucket '#{s3.s3_bucket}'")
12
10
  s3.store
13
11
  end
14
12
 
@@ -17,6 +17,7 @@
17
17
  # - MySQL
18
18
  # - PostgreSQL
19
19
  # - SQLite
20
+ # - MongoDB
20
21
  # - Archive
21
22
  # - Custom
22
23
  #
@@ -74,35 +75,41 @@
74
75
 
75
76
 
76
77
  # Initialize with:
77
- # rake backup:run trigger='mysql-backup-s3'
78
- backup 'mysql-backup-s3' do
79
-
80
- adapter :mysql do
81
- user 'user'
82
- password 'password'
83
- database 'database'
84
-
85
- # skip_tables ['table1', 'table2', 'table3']
86
- #
87
- # options do
88
- # host '123.45.678.90'
89
- # port '80'
90
- # socket '/tmp/socket.sock'
91
- # end
92
- # additional_options '--single-transaction --quick'
78
+ # rake backup:run trigger='mongo-backup-s3'
79
+ backup 'mongo-backup-s3' do
80
+
81
+ adapter :mongo do
82
+ database "your_mongo_database"
83
+ #There are two ways to backup mongo:
84
+ # * :mongodump (DEFAULT) fairly fast, non-blocking, creates smaller bson files, need to import to recover
85
+ # * :disk_copy locks the database (use a slave!!!), does a disk-level copy, and then unlocks. fast, blocking, large archives, but very fast to recover
86
+ backup_method :mongodump #default
87
+ database :my_mongo_collection
88
+ options do
89
+ # host mongo.mysite.com
90
+ # port 27018 #perhaps you have a slave instance
91
+ # username user
92
+ # password secret
93
+ end
93
94
  end
94
95
 
95
96
  storage :s3 do
96
97
  access_key_id 'access_key_id'
97
98
  secret_access_key 'secret_access_key'
99
+ # host 's3-ap-southeast-1.amazonaws.com' #the s3 location. Defaults to us-east-1
98
100
  bucket '/bucket/backups/mysql/'
99
101
  use_ssl true
100
102
  end
101
103
 
102
104
  keep_backups 25
103
- encrypt_with_password 'password'
104
- notify false
105
-
105
+ encrypt_with_gpg_public_key <<-KEY
106
+ -----BEGIN PGP PUBLIC KEY BLOCK-----
107
+ Version: GnuPG v1.4.10 (Darwin)
108
+
109
+ public key goes here
110
+ -----END PGP PUBLIC KEY BLOCK-----
111
+ KEY
112
+ notify false
106
113
  end
107
114
 
108
115
  # Initialize with:
@@ -21,12 +21,13 @@
21
21
  # - Custom
22
22
  #
23
23
  # STORAGE METHODS
24
- # - S3 (Amazon)
25
- # - CF (Rackspace Cloud Files)
26
- # - SCP (Remote Server)
27
- # - FTP (Remote Server)
28
- # - SFTP (Remote Server)
29
- # - LOCAL (Local Server)
24
+ # - S3 (Amazon)
25
+ # - CF (Rackspace Cloud Files)
26
+ # - Dropbox (Dropbox Web Service)
27
+ # - SCP (Remote Server)
28
+ # - FTP (Remote Server)
29
+ # - SFTP (Remote Server)
30
+ # - LOCAL (Local Server)
30
31
  #
31
32
  # GLOBAL OPTIONS
32
33
  # - Keep Backups (keep_backups)
@@ -228,4 +229,27 @@ backup 'sqlite-backup-local' do
228
229
  encrypt_with_password false
229
230
  notify false
230
231
 
231
- end
232
+ end
233
+
234
+
235
+ # Initialize with
236
+ # sudo backup --run postgresql-backup-dropbox
237
+ backup 'postgresql-backup-dropbox' do
238
+
239
+ adapter :postgresql do
240
+ user 'devmen'
241
+ database 'domowoi_test'
242
+ end
243
+
244
+ storage :dropbox do
245
+ api_key 'your_api_key'
246
+ secret_access_key 'your_api_secret'
247
+ username 'user@example.org'
248
+ password 'super_s3cret'
249
+ path 'backups'
250
+ end
251
+
252
+ keep_backups :all
253
+ encrypt_with_password false
254
+ notify false
255
+ end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: backup
3
3
  version: !ruby/object:Gem::Version
4
- hash: 31
4
+ hash: 29
5
5
  prerelease: false
6
6
  segments:
7
7
  - 2
8
8
  - 4
9
- - 0
10
- version: 2.4.0
9
+ - 1
10
+ version: 2.4.1
11
11
  platform: ruby
12
12
  authors:
13
13
  - Michael van Rooijen
@@ -15,29 +15,45 @@ autorequire:
15
15
  bindir: bin
16
16
  cert_chain: []
17
17
 
18
- date: 2010-09-23 00:00:00 +02:00
18
+ date: 2010-10-17 00:00:00 +02:00
19
19
  default_executable:
20
20
  dependencies:
21
21
  - !ruby/object:Gem::Dependency
22
- name: s3
22
+ name: fog
23
23
  prerelease: false
24
24
  requirement: &id001 !ruby/object:Gem::Requirement
25
25
  none: false
26
26
  requirements:
27
- - - ">="
27
+ - - ~>
28
28
  - !ruby/object:Gem::Version
29
- hash: 19
29
+ hash: 25
30
30
  segments:
31
31
  - 0
32
32
  - 3
33
- - 0
34
- version: 0.3.0
33
+ - 5
34
+ version: 0.3.5
35
35
  type: :runtime
36
36
  version_requirements: *id001
37
37
  - !ruby/object:Gem::Dependency
38
- name: net-ssh
38
+ name: json_pure
39
39
  prerelease: false
40
40
  requirement: &id002 !ruby/object:Gem::Requirement
41
+ none: false
42
+ requirements:
43
+ - - ~>
44
+ - !ruby/object:Gem::Version
45
+ hash: 11
46
+ segments:
47
+ - 1
48
+ - 4
49
+ - 6
50
+ version: 1.4.6
51
+ type: :runtime
52
+ version_requirements: *id002
53
+ - !ruby/object:Gem::Dependency
54
+ name: net-ssh
55
+ prerelease: false
56
+ requirement: &id003 !ruby/object:Gem::Requirement
41
57
  none: false
42
58
  requirements:
43
59
  - - ">="
@@ -49,11 +65,11 @@ dependencies:
49
65
  - 15
50
66
  version: 2.0.15
51
67
  type: :runtime
52
- version_requirements: *id002
68
+ version_requirements: *id003
53
69
  - !ruby/object:Gem::Dependency
54
70
  name: net-scp
55
71
  prerelease: false
56
- requirement: &id003 !ruby/object:Gem::Requirement
72
+ requirement: &id004 !ruby/object:Gem::Requirement
57
73
  none: false
58
74
  requirements:
59
75
  - - ">="
@@ -65,11 +81,11 @@ dependencies:
65
81
  - 2
66
82
  version: 1.0.2
67
83
  type: :runtime
68
- version_requirements: *id003
84
+ version_requirements: *id004
69
85
  - !ruby/object:Gem::Dependency
70
86
  name: net-sftp
71
87
  prerelease: false
72
- requirement: &id004 !ruby/object:Gem::Requirement
88
+ requirement: &id005 !ruby/object:Gem::Requirement
73
89
  none: false
74
90
  requirements:
75
91
  - - ">="
@@ -81,11 +97,11 @@ dependencies:
81
97
  - 4
82
98
  version: 2.0.4
83
99
  type: :runtime
84
- version_requirements: *id004
100
+ version_requirements: *id005
85
101
  - !ruby/object:Gem::Dependency
86
102
  name: activerecord
87
103
  prerelease: false
88
- requirement: &id005 !ruby/object:Gem::Requirement
104
+ requirement: &id006 !ruby/object:Gem::Requirement
89
105
  none: false
90
106
  requirements:
91
107
  - - ">="
@@ -97,11 +113,11 @@ dependencies:
97
113
  - 5
98
114
  version: 2.3.5
99
115
  type: :runtime
100
- version_requirements: *id005
116
+ version_requirements: *id006
101
117
  - !ruby/object:Gem::Dependency
102
118
  name: sqlite3-ruby
103
119
  prerelease: false
104
- requirement: &id006 !ruby/object:Gem::Requirement
120
+ requirement: &id007 !ruby/object:Gem::Requirement
105
121
  none: false
106
122
  requirements:
107
123
  - - ">="
@@ -113,11 +129,11 @@ dependencies:
113
129
  - 5
114
130
  version: 1.2.5
115
131
  type: :runtime
116
- version_requirements: *id006
132
+ version_requirements: *id007
117
133
  - !ruby/object:Gem::Dependency
118
134
  name: hirb
119
135
  prerelease: false
120
- requirement: &id007 !ruby/object:Gem::Requirement
136
+ requirement: &id008 !ruby/object:Gem::Requirement
121
137
  none: false
122
138
  requirements:
123
139
  - - ">="
@@ -129,11 +145,11 @@ dependencies:
129
145
  - 9
130
146
  version: 0.2.9
131
147
  type: :runtime
132
- version_requirements: *id007
148
+ version_requirements: *id008
133
149
  - !ruby/object:Gem::Dependency
134
150
  name: pony
135
151
  prerelease: false
136
- requirement: &id008 !ruby/object:Gem::Requirement
152
+ requirement: &id009 !ruby/object:Gem::Requirement
137
153
  none: false
138
154
  requirements:
139
155
  - - ">="
@@ -144,11 +160,11 @@ dependencies:
144
160
  - 5
145
161
  version: "0.5"
146
162
  type: :runtime
147
- version_requirements: *id008
163
+ version_requirements: *id009
148
164
  - !ruby/object:Gem::Dependency
149
165
  name: cloudfiles
150
166
  prerelease: false
151
- requirement: &id009 !ruby/object:Gem::Requirement
167
+ requirement: &id010 !ruby/object:Gem::Requirement
152
168
  none: false
153
169
  requirements:
154
170
  - - ">="
@@ -160,13 +176,29 @@ dependencies:
160
176
  - 7
161
177
  version: 1.4.7
162
178
  type: :runtime
163
- version_requirements: *id009
179
+ version_requirements: *id010
180
+ - !ruby/object:Gem::Dependency
181
+ name: dropbox
182
+ prerelease: false
183
+ requirement: &id011 !ruby/object:Gem::Requirement
184
+ none: false
185
+ requirements:
186
+ - - ">="
187
+ - !ruby/object:Gem::Version
188
+ hash: 23
189
+ segments:
190
+ - 1
191
+ - 1
192
+ - 2
193
+ version: 1.1.2
194
+ type: :runtime
195
+ version_requirements: *id011
164
196
  description: |-
165
197
  Backup is a Ruby Gem written for Unix and Ruby on Rails (2 and 3) environments. It can be used both with
166
198
  and without the Ruby on Rails framework! This gem offers a quick and simple solution to backing up databases
167
199
  such as MySQL/PostgreSQL/SQLite and Files/Folders. All backups can be transferred to Amazon S3, Rackspace Cloud Files,
168
- any remote server you have access to (using either SCP, SFTP or regular FTP), or a Local server. Backup handles
169
- Compression, Archiving, Encryption (OpenSSL or GPG), Backup Cleaning (Cycling) and supports Email Notifications.
200
+ Dropbox Web Service, any remote server you have access to (using either SCP, SFTP or regular FTP), or a Local server.
201
+ Backup handles Compression, Archiving, Encryption (OpenSSL or GPG), Backup Cleaning (Cycling) and supports Email Notifications.
170
202
  email: meskyanichi@gmail.com
171
203
  executables:
172
204
  - backup
@@ -182,6 +214,7 @@ files:
182
214
  - lib/backup/adapters/archive.rb
183
215
  - lib/backup/adapters/base.rb
184
216
  - lib/backup/adapters/custom.rb
217
+ - lib/backup/adapters/mongo_db.rb
185
218
  - lib/backup/adapters/mysql.rb
186
219
  - lib/backup/adapters/postgresql.rb
187
220
  - lib/backup/adapters/sqlite.rb
@@ -195,6 +228,7 @@ files:
195
228
  - lib/backup/configuration/smtp.rb
196
229
  - lib/backup/configuration/storage.rb
197
230
  - lib/backup/connection/cloudfiles.rb
231
+ - lib/backup/connection/dropbox.rb
198
232
  - lib/backup/connection/s3.rb
199
233
  - lib/backup/environment/base.rb
200
234
  - lib/backup/environment/rails_configuration.rb
@@ -203,6 +237,7 @@ files:
203
237
  - lib/backup/mail/mail.txt
204
238
  - lib/backup/record/base.rb
205
239
  - lib/backup/record/cloudfiles.rb
240
+ - lib/backup/record/dropbox.rb
206
241
  - lib/backup/record/ftp.rb
207
242
  - lib/backup/record/local.rb
208
243
  - lib/backup/record/s3.rb
@@ -210,6 +245,7 @@ files:
210
245
  - lib/backup/record/sftp.rb
211
246
  - lib/backup/storage/base.rb
212
247
  - lib/backup/storage/cloudfiles.rb
248
+ - lib/backup/storage/dropbox.rb
213
249
  - lib/backup/storage/ftp.rb
214
250
  - lib/backup/storage/local.rb
215
251
  - lib/backup/storage/s3.rb