backup 0.1.0 → 0.2.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -1,237 +1,69 @@
1
1
  = Backup
2
2
 
3
- Backup is a gem/plugin that enables you to very easily create backups and transfer these to Amazon S3 or another server using SSH.
4
- It currently supports MySQL, SQLite3 and basic Assets (documents, images, etc) inside a folder. The files will get tar'd / gzip'd and get a timestamp-prefix.
5
- After the backup file has been created, it can be transferred to either Amazon S3 or any remote server through SSH.
3
+ == What is "Backup"?
6
4
 
7
- == Installation
5
+ "Backup" is a RubyGem written for Rails to (easily) handle backing up your database files and assets to either Amazon S3, or any other server using SSH.
8
6
 
9
- === Add Repository Source(s)
10
7
 
11
- gem sources -a http://gemcutter.org
12
- gem sources -a http://gems.github.com
8
+ === Backup makes use of two storage methods:
13
9
 
14
- === Gem
10
+ - Amazon S3
11
+ - Any Remote Server You Can Access Through "SSH"
15
12
 
16
- # Gem Cutter
17
- sudo gem install backup
18
-
19
- # GitHub
20
- sudo gem install meskyanichi-backup
21
13
 
22
- === Plugin
14
+ === Currently it supports:
23
15
 
24
- ./script/plugin install git://github.com/meskyanichi/backup.git
16
+ - SQLite3
17
+ - MySQL
18
+ - Folder (and Sub Folders) of Assets (Think of: Documents, Images, Etc.)
25
19
 
26
20
 
27
- === Dependencies
21
+ === Your database type not on the list?
28
22
 
29
- # This will automatically install when installing the Backup gem.
30
- # If you are using the Plugin, instead of the gem, be sure to install the aws-s3 gem.
31
- sudo gem install aws-s3
23
+ - Custom
32
24
 
25
+ I built in a simple class (called: Custom) that will allow you to manually create a SQL dump using a shell command.
26
+ This looks basically the same as the other options, except that you will have to write (probably) only a "single
27
+ line'd shell command" to generate a sql dump file. Once you generate the file, Backup will take care of the rest
28
+ of the (optional)archiving, compressing and pushing to S3 or any remote server that supports SSH.
33
29
 
34
- == Getting started
30
+ ==== Setting up Backup takes me about 1-2 minutes and it's really easy!
35
31
 
36
- Well, this is ridiculously easy to set up! So let's do this.
37
- First install either the gem or plugin.
38
32
 
39
- If you are using the gem version you "must" add the following line to your environment.rb file
33
+ == Interested in trying out Backup?
40
34
 
41
- ==== config/environment.rb
35
+ === Check out the following Wiki pages:
42
36
 
43
- # For GemCutter Version
44
- config.gem "backup", :lib => "backup", :version => "0.1.0", :source => "http://gemcutter.org"
45
-
46
- # For GitHub Version
47
- config.gem "meskyanichi-backup", :lib => "backup", :version => "0.1.0", :source => "http://gems.github.com"
37
+ === Installation
48
38
 
49
- Once that's done, run the following command from the "root" of your Rails App
39
+ http://wiki.github.com/meskyanichi/backup/installation
50
40
 
51
- ./script/generate backup_rake_tasks
52
-
53
41
 
54
- This will generate two rake tasks and a README inside your "#{RAILS_ROOT}/lib/tasks/backup" folder.
55
- - README.rdoc
56
- - s3.rake
57
- - ssh.rake
42
+ === Getting started
58
43
 
59
- Open and read the README.rdoc if you want. It will explain everything very quickly, though, I doubt there is much to explain!
60
- Then (or otherwise) open the "s3.rake" and "ssh.rake" rake task files. These include all the rake tasks/combinations that are (currently!) available.
61
- Above each task inside these files is a description, explaining what you must do, again, "very" straight forward. Stupidly Easy to set up, thankfully!
44
+ http://wiki.github.com/meskyanichi/backup/getting-started
62
45
 
63
- After you've set up the tasks you wish to utilize (obviously you don't need to use all of them, you can simply just choose to use one of them),
64
- they are all generated so you can basically just fill in the configuration values and be done with it. You obviously don't have to fill in the configuration
65
- for any rake tasks you are not going to use.
66
46
 
67
- That's it! So you have now done two things:
68
- - Configured the Rake Files (just filled in the empty values)
69
- - And at the same time, these are the executable rake tasks you will be using to create a backup
47
+ === Requirements
70
48
 
71
- Just run one of the configured rake tasks to perform a backup!
49
+ http://wiki.github.com/meskyanichi/backup/requirements
72
50
 
73
- == Example
74
51
 
75
- So let me give an example of what one of these rake tasks will look like:
52
+ === Automatic Backups
76
53
 
77
- ==== Rake Task for Backing up a Sqlite3 file to S3
54
+ http://wiki.github.com/meskyanichi/backup/automatic-backups
78
55
 
79
- task :sqlite3 => :environment do
80
- Backup::Sqlite3.new({
81
- :file => 'production.sqlite3',
82
-
83
- :use => :s3,
84
- :s3 => {
85
- :access_key_id => 'your-s3-id',
86
- :secret_access_key => 'your-s3-password',
87
- :bucket => 'your-bucket-to-backup-to'
88
- }
89
- }).run
90
- end
91
56
 
92
- So this is one of the rake that's that the generator provides. Simple, isn't it?
93
- Just fill in the values and run:
57
+ === Resources
94
58
 
95
- rake backup:s3:sqlite3
59
+ http://wiki.github.com/meskyanichi/backup/resources
96
60
 
97
- ==== Rake Task for Backing up a Sqlite3 file to another server through SSH
98
61
 
99
- task :sqlite3 => :environment do
100
- Backup::Sqlite3.new({
101
- :file => 'production.sqlite3',
102
-
103
- :use => :ssh,
104
- :ssh => {
105
- :user => "root",
106
- :ip => "123.45.678.90", # OR my-domain.com
107
- :path => "/var/backups/etc"
108
- }
109
- }).run
110
- end
111
-
112
- Again, quick and easy. Now just execute this Backing/SSH transfer by running:
113
-
114
- rake backing:ssh:sqlite3
115
-
116
-
117
- See below what the requirements are when using S3 or SSH.
118
-
119
- == Requirements
120
-
121
- === Using Amazon S3
122
-
123
- This obviously requires you to have access to an Amazon S3 account.
124
- These accounts are free and you only get charged for what you actually "use".
125
- So no transfers = no cost. And aside of that, S3 is EXTREMELY cheap!
126
-
127
- You can get an account here: http://aws.amazon.com/s3
128
-
129
- Once you have an account you must install the AWS S3 gem, like so:
130
-
131
- sudo gem install aws-s3
132
-
133
- Backup makes use of the "aws-s3" gem to connect to Amazon S3. This is a dependency and will be installed when installing the Backup gem.
134
- If you are using the plugin, you will need to manually install it.
135
-
136
- === Using SSH
137
-
138
- If you're using SSH then there is one thing you must do. You must provide the machine that's going to "receive" your backups
139
- your machine's (the senders) ssh-key. This is basically what you did with GitHub so you could push data to your GitHub repository
140
- without getting prompted for a password.
141
-
142
- === Setting Up A Key For SSH
143
-
144
- Setting up SSH Keychains is quite simple.
145
-
146
- SSH to the "production" server and run the following command:
147
- ssh-keygen -t rsa
148
-
149
- It will prompt you 3 times, first it will ask what you wish to call the filename.
150
-
151
- Just hit enter every time, do "not" fill in a password.
152
-
153
- This will generate two files in the ~/.ssh/ directory, namely:
154
- - id_rsa
155
- - id_rsa.pub
156
-
157
- So, now that the files are generated, holding the authorization keys, you can use these on any server you wish to login to without using a password.
158
- The procedure accomplishing this is easy.
159
-
160
- First, we will ensure there is a .ssh directory on the "backup" server by running the following command:
161
- ssh root@your_ip mkdir -p .ssh
162
-
163
- Once that's in place, we will append our newly (or already existing) key to the backup servers' .ssh/authorized_keys file.
164
- cat ~/.ssh/id_rsa.pub | ssh root@server.com 'cat >> .ssh/authorized_keys'
165
-
166
- Done. Now the key, generated on your "production" server, has been inserted inside the ".ssh/authorized_keys" file on the Backup server.
167
- You should now be set to run all SSH rake tasks that have been configured to work with that particular Backup server.
168
-
169
- ==== Note: If the "authorized_keys" file does not yet exist, it will be automatically be created.
170
-
171
-
172
- == Periodical Backups (using the "rake tasks" and a "cron")
173
-
174
- Assuming you will want to run these backups (rake tasks) periodically.
175
- What I currently use to run them is the "javan-whenever" gem. This is a very simple, easy to use gem
176
- that makes it EXTREMELY simple to manage cron, using Ruby syntax. To understand what I mean, see the example below!
177
-
178
-
179
- === Javan's Whenever Gem Example
180
-
181
- With this gem you can basically get periodic backup execution as easy as this:
182
-
183
- every 2.hours do
184
- rake "backup:s3:sqlite3"
185
- end
186
-
187
- Obviously this will update the crontab to make SQLite3 backups and store them on Amazon S3 every 2 hours.
188
- If you want to do multiple backups, like perhaps backup your MySQL database, along with your assets:
189
-
190
- every 2.hours do
191
- rake "backup:s3:mysql"
192
- rake "backup:s3:assets"
193
- end
194
-
195
- So yes, I highly recommend using the javan-whenever gem for this. It's very easy to write and maintain the crontab this way.
196
- For more information on this gem and on how to use it: http://github.com/javan/whenever
197
- Also, Ryan Bates has created a screencast for this gem, see it here: http://railscasts.com/episodes/164-cron-in-ruby
198
- It's awesome, go check it out!
199
-
200
-
201
- == Resources
202
-
203
- So let me sum up the resources
204
-
205
- ==== My Backup Gem
206
- sudo gem install backup
207
- or
208
- sudo gem install meskyanichi-backup
209
-
210
- ==== AWS-S3 Gem
211
- sudo gem install aws-s3
212
-
213
- ==== Whenever Gem
214
- sudo gem install javan-whenever
215
- http://railscasts.com/episodes/164-cron-in-ruby
216
-
217
- ==== Amazon S3 Account Creation
218
- http://aws.amazon.com/s3
219
-
220
-
221
- ==== Notes:
222
-
223
- If you are going to use Amazon S3 to store your backups, be sure to install "AWS-S3".
224
-
225
- If you want a nice and super easy way of managing the crontab and do periodical backups, install "Whenever".
226
-
227
- Watch the Whenever Gem Screencast by Ryan Bates: http://railscasts.com/episodes/164-cron-in-ruby
228
-
229
-
230
- == Requests
62
+ === Requests
231
63
 
232
64
  If anyone wishes to see support for PostgreSQL or any other database format, please send me a message!
233
65
 
234
66
 
235
- == Copyright
67
+ === Copyright
236
68
 
237
69
  Copyright (c) 2009 Michael van Rooijen | Final Creation. See LICENSE for details.
data/VERSION CHANGED
@@ -1 +1 @@
1
- 0.1.0
1
+ 0.2.0
@@ -5,7 +5,7 @@
5
5
 
6
6
  Gem::Specification.new do |s|
7
7
  s.name = %q{backup}
8
- s.version = "0.1.0"
8
+ s.version = "0.2.0"
9
9
 
10
10
  s.required_rubygems_version = Gem::Requirement.new(">= 0") if s.respond_to? :required_rubygems_version=
11
11
  s.authors = ["meskyanichi"]
@@ -6,6 +6,17 @@ module Backup
6
6
  setup_paths("assets/#{self.class.name.downcase.gsub('::','-')}", 'tar.gz')
7
7
  end
8
8
 
9
+ # Initialize the process
10
+ # Executing multiple processes
11
+ #
12
+ # - Archive
13
+ # Archives the specified folder to a .tar
14
+ # - Compress
15
+ # Compresses the .tar file using Gzip
16
+ # - Transfer
17
+ # Initializes the transfer to either S3 or using SSH
18
+ # - Remove Temp Files
19
+ # Removes temporary files after the process is complete
9
20
  def run
10
21
  archive
11
22
  compress
@@ -15,14 +26,18 @@ module Backup
15
26
 
16
27
  private
17
28
 
29
+ # Archives the assets into a .tar file and stores it
30
+ # inside the "Backup Path"
18
31
  def archive
19
32
  %x{ tar -cf #{File.join(options[:backup_path], options[:backup_file])} #{options[:path]} }
20
33
  end
21
34
 
35
+ # Compresses the .tar file to .tar.gz and removes the old .tar file
22
36
  def compress
23
37
  %x{ gzip --best #{File.join(options[:backup_path], options[:backup_file])} }
24
38
  end
25
39
 
40
+ # Set default options
26
41
  def default_options
27
42
  { :path => "#{RAILS_ROOT}/public/assets", :file => "assets" }
28
43
  end
@@ -9,7 +9,12 @@ module Backup
9
9
  end
10
10
 
11
11
  private
12
-
12
+
13
+ # Sets up the default paths and stores them in the options hash
14
+ # It also ensures the directory to where the temporary files are stored
15
+ # exists. If it doesn't it'll be created
16
+ # It will store the backup_path and the backup_file names
17
+ # The backup_file name is prefixed with the timestamp of the initialize time.
13
18
  def setup_paths(path, type = nil)
14
19
  %x{ mkdir -p #{RAILS_ROOT}/tmp/backups/#{path} }
15
20
  options[:backup_path] = "#{RAILS_ROOT}/tmp/backups/#{path}"
@@ -21,6 +26,11 @@ module Backup
21
26
  end
22
27
  end
23
28
 
29
+ # Initializes one of the transfer methods
30
+ # Currently there are two transfer methods available
31
+ #
32
+ # - Amazon S3
33
+ # - SSH
24
34
  def transfer
25
35
  case options[:use]
26
36
  when :s3 then Backup::Transfer::S3.new(options)
@@ -28,10 +38,15 @@ module Backup
28
38
  end
29
39
  end
30
40
 
41
+ # Removes files that were stored in the tmp/backups/* directory of the Rails application
42
+ # It completely cleans up the backup folders so theres no trash stored on the production server
31
43
  def remove_temp_files
32
44
  %x{ rm #{File.join(options[:backup_path], "*")} }
33
45
  end
34
46
 
47
+ # Removes files that were generated for the transfer
48
+ # This can remove either a single file or an array of files
49
+ # Depending on whether the options[:file] is an Array or a String
35
50
  def remove_original_file
36
51
  if options[:file].is_a?(Array)
37
52
  options[:file].each do |file|
@@ -6,6 +6,7 @@ module Backup
6
6
  super(options)
7
7
  end
8
8
 
9
+ # Establishes a connection with Amazon S3 using the credentials provided by the user
9
10
  def connect
10
11
  AWS::S3::Base.establish_connection!(
11
12
  :access_key_id => options[:s3][:access_key_id],
@@ -13,18 +14,30 @@ module Backup
13
14
  )
14
15
  end
15
16
 
17
+ # Wrapper for the Service object
16
18
  def service
17
19
  AWS::S3::Service
18
20
  end
19
21
 
22
+ # Wrapper for the Bucket object
20
23
  def bucket
21
24
  AWS::S3::Bucket
22
25
  end
23
26
 
27
+ # Wrapper for the Object object
24
28
  def object
25
29
  AWS::S3::S3Object
26
30
  end
27
31
 
32
+ # Initializes the file transfer to Amazon S3
33
+ # This can only run after a connection has been made using the #connect method
34
+ def transfer
35
+ object.store(
36
+ options[:backup_file],
37
+ open(File.join(options[:backup_path], options[:backup_file])),
38
+ options[:s3][:bucket] )
39
+ end
40
+
28
41
  end
29
42
  end
30
43
  end
@@ -6,7 +6,10 @@ module Backup
6
6
  super(options)
7
7
  end
8
8
 
9
- def store
9
+ # Initializes the transfer to the specified server using SSH.
10
+ # This will first ensure there is a directory, if there is not, a new one will be created
11
+ # After the directory has been confirmed, the transfer process will be initialized.
12
+ def transfer
10
13
  %x{ ssh #{options[:ssh][:user]}@#{options[:ssh][:ip]} mkdir -p #{options[:ssh][:path]} }
11
14
  %x{ scp #{File.join(options[:backup_path], options[:backup_file])} #{options[:ssh][:user]}@#{options[:ssh][:ip]}:#{options[:ssh][:path]} }
12
15
  end
@@ -6,6 +6,21 @@ module Backup
6
6
  setup_paths("db/#{self.class.name.downcase.gsub('::','-')}", options[:file].is_a?(Array) ? 'tar.gz' : 'gz')
7
7
  end
8
8
 
9
+ # Initialize the process
10
+ # Executing multiple processes
11
+ #
12
+ # - Command
13
+ # Executes a command from a user to generate a SQL dump
14
+ # - Archive
15
+ # Archives the specified folder to a .tar
16
+ # - Compress
17
+ # Compresses the .tar file using Gzip
18
+ # - Transfer
19
+ # Initializes the transfer to either S3 or using SSH
20
+ # - Remove Temp Files
21
+ # Removes temporary files after the process is complete
22
+ # - Remove Original File
23
+ # Removes the user generated sql files (unless the user specifies he wants to keep them)
9
24
  def run
10
25
  command
11
26
  archive
@@ -16,7 +31,10 @@ module Backup
16
31
  end
17
32
 
18
33
  private
19
-
34
+
35
+ # Allows a user to insert one or more commands to be executed
36
+ # before the actual archive, compress and transferring processes.
37
+ # The command takes either a String for a single command, and an Array for multiple commands.
20
38
  def command
21
39
  if options[:command].is_a?(Array)
22
40
  options[:command].each do |command|
@@ -27,6 +45,8 @@ module Backup
27
45
  end
28
46
  end
29
47
 
48
+ # Archives the assets into a .tar file and stores it
49
+ # inside the "Backup Path"
30
50
  def archive
31
51
  if options[:file].is_a?(Array)
32
52
  files = options[:file].map {|file| File.join(options[:path], file)}
@@ -36,6 +56,10 @@ module Backup
36
56
  end
37
57
  end
38
58
 
59
+ # If the user has bundled a couple of files to a .tar (by using an Array for the :file attribute)
60
+ # then it compresses the .tar file to .tar.gz and removes the old .tar file
61
+ # If the user has only a single file, it will be read out and a new file will be generated
62
+ # The old (single) file will remain until the process is complete, unless the user specifies otherwise.
39
63
  def compress
40
64
  if options[:file].is_a?(Array)
41
65
  %x{ gzip --best #{File.join(options[:backup_path], options[:backup_file])} }
@@ -44,6 +68,7 @@ module Backup
44
68
  end
45
69
  end
46
70
 
71
+ # Set default options
47
72
  def default_options
48
73
  { :path => "",
49
74
  :file => "",
@@ -6,6 +6,17 @@ module Backup
6
6
  setup_paths("db/#{self.class.name.downcase.gsub('::','-')}", :gz)
7
7
  end
8
8
 
9
+ # Initialize the process
10
+ # Executing multiple processes
11
+ #
12
+ # - Make MySQL Dump
13
+ # Creates a MySQL dump based on the parameters provided by the user
14
+ # - Compress
15
+ # Compresses the .tar file using Gzip
16
+ # - Transfer
17
+ # Initializes the transfer to either S3 or using SSH
18
+ # - Remove Temp Files
19
+ # Removes temporary files after the process is complete
9
20
  def run
10
21
  make_mysql_dump
11
22
  compress
@@ -14,16 +25,21 @@ module Backup
14
25
  end
15
26
 
16
27
  private
17
-
28
+
29
+ # Compresses the MySQL dump file and stores the compressed version inside the tmp/backups folder.
18
30
  def compress
19
31
  %x{ gzip -cv #{File.join(options[:path], options[:file])} --best > #{File.join(options[:backup_path], options[:backup_file])} }
20
32
  end
21
-
33
+
34
+ # This will generate a MySQL dump based on the options the user passed in.
35
+ # The MySQL dump will be placed (by default) in the config/db directory so it can be found
36
+ # by the compressor.
22
37
  def make_mysql_dump
23
38
  # => /usr/local/mysql/bin/mysqldump on Mac OS X 10.6
24
39
  %x{ mysqldump --quick -u #{options[:mysql][:user]} --password='#{options[:mysql][:password]}' #{options[:mysql][:database]} > #{File.join(options[:path], options[:file])} }
25
40
  end
26
-
41
+
42
+ # Set default options
27
43
  def default_options
28
44
  {:path => "#{RAILS_ROOT}/tmp/backups/db/#{self.class.name.downcase.gsub('::','-')}",
29
45
  :file => "production.sql",
@@ -6,6 +6,15 @@ module Backup
6
6
  setup_paths("db/#{self.class.name.downcase.gsub('::','-')}", :gz)
7
7
  end
8
8
 
9
+ # Initialize the process
10
+ # Executing multiple processes
11
+ #
12
+ # - Compress
13
+ # Compresses the .tar file using Gzip
14
+ # - Transfer
15
+ # Initializes the transfer to either S3 or using SSH
16
+ # - Remove Temp Files
17
+ # Removes temporary files after the process is complete
9
18
  def run
10
19
  compress
11
20
  transfer
@@ -14,10 +23,12 @@ module Backup
14
23
 
15
24
  private
16
25
 
26
+ # Compresses the SQLite3file and stores the compressed version inside the tmp/backups folder.
17
27
  def compress
18
28
  %x{ gzip -cv #{File.join(options[:path], options[:file])} --best > #{File.join(options[:backup_path], options[:backup_file])} }
19
29
  end
20
30
 
31
+ # Set default options
21
32
  def default_options
22
33
  { :path => "#{RAILS_ROOT}/db",
23
34
  :file => "production.sqlite3" }
@@ -7,16 +7,22 @@ module Backup
7
7
  def initialize(options)
8
8
  super(default_options.merge(options))
9
9
 
10
+ # Creates a new instance of the Amazon S3 Wrapper Class/Object
11
+ # Passes in the options hash and lets the wrapper extract only the
12
+ # necessary information that is required to establish a link to Amazon S3.
10
13
  s3 = Backup::Connection::S3.new(options)
14
+
15
+ # Connects to Amazon S3 using the credentials provided and
16
+ # stored in the options has by the user
11
17
  s3.connect
12
- s3.object.store(
13
- options[:backup_file],
14
- open(File.join(options[:backup_path], options[:backup_file])),
15
- options[:s3][:bucket] )
18
+
19
+ # Initializes the file transfer to Amazon S3
20
+ s3.transfer
16
21
  end
17
22
 
18
23
  private
19
24
 
25
+ # Set default options
20
26
  def default_options
21
27
  {:s3 => {
22
28
  :access_key_id => '',
@@ -5,12 +5,18 @@ module Backup
5
5
  def initialize(options)
6
6
  super(default_options.merge(options))
7
7
 
8
+ # Creates a new instance of the SSH Wrapper Class/Object
9
+ # Passes in the options hash and lets the wrapper extract only the
10
+ # necessary information that is required to later transfer the specified file through SSH.
8
11
  ssh = Backup::Connection::SSH.new(options)
9
- ssh.store
12
+
13
+ # Initializes the file transfer to the specified server through SSH.
14
+ ssh.transfer
10
15
  end
11
16
 
12
17
  private
13
18
 
19
+ # Set default options
14
20
  def default_options
15
21
  {:ssh => {
16
22
  :user => "",
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: backup
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - meskyanichi