sweety_backy 0.0.19 → 0.0.21

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,28 +1,30 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- sweety_backy (0.0.15)
4
+ sweety_backy (0.0.19)
5
5
  aws-sdk
6
6
 
7
7
  GEM
8
8
  remote: http://rubygems.org/
9
9
  specs:
10
- aws-sdk (1.1.1)
10
+ aws-sdk (1.7.1)
11
11
  httparty (~> 0.7)
12
12
  json (~> 1.4)
13
13
  nokogiri (>= 1.4.4)
14
14
  uuidtools (~> 2.1)
15
15
  chronic (0.6.2)
16
- crack (0.1.8)
17
16
  delorean (1.1.0)
18
17
  chronic
19
- httparty (0.7.8)
20
- crack (= 0.1.8)
21
- json (1.5.3)
18
+ httparty (0.9.0)
19
+ multi_json (~> 1.0)
20
+ multi_xml
21
+ json (1.7.5)
22
22
  mocha (0.9.12)
23
- nokogiri (1.5.0)
23
+ multi_json (1.3.7)
24
+ multi_xml (0.5.1)
25
+ nokogiri (1.5.5)
24
26
  rake (0.9.2)
25
- uuidtools (2.1.2)
27
+ uuidtools (2.1.3)
26
28
 
27
29
  PLATFORMS
28
30
  ruby
data/README.md CHANGED
@@ -15,11 +15,11 @@ Please take a look of other **Ruby backups gems**:
15
15
  ## How install
16
16
 
17
17
  gem install 'sweety_backy'
18
-
18
+
19
19
  ## How to use it
20
-
20
+
21
21
  sweety_backy <config_file>
22
-
22
+
23
23
  ### Config file
24
24
 
25
25
  It is a _yaml_ file with all this attributes
@@ -30,6 +30,7 @@ It is a _yaml_ file with all this attributes
30
30
  monthly: <quantity of monthly backups>
31
31
  weekly: <quantity of weekly backups>
32
32
  daily: <quantity of daily backups>
33
+ slices_size: <in MB, if present the compressed files will be sliced in pieces>
33
34
  database_user: <database user with read privileges of all datases>
34
35
  database_pass: <database user password>
35
36
  storage_system: { 's3' | 'local' }
@@ -37,16 +38,16 @@ It is a _yaml_ file with all this attributes
37
38
  path: <absoulte path to the root folder of the backups>
38
39
  s3_opts: (only if the storage_system is 's3')
39
40
  bucket: <bucket name>
40
- path: <bucket path where the backups will be stored><
41
+ path: <bucket path where the backups will be stored>
41
42
  passwd_file: <path to the S3 credentials>
42
-
43
+
43
44
  ### S3 credentials file
44
45
 
45
46
  It is a _yaml_ file with two keys with the S3 credentials:
46
47
 
47
48
  access_key_id: "XXX"
48
49
  secret_access_key: "YYY"
49
-
50
+
50
51
  ### Example
51
52
 
52
53
  #### S3 config example
@@ -56,7 +57,7 @@ It is a _yaml_ file with two keys with the S3 credentials:
56
57
  secret_access_key: "YYY"
57
58
 
58
59
 
59
- #### SweetyBacky config example
60
+ #### SweetyBacky config example
60
61
 
61
62
  # ~/.sweety_backy.conf
62
63
  paths: [ "/Users/fguillen/Develop/Brico", "/Users/fguillen/Develop/Arduino" ]
@@ -65,57 +66,86 @@ It is a _yaml_ file with two keys with the S3 credentials:
65
66
  monthly: 2
66
67
  weekly: 3
67
68
  daily: 4
69
+ slices_size: 100
68
70
  database_user: 'root'
69
71
  database_pass: ''
70
72
  storage_system: 's3'
71
- s3_opts:
73
+ s3_opts:
72
74
  bucket: 'sweety_backy'
73
75
  path: 'fguillen'
74
76
  passwd_file: '~/.s3.passwd'
75
-
77
+
76
78
  #### Execute
77
79
 
78
80
  sweety_backy ~/.sweety_backy.conf
79
-
81
+
80
82
  #### Result
81
83
 
82
84
  This will generate a bunch of backups in the _sweety_backy_ bucket like these ones:
83
85
 
84
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz
85
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz
86
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz
87
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz
88
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz
89
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz
90
-
91
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110626.weekly.tar.gz
92
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110703.weekly.tar.gz
93
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110704.daily.tar.gz
94
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110705.daily.tar.gz
95
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110706.daily.tar.gz
96
- https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110707.daily.tar.gz
86
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz.part_aa
87
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz.part_ab
88
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz.part_ac
89
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz.md5
90
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz.part_aa
91
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz.part_ab
92
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz.part_ac
93
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz.md5
94
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz.part_aa
95
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz.part_ab
96
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz.part_ac
97
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz.md5
98
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz.part_aa
99
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz.part_ab
100
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz.part_ac
101
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz.md5
102
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz.part_aa
103
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz.part_ab
104
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz.part_ac
105
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz.md5
106
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz.part_aa
107
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz.part_ab
108
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz.part_ac
109
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz.md5
110
+
111
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110626.weekly.tar.gz.part_aa
112
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110626.weekly.tar.gz.md5
113
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110703.weekly.tar.gz.part_aa
114
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110703.weekly.tar.gz.md5
115
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110704.daily.tar.gz.part_aa
116
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110704.daily.tar.gz.md5
117
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110705.daily.tar.gz.part_aa
118
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110705.daily.tar.gz.md5
119
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110706.daily.tar.gz.part_aa
120
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110706.daily.tar.gz.md5
121
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110707.daily.tar.gz.part_aa
122
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110707.daily.tar.gz.md5
97
123
 
98
124
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110626.weekly.sql.tar.gz
99
125
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110703.weekly.sql.tar.gz
100
126
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110704.daily.sql.tar.gz
101
127
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110705.daily.sql.tar.gz
102
- https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110706.daily.sql.tar.gz
128
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110706.daily.sql.tar.gz
103
129
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110707.daily.sql.tar.gz
104
130
 
105
131
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110626.weekly.sql.tar.gz
106
132
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110703.weekly.sql.tar.gz
107
133
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110704.daily.sql.tar.gz
108
134
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110705.daily.sql.tar.gz
109
- https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110706.daily.sql.tar.gz
135
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110706.daily.sql.tar.gz
110
136
  https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110707.daily.sql.tar.gz
111
-
137
+
112
138
  ... and so on.
113
139
 
114
140
  ### Cron execution example
115
-
141
+
116
142
  # every day at 02:00 am
117
143
  00 02 * * * sweety_backy /home/fguillen/.sweety_backy.conf >> /var/log/sweety_backy.log 2>&1
118
144
 
145
+ ### State of development
146
+
147
+ I'm using it in my prouduction servers, but it can fails and damage your data.. use it at your own risk.
148
+
119
149
  ## License
120
150
 
121
151
  MIT License. (c) 2011 Fernando Guillen (http://fernandoguillen.info).
@@ -4,23 +4,23 @@
4
4
  # sweety_backy /path/to/sweety_backy.conf
5
5
 
6
6
  begin
7
- require 'sweety_backy'
7
+ require "sweety_backy"
8
8
  rescue LoadError
9
- require 'rubygems'
10
- require 'sweety_backy'
9
+ require "rubygems"
10
+ require "sweety_backy"
11
11
  end
12
12
 
13
- require 'benchmark'
13
+ require "benchmark"
14
14
 
15
15
  if( ARGV[0].nil? )
16
16
  SweetyBacky::Utils.log "use: $ sweety_backy <config_file_path>"
17
17
  exit 1
18
18
  end
19
19
 
20
- lapsus_time =
20
+ lapsus_time =
21
21
  Benchmark.realtime do
22
22
  SweetyBacky::Utils.log "--------------------"
23
- SweetyBacky::Utils.log "Starting SweetyBacky"
23
+ SweetyBacky::Utils.log "Starting SweetyBacky v #{SweetyBacky::VERSION}"
24
24
  sb = SweetyBacky::Runner.new( ARGV[0] )
25
25
  sb.run
26
26
  end
@@ -1,11 +1,11 @@
1
1
  begin
2
- require 'aws-sdk'
2
+ require "aws-sdk"
3
3
  rescue LoadError
4
- require 'rubygems'
5
- require 'aws-sdk'
4
+ require "rubygems"
5
+ require "aws-sdk"
6
6
  end
7
7
 
8
- require 'digest/md5'
8
+ require "digest/md5"
9
9
 
10
10
  require "#{File.dirname(__FILE__)}/sweety_backy/version"
11
11
  require "#{File.dirname(__FILE__)}/sweety_backy/runner"
@@ -1,23 +1,23 @@
1
1
  module SweetyBacky
2
- module Commander
2
+ module Commander
3
3
  def self.do_files_backup( path, backup_path )
4
4
  SweetyBacky::Utils.log "doing files backup of #{path} to #{backup_path}"
5
-
5
+
6
6
  FileUtils.mkdir_p( File.dirname( backup_path ) )
7
7
  SweetyBacky::Utils::command( "tar -cz --directory #{path} --same-permissions --file #{backup_path} ." )
8
8
  end
9
-
9
+
10
10
  def self.do_database_backup( database_name, backup_path, opts )
11
11
  SweetyBacky::Utils.log "doing database backup #{database_name} on #{backup_path}"
12
-
12
+
13
13
  FileUtils.mkdir_p( File.dirname( backup_path ) )
14
14
  tmp_sql_file_path = File.join( Dir::tmpdir, "#{File.basename( backup_path, '.tar.gz' )}" )
15
-
15
+
16
16
  database_pass = opts[:database_pass].empty? ? '' : "-p'#{opts[:database_pass]}'"
17
17
 
18
18
  SweetyBacky::Utils::command( "mysqldump -u#{opts[:database_user]} #{database_pass} #{database_name} > #{tmp_sql_file_path}" )
19
19
  SweetyBacky::Utils::command( "tar -cz --same-permissions --file #{backup_path} --directory #{File.dirname(tmp_sql_file_path)} #{File.basename(tmp_sql_file_path)}" )
20
-
20
+
21
21
  File.delete( tmp_sql_file_path )
22
22
  end
23
23
 
@@ -25,43 +25,54 @@ module SweetyBacky
25
25
  clean_files( opts )
26
26
  clean_databases( opts )
27
27
  end
28
-
28
+
29
29
  def self.clean_files( opts )
30
30
  SweetyBacky::Utils.log "cleaning files on #{opts[:target_path]}/files/"
31
-
31
+
32
+ suffix = opts[:slices_size] ? ".part_*" : "" # suffix support in case of spliting activate
33
+ suffix_regex = opts[:slices_size] ? /\.part_.*/ : "" # suffix support in case of spliting activate
34
+
32
35
  opts[:paths].each do |path|
33
36
  SweetyBacky::Utils.log "cleaning file #{path}"
34
-
37
+
35
38
  [:yearly, :monthly, :weekly, :daily].each do |period|
36
- paths_in(
37
- "#{opts[:target_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.tar.gz",
39
+ paths_in(
40
+ "#{opts[:target_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.tar.gz#{suffix}",
38
41
  opts
39
- ).sort[0..(-1*(opts[period]+1))].each do |file_path|
40
- remove_path( file_path, opts )
41
- remove_path( "#{file_path}.md5", opts ) if exists?( "#{file_path}.md5", opts )
42
- end
42
+ ).map do |file_name|
43
+ file_name.match( "files\/#{SweetyBacky::Utils.namerize( path )}.(\\d{8}).#{period.to_s}.tar.gz#{suffix}" )[1]
44
+ end.uniq.sort[0..(-1*(opts[period]+1))].each do |date_to_remove|
45
+ paths_in(
46
+ "#{opts[:target_path]}/files/#{SweetyBacky::Utils.namerize( path )}.#{date_to_remove}.#{period.to_s}.tar.gz#{suffix}",
47
+ opts
48
+ ).each do |file_path|
49
+ Utils.log( "Removing file: #{file_path}" )
50
+ remove_path( file_path, opts )
51
+ remove_path( "#{file_path.gsub(suffix_regex, "")}.md5", opts ) if exists?( "#{file_path.gsub(suffix_regex, "")}.md5", opts )
52
+ end
53
+ end
43
54
  end
44
55
  end
45
56
  end
46
-
57
+
47
58
  def self.clean_databases( opts )
48
59
  SweetyBacky::Utils.log "cleaning databases on #{opts[:target_path]}/databases/"
49
-
60
+
50
61
  opts[:databases].each do |database_name|
51
62
  SweetyBacky::Utils.log "cleaning database #{database_name}"
52
-
63
+
53
64
  [:yearly, :monthly, :weekly, :daily].each do |period|
54
- paths_in(
65
+ paths_in(
55
66
  "#{opts[:target_path]}/databases/#{database_name}.*.#{period.to_s}.sql.tar.gz",
56
67
  opts
57
68
  ).sort[0..(-1*(opts[period]+1))].each do |file_path|
58
69
  remove_path( file_path, opts )
59
70
  remove_path( "#{file_path}.md5", opts ) if exists?( "#{file_path}.md5", opts )
60
- end
71
+ end
61
72
  end
62
73
  end
63
74
  end
64
-
75
+
65
76
  def self.exists?( path, opts )
66
77
  if( opts[:storage_system].to_sym == :s3 )
67
78
  return SweetyBacky::S3.exists?( path, opts[:s3_opts] )
@@ -69,7 +80,7 @@ module SweetyBacky
69
80
  return File.exists?( path )
70
81
  end
71
82
  end
72
-
83
+
73
84
  def self.paths_in( path, opts )
74
85
  if( opts[:storage_system].to_sym == :s3 )
75
86
  return SweetyBacky::S3.paths_in( path, opts[:s3_opts] )
@@ -77,7 +88,7 @@ module SweetyBacky
77
88
  return Dir.glob( path )
78
89
  end
79
90
  end
80
-
91
+
81
92
  def self.remove_path( path, opts )
82
93
  if( opts[:storage_system].to_sym == :s3 )
83
94
  SweetyBacky::Utils.log "cleaning: removing #{opts[:s3_opts][:bucket]}/#{path}"
@@ -87,19 +98,24 @@ module SweetyBacky
87
98
  File.delete( path )
88
99
  end
89
100
  end
90
-
101
+
91
102
  def self.do_md5( path, md5_path )
92
103
  digest = Digest::MD5.new();
93
-
104
+
94
105
  File.open( path, 'r' ) do |f|
95
106
  f.each_line { |line| digest << line }
96
107
  end
97
108
 
98
109
  result = digest.hexdigest
99
-
110
+
100
111
  File.open( md5_path, 'w' ) { |f| f.write result }
101
-
112
+
102
113
  return result
103
114
  end
115
+
116
+ def self.do_slices( file_path, size )
117
+ SweetyBacky::Utils::command( "split -b #{size}m #{file_path} #{file_path}.part_" )
118
+ File.delete( file_path )
119
+ end
104
120
  end
105
121
  end
@@ -7,15 +7,15 @@ require File.dirname(__FILE__) + "/utils.rb"
7
7
  module SweetyBacky
8
8
  class Runner
9
9
  attr_reader :opts, :results
10
-
10
+
11
11
  def initialize( path = nil )
12
12
  if( !path.nil? )
13
13
  config( SweetyBacky::OptsReader.read_opts( path ) )
14
14
  end
15
-
15
+
16
16
  @results = []
17
17
  end
18
-
18
+
19
19
  def config( opts )
20
20
  @opts = {
21
21
  :paths => [],
@@ -26,7 +26,7 @@ module SweetyBacky
26
26
  :daily => 4,
27
27
  :storage_system => :local
28
28
  }.merge( opts )
29
-
29
+
30
30
  if( @opts[:storage_system].to_sym == :s3 )
31
31
  @opts[:working_path] = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
32
32
  @opts[:target_path] = @opts[:s3_opts][:path]
@@ -35,82 +35,83 @@ module SweetyBacky
35
35
  @opts[:target_path] = @opts[:local_opts][:path]
36
36
  end
37
37
  end
38
-
38
+
39
39
  def do_backup
40
40
  do_files_backup
41
41
  do_databases_backup
42
42
  end
43
-
43
+
44
44
  def do_files_backup
45
45
  @opts[:paths].each do |path|
46
46
  success = nil
47
47
  backup_path = "#{@opts[:working_path]}/files/#{SweetyBacky::Utils.namerize( path )}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.tar.gz"
48
48
  md5_path = "#{backup_path}.md5"
49
-
49
+
50
50
  begin
51
51
  SweetyBacky::Commander.do_files_backup( path, backup_path )
52
52
  SweetyBacky::Commander.do_md5( backup_path, md5_path )
53
-
53
+ SweetyBacky::Commander.do_slices( backup_path, opts[:slices_size] ) if opts[:slices_size]
54
+
54
55
  if( @opts[:storage_system].to_sym == :s3 )
55
56
  upload_files_backup_to_s3( backup_path, md5_path )
56
57
  end
57
-
58
+
58
59
  success = true
59
-
60
+
60
61
  rescue Exception => e
61
62
  Utils.log( "ERROR: backing up file: '#{path}', e: #{e.message}" )
62
63
  Utils.log( e.backtrace.join("\n") )
63
-
64
+
64
65
  success = false
65
66
  end
66
-
67
+
67
68
  @results << { :name => "file: #{path}", :success => success }
68
69
  end
69
70
  end
70
-
71
+
71
72
  def do_databases_backup
72
73
  @opts[:databases].each do |database_name|
73
-
74
+
74
75
  success = nil
75
76
  backup_path = "#{@opts[:working_path]}/databases/#{database_name}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.sql.tar.gz"
76
77
  md5_path = "#{backup_path}.md5"
77
-
78
+
78
79
  begin
79
80
  SweetyBacky::Commander.do_database_backup( database_name, backup_path, @opts)
80
81
  SweetyBacky::Commander.do_md5( backup_path, md5_path )
81
-
82
+
82
83
  if( @opts[:storage_system].to_sym == :s3 )
83
84
  upload_databases_backup_to_s3( backup_path, md5_path )
84
85
  end
85
-
86
+
86
87
  success = true
87
-
88
+
88
89
  rescue Exception => e
89
90
  Utils.log( "ERROR: backing up database: '#{database_name}', e: #{e.message}" )
90
91
  Utils.log( e.backtrace.join("\n") )
91
-
92
+
92
93
  success = false
93
94
  end
94
-
95
+
95
96
  @results << { :name => "database: #{database_name}", :success => success }
96
97
  end
97
98
  end
98
-
99
99
 
100
-
100
+
101
+
101
102
  def clean
102
103
  SweetyBacky::Commander.clean( @opts )
103
104
  end
104
-
105
+
105
106
  def print_results
106
107
  Utils.log( "RESULTS:" )
107
108
  Utils.log( "--------" )
108
-
109
+
109
110
  @results.each do |result|
110
111
  Utils.log( "#{result[:name]} -> #{result[:success] ? 'OK' : 'ERROR'}" )
111
112
  end
112
113
  end
113
-
114
+
114
115
  def run
115
116
  begin
116
117
  do_backup
@@ -122,40 +123,43 @@ module SweetyBacky
122
123
  SweetyBacky::Utils.log "I should send and email at this moment"
123
124
  end
124
125
  end
125
-
126
- private
127
-
126
+
128
127
  def upload_databases_backup_to_s3( backup_path, md5_path )
129
128
  SweetyBacky::S3.upload(
130
129
  backup_path,
131
130
  "#{@opts[:target_path]}/databases/#{File.basename( backup_path )}",
132
131
  @opts[:s3_opts]
133
132
  )
134
-
133
+
135
134
  SweetyBacky::S3.upload(
136
135
  md5_path,
137
136
  "#{@opts[:target_path]}/databases/#{File.basename( md5_path )}",
138
137
  @opts[:s3_opts]
139
138
  )
140
-
139
+
141
140
  FileUtils.rm backup_path
142
141
  FileUtils.rm md5_path
143
142
  end
144
-
143
+
145
144
  def upload_files_backup_to_s3( backup_path, md5_path )
146
- SweetyBacky::S3.upload(
147
- backup_path,
148
- "#{@opts[:target_path]}/files/#{File.basename( backup_path )}",
149
- @opts[:s3_opts]
150
- )
151
-
145
+ backup_paths = opts[:slices_size] ? Dir.glob( "#{backup_path}.part_*" ) : [backup_path]
146
+
147
+ backup_paths.each do |backup_path|
148
+ SweetyBacky::S3.upload(
149
+ backup_path,
150
+ "#{@opts[:target_path]}/files/#{File.basename( backup_path )}",
151
+ @opts[:s3_opts]
152
+ )
153
+
154
+ FileUtils.rm backup_path
155
+ end
156
+
152
157
  SweetyBacky::S3.upload(
153
158
  md5_path,
154
159
  "#{@opts[:target_path]}/files/#{File.basename( md5_path )}",
155
160
  @opts[:s3_opts]
156
161
  )
157
-
158
- FileUtils.rm backup_path
162
+
159
163
  FileUtils.rm md5_path
160
164
  end
161
165
 
@@ -2,34 +2,34 @@ module SweetyBacky
2
2
  class S3
3
3
  def self.upload( path, s3_path, opts )
4
4
  SweetyBacky::Utils.log( "S3 uploading: #{path} to #{opts[:bucket]}/#{s3_path}" )
5
-
5
+
6
6
  s3 = AWS::S3.new( read_s3_password( opts[:passwd_file] ) )
7
7
  bucket = s3.buckets[ opts[:bucket] ]
8
-
8
+
9
9
  if !bucket.exists?
10
10
  bucket = s3.buckets.create( opts[:bucket] )
11
11
  end
12
12
 
13
- object = bucket.objects.create( s3_path )
13
+ object = bucket.objects[ s3_path ]
14
14
  object.write( :file => path )
15
15
  end
16
-
16
+
17
17
  def self.object( path, opts )
18
18
  s3 = AWS::S3.new( read_s3_password( opts[:passwd_file] ) )
19
19
  bucket = s3.buckets[ opts[:bucket] ]
20
20
  object = bucket.objects[ path ]
21
-
21
+
22
22
  object
23
23
  end
24
-
24
+
25
25
  def self.exists?( path, opts )
26
26
  return object( path, opts ).exists?
27
27
  end
28
-
28
+
29
29
  def self.paths_in( path, opts )
30
30
  s3 = AWS::S3.new( read_s3_password( opts[:passwd_file] ) )
31
31
  bucket = s3.buckets[ opts[:bucket] ]
32
-
32
+
33
33
  regex = Regexp.escape( path ).gsub('\*', '.*').gsub('\?', '.')
34
34
 
35
35
  objects = bucket.objects.select { |e| e.key =~ /^#{regex}$/ }
@@ -37,22 +37,22 @@ module SweetyBacky
37
37
 
38
38
  return paths
39
39
  end
40
-
40
+
41
41
  def self.read_s3_password( path )
42
42
  opts = YAML.load( File.read( File.expand_path path ) )
43
43
  new_opts = {}
44
-
44
+
45
45
  # symbolize keys
46
46
  opts.keys.each do |key|
47
47
  new_opts[key.to_sym] = opts[key]
48
48
  end
49
-
49
+
50
50
  return new_opts
51
51
  end
52
-
52
+
53
53
  def self.delete( path, opts )
54
54
  SweetyBacky::S3.object( path, opts ).delete
55
55
  end
56
-
56
+
57
57
  end
58
58
  end
@@ -1,3 +1,3 @@
1
1
  module SweetyBacky
2
- VERSION = "0.0.19"
2
+ VERSION = "0.0.21"
3
3
  end
@@ -12,13 +12,13 @@ Gem::Specification.new do |s|
12
12
  s.description = "Simple mechanism to configure and execute backups of folders and MySQL DBs and store them in local folder or S3 bucket"
13
13
 
14
14
  s.rubyforge_project = "SweetyBacky"
15
-
15
+
16
16
  s.add_development_dependency "bundler", ">= 1.0.0.rc.6"
17
17
  s.add_development_dependency "mocha"
18
18
  s.add_development_dependency "delorean"
19
19
  s.add_development_dependency "rake"
20
-
21
- s.add_dependency "aws-sdk"
20
+
21
+ s.add_dependency "aws-sdk", "1.7.1"
22
22
 
23
23
  s.files = `git ls-files`.split("\n")
24
24
  s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
@@ -1,25 +1,25 @@
1
1
  require "#{File.dirname(__FILE__)}/test_helper"
2
2
 
3
3
  class CommanderTest < Test::Unit::TestCase
4
-
4
+
5
5
  def setup
6
6
  SweetyBacky::Utils.stubs(:log)
7
-
7
+
8
8
  # tmp dir
9
9
  @tmp_dir = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
10
10
  Dir.mkdir( @tmp_dir ) unless File.exists?(@tmp_dir)
11
11
  end
12
-
12
+
13
13
  def teardown
14
14
  FileUtils.rm_rf @tmp_dir if File.exists?(@tmp_dir)
15
15
  end
16
-
16
+
17
17
  def test_do_files_backup
18
- SweetyBacky::Commander.do_files_backup(
19
- "#{FIXTURES_PATH}/path",
18
+ SweetyBacky::Commander.do_files_backup(
19
+ "#{FIXTURES_PATH}/path",
20
20
  "#{@tmp_dir}/back.tar.gz"
21
21
  )
22
-
22
+
23
23
  result = %x(tar -tzvf #{@tmp_dir}/back.tar.gz)
24
24
 
25
25
  assert_match( "./", result )
@@ -27,23 +27,23 @@ class CommanderTest < Test::Unit::TestCase
27
27
  assert_match( "./a/", result )
28
28
  assert_match( "./b/file3.txt", result )
29
29
  end
30
-
30
+
31
31
  def test_do_databases_backup
32
- SweetyBacky::Commander.do_database_backup(
33
- "test",
32
+ SweetyBacky::Commander.do_database_backup(
33
+ "test",
34
34
  "#{@tmp_dir}/back.sql.tar.gz",
35
35
  {
36
- :database_user => "test",
36
+ :database_user => "test",
37
37
  :database_pass => ""
38
38
  }
39
39
  )
40
-
40
+
41
41
  result = %x(tar -tzvf #{@tmp_dir}/back.sql.tar.gz)
42
-
42
+
43
43
  assert_match( /\sback.sql$/, result )
44
44
  end
45
45
 
46
-
46
+
47
47
  def test_clean
48
48
  opts = {
49
49
  :paths => [ 'name1', 'name2' ],
@@ -59,10 +59,10 @@ class CommanderTest < Test::Unit::TestCase
59
59
  :working_path => @tmp_dir,
60
60
  :target_path => @tmp_dir
61
61
  }
62
-
62
+
63
63
  Dir.mkdir( "#{@tmp_dir}/files" ) unless File.exists?( "#{@tmp_dir}/files" )
64
64
  Dir.mkdir( "#{@tmp_dir}/databases" ) unless File.exists?( "#{@tmp_dir}/databases" )
65
-
65
+
66
66
  [
67
67
  'name1.20081231.yearly',
68
68
  'name1.20081232.yearly',
@@ -85,17 +85,18 @@ class CommanderTest < Test::Unit::TestCase
85
85
  'name2.20100726.daily'
86
86
  ].each do |file_part|
87
87
  File.open( "#{@tmp_dir}/files/#{file_part}.tar.gz", 'w' ) { |f| f.write 'wadus' }
88
+ File.open( "#{@tmp_dir}/files/#{file_part}.tar.gz.md5", 'w' ) { |f| f.write 'wadus' }
88
89
  File.open( "#{@tmp_dir}/databases/#{file_part}.sql.tar.gz", 'w' ) { |f| f.write 'wadus' }
89
90
  end
90
-
91
+
91
92
  # puts @tmp_dir
92
93
  # exit 1
93
-
94
+
94
95
  SweetyBacky::Commander.clean( opts )
95
-
96
+
96
97
  files_keeped = Dir.glob( "#{@tmp_dir}/files/*" ).join( "\n" )
97
98
  databases_keeped = Dir.glob( "#{@tmp_dir}/databases/*" ).join( "\n" )
98
-
99
+
99
100
  # files to keep
100
101
  [
101
102
  'name1.20081232.yearly',
@@ -113,9 +114,10 @@ class CommanderTest < Test::Unit::TestCase
113
114
  'name2.20100726.daily'
114
115
  ].each do |file_part|
115
116
  assert_match( "#{file_part}.tar.gz", files_keeped )
117
+ assert_match( "#{file_part}.tar.gz.md5", files_keeped )
116
118
  assert_match( "#{file_part}.sql.tar.gz", databases_keeped )
117
119
  end
118
-
120
+
119
121
  # files to deleted
120
122
  [
121
123
  'name1.20081231.yearly',
@@ -125,16 +127,109 @@ class CommanderTest < Test::Unit::TestCase
125
127
  'name2.20100721.daily'
126
128
  ].each do |file_part|
127
129
  assert_no_match( /#{file_part}.tar.gz/, files_keeped )
130
+ assert_no_match( /#{file_part}.tar.gz.md5/, files_keeped )
131
+ assert_no_match( /#{file_part}.sql.tar.gz/, databases_keeped )
132
+ end
133
+ end
134
+
135
+ def test_clean_with_part_extensions
136
+ opts = {
137
+ :paths => [ 'name1', 'name2' ],
138
+ :databases => [ 'name1', 'name2' ],
139
+ :yearly => 1,
140
+ :monthly => 2,
141
+ :weekly => 3,
142
+ :daily => 4,
143
+ :slices_size => 10,
144
+ :storage_system => :local,
145
+ :local_opts => {
146
+ :path => @tmp_dir
147
+ },
148
+ :working_path => @tmp_dir,
149
+ :target_path => @tmp_dir
150
+ }
151
+
152
+ Dir.mkdir( "#{@tmp_dir}/files" ) unless File.exists?( "#{@tmp_dir}/files" )
153
+ Dir.mkdir( "#{@tmp_dir}/databases" ) unless File.exists?( "#{@tmp_dir}/databases" )
154
+
155
+ [
156
+ 'name1.20081231.yearly',
157
+ 'name1.20081232.yearly',
158
+ 'name2.20091231.yearly',
159
+ 'name1.20100131.monthly',
160
+ 'name1.20100228.monthly',
161
+ 'name1.20100331.monthly',
162
+ 'name2.20100430.monthly',
163
+ 'name2.20100531.monthly',
164
+ 'name2.20100630.monthly',
165
+ 'name1.20100704.weekly',
166
+ 'name1.20100711.weekly',
167
+ 'name1.20100718.weekly',
168
+ 'name1.20100725.weekly',
169
+ 'name1.20100720.daily',
170
+ 'name2.20100721.daily',
171
+ 'name2.20100722.daily',
172
+ 'name2.20100723.daily',
173
+ 'name2.20100724.daily',
174
+ 'name2.20100726.daily'
175
+ ].each do |file_part|
176
+ File.open( "#{@tmp_dir}/files/#{file_part}.tar.gz.part_aa", 'w' ) { |f| f.write 'wadus' }
177
+ File.open( "#{@tmp_dir}/files/#{file_part}.tar.gz.part_ab", 'w' ) { |f| f.write 'wadus' }
178
+ File.open( "#{@tmp_dir}/files/#{file_part}.tar.gz.md5", 'w' ) { |f| f.write 'wadus' }
179
+ File.open( "#{@tmp_dir}/databases/#{file_part}.sql.tar.gz", 'w' ) { |f| f.write 'wadus' }
180
+ end
181
+
182
+ # puts @tmp_dir
183
+ # exit 1
184
+
185
+ SweetyBacky::Commander.clean( opts )
186
+
187
+ files_keeped = Dir.glob( "#{@tmp_dir}/files/*" ).join( "\n" )
188
+ databases_keeped = Dir.glob( "#{@tmp_dir}/databases/*" ).join( "\n" )
189
+
190
+ # files to keep
191
+ [
192
+ 'name1.20081232.yearly',
193
+ 'name2.20091231.yearly',
194
+ 'name1.20100228.monthly',
195
+ 'name1.20100331.monthly',
196
+ 'name2.20100531.monthly',
197
+ 'name2.20100630.monthly',
198
+ 'name1.20100718.weekly',
199
+ 'name1.20100725.weekly',
200
+ 'name1.20100720.daily',
201
+ 'name2.20100722.daily',
202
+ 'name2.20100723.daily',
203
+ 'name2.20100724.daily',
204
+ 'name2.20100726.daily'
205
+ ].each do |file_part|
206
+ assert_match( "#{file_part}.tar.gz.part_aa", files_keeped )
207
+ assert_match( "#{file_part}.tar.gz.part_ab", files_keeped )
208
+ assert_match( "#{file_part}.tar.gz.md5", files_keeped )
209
+ assert_match( "#{file_part}.sql.tar.gz", databases_keeped )
210
+ end
211
+
212
+ # files to deleted
213
+ [
214
+ 'name1.20081231.yearly',
215
+ 'name1.20100131.monthly',
216
+ 'name2.20100430.monthly',
217
+ 'name1.20100704.weekly',
218
+ 'name2.20100721.daily'
219
+ ].each do |file_part|
220
+ assert_no_match( /#{file_part}.tar.gz.part_aa/, files_keeped )
221
+ assert_no_match( /#{file_part}.tar.gz.part_ab/, files_keeped )
222
+ assert_no_match( /#{file_part}.tar.gz.md5/, files_keeped )
128
223
  assert_no_match( /#{file_part}.sql.tar.gz/, databases_keeped )
129
224
  end
130
225
  end
131
-
226
+
132
227
  def test_do_md5
133
228
  md5_path = "#{@tmp_dir}/#{Time.now.to_i}.md5"
134
229
  fixture_file_path = "#{FIXTURES_PATH}/file_for_md5.txt"
135
-
230
+
136
231
  md5 = SweetyBacky::Commander.do_md5( fixture_file_path, md5_path )
137
-
232
+
138
233
  assert_equal( '75fca8d37b0f7d75d11ccc8d255debe5', md5 )
139
234
  assert_equal( '75fca8d37b0f7d75d11ccc8d255debe5', File.read( md5_path ) )
140
235
  end
@@ -0,0 +1,18 @@
1
+ paths:
2
+ - "path1"
3
+ - "path2"
4
+ databases:
5
+ - "db1"
6
+ - "db2"
7
+ yearly: 1
8
+ monthly: 2
9
+ weekly: 3
10
+ daily: 4
11
+ slices_size: 10
12
+ database_user: 'database_user'
13
+ database_pass: 'database_pass'
14
+ storage_system: 's3'
15
+ s3_opts:
16
+ bucket: 'bucket_name'
17
+ path: 's3/path/path'
18
+ passwd_file: '/path/.s3.passwd'
@@ -0,0 +1 @@
1
+ wadus
@@ -1,14 +1,14 @@
1
1
  require "#{File.dirname(__FILE__)}/test_helper"
2
2
 
3
3
  class RunnerTest < Test::Unit::TestCase
4
-
4
+
5
5
  def setup
6
6
  SweetyBacky::Utils.stubs(:log)
7
7
 
8
8
  # tmp dir
9
9
  @tmp_dir = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
10
10
  Dir.mkdir( @tmp_dir ) unless File.exists?(@tmp_dir)
11
-
11
+
12
12
  # runner
13
13
  @opts = {
14
14
  :paths => [ "#{FIXTURES_PATH}/path" ],
@@ -24,28 +24,28 @@ class RunnerTest < Test::Unit::TestCase
24
24
  :path => @tmp_dir
25
25
  }
26
26
  }
27
-
27
+
28
28
  @runner = SweetyBacky::Runner.new
29
29
  @runner.config( @opts )
30
30
  end
31
-
31
+
32
32
  def teardown
33
33
  FileUtils.rm_rf @tmp_dir if File.exists?(@tmp_dir)
34
34
  end
35
-
35
+
36
36
  def test_do_backup_yearly
37
37
  Delorean.time_travel_to( '2009-12-31' ) do
38
38
  @runner.do_backup
39
39
  end
40
-
40
+
41
41
  assert( File.exists?( "#{@tmp_dir}/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.20091231.yearly.tar.gz" ) )
42
42
  assert( File.exists?( "#{@tmp_dir}/databases/test.20091231.yearly.sql.tar.gz" ) )
43
43
 
44
44
  assert( File.exists?( "#{@tmp_dir}/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.20091231.yearly.tar.gz.md5" ) )
45
45
  assert( File.exists?( "#{@tmp_dir}/databases/test.20091231.yearly.sql.tar.gz.md5" ) )
46
46
  end
47
-
48
- def test_do_backup_monthly
47
+
48
+ def test_do_backup_monthly
49
49
  Delorean.time_travel_to( '2010-01-31' ) do
50
50
  @runner.do_backup
51
51
  end
@@ -53,21 +53,21 @@ class RunnerTest < Test::Unit::TestCase
53
53
  assert( File.exists?( "#{@tmp_dir}/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.20100131.monthly.tar.gz") )
54
54
  assert( File.exists?( "#{@tmp_dir}/databases/test.20100131.monthly.sql.tar.gz") )
55
55
  end
56
-
57
- def test_do_backup_weekly
56
+
57
+ def test_do_backup_weekly
58
58
  Delorean.time_travel_to( '2010-01-03' ) do
59
59
  @runner.do_backup
60
60
  end
61
-
61
+
62
62
  assert( File.exists?( "#{@tmp_dir}/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.20100103.weekly.tar.gz") )
63
63
  assert( File.exists?( "#{@tmp_dir}/databases/test.20100103.weekly.sql.tar.gz") )
64
64
  end
65
-
65
+
66
66
  def test_do_backup_daily
67
67
  Delorean.time_travel_to( '2010-01-04' ) do
68
68
  @runner.do_backup
69
69
  end
70
-
70
+
71
71
  assert( File.exists?( "#{@tmp_dir}/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.20100104.daily.tar.gz") )
72
72
  assert( File.exists?( "#{@tmp_dir}/databases/test.20100104.daily.sql.tar.gz") )
73
73
  end
@@ -76,22 +76,22 @@ class RunnerTest < Test::Unit::TestCase
76
76
  @runner.expects(:do_backup)
77
77
  @runner.expects(:print_results)
78
78
  SweetyBacky::Commander.expects(:clean)
79
-
79
+
80
80
  @runner.run
81
81
  end
82
-
82
+
83
83
  def test_initialize_with_config_file
84
- SweetyBacky::OptsReader.expects( :read_opts ).with( '/path/config.yml' ).returns(
85
- {
84
+ SweetyBacky::OptsReader.expects( :read_opts ).with( '/path/config.yml' ).returns(
85
+ {
86
86
  :paths => [ 'pepe', 'juan' ],
87
87
  :local_opts => {
88
88
  :path => '/local/path'
89
89
  }
90
90
  }
91
91
  )
92
-
92
+
93
93
  runner = SweetyBacky::Runner.new( "/path/config.yml" )
94
-
94
+
95
95
  assert_equal( [ "pepe", "juan" ], runner.opts[:paths] )
96
96
  assert_equal( [], runner.opts[:databases] )
97
97
  assert_equal( 1, runner.opts[:yearly] )
@@ -103,6 +103,46 @@ class RunnerTest < Test::Unit::TestCase
103
103
  assert_equal( '/local/path', runner.opts[:working_path] )
104
104
  assert_equal( '/local/path', runner.opts[:target_path] )
105
105
  end
106
-
106
+
107
+ def test_upload_files_backup_to_s3_with_slices
108
+ @runner.opts[:slices_size] = 10
109
+ @runner.opts[:s3_opts] = "s3_opts"
110
+
111
+ sequence_uploads = sequence( "uploads" )
112
+ sequence_removes = sequence( "removes" )
113
+
114
+ SweetyBacky::S3.expects(:upload).with(
115
+ "#{FIXTURES_PATH}/slices/file1.txt.part_aa",
116
+ "#{@tmp_dir}/files/file1.txt.part_aa",
117
+ @runner.opts[:s3_opts]
118
+ ).in_sequence( sequence_uploads )
119
+
120
+
121
+ SweetyBacky::S3.expects(:upload).with(
122
+ "#{FIXTURES_PATH}/slices/file1.txt.part_ab",
123
+ "#{@tmp_dir}/files/file1.txt.part_ab",
124
+ @runner.opts[:s3_opts]
125
+ ).in_sequence( sequence_uploads )
126
+
127
+ SweetyBacky::S3.expects(:upload).with(
128
+ "#{FIXTURES_PATH}/slices/file1.txt.md5",
129
+ "#{@tmp_dir}/files/file1.txt.md5",
130
+ @runner.opts[:s3_opts]
131
+ ).in_sequence( sequence_uploads )
132
+
133
+ FileUtils.expects(:rm).with("#{FIXTURES_PATH}/slices/file1.txt.part_aa").in_sequence( sequence_removes )
134
+ FileUtils.expects(:rm).with("#{FIXTURES_PATH}/slices/file1.txt.part_ab").in_sequence( sequence_removes )
135
+ FileUtils.expects(:rm).with("#{FIXTURES_PATH}/slices/file1.txt.md5").in_sequence( sequence_removes )
136
+
137
+ @runner.upload_files_backup_to_s3(
138
+ "#{FIXTURES_PATH}/slices/file1.txt",
139
+ "#{FIXTURES_PATH}/slices/file1.txt.md5"
140
+ )
141
+
142
+
143
+
144
+ # FileUtils.expects(:rm).with()
145
+ end
146
+
107
147
  end
108
148
 
@@ -10,57 +10,57 @@ class S3Test < Test::Unit::TestCase
10
10
  :path => 'test_path',
11
11
  :passwd_file => '~/.s3.passwd'
12
12
  }
13
-
13
+
14
14
  s3 = AWS::S3.new( SweetyBacky::S3.read_s3_password( @opts[:passwd_file] ) )
15
15
  @bucket = s3.buckets.create( @opts[:bucket] )
16
16
  end
17
-
17
+
18
18
  def teardown
19
19
  @bucket.delete!
20
20
  end
21
-
21
+
22
22
  def test_upload
23
23
  SweetyBacky::S3.upload(
24
24
  "#{FIXTURES_PATH}/file.txt",
25
25
  "test/path/file.txt",
26
26
  @opts
27
27
  )
28
-
28
+
29
29
  assert_equal(
30
30
  File.read( "#{FIXTURES_PATH}/file.txt" ),
31
31
  SweetyBacky::S3.object( "test/path/file.txt", @opts ).read
32
32
  )
33
33
  end
34
-
34
+
35
35
  def test_paths_in
36
36
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file1.txt", @opts )
37
37
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file2.txt", @opts )
38
38
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file3.txt", @opts )
39
39
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/other_file.txt", @opts )
40
-
40
+
41
41
  paths = SweetyBacky::S3.paths_in( "test/path/file*.txt", @opts )
42
-
42
+
43
43
  assert_equal(3, paths.size)
44
44
  assert( ( paths.include? "test/path/file1.txt" ) )
45
45
  assert( ( paths.include? "test/path/file2.txt" ) )
46
46
  assert( ( paths.include? "test/path/file3.txt" ) )
47
47
  assert( ( !paths.include? "test/path/other_file.txt" ) )
48
48
  end
49
-
49
+
50
50
  def test_delete
51
51
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file1.txt", @opts )
52
52
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file2.txt", @opts )
53
-
53
+
54
54
  SweetyBacky::S3.delete( "test/path/file2.txt", @opts )
55
-
55
+
56
56
  assert( @bucket.objects[ "test/path/file1.txt" ].exists? )
57
57
  assert( !@bucket.objects[ "test/path/file2.txt" ].exists? )
58
58
  end
59
-
59
+
60
60
  def test_exists
61
61
  SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file1.txt", @opts )
62
62
  assert_equal( true, SweetyBacky::S3.exists?( "test/path/file1.txt", @opts ) )
63
63
  assert_equal( false, SweetyBacky::S3.exists?( "test/path/file2.txt", @opts ) )
64
64
  end
65
-
65
+
66
66
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: sweety_backy
3
3
  version: !ruby/object:Gem::Version
4
- hash: 57
5
- prerelease: false
4
+ hash: 53
5
+ prerelease:
6
6
  segments:
7
7
  - 0
8
8
  - 0
9
- - 19
10
- version: 0.0.19
9
+ - 21
10
+ version: 0.0.21
11
11
  platform: ruby
12
12
  authors:
13
13
  - Fernando Guillen
@@ -15,8 +15,7 @@ autorequire:
15
15
  bindir: bin
16
16
  cert_chain: []
17
17
 
18
- date: 2011-10-04 00:00:00 +02:00
19
- default_executable:
18
+ date: 2012-11-24 00:00:00 Z
20
19
  dependencies:
21
20
  - !ruby/object:Gem::Dependency
22
21
  name: bundler
@@ -26,7 +25,7 @@ dependencies:
26
25
  requirements:
27
26
  - - ">="
28
27
  - !ruby/object:Gem::Version
29
- hash: 15424057
28
+ hash: -2749782956
30
29
  segments:
31
30
  - 1
32
31
  - 0
@@ -84,12 +83,14 @@ dependencies:
84
83
  requirement: &id005 !ruby/object:Gem::Requirement
85
84
  none: false
86
85
  requirements:
87
- - - ">="
86
+ - - "="
88
87
  - !ruby/object:Gem::Version
89
- hash: 3
88
+ hash: 9
90
89
  segments:
91
- - 0
92
- version: "0"
90
+ - 1
91
+ - 7
92
+ - 1
93
+ version: 1.7.1
93
94
  type: :runtime
94
95
  version_requirements: *id005
95
96
  description: Simple mechanism to configure and execute backups of folders and MySQL DBs and store them in local folder or S3 bucket
@@ -119,12 +120,16 @@ files:
119
120
  - sweety_backy.gemspec
120
121
  - test/commander_test.rb
121
122
  - test/fixtures/config_s3.yml
123
+ - test/fixtures/config_s3_slices.yml
122
124
  - test/fixtures/file.txt
123
125
  - test/fixtures/file_for_md5.txt
124
126
  - test/fixtures/path/a/file2.txt
125
127
  - test/fixtures/path/b/file3.txt
126
128
  - test/fixtures/path/file1.txt
127
129
  - test/fixtures/s3.passwd
130
+ - test/fixtures/slices/file1.txt.md5
131
+ - test/fixtures/slices/file1.txt.part_aa
132
+ - test/fixtures/slices/file1.txt.part_ab
128
133
  - test/opts_reader_test.rb
129
134
  - test/runner_test.rb
130
135
  - test/s3/commander_s3_test.rb
@@ -132,7 +137,6 @@ files:
132
137
  - test/s3/s3_test.rb
133
138
  - test/test_helper.rb
134
139
  - test/utils_test.rb
135
- has_rdoc: true
136
140
  homepage: https://github.com/fguillen/SweetyBacky
137
141
  licenses: []
138
142
 
@@ -162,19 +166,23 @@ required_rubygems_version: !ruby/object:Gem::Requirement
162
166
  requirements: []
163
167
 
164
168
  rubyforge_project: SweetyBacky
165
- rubygems_version: 1.3.7
169
+ rubygems_version: 1.8.15
166
170
  signing_key:
167
171
  specification_version: 3
168
172
  summary: Ruby backup mechanism
169
173
  test_files:
170
174
  - test/commander_test.rb
171
175
  - test/fixtures/config_s3.yml
176
+ - test/fixtures/config_s3_slices.yml
172
177
  - test/fixtures/file.txt
173
178
  - test/fixtures/file_for_md5.txt
174
179
  - test/fixtures/path/a/file2.txt
175
180
  - test/fixtures/path/b/file3.txt
176
181
  - test/fixtures/path/file1.txt
177
182
  - test/fixtures/s3.passwd
183
+ - test/fixtures/slices/file1.txt.md5
184
+ - test/fixtures/slices/file1.txt.part_aa
185
+ - test/fixtures/slices/file1.txt.part_ab
178
186
  - test/opts_reader_test.rb
179
187
  - test/runner_test.rb
180
188
  - test/s3/commander_s3_test.rb