s3_rotate 1.0.1 → 1.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 16f04d3b048eedc89e3a4efaacdb46009bda296f842ace1d1a0c56f87bb88158
4
- data.tar.gz: df1fd512ebb22a0800ea822364e76f69c0a5409d89e90b95ee7731dcd81f0bed
3
+ metadata.gz: bb5f480435edb48a8d0d8195774fd5280d988ab2c219351a9fa18aba4ff5f89e
4
+ data.tar.gz: 6da33cabbbd97c8b5256fcb4f0cbd282606745676b6fbaecb746a6b2868925a5
5
5
  SHA512:
6
- metadata.gz: 89caf936d757c6d2afd3bc2eba65361e9d79ed1547b1fed1f51317925af7ddd458dce94cdd6a4402cf07d35889bb3be105c047dbdc6c690bf9f3c71fb715a26e
7
- data.tar.gz: 30e6a7117c71466ff2bc99dafe37d97edfc574d945c30c6b4886d06f2644a2dae68db59f23d1f6dc11c3e7f32928927ad11182a24e69f1e0749203aee090cc62
6
+ metadata.gz: c11dd83c9a0aa17194542d8e398af2f992d64a9c9b1d4a1182fbda65ba5aa67a2bcf368753d2f70e55f2e4827d7de6c54ba0db7457bb50a2b027a9c24ca04574
7
+ data.tar.gz: 7375433ebc482f99592e8e1b241c4d479c72f2e54ef03dfde53d4eba2c078f703dfe1437dc68cf05310e06ae99e99c82faee57a0ac99e19a2a868689a324afba
data/LICENSE CHANGED
@@ -1,6 +1,6 @@
1
1
  MIT License
2
2
 
3
- Copyright (c) 2019 Whova, Inc.
3
+ Copyright (c) 2019 Simon Ninon
4
4
 
5
5
  Permission is hereby granted, free of charge, to any person obtaining a copy
6
6
  of this software and associated documentation files (the "Software"), to deal
data/README.md CHANGED
@@ -147,10 +147,11 @@ Prototype:
147
147
  # @param backup_name String containing the name of the backup to upload
148
148
  # @param local_backups_path String containing the path to the directory containing the backups
149
149
  # @param date_regex Regex returning the date contained in the filename of each backup
150
+ # @param date_format Format to be used by DateTime.strptime to parse the extracted date
150
151
  #
151
152
  # @return nothing
152
153
  #
153
- def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/)
154
+ def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/, date_format="%Y-%m-%d")
154
155
  ```
155
156
 
156
157
  Example:
@@ -165,14 +166,11 @@ backup_manager.upload("defect-dojo-backup", "/var/opt/defect-dojo/backups")
165
166
  - `backup_name`: This is how you want to name your group of backup. This will be used to create a directory on the AWS S3 bucket under which your backup will be stored. It can be anything you want
166
167
  - `local_backups_path`: This is the path to the local directory containing your backups to be uploaded
167
168
  - `date_regex`: To rotate backups from daily to weekly, and from weekly to monthly, `S3 Backup` needs to determine which date is related to each backup file. This is done by extracting the date information from the filename, using a regex specified in `date_regex`.
169
+ - `date_format`: This complements `date_regex` and gives the format to be used to convert the string matched by `date_regex` into a `Date` object using `DateTime.strptime`.
168
170
 
169
- `date_regex` is the most important part here: without it, `S3 Rotate` does not know when your backup was generated and can not rotate your backups properly. Here are some examples of regex:
170
- - if your backups are like `1578804325_2020_01_11_12.6.2_gitlab_backup.tar`, you can use `date_regex=/\d{4}-\d{2}-\d{2}/` (this will match `2020_01_11_12`)
171
- - if your backups are like `1578804325_gitlab_backup.tar`, you can use `date_regex=/(\d+)_gitlab_backup.tar/` (this will match `1578804325`)
172
-
173
- As of now, `date_regex` can be:
174
- - any string that can be parsed by `Date.parse`
175
- - a timestamp
171
+ `date_regex` & `date_format` is the most important part here: without it, `S3 Rotate` does not know when your backup was generated and can not rotate your backups properly. Here are some examples of regex:
172
+ - if your backups are like `1578804325_2020_01_11_12.6.2_gitlab_backup.tar`, you can use `date_regex=/\d{4}-\d{2}-\d{2}/` (this will match `2020_01_11_12`) & `date_format="%Y-%m-%d"`
173
+ - if your backups are like `1578804325_gitlab_backup.tar`, you can use `date_regex=/(\d+)_gitlab_backup.tar/` (this will match `1578804325`) % `date_format="%s"`
176
174
 
177
175
  ### S3Rotate::BackupManager.rotate
178
176
  Prototype:
@@ -228,7 +226,7 @@ $> bundle exec rspec .
228
226
 
229
227
 
230
228
  ## Author
231
- [Simon Ninon](https://github.com/Cylix), for [Whova](https://whova.com)
229
+ [Simon Ninon](https://github.com/Cylix)
232
230
 
233
231
 
234
232
 
@@ -1,7 +1,33 @@
1
- #!/usr/bin/env ruby
2
-
3
1
  require 's3_rotate'
4
2
 
5
- backup_manager = S3Rotate::BackupManager.new('aws_access_key_id', 'aws_secret_access_key', 'bucket_name', 'region')
6
- backup_manager.upload("defect-dojo-backup", "/var/opt/defect-dojo/backups")
7
- backup_manager.rotate("defect-dojo-backup", "/var/opt/defect-dojo/backups")
3
+ # Query the environment
4
+ aws_access_key_id = ENV["AWS_ACCESS_KEY_ID"]
5
+ aws_access_key_secret = ENV["AWS_ACCESS_KEY_SECRET"]
6
+ bucket_name = ENV["BUCKET_NAME"]
7
+ region = ENV["REGION"]
8
+
9
+ if not aws_access_key_id or not aws_access_key_secret or not bucket_name or not region
10
+ puts "Misconfigured Environment: Please make sure to set the following Environment Variables"
11
+ puts " - AWS_ACCESS_KEY_ID"
12
+ puts " - AWS_ACCESS_KEY_SECRET"
13
+ puts " - BUCKET_NAME"
14
+ puts " - REGION"
15
+ exit -1
16
+ end
17
+
18
+ # Init S3 Rotate
19
+ backup_manager = S3Rotate::BackupManager.new(aws_access_key_id, aws_access_key_secret, bucket_name, region)
20
+
21
+ # Upload backups to S3
22
+ backup_manager.upload("backup-dojo", "/data/dojo", date_regex=/\d{4}_\d{2}_\d{2}/, date_format="%Y_%m_%d")
23
+ backup_manager.upload("backup-gitlab", "/data/gitlab/app-backup", date_regex=/\d{4}_\d{2}_\d{2}/, date_format="%Y_%m_%d")
24
+ backup_manager.upload("backup-splunk", "/data/splunk/backup", date_regex=/\d{4}_\d{2}_\d{2}/, date_format="%Y_%m_%d")
25
+ backup_manager.upload("backup-taiga", "/data/taiga", date_regex=/\d{4}_\d{2}_\d{2}/, date_format="%Y_%m_%d")
26
+ backup_manager.upload("backup-trac", "/data/trac/backup", date_regex=/\d{4}_\d{2}_\d{2}/, date_format="%Y_%m_%d")
27
+
28
+ # Rotate backups
29
+ backup_manager.rotate("backup-dojo", "/data/dojo", max_local=3, max_daily=14, max_weekly=8, max_monthly=6)
30
+ backup_manager.rotate("backup-gitlab", "/data/gitlab/app-backup", max_local=3, max_daily=14, max_weekly=8, max_monthly=6)
31
+ backup_manager.rotate("backup-splunk", "/data/splunk/backup", max_local=3, max_daily=14, max_weekly=8, max_monthly=6)
32
+ backup_manager.rotate("backup-taiga", "/data/taiga", max_local=3, max_daily=14, max_weekly=8, max_monthly=6)
33
+ backup_manager.rotate("backup-trac", "/data/trac/backup", max_local=3, max_daily=14, max_weekly=8, max_monthly=6)
@@ -1,9 +1,15 @@
1
1
  require 'fog-aws'
2
+ require 'logger'
3
+
4
+ require 's3_rotate/utils/logging'
2
5
 
3
6
  module S3Rotate
4
7
 
5
8
  class S3Client
6
9
 
10
+ # logger
11
+ include Logging
12
+
7
13
  # attributes
8
14
  attr_accessor :access_key
9
15
  attr_accessor :access_secret
@@ -84,10 +90,28 @@ module S3Rotate
84
90
  # @return created S3 Bucket File
85
91
  #
86
92
  def upload(backup_name, backup_date, type, extension, data)
93
+ logger.info("uploading /#{backup_name}/#{type}/#{backup_date.to_s}#{extension}")
94
+
87
95
  # 104857600 bytes => 100 megabytes
88
96
  bucket.files.create(key: "/#{backup_name}/#{type}/#{backup_date.to_s}#{extension}", body: data, multipart_chunk_size: 104857600)
89
97
  end
90
98
 
99
+ #
100
+ # Copy an existing file on AWS S3
101
+ #
102
+ # @param backup_name String containing the name of the backup being updated
103
+ # @param file S3 File, file to be copied
104
+ # @param type String representing the type the backup is being copied to, one of "daily", "weekly" or "monthly"
105
+ #
106
+ # @return created S3 Bucket File
107
+ #
108
+ def copy(backup_name, file, type)
109
+ logger.info("copying #{file.key} to /#{backup_name}/#{type}/#{file.key.split('/').last}")
110
+
111
+ # 104857600 bytes => 100 megabytes
112
+ file.copy(@bucket_name, "/#{backup_name}/#{type}/#{file.key.split('/').last}")
113
+ end
114
+
91
115
  end
92
116
 
93
117
  end
@@ -34,11 +34,12 @@ module S3Rotate
34
34
  # @param backup_name String containing the name of the backup to upload
35
35
  # @param local_backups_path String containing the path to the directory containing the backups
36
36
  # @param date_regex Regex returning the date contained in the filename of each backup
37
+ # @param date_format Format to be used by DateTime.strptime to parse the extracted date
37
38
  #
38
39
  # @return nothing
39
40
  #
40
- def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/)
41
- @uploader.upload(backup_name, local_backups_path, date_regex)
41
+ def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/, date_format="%Y-%m-%d")
42
+ @uploader.upload(backup_name, local_backups_path, date_regex, date_format)
42
43
  end
43
44
 
44
45
  #
@@ -1,5 +1,6 @@
1
1
  # s3_rotate
2
2
  require 's3_rotate/utils/file_utils'
3
+ require 's3_rotate/utils/logging'
3
4
 
4
5
  module S3Rotate
5
6
 
@@ -9,6 +10,9 @@ module S3Rotate
9
10
  #
10
11
  class BackupRotator
11
12
 
13
+ # logger
14
+ include Logging
15
+
12
16
  # attributes
13
17
  attr_accessor :s3_client
14
18
 
@@ -69,13 +73,18 @@ module S3Rotate
69
73
  daily_backups.each do |backup|
70
74
  # promote to weekly if applicable
71
75
  if should_promote_daily_to_weekly?(backup.key, recent_weekly_file)
72
- recent_weekly_file = promote(backup_name, backup.key, backup.body, "weekly").key
76
+ recent_weekly_file = promote(backup_name, backup, "weekly").key
73
77
  end
74
78
  end
75
79
 
76
80
  # cleanup old files
77
81
  if daily_backups.length > max_daily
78
- daily_backups.each_with_index { |backup, i| backup.destroy if i < daily_backups.length - max_daily }
82
+ daily_backups.each_with_index do |backup, i|
83
+ if i < daily_backups.length - max_daily
84
+ logger.info("removing #{backup.key}")
85
+ backup.destroy
86
+ end
87
+ end
79
88
  end
80
89
  end
81
90
 
@@ -106,13 +115,18 @@ module S3Rotate
106
115
  weekly_backups.each do |backup|
107
116
  # promote to monthly if applicable
108
117
  if should_promote_weekly_to_monthly?(backup.key, recent_monthly_file)
109
- recent_monthly_file = promote(backup_name, backup.key, backup.body, "monthly").key
118
+ recent_monthly_file = promote(backup_name, backup, "monthly").key
110
119
  end
111
120
  end
112
121
 
113
122
  # cleanup old files
114
123
  if weekly_backups.length > max_weekly
115
- weekly_backups.each_with_index { |backup, i| backup.destroy if i < weekly_backups.length - max_weekly }
124
+ weekly_backups.each_with_index do |backup, i|
125
+ if i < weekly_backups.length - max_weekly
126
+ logger.info("removing #{backup.key}")
127
+ backup.destroy
128
+ end
129
+ end
116
130
  end
117
131
  end
118
132
 
@@ -132,7 +146,12 @@ module S3Rotate
132
146
 
133
147
  # cleanup old files
134
148
  if monthly_backups.length > max_monthly
135
- monthly_backups.each_with_index { |backup, i| backup.destroy if i < monthly_backups.length - max_monthly }
149
+ monthly_backups.each_with_index do |backup, i|
150
+ if i < monthly_backups.length - max_monthly
151
+ logger.info("removing #{backup.key}")
152
+ backup.destroy
153
+ end
154
+ end
136
155
  end
137
156
  end
138
157
 
@@ -152,7 +171,10 @@ module S3Rotate
152
171
 
153
172
  # cleanup old files
154
173
  if local_backups.length > max_local
155
- local_backups[0..(local_backups.length - max_local - 1)].each { |backup| File.delete("#{local_backups_path}/#{backup}") }
174
+ local_backups[0..(local_backups.length - max_local - 1)].each do |backup|
175
+ logger.info("removing #{local_backups_path}/#{backup}")
176
+ File.delete("#{local_backups_path}/#{backup}")
177
+ end
156
178
  end
157
179
  end
158
180
 
@@ -182,7 +204,7 @@ module S3Rotate
182
204
  end
183
205
 
184
206
  # perform date comparison
185
- return (date_daily_file - date_weekly_file).abs >= 7
207
+ return date_daily_file - date_weekly_file >= 7
186
208
  end
187
209
 
188
210
  #
@@ -215,23 +237,17 @@ module S3Rotate
215
237
  end
216
238
 
217
239
  #
218
- # Promote a daily backup into a weekly backup
240
+ # Promote a backup into a different type of backup backup (for example, daily into weekly)
219
241
  # This operation keeps the original daily file, and creates a new weekly backup
220
242
  #
221
243
  # @param backup_name String containing the name of the backup being updated
222
- # @param filename String, filename of the backup you want to promote
223
- # @param body String, body of the file you want to promote
224
- # @param type String representing the type of backup being uploaded, one of "daily", "weekly" or "monthly"
244
+ # @param file S3 File, file to be promoted
245
+ # @param type String representing the type the backup is being promoted into, one of "daily", "weekly" or "monthly"
225
246
  #
226
247
  # @return created S3 Bucket File
227
248
  #
228
- def promote(backup_name, filename, body, type)
229
- # parse the date & extension
230
- backup_date = FileUtils::date_from_filename(filename)
231
- backup_extension = FileUtils::extension_from_filename(filename)
232
-
233
- # upload
234
- @s3_client.upload(backup_name, backup_date, type, backup_extension, body)
249
+ def promote(backup_name, file, type)
250
+ @s3_client.copy(backup_name, file, type)
235
251
  end
236
252
 
237
253
  end
@@ -31,17 +31,18 @@ module S3Rotate
31
31
  # @param backup_name String containing the name of the backup to upload
32
32
  # @param local_backups_path String containing the path to the directory containing the backups
33
33
  # @param date_regex Regex returning the date contained in the filename of each backup
34
+ # @param date_format Format to be used by DateTime.strptime to parse the extracted date
34
35
  #
35
36
  # @return nothing
36
37
  #
37
- def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/)
38
+ def upload(backup_name, local_backups_path, date_regex=/\d{4}-\d{2}-\d{2}/, date_format="%Y-%m-%d")
38
39
  # get backup files
39
40
  local_backups = FileUtils::files_in_directory(local_backups_path).reverse
40
41
 
41
42
  # upload local backups until we find one backup already uploaded
42
43
  local_backups.each do |local_backup|
43
44
  # parse the date & extension
44
- backup_date = FileUtils::date_from_filename(local_backup, date_regex)
45
+ backup_date = FileUtils::date_from_filename(local_backup, date_regex, date_format)
45
46
  backup_extension = FileUtils::extension_from_filename(local_backup)
46
47
 
47
48
  # skip invalid files
@@ -51,7 +52,7 @@ module S3Rotate
51
52
  break if @s3_client.exists?(backup_name, backup_date, "daily", extension=backup_extension)
52
53
 
53
54
  # upload file
54
- @s3_client.upload_local_backup_to_s3(backup_name, backup_date, "daily", backup_extension, File.open(local_backup))
55
+ @s3_client.upload(backup_name, backup_date, "daily", backup_extension, File.open("#{local_backups_path}/#{local_backup}"))
55
56
  end
56
57
  end
57
58
 
@@ -10,10 +10,11 @@ module S3Rotate
10
10
  #
11
11
  # @param filename String containing the filename to be parsed.
12
12
  # @param date_regex Regex returning the date contained in the filename
13
+ # @param date_format Format to be used by DateTime.strptime to parse the extracted date
13
14
  #
14
15
  # @return Date instance, representing the parsed date
15
16
  #
16
- def FileUtils.date_from_filename(filename, date_regex=/\d{4}-\d{2}-\d{2}/)
17
+ def FileUtils.date_from_filename(filename, date_regex=/\d{4}-\d{2}-\d{2}/, date_format="%Y-%m-%d")
17
18
  # match the date in the filename
18
19
  match = filename.match(date_regex)
19
20
 
@@ -30,14 +31,9 @@ module S3Rotate
30
31
 
31
32
  # regular date
32
33
  begin
33
- if date_str.include?("-")
34
- Date.parse(date_str)
35
- # timestamp
36
- else
37
- DateTime.strptime(date_str, "%s").to_date
38
- end
34
+ DateTime.strptime(date_str, date_format).to_date
39
35
  rescue
40
- raise "Date format not supported"
36
+ raise "Invalid date_format"
41
37
  end
42
38
  end
43
39
 
@@ -0,0 +1,19 @@
1
+ module S3Rotate
2
+
3
+ module Logging
4
+
5
+ def logger
6
+ Logging.logger
7
+ end
8
+
9
+ def self.logger
10
+ @logger ||= Logger.new(STDOUT)
11
+ end
12
+
13
+ def self.level(level)
14
+ logger.level = level
15
+ end
16
+
17
+ end
18
+
19
+ end
@@ -7,7 +7,7 @@ Gem::Specification.new do |s|
7
7
  s.homepage = 'https://github.com/Whova/s3_rotate'
8
8
  s.license = 'MIT'
9
9
 
10
- s.version = '1.0.1'
10
+ s.version = '1.1.0'
11
11
  s.date = Date.today.to_s
12
12
 
13
13
  s.authors = ["Simon Ninon"]
@@ -125,4 +125,25 @@ describe S3Rotate::S3Client do
125
125
 
126
126
  end
127
127
 
128
+ describe '#copy' do
129
+
130
+ it 'copies backup' do
131
+ # mock data
132
+ file = @client.connection.directories.get('bucket').files.create(key: '/backup_name/daily/2020-01-12.tgz', body: 'some data')
133
+
134
+ # perform test
135
+ @client.copy('backup_name', file, 'weekly')
136
+
137
+ # verify result
138
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files.length).to eq 1
139
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].key).to eq '/backup_name/weekly/2020-01-12.tgz'
140
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].body).to eq 'some data'
141
+
142
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files.length).to eq 1
143
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files[0].key).to eq '/backup_name/daily/2020-01-12.tgz'
144
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files[0].body).to eq 'some data'
145
+ end
146
+
147
+ end
148
+
128
149
  end
@@ -33,10 +33,10 @@ describe S3Rotate::BackupManager do
33
33
 
34
34
  it 'calls uploader.upload' do
35
35
  # mock
36
- allow(@manager.uploader).to receive(:upload).with('backup_name', '/path/to/dir', /\d{4}-\d{2}-\d{2}/).and_return('upload_result')
36
+ allow(@manager.uploader).to receive(:upload).with('backup_name', '/path/to/dir', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d").and_return('upload_result')
37
37
 
38
38
  # perform test
39
- expect(@manager.upload('backup_name', '/path/to/dir', /\d{4}-\d{2}-\d{2}/)).to eq 'upload_result'
39
+ expect(@manager.upload('backup_name', '/path/to/dir', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")).to eq 'upload_result'
40
40
  end
41
41
 
42
42
  end
@@ -748,24 +748,21 @@ describe S3Rotate::BackupRotator do
748
748
 
749
749
  describe '#promote' do
750
750
 
751
- it 'promotes backup with extension' do
752
- # perform test
753
- @rotator.promote('backup_name', '/gitlab/daily/2020-01-02.tgz', 'some data', 'weekly')
754
-
755
- # verify result
756
- expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files.length).to eq 1
757
- expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].key).to eq '/backup_name/weekly/2020-01-02.tgz'
758
- expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].body).to eq 'some data'
759
- end
751
+ it 'promotes backup' do
752
+ # mock data
753
+ file = @client.connection.directories.get('bucket').files.create(key: '/backup_name/daily/2020-01-12.tgz', body: 'some data')
760
754
 
761
- it 'promotes backup without extension' do
762
755
  # perform test
763
- @rotator.promote('backup_name', '/gitlab/daily/2020-01-02', 'some data', 'weekly')
756
+ @rotator.promote('backup_name', file, 'weekly')
764
757
 
765
758
  # verify result
766
759
  expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files.length).to eq 1
767
- expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].key).to eq '/backup_name/weekly/2020-01-02'
760
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].key).to eq '/backup_name/weekly/2020-01-12.tgz'
768
761
  expect(@client.connection.directories.get('bucket', prefix: '/backup_name/weekly').files[0].body).to eq 'some data'
762
+
763
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files.length).to eq 1
764
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files[0].key).to eq '/backup_name/daily/2020-01-12.tgz'
765
+ expect(@client.connection.directories.get('bucket', prefix: '/backup_name/daily').files[0].body).to eq 'some data'
769
766
  end
770
767
 
771
768
  end
@@ -45,23 +45,23 @@ describe S3Rotate::BackupUploader do
45
45
  'some-backup-2020-01-06.tgz',
46
46
  'some-backup-2020-01-07.tgz'
47
47
  ])
48
- allow(@client).to receive(:upload_local_backup_to_s3).and_return nil
48
+ allow(@client).to receive(:upload).and_return nil
49
49
  allow(File).to receive(:open).and_return "raw_data"
50
50
 
51
51
  # perform test
52
52
  @uploader.upload('backup_name', '/path/to/dir')
53
53
 
54
54
  # verify result
55
- expect(@client).to have_received(:upload_local_backup_to_s3).exactly(3)
56
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 7), 'daily', '.tgz', 'raw_data')
57
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 6), 'daily', '.tgz', 'raw_data')
58
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 5), 'daily', '.tgz', 'raw_data')
55
+ expect(@client).to have_received(:upload).exactly(3)
56
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 7), 'daily', '.tgz', 'raw_data')
57
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 6), 'daily', '.tgz', 'raw_data')
58
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 5), 'daily', '.tgz', 'raw_data')
59
59
 
60
60
  expect(S3Rotate::FileUtils).to have_received(:date_from_filename).exactly(4)
61
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-07.tgz', /\d{4}-\d{2}-\d{2}/)
62
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-06.tgz', /\d{4}-\d{2}-\d{2}/)
63
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-05.tgz', /\d{4}-\d{2}-\d{2}/)
64
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-04.tgz', /\d{4}-\d{2}-\d{2}/)
61
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-07.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
62
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-06.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
63
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-05.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
64
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-04.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
65
65
 
66
66
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).exactly(4)
67
67
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).with('some-backup-2020-01-07.tgz')
@@ -79,22 +79,22 @@ describe S3Rotate::BackupUploader do
79
79
  'some-backup-2020-01-02.tgz',
80
80
  'some-backup-2020-01-03.tgz',
81
81
  ])
82
- allow(@client).to receive(:upload_local_backup_to_s3).and_return nil
82
+ allow(@client).to receive(:upload).and_return nil
83
83
  allow(File).to receive(:open).and_return "raw_data"
84
84
 
85
85
  # perform test
86
86
  @uploader.upload('backup_name', '/path/to/dir')
87
87
 
88
88
  # verify result
89
- expect(@client).to have_received(:upload_local_backup_to_s3).exactly(3)
90
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 3), 'daily', nil, 'raw_data')
91
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 2), 'daily', nil, 'raw_data')
92
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 1), 'daily', nil, 'raw_data')
89
+ expect(@client).to have_received(:upload).exactly(3)
90
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 3), 'daily', nil, 'raw_data')
91
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 2), 'daily', nil, 'raw_data')
92
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 1), 'daily', nil, 'raw_data')
93
93
 
94
94
  expect(S3Rotate::FileUtils).to have_received(:date_from_filename).exactly(3)
95
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-03.tgz', /\d{4}-\d{2}-\d{2}/)
96
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-02.tgz', /\d{4}-\d{2}-\d{2}/)
97
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-01.tgz', /\d{4}-\d{2}-\d{2}/)
95
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-03.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
96
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-02.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
97
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-01.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
98
98
 
99
99
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).exactly(3)
100
100
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).with('some-backup-2020-01-03.tgz')
@@ -118,21 +118,21 @@ describe S3Rotate::BackupUploader do
118
118
  'some-backup-2020-01-02.tgz',
119
119
  'some-backup-2020-01-03.tgz',
120
120
  ])
121
- allow(@client).to receive(:upload_local_backup_to_s3).and_return nil
121
+ allow(@client).to receive(:upload).and_return nil
122
122
  allow(File).to receive(:open).and_return "raw_data"
123
123
 
124
124
  # perform test
125
125
  @uploader.upload('backup_name', '/path/to/dir')
126
126
 
127
127
  # verify result
128
- expect(@client).to have_received(:upload_local_backup_to_s3).exactly(2)
129
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 3), 'daily', '.tgz', 'raw_data')
130
- expect(@client).to have_received(:upload_local_backup_to_s3).with('backup_name', Date.new(2020, 1, 1), 'daily', '.tgz', 'raw_data')
128
+ expect(@client).to have_received(:upload).exactly(2)
129
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 3), 'daily', '.tgz', 'raw_data')
130
+ expect(@client).to have_received(:upload).with('backup_name', Date.new(2020, 1, 1), 'daily', '.tgz', 'raw_data')
131
131
 
132
132
  expect(S3Rotate::FileUtils).to have_received(:date_from_filename).exactly(3)
133
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-03.tgz', /\d{4}-\d{2}-\d{2}/)
134
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-02.tgz', /\d{4}-\d{2}-\d{2}/)
135
- expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-01.tgz', /\d{4}-\d{2}-\d{2}/)
133
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-03.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
134
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-02.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
135
+ expect(S3Rotate::FileUtils).to have_received(:date_from_filename).with('some-backup-2020-01-01.tgz', /\d{4}-\d{2}-\d{2}/, "%Y-%m-%d")
136
136
 
137
137
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).exactly(3)
138
138
  expect(S3Rotate::FileUtils).to have_received(:extension_from_filename).with('some-backup-2020-01-03.tgz')
@@ -13,15 +13,15 @@ describe S3Rotate::FileUtils do
13
13
  end
14
14
 
15
15
  it 'parses timestamp formats' do
16
- expect(S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz", /file-(\d+)-backup.tar.gz/)).to eq Date.new(2020, 1, 27)
16
+ expect(S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz", /file-(\d+)-backup.tar.gz/, "%s")).to eq Date.new(2020, 1, 27)
17
17
  end
18
18
 
19
19
  it 'raises if the regex matched nothing' do
20
- expect{ S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz") }.to raise_error(RuntimeError, "Invalid date_regex or filename format")
20
+ expect{ S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz", /\d{4}-\d{2}-\d{2}/, "%s") }.to raise_error(RuntimeError, "Invalid date_regex or filename format")
21
21
  end
22
22
 
23
- it 'raises if the regex matched nothing' do
24
- expect{ S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz", /file-\d+-backup.tar.gz/) }.to raise_error(RuntimeError, "Date format not supported")
23
+ it 'raises if the matched string can not be parsed' do
24
+ expect{ S3Rotate::FileUtils::date_from_filename("/path/to/file-1580098737-backup.tar.gz", /file-\d+-backup.tar.gz/, "%s") }.to raise_error(RuntimeError, "Invalid date_format")
25
25
  end
26
26
 
27
27
  end
@@ -1,3 +1,8 @@
1
+ require 'logger'
2
+ require 'fog-aws'
3
+
4
+ require File.expand_path("../../lib/s3_rotate/utils/logging", __FILE__)
5
+
1
6
  RSpec.configure do |c|
2
7
 
3
8
  c.before :each do
@@ -5,6 +10,8 @@ RSpec.configure do |c|
5
10
 
6
11
  fog = Fog::Storage.new(aws_access_key_id: 'key', aws_secret_access_key: 'secret', provider: 'AWS', region: 'region')
7
12
  fog.directories.create(key: 'bucket')
13
+
14
+ S3Rotate::Logging::level Logger::ERROR
8
15
  end
9
16
 
10
17
  c.after :each do
metadata CHANGED
@@ -1,27 +1,27 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: s3_rotate
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.1
4
+ version: 1.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Simon Ninon
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2020-01-28 00:00:00.000000000 Z
11
+ date: 2020-02-01 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: fog-aws
15
15
  requirement: !ruby/object:Gem::Requirement
16
16
  requirements:
17
- - - ~>
17
+ - - "~>"
18
18
  - !ruby/object:Gem::Version
19
19
  version: 3.5.2
20
20
  type: :runtime
21
21
  prerelease: false
22
22
  version_requirements: !ruby/object:Gem::Requirement
23
23
  requirements:
24
- - - ~>
24
+ - - "~>"
25
25
  - !ruby/object:Gem::Version
26
26
  version: 3.5.2
27
27
  description: AWS S3 upload with rotation mechanism
@@ -30,8 +30,8 @@ executables: []
30
30
  extensions: []
31
31
  extra_rdoc_files: []
32
32
  files:
33
- - .gitignore
34
- - .rspec
33
+ - ".gitignore"
34
+ - ".rspec"
35
35
  - Gemfile
36
36
  - LICENSE
37
37
  - README.md
@@ -42,6 +42,7 @@ files:
42
42
  - lib/s3_rotate/core/backup_rotator.rb
43
43
  - lib/s3_rotate/core/backup_uploader.rb
44
44
  - lib/s3_rotate/utils/file_utils.rb
45
+ - lib/s3_rotate/utils/logging.rb
45
46
  - s3_rotate.gemspec
46
47
  - spec/s3_rotate/aws/s3_client_spec.rb
47
48
  - spec/s3_rotate/core/backup_manager_spec.rb
@@ -63,17 +64,16 @@ require_paths:
63
64
  - lib
64
65
  required_ruby_version: !ruby/object:Gem::Requirement
65
66
  requirements:
66
- - - '>='
67
+ - - ">="
67
68
  - !ruby/object:Gem::Version
68
69
  version: 2.0.0
69
70
  required_rubygems_version: !ruby/object:Gem::Requirement
70
71
  requirements:
71
- - - '>='
72
+ - - ">="
72
73
  - !ruby/object:Gem::Version
73
74
  version: '0'
74
75
  requirements: []
75
- rubyforge_project:
76
- rubygems_version: 2.7.10
76
+ rubygems_version: 3.0.1
77
77
  signing_key:
78
78
  specification_version: 4
79
79
  summary: AWS S3 upload with rotation mechanism