travis-backup 0.1.2 → 0.3.0
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/.gitignore +1 -0
- data/.travis.yml +1 -1
- data/README.md +14 -4
- data/lib/backup/load_from_files.rb +245 -0
- data/lib/backup/move_logs.rb +43 -0
- data/lib/backup/remove_orphans.rb +177 -0
- data/lib/backup/remove_specified/remove_heavy_data.rb +99 -0
- data/lib/backup/remove_specified/remove_with_all_dependencies.rb +51 -0
- data/lib/backup/remove_specified/shared.rb +20 -0
- data/lib/backup/remove_specified.rb +68 -0
- data/lib/backup/save_file.rb +43 -0
- data/lib/backup/save_id_hash_to_file.rb +33 -0
- data/lib/backup/save_nullified_rels_to_file.rb +29 -0
- data/lib/config.rb +37 -7
- data/lib/db_helper.rb +27 -0
- data/lib/dry_run_reporter.rb +47 -0
- data/lib/id_hash.rb +97 -0
- data/lib/ids_of_all_dependencies.rb +330 -0
- data/lib/model.rb +77 -0
- data/lib/models/abuse.rb +9 -0
- data/lib/models/annotation.rb +8 -0
- data/lib/models/branch.rb +9 -1
- data/lib/models/broadcast.rb +8 -0
- data/lib/models/build.rb +23 -3
- data/lib/models/commit.rb +8 -1
- data/lib/models/cron.rb +2 -1
- data/lib/models/email.rb +8 -0
- data/lib/models/invoice.rb +8 -0
- data/lib/models/job.rb +10 -2
- data/lib/models/log.rb +1 -1
- data/lib/models/membership.rb +9 -0
- data/lib/models/message.rb +8 -0
- data/lib/models/organization.rb +15 -1
- data/lib/models/owner_group.rb +8 -0
- data/lib/models/permission.rb +9 -0
- data/lib/models/pull_request.rb +5 -1
- data/lib/models/queueable_job.rb +8 -0
- data/lib/models/repository.rb +16 -3
- data/lib/models/request.rb +11 -1
- data/lib/models/ssl_key.rb +2 -1
- data/lib/models/stage.rb +4 -1
- data/lib/models/star.rb +9 -0
- data/lib/models/subscription.rb +9 -0
- data/lib/models/tag.rb +7 -1
- data/lib/models/token.rb +8 -0
- data/lib/models/trial.rb +9 -0
- data/lib/models/trial_allowance.rb +9 -0
- data/lib/models/user.rb +33 -1
- data/lib/models/user_beta_feature.rb +8 -0
- data/lib/nullify_dependencies.rb +42 -0
- data/lib/travis-backup.rb +40 -266
- data/travis-backup.gemspec +2 -1
- metadata +53 -9
- data/Gemfile.lock +0 -212
- data/lib/models/model.rb +0 -8
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: ce106f04b21145f1d238deb83a1aa1666f1a6861f0c5cbf09f58dd939a70b82c
|
4
|
+
data.tar.gz: f4b17655668028ee56c9fe3ecc88feb2baea37aa60c46435b1e9fe3ba251036c
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: c11bcb4f873d4dedc7f95eb706c9381f0f2e27f67762bd8ddbaa2d94d8689de1496a095e17e8cbcba42cfce04e4e9e61518bc63f3931495ceecae351f1674b12
|
7
|
+
data.tar.gz: 714f37c213a1c65d8e2d3b94653941c1c6e04919ce3c51750577fb4365ea976da0e90988deec914fed037df08cf6f640622928aaa5470ccb9bdbe69a35ca6045
|
data/.gitignore
CHANGED
data/.travis.yml
CHANGED
data/README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1
1
|
# README
|
2
2
|
|
3
|
-
*travis-backup* is an application that helps with housekeeping and backup for Travis CI database v2.2 and with migration to v3.0 database. By default it removes requests and builds with their corresponding jobs and logs, as long as they are older than given threshold says (and backups them in files, if this option is active). Although it can be also run
|
3
|
+
*travis-backup* is an application that helps with housekeeping and backup for Travis CI database v2.2 and with migration to v3.0 database. By default it removes requests and builds with their corresponding jobs and logs, as long as they are older than given threshold says (and backups them in files, if this option is active). Although it can be also run in special modes to perform other specific tasks.
|
4
4
|
|
5
5
|
### Installation and run
|
6
6
|
|
@@ -29,6 +29,9 @@ All arguments:
|
|
29
29
|
--move_logs # run in move logs mode - move all logs to database at destination_db_url URL
|
30
30
|
--destination_db_url URL # URL for moving logs to
|
31
31
|
--remove_orphans # run in remove orphans mode
|
32
|
+
--orphans_table # name of the table we will remove orphans from (if not defined, all tables are considered)
|
33
|
+
--load_from_files # loads files stored in files_location to the database
|
34
|
+
--id_gap # concerns file loading - the gap between the biggest id in database and the lowest one that will be set to loaded data (that's for data inserted by other users during the load being performed; equals 1000 by default)
|
32
35
|
```
|
33
36
|
|
34
37
|
Or inside your app:
|
@@ -60,9 +63,13 @@ backup.run(repo_id: 1)
|
|
60
63
|
|
61
64
|
Using `--move_logs` flag you can move all logs to database at `destination_db_url` URL (which is required in this case). When you run gem in this mode no files are created and no other tables are being touched.
|
62
65
|
|
63
|
-
Using `--remove_orphans` flag you can remove all orphaned data from tables.
|
66
|
+
Using `--remove_orphans` flag you can remove all orphaned data from the tables. You can pick a specific table using `--orphans_table` flag or, by leaving it undefined, let all tables to be processed in the removing orphans procedure. It can be combined with `--backup` flag in order to save removed data in files.
|
64
67
|
|
65
|
-
Using `--
|
68
|
+
Using `--user_id`, `--org_id` or `--repo_id` flag without setting `--threshold` results in removing the specified user/organization/repository with all its dependencies. It can be combined with `--backup` flag in order to save removed data in files.
|
69
|
+
|
70
|
+
Using `--load_from_files` flag you can restore dumped data from files located at path given by `--files_location`. The distance defined by `--id_gap` is going to be kept between biggest ids in the database and the lowest ones from the data loaded from files (and it equals 1000 by default).
|
71
|
+
|
72
|
+
Using `--dry_run` flag you can check which data would be removed by gem, but without removing them actually. Instead of that reports will be printed on standard output. This flag can be also combined with special modes.
|
66
73
|
|
67
74
|
### Configuration options
|
68
75
|
|
@@ -80,9 +87,12 @@ backup:
|
|
80
87
|
repo_id: 1 # run only for given repository
|
81
88
|
move_logs: false # run in move logs mode - move all logs to database at destination_db_url URL
|
82
89
|
remove_orphans: false # run in remove orphans mode
|
90
|
+
orphans_table: 'builds' # name of the table we will remove orphans from (if not defined, all tables are considered)
|
91
|
+
load_from_files: false # loads files stored in files_location to the database
|
92
|
+
id_gap: 1500 # concerns file loading - the gap between the biggest id in database and the lowest one that will be set to loaded data (that's for data inserted by other users during the load being performed; equals 1000 by default)
|
83
93
|
```
|
84
94
|
|
85
|
-
You can also set these properties using env vars corresponding to them: `IF_BACKUP`, `BACKUP_DRY_RUN`, `BACKUP_LIMIT`, `BACKUP_THRESHOLD`, `BACKUP_FILES_LOCATION`, `BACKUP_USER_ID`, `BACKUP_ORG_ID`, `BACKUP_REPO_ID`, `BACKUP_MOVE_LOGS`, `BACKUP_REMOVE_ORPHANS`.
|
95
|
+
You can also set these properties using env vars corresponding to them: `IF_BACKUP`, `BACKUP_DRY_RUN`, `BACKUP_LIMIT`, `BACKUP_THRESHOLD`, `BACKUP_FILES_LOCATION`, `BACKUP_USER_ID`, `BACKUP_ORG_ID`, `BACKUP_REPO_ID`, `BACKUP_MOVE_LOGS`, `BACKUP_REMOVE_ORPHANS`, `BACKUP_ORPHANS_TABLE`, `BACKUP_LOAD_FROM_FILES`, `BACKUP_ID_GAP`.
|
86
96
|
|
87
97
|
You should also specify your database url. You can do this the standard way in `config/database.yml` file, setting the `database_url` hash argument while creating `Backup` instance or using the `DATABASE_URL` env var. Your database should be consistent with the Travis 2.2 database schema.
|
88
98
|
|
@@ -0,0 +1,245 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'id_hash'
|
4
|
+
|
5
|
+
class Backup
|
6
|
+
class LoadFromFiles
|
7
|
+
class JsonContent < String
|
8
|
+
def hash
|
9
|
+
@hash ||= JSON.parse(self).symbolize_keys
|
10
|
+
end
|
11
|
+
end
|
12
|
+
|
13
|
+
class DataFile
|
14
|
+
attr_accessor :content
|
15
|
+
|
16
|
+
def initialize(json_content)
|
17
|
+
@content = json_content
|
18
|
+
end
|
19
|
+
|
20
|
+
def table_name
|
21
|
+
@content.match(/"table_name":\s?"(\w+)"/)[1]
|
22
|
+
end
|
23
|
+
|
24
|
+
def table_name_sym
|
25
|
+
table_name.to_sym
|
26
|
+
end
|
27
|
+
|
28
|
+
def full_hash
|
29
|
+
@content.hash
|
30
|
+
end
|
31
|
+
end
|
32
|
+
|
33
|
+
class EntryFile < DataFile
|
34
|
+
def ids
|
35
|
+
@content.scan(/"id":\s?(\d+)/).flatten.map(&:to_i)
|
36
|
+
end
|
37
|
+
|
38
|
+
def lowest_id
|
39
|
+
ids.min
|
40
|
+
end
|
41
|
+
|
42
|
+
def entries
|
43
|
+
full_hash[:data]
|
44
|
+
end
|
45
|
+
end
|
46
|
+
|
47
|
+
class RelationshipFile < DataFile
|
48
|
+
def relationships
|
49
|
+
@relationships ||= full_hash[:nullified_relationships].map do |rel|
|
50
|
+
rel.symbolize_keys
|
51
|
+
end
|
52
|
+
end
|
53
|
+
end
|
54
|
+
|
55
|
+
def initialize(config, dry_run_reporter=nil)
|
56
|
+
@config = config
|
57
|
+
@dry_run_reporter = dry_run_reporter
|
58
|
+
@touched_models = []
|
59
|
+
end
|
60
|
+
|
61
|
+
def run
|
62
|
+
set_id_offsets
|
63
|
+
load_data_with_offsets
|
64
|
+
cancel_offset_for_foreign_data
|
65
|
+
set_id_sequences
|
66
|
+
load_nullified_relationships
|
67
|
+
end
|
68
|
+
|
69
|
+
private
|
70
|
+
|
71
|
+
def load_nullified_relationships
|
72
|
+
relationship_files.each do |file|
|
73
|
+
file.relationships.each do |rel|
|
74
|
+
offset = @id_offsets[file.table_name.to_sym]
|
75
|
+
|
76
|
+
ActiveRecord::Base.connection.execute(%{
|
77
|
+
update #{rel[:related_table]}
|
78
|
+
set #{rel[:foreign_key]} = #{rel[:parent_id].to_i + offset}
|
79
|
+
where id = #{rel[:related_id].to_i};
|
80
|
+
})
|
81
|
+
end
|
82
|
+
end
|
83
|
+
end
|
84
|
+
|
85
|
+
def set_id_sequences
|
86
|
+
@touched_models.each do |model|
|
87
|
+
value = model.last.id + 1
|
88
|
+
seq = model.table_name + '_id_seq'
|
89
|
+
set_sequence(seq, value)
|
90
|
+
end
|
91
|
+
|
92
|
+
set_shared_builds_tasks_seq
|
93
|
+
end
|
94
|
+
|
95
|
+
def set_shared_builds_tasks_seq
|
96
|
+
value = [Build.last&.id || -1, Job.last&.id || -1].max + 1
|
97
|
+
|
98
|
+
if value > 0
|
99
|
+
set_sequence("shared_builds_tasks_seq", value)
|
100
|
+
end
|
101
|
+
end
|
102
|
+
|
103
|
+
def set_sequence(seq, value)
|
104
|
+
ActiveRecord::Base.connection.execute("alter sequence #{seq} restart with #{value};")
|
105
|
+
end
|
106
|
+
|
107
|
+
def cancel_offset_for_foreign_data
|
108
|
+
@loaded_entries.each do |entry|
|
109
|
+
entry.class.reflect_on_all_associations.select { |a| a.macro == :belongs_to }.each do |association|
|
110
|
+
foreign_key = association.foreign_key.to_sym
|
111
|
+
if entry.send(association.name).nil? && entry.send(foreign_key).present?
|
112
|
+
entry_hash = entry.attributes.symbolize_keys
|
113
|
+
table_name = get_table_name(entry_hash, association)
|
114
|
+
offset = @id_offsets[table_name.to_sym]
|
115
|
+
next if offset.nil?
|
116
|
+
|
117
|
+
proper_id = entry.send(foreign_key) - offset
|
118
|
+
entry.update(foreign_key => proper_id)
|
119
|
+
end
|
120
|
+
end
|
121
|
+
end
|
122
|
+
end
|
123
|
+
|
124
|
+
def load_data_with_offsets
|
125
|
+
@repository_files = []
|
126
|
+
|
127
|
+
@loaded_entries = entry_files.map do |data_file|
|
128
|
+
model = Model.get_model_by_table_name(data_file.table_name)
|
129
|
+
|
130
|
+
if model == Repository
|
131
|
+
@repository_files << data_file
|
132
|
+
next
|
133
|
+
end
|
134
|
+
|
135
|
+
load_file(model, data_file)
|
136
|
+
end.flatten.compact
|
137
|
+
|
138
|
+
repository_entries = @repository_files.map do |data_file|
|
139
|
+
load_file(Repository, data_file)
|
140
|
+
end.flatten.compact
|
141
|
+
|
142
|
+
@loaded_entries.concat(repository_entries)
|
143
|
+
end
|
144
|
+
|
145
|
+
def load_file(model, data_file)
|
146
|
+
@touched_models << model
|
147
|
+
|
148
|
+
data_file.entries&.map do |entry_hash|
|
149
|
+
load_entry(model, entry_hash)
|
150
|
+
end
|
151
|
+
end
|
152
|
+
|
153
|
+
def load_entry(model, entry_hash)
|
154
|
+
entry_hash.symbolize_keys!
|
155
|
+
entry_hash[:id] += @id_offsets[model.table_name.to_sym]
|
156
|
+
add_offset_to_foreign_keys!(model, entry_hash)
|
157
|
+
model.create(entry_hash)
|
158
|
+
end
|
159
|
+
|
160
|
+
def add_offset_to_foreign_keys!(model, entry_hash)
|
161
|
+
model.reflect_on_all_associations.select { |a| a.macro == :belongs_to }.each do |association|
|
162
|
+
foreign_key_sym = association.foreign_key.to_sym
|
163
|
+
next unless entry_hash[foreign_key_sym]
|
164
|
+
|
165
|
+
table_name = get_table_name(entry_hash, association)
|
166
|
+
entry_hash[foreign_key_sym] += @id_offsets[table_name.to_sym] || 0
|
167
|
+
end
|
168
|
+
end
|
169
|
+
|
170
|
+
def get_table_name(entry_hash, association)
|
171
|
+
if association.polymorphic?
|
172
|
+
type_symbol = association.foreign_key.gsub(/_id$/, '_type').to_sym
|
173
|
+
class_name = entry_hash[type_symbol]
|
174
|
+
else
|
175
|
+
class_name = association.class_name
|
176
|
+
end
|
177
|
+
|
178
|
+
Model.get_model(class_name).table_name
|
179
|
+
end
|
180
|
+
|
181
|
+
def file_contents
|
182
|
+
@file_contents ||= Dir["#{@config.files_location}/**/*.json"].map do |file_path|
|
183
|
+
JsonContent.new(File.read(file_path))
|
184
|
+
end
|
185
|
+
end
|
186
|
+
|
187
|
+
def entry_files
|
188
|
+
@entry_files ||= file_contents.map do |content|
|
189
|
+
next unless content.hash[:data]
|
190
|
+
|
191
|
+
EntryFile.new(content)
|
192
|
+
end.compact
|
193
|
+
end
|
194
|
+
|
195
|
+
def relationship_files
|
196
|
+
@relationship_files ||= file_contents.map do |content|
|
197
|
+
next if content.hash[:data]
|
198
|
+
|
199
|
+
RelationshipFile.new(content)
|
200
|
+
end.compact
|
201
|
+
end
|
202
|
+
|
203
|
+
def find_lowest_ids_from_files
|
204
|
+
@lowest_ids_from_files = HashOfArrays.new
|
205
|
+
|
206
|
+
entry_files.each do |data_file|
|
207
|
+
table_name = data_file.table_name_sym
|
208
|
+
min_id = data_file.lowest_id
|
209
|
+
@lowest_ids_from_files.add(table_name, min_id) if min_id
|
210
|
+
end
|
211
|
+
|
212
|
+
@lowest_ids_from_files = @lowest_ids_from_files.map { |k, arr| [k, arr.min] }.to_h
|
213
|
+
end
|
214
|
+
|
215
|
+
def find_highest_ids_from_db
|
216
|
+
@highest_ids_from_db = {}
|
217
|
+
|
218
|
+
Model.subclasses.each do |model|
|
219
|
+
table_name = model.table_name.to_sym
|
220
|
+
@highest_ids_from_db[table_name] = model.order(:id).last&.id || 0
|
221
|
+
end
|
222
|
+
end
|
223
|
+
|
224
|
+
def set_id_offsets
|
225
|
+
find_lowest_ids_from_files
|
226
|
+
find_highest_ids_from_db
|
227
|
+
|
228
|
+
@id_offsets = @lowest_ids_from_files.map do |key, file_min|
|
229
|
+
db_max = @highest_ids_from_db[key]
|
230
|
+
offset = db_max - file_min + @config.id_gap
|
231
|
+
[key, offset]
|
232
|
+
end.to_h
|
233
|
+
|
234
|
+
make_offset_common_for_builds_and_jobs
|
235
|
+
end
|
236
|
+
|
237
|
+
def make_offset_common_for_builds_and_jobs
|
238
|
+
if @id_offsets[:builds] && @id_offsets[:jobs]
|
239
|
+
offset = [@id_offsets[:builds], @id_offsets[:jobs]].max
|
240
|
+
@id_offsets[:builds] = offset
|
241
|
+
@id_offsets[:jobs] = offset
|
242
|
+
end
|
243
|
+
end
|
244
|
+
end
|
245
|
+
end
|
@@ -0,0 +1,43 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
class Backup
|
4
|
+
class MoveLogs
|
5
|
+
attr_reader :config
|
6
|
+
|
7
|
+
def initialize(config, db_helper, dry_run_reporter=nil)
|
8
|
+
@config = config
|
9
|
+
@dry_run_reporter = dry_run_reporter
|
10
|
+
@db_helper = db_helper
|
11
|
+
end
|
12
|
+
|
13
|
+
def run
|
14
|
+
return run_dry if @config.dry_run
|
15
|
+
|
16
|
+
@db_helper.connect_db(@config.database_url)
|
17
|
+
Log.order(:id).in_batches(of: @config.limit.to_i).map do |logs_batch|
|
18
|
+
process_logs_batch(logs_batch)
|
19
|
+
end
|
20
|
+
end
|
21
|
+
|
22
|
+
def process_logs_batch(logs_batch)
|
23
|
+
log_hashes = logs_batch.as_json
|
24
|
+
@db_helper.connect_db(@config.destination_db_url)
|
25
|
+
|
26
|
+
log_hashes.each do |log_hash|
|
27
|
+
new_log = Log.new(log_hash)
|
28
|
+
new_log.save!
|
29
|
+
end
|
30
|
+
|
31
|
+
@db_helper.connect_db(@config.database_url)
|
32
|
+
|
33
|
+
logs_batch.each(&:destroy)
|
34
|
+
|
35
|
+
GC.start
|
36
|
+
end
|
37
|
+
|
38
|
+
def run_dry
|
39
|
+
ids = Log.order(:id).map(&:id)
|
40
|
+
@dry_run_reporter.add_to_report(:logs, *ids)
|
41
|
+
end
|
42
|
+
end
|
43
|
+
end
|
@@ -0,0 +1,177 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'byebug'
|
4
|
+
require 'backup/save_id_hash_to_file'
|
5
|
+
require 'backup/save_nullified_rels_to_file'
|
6
|
+
|
7
|
+
class Backup
|
8
|
+
class RemoveOrphans
|
9
|
+
include SaveIdHashToFile
|
10
|
+
include SaveNullifiedRelsToFile
|
11
|
+
|
12
|
+
attr_reader :config
|
13
|
+
|
14
|
+
def initialize(config, dry_run_reporter=nil)
|
15
|
+
@config = config
|
16
|
+
@dry_run_reporter = dry_run_reporter
|
17
|
+
@ids_to_remove = IdHash.new
|
18
|
+
end
|
19
|
+
|
20
|
+
def dry_run_report
|
21
|
+
@dry_run_reporter.report
|
22
|
+
end
|
23
|
+
|
24
|
+
def run
|
25
|
+
if @config.orphans_table
|
26
|
+
check_specified(@config.orphans_table)
|
27
|
+
else
|
28
|
+
check_all
|
29
|
+
end
|
30
|
+
|
31
|
+
process_ids_to_remove
|
32
|
+
end
|
33
|
+
|
34
|
+
def check_all
|
35
|
+
cases.each do |model_block|
|
36
|
+
check_model_block(model_block)
|
37
|
+
end
|
38
|
+
end
|
39
|
+
|
40
|
+
def check_specified(table_name)
|
41
|
+
model_block = cases.find { |c| c[:main_model] == Model.get_model_by_table_name(table_name) }
|
42
|
+
check_model_block(model_block)
|
43
|
+
end
|
44
|
+
|
45
|
+
def check_model_block(model_block)
|
46
|
+
model_block[:relations].each do |relation|
|
47
|
+
check_relationship(
|
48
|
+
main_model: model_block[:main_model],
|
49
|
+
related_model: relation[:related_model],
|
50
|
+
fk_name: relation[:fk_name],
|
51
|
+
)
|
52
|
+
end
|
53
|
+
end
|
54
|
+
|
55
|
+
def check_relationship(args)
|
56
|
+
main_model = args[:main_model]
|
57
|
+
related_model = args[:related_model]
|
58
|
+
fk_name = args[:fk_name]
|
59
|
+
|
60
|
+
main_table = main_model.table_name
|
61
|
+
related_table = related_model.table_name
|
62
|
+
|
63
|
+
for_delete = main_model.find_by_sql(%{
|
64
|
+
select a.*
|
65
|
+
from #{main_table} a
|
66
|
+
left join #{related_table} b
|
67
|
+
on a.#{fk_name} = b.id
|
68
|
+
where
|
69
|
+
a.#{fk_name} is not null
|
70
|
+
and b.id is null;
|
71
|
+
})
|
72
|
+
|
73
|
+
key = main_model.name.underscore.to_sym
|
74
|
+
ids = for_delete.map(&:id)
|
75
|
+
@ids_to_remove.add(key, *ids)
|
76
|
+
end
|
77
|
+
|
78
|
+
def process_ids_to_remove
|
79
|
+
return @dry_run_reporter.add_to_report(@ids_to_remove.with_table_symbols) if @config.dry_run
|
80
|
+
|
81
|
+
nullified_rels = nullify_builds_dependencies
|
82
|
+
|
83
|
+
if @config.if_backup
|
84
|
+
@subfolder = "remove_orphans_#{current_time_for_subfolder}"
|
85
|
+
save_nullified_rels_to_file(build: nullified_rels)
|
86
|
+
save_id_hash_to_file(@ids_to_remove)
|
87
|
+
end
|
88
|
+
|
89
|
+
@ids_to_remove.remove_entries_from_db
|
90
|
+
end
|
91
|
+
|
92
|
+
def nullify_builds_dependencies
|
93
|
+
nullified = @ids_to_remove[:build]&.map do |build_id|
|
94
|
+
build = Build.find(build_id)
|
95
|
+
build.nullify_all_dependencies
|
96
|
+
end
|
97
|
+
|
98
|
+
nullified&.flatten || []
|
99
|
+
end
|
100
|
+
|
101
|
+
def cases
|
102
|
+
[
|
103
|
+
{
|
104
|
+
main_model: Repository,
|
105
|
+
relations: [
|
106
|
+
{related_model: Build, fk_name: 'current_build_id'},
|
107
|
+
{related_model: Build, fk_name: 'last_build_id'}
|
108
|
+
]
|
109
|
+
}, {
|
110
|
+
main_model: Build,
|
111
|
+
relations: [
|
112
|
+
{related_model: Repository, fk_name: 'repository_id'},
|
113
|
+
{related_model: Commit, fk_name: 'commit_id'},
|
114
|
+
{related_model: Request, fk_name: 'request_id'},
|
115
|
+
{related_model: PullRequest, fk_name: 'pull_request_id'},
|
116
|
+
{related_model: Branch, fk_name: 'branch_id'},
|
117
|
+
{related_model: Tag, fk_name: 'tag_id'}
|
118
|
+
]
|
119
|
+
}, {
|
120
|
+
main_model: Job,
|
121
|
+
relations: [
|
122
|
+
{related_model: Repository, fk_name: 'repository_id'},
|
123
|
+
{related_model: Commit, fk_name: 'commit_id'},
|
124
|
+
{related_model: Stage, fk_name: 'stage_id'},
|
125
|
+
]
|
126
|
+
}, {
|
127
|
+
main_model: Branch,
|
128
|
+
relations: [
|
129
|
+
{related_model: Repository, fk_name: 'repository_id'},
|
130
|
+
{related_model: Build, fk_name: 'last_build_id'}
|
131
|
+
]
|
132
|
+
}, {
|
133
|
+
main_model: Tag,
|
134
|
+
relations: [
|
135
|
+
{related_model: Repository, fk_name: 'repository_id'},
|
136
|
+
{related_model: Build, fk_name: 'last_build_id'}
|
137
|
+
]
|
138
|
+
}, {
|
139
|
+
main_model: Commit,
|
140
|
+
relations: [
|
141
|
+
{related_model: Repository, fk_name: 'repository_id'},
|
142
|
+
{related_model: Branch, fk_name: 'branch_id'},
|
143
|
+
{related_model: Tag, fk_name: 'tag_id'}
|
144
|
+
]
|
145
|
+
}, {
|
146
|
+
main_model: Cron,
|
147
|
+
relations: [
|
148
|
+
{related_model: Branch, fk_name: 'branch_id'}
|
149
|
+
]
|
150
|
+
}, {
|
151
|
+
main_model: PullRequest,
|
152
|
+
relations: [
|
153
|
+
{related_model: Repository, fk_name: 'repository_id'}
|
154
|
+
]
|
155
|
+
}, {
|
156
|
+
main_model: SslKey,
|
157
|
+
relations: [
|
158
|
+
{related_model: Repository, fk_name: 'repository_id'}
|
159
|
+
]
|
160
|
+
}, {
|
161
|
+
main_model: Request,
|
162
|
+
relations: [
|
163
|
+
{related_model: Commit, fk_name: 'commit_id'},
|
164
|
+
{related_model: PullRequest, fk_name: 'pull_request_id'},
|
165
|
+
{related_model: Branch, fk_name: 'branch_id'},
|
166
|
+
{related_model: Tag, fk_name: 'tag_id'}
|
167
|
+
]
|
168
|
+
}, {
|
169
|
+
main_model: Stage,
|
170
|
+
relations: [
|
171
|
+
{related_model: Build, fk_name: 'build_id'}
|
172
|
+
]
|
173
|
+
}
|
174
|
+
]
|
175
|
+
end
|
176
|
+
end
|
177
|
+
end
|
@@ -0,0 +1,99 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'backup/save_file'
|
4
|
+
require 'backup/remove_specified/shared'
|
5
|
+
|
6
|
+
class Backup
|
7
|
+
class RemoveSpecified
|
8
|
+
module RemoveHeavyData
|
9
|
+
include SaveIdHashToFile
|
10
|
+
include SaveNullifiedRelsToFile
|
11
|
+
include Shared
|
12
|
+
|
13
|
+
def remove_heavy_data_for_repos_owned_by(owner_id, owner_type)
|
14
|
+
Repository.where('owner_id = ? and owner_type = ?', owner_id, owner_type).order(:id).each do |repository|
|
15
|
+
remove_heavy_data_for_repo(repository)
|
16
|
+
end
|
17
|
+
end
|
18
|
+
|
19
|
+
def remove_heavy_data_for_repo(repository)
|
20
|
+
remove_repo_builds(repository)
|
21
|
+
remove_repo_requests(repository)
|
22
|
+
end
|
23
|
+
|
24
|
+
def remove_repo_builds(repository) # rubocop:disable Metrics/AbcSize, Metrics/MethodLength
|
25
|
+
threshold = @config.threshold.to_i.months.ago.to_datetime
|
26
|
+
builds_to_remove = repository.builds.where('created_at < ?', threshold)
|
27
|
+
|
28
|
+
builds_dependencies = builds_to_remove.map do |build|
|
29
|
+
result = build.ids_of_all_dependencies(dependencies_to_filter, :without_parents)
|
30
|
+
result.add(:build, build.id)
|
31
|
+
result
|
32
|
+
end.compact
|
33
|
+
|
34
|
+
ids_to_remove = IdHash.join(*builds_dependencies)
|
35
|
+
@subfolder = "repository_#{repository.id}_old_builds_#{current_time_for_subfolder}"
|
36
|
+
|
37
|
+
unless @config.dry_run
|
38
|
+
nullified_rels = builds_to_remove&.map(&:nullify_default_dependencies)&.flatten
|
39
|
+
save_nullified_rels_to_file(build: nullified_rels) if @config.if_backup
|
40
|
+
end
|
41
|
+
|
42
|
+
process_ids_to_remove(ids_to_remove)
|
43
|
+
end
|
44
|
+
|
45
|
+
def remove_repo_requests(repository)
|
46
|
+
threshold = @config.threshold.to_i.months.ago.to_datetime
|
47
|
+
requests_to_remove = repository.requests.where('created_at < ?', threshold)
|
48
|
+
|
49
|
+
requests_dependencies = requests_to_remove.map do |request|
|
50
|
+
hash_with_filtered = request.ids_of_all_dependencies(dependencies_to_filter, :without_parents)
|
51
|
+
hash_with_filtered.add(:request, request.id)
|
52
|
+
end
|
53
|
+
|
54
|
+
@subfolder = "repository_#{repository.id}_old_requests_#{current_time_for_subfolder}"
|
55
|
+
|
56
|
+
unless @config.dry_run
|
57
|
+
nullified_rels = requests_to_remove.map do |request|
|
58
|
+
nullify_filtered_dependencies(request)
|
59
|
+
end.flatten
|
60
|
+
save_nullified_rels_to_file(build: nullified_rels) if @config.if_backup
|
61
|
+
end
|
62
|
+
|
63
|
+
ids_to_remove = IdHash.join(*(requests_dependencies))
|
64
|
+
process_ids_to_remove(ids_to_remove)
|
65
|
+
end
|
66
|
+
|
67
|
+
private
|
68
|
+
|
69
|
+
def process_ids_to_remove(ids_to_remove)
|
70
|
+
if @config.dry_run
|
71
|
+
@dry_run_reporter.add_to_report(ids_to_remove.with_table_symbols)
|
72
|
+
else
|
73
|
+
save_id_hash_to_file(ids_to_remove) if @config.if_backup
|
74
|
+
ids_to_remove.remove_entries_from_db
|
75
|
+
end
|
76
|
+
end
|
77
|
+
|
78
|
+
def save_and_destroy_requests_batch(requests_batch, repository)
|
79
|
+
requests_export = requests_batch.map(&:attributes)
|
80
|
+
file_name = "repository_#{repository.id}_requests_#{requests_batch.first.id}-#{requests_batch.last.id}.json"
|
81
|
+
pretty_json = JSON.pretty_generate(requests_export)
|
82
|
+
if save_file(file_name, pretty_json)
|
83
|
+
destroy_requests_batch(requests_batch)
|
84
|
+
end
|
85
|
+
requests_export
|
86
|
+
end
|
87
|
+
|
88
|
+
def destroy_requests_batch(requests_batch)
|
89
|
+
return destroy_requests_batch_dry(requests_batch) if @config.dry_run
|
90
|
+
|
91
|
+
requests_batch.each(&:destroy)
|
92
|
+
end
|
93
|
+
|
94
|
+
def destroy_requests_batch_dry(requests_batch)
|
95
|
+
@dry_run_reporter.add_to_report(:requests, *requests_batch.map(&:id))
|
96
|
+
end
|
97
|
+
end
|
98
|
+
end
|
99
|
+
end
|
@@ -0,0 +1,51 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require 'backup/save_id_hash_to_file'
|
4
|
+
require 'backup/save_nullified_rels_to_file'
|
5
|
+
require 'models/user'
|
6
|
+
require 'models/repository'
|
7
|
+
require 'backup/remove_specified/shared'
|
8
|
+
|
9
|
+
class Backup
|
10
|
+
class RemoveSpecified
|
11
|
+
module RemoveWithAllDependencies
|
12
|
+
include SaveIdHashToFile
|
13
|
+
include SaveNullifiedRelsToFile
|
14
|
+
include Shared
|
15
|
+
|
16
|
+
def remove_user_with_dependencies(user_id)
|
17
|
+
remove_entry_with_dependencies(:user, user_id)
|
18
|
+
end
|
19
|
+
|
20
|
+
def remove_org_with_dependencies(org_id)
|
21
|
+
remove_entry_with_dependencies(:organization, org_id)
|
22
|
+
end
|
23
|
+
|
24
|
+
def remove_repo_with_dependencies(repo_id)
|
25
|
+
remove_entry_with_dependencies(:repository, repo_id)
|
26
|
+
end
|
27
|
+
|
28
|
+
private
|
29
|
+
|
30
|
+
def remove_entry_with_dependencies(model_name, id)
|
31
|
+
@subfolder = "#{model_name}_#{id}_#{current_time_for_subfolder}"
|
32
|
+
entry = Model.get_model(model_name).find(id)
|
33
|
+
hash_with_filtered = entry.ids_of_all_dependencies_with_filtered(dependencies_to_filter, :without_parents)
|
34
|
+
ids_to_remove = hash_with_filtered[:main]
|
35
|
+
ids_to_remove.add(model_name, id)
|
36
|
+
|
37
|
+
return @dry_run_reporter.add_to_report(ids_to_remove) if @config.dry_run
|
38
|
+
|
39
|
+
nullified_rels = { build: nullify_filtered_dependencies(entry) || [] }
|
40
|
+
|
41
|
+
if @config.if_backup
|
42
|
+
save_nullified_rels_to_file(nullified_rels)
|
43
|
+
save_id_hash_to_file(ids_to_remove)
|
44
|
+
end
|
45
|
+
|
46
|
+
ids_to_remove.remove_entries_from_db(as_last: [:build])
|
47
|
+
# order important because of foreign key constraint between builds and repos
|
48
|
+
end
|
49
|
+
end
|
50
|
+
end
|
51
|
+
end
|