knife-tidy 0.7.0 → 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +12 -0
- data/README.md +9 -2
- data/lib/chef/knife/tidy_backup_clean.rb +25 -25
- data/lib/chef/knife/tidy_base.rb +15 -1
- data/lib/chef/knife/tidy_server_clean.rb +23 -8
- data/lib/chef/knife/tidy_server_report.rb +43 -18
- data/lib/chef/tidy_acls.rb +47 -26
- data/lib/chef/tidy_common.rb +7 -2
- data/lib/chef/tidy_substitutions.rb +7 -6
- data/lib/knife-tidy/version.rb +1 -1
- metadata +2 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA1:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 1e5c5c4edef9b63996bbfa1caf5890a40f443ee0
|
4
|
+
data.tar.gz: 36eed111cf84de44150c8d39486ae05939cf37e1
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: edc4d0d480d39c99c0b59b54180003fa60c259c600d30a987bb4cd4e69ec67a54c6f6fc37b31a958d3d725f9f95f0bf79660cf6b554e664fcf764be00bda979b
|
7
|
+
data.tar.gz: 05b0a59b86a3f5a95ed136222555f984bbbe583fdd6acb2064398571b2c88c1ffa9abc7da7fd0917e984be5018c5d3a8c2ab25593ec1e8b0d831ea45012964c5
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,16 @@
|
|
1
1
|
# Change Log
|
2
2
|
|
3
|
+
## [1.0.0](https://github.com/chef-customers/knife-tidy/tree/1.0.0) (2017-12-04)
|
4
|
+
[Full Changelog](https://github.com/chef-customers/knife-tidy/compare/0.7.0...1.0.0)
|
5
|
+
|
6
|
+
**Merged pull requests:**
|
7
|
+
|
8
|
+
- Enabled cookbook deletion [\#71](https://github.com/chef-customers/knife-tidy/pull/71) ([itmustbejj](https://github.com/itmustbejj))
|
9
|
+
- Add option for backup path to server clean [\#70](https://github.com/chef-customers/knife-tidy/pull/70) ([TheLunaticScripter](https://github.com/TheLunaticScripter))
|
10
|
+
- Warn the user if there are nodes created in the last hour that haven'… [\#67](https://github.com/chef-customers/knife-tidy/pull/67) ([itmustbejj](https://github.com/itmustbejj))
|
11
|
+
- Add guard to skip generating org reports if the search index is not u… [\#66](https://github.com/chef-customers/knife-tidy/pull/66) ([itmustbejj](https://github.com/itmustbejj))
|
12
|
+
- Enable server clean command and clarify confirmation dialogue [\#31](https://github.com/chef-customers/knife-tidy/pull/31) ([jonlives](https://github.com/jonlives))
|
13
|
+
|
3
14
|
## [0.7.0](https://github.com/chef-customers/knife-tidy/tree/0.7.0) (2017-11-29)
|
4
15
|
[Full Changelog](https://github.com/chef-customers/knife-tidy/compare/0.6.1...0.7.0)
|
5
16
|
|
@@ -10,6 +21,7 @@
|
|
10
21
|
|
11
22
|
**Merged pull requests:**
|
12
23
|
|
24
|
+
- release 0.7.0 [\#65](https://github.com/chef-customers/knife-tidy/pull/65) ([jeremymv2](https://github.com/jeremymv2))
|
13
25
|
- Add admins/users groups to the read acl for clients from \< CS 12.5 [\#64](https://github.com/chef-customers/knife-tidy/pull/64) ([itmustbejj](https://github.com/itmustbejj))
|
14
26
|
- Restore acls for ::server-admins and org read access groups if they a… [\#61](https://github.com/chef-customers/knife-tidy/pull/61) ([itmustbejj](https://github.com/itmustbejj))
|
15
27
|
- Filter email notifications on org\_list config option. [\#60](https://github.com/chef-customers/knife-tidy/pull/60) ([itmustbejj](https://github.com/itmustbejj))
|
data/README.md
CHANGED
@@ -60,10 +60,17 @@ org_unused_cookbooks.json | List of cookbooks and versions that do not appear to
|
|
60
60
|
## $ knife tidy server clean --help
|
61
61
|
Remove stale nodes that haven't checked-in to the Chef Server as defined by the `--node-threshold NUM_DAYS` option when the reports were generated.. The associated client and ACLs are also removed.
|
62
62
|
|
63
|
-
Future: remove unused cookbooks - currently this feature is disabled.
|
64
|
-
|
65
63
|
## Options
|
66
64
|
|
65
|
+
* `--backup-path /path/to/an-ec-backup`
|
66
|
+
The location to the last backup of the target Chef Server. It is not recommended to run the clean command without first taking a current backup using [knife-ec-backup](https://github.com/chef/knife-ec-backup)
|
67
|
+
|
68
|
+
* `--only-cookbooks`
|
69
|
+
Only deletes the unused cookbooks from the target Chef Server. NOTE: Cannot be specified if `--only-nodes` is already specified
|
70
|
+
|
71
|
+
* `--only-nodes`
|
72
|
+
Only deltes the stale nodes, associated clients, and ACLs from the target Chef Server. NOTE: Cannot be specified if `--only-cookbooks` is already specified
|
73
|
+
|
67
74
|
* `--dry-run`
|
68
75
|
Do not perform any actual deletion, only report on what would have been deleted.
|
69
76
|
|
@@ -36,7 +36,7 @@ class Chef
|
|
36
36
|
FileUtils.rm_f(action_needed_file_path)
|
37
37
|
|
38
38
|
if config[:gen_gsub]
|
39
|
-
Chef::TidySubstitutions.new().boiler_plate
|
39
|
+
Chef::TidySubstitutions.new(nil, tidy).boiler_plate
|
40
40
|
exit
|
41
41
|
end
|
42
42
|
|
@@ -67,18 +67,18 @@ class Chef
|
|
67
67
|
|
68
68
|
completion_message
|
69
69
|
|
70
|
-
puts "\nWARNING: ** Unrepairable Items **\nPlease see #{action_needed_file_path}\n" if ::File.exist?(action_needed_file_path)
|
70
|
+
ui.stdout.puts "\nWARNING: ** Unrepairable Items **\nPlease see #{action_needed_file_path}\n" if ::File.exist?(action_needed_file_path)
|
71
71
|
end
|
72
72
|
|
73
73
|
def validate_user_emails
|
74
74
|
emails_seen = []
|
75
75
|
tidy.global_user_names.each do |user|
|
76
76
|
email = ''
|
77
|
-
puts "INFO: Validating #{user}"
|
77
|
+
ui.stdout.puts "INFO: Validating #{user}"
|
78
78
|
the_user = FFI_Yajl::Parser.parse(::File.read(::File.join(tidy.users_path, "#{user}.json")), symbolize_names: false)
|
79
79
|
if the_user.has_key?('email') && the_user['email'].match(/\A[^@\s]+@[^@\s]+\z/)
|
80
80
|
if emails_seen.include?(the_user['email'])
|
81
|
-
puts "REPAIRING: Already saw #{user}'s email, creating a unique one."
|
81
|
+
ui.stdout.puts "REPAIRING: Already saw #{user}'s email, creating a unique one."
|
82
82
|
email = tidy.unique_email
|
83
83
|
new_user = the_user.dup
|
84
84
|
new_user['email'] = email
|
@@ -88,7 +88,7 @@ class Chef
|
|
88
88
|
emails_seen.push(the_user['email'])
|
89
89
|
end
|
90
90
|
else
|
91
|
-
puts "REPAIRING: User #{user} does not have a valid email, creating a unique one."
|
91
|
+
ui.stdout.puts "REPAIRING: User #{user} does not have a valid email, creating a unique one."
|
92
92
|
email = tidy.unique_email
|
93
93
|
new_user = the_user.dup
|
94
94
|
new_user['email'] = email
|
@@ -102,11 +102,11 @@ class Chef
|
|
102
102
|
# The existence of anything else will cause a restore to fail
|
103
103
|
# EC11 backups will contain org objects with 6 extra fields including org_type, billing_plan, assigned_at, etc
|
104
104
|
def fix_org_object(org)
|
105
|
-
puts "INFO: Validating org object for #{org}"
|
105
|
+
ui.stdout.puts "INFO: Validating org object for #{org}"
|
106
106
|
org_object = load_org_object(org)
|
107
107
|
|
108
108
|
unless org_object.keys.count == 3 # cheapo, maybe expect the exact names?
|
109
|
-
puts "REPAIRING: org object for #{org} contains extra/missing fields. Fixing that for you"
|
109
|
+
ui.stdout.puts "REPAIRING: org object for #{org} contains extra/missing fields. Fixing that for you"
|
110
110
|
# quick/dirty attempt at fixing any of the required fields in case they're nil
|
111
111
|
good_name = org_object['name'] || org
|
112
112
|
good_full_name = org_object['full_name'] || org
|
@@ -120,7 +120,7 @@ class Chef
|
|
120
120
|
def load_org_object(org)
|
121
121
|
JSON.parse(File.read(File.join(tidy.org_path(org), 'org.json')))
|
122
122
|
rescue Errno::ENOENT, JSON::ParserError
|
123
|
-
puts "REPAIRING: org object for organization #{org} is missing or corrupt. Generating a new one"
|
123
|
+
ui.stdout.puts "REPAIRING: org object for organization #{org} is missing or corrupt. Generating a new one"
|
124
124
|
return { name: org, full_name: org, guid: SecureRandom.uuid.gsub('-','') }
|
125
125
|
end
|
126
126
|
|
@@ -129,7 +129,7 @@ class Chef
|
|
129
129
|
end
|
130
130
|
|
131
131
|
def add_cookbook_name_to_metadata(cookbook_name, rb_path)
|
132
|
-
puts "REPAIRING: Correcting `name` in #{rb_path}"
|
132
|
+
ui.stdout.puts "REPAIRING: Correcting `name` in #{rb_path}"
|
133
133
|
content = IO.readlines(rb_path)
|
134
134
|
new_content = content.reject { |line| line =~ /^name\s+/ }
|
135
135
|
name_field = "name '#{cookbook_name}'\n"
|
@@ -150,7 +150,7 @@ class Chef
|
|
150
150
|
metadata = FFI_Yajl::Parser.parse(::File.read(json_path), symbolize_names: false)
|
151
151
|
if metadata['name'] != cookbook_name
|
152
152
|
metadata['name'] = cookbook_name
|
153
|
-
puts "REPAIRING: Correcting `name` in #{json_path}`"
|
153
|
+
ui.stdout.puts "REPAIRING: Correcting `name` in #{json_path}`"
|
154
154
|
::File.open(json_path, 'w') do |f|
|
155
155
|
f.write(Chef::JSONCompat.to_json_pretty(metadata))
|
156
156
|
end
|
@@ -163,7 +163,7 @@ class Chef
|
|
163
163
|
def load_cookbooks(org)
|
164
164
|
cl = Chef::CookbookLoader.new(tidy.cookbooks_path(org))
|
165
165
|
for_each_cookbook_basename(org) do |cookbook|
|
166
|
-
puts "INFO: Loading #{cookbook}"
|
166
|
+
ui.stdout.puts "INFO: Loading #{cookbook}"
|
167
167
|
ret = cl.load_cookbook(cookbook)
|
168
168
|
if ret.nil?
|
169
169
|
action_needed("ACTION NEEDED: Something's wrong with the #{cookbook} cookbook in org #{org} - cannot load it! Moving to cookbooks.broken folder.")
|
@@ -192,8 +192,8 @@ class Chef
|
|
192
192
|
|
193
193
|
def fix_chef_sugar_metadata
|
194
194
|
Dir[::File.join(tidy.backup_path, 'organizations/*/cookbooks/chef-sugar*/metadata.rb')].each do |file|
|
195
|
-
puts 'INFO: Searching for known chef-sugar problems when uploading.'
|
196
|
-
s = Chef::TidySubstitutions.new
|
195
|
+
ui.stdout.puts 'INFO: Searching for known chef-sugar problems when uploading.'
|
196
|
+
s = Chef::TidySubstitutions.new(nil, tidy)
|
197
197
|
version = s.cookbook_version_from_path(file)
|
198
198
|
patterns = [
|
199
199
|
{
|
@@ -216,10 +216,10 @@ class Chef
|
|
216
216
|
name = tidy.cookbook_name_from_path(cookbook_path)
|
217
217
|
md_path = ::File.join(cookbook_path, 'metadata.rb')
|
218
218
|
unless ::File.exist?(md_path)
|
219
|
-
puts "INFO: No metadata.rb in #{cookbook_path} - skipping"
|
219
|
+
ui.stdout.puts "INFO: No metadata.rb in #{cookbook_path} - skipping"
|
220
220
|
next
|
221
221
|
end
|
222
|
-
Chef::TidySubstitutions.new.sub_in_file(
|
222
|
+
Chef::TidySubstitutions.new(nil, tidy).sub_in_file(
|
223
223
|
::File.join(cookbook_path, 'metadata.rb'),
|
224
224
|
Regexp.new("^depends +['\"]#{name}['\"]"),
|
225
225
|
"# depends '#{name}' # knife-tidy was here")
|
@@ -232,7 +232,7 @@ class Chef
|
|
232
232
|
md = metadata.dup
|
233
233
|
metadata.each_pair do |key, value|
|
234
234
|
if value.nil?
|
235
|
-
puts "REPAIRING: Fixing null value for key #{key} in #{json_path}"
|
235
|
+
ui.stdout.puts "REPAIRING: Fixing null value for key #{key} in #{json_path}"
|
236
236
|
md[key] = 'default value'
|
237
237
|
end
|
238
238
|
end
|
@@ -241,7 +241,7 @@ class Chef
|
|
241
241
|
# platform key cannot contain comma delimited values
|
242
242
|
md['platforms'].delete(key) if key =~ /,/
|
243
243
|
if value.kind_of?(Array) && value.empty?
|
244
|
-
puts "REPAIRING: Fixing empty platform key for for key #{key} in #{json_path}"
|
244
|
+
ui.stdout.puts "REPAIRING: Fixing empty platform key for for key #{key} in #{json_path}"
|
245
245
|
md['platforms'][key] = '>= 0.0.0'
|
246
246
|
end
|
247
247
|
end
|
@@ -258,10 +258,10 @@ class Chef
|
|
258
258
|
create_minimal_metadata(path)
|
259
259
|
end
|
260
260
|
unless ::File.exist?(md_path)
|
261
|
-
puts "INFO: No metadata.rb in #{path} - skipping"
|
261
|
+
ui.stdout.puts "INFO: No metadata.rb in #{path} - skipping"
|
262
262
|
return
|
263
263
|
end
|
264
|
-
puts "INFO: Generating new metadata.json for #{path}"
|
264
|
+
ui.stdout.puts "INFO: Generating new metadata.json for #{path}"
|
265
265
|
md = Chef::Cookbook::Metadata.new
|
266
266
|
md.name(cookbook)
|
267
267
|
md.from_file(md_path)
|
@@ -290,7 +290,7 @@ class Chef
|
|
290
290
|
metadata['maintainer'] = 'the maintainer'
|
291
291
|
metadata['maintainer_email'] = 'the maintainer email'
|
292
292
|
rb_file = ::File.join(cookbook_path, 'metadata.rb')
|
293
|
-
puts "REPAIRING: no metadata files exist for #{cookbook_path}, creating #{rb_file}"
|
293
|
+
ui.stdout.puts "REPAIRING: no metadata files exist for #{cookbook_path}, creating #{rb_file}"
|
294
294
|
::File.open(rb_file, 'w') do |f|
|
295
295
|
metadata.each_pair do |key, value|
|
296
296
|
f.write("#{key} '#{value}'\n")
|
@@ -362,7 +362,7 @@ class Chef
|
|
362
362
|
rl << item
|
363
363
|
new_role['run_list'].push(item)
|
364
364
|
rescue ArgumentError
|
365
|
-
puts "REPAIRING: Invalid Recipe Item: #{item} in run_list from #{role_path}"
|
365
|
+
ui.stdout.puts "REPAIRING: Invalid Recipe Item: #{item} in run_list from #{role_path}"
|
366
366
|
end
|
367
367
|
end
|
368
368
|
if the_role.has_key?('env_run_lists')
|
@@ -373,7 +373,7 @@ class Chef
|
|
373
373
|
rl << item
|
374
374
|
new_role['env_run_lists'][key].push(item)
|
375
375
|
rescue ArgumentError
|
376
|
-
puts "REPAIRING: Invalid Recipe Item: #{item} in env_run_lists #{key} from #{role_path}"
|
376
|
+
ui.stdout.puts "REPAIRING: Invalid Recipe Item: #{item} in env_run_lists #{key} from #{role_path}"
|
377
377
|
end
|
378
378
|
end
|
379
379
|
end
|
@@ -384,7 +384,7 @@ class Chef
|
|
384
384
|
|
385
385
|
def validate_roles(org)
|
386
386
|
for_each_role(org) do |role_path|
|
387
|
-
puts "INFO: Validating Role at #{role_path}"
|
387
|
+
ui.stdout.puts "INFO: Validating Role at #{role_path}"
|
388
388
|
begin
|
389
389
|
Chef::Role.from_hash(FFI_Yajl::Parser.parse(::File.read(role_path), symbolize_names: false))
|
390
390
|
rescue ArgumentError
|
@@ -395,12 +395,12 @@ class Chef
|
|
395
395
|
|
396
396
|
def validate_invitations(org)
|
397
397
|
invite_file = tidy.invitations_path(org)
|
398
|
-
puts "INFO: validating org #{org} invites in #{invite_file}"
|
398
|
+
ui.stdout.puts "INFO: validating org #{org} invites in #{invite_file}"
|
399
399
|
invitations = FFI_Yajl::Parser.parse(::File.read(invite_file), symbolize_names: false)
|
400
400
|
invitations_new = []
|
401
401
|
invitations.each do |invite|
|
402
402
|
if invite['username'].nil?
|
403
|
-
puts "REPAIRING: Dropping corrupt invitations for #{org} in file #{invite_file}"
|
403
|
+
ui.stdout.puts "REPAIRING: Dropping corrupt invitations for #{org} in file #{invite_file}"
|
404
404
|
else
|
405
405
|
invite_hash = {}
|
406
406
|
invite_hash['id'] = invite['id']
|
data/lib/chef/knife/tidy_base.rb
CHANGED
@@ -58,7 +58,21 @@ class Chef
|
|
58
58
|
end
|
59
59
|
|
60
60
|
def completion_message
|
61
|
-
puts "#{ui.color("** Finished **", :magenta)}"
|
61
|
+
ui.stdout.puts "#{ui.color("** Finished **", :magenta)}"
|
62
|
+
end
|
63
|
+
|
64
|
+
def action_needed_file_path
|
65
|
+
::File.expand_path('knife-tidy-actions-needed.txt')
|
66
|
+
end
|
67
|
+
|
68
|
+
def server_warnings_file_path
|
69
|
+
::File.expand_path('reports/knife-tidy-server-warnings.txt')
|
70
|
+
end
|
71
|
+
|
72
|
+
def action_needed(msg, file_path=action_needed_file_path)
|
73
|
+
::File.open(file_path, 'a') do |f|
|
74
|
+
f.write(msg + "\n")
|
75
|
+
end
|
62
76
|
end
|
63
77
|
end
|
64
78
|
end
|
@@ -3,7 +3,6 @@ require 'chef/knife/tidy_base'
|
|
3
3
|
class Chef
|
4
4
|
class Knife
|
5
5
|
class TidyServerClean < Knife
|
6
|
-
|
7
6
|
include Knife::TidyBase
|
8
7
|
|
9
8
|
deps do
|
@@ -13,6 +12,10 @@ class Chef
|
|
13
12
|
|
14
13
|
banner "knife tidy server clean (options)"
|
15
14
|
|
15
|
+
option :backup_path,
|
16
|
+
:long => '--backup-path path/to/backup',
|
17
|
+
:description => 'The path to the knife-ec-backup backup directory'
|
18
|
+
|
16
19
|
option :concurrency,
|
17
20
|
:long => '--concurrency THREADS',
|
18
21
|
:default => 1,
|
@@ -42,6 +45,16 @@ class Chef
|
|
42
45
|
exit 1
|
43
46
|
end
|
44
47
|
|
48
|
+
while config[:backup_path].nil?
|
49
|
+
user_value = ui.ask_question("It is not recommended to run this command without specifying a current backup directory.\nPlease specify a backup directory:")
|
50
|
+
config[:backup_path] = user_value == '' ? nil : user_value
|
51
|
+
end
|
52
|
+
|
53
|
+
unless ::File.directory?(config[:backup_path])
|
54
|
+
ui.error 'Must specify valid --backup-path'
|
55
|
+
exit 1
|
56
|
+
end
|
57
|
+
|
45
58
|
deletions = if config[:only_cookbooks]
|
46
59
|
"cookbooks"
|
47
60
|
elsif config[:only_nodes]
|
@@ -56,9 +69,13 @@ class Chef
|
|
56
69
|
all_orgs
|
57
70
|
end
|
58
71
|
|
59
|
-
ui.warn "This operation will affect the following Orgs on #{server.root_url}
|
60
|
-
|
61
|
-
|
72
|
+
ui.warn "This operation will affect the following Orgs on #{server.root_url}: #{orgs}"
|
73
|
+
if ::File.exist?(server_warnings_file_path)
|
74
|
+
::File.read(::File.expand_path('reports/knife-tidy-server-warnings.txt')).each_line do |line|
|
75
|
+
ui.warn(line)
|
76
|
+
end
|
77
|
+
end
|
78
|
+
ui.confirm("This command will delete #{deletions} identified by the knife-tidy reports in #{tidy.reports_dir} from the Chef Server specified in your knife configuration file. \n\n The Chef server to be used is currently #{server.root_url}.\n\n Please be sure this is the Chef server you wish to delete data from. \n\nWould you like to continue?") unless config[:unattended]
|
62
79
|
|
63
80
|
orgs.each do |org|
|
64
81
|
clean_cookbooks(org) unless config[:only_nodes]
|
@@ -69,12 +86,10 @@ class Chef
|
|
69
86
|
end
|
70
87
|
|
71
88
|
def clean_cookbooks(org)
|
72
|
-
ui.warn "Cleaning cookbooks is a feature not yet enabled."
|
73
|
-
return
|
74
89
|
queue = Chef::Util::ThreadedJobQueue.new
|
75
90
|
unused_cookbooks_file = ::File.join(tidy.reports_dir, "#{org}_unused_cookbooks.json")
|
76
91
|
return unless ::File.exist?(unused_cookbooks_file)
|
77
|
-
puts "INFO: Cleaning cookbooks for Org: #{org}, using #{unused_cookbooks_file}"
|
92
|
+
ui.stdout.puts "INFO: Cleaning cookbooks for Org: #{org}, using #{unused_cookbooks_file}"
|
78
93
|
unused_cookbooks = FFI_Yajl::Parser.parse(::File.read(unused_cookbooks_file), symbolize_names: true)
|
79
94
|
unused_cookbooks.keys.each do |cookbook|
|
80
95
|
versions = unused_cookbooks[cookbook]
|
@@ -100,7 +115,7 @@ class Chef
|
|
100
115
|
queue = Chef::Util::ThreadedJobQueue.new
|
101
116
|
stale_nodes_file = ::File.join(tidy.reports_dir, "#{org}_stale_nodes.json")
|
102
117
|
return unless ::File.exist?(stale_nodes_file)
|
103
|
-
puts "INFO: Cleaning stale nodes for Org: #{org}, using #{stale_nodes_file}"
|
118
|
+
ui.stdout.puts "INFO: Cleaning stale nodes for Org: #{org}, using #{stale_nodes_file}"
|
104
119
|
stale_nodes = FFI_Yajl::Parser.parse(::File.read(stale_nodes_file), symbolize_names: true)
|
105
120
|
stale_nodes[:list].each do |node|
|
106
121
|
queue << lambda { delete_node_job(org, node) }
|
@@ -19,8 +19,9 @@ class Chef
|
|
19
19
|
|
20
20
|
def run
|
21
21
|
ensure_reports_dir!
|
22
|
+
FileUtils.rm_f(server_warnings_file_path)
|
22
23
|
|
23
|
-
ui.
|
24
|
+
ui.stdout.puts(ui.color("Writing to #{tidy.reports_dir} directory", :magenta))
|
24
25
|
delete_existing_reports
|
25
26
|
|
26
27
|
orgs = if config[:org_list]
|
@@ -29,20 +30,37 @@ class Chef
|
|
29
30
|
all_orgs
|
30
31
|
end
|
31
32
|
|
32
|
-
pre_12_3_nodes = []
|
33
33
|
stale_orgs = []
|
34
34
|
node_threshold = config[:node_threshold].to_i
|
35
35
|
|
36
36
|
orgs.each do |org|
|
37
|
+
pre_12_3_nodes = []
|
38
|
+
unconverged_recent_nodes = []
|
37
39
|
ui.info " Organization: #{org}"
|
38
40
|
cb_list = cookbook_list(org)
|
39
41
|
version_count = cookbook_count(cb_list).sort_by(&:last).reverse.to_h
|
40
42
|
used_cookbooks = {}
|
41
43
|
nodes = nodes_list(org)
|
44
|
+
db_nodes = rest.get("/organizations/#{org}/nodes")
|
45
|
+
unless nodes.length == db_nodes.length
|
46
|
+
ood_message = "Search index is out of date! No cleanup action will be taken for #{org}."
|
47
|
+
ui.error(ood_message)
|
48
|
+
action_needed(ood_message, server_warnings_file_path)
|
49
|
+
next
|
50
|
+
end
|
42
51
|
|
43
52
|
nodes.each do |node|
|
44
|
-
|
45
|
-
if
|
53
|
+
# If the node hasn't checked in.
|
54
|
+
if !node['chef_packages']
|
55
|
+
# If the node is under an hour old.
|
56
|
+
if (Time.now.to_i - node['ohai_time'].to_i) < 3600
|
57
|
+
unconverged_recent_nodes << node['name']
|
58
|
+
end
|
59
|
+
next
|
60
|
+
end
|
61
|
+
chef_version = Gem::Version.new(node['chef_packages']['chef']['version'])
|
62
|
+
# If the node has checked in within the node_threshold with a client older than 12.3
|
63
|
+
if chef_version < Gem::Version.new("12.3") && (Time.now.to_i - node['ohai_time'].to_i) <= node_threshold * 86400
|
46
64
|
pre_12_3_nodes << node['name']
|
47
65
|
end
|
48
66
|
end
|
@@ -58,10 +76,8 @@ class Chef
|
|
58
76
|
end
|
59
77
|
end
|
60
78
|
|
61
|
-
Chef::Log.debug("Used cookbook list before checking environments: #{used_cookbooks}")
|
62
79
|
pins = environment_constraints(org)
|
63
80
|
used_cookbooks = check_environment_pins(used_cookbooks, pins, cb_list)
|
64
|
-
Chef::Log.debug("Used cookbook list after checking environments: #{used_cookbooks}")
|
65
81
|
|
66
82
|
stale_nodes = []
|
67
83
|
nodes.each do |n|
|
@@ -73,12 +89,19 @@ class Chef
|
|
73
89
|
stale_nodes_hash = {'threshold_days': node_threshold, 'org_total_node_count': nodes.count, 'count': stale_nodes.count, 'list': stale_nodes}
|
74
90
|
stale_orgs.push(org) if stale_nodes.count == nodes.count
|
75
91
|
|
76
|
-
tidy.write_new_file(unused_cookbooks(used_cookbooks, cb_list), ::File.join(tidy.reports_dir, "#{org}_unused_cookbooks.json"))
|
77
|
-
tidy.write_new_file(version_count, ::File.join(tidy.reports_dir, "#{org}_cookbook_count.json"))
|
78
|
-
tidy.write_new_file(stale_nodes_hash, ::File.join(tidy.reports_dir, "#{org}_stale_nodes.json"))
|
92
|
+
tidy.write_new_file(unused_cookbooks(used_cookbooks, cb_list), ::File.join(tidy.reports_dir, "#{org}_unused_cookbooks.json"), backup=false)
|
93
|
+
tidy.write_new_file(version_count, ::File.join(tidy.reports_dir, "#{org}_cookbook_count.json"), backup=false)
|
94
|
+
tidy.write_new_file(stale_nodes_hash, ::File.join(tidy.reports_dir, "#{org}_stale_nodes.json"), backup=false)
|
79
95
|
|
80
96
|
if pre_12_3_nodes.length > 0
|
81
|
-
|
97
|
+
pre_12_3_message = "#{pre_12_3_nodes.length} nodes in organization #{org} have converged in the last #{node_threshold} days with a chef-client < 12.3. These nodes' cookbook versions WILL NOT be factored in the stale cookbooks versions report. Continuing with the server cleanup will delete cookbooks in-use by these nodes."
|
98
|
+
ui.warn(pre_12_3_message)
|
99
|
+
action_needed(pre_12_3_message, server_warnings_file_path)
|
100
|
+
end
|
101
|
+
if unconverged_recent_nodes.length > 0
|
102
|
+
unconverged_recent_message "#{unconverged_recent_nodes.length} nodes have been created in the last hour that have yet to converge in organization #{org}. These nodes WILL NOT be factored in the stale cookbook verisons report. Continuing with the server cleanup will delete cookbooks in-use by these nodes."
|
103
|
+
ui.warn(unconverged_recent_message)
|
104
|
+
action_needed(unconverged_recent_message, server_warnings_file_path)
|
82
105
|
end
|
83
106
|
end
|
84
107
|
|
@@ -141,10 +164,11 @@ class Chef
|
|
141
164
|
def unused_cookbooks(used_list, cb_list)
|
142
165
|
unused_list = {}
|
143
166
|
cb_list.each do |name, versions|
|
167
|
+
versions.sort! {| a, b | Gem::Version.new(a) <=> Gem::Version.new(b) }
|
144
168
|
if used_list[name].nil? # Not in the used list at all (Remove all versions)
|
145
169
|
unused_list[name] = versions
|
146
170
|
elsif used_list[name].sort != versions # Is in the used cookbook list, but version arrays do not match (Find unused versions)
|
147
|
-
unused_list[name] = versions - used_list[name]
|
171
|
+
unused_list[name] = versions - used_list[name] - [versions.last] # Don't delete the most recent version as it might not be in a run_list yet.
|
148
172
|
end
|
149
173
|
end
|
150
174
|
unused_list
|
@@ -176,15 +200,18 @@ class Chef
|
|
176
200
|
def check_cookbook_list(cb_list, cb, version)
|
177
201
|
if cb_list[cb]
|
178
202
|
cb_list[cb].each do |v|
|
203
|
+
versions_not_satisfied = []
|
179
204
|
if Gem::Dependency.new('', version).match?('', v)
|
180
|
-
Chef::Log.debug("Pin of #{cb} can be satisfied by #{v}, adding to used list")
|
181
205
|
return [v]
|
182
206
|
else
|
183
|
-
|
207
|
+
versions_not_satisfied.push(v)
|
208
|
+
end
|
209
|
+
if v == cb_list[cb].last
|
210
|
+
ui.warn("Pin of #{cb} #{version} not satisfied by current versions of cookbook: [#{versions_not_satisfied.join(', ')}]")
|
184
211
|
end
|
185
212
|
end
|
186
213
|
else
|
187
|
-
|
214
|
+
ui.warn("Cookbook #{cb} #{version} is pinned in an environment, but does not exist on the server in this org.")
|
188
215
|
end
|
189
216
|
return nil
|
190
217
|
end
|
@@ -192,20 +219,18 @@ class Chef
|
|
192
219
|
def check_environment_pins(used_cookbooks, pins, cb_list)
|
193
220
|
pins.each do |cb, versions|
|
194
221
|
versions.each do |version|
|
222
|
+
next if version == "<= 0.0.0"
|
195
223
|
if used_cookbooks[cb]
|
196
224
|
# This pinned cookbook is in the used list, now check for a matching version.
|
197
225
|
used_cookbooks[cb].each do |v|
|
198
226
|
if Gem::Dependency.new('', version).match?('', v)
|
199
|
-
# This version in used_cookbooks satisfies the pin
|
200
|
-
Chef::Log.debug("Pin of #{cb}: #{version} is satisfied by #{v}")
|
201
227
|
break
|
202
228
|
end
|
203
229
|
end
|
204
230
|
result = check_cookbook_list(cb_list, cb, version)
|
205
|
-
used_cookbooks[cb].push(result[0]) if result
|
231
|
+
used_cookbooks[cb].push(result[0]) if result && !used_cookbooks[cb].include?(result[0])
|
206
232
|
else
|
207
233
|
# No cookbook version for that pin, look through the full cookbook list for a match
|
208
|
-
Chef::Log.debug("No used cookbook #{cb}, checking the full cookbook list")
|
209
234
|
result = check_cookbook_list(cb_list, cb, version)
|
210
235
|
used_cookbooks[cb] = result if result
|
211
236
|
end
|
data/lib/chef/tidy_acls.rb
CHANGED
@@ -18,26 +18,26 @@ class Chef
|
|
18
18
|
end
|
19
19
|
|
20
20
|
def load_users
|
21
|
-
puts "INFO: Loading users"
|
21
|
+
@tidy.ui.stdout.puts "INFO: Loading users"
|
22
22
|
Dir[::File.join(@tidy.users_path, '*.json')].each do |user|
|
23
23
|
@users.push(FFI_Yajl::Parser.parse(::File.read(user), symbolize_names: true))
|
24
24
|
end
|
25
25
|
end
|
26
26
|
|
27
27
|
def load_members
|
28
|
-
puts "INFO: Loading members for #{@org}"
|
28
|
+
@tidy.ui.stdout.puts "INFO: Loading members for #{@org}"
|
29
29
|
@members = FFI_Yajl::Parser.parse(::File.read(@tidy.members_path(@org)), symbolize_names: true)
|
30
30
|
end
|
31
31
|
|
32
32
|
def load_clients
|
33
|
-
puts "INFO: Loading clients for #{@org}"
|
33
|
+
@tidy.ui.stdout.puts "INFO: Loading clients for #{@org}"
|
34
34
|
Dir[::File.join(@tidy.clients_path(@org), '*.json')].each do |client|
|
35
35
|
@clients.push(FFI_Yajl::Parser.parse(::File.read(client), symbolize_names: true))
|
36
36
|
end
|
37
37
|
end
|
38
38
|
|
39
39
|
def load_groups
|
40
|
-
puts "INFO: Loading groups for #{@org}"
|
40
|
+
@tidy.ui.stdout.puts "INFO: Loading groups for #{@org}"
|
41
41
|
Dir[::File.join(@tidy.groups_path(@org), '*.json')].each do |group|
|
42
42
|
@groups.push(FFI_Yajl::Parser.parse(::File.read(group), symbolize_names: true))
|
43
43
|
end
|
@@ -48,7 +48,7 @@ class Chef
|
|
48
48
|
load_members
|
49
49
|
load_clients
|
50
50
|
load_groups
|
51
|
-
puts "INFO: #{@org} Actors loaded!"
|
51
|
+
@tidy.ui.stdout.puts "INFO: #{@org} Actors loaded!"
|
52
52
|
end
|
53
53
|
|
54
54
|
def acl_ops
|
@@ -105,41 +105,34 @@ class Chef
|
|
105
105
|
end
|
106
106
|
|
107
107
|
def fix_ambiguous_actor(actor)
|
108
|
-
puts "REPAIRING: Ambiguous actor! #{actor} removing from #{@tidy.members_path(@org)}"
|
108
|
+
@tidy.ui.stdout.puts "REPAIRING: Ambiguous actor! #{actor} removing from #{@tidy.members_path(@org)}"
|
109
109
|
remove_user_from_org(actor)
|
110
110
|
end
|
111
111
|
|
112
112
|
def add_client_to_org(actor)
|
113
113
|
# TODO
|
114
|
-
puts "ACTION NEEDED: Client referenced in acl non-existant: #{actor}"
|
114
|
+
@tidy.ui.stdout.puts "ACTION NEEDED: Client referenced in acl non-existant: #{actor}"
|
115
115
|
end
|
116
116
|
|
117
117
|
def add_actor_to_members(actor)
|
118
|
-
puts "REPAIRING: Invalid actor: #{actor} adding to #{@tidy.members_path(@org)}"
|
118
|
+
@tidy.ui.stdout.puts "REPAIRING: Invalid actor: #{actor} adding to #{@tidy.members_path(@org)}"
|
119
119
|
user = { user: { username: actor } }
|
120
120
|
@members.push(user)
|
121
|
-
write_new_file(@members, @tidy.members_path(@org))
|
122
|
-
end
|
123
|
-
|
124
|
-
def write_new_file(contents, path)
|
125
|
-
FileUtils.cp(path, "#{path}.orig") unless ::File.exist?("#{path}.orig")
|
126
|
-
::File.open(path, 'w+') do |f|
|
127
|
-
f.write(FFI_Yajl::Encoder.encode(contents, pretty: true))
|
128
|
-
end
|
121
|
+
@tidy.write_new_file(@members, @tidy.members_path(@org))
|
129
122
|
end
|
130
123
|
|
131
124
|
def remove_user_from_org(actor)
|
132
125
|
@members.reject! { |user| user[:user][:username] == actor }
|
133
|
-
write_new_file(@members, @tidy.members_path(@org))
|
126
|
+
@tidy.write_new_file(@members, @tidy.members_path(@org))
|
134
127
|
end
|
135
128
|
|
136
129
|
def remove_group_from_acl(group, acl_file)
|
137
|
-
puts "REPAIRING: Removing invalid group: #{group} from #{acl_file}"
|
130
|
+
@tidy.ui.stdout.puts "REPAIRING: Removing invalid group: #{group} from #{acl_file}"
|
138
131
|
acl = FFI_Yajl::Parser.parse(::File.read(acl_file), symbolize_names: false)
|
139
132
|
acl_ops.each do |op|
|
140
133
|
acl[op]['groups'].reject! { |the_group| the_group == group }
|
141
134
|
end
|
142
|
-
write_new_file(acl, acl_file)
|
135
|
+
@tidy.write_new_file(acl, acl_file)
|
143
136
|
end
|
144
137
|
|
145
138
|
# Appends the proper acls for ::server-admins and the org's read access group if they are missing.
|
@@ -147,26 +140,26 @@ class Chef
|
|
147
140
|
acl = FFI_Yajl::Parser.parse(::File.read(acl_file), symbolize_names: false)
|
148
141
|
acl_ops.each do |op|
|
149
142
|
unless acl[op]['groups'].include? '::server-admins'
|
150
|
-
puts "REPAIRING: Adding #{op} acl for ::server-admins in #{acl_file}"
|
143
|
+
@tidy.ui.stdout.puts "REPAIRING: Adding #{op} acl for ::server-admins in #{acl_file}"
|
151
144
|
acl[op]['groups'].push('::server-admins')
|
152
145
|
end
|
153
146
|
if op == 'read' && !acl[op]['groups'].include?("::#{@org}_read_access_group")
|
154
|
-
puts "REPAIRING: Adding #{op} acl for ::#{@org}_read_access_group in #{acl_file}"
|
147
|
+
@tidy.ui.stdout.puts "REPAIRING: Adding #{op} acl for ::#{@org}_read_access_group in #{acl_file}"
|
155
148
|
acl[op]['groups'].push("::#{@org}_read_access_group")
|
156
149
|
end
|
157
150
|
end
|
158
|
-
write_new_file(acl, acl_file)
|
151
|
+
@tidy.write_new_file(acl, acl_file)
|
159
152
|
end
|
160
153
|
|
161
154
|
def ensure_client_read_acls(acl_file)
|
162
155
|
acl = FFI_Yajl::Parser.parse(::File.read(acl_file), symbolize_names: false)
|
163
156
|
%w(users admins).each do | group |
|
164
157
|
unless acl['read']['groups'].include? group
|
165
|
-
puts "REPAIRING: Adding read acl for #{group} in #{acl_file}"
|
158
|
+
@tidy.ui.stdout.puts "REPAIRING: Adding read acl for #{group} in #{acl_file}"
|
166
159
|
acl['read']['groups'].push(group)
|
167
160
|
end
|
168
161
|
end
|
169
|
-
write_new_file(acl, acl_file)
|
162
|
+
@tidy.write_new_file(acl, acl_file)
|
170
163
|
end
|
171
164
|
|
172
165
|
def validate_acls
|
@@ -191,10 +184,32 @@ class Chef
|
|
191
184
|
end
|
192
185
|
end
|
193
186
|
|
187
|
+
def default_user_acl
|
188
|
+
return {:create=>{:actors=>["pivotal", client], :groups=>["::server-admins"]},
|
189
|
+
:read=>{:actors=>["pivotal", client], :groups=>["::server-admins", "::#{@org}_read_access_group"]},
|
190
|
+
:update=>{:actors=>["pivotal", client], :groups=>["::server-admins"]},
|
191
|
+
:delete=>{:actors=>["pivotal", client], :groups=>["::server-admins"]},
|
192
|
+
:grant=>{:actors=>["pivotal", client], :groups=>["::server-admins"]}}
|
193
|
+
end
|
194
|
+
|
195
|
+
def default_client_acl(client_name)
|
196
|
+
return {:create=>{:actors=>["pivotal", "#{@org}-validator", client_name], :groups=>["admins"]},
|
197
|
+
:read=>{:actors=>["pivotal", "#{@org}-validator", client_name], :groups=>["admins", "users"]},
|
198
|
+
:update=>{:actors=>["pivotal", client_name], :groups=>["admins"]},
|
199
|
+
:delete=>{:actors=>["pivotal", client_name], :groups=>["admins", "users"]},
|
200
|
+
:grant=>{:actors=>["pivotal", client_name], :groups=>["admins"]}}
|
201
|
+
end
|
202
|
+
|
194
203
|
def validate_user_acls
|
195
204
|
@members.each do |member|
|
196
205
|
user_acl_path = ::File.join(@tidy.user_acls_path, "#{member[:user][:username]}.json")
|
197
|
-
|
206
|
+
begin
|
207
|
+
user_acl = FFI_Yajl::Parser.parse(::File.read(user_acl_path), symbolize_names: false)
|
208
|
+
rescue Errno::ENOENT
|
209
|
+
@tidy.ui.stdout.puts "REPAIRING: Replacing missing user acl for #{member[:user][:username]}."
|
210
|
+
@tidy.write_new_file(default_user_acl, client_acl_path, backup=false)
|
211
|
+
user_acl = FFI_Yajl::Parser.parse(::File.read(user_acl_path), symbolize_names: false)
|
212
|
+
end
|
198
213
|
ensure_global_group_acls(user_acl_path)
|
199
214
|
actors_groups = acl_actors_groups(user_acl)
|
200
215
|
actors_groups[:groups].each do |group|
|
@@ -208,7 +223,13 @@ class Chef
|
|
208
223
|
def validate_client_acls
|
209
224
|
@clients.each do |client|
|
210
225
|
client_acl_path = ::File.join(@tidy.org_acls_path(@org), 'clients', "#{client[:name]}.json")
|
211
|
-
|
226
|
+
begin
|
227
|
+
client_acl = FFI_Yajl::Parser.parse(::File.read(client_acl_path), symbolize_names: false)
|
228
|
+
rescue Errno::ENOENT
|
229
|
+
@tidy.ui.stdout.puts "REPAIRING: Replacing missing client acl for #{client[:name]} in #{client_acl_path}."
|
230
|
+
@tidy.write_new_file(default_client_acl(client[:name]), client_acl_path, backup=false)
|
231
|
+
client_acl = FFI_Yajl::Parser.parse(::File.read(client_acl_path), symbolize_names: false)
|
232
|
+
end
|
212
233
|
ensure_client_read_acls(client_acl_path)
|
213
234
|
end
|
214
235
|
end
|
data/lib/chef/tidy_common.rb
CHANGED
@@ -1,5 +1,6 @@
|
|
1
1
|
require 'ffi_yajl'
|
2
2
|
require 'fileutils'
|
3
|
+
require "chef/knife/core/ui"
|
3
4
|
|
4
5
|
class Chef
|
5
6
|
class TidyCommon
|
@@ -11,6 +12,10 @@ class Chef
|
|
11
12
|
@backup_path = ::File.expand_path(backup_path)
|
12
13
|
end
|
13
14
|
|
15
|
+
def ui
|
16
|
+
@ui ||= Chef::Knife::UI.new(STDOUT, STDERR, STDIN, {})
|
17
|
+
end
|
18
|
+
|
14
19
|
def users_path
|
15
20
|
@users_path ||= ::File.expand_path(::File.join(@backup_path, 'users'))
|
16
21
|
end
|
@@ -62,8 +67,8 @@ class Chef
|
|
62
67
|
end
|
63
68
|
end
|
64
69
|
|
65
|
-
def write_new_file(contents, path)
|
66
|
-
if ::File.exist?(path)
|
70
|
+
def write_new_file(contents, path, backup=true)
|
71
|
+
if ::File.exist?(path) && backup
|
67
72
|
FileUtils.cp(path, "#{path}.orig") unless ::File.exist?("#{path}.orig")
|
68
73
|
end
|
69
74
|
::File.open(path, 'w+') do |f|
|
@@ -9,13 +9,14 @@ class Chef
|
|
9
9
|
|
10
10
|
attr_accessor :file_path, :backup_path, :data
|
11
11
|
|
12
|
-
def initialize(file_path = nil, tidy_common
|
12
|
+
def initialize(file_path = nil, tidy_common)
|
13
13
|
@file_path = file_path
|
14
|
-
@
|
14
|
+
@tidy = tidy_common
|
15
|
+
@backup_path = tidy_common.backup_path
|
15
16
|
end
|
16
17
|
|
17
18
|
def load_data
|
18
|
-
puts "INFO: Loading substitutions from #{file_path}"
|
19
|
+
@tidy.ui.stdout.puts "INFO: Loading substitutions from #{file_path}"
|
19
20
|
@data = FFI_Yajl::Parser.parse(::File.read(@file_path), symbolize_names: false)
|
20
21
|
rescue Errno::ENOENT
|
21
22
|
raise NoSubstitutionFile, file_path
|
@@ -23,7 +24,7 @@ class Chef
|
|
23
24
|
|
24
25
|
def boiler_plate
|
25
26
|
bp = ::File.join(File.dirname(__FILE__), '../../conf/substitutions.json.example')
|
26
|
-
puts "INFO: Creating boiler plate gsub file: 'substitutions.json'"
|
27
|
+
@tidy.ui.stdout.puts "INFO: Creating boiler plate gsub file: 'substitutions.json'"
|
27
28
|
FileUtils.cp(bp, ::File.join(Dir.pwd, 'substitutions.json'))
|
28
29
|
end
|
29
30
|
|
@@ -44,7 +45,7 @@ class Chef
|
|
44
45
|
file.each_line do |line|
|
45
46
|
if line.match(search)
|
46
47
|
temp_file.puts replace
|
47
|
-
puts "INFO: ++ #{path}"
|
48
|
+
@tidy.ui.stdout.puts "INFO: ++ #{path}"
|
48
49
|
else
|
49
50
|
temp_file.puts line
|
50
51
|
end
|
@@ -63,7 +64,7 @@ class Chef
|
|
63
64
|
load_data
|
64
65
|
@data.keys.each do |entry|
|
65
66
|
@data[entry].keys.each do |glob|
|
66
|
-
puts "INFO: Running substitutions for #{entry} -> #{glob}"
|
67
|
+
@tidy.ui.stdout.puts "INFO: Running substitutions for #{entry} -> #{glob}"
|
67
68
|
Dir[::File.join(@backup_path, glob)].each do |file|
|
68
69
|
@data[entry][glob].each do |substitution|
|
69
70
|
search = Regexp.new(substitution['pattern'])
|
data/lib/knife-tidy/version.rb
CHANGED
metadata
CHANGED
@@ -1,14 +1,14 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: knife-tidy
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.
|
4
|
+
version: 1.0.0
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Jeremy Miller
|
8
8
|
autorequire:
|
9
9
|
bindir: bin
|
10
10
|
cert_chain: []
|
11
|
-
date: 2017-
|
11
|
+
date: 2017-12-04 00:00:00.000000000 Z
|
12
12
|
dependencies:
|
13
13
|
- !ruby/object:Gem::Dependency
|
14
14
|
name: rake
|