kafo 3.0.0 → 5.0.1

Sign up to get free protection for your applications and to get access to all the features.
data/README.md CHANGED
@@ -79,8 +79,8 @@ All configuration related files are to be found in the config directory.
79
79
 
80
80
  You can supply custom location for your scenario configuration and answer files
81
81
  and change configuration and answer files names using options:
82
- ```
83
- kafofy --help
82
+ ```console
83
+ $ kafofy --help
84
84
  Usage: kafofy [options]
85
85
  -c, --config_dir DIR location of the scenarios configuration directory [./config/installer-scenarios.d/]
86
86
  -s, --scenario SCENARIO scenario file name (without extension) [default]
@@ -1004,6 +1004,21 @@ The cache will be skipped if the file modification time of the manifest is
1004
1004
  greater than the mtime recorded in the cache. Using `--parser-cache` will force
1005
1005
  the use of an outdated cache, but this should be used with caution.
1006
1006
 
1007
+ ## Facts
1008
+
1009
+ Kafo provides a structured fact describing the state. This fact is only present
1010
+ during the Puppet run. Currently it's the scenario id and name where the id is
1011
+ the same as passed via --scenario by the user and matches the scenario filename
1012
+ with an extension. The name is a human readable version.
1013
+
1014
+ ```yaml
1015
+ ---
1016
+ kafo:
1017
+ scenario:
1018
+ id: foreman_proxy
1019
+ name: Foreman Proxy
1020
+ ```
1021
+
1007
1022
  ## Configuring Hiera
1008
1023
 
1009
1024
  Kafo uses Hiera to include classes and pass parameters to classes using data
@@ -1018,22 +1033,20 @@ The contents of this file are as per the
1018
1033
  [hiera.yaml docs](https://docs.puppet.com/hiera/latest/configuring.html).
1019
1034
  Only Hiera version 5 is supported.
1020
1035
 
1021
- When running Puppet, Kafo will copy the Hiera config to a temporary location.
1022
- Relative data directories will be changed to absolute paths. A
1023
- `kafo_answers.yaml` file will be generated containing _all_ default and
1024
- overriden values for parameters managed by Kafo. This may change in the future
1025
- to allow a more complex hierarchy.
1036
+ An answers file will be generated containing _all_ default and overriden values
1037
+ for parameters managed by Kafo. During the run this is available as a
1038
+ `kafo.scenario.answer_file` fact. This may change in the future to allow a more
1039
+ complex hierarchy.
1026
1040
 
1027
- The hierarchy can contain a special value with the name `Kafo Answers`. The
1028
- exact values will be rewritten by Kafo, but it can be used to determine when
1029
- the Kafo answers are loaded. Note the name is case sensitive. When it's
1030
- missing, it will be added.
1041
+ The hierarchy must contain the path `%{facts.kafo.scenario.answer_file}`. This
1042
+ contains all answers in a temporary location.
1031
1043
 
1032
1044
  As an example, a hierarchy could be set up with:
1033
1045
 
1034
1046
  ```yaml
1035
1047
  hierarchy:
1036
1048
  - name: "Kafo Answers"
1049
+ path: "%{facts.kafo.scenario.answer_file}"
1037
1050
  - name: "Other YAML hierarchy levels"
1038
1051
  paths:
1039
1052
  - "family/%{facts.os.family}.yaml"
@@ -1048,6 +1061,7 @@ hierarchy:
1048
1061
  datadir: "custom"
1049
1062
  path: "override.yaml"
1050
1063
  - name: "Kafo Answers"
1064
+ path: "%{facts.kafo.scenario.answer_file}"
1051
1065
  - name: "Other YAML hierarchy levels"
1052
1066
  datadir: "data"
1053
1067
  paths:
@@ -1058,6 +1072,17 @@ hierarchy:
1058
1072
  This would give precedence to all Kafo-managed parameter values, but for any
1059
1073
  others, would check for values per OS family, followed by a `common.yaml` file.
1060
1074
 
1075
+ The scenario id is also available as a fact which can be used to provide
1076
+ scenario specific overrides for unmanaged modules.
1077
+
1078
+ ```yaml
1079
+ hierarchy:
1080
+ - name: "Kafo Answers"
1081
+ path: "%{facts.kafo.scenario.answer_file}"
1082
+ - name: "Scenario defaults"
1083
+ path: "scenario/%{facts.kafo.scenario.id}.yaml"
1084
+ ```
1085
+
1061
1086
  [Migration from Hiera version 3](https://puppet.com/docs/puppet/4.9/hiera_migrate_v3_yaml.html)
1062
1087
  is documented by Puppet.
1063
1088
 
data/Rakefile CHANGED
@@ -2,6 +2,12 @@ require 'rake/testtask'
2
2
  require "bundler/gem_tasks"
3
3
  load 'tasks/jenkins.rake'
4
4
 
5
+ Rake::TestTask.new('test:ruby') do |t|
6
+ t.libs << 'lib' << 'test'
7
+ t.test_files = FileList['test/**/*_test.rb']
8
+ t.verbose = true
9
+ end
10
+
5
11
  Rake::TestTask.new('test:unit') do |t|
6
12
  t.libs << 'lib' << 'test'
7
13
  t.test_files = FileList['test/kafo/**/*_test.rb']
@@ -27,4 +33,4 @@ end
27
33
 
28
34
  CLEAN.include 'test/tmp'
29
35
 
30
- task :test => ['test:unit', 'test:acceptance', 'test:puppet_modules']
36
+ task :test => ['test:ruby', 'test:puppet_modules']
Binary file
@@ -68,7 +68,7 @@
68
68
  #Greenyellow:**pre_values**|
69
69
  :load values from answer file;
70
70
  :load values from previous answer file;
71
- note right: if previous scenraio != current scenario
71
+ note right: if previous scenario != current scenario
72
72
 
73
73
  :update help with default values;
74
74
  |execute|
@@ -94,7 +94,7 @@
94
94
 
95
95
  #Greenyellow:**pre**|
96
96
 
97
- :setup HieraConfigurer and save configs;
97
+ :setup ExecutionEnvironment and save configs;
98
98
 
99
99
  #Gold:run puppet;
100
100
  repeat
@@ -1,3 +1,5 @@
1
+ require 'open3'
2
+
1
3
  module Kafo
2
4
  class BaseContext
3
5
  def facts
@@ -24,12 +26,18 @@ module Kafo
24
26
 
25
27
  def self.facts
26
28
  @facts ||= begin
27
- symbolize(JSON.load(`#{facter_path} --json`) || {})
29
+ result = run_command("#{facter_path} --json")
30
+ symbolize(JSON.load(result) || {})
28
31
  end
29
32
  end
30
33
 
31
34
  def self.facter_path
32
35
  @facter_path ||= PuppetCommand.search_puppet_path('facter')
33
36
  end
37
+
38
+ def self.run_command(command)
39
+ stdout, _stderr, _status = Open3.capture3(*PuppetCommand.format_command(command))
40
+ stdout
41
+ end
34
42
  end
35
43
  end
@@ -4,11 +4,11 @@ require 'tmpdir'
4
4
  require 'kafo/puppet_module'
5
5
  require 'kafo/color_scheme'
6
6
  require 'kafo/data_type_parser'
7
- require 'kafo/puppet_configurer'
7
+ require 'kafo/execution_environment'
8
8
 
9
9
  module Kafo
10
10
  class Configuration
11
- attr_reader :config_file, :answer_file
11
+ attr_reader :config_file, :answer_file, :scenario_id
12
12
 
13
13
  DEFAULT = {
14
14
  :name => '',
@@ -27,11 +27,16 @@ module Kafo
27
27
  :color_of_background => :dark,
28
28
  :hook_dirs => [],
29
29
  :custom => {},
30
+ :facts => {},
30
31
  :low_priority_modules => [],
31
32
  :verbose_log_level => 'info',
32
33
  :skip_puppet_version_check => false
33
34
  }
34
35
 
36
+ def self.get_scenario_id(filename)
37
+ File.basename(filename, '.yaml')
38
+ end
39
+
35
40
  def initialize(file, persist = true)
36
41
  @config_file = file
37
42
  @persist = persist
@@ -47,6 +52,7 @@ module Kafo
47
52
  end
48
53
 
49
54
  @config_dir = File.dirname(@config_file)
55
+ @scenario_id = Configuration.get_scenario_id(@config_file)
50
56
  end
51
57
 
52
58
  def save_configuration(configuration)
@@ -90,6 +96,14 @@ module Kafo
90
96
  custom_storage[key.to_sym] = value
91
97
  end
92
98
 
99
+ def get_custom_fact(key)
100
+ custom_fact_storage[key.to_s]
101
+ end
102
+
103
+ def set_custom_fact(key, value)
104
+ custom_fact_storage[key.to_s] = value
105
+ end
106
+
93
107
  def modules
94
108
  @modules ||= begin
95
109
  register_data_types
@@ -148,8 +162,10 @@ module Kafo
148
162
 
149
163
  def params_default_values
150
164
  @params_default_values ||= begin
151
- puppetconf = PuppetConfigurer.new('noop' => true)
152
- KafoConfigure.exit_handler.register_cleanup_path puppetconf.config_path
165
+ execution_env = ExecutionEnvironment.new(self)
166
+ KafoConfigure.exit_handler.register_cleanup_path(execution_env.directory)
167
+
168
+ puppetconf = execution_env.configure_puppet('noop' => true)
153
169
 
154
170
  dump_manifest = <<EOS
155
171
  #{includes}
@@ -160,12 +176,16 @@ module Kafo
160
176
  EOS
161
177
 
162
178
  @logger.info 'Loading default values from puppet modules...'
163
- command = PuppetCommand.new(dump_manifest, [], puppetconf, self).append('2>&1').command
164
- result = `#{command}`
165
- @logger.debug result
166
- unless $?.exitstatus == 0
179
+ command = PuppetCommand.new(dump_manifest, [], puppetconf, self).command
180
+ stdout, stderr, status = Open3.capture3(*PuppetCommand.format_command(command))
181
+
182
+ @logger.debug stdout
183
+ @logger.debug stderr
184
+
185
+ unless status.success?
167
186
  log = app[:log_dir] + '/' + app[:log_name]
168
- if (version_mismatch = /kafo_configure::puppet_version_failure: (.+?\))/.match(result))
187
+
188
+ if (version_mismatch = /kafo_configure::puppet_version_failure: (.+?\))/.match(stderr))
169
189
  puts version_mismatch[1]
170
190
  puts "Cannot continue due to incompatible version of Puppet. Use --skip-puppet-version-check to disable this check."
171
191
  @logger.error version_mismatch[1]
@@ -174,14 +194,15 @@ EOS
174
194
  else
175
195
  puts "Could not get default values, check log file at #{log} for more information"
176
196
  @logger.error command
177
- @logger.error result
197
+ @logger.error stderr
178
198
  @logger.error 'Could not get default values, cannot continue'
179
199
  KafoConfigure.exit(:defaults_error)
180
200
  end
181
201
  end
202
+
182
203
  @logger.info "... finished"
183
204
 
184
- load_yaml_from_output(result.split($/))
205
+ load_yaml_from_output(stdout.split($/))
185
206
  end
186
207
  end
187
208
 
@@ -305,6 +326,10 @@ EOS
305
326
  app[:custom]
306
327
  end
307
328
 
329
+ def custom_fact_storage
330
+ app[:facts]
331
+ end
332
+
308
333
  def includes
309
334
  modules.map do |mod|
310
335
  module_dir = module_dirs.find do |dir|
@@ -0,0 +1,84 @@
1
+ require 'tmpdir'
2
+
3
+ require 'kafo/fact_writer'
4
+ require 'kafo/hiera_configurer'
5
+ require 'kafo/puppet_configurer'
6
+
7
+ module Kafo
8
+ class ExecutionEnvironment
9
+ def initialize(config, logger = KafoConfigure.logger)
10
+ @config = config
11
+ @logger = logger
12
+ end
13
+
14
+ def directory
15
+ @directory ||= begin
16
+ directory = Dir.mktmpdir('kafo_installation')
17
+ @logger.debug("Creating execution environment in #{directory}")
18
+ directory
19
+ end
20
+ end
21
+
22
+ def store_answers
23
+ answer_data = HieraConfigurer.generate_data(@config.modules, @config.app[:order])
24
+ @logger.debug("Writing temporary answers to #{answer_file}")
25
+ File.open(answer_file, 'w') { |f| f.write(YAML.dump(answer_data)) }
26
+ end
27
+
28
+ def configure_puppet(settings = {})
29
+ @logger.debug("Configuring Puppet in #{directory}")
30
+
31
+ @logger.debug("Writing facts to #{factpath}")
32
+ FactWriter.write_facts(facts, factpath)
33
+
34
+ hiera_config = configure_hiera
35
+
36
+ settings = {
37
+ 'environmentpath' => environmentpath,
38
+ 'factpath' => factpath,
39
+ 'hiera_config' => hiera_config,
40
+ }.merge(settings)
41
+
42
+ PuppetConfigurer.new(puppet_conf, settings)
43
+ end
44
+
45
+ private
46
+
47
+ def environmentpath
48
+ File.join(directory, 'environments')
49
+ end
50
+
51
+ def factpath
52
+ File.join(directory, 'facts')
53
+ end
54
+
55
+ def answer_file
56
+ File.join(directory, 'answers.yaml')
57
+ end
58
+
59
+ def puppet_conf
60
+ File.join(directory, 'puppet.conf')
61
+ end
62
+
63
+ def configure_hiera
64
+ if @config.app[:hiera_config]
65
+ File.realpath(@config.app[:hiera_config])
66
+ else
67
+ config_path = File.join(directory, 'hiera.yaml')
68
+ @logger.debug("Writing default hiera config to #{config_path}")
69
+ HieraConfigurer.write_default_config(config_path)
70
+ end
71
+ end
72
+
73
+ def facts
74
+ {
75
+ 'scenario' => {
76
+ 'id' => @config.scenario_id,
77
+ 'name' => @config.app[:name],
78
+ 'answer_file' => answer_file,
79
+ 'custom' => @config.app[:facts],
80
+ },
81
+ }
82
+ end
83
+ end
84
+ end
@@ -1,11 +1,10 @@
1
1
  module Kafo
2
2
  class ExitHandler
3
- attr_accessor :cleanup_paths, :exit_code, :logger
3
+ attr_accessor :cleanup_paths, :exit_code
4
4
 
5
5
  def initialize
6
6
  @cleanup_paths = []
7
7
  @exit_code = 0
8
- @logger = KafoConfigure.logger
9
8
  end
10
9
 
11
10
  def error_codes
@@ -27,8 +26,8 @@ module Kafo
27
26
  def exit(code, &block)
28
27
  @exit_code = translate_exit_code(code)
29
28
  block.call if block
30
- KafoConfigure.logger.debug "Exit with status code: #{@exit_code} (signal was #{code})"
31
- KafoConfigure.logger.dump_errors unless KafoConfigure.verbose
29
+ logger.debug "Exit with status code: #{@exit_code} (signal was #{code})"
30
+ logger.dump_errors unless KafoConfigure.verbose
32
31
  cleanup
33
32
  Kernel.exit(@exit_code)
34
33
  end
@@ -54,5 +53,11 @@ module Kafo
54
53
  self.cleanup_paths<< path
55
54
  end
56
55
 
56
+ private
57
+
58
+ def logger
59
+ @logger ||= KafoConfigure.logger
60
+ end
61
+
57
62
  end
58
63
  end
@@ -0,0 +1,24 @@
1
+ module Kafo
2
+ class FactWriter
3
+ DATA_FILENAME = 'kafo.yaml'
4
+ WRAPPER_FILENAME = 'kafo.rb'
5
+
6
+ def self.write_facts(facts, directory)
7
+ Dir.mkdir(directory)
8
+
9
+ # Write a data file containing all the facts encoded as YAML
10
+ File.open(File.join(directory, DATA_FILENAME), 'w') { |f| f.write(YAML.dump(facts)) }
11
+
12
+ # Write a Ruby wrapper since only those are executed within puppet
13
+ File.open(File.join(directory, 'kafo.rb'), 'w') { |f| f.write(wrapper) }
14
+ end
15
+
16
+ def self.wrapper
17
+ # Ruby 2.0 doesn't have <<~ heredocs
18
+ <<-WRAPPER
19
+ require 'yaml'
20
+ Facter.add(:kafo) { setcode { YAML.load_file(File.join(__dir__, '#{DATA_FILENAME}')) } }
21
+ WRAPPER
22
+ end
23
+ end
24
+ end
@@ -1,111 +1,37 @@
1
- require 'fileutils'
2
- require 'tmpdir'
3
-
4
1
  module Kafo
5
2
  class HieraConfigurer
6
- HIERARCHY_NAME = 'Kafo Answers'
7
- HIERARCHY_FILENAME = 'kafo_answers.yaml'
8
-
9
- attr_reader :temp_dir, :config_path, :data_dir, :logger
10
-
11
- def initialize(user_config_path, modules, modules_order)
12
- @user_config_path = user_config_path
13
- @modules = modules
14
- @modules_order = modules_order
15
- @logger = KafoConfigure.logger
3
+ def self.default_config
4
+ {
5
+ 'version' => 5,
6
+ 'hierarchy' => [
7
+ {
8
+ 'name' => 'Kafo Answers',
9
+ 'path' => '%{facts.kafo.scenario.answer_file}',
10
+ 'data_hash' => 'yaml_data',
11
+ },
12
+ ],
13
+ }
16
14
  end
17
15
 
18
- def write_configs
19
- build_temp_dir
20
-
21
- if @user_config_path
22
- logger.debug("Merging existing Hiera config file from #{@user_config_path}")
23
- user_config = YAML.load(File.read(@user_config_path))
24
- else
25
- user_config = {}
26
- end
27
- logger.debug("Writing Hiera config file to #{config_path}")
28
- File.open(config_path, 'w') do |f|
29
- # merge required config changes into the user's Hiera config
30
- f.write(format_yaml_symbols(generate_config(user_config).to_yaml))
31
- end
32
-
33
- logger.debug("Creating Hiera data files in #{data_dir}")
34
- FileUtils.mkdir(data_dir)
35
-
36
- File.open(File.join(data_dir, HIERARCHY_FILENAME), 'w') do |f|
37
- f.write(format_yaml_symbols(generate_data(@modules).to_yaml))
38
- end
16
+ def self.write_default_config(path)
17
+ File.open(path, 'w') { |f| f.write(YAML.dump(default_config)) }
18
+ path
39
19
  end
40
20
 
41
- def generate_config(config = {})
42
- config ||= {}
43
-
44
- config['version'] = 5
45
-
46
- # ensure there are defaults
47
- config['defaults'] ||= {}
48
- config['defaults']['datadir'] = determine_data_dir_path(config['defaults']['datadir'])
49
- config['defaults']['data_hash'] ||= 'yaml_data'
50
-
51
- # ensure our answers file is present and has the right settings
52
- config['hierarchy'] ||= []
53
-
54
- config['hierarchy'].each do |level|
55
- if level['datadir']
56
- level['datadir'] = determine_data_dir_path(level['datadir'])
57
- end
58
- end
59
-
60
- kafo_answers = config['hierarchy'].find { |level| level['name'] == HIERARCHY_NAME }
61
- if kafo_answers
62
- kafo_answers.clear
63
- else
64
- kafo_answers = {}
65
- config['hierarchy'].unshift(kafo_answers)
66
- end
67
- kafo_answers['name'] = HIERARCHY_NAME
68
- kafo_answers['path'] = HIERARCHY_FILENAME
69
- kafo_answers['datadir'] = data_dir
70
- kafo_answers['data_hash'] = 'yaml_data'
71
-
72
- config
73
- end
74
-
75
- def generate_data(modules)
21
+ def self.generate_data(modules, order = nil)
76
22
  classes = []
77
23
  data = modules.select(&:enabled?).inject({}) do |config, mod|
78
24
  classes << mod.class_name
79
25
  config.update(Hash[mod.params_hash.map { |k, v| ["#{mod.class_name}::#{k}", v] }])
80
26
  end
81
- data['classes'] = @modules_order ? sort_modules(classes, @modules_order) : classes
27
+ data['classes'] = sort_modules(classes, order)
82
28
  data
83
29
  end
84
30
 
85
- def sort_modules(modules, order)
86
- (order & modules) + (modules - order)
87
- end
88
-
89
- def build_temp_dir
90
- @temp_dir ||= Dir.mktmpdir('kafo_hiera')
91
- @config_path = File.join(temp_dir, 'hiera.conf')
92
- @data_dir = File.join(temp_dir, 'data')
93
- end
94
-
95
- private
96
-
97
- def format_yaml_symbols(data)
98
- data.gsub('!ruby/sym ', ':')
99
- end
31
+ def self.sort_modules(modules, order)
32
+ return modules unless order
100
33
 
101
- def determine_data_dir_path(path)
102
- # Relies on data_dir being absolute or having a user config
103
- path ||= data_dir
104
- Pathname.new(path).relative? ? File.join(original_hiera_directory, path) : path
105
- end
106
-
107
- def original_hiera_directory
108
- @user_config_path ? File.dirname(@user_config_path) : nil
34
+ (order & modules) + (modules - order)
109
35
  end
110
36
  end
111
37
  end