jenkins_pipeline_builder 0.10.12 → 0.10.13

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,15 +1,15 @@
1
1
  ---
2
2
  !binary "U0hBMQ==":
3
3
  metadata.gz: !binary |-
4
- ZTM5YTI4ZDgwN2M1YjZjNzQ3MWY5NzliM2U4MTA4NmQ3MjQ5N2Y4NA==
4
+ MzcxMzVkZDI1ODAzOWFkZGQxMWIwOTg1MzE1YTdjN2M2OGUwNjUwZA==
5
5
  data.tar.gz: !binary |-
6
- MWFhY2QyYzc5MWQyOGM5MmE3YTM2MzhhNGY5ZWUxODdmMTQ0YjFiMg==
6
+ YzRkZjVjNWU0OTg2OGNkMWE3NDMxOTUxYjg2YTBlZjFhMDMxNTdiZQ==
7
7
  SHA512:
8
8
  metadata.gz: !binary |-
9
- NThjODRiYzA3ZDY1NDJmY2NjZWJjOTY3MTY0YmJmMTkyZTllODc4YTc0YmI5
10
- MDhhODgyYWM3ZjdlZTQwZmFhY2U4YWJkZTQ4MjYwNzg4YmYyZWQ4ZGZmNjJh
11
- NTMzMzJlOTBlOGY4MGViZjFiMWU0Yzk3NjYyNjI2MWQ2YzM0MjI=
9
+ NDk0MDhhMmYwMGUxYThmZGEwMmU0Y2U1Y2ZhNmIxNDE1ZWI1NTM5MGI5MmRi
10
+ MDM3NTY0NTcwNzdkZGFkOTMwZDA4ZDkwYzBlMjhmNzY4OTg1ZmY3MDVmY2Nl
11
+ M2EzMThkMDBiZTRjZWViYzBmNzgyOTA4YTY3MjNlMjU4MTljNzk=
12
12
  data.tar.gz: !binary |-
13
- NzhmYjZkNGRlMTVmYTBlNDBkNDRkMjE5YzMzZGIwYzZlMjRkMjI2YjkwYzlm
14
- ODY2NGUwMDlkODVlNjcyZmNjNzIzMDk3NWJkMTJkMDdlMGFmZmE1NDE1NGNl
15
- NzhkMjRmNmM4MDBkOTIzNTFhMDFkYmViNDc0MzQyNzcxYzNhNGY=
13
+ OTA4NThkMzdmYTVjZTY0YzcyZTk5ZjQ5Y2ZmNzM5MWViM2ExOWYwNzFlZmRk
14
+ MjBiZTM1YTRjMWUwNGUwODY0ZjQwYjUxNDdjZjJiNGUyMTdhN2Q1MWM1Mjg4
15
+ ODEwMjg5OWQ3YTAwOTlhMTc1Mzc1YWExMjU3MDdhMDc3ZTlmYzA=
@@ -5,9 +5,9 @@ Metrics/CyclomaticComplexity:
5
5
  Metrics/PerceivedComplexity:
6
6
  Max: 11
7
7
  Metrics/MethodLength:
8
- Max: 27
8
+ Max: 30
9
9
  Metrics/ClassLength:
10
- Max: 350
10
+ Max: 175
11
11
 
12
12
  #######
13
13
  # Finalized settings
data/README.md CHANGED
@@ -3,19 +3,13 @@ jenkins-pipeline-builder
3
3
 
4
4
  [![Build Status](https://travis-ci.org/constantcontact/jenkins_pipeline_builder.svg)](https://travis-ci.org/constantcontact/jenkins_pipeline_builder) [![Gem Version](https://badge.fury.io/rb/jenkins_pipeline_builder.svg)](http://badge.fury.io/rb/jenkins_pipeline_builder)
5
5
 
6
- YAML driven CI Jenkins Pipeline Builder enabling to version your artifact pipelines alongside with the artifact source
7
- itself.
8
-
9
- This Gem uses this methodolody by itself. Notice the 'pipeline' folder where we have a declaration of the Gem's build
10
- pipeline.
6
+ YAML/JSON driven jenkins job generator that lets you version your artifact pipelines alongside with the artifact source itself.
11
7
 
12
8
  # Background
13
9
 
14
- This project is inspired by a great work done by Arangamani with [jenkins_api_client](https://github.com/arangamani/jenkins_api_client) and
15
- amazing progress done by the Open Stack community with their [jenkins-job-builder](http://ci.openstack.org/jenkins-job-builder/)
10
+ This project is inspired by the great work done by Arangamani with [jenkins_api_client](https://github.com/arangamani/jenkins_api_client) and amazing progress done by the Open Stack community with their [jenkins-job-builder](http://ci.openstack.org/jenkins-job-builder/)
16
11
 
17
- The YAML structure very closely resembles the OpenStack Job Builder, but, in comparison to Python version, is 100%
18
- pure Ruby and uses Jenkins API Client and has additional functionlity of building different types of Jenkins views.
12
+ The YAML structure began very closely resembling the OpenStack Job Builder, but, has evolved since then. Under the covers it uses the Jenkins API Client and is very extensible. Any plugin that is not currently supported can be added locally.
19
13
 
20
14
  # JenkinsPipelineBuilder
21
15
 
@@ -47,8 +41,8 @@ Or install it yourself as:
47
41
  For more info see [jenkins_api_client](https://github.com/arangamani/jenkins_api_client).
48
42
  Supplying credentials to the client is optional, as not all Jenkins instances
49
43
  require authentication. This project supports two types of password-based
50
- authentication. You can just you the plain password by using <tt>password</tt>
51
- parameter. If you don't prefer leaving plain passwords in the credentials file,
44
+ authentication. You can just use the plain password by using <tt>password</tt>
45
+ parameter. If you don't want to leave plain passwords in the credentials file,
52
46
  you can encode your password in base64 format and use <tt>password_base64</tt>
53
47
  parameter to specify the password either in the arguments or in the credentials
54
48
  file. To use the client without credentials, just leave out the
@@ -66,7 +60,7 @@ initializing the client.
66
60
 
67
61
  ### Basic usage
68
62
 
69
- Create all your Job description files in a folder (Ex.: ./pipeline). Follow the Job/View/Project DSL.
63
+ Create all your Job description files in a folder (e.g.: ./pipeline). Follow the Job/View/Project DSL.
70
64
  Try to extract the reusable values out of jobs into the project.
71
65
 
72
66
  Put the right information about the location of your Jenkins server and the appropriate credentials
@@ -22,42 +22,21 @@
22
22
 
23
23
  module JenkinsPipelineBuilder
24
24
  class Compiler
25
- def self.resolve_value(value, settings, job_collection)
26
- # pull@ designates that this is a reference to a job that will be generated
27
- # for a pull request, so we want to save the resolution for the second pass
28
- pull_job = value.to_s.match(/{{pull@(.*)}}/)
29
- return pull_job[1] if pull_job
25
+ attr_reader :generator, :job_collection
30
26
 
31
- settings = settings.with_indifferent_access
32
- value_s = value.to_s.clone
33
- # First we try to do job name correction
34
- vars = value_s.scan(/{{job@(.*)}}/).flatten
35
- if vars.count > 0
36
- vars.select! do |var|
37
- var_val = job_collection[var.to_s]
38
- value_s.gsub!("{{job@#{var}}}", var_val[:value][:name]) unless var_val.nil?
39
- var_val.nil?
40
- end
41
- end
42
- # Then we look for normal values to replace
43
- vars = value_s.scan(/{{([^{}@]+)}}/).flatten
44
- vars.select! do |var|
45
- var_val = settings[var]
46
- value_s.gsub!("{{#{var}}}", var_val.to_s) unless var_val.nil?
47
- var_val.nil?
48
- end
49
- return nil if vars.count != 0
50
- value_s
27
+ def initialize(generator)
28
+ @generator = generator
29
+ @job_collection = generator.job_collection.collection
51
30
  end
52
31
 
53
- def self.get_settings_bag(item_bag, settings_bag = {})
32
+ def get_settings_bag(item_bag, settings_bag = {})
54
33
  item = item_bag[:value]
55
34
  bag = {}
56
35
  return unless item.is_a?(Hash)
57
36
  item.keys.each do |k|
58
37
  val = item[k]
59
38
  next unless val.is_a? String
60
- new_value = resolve_value(val, settings_bag, {})
39
+ new_value = resolve_value(val, settings_bag)
61
40
  return nil if new_value.nil?
62
41
  bag[k] = new_value
63
42
  end
@@ -65,30 +44,51 @@ module JenkinsPipelineBuilder
65
44
  my_settings_bag.merge(bag)
66
45
  end
67
46
 
68
- def self.compile(item, settings = {}, job_collection = {})
69
- success, item = handle_enable(item, settings, job_collection)
47
+ def compile(item, settings = {})
48
+ success, item = handle_enable(item, settings)
70
49
  return false, item unless success
71
50
 
72
51
  case item
73
52
  when String
74
- return compile_string item, settings, job_collection
53
+ return compile_string item, settings
75
54
  when Hash
76
- return compile_hash item, settings, job_collection
55
+ return compile_hash item, settings
77
56
  when Array
78
- return compile_array item, settings, job_collection
57
+ return compile_array item, settings
79
58
  end
80
59
  [true, item]
81
60
  end
82
61
 
83
- def self.compile_string(item, settings, job_collection)
62
+ def handle_enable(item, settings)
63
+ return true, item unless item.is_a? Hash
64
+ if item.key?(:enabled) && item.key?(:parameters) && item.length == 2
65
+ enabled_switch = resolve_value(item[:enabled], settings)
66
+ return [true, {}] if enabled_switch == 'false'
67
+ if enabled_switch != 'true'
68
+ return [false, { 'value error' => "Invalid value for #{item[:enabled]}: #{enabled_switch}" }]
69
+ end
70
+ if item[:parameters].is_a? Hash
71
+ item = item.merge item[:parameters]
72
+ item.delete :parameters
73
+ item.delete :enabled
74
+ else
75
+ item = item[:parameters]
76
+ end
77
+ end
78
+ [true, item]
79
+ end
80
+
81
+ private
82
+
83
+ def compile_string(item, settings)
84
84
  errors = {}
85
- new_value = resolve_value(item, settings, job_collection)
85
+ new_value = resolve_value(item, settings)
86
86
  errors[item] = "Failed to resolve #{item}" if new_value.nil?
87
87
  return false, errors unless errors.empty?
88
88
  [true, new_value]
89
89
  end
90
90
 
91
- def self.compile_array(item, settings, job_collection)
91
+ def compile_array(item, settings)
92
92
  errors = {}
93
93
  result = []
94
94
  item.each do |value|
@@ -96,7 +96,7 @@ module JenkinsPipelineBuilder
96
96
  errors[item] = "found a nil value when processing following array:\n #{item.inspect}"
97
97
  break
98
98
  end
99
- success, payload = compile(value, settings, job_collection)
99
+ success, payload = compile(value, settings)
100
100
  unless success
101
101
  errors.merge!(payload)
102
102
  next
@@ -111,50 +111,65 @@ module JenkinsPipelineBuilder
111
111
  [true, result]
112
112
  end
113
113
 
114
- def self.handle_enable(item, settings, job_collection)
115
- return true, item unless item.is_a? Hash
116
- if item.key?(:enabled) && item.key?(:parameters) && item.length == 2
117
- enabled_switch = resolve_value(item[:enabled], settings, job_collection)
118
- return [true, {}] if enabled_switch == 'false'
119
- if enabled_switch != 'true'
120
- return [false, { 'value error' => "Invalid value for #{item[:enabled]}: #{enabled_switch}" }]
121
- end
122
- if item[:parameters].is_a? Hash
123
- item = item.merge item[:parameters]
124
- item.delete :parameters
125
- item.delete :enabled
126
- else
127
- item = item[:parameters]
128
- end
114
+ def compile_item(key, value, errors, settings)
115
+ if value.nil?
116
+ errors[key] = "key: #{key} has a nil value, this is often a yaml syntax error. Skipping children and siblings"
117
+ return false, errors[key]
129
118
  end
130
- [true, item]
119
+ success, payload = compile(value, settings)
120
+ unless success
121
+ errors.merge!(payload)
122
+ return false, payload
123
+ end
124
+ if payload.nil?
125
+ errors[key] = "Failed to resolve:\n===>key: #{key}\n\n===>value: #{value}\n\n===>of: #{item}"
126
+ return false, errors[key]
127
+ end
128
+ [true, payload]
131
129
  end
132
130
 
133
- def self.compile_hash(item, settings, job_collection)
134
- success, item = handle_enable(item, settings, job_collection)
131
+ def compile_hash(item, settings)
132
+ success, item = handle_enable(item, settings)
135
133
  return false, item unless success
136
134
 
137
135
  errors = {}
138
136
  result = {}
139
137
 
140
138
  item.each do |key, value|
141
- if value.nil?
142
- errors[key] = "key: #{key} has a nil value, this is often a yaml syntax error. Skipping children and siblings"
143
- break
144
- end
145
- success, payload = compile(value, settings, job_collection)
146
- unless success
147
- errors.merge!(payload)
148
- next
149
- end
150
- if payload.nil?
151
- errors[key] = "Failed to resolve:\n===>key: #{key}\n\n===>value: #{value}\n\n===>of: #{item}"
152
- next
153
- end
139
+ success, payload = compile_item(key, value, errors, settings)
140
+ next unless success
154
141
  result[key] = payload unless payload == {}
155
142
  end
156
143
  return false, errors unless errors.empty?
157
144
  [true, result]
158
145
  end
146
+
147
+ def resolve_value(value, settings)
148
+ # pull@ designates that this is a reference to a job that will be generated
149
+ # for a pull request, so we want to save the resolution for the second pass
150
+ pull_job = value.to_s.match(/{{pull@(.*)}}/)
151
+ return pull_job[1] if pull_job
152
+
153
+ settings = settings.with_indifferent_access
154
+ value_s = value.to_s.clone
155
+ # First we try to do job name correction
156
+ vars = value_s.scan(/{{job@(.*)}}/).flatten
157
+ if vars.count > 0
158
+ vars.select! do |var|
159
+ var_val = job_collection[var.to_s]
160
+ value_s.gsub!("{{job@#{var}}}", var_val[:value][:name]) unless var_val.nil?
161
+ var_val.nil?
162
+ end
163
+ end
164
+ # Then we look for normal values to replace
165
+ vars = value_s.scan(/{{([^{}@]+)}}/).flatten
166
+ vars.select! do |var|
167
+ var_val = settings[var]
168
+ value_s.gsub!("{{#{var}}}", var_val.to_s) unless var_val.nil?
169
+ var_val.nil?
170
+ end
171
+ return nil if vars.count != 0
172
+ value_s
173
+ end
159
174
  end
160
175
  end
@@ -29,6 +29,10 @@ module JenkinsPipelineBuilder
29
29
  end
30
30
  end
31
31
 
32
+ def clear_installed_version
33
+ @version = nil
34
+ end
35
+
32
36
  def installed_version=(version)
33
37
  version = version.match(/\d+\.\d+/)
34
38
  @version = Gem::Version.new version
@@ -77,6 +77,32 @@ module JenkinsPipelineBuilder
77
77
  errors.empty?
78
78
  end
79
79
 
80
+ def execute(value, n_xml)
81
+ errors = []
82
+ check_parameters value
83
+ errors.each do |error|
84
+ logger.error error
85
+ end
86
+ return false if errors.any?
87
+
88
+ n_builders = n_xml.xpath(path).first
89
+ n_builders.instance_exec(value, &before) if before
90
+ Nokogiri::XML::Builder.with(n_builders) do |builder|
91
+ builder.instance_exec value, &xml
92
+ end
93
+ n_builders.instance_exec(value, &after) if after
94
+ true
95
+ end
96
+
97
+ def check_parameters(value)
98
+ return if parameters && parameters.empty?
99
+ return unless value.is_a? Hash
100
+ value.each_key do |key|
101
+ next if parameters && parameters.include?(key)
102
+ errors << "Extension #{extension.name} does not support parameter #{key}"
103
+ end
104
+ end
105
+
80
106
  def errors
81
107
  errors = {}
82
108
  EXT_METHODS.keys.each do |name|
@@ -43,6 +43,12 @@ builder do
43
43
  properties job[:config][:predefined_build_parameters]
44
44
  end
45
45
  end
46
+ if job[:config].key? :properties_file
47
+ send('hudson.plugins.parameterizedtrigger.FileBuildParameters') do
48
+ propertiesFile job[:config][:properties_file][:file]
49
+ failTriggerOnMissing job[:config][:properties_file][:skip_if_missing] || 'false'
50
+ end
51
+ end
46
52
  end
47
53
  end
48
54
  killPhaseOnJobResultCondition job[:kill_phase_on] || 'FAILURE'
@@ -25,19 +25,8 @@ require 'json'
25
25
 
26
26
  module JenkinsPipelineBuilder
27
27
  class Generator
28
- attr_reader :debug
29
28
  attr_accessor :no_files, :job_templates, :logger, :module_registry, :job_collection
30
29
 
31
- # Initialize a Client object with Jenkins Api Client
32
- #
33
- # @param args [Hash] Arguments to connect to Jenkins server
34
- #
35
- # @option args [String] :something some option description
36
- #
37
- # @return [JenkinsPipelineBuilder::Generator] a client generator
38
- #
39
- # @raise [ArgumentError] when required options are not provided.
40
- #
41
30
  def initialize
42
31
  @job_templates = {}
43
32
  @extensions = {}
@@ -53,10 +42,6 @@ module JenkinsPipelineBuilder
53
42
  JenkinsPipelineBuilder.client
54
43
  end
55
44
 
56
- # Creates an instance to the View class by passing a reference to self
57
- #
58
- # @return [JenkinsApi::Client::System] An object to System subclass
59
- #
60
45
  def view
61
46
  JenkinsPipelineBuilder::View.new(self)
62
47
  end
@@ -68,7 +53,7 @@ module JenkinsPipelineBuilder
68
53
  if job_collection.projects.any?
69
54
  errors = publish_project(project_name)
70
55
  else
71
- errors = publish_jobs(standalone job_collection.jobs)
56
+ errors = publish_jobs(job_collection.standalone_jobs)
72
57
  end
73
58
  errors.each do |k, v|
74
59
  logger.error "Encountered errors compiling: #{k}:"
@@ -85,18 +70,13 @@ module JenkinsPipelineBuilder
85
70
  job_collection.projects.each do |project|
86
71
  next unless project[:name] == project_name || project_name.nil?
87
72
  logger.info "Using Project #{project}"
88
- pull_job = find_pull_request_generator(project)
89
- p_success, p_payload = compile_pull_request_generator(pull_job[:name], project)
90
- unless p_success
91
- errors[pull_job[:name]] = p_payload
92
- next
73
+
74
+ pull_request_generator = JenkinsPipelineBuilder::PullRequestGenerator.new project, self
75
+
76
+ unless pull_request_generator.valid?
77
+ errors[pull_request_generator.pull_generator[:name]] = pull_request_generator.errors
93
78
  end
94
- jobs = filter_pull_request_jobs(pull_job)
95
- pull = JenkinsPipelineBuilder::PullRequestGenerator.new(project, jobs, p_payload)
96
- @job_collection.collection.merge! pull.jobs
97
- pull_errors = create_pull_request_jobs(pull)
98
- errors.merge! pull_errors
99
- purge_pull_request_jobs(pull)
79
+
100
80
  end
101
81
  errors.each do |k, v|
102
82
  logger.error "Encountered errors compiling: #{k}:"
@@ -112,84 +92,51 @@ module JenkinsPipelineBuilder
112
92
  end
113
93
 
114
94
  def dump(job_name)
115
- @logger.info "Debug #{@debug}"
116
- @logger.info "Dumping #{job_name} into #{job_name}.xml"
95
+ logger.info "Debug #{JenkinsPipelineBuilder.debug}"
96
+ logger.info "Dumping #{job_name} into #{job_name}.xml"
117
97
  xml = client.job.get_config(job_name)
118
98
  File.open(job_name + '.xml', 'w') { |f| f.write xml }
119
99
  end
120
100
 
121
- #
122
- # BEGIN PRIVATE METHODS
123
- #
124
-
125
- private
126
-
127
- # Converts standalone jobs to the format that they have when loaded as part of a project.
128
- # This addresses an issue where #pubish_jobs assumes that each job will be wrapped
129
- # with in a hash a referenced under a key called :result, which is what happens when
130
- # it is loaded as part of a project.
131
- #
132
- # @return An array of jobs
133
- #
134
- def standalone(jobs)
135
- jobs.map! { |job| { result: job } }
101
+ def resolve_job_by_name(name, settings = {})
102
+ job = job_collection.get_item(name)
103
+ fail "Failed to locate job by name '#{name}'" if job.nil?
104
+ job_value = job[:value]
105
+ logger.debug "Compiling job #{name}"
106
+ compiler = Compiler.new self
107
+ success, payload = compiler.compile(job_value, settings)
108
+ [success, payload]
136
109
  end
137
110
 
138
- def purge_pull_request_jobs(pull)
139
- pull.purge.each do |purge_job|
140
- jobs = client.job.list "#{purge_job}.*"
141
- jobs.each do |job|
142
- client.job.delete job
143
- end
144
- end
145
- end
111
+ def resolve_project(project)
112
+ defaults = job_collection.defaults
113
+ settings = defaults.nil? ? {} : defaults[:value] || {}
114
+ compiler = Compiler.new self
115
+ project[:settings] = compiler.get_settings_bag(project, settings) unless project[:settings]
116
+ project_body = project[:value]
146
117
 
147
- def create_pull_request_jobs(pull)
148
- errors = {}
149
- pull.create.each do |pull_project|
150
- success, compiled_project = resolve_project(pull_project)
151
- compiled_project[:value][:jobs].each do |i|
152
- job = i[:result]
153
- job = Job.new job
154
- success, payload = job.create_or_update
155
- errors[job.name] = payload unless success
118
+ jobs = prepare_jobs(project_body[:jobs]) if project_body[:jobs]
119
+ logger.info project
120
+ process_job_changes(jobs)
121
+ errors = process_jobs(jobs, project)
122
+ errors = process_views(project_body[:views], project, errors) if project_body[:views]
123
+ errors.each do |k, v|
124
+ puts "Encountered errors processing: #{k}:"
125
+ v.each do |key, error|
126
+ puts " key: #{key} had the following error:"
127
+ puts " #{error.inspect}"
156
128
  end
157
129
  end
158
- errors
159
- end
130
+ return false, 'Encountered errors exiting' unless errors.empty?
160
131
 
161
- def find_pull_request_generator(project)
162
- project_jobs = project[:value][:jobs] || []
163
- pull_job = nil
164
- project_jobs.each do |job|
165
- job = job.keys.first if job.is_a? Hash
166
- job = @job_collection.collection[job.to_s]
167
- pull_job = job if job[:value][:job_type] == 'pull_request_generator'
168
- end
169
- fail 'No jobs of type pull_request_generator found' unless pull_job
170
- pull_job
132
+ [true, project]
171
133
  end
172
134
 
173
- def filter_pull_request_jobs(pull_job)
174
- jobs = {}
175
- pull_jobs = pull_job[:value][:jobs] || []
176
- pull_jobs.each do |job|
177
- if job.is_a? String
178
- jobs[job.to_s] = @job_collection.collection[job.to_s]
179
- else
180
- jobs[job.keys.first.to_s] = @job_collection.collection[job.keys.first.to_s]
181
- end
182
- end
183
- fail 'No jobs found for pull request' if jobs.empty?
184
- jobs
185
- end
135
+ #
136
+ # BEGIN PRIVATE METHODS
137
+ #
186
138
 
187
- def compile_pull_request_generator(pull_job, project)
188
- defaults = job_collection.defaults
189
- settings = defaults.nil? ? {} : defaults[:value] || {}
190
- settings = Compiler.get_settings_bag(project, settings)
191
- resolve_job_by_name(pull_job, settings)
192
- end
139
+ private
193
140
 
194
141
  def prepare_jobs(jobs)
195
142
  jobs.map! do |job|
@@ -227,29 +174,6 @@ module JenkinsPipelineBuilder
227
174
  errors
228
175
  end
229
176
 
230
- def resolve_project(project)
231
- defaults = job_collection.defaults
232
- settings = defaults.nil? ? {} : defaults[:value] || {}
233
- project[:settings] = Compiler.get_settings_bag(project, settings) unless project[:settings]
234
- project_body = project[:value]
235
-
236
- jobs = prepare_jobs(project_body[:jobs]) if project_body[:jobs]
237
- logger.info project
238
- process_job_changes(jobs)
239
- errors = process_jobs(jobs, project)
240
- errors = process_views(project_body[:views], project, errors) if project_body[:views]
241
- errors.each do |k, v|
242
- puts "Encountered errors processing: #{k}:"
243
- v.each do |key, error|
244
- puts " key: #{key} had the following error:"
245
- puts " #{error.inspect}"
246
- end
247
- end
248
- return false, 'Encountered errors exiting' unless errors.empty?
249
-
250
- [true, project]
251
- end
252
-
253
177
  def process_jobs(jobs, project, errors = {})
254
178
  jobs.each do |job|
255
179
  job_id = job.keys.first
@@ -264,15 +188,6 @@ module JenkinsPipelineBuilder
264
188
  errors
265
189
  end
266
190
 
267
- def resolve_job_by_name(name, settings = {})
268
- job = job_collection.get_item(name)
269
- fail "Failed to locate job by name '#{name}'" if job.nil?
270
- job_value = job[:value]
271
- logger.debug "Compiling job #{name}"
272
- success, payload = Compiler.compile(job_value, settings, @job_collection.collection)
273
- [success, payload]
274
- end
275
-
276
191
  def publish_project(project_name, errors = {})
277
192
  job_collection.projects.each do |project|
278
193
  next unless project_name.nil? || project[:name] == project_name