sparkle_formation 1.2.0 → 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: 885773b23288e446fbafd4bf227fefd8fab90689
4
- data.tar.gz: 1fff2fde513ea68af8368e08c4bbc6aeb8af861e
3
+ metadata.gz: a3733e20ff2e0df836b9e519da638cd1ae5dd570
4
+ data.tar.gz: 42e65f5894748ff4dcd38641ee39499ffb39a6e1
5
5
  SHA512:
6
- metadata.gz: 4005c93381322a6f94e9e0c09ce33d8f77360d5a416f70a592bb75ecd72e4b024643944fffb1b9b52cc3e6b98cd0cc7bc98de78d1ed4c4fbc18bb0752036a25d
7
- data.tar.gz: 1c6586660a632bf5c9d721e41ba88bab78ee8d696d697fccf8779a2a785dac3d7fc0efe86f0e3c57883b42d07011fd1e63dd0603f25c9c016d31d6f98eb2bb29
6
+ metadata.gz: c85ee17f6bdd54cfcf73dde7c31c33baf1fc46f7dcaa6f2c229a60be15bfbfb5314b19e34f0580fc37cb1c25db88115b32851d70868eafff8d2edf93b691dfac
7
+ data.tar.gz: 17aade1f53fe09ab6eec1ca52dc780ce74a2ccdd5d530bc6371f78c0c48246b76fc99f9baa3b9e0d74e67538c9212dd2ffa59d20a30e4a333bbdfe8f9b42a410
@@ -1,3 +1,12 @@
1
+ # v2.0.0
2
+ * Fix sparkle pack usage in nested stacks (#140)
3
+ * Update value processing in attribute helpers for consistent behavior
4
+ * Extraction and isolation of provider specific functionalities (#138)
5
+ * Added provider support for Azure (#138)
6
+ * Added provider support for Heat (#138)
7
+ * Added provider support for Rackspace (#139)
8
+ * Enforce minimum supported Ruby version
9
+
1
10
  # v1.2.0
2
11
  * Fix pack registration helper method
3
12
  * Provide easy access to Kernel.method (#122)
@@ -1,18 +1,33 @@
1
1
  #!/usr/bin/env ruby
2
2
  require 'fileutils'
3
3
 
4
- unless(system("yardoc"))
5
- $stderr.puts 'ACK: Failed to create docs!'
6
- exit -1
4
+ ['yard', 'redcarpet', 'github-markup'].each do |item|
5
+ begin
6
+ require item
7
+ rescue LoadError
8
+ $stderr.puts "Failed to load required library `#{item}`. To fix: `gem install #{item}` or include within bundle"
9
+ exit -1
10
+ end
11
+ end
12
+
13
+ doc_path = File.join(Dir.pwd, 'doc')
14
+ FileUtils.mkdir_p(doc_path)
15
+
16
+ Dir.chdir(File.dirname(File.dirname(__FILE__))) do
17
+ unless(system("yardoc -o #{doc_path}"))
18
+ $stderr.puts 'Failed to successfully run `yardoc`. Unable to generate documentation!'
19
+ exit -1
20
+ end
7
21
  end
8
22
 
9
- FileUtils.mkdir_p('doc/UserDocs')
23
+ user_docs_path = File.join(Dir.pwd, 'doc', 'UserDocs')
24
+ FileUtils.mkdir_p(user_docs_path)
10
25
 
11
- Dir.glob('docs/**/*').each do |path|
26
+ Dir.glob(File.join(File.dirname(File.dirname(__FILE__)), 'docs', '**', '*')) do |path|
12
27
  next unless File.file?(path)
13
28
  content = File.read(path)
14
29
  rel_path = path.sub(/.*?docs\//, '')
15
- new_path = File.join('doc/UserDocs', rel_path)
30
+ new_path = File.join(user_docs_path, rel_path)
16
31
  user_doc_root = (['..'] * rel_path.scan('/').size).join('/')
17
32
  unless(user_doc_root.to_s.empty?)
18
33
  user_doc_root << '/'
@@ -15,8 +15,8 @@ anchors:
15
15
  url: "#output-to-stdout"
16
16
  - title: "Raise Exceptions"
17
17
  url: "#raise-exceptions"
18
- - title: "AWS Helpers"
19
- url: "#aws-helpers"
18
+ - title: "Provider specific helpers"
19
+ url: "#provider-specific-heleprs"
20
20
  ---
21
21
 
22
22
  ## Helper methods
@@ -123,60 +123,37 @@ end
123
123
 
124
124
  ### Generation Helpers
125
125
 
126
- #### AWS Helpers
126
+ #### Provider specific helpers
127
127
 
128
- Data generation helpers are available for all the AWS
129
- intrinsic functions and pseudo parameters:
130
-
131
-
132
- ##### Base intrinsic functions
133
-
134
- * `base64!(VAL)`
135
- * `find_in_map!(A, B, C)`
136
- * `attr!(RESOURCE, ATTRIBUTE)`
137
- * `azs!(REGION)`
138
- * `join!(VAL1, VAL2, ...)`
139
- * `select!(INDEX, ITEM)`
140
- * `ref!(NAME)`
141
-
142
- ##### Pseudo Parameters
143
-
144
- * `account_id!`
145
- * `notification_arns!`
146
- * `no_value!`
147
- * `region!`
148
- * `stack_id!`
149
- * `stack_name!`
150
-
151
- ##### Conditional functions
152
-
153
- AWS CFN supports runtime conditions. Helpers for building conditions:
154
-
155
- * `and!(VAL1, VAL2, ...)`
156
- * `equals!(VAL1, VAL2)`
157
- * `not!(VAL)`
158
- * `or!(VAL1, VAL2, ...)`
159
- * `condition!(CONDITION_NAME)`
160
-
161
- Helpers for using conditions:
162
-
163
- * `if!(CONDITION_NAME)`
128
+ SparkleFormation includes provider specific helpers based on the
129
+ provider defined when instantiating the SparkleFormation template
130
+ instance. For example:
164
131
 
165
132
  ~~~ruby
166
- SparkleFormation.new(:test) do
167
- ...
168
- some_value if!(:my_condition, TRUE_VALUE, FALSE_VALUE)
169
- ...
133
+ SparkleFormation.new(:my_stack, :provider => :aws) do
134
+ ...
135
+ output.instance_id.value ref!(:my_instance)
170
136
  end
171
137
  ~~~
172
138
 
173
- * `on_condition!(CONDITION_NAME)`
139
+ will make the AWS specific helper functions available within this
140
+ template instance. If the provider specified Azure, then the Azure
141
+ specific helper methods would be available:
174
142
 
175
143
  ~~~ruby
176
- SparkleFormation.new(:test) do
177
- ...
178
- resources.my_cool_resource do
179
- on_condition!(:stack_is_cool)
180
- ...
144
+ SparkleFormation.new(:my_stack, :provider => :azure) do
145
+ ...
146
+ outputs.instance_id do
147
+ type 'string'
148
+ value reference_id(:my_instance)
149
+ end
181
150
  end
182
151
  ~~~
152
+
153
+ To see all the available helpers for specific providers, refer
154
+ to the library documentation:
155
+
156
+ * [AWS helpers](http://sparkleformation.github.io/sparkle_formation/SparkleFormation/SparkleAttribute/Aws.html)
157
+ * [Azure helpers](http://sparkleformation.github.io/sparkle_formation/SparkleFormation/SparkleAttribute/Azure.html)
158
+ * [HEAT helpers](http://sparkleformation.github.io/sparkle_formation/SparkleFormation/SparkleAttribute/Heat.html)
159
+ * [Rackspace helpers](http://sparkleformation.github.io/sparkle_formation/SparkleFormation/SparkleAttribute/Rackspace.html)
@@ -24,8 +24,11 @@ require 'attribute_struct'
24
24
  class SparkleFormation
25
25
  autoload :Aws, 'sparkle_formation/aws'
26
26
  autoload :Error, 'sparkle_formation/error'
27
+ autoload :FunctionStruct, 'sparkle_formation/function_struct'
28
+ autoload :Provider, 'sparkle_formation/provider'
27
29
  autoload :Resources, 'sparkle_formation/resources'
28
30
  autoload :Sparkle, 'sparkle_formation/sparkle'
31
+ autoload :SparklePack, 'sparkle_formation/sparkle'
29
32
  autoload :SparkleCollection, 'sparkle_formation/sparkle_collection'
30
33
  autoload :SparkleAttribute, 'sparkle_formation/sparkle_attribute'
31
34
  autoload :SparkleStruct, 'sparkle_formation/sparkle_struct'
@@ -16,12 +16,20 @@ class SparkleFormation
16
16
  @name = opts[:name] if opts
17
17
  end
18
18
 
19
+ # Pack related items
19
20
  class Dynamic < NotFound; end
20
21
  class Component < NotFound; end
21
22
  class Registry < NotFound; end
22
23
  class Template < NotFound; end
23
24
 
25
+ # Template internals
26
+ class Resource < NotFound; end
27
+
24
28
  end
29
+
30
+ # Deprecation error
31
+ class Deprecated < Error; end
32
+
25
33
  end
26
34
 
27
35
  end
@@ -0,0 +1,123 @@
1
+ require 'sparkle_formation'
2
+
3
+ class SparkleFormation
4
+
5
+ # SparkleFormation customized AttributeStruct targeted at defining
6
+ # strings of code for remote evaulation
7
+ class FunctionStruct < AttributeStruct
8
+
9
+ # @return [String] name of function
10
+ attr_reader :_fn_name
11
+ # @return [Array<Object>] function argument list
12
+ attr_reader :_fn_args
13
+
14
+ # Create a new FunctionStruct instance
15
+ #
16
+ # @param f_name [String] name of function
17
+ # @param args [Array<Object>] argument list
18
+ # @return [self]
19
+ def initialize(f_name=nil, *args)
20
+ super()
21
+ @_fn_name = f_name.to_s
22
+ @_fn_args = args
23
+ @_fn_args.map! do |l_arg|
24
+ if(l_arg.is_a?(_klass))
25
+ l_arg = l_arg._root
26
+ l_arg._parent(self)
27
+ end
28
+ l_arg
29
+ end
30
+ end
31
+
32
+ # @return [False] functions are never nil
33
+ def nil?
34
+ false
35
+ end
36
+
37
+ # @return [TrueClass, FalseClass] is root struct
38
+ def root?
39
+ _parent.nil?
40
+ end
41
+
42
+ # Override to provide expected behavior when arguments are passed
43
+ # to a function call
44
+ #
45
+ # @param name [String, Symbol] method name
46
+ # @param args [Object<Array>] argument list
47
+ # @return [Object]
48
+ def method_missing(name, *args)
49
+ if(args.empty?)
50
+ super
51
+ else
52
+ @table['_function_'] = _klass_new(name, *args)
53
+ end
54
+ end
55
+
56
+ # Set accessor directly into table data
57
+ #
58
+ # @param val [Integer, String]
59
+ # @return [FunctionStruct]
60
+ def [](val)
61
+ _set("[#{val}]")
62
+ end
63
+
64
+ # Override of the dump to properly format eval string
65
+ #
66
+ # @return [String]
67
+ def _dump
68
+ unless(@table.empty?)
69
+ key, value = @table.first
70
+ suffix = _eval_join(
71
+ *[
72
+ key == '_function_' ? nil : key,
73
+ !value.nil? ? value._dump : nil
74
+ ].compact
75
+ )
76
+ end
77
+ if(_fn_name)
78
+ args = _fn_args.map do |arg|
79
+ if(arg.respond_to?(:_dump))
80
+ arg._dump
81
+ elsif(arg.is_a?(::Symbol))
82
+ "'#{::Bogo::Utility.camel(arg.to_s, false)}'"
83
+ elsif(arg.is_a?(::String))
84
+ "'#{arg}'"
85
+ else
86
+ arg.inspect
87
+ end
88
+ end.join(', ')
89
+ internal = _eval_join(
90
+ *[
91
+ args.empty? ? _fn_name : "#{_fn_name}(#{args})",
92
+ suffix
93
+ ].compact
94
+ )
95
+ root? ? "[#{internal}]" : internal
96
+ else
97
+ suffix
98
+ end
99
+ end
100
+
101
+ # Join arguments into a string for remote evaluation
102
+ #
103
+ # @param args [Array<String>]
104
+ # @return [String]
105
+ def _eval_join(*args)
106
+ args = args.compact
107
+ args.delete_if(&:empty?)
108
+ args.slice(1, args.size).to_a.inject(args.first) do |memo, item|
109
+ if(item.start_with?('['))
110
+ memo += item
111
+ else
112
+ memo += ".#{item}"
113
+ end
114
+ end
115
+ end
116
+
117
+ # @return [Class]
118
+ def _klass
119
+ ::SparkleFormation::FunctionStruct
120
+ end
121
+
122
+ end
123
+ end
@@ -0,0 +1,12 @@
1
+ require 'sparkle_formation'
2
+
3
+ class SparkleFormation
4
+ # Provider specific implementation namespace
5
+ module Provider
6
+
7
+ autoload :Aws, 'sparkle_formation/provider/aws'
8
+ autoload :Azure, 'sparkle_formation/provider/azure'
9
+ autoload :Heat, 'sparkle_formation/provider/heat'
10
+
11
+ end
12
+ end
@@ -0,0 +1,201 @@
1
+ require 'sparkle_formation'
2
+
3
+ class SparkleFormation
4
+ module Provider
5
+ # AWS specific implementation
6
+ module Aws
7
+
8
+ # @return [String] Type string for AWS CFN stack resource
9
+ def stack_resource_type
10
+ 'AWS::CloudFormation::Stack'
11
+ end
12
+
13
+ # Generate policy for stack
14
+ #
15
+ # @return [Hash]
16
+ def generate_policy
17
+ statements = []
18
+ compile.resources.keys!.each do |r_name|
19
+ r_object = compile.resources[r_name]
20
+ if(r_object['Policy'])
21
+ r_object['Policy'].keys!.each do |effect|
22
+ statements.push(
23
+ 'Effect' => effect.to_s.capitalize,
24
+ 'Action' => [r_object['Policy'][effect]].flatten.compact.map{|i| "Update:#{i}"},
25
+ 'Resource' => "LogicalResourceId/#{r_name}",
26
+ 'Principal' => '*'
27
+ )
28
+ end
29
+ r_object.delete!('Policy')
30
+ end
31
+ end
32
+ statements.push(
33
+ 'Effect' => 'Allow',
34
+ 'Action' => 'Update:*',
35
+ 'Resource' => '*',
36
+ 'Principal' => '*'
37
+ )
38
+ Smash.new('Statement' => statements)
39
+ end
40
+
41
+ # Apply deeply nested stacks. This is the new nesting approach and
42
+ # does not bubble parameters up to the root stack. Parameters are
43
+ # isolated to the stack resource itself and output mapping is
44
+ # automatically applied.
45
+ #
46
+ # @yieldparam stack [SparkleFormation] stack instance
47
+ # @yieldparam resource [AttributeStruct] the stack resource
48
+ # @yieldparam s_name [String] stack resource name
49
+ # @yieldreturn [Hash] key/values to be merged into resource properties
50
+ # @return [Hash] dumped stack
51
+ def apply_deep_nesting(*args, &block)
52
+ outputs = collect_outputs
53
+ nested_stacks(:with_resource).each do |stack, resource|
54
+ unless(stack.nested_stacks.empty?)
55
+ stack.apply_deep_nesting(*args)
56
+ end
57
+ stack.compile.parameters.keys!.each do |parameter_name|
58
+ if(output_name = output_matched?(parameter_name, outputs.keys))
59
+ next if outputs[output_name] == stack
60
+ stack_output = stack.make_output_available(output_name, outputs)
61
+ resource.properties.parameters.set!(parameter_name, stack_output)
62
+ end
63
+ end
64
+ end
65
+ if(block_given?)
66
+ extract_templates(&block)
67
+ end
68
+ compile.dump!
69
+ end
70
+
71
+ # Apply shallow nesting. This style of nesting will bubble
72
+ # parameters up to the root stack. This type of nesting is the
73
+ # original and now deprecated, but remains for compat issues so any
74
+ # existing usage won't be automatically busted.
75
+ #
76
+ # @yieldparam resource_name [String] name of stack resource
77
+ # @yieldparam stack [SparkleFormation] nested stack
78
+ # @yieldreturn [String] Remote URL storage for template
79
+ # @return [Hash]
80
+ def apply_shallow_nesting(*args, &block)
81
+ parameters = compile[:parameters] ? compile[:parameters]._dump : {}
82
+ output_map = {}
83
+ nested_stacks(:with_resource, :with_name).each do |_stack, stack_resource, stack_name|
84
+ remap_nested_parameters(compile, parameters, stack_name, stack_resource, output_map)
85
+ end
86
+ extract_templates(&block)
87
+ compile.parameters parameters
88
+ if(args.include?(:bubble_outputs))
89
+ outputs_hash = Hash[
90
+ output_map do |name, value|
91
+ [name, {'Value' => {'Fn::GetAtt' => value}}]
92
+ end
93
+ ]
94
+ if(compile.outputs)
95
+ compile._merge(compile._klass_new(outputs_hash))
96
+ else
97
+ compile.outputs output_hash
98
+ end
99
+ end
100
+ compile.dump!
101
+ end
102
+
103
+ # Extract output to make available for stack parameter usage at the
104
+ # current depth
105
+ #
106
+ # @param output_name [String] name of output
107
+ # @param outputs [Hash] listing of outputs
108
+ # @reutrn [Hash] reference to output value (used for setting parameter)
109
+ def make_output_available(output_name, outputs)
110
+ bubble_path = outputs[output_name].root_path - root_path
111
+ drip_path = root_path - outputs[output_name].root_path
112
+ bubble_path.each_slice(2) do |base_sparkle, ref_sparkle|
113
+ next unless ref_sparkle
114
+ base_sparkle.compile.outputs.set!(output_name).set!(
115
+ :value, base_sparkle.compile.attr!(
116
+ ref_sparkle.name, "Outputs.#{output_name}"
117
+ )
118
+ )
119
+ end
120
+ if(bubble_path.empty?)
121
+ if(drip_path.size == 1)
122
+ parent = drip_path.first.parent
123
+ if(parent && parent.compile.parameters.data![output_name])
124
+ return compile.ref!(output_name)
125
+ end
126
+ end
127
+ raise ArgumentError.new "Failed to detect available bubbling path for output `#{output_name}`. " <<
128
+ 'This may be due to a circular dependency! ' <<
129
+ "(Output Path: #{outputs[output_name].root_path.map(&:name).join(' > ')} " <<
130
+ "Requester Path: #{root_path.map(&:name).join(' > ')})"
131
+ end
132
+ result = compile.attr!(bubble_path.first.name, "Outputs.#{output_name}")
133
+ if(drip_path.size > 1)
134
+ parent = drip_path.first.parent
135
+ drip_path.unshift(parent) if parent
136
+ drip_path.each_slice(2) do |base_sparkle, ref_sparkle|
137
+ next unless ref_sparkle
138
+ base_sparkle.compile.resources[ref_sparkle.name].properties.parameters.set!(output_name, result)
139
+ ref_sparkle.compile.parameters.set!(output_name){ type 'String' } # TODO: <<<<------ type check and prop
140
+ result = compile.ref!(output_name)
141
+ end
142
+ end
143
+ result
144
+ end
145
+
146
+ # Extract parameters from nested stacks. Check for previous nested
147
+ # stack outputs that match parameter. If match, set parameter to use
148
+ # output. If no match, check container stack parameters for match.
149
+ # If match, set to use ref. If no match, add parameter to container
150
+ # stack parameters and set to use ref.
151
+ #
152
+ # @param template [Hash] template being processed
153
+ # @param parameters [Hash] top level parameter set being built
154
+ # @param stack_name [String] name of stack resource
155
+ # @param stack_resource [Hash] duplicate of stack resource contents
156
+ # @param output_map [Hash] mapping of output names to required stack output access
157
+ # @return [TrueClass]
158
+ # @note if parameter has includes `StackUnique` a new parameter will
159
+ # be added to container stack and it will not use outputs
160
+ def remap_nested_parameters(template, parameters, stack_name, stack_resource, output_map)
161
+ stack_parameters = stack_resource.properties.stack.compile.parameters
162
+ unless(stack_parameters.nil?)
163
+ stack_parameters._dump.each do |pname, pval|
164
+ if(pval['StackUnique'])
165
+ check_name = [stack_name, pname].join
166
+ else
167
+ check_name = pname
168
+ end
169
+ if(parameters.keys.include?(check_name))
170
+ if(parameters[check_name]['Type'] == 'CommaDelimitedList')
171
+ new_val = {'Fn::Join' => [',', {'Ref' => check_name}]}
172
+ else
173
+ new_val = {'Ref' => check_name}
174
+ end
175
+ template.resources.set!(stack_name).properties.parameters.set!(pname, new_val)
176
+ elsif(output_map[check_name])
177
+ template.resources.set!(stack_name).properties.parameters.set!(
178
+ pname, 'Fn::GetAtt' => output_map[check_name]
179
+ )
180
+ else
181
+ if(pval['Type'] == 'CommaDelimitedList')
182
+ new_val = {'Fn::Join' => [',', {'Ref' => check_name}]}
183
+ else
184
+ new_val = {'Ref' => check_name}
185
+ end
186
+ template.resources.set!(stack_name).properties.parameters.set!(pname, new_val)
187
+ parameters[check_name] = pval
188
+ end
189
+ end
190
+ end
191
+ unless(stack_resource.properties.stack.compile.outputs.nil?)
192
+ stack_resource.properties.stack.compile.outputs.keys!.each do |oname|
193
+ output_map[oname] = [stack_name, "Outputs.#{oname}"]
194
+ end
195
+ end
196
+ true
197
+ end
198
+
199
+ end
200
+ end
201
+ end