openstudio-workflow 0.0.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: cc2c789cb6f241ce8a9693f4fd52c2cea857a6ab
4
+ data.tar.gz: 8ead2192e95b45850c00a0b0e048d326eef05ead
5
+ SHA512:
6
+ metadata.gz: 31c526b09291097df08aef677a51e1dd46bfb64710c0a3a0fef1b6e2a8f9fc2ab0d23973afcdd651dedfe7f17f77266fcb6ef7d5df67a80d39addb590ef9b358
7
+ data.tar.gz: acaee1bf5ae39b1f4be3f67b4d292dfe790f0415e1f1aa04552bdae09ad1003ad0b44c45939a4b5f0f4ed76ff76ec1522871a61dd2fd5c508c8848b88685cbec
data/CHANGELOG.md ADDED
@@ -0,0 +1,10 @@
1
+ OpenStudio::Workflow Change Log
2
+ ==================================
3
+
4
+ Unreleased
5
+ --------------
6
+
7
+ Version 0.0.1
8
+ --------------
9
+
10
+ * Initial release with basic workflow implemented to support running OpenStudio measure-based workflows
data/README.md ADDED
@@ -0,0 +1,97 @@
1
+ # OpenStudio::Workflow
2
+
3
+ Run an EnergyPlus simulation using a file-based workflow that is read from a Local or MongoDB adapter.
4
+
5
+ ## Installation
6
+
7
+ This applications has the following dependencies
8
+
9
+ * Ruby 2.0
10
+ * OpenStudio with Ruby 2.0 bindings
11
+ * EnergyPlus 8.1 (assuming OpenStudio ~> 1.3.1)
12
+ * MongoDB if using MongoDB Adapter (or when running rspec)
13
+
14
+ [OpenStudio](http://developer.nrel.gov/downloads/buildings/openstudio/builds/) needs to be installed
15
+ and in your path. On Mac/Linux it is easiest to add the following to your .bash_profile or /etc/profile.d in order
16
+ to make sure that OpenStudio can be loaded.
17
+
18
+ export OPENSTUDIO_ROOT=/usr/local
19
+ export RUBYLIB=$OPENSTUDIO_ROOT/lib/ruby/site_ruby/2.0.0
20
+
21
+ Add this line to your application's Gemfile:
22
+
23
+ gem 'OpenStudio-workflow'
24
+
25
+ Use this line if you want the bleeding edge:
26
+
27
+ gem 'OpenStudio-workflow', :git => 'git@github.com:NREL/OpenStudio-workflow-gem.git'
28
+
29
+ And then execute:
30
+
31
+ $ bundle
32
+
33
+ Or install it yourself as:
34
+
35
+ $ gem install OpenStudio-workflow
36
+
37
+ ## Usage
38
+
39
+ There are currently two adapters to run OpenStudio workflow. The first is a simple Local adapter
40
+ allowing the user to pass in the directory to simulation. The directory must have an
41
+ [analysis/problem JSON file](spec/files/local_ex1/analysis_1.json) and a [datapoint JSON file](spec/files/local_ex1/datapoint_1.json).
42
+ The workflow manager will use these data (and the measures, seed model, and weather data) to assemble and
43
+ execute the standard workflow of (preflight->openstudio measures->energyplus->postprocess).
44
+
45
+ r = OpenStudio::Workflow.load 'Local', '/home/user/a_directory', options
46
+ r.run
47
+
48
+ The workflow manager can also use MongoDB to receive instructions on the workflow to run and the data point values.
49
+
50
+ ## Caveats and Todos
51
+
52
+ ### Caveats
53
+
54
+ * There are currently several hard coded workflow options
55
+ * Must use OpenStudio with Ruby 2.0 support
56
+ * Using MongoDB as the Adapter requires a command line zip (gnuzip) utility
57
+
58
+ ### Todos
59
+
60
+ * Read the analysis.json file to determine the states that are going to run instead of (or inaddition to) passing them into the constructor
61
+ * Implement better error handling with custom exception classes
62
+ * Implement a different measure directory, seed model directory, and weather file directory option
63
+ * Dynamically add other "states" to the workflow
64
+ * Create and change into a unique directory when running measures
65
+ * ~~Implement Error State~~
66
+ * ~~Implement MongoDB Adapter~~
67
+ * ~~Implement remaining Adapter states (i.e. communicate success, communicate failure etc~~
68
+ * Add a results adapter to return a string as the last call based on the source of the call. (e.g. R, command line, C++, etc).
69
+ * Implement a logger in the Adapters, right now they are unable to log
70
+ * Hook up the measure groups based workflows
71
+ * More tests!
72
+ * ~~Add xml workflow item~~
73
+
74
+ ## Testing and Development
75
+
76
+ Depending on what adapter is being tested it may be preferable to skip installing various gems. This can be done by calling
77
+
78
+ bundle install --without mongo
79
+
80
+ On Windows it is recommended to bundle without mongo nor ci as they may require native extensions.
81
+
82
+ bundle install --without mongo ci
83
+
84
+ ### Testing
85
+
86
+ Run `rspec` or `rake` to execute the tests.
87
+
88
+ ## Contributing
89
+
90
+ 1. Fork it ( https://github.com/NREL/OpenStudio-workflow/fork )
91
+ 2. Create your feature branch (`git checkout -b my-new-feature`)
92
+ 3. Commit your changes (`git commit -am 'Add some feature'`)
93
+ 4. Push to the branch (`git push origin my-new-feature`)
94
+ 5. Create a new Pull Request
95
+
96
+ ## Development
97
+
data/Rakefile ADDED
@@ -0,0 +1,20 @@
1
+ require 'bundler'
2
+ require 'bundler/gem_tasks'
3
+ begin
4
+ Bundler.setup
5
+ rescue Bundler::BundlerError => e
6
+ $stderr.puts e.message
7
+ $stderr.puts 'Run `bundle install` to install missing gems'
8
+ exit e.status_code
9
+ end
10
+
11
+ require 'rake'
12
+ require 'rspec/core'
13
+ require 'rspec/core/rake_task'
14
+
15
+ #require 'rubocop/rake_task'
16
+ RSpec::Core::RakeTask.new(:spec) do |spec|
17
+ spec.pattern = FileList['spec/**/*_spec.rb']
18
+ end
19
+
20
+ task default: [:spec]
@@ -0,0 +1,68 @@
1
+ ######################################################################
2
+ # Copyright (c) 2008-2014, Alliance for Sustainable Energy.
3
+ # All rights reserved.
4
+ #
5
+ # This library is free software; you can redistribute it and/or
6
+ # modify it under the terms of the GNU Lesser General Public
7
+ # License as published by the Free Software Foundation; either
8
+ # version 2.1 of the License, or (at your option) any later version.
9
+ #
10
+ # This library is distributed in the hope that it will be useful,
11
+ # but WITHOUT ANY WARRANTY; without even the implied warranty of
12
+ # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
13
+ # Lesser General Public License for more details.
14
+ #
15
+ # You should have received a copy of the GNU Lesser General Public
16
+ # License along with this library; if not, write to the Free Software
17
+ # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
18
+ ######################################################################
19
+
20
+ # Adapter class to decide where to obtain instructions to run the simulation workflow
21
+ module OpenStudio
22
+ module Workflow
23
+ class Adapter
24
+
25
+ attr_accessor :options
26
+
27
+ def initialize(options={})
28
+ @options = options
29
+ @log = nil
30
+ end
31
+
32
+ #class << self
33
+ #attr_reader :problem
34
+
35
+ def load(filename, options={})
36
+ instance.load(filename, options)
37
+ end
38
+
39
+ def communicate_started(id, options = {})
40
+ instance.communicate_started id
41
+ end
42
+
43
+ def get_datapoint(id, options={})
44
+ instance.get_datapoint id, options
45
+ end
46
+
47
+ def get_problem(id, options = {})
48
+ instance.get_problem id, options
49
+ end
50
+
51
+ def communicate_results(id, results)
52
+ instance.communicate_results id, results
53
+ end
54
+
55
+ def communicate_complete(id)
56
+ instance.communicate_complete id
57
+ end
58
+
59
+ def communicate_failure(id)
60
+ instance.communicate_failure id
61
+ end
62
+
63
+ def get_logger(file, options={})
64
+ instance.get_logger file, options
65
+ end
66
+ end
67
+ end
68
+ end
@@ -0,0 +1,110 @@
1
+ ######################################################################
2
+ # Copyright (c) 2008-2014, Alliance for Sustainable Energy.
3
+ # All rights reserved.
4
+ #
5
+ # This library is free software; you can redistribute it and/or
6
+ # modify it under the terms of the GNU Lesser General Public
7
+ # License as published by the Free Software Foundation; either
8
+ # version 2.1 of the License, or (at your option) any later version.
9
+ #
10
+ # This library is distributed in the hope that it will be useful,
11
+ # but WITHOUT ANY WARRANTY; without even the implied warranty of
12
+ # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
13
+ # Lesser General Public License for more details.
14
+ #
15
+ # You should have received a copy of the GNU Lesser General Public
16
+ # License along with this library; if not, write to the Free Software
17
+ # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
18
+ ######################################################################
19
+
20
+ require_relative '../adapter'
21
+
22
+ # Local file based workflow
23
+ module OpenStudio
24
+ module Workflow
25
+ module Adapters
26
+ class Local < Adapter
27
+ def initialize(options={})
28
+
29
+ super
30
+ end
31
+
32
+ # Tell the system that the process has started
33
+ def communicate_started(directory, options = {})
34
+ # Watch out for namespace conflicts (::Time is okay but Time is OpenStudio::Time)
35
+ File.open("#{directory}/started.job", 'w') { |f| f << "Started Workflow #{::Time.now}" }
36
+ end
37
+
38
+ # Get the data point from the path
39
+ def get_datapoint(directory, options={})
40
+ defaults = {datapoint_filename: 'datapoint.json', format: 'json'}
41
+ options = defaults.merge(options)
42
+
43
+ # how do we log within this file?
44
+ if File.exist? "#{directory}/#{options[:datapoint_filename]}"
45
+ ::MultiJson.load(File.read("#{directory}/#{options[:datapoint_filename]}"), symbolize_names: true)
46
+ else
47
+ fail "Data point file does not exist for #{directory}/#{options[:datapoint_filename]}"
48
+ end
49
+ end
50
+
51
+ # Get the Problem/Analysis definition from the local file
52
+ # TODO: rename this to get_analysis_definintion (or something like that)
53
+ def get_problem(directory, options = {})
54
+ defaults = {problem_filename: 'problem.json', format: 'json'}
55
+ options = defaults.merge(options)
56
+
57
+ if File.exist? "#{directory}/#{options[:problem_filename]}"
58
+ ::MultiJson.load(File.read("#{directory}/#{options[:problem_filename]}"), symbolize_names: true)
59
+ else
60
+ fail "Problem file does not exist for #{directory}/#{options[:problem_filename]}"
61
+ end
62
+ end
63
+
64
+ def communicate_intermediate_result(directory)
65
+ # noop
66
+ end
67
+
68
+ def communicate_complete(directory)
69
+ File.open("#{directory}/finished.job", 'w') { |f| f << "Finished Workflow #{::Time.now}" }
70
+ end
71
+
72
+ # Final state of the simulation. The os_directory is the run directory and may be needed to
73
+ # zip up the results of the simuation.
74
+ def communicate_failure(directory)
75
+ File.open("#{directory}/failed.job", 'w') { |f| f << "Failed Workflow #{::Time.now}" }
76
+ #@communicate_module.communicate_failure(@communicate_object, os_directory)
77
+ end
78
+
79
+ def communicate_results(directory, results)
80
+ if results.is_a? Hash
81
+ File.open("#{directory}/datapoint_out.json", 'w') { |f| f << JSON.pretty_generate(results) }
82
+ else
83
+ pp "Unknown datapoint result type. Please handle #{results.class}"
84
+ #data_point_json_path = OpenStudio::Path.new(run_dir) / OpenStudio::Path.new('data_point_out.json')
85
+ #os_data_point.saveJSON(data_point_json_path, true)
86
+ end
87
+ #end
88
+ end
89
+
90
+ # TODO: can this be deprecated in favor a checking the class?
91
+ def communicate_results_json(eplus_json, analysis_dir)
92
+ # noop
93
+ end
94
+
95
+ def reload
96
+ # noop
97
+ end
98
+
99
+ # For the local adapter send back a handle to a file to append the data. For this adapter
100
+ # the log messages are likely to be the same as the run.log messages.
101
+ # TODO: do we really want two local logs from the Local adapter? One is in the run dir and the other is in the root
102
+ def get_logger(directory, options={})
103
+ @log ||= File.open("#{directory}/local_adapter.log", "w")
104
+ @log
105
+ end
106
+
107
+ end
108
+ end
109
+ end
110
+ end
@@ -0,0 +1,259 @@
1
+ ######################################################################
2
+ # Copyright (c) 2008-2014, Alliance for Sustainable Energy.
3
+ # All rights reserved.
4
+ #
5
+ # This library is free software; you can redistribute it and/or
6
+ # modify it under the terms of the GNU Lesser General Public
7
+ # License as published by the Free Software Foundation; either
8
+ # version 2.1 of the License, or (at your option) any later version.
9
+ #
10
+ # This library is distributed in the hope that it will be useful,
11
+ # but WITHOUT ANY WARRANTY; without even the implied warranty of
12
+ # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
13
+ # Lesser General Public License for more details.
14
+ #
15
+ # You should have received a copy of the GNU Lesser General Public
16
+ # License along with this library; if not, write to the Free Software
17
+ # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
18
+ ######################################################################
19
+
20
+ require_relative '../adapter'
21
+
22
+ module OpenStudio
23
+ module Workflow
24
+ module Adapters
25
+ class MongoLog
26
+ def initialize(datapoint_model)
27
+ @dp = datapoint_model
28
+ @dp.sdp_log_file ||= []
29
+ end
30
+
31
+ def write(msg)
32
+ @dp.sdp_log_file << msg.gsub('\n', '')
33
+ @dp.save!
34
+ end
35
+ end
36
+
37
+ class Mongo < Adapter
38
+ attr_reader :datapoint
39
+
40
+ def initialize(options={})
41
+ super
42
+
43
+ require 'mongoid'
44
+ require 'mongoid_paperclip'
45
+ require 'delayed_job_mongoid'
46
+ base_path = @options[:mongoid_path] ? @options[:mongoid_path] : "#{File.dirname(__FILE__)}/mongo"
47
+
48
+ Dir["#{base_path}/models/*.rb"].each { |f| require f }
49
+ Mongoid.load!("#{base_path}/mongoid.yml", :development)
50
+
51
+ @datapoint = nil
52
+ end
53
+
54
+ # Tell the system that the process has started
55
+ def communicate_started(directory, options={})
56
+ # Watch out for namespace conflicts (::Time is okay but Time is OpenStudio::Time)
57
+ File.open("#{directory}/started.job", 'w') { |f| f << "Started Workflow #{::Time.now}" }
58
+
59
+ @datapoint ||= get_datapoint_model(options[:datapoint_id])
60
+ @datapoint.status = 'started'
61
+ @datapoint.status_message = ''
62
+ @datapoint.run_start_time = ::Time.now
63
+
64
+
65
+ # TODO: use a different method to determine if this is an amazon account
66
+ # TODO: use the ComputeNode model to pull out the information so that we can reuse the methods
67
+ # Determine what the IP address is of the worker node and save in the data point
68
+ require 'socket'
69
+ if Socket.gethostname =~ /os-.*/
70
+ # Maybe use this in the future: /sbin/ifconfig eth1|grep inet|head -1|sed 's/\:/ /'|awk '{print $3}'
71
+ # Must be on vagrant and just use the hostname to do a lookup
72
+ map = {
73
+ 'os-server' => '192.168.33.10',
74
+ 'os-worker-1' => '192.168.33.11',
75
+ 'os-worker-2' => '192.168.33.12'
76
+ }
77
+ @datapoint.ip_address = map[Socket.gethostname]
78
+ @datapoint.internal_ip_address = @datapoint.ip_address
79
+
80
+ # TODO: add back in the instance id
81
+ elsif Socket.gethostname =~ /instance-data.ec2.internal.*/i
82
+ # On amazon, you have to hit an API to determine the IP address because
83
+ # of the internal/external ip addresses
84
+
85
+ # NL: add the suppress
86
+ public_ip_address = `curl -sL http://169.254.169.254/latest/meta-data/public-ipv4`
87
+ internal_ip_address = `curl -sL http://169.254.169.254/latest/meta-data/local-ipv4`
88
+ # instance_information = `curl -sL http://169.254.169.254/latest/meta-data/instance-id`
89
+ # instance_information = `curl -sL http://169.254.169.254/latest/meta-data/ami-id`
90
+ @datapoint.ip_address = public_ip_address
91
+ @datapoint.internal_ip_address = internal_ip_address
92
+ # @datapoint.server_information = instance_information
93
+ else
94
+ if Gem.loaded_specs["facter"]
95
+ # TODO: add hostname via the facter gem (and anything else?)
96
+ @datapoint.ip_address = Facter.fact(:ipaddress).value
97
+ @datapoint.internal_ip_address = Facter.fact(:hostname).value
98
+ end
99
+ end
100
+
101
+ @datapoint.save!
102
+ end
103
+
104
+ # Get the data point from the path
105
+ def get_datapoint(directory, options={})
106
+ # TODO : make this a conditional on when to create one vs when to error out.
107
+ # keep @datapoint as the model instance
108
+ @datapoint = DataPoint.find_or_create_by(uuid: options[:datapoint_id])
109
+
110
+ # convert to JSON for the workflow - and rearrange the version (fix THIS)
111
+ datapoint_hash = {}
112
+ unless @datapoint.nil?
113
+ datapoint_hash[:data_point] = @datapoint.as_document.to_hash
114
+ # TODO: Can i remove this openstudio_version stuff?
115
+ #datapoint_hash[:openstudio_version] = datapoint_hash[:openstudio_version]
116
+
117
+ # TODO: need to figure out how to get symbols from mongo.
118
+ datapoint_hash = MultiJson.load(MultiJson.dump(datapoint_hash, pretty: true), symbolize_keys: true)
119
+ else
120
+ fail "Could not find datapoint"
121
+ end
122
+
123
+ datapoint_hash
124
+ end
125
+
126
+ # TODO: cleanup these options. Make them part of the class. They are just unwieldly here.
127
+ def get_problem(directory, options = {})
128
+ defaults = {format: 'json'}
129
+ options = defaults.merge(options)
130
+
131
+ get_datapoint(directory, options) unless @datapoint
132
+
133
+ if @datapoint
134
+ analysis = @datapoint.analysis.as_document.to_hash
135
+ else
136
+ fail "Cannot retrieve problem because datapoint was nil"
137
+ end
138
+
139
+ analysis_hash = {}
140
+ if analysis
141
+ analysis_hash[:analysis] = analysis
142
+ analysis_hash[:openstudio_version] = analysis[:openstudio_version]
143
+
144
+ # TODO: need to figure out how to get symbols from mongo.
145
+ analysis_hash = MultiJson.load(MultiJson.dump(analysis_hash, pretty: true), symbolize_keys: true)
146
+ end
147
+ analysis_hash
148
+ end
149
+
150
+ def communicate_intermediate_result(directory)
151
+ # noop
152
+ end
153
+
154
+ def communicate_complete(directory)
155
+ @datapoint.run_end_time = ::Time.now
156
+ @datapoint.status = 'completed'
157
+ @datapoint.status_message = 'completed normal'
158
+ @datapoint.save!
159
+ end
160
+
161
+ # Final state of the simulation. The os_directory is the run directory and may be needed to
162
+ # zip up the results of the simuation.
163
+ def communicate_failure(directory)
164
+ # zip up the folder even on datapoint failures
165
+ if directory && File.exist?(directory)
166
+ zip_results(directory)
167
+ end
168
+
169
+ @datapoint.run_end_time = ::Time.now
170
+ @datapoint.status = 'completed'
171
+ @datapoint.status_message = 'datapoint failure'
172
+ @datapoint.save!
173
+ end
174
+
175
+ def communicate_results(directory, results)
176
+ zip_results(directory, 'workflow')
177
+
178
+ #@logger.info 'Saving EnergyPlus JSON file'
179
+ if results
180
+ @datapoint.results ? @datapoint.results.merge!(eplus_json) : @datapoint.results = results
181
+ end
182
+ result = @datapoint.save! # redundant because next method calls save too.
183
+ if result
184
+ #@logger.info 'Successfully saved result to database'
185
+ else
186
+ #@logger.error 'ERROR saving result to database'
187
+ end
188
+ end
189
+
190
+ # TODO: can this be deprecated in favor a checking the class?
191
+ def communicate_results_json(eplus_json, analysis_dir)
192
+ # noop
193
+ end
194
+
195
+ # TODO: not needed anymore i think...
196
+ def reload
197
+ # noop
198
+ end
199
+
200
+ # TODO: Implement the writing to the mongo_db for logging
201
+ def get_logger(directory, options={})
202
+ # get the datapoint object
203
+ get_datapoint(directory, options) unless @datapoint
204
+ @log = OpenStudio::Workflow::Adapters::MongoLog.new(@datapoint)
205
+ @log
206
+ end
207
+
208
+ private
209
+
210
+ def get_datapoint_model(uuid)
211
+ # TODO : make this a conditional on when to create one vs when to error out.
212
+ # keep @datapoint as the model instance
213
+ DataPoint.find_or_create_by(uuid: uuid)
214
+ end
215
+
216
+ # TODO: this uses a system call to zip results at the moment
217
+ def zip_results(analysis_dir, analysis_type = 'workflow')
218
+ eplus_search_path = nil
219
+ current_dir = Dir.pwd
220
+ FileUtils.mkdir_p "#{analysis_dir}/reports"
221
+ case analysis_type
222
+ when 'workflow'
223
+ eplus_search_path = "#{analysis_dir}/*run*/eplustbl.htm"
224
+ when 'runmanager'
225
+ eplus_search_path = "#{analysis_dir}/*EnergyPlus*/eplustbl.htm"
226
+ end
227
+
228
+ # copy some files into a report folder
229
+ eplus_html = Dir.glob(eplus_search_path).last || nil
230
+ if eplus_html
231
+ #@logger.info "Checking for HTML Report: #{eplus_html}"
232
+ if File.exist? eplus_html
233
+ # do some encoding on the html if possible
234
+ html = File.read(eplus_html)
235
+ html = html.force_encoding('ISO-8859-1').encode('utf-8', replace: nil)
236
+ File.open("#{analysis_dir}/reports/eplustbl.html", 'w') { |f| f << html }
237
+ end
238
+ end
239
+
240
+ # create zip file using a system call
241
+ #@logger.info "Zipping up Analysis Directory #{analysis_dir}"
242
+ if File.directory? analysis_dir
243
+ Dir.chdir(analysis_dir)
244
+ `zip -9 -r --exclude=*.rb* data_point_#{@datapoint.uuid}.zip .`
245
+ end
246
+
247
+ # zip up only the reports folder
248
+ report_dir = "#{analysis_dir}"
249
+ #@logger.info "Zipping up Analysis Reports Directory #{report_dir}/reports"
250
+ if File.directory? report_dir
251
+ Dir.chdir(report_dir)
252
+ `zip -r data_point_#{@datapoint.uuid}_reports.zip reports`
253
+ end
254
+ Dir.chdir(current_dir)
255
+ end
256
+ end
257
+ end
258
+ end
259
+ end