openstudio-workflow 0.0.1 → 0.0.2

Sign up to get free protection for your applications and to get access to all the features.
Files changed (24) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +9 -0
  3. data/README.md +39 -2
  4. data/Rakefile +12 -1
  5. data/lib/openstudio-workflow.rb +31 -4
  6. data/lib/openstudio/workflow/adapter.rb +8 -9
  7. data/lib/openstudio/workflow/adapters/local.rb +35 -22
  8. data/lib/openstudio/workflow/adapters/mongo.rb +82 -92
  9. data/lib/openstudio/workflow/jobs/lib/apply_measures.rb +229 -0
  10. data/lib/openstudio/workflow/jobs/run_energyplus/run_energyplus.rb +7 -10
  11. data/lib/openstudio/workflow/jobs/run_openstudio/run_openstudio.rb +37 -159
  12. data/lib/openstudio/workflow/jobs/run_postprocess/run_postprocess.rb +53 -492
  13. data/lib/openstudio/workflow/jobs/run_preflight/run_preflight.rb +1 -5
  14. data/lib/openstudio/workflow/jobs/{run_postprocess → run_reporting_measures}/packaged_measures/README.md +0 -0
  15. data/lib/openstudio/workflow/jobs/{run_postprocess → run_reporting_measures}/packaged_measures/StandardReports/measure.rb +81 -87
  16. data/lib/openstudio/workflow/jobs/{run_postprocess → run_reporting_measures}/packaged_measures/StandardReports/measure.xml +1 -1
  17. data/lib/openstudio/workflow/jobs/{run_postprocess/packaged_measures/StandardReports/resources/report.html.in → run_reporting_measures/packaged_measures/StandardReports/resources/report.html.erb} +0 -0
  18. data/lib/openstudio/workflow/jobs/run_reporting_measures/run_reporting_measures.rb +548 -0
  19. data/lib/openstudio/workflow/jobs/run_runmanager/run_runmanager.rb +226 -0
  20. data/lib/openstudio/workflow/jobs/run_xml/run_xml.rb +39 -41
  21. data/lib/openstudio/workflow/multi_delegator.rb +6 -6
  22. data/lib/openstudio/workflow/run.rb +95 -39
  23. data/lib/openstudio/workflow/version.rb +1 -1
  24. metadata +9 -6
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA1:
3
- metadata.gz: cc2c789cb6f241ce8a9693f4fd52c2cea857a6ab
4
- data.tar.gz: 8ead2192e95b45850c00a0b0e048d326eef05ead
3
+ metadata.gz: 8246584121ca33fc5d5b10e34ff852d0884d6de6
4
+ data.tar.gz: 979bcb2d5e8445a6cf69cf1e948d091ae1a0cf61
5
5
  SHA512:
6
- metadata.gz: 31c526b09291097df08aef677a51e1dd46bfb64710c0a3a0fef1b6e2a8f9fc2ab0d23973afcdd651dedfe7f17f77266fcb6ef7d5df67a80d39addb590ef9b358
7
- data.tar.gz: acaee1bf5ae39b1f4be3f67b4d292dfe790f0415e1f1aa04552bdae09ad1003ad0b44c45939a4b5f0f4ed76ff76ec1522871a61dd2fd5c508c8848b88685cbec
6
+ metadata.gz: 843a1e9af4d17e54b28d30d83fe9020502c29ac71e350f3e1625694feea3120a97826dd8ae8c383e3f4bf75ab91ada7b2c67de3d737ed339bdbd099f4ba1517f
7
+ data.tar.gz: a9463ff16b20e54881097c2fae31ad82f9ed9a1d3398a7fc0987640abaa824fba58923bfd4783c52572d91ca673c967081c0d735e281e7657d30bd193b373773
data/CHANGELOG.md CHANGED
@@ -4,6 +4,15 @@ OpenStudio::Workflow Change Log
4
4
  Unreleased
5
5
  --------------
6
6
 
7
+ Version 0.0.2
8
+ --------------
9
+
10
+ * Support reporting measures
11
+ * Reduce logging messages
12
+ * Keep IDF files
13
+ * Remove mtr and eso files after simulation completes
14
+ * If measure changes weather file, then use the new weather file with the analysis.json weather path
15
+
7
16
  Version 0.0.1
8
17
  --------------
9
18
 
data/README.md CHANGED
@@ -60,7 +60,7 @@ The workflow manager can also use MongoDB to receive instructions on the workflo
60
60
  * Read the analysis.json file to determine the states that are going to run instead of (or inaddition to) passing them into the constructor
61
61
  * Implement better error handling with custom exception classes
62
62
  * Implement a different measure directory, seed model directory, and weather file directory option
63
- * Dynamically add other "states" to the workflow
63
+ * ~Dynamically add other "states" to the workflow~
64
64
  * Create and change into a unique directory when running measures
65
65
  * ~~Implement Error State~~
66
66
  * ~~Implement MongoDB Adapter~~
@@ -68,7 +68,6 @@ The workflow manager can also use MongoDB to receive instructions on the workflo
68
68
  * Add a results adapter to return a string as the last call based on the source of the call. (e.g. R, command line, C++, etc).
69
69
  * Implement a logger in the Adapters, right now they are unable to log
70
70
  * Hook up the measure groups based workflows
71
- * More tests!
72
71
  * ~~Add xml workflow item~~
73
72
 
74
73
  ## Testing and Development
@@ -95,3 +94,41 @@ Run `rspec` or `rake` to execute the tests.
95
94
 
96
95
  ## Development
97
96
 
97
+ If you are testing changes to OpenStudio source code and want to test these on the Vagrant machine you can use the source configuration. This creates a new virtual machine which can be accessed by adding the name 'source' to the end of standard Vagrant commands. To set up this machine, build a custom version of OpenStudio, and install that version for testing follow these steps:
98
+
99
+ * vagrant up source
100
+ * vagrant ssh source
101
+ * sudo apt-get install dpkg-dev git cmake-curses-gui qt5-default libqt5webkit5-dev libboost1.55-all-dev swig ruby2.0 libssl-dev libxt-dev doxygen graphviz
102
+ * sudo ln -s /usr/lib/x86_64-linux-gnu/libruby-2.0.so.2.0.0 /usr/lib/x86_64-linux-gnu/libruby.so.2.0
103
+ ** Install clang (OPTIONAL):
104
+ ** wget -O - http://llvm.org/apt/llvm-snapshot.gpg.key|sudo apt-key add -
105
+ ** sudo apt-add-repository 'deb http://llvm.org/apt/trusty/ llvm-toolchain-trusty-3.5 main'
106
+ ** sudo apt-get update
107
+ ** sudo apt-get install clang-3.5
108
+ ** echo 'export CC=/usr/bin/clang-3.5' >> ~/.bashrc
109
+ ** echo 'export CXX=/usr/bin/clang++-3.5' >> ~/.bashrc
110
+ ** source ~/.bashrc
111
+ ** cd /home/vagrant
112
+ * git clone https://github.com/NREL/OpenStudio.git openstudio
113
+ * cd openstudio
114
+ * git checkout your_branch_name
115
+ * mkdir build
116
+ * cd build
117
+ * cmake .. -DBUILD_PACKAGE=TRUE -DCMAKE_INSTALL_PREFIX=$OPENSTUDIO_ROOT -DRUBY_EXECUTABLE=/usr/local/rbenv/versions/2.0.0-p481/bin/ruby
118
+ * make -j4
119
+
120
+ To install do either:
121
+ * cd OpenStudioCore-prefix/src/OpenStudioCore-build/
122
+ * sudo make install
123
+
124
+ or:
125
+ * make package
126
+ * sudo ./OpenStudio-1.5.1.02b7131b4c-Linux.sh --prefix=/usr/local --exclude-subdir --skip-license
127
+
128
+ Next you have to do this:
129
+ * export RUBYLIB=/usr/local/Ruby
130
+ * export LD_LIBRARY_PATH=/usr/local/lib
131
+
132
+ Then you can test that you are using your build by comparing the output of these two commands:
133
+ * ruby -e "require 'openstudio'" -e "puts OpenStudio::openStudioLongVersion"
134
+ * git rev-parse --short HEAD
data/Rakefile CHANGED
@@ -12,9 +12,20 @@ require 'rake'
12
12
  require 'rspec/core'
13
13
  require 'rspec/core/rake_task'
14
14
 
15
- #require 'rubocop/rake_task'
15
+ # require 'rubocop/rake_task'
16
16
  RSpec::Core::RakeTask.new(:spec) do |spec|
17
+ spec.rspec_opts = %w(--format progress --format CI::Reporter::RSpec)
17
18
  spec.pattern = FileList['spec/**/*_spec.rb']
18
19
  end
19
20
 
21
+ require 'rubocop/rake_task'
22
+ desc 'Run RuboCop on the lib directory'
23
+ RuboCop::RakeTask.new(:rubocop) do |task|
24
+ task.options = ['--no-color', '--out=rubocop-results.xml']
25
+ task.formatters = ['RuboCop::Formatter::CheckstyleFormatter']
26
+ task.requires = ['rubocop/formatter/checkstyle_formatter']
27
+ # don't abort rake on failure
28
+ task.fail_on_error = false
29
+ end
30
+
20
31
  task default: [:spec]
@@ -23,6 +23,7 @@ require 'multi_json'
23
23
  require 'colored'
24
24
  require 'fileutils'
25
25
  require 'json' # needed for a single pretty generate call
26
+ require 'pathname'
26
27
 
27
28
  begin
28
29
  require 'facter'
@@ -33,6 +34,7 @@ end
33
34
  require 'openstudio/workflow/version'
34
35
  require 'openstudio/workflow/multi_delegator'
35
36
  require 'openstudio/workflow/run'
37
+ require 'openstudio/workflow/jobs/lib/apply_measures'
36
38
 
37
39
  begin
38
40
  require 'openstudio'
@@ -42,14 +44,39 @@ rescue LoadError => e
42
44
  puts 'OpenStudio did not load, but most functionality is still available. Will try to continue...'.red
43
45
  end
44
46
 
47
+ # some core extensions
48
+ class String
49
+ def snake_case
50
+ gsub(/::/, '/')
51
+ .gsub(/([A-Z]+)([A-Z][a-z])/, '\1_\2')
52
+ .gsub(/([a-z\d])([A-Z])/, '\1_\2')
53
+ .tr(' -', '__')
54
+ .downcase
55
+ end
56
+ end
57
+
45
58
  module OpenStudio
46
59
  module Workflow
47
60
  extend self
48
61
 
49
62
  # Create a new workflow instance using the defined adapter and UUID
50
- def load(adapter_name, run_directory, options={})
51
- defaults = {adapter_options: {}}
63
+ def load(adapter_name, run_directory, options = {})
64
+ defaults = { adapter_options: {} }
52
65
  options = defaults.merge(options)
66
+
67
+ # Convert various paths to absolute paths
68
+ if options[:adapter_options] && options[:adapter_options][:mongoid_path] &&
69
+ (Pathname.new options[:adapter_options][:mongoid_path]).absolute? == false
70
+ options[:adapter_options][:mongoid_path] = File.expand_path options[:adapter_options][:mongoid_path]
71
+ end
72
+ if options[:analysis_root_path] &&
73
+ (Pathname.new options[:analysis_root_path]).absolute? == false
74
+ options[:analysis_root_path] = File.expand_path options[:analysis_root_path]
75
+ end
76
+ unless (Pathname.new run_directory).absolute?
77
+ # relative to wherever you are running the script
78
+ run_directory = File.expand_path run_directory
79
+ end
53
80
  adapter = load_adapter adapter_name, options[:adapter_options]
54
81
  run_klass = OpenStudio::Workflow::Run.new(adapter, run_directory, options)
55
82
  # return the run class
@@ -58,10 +85,10 @@ module OpenStudio
58
85
 
59
86
  private
60
87
 
61
- def load_adapter(name, adapter_options={})
88
+ def load_adapter(name, adapter_options = {})
62
89
  require "openstudio/workflow/adapters/#{name.downcase}"
63
90
  klass_name = name.to_s.split('_').map(&:capitalize) * ''
64
- #pp "#{klass_name} is the adapter class name"
91
+ # pp "#{klass_name} is the adapter class name"
65
92
  klass = OpenStudio::Workflow::Adapters.const_get(klass_name).new(adapter_options)
66
93
  klass
67
94
  end
@@ -21,26 +21,25 @@
21
21
  module OpenStudio
22
22
  module Workflow
23
23
  class Adapter
24
-
25
24
  attr_accessor :options
26
25
 
27
- def initialize(options={})
26
+ def initialize(options = {})
28
27
  @options = options
29
28
  @log = nil
30
29
  end
31
30
 
32
- #class << self
33
- #attr_reader :problem
31
+ # class << self
32
+ # attr_reader :problem
34
33
 
35
- def load(filename, options={})
34
+ def load(filename, options = {})
36
35
  instance.load(filename, options)
37
36
  end
38
37
 
39
- def communicate_started(id, options = {})
38
+ def communicate_started(id, _options = {})
40
39
  instance.communicate_started id
41
40
  end
42
41
 
43
- def get_datapoint(id, options={})
42
+ def get_datapoint(id, options = {})
44
43
  instance.get_datapoint id, options
45
44
  end
46
45
 
@@ -60,9 +59,9 @@ module OpenStudio
60
59
  instance.communicate_failure id
61
60
  end
62
61
 
63
- def get_logger(file, options={})
62
+ def get_logger(file, options = {})
64
63
  instance.get_logger file, options
65
64
  end
66
65
  end
67
66
  end
68
- end
67
+ end
@@ -24,20 +24,19 @@ module OpenStudio
24
24
  module Workflow
25
25
  module Adapters
26
26
  class Local < Adapter
27
- def initialize(options={})
28
-
27
+ def initialize(options = {})
29
28
  super
30
29
  end
31
30
 
32
31
  # Tell the system that the process has started
33
- def communicate_started(directory, options = {})
32
+ def communicate_started(directory, _options = {})
34
33
  # Watch out for namespace conflicts (::Time is okay but Time is OpenStudio::Time)
35
34
  File.open("#{directory}/started.job", 'w') { |f| f << "Started Workflow #{::Time.now}" }
36
35
  end
37
36
 
38
37
  # Get the data point from the path
39
- def get_datapoint(directory, options={})
40
- defaults = {datapoint_filename: 'datapoint.json', format: 'json'}
38
+ def get_datapoint(directory, options = {})
39
+ defaults = { datapoint_filename: 'datapoint.json', format: 'json' }
41
40
  options = defaults.merge(options)
42
41
 
43
42
  # how do we log within this file?
@@ -51,7 +50,7 @@ module OpenStudio
51
50
  # Get the Problem/Analysis definition from the local file
52
51
  # TODO: rename this to get_analysis_definintion (or something like that)
53
52
  def get_problem(directory, options = {})
54
- defaults = {problem_filename: 'problem.json', format: 'json'}
53
+ defaults = { problem_filename: 'problem.json', format: 'json' }
55
54
  options = defaults.merge(options)
56
55
 
57
56
  if File.exist? "#{directory}/#{options[:problem_filename]}"
@@ -61,7 +60,7 @@ module OpenStudio
61
60
  end
62
61
  end
63
62
 
64
- def communicate_intermediate_result(directory)
63
+ def communicate_intermediate_result(_directory)
65
64
  # noop
66
65
  end
67
66
 
@@ -73,37 +72,51 @@ module OpenStudio
73
72
  # zip up the results of the simuation.
74
73
  def communicate_failure(directory)
75
74
  File.open("#{directory}/failed.job", 'w') { |f| f << "Failed Workflow #{::Time.now}" }
76
- #@communicate_module.communicate_failure(@communicate_object, os_directory)
75
+ # @communicate_module.communicate_failure(@communicate_object, os_directory)
77
76
  end
78
77
 
79
78
  def communicate_results(directory, results)
79
+ zip_results(directory, 'workflow')
80
+
80
81
  if results.is_a? Hash
81
82
  File.open("#{directory}/datapoint_out.json", 'w') { |f| f << JSON.pretty_generate(results) }
82
83
  else
83
84
  pp "Unknown datapoint result type. Please handle #{results.class}"
84
- #data_point_json_path = OpenStudio::Path.new(run_dir) / OpenStudio::Path.new('data_point_out.json')
85
- #os_data_point.saveJSON(data_point_json_path, true)
85
+ # data_point_json_path = OpenStudio::Path.new(run_dir) / OpenStudio::Path.new('data_point_out.json')
86
+ # os_data_point.saveJSON(data_point_json_path, true)
86
87
  end
87
- #end
88
- end
89
-
90
- # TODO: can this be deprecated in favor a checking the class?
91
- def communicate_results_json(eplus_json, analysis_dir)
92
- # noop
93
- end
94
-
95
- def reload
96
- # noop
88
+ # end
97
89
  end
98
90
 
99
91
  # For the local adapter send back a handle to a file to append the data. For this adapter
100
92
  # the log messages are likely to be the same as the run.log messages.
101
93
  # TODO: do we really want two local logs from the Local adapter? One is in the run dir and the other is in the root
102
- def get_logger(directory, options={})
103
- @log ||= File.open("#{directory}/local_adapter.log", "w")
94
+ def get_logger(directory, _options = {})
95
+ @log ||= File.open("#{directory}/local_adapter.log", 'w')
104
96
  @log
105
97
  end
106
98
 
99
+ # TODO: this uses a system call to zip results at the moment
100
+ def zip_results(directory, _analysis_type = 'workflow')
101
+ current_dir = Dir.pwd
102
+ begin
103
+ # create zip file using a system call
104
+ # @logger.info "Zipping up data point #{analysis_dir}"
105
+ if File.directory? directory
106
+ Dir.chdir(directory)
107
+ `zip -9 -r --exclude=*.rb* data_point.zip .`
108
+ end
109
+
110
+ # zip up only the reports folder
111
+ report_dir = 'reports'
112
+ # @logger.info "Zipping up Analysis Reports Directory #{report_dir}/reports"
113
+ if File.directory? report_dir
114
+ `zip -9 -r data_point_reports.zip reports`
115
+ end
116
+ ensure
117
+ Dir.chdir(current_dir)
118
+ end
119
+ end
107
120
  end
108
121
  end
109
122
  end
@@ -37,7 +37,7 @@ module OpenStudio
37
37
  class Mongo < Adapter
38
38
  attr_reader :datapoint
39
39
 
40
- def initialize(options={})
40
+ def initialize(options = {})
41
41
  super
42
42
 
43
43
  require 'mongoid'
@@ -52,7 +52,7 @@ module OpenStudio
52
52
  end
53
53
 
54
54
  # Tell the system that the process has started
55
- def communicate_started(directory, options={})
55
+ def communicate_started(directory, options = {})
56
56
  # Watch out for namespace conflicts (::Time is okay but Time is OpenStudio::Time)
57
57
  File.open("#{directory}/started.job", 'w') { |f| f << "Started Workflow #{::Time.now}" }
58
58
 
@@ -61,40 +61,52 @@ module OpenStudio
61
61
  @datapoint.status_message = ''
62
62
  @datapoint.run_start_time = ::Time.now
63
63
 
64
-
65
- # TODO: use a different method to determine if this is an amazon account
64
+ # TODO: Get Facter to play well on windows and replace 'socket'
66
65
  # TODO: use the ComputeNode model to pull out the information so that we can reuse the methods
67
66
  # Determine what the IP address is of the worker node and save in the data point
68
- require 'socket'
69
- if Socket.gethostname =~ /os-.*/
70
- # Maybe use this in the future: /sbin/ifconfig eth1|grep inet|head -1|sed 's/\:/ /'|awk '{print $3}'
71
- # Must be on vagrant and just use the hostname to do a lookup
72
- map = {
67
+
68
+ retries = 0
69
+ begin
70
+ require 'socket'
71
+ if Socket.gethostname =~ /os-.*/
72
+ # Maybe use this in the future: /sbin/ifconfig eth1|grep inet|head -1|sed 's/\:/ /'|awk '{print $3}'
73
+ # Must be on vagrant and just use the hostname to do a lookup
74
+ map = {
73
75
  'os-server' => '192.168.33.10',
74
76
  'os-worker-1' => '192.168.33.11',
75
77
  'os-worker-2' => '192.168.33.12'
76
- }
77
- @datapoint.ip_address = map[Socket.gethostname]
78
- @datapoint.internal_ip_address = @datapoint.ip_address
79
-
80
- # TODO: add back in the instance id
81
- elsif Socket.gethostname =~ /instance-data.ec2.internal.*/i
82
- # On amazon, you have to hit an API to determine the IP address because
83
- # of the internal/external ip addresses
84
-
85
- # NL: add the suppress
86
- public_ip_address = `curl -sL http://169.254.169.254/latest/meta-data/public-ipv4`
87
- internal_ip_address = `curl -sL http://169.254.169.254/latest/meta-data/local-ipv4`
88
- # instance_information = `curl -sL http://169.254.169.254/latest/meta-data/instance-id`
89
- # instance_information = `curl -sL http://169.254.169.254/latest/meta-data/ami-id`
90
- @datapoint.ip_address = public_ip_address
91
- @datapoint.internal_ip_address = internal_ip_address
92
- # @datapoint.server_information = instance_information
93
- else
94
- if Gem.loaded_specs["facter"]
95
- # TODO: add hostname via the facter gem (and anything else?)
96
- @datapoint.ip_address = Facter.fact(:ipaddress).value
97
- @datapoint.internal_ip_address = Facter.fact(:hostname).value
78
+ }
79
+ @datapoint.ip_address = map[Socket.gethostname]
80
+ @datapoint.internal_ip_address = @datapoint.ip_address
81
+ else
82
+ if Gem.loaded_specs['facter']
83
+ # Check if we are on amazon
84
+ if Facter.fact(:ec2_metadata)
85
+ # must be on amazon
86
+ m = Facter.fact(:ec2_metadata).value
87
+
88
+ @datapoint.ip_address = m['public-ipv4'] ? m['public-ipv4'] : 'unknown'
89
+ @datapoint.internal_ip_address = m['local-ipv4'] ? m['local-ipv4'] : 'unknown'
90
+ else
91
+ @datapoint.ip_address = Facter.fact(:ipaddress).value
92
+ @datapoint.internal_ip_address = Facter.fact(:ipaddress).value
93
+ end
94
+ end
95
+ end
96
+ rescue => e
97
+ # catch any exceptions. It appears that if a new instance of amazon starts, then it is likely that
98
+ # the Facter for AWS may not be initialized yet. Retry after waiting for 15 seconds if this happens.
99
+ # If this fails out, then the only issue with this is that the data point won't be downloaded because
100
+ # the worker node is not known
101
+
102
+ # retry just in case
103
+ if retries < 30 # try for up to 5 minutes
104
+ retries += 1
105
+ sleep 10
106
+ retry
107
+ else
108
+ raise "could not find Facter based data for worker node after #{retries} retries with message #{e.message}"
109
+ # just do nothing for now
98
110
  end
99
111
  end
100
112
 
@@ -102,22 +114,27 @@ module OpenStudio
102
114
  end
103
115
 
104
116
  # Get the data point from the path
105
- def get_datapoint(directory, options={})
117
+ def get_datapoint(directory, options = {})
106
118
  # TODO : make this a conditional on when to create one vs when to error out.
107
119
  # keep @datapoint as the model instance
108
120
  @datapoint = DataPoint.find_or_create_by(uuid: options[:datapoint_id])
109
121
 
110
122
  # convert to JSON for the workflow - and rearrange the version (fix THIS)
111
123
  datapoint_hash = {}
112
- unless @datapoint.nil?
124
+ if @datapoint.nil?
125
+ fail 'Could not find datapoint'
126
+ else
113
127
  datapoint_hash[:data_point] = @datapoint.as_document.to_hash
114
128
  # TODO: Can i remove this openstudio_version stuff?
115
- #datapoint_hash[:openstudio_version] = datapoint_hash[:openstudio_version]
129
+ # datapoint_hash[:openstudio_version] = datapoint_hash[:openstudio_version]
116
130
 
117
131
  # TODO: need to figure out how to get symbols from mongo.
118
- datapoint_hash = MultiJson.load(MultiJson.dump(datapoint_hash, pretty: true), symbolize_keys: true)
119
- else
120
- fail "Could not find datapoint"
132
+ datapoint_hash = MultiJson.load(MultiJson.dump(datapoint_hash), symbolize_keys: true)
133
+
134
+ # save to disk for inspection
135
+ save_dp = File.join(directory, 'data_point.json')
136
+ FileUtils.rm_f save_dp if File.exist? save_dp
137
+ File.open(save_dp, 'w') { |f| f << MultiJson.dump(datapoint_hash, pretty: true) }
121
138
  end
122
139
 
123
140
  datapoint_hash
@@ -125,7 +142,7 @@ module OpenStudio
125
142
 
126
143
  # TODO: cleanup these options. Make them part of the class. They are just unwieldly here.
127
144
  def get_problem(directory, options = {})
128
- defaults = {format: 'json'}
145
+ defaults = { format: 'json' }
129
146
  options = defaults.merge(options)
130
147
 
131
148
  get_datapoint(directory, options) unless @datapoint
@@ -133,7 +150,7 @@ module OpenStudio
133
150
  if @datapoint
134
151
  analysis = @datapoint.analysis.as_document.to_hash
135
152
  else
136
- fail "Cannot retrieve problem because datapoint was nil"
153
+ fail 'Cannot retrieve problem because datapoint was nil'
137
154
  end
138
155
 
139
156
  analysis_hash = {}
@@ -147,11 +164,11 @@ module OpenStudio
147
164
  analysis_hash
148
165
  end
149
166
 
150
- def communicate_intermediate_result(directory)
167
+ def communicate_intermediate_result(_directory)
151
168
  # noop
152
169
  end
153
170
 
154
- def communicate_complete(directory)
171
+ def communicate_complete(_directory)
155
172
  @datapoint.run_end_time = ::Time.now
156
173
  @datapoint.status = 'completed'
157
174
  @datapoint.status_message = 'completed normal'
@@ -175,33 +192,25 @@ module OpenStudio
175
192
  def communicate_results(directory, results)
176
193
  zip_results(directory, 'workflow')
177
194
 
178
- #@logger.info 'Saving EnergyPlus JSON file'
195
+ # @logger.info 'Saving EnergyPlus JSON file'
179
196
  if results
180
- @datapoint.results ? @datapoint.results.merge!(eplus_json) : @datapoint.results = results
197
+ @datapoint.results ? @datapoint.results.merge!(results) : @datapoint.results = results
181
198
  end
182
199
  result = @datapoint.save! # redundant because next method calls save too.
200
+
183
201
  if result
184
- #@logger.info 'Successfully saved result to database'
202
+ # @logger.info 'Successfully saved result to database'
185
203
  else
186
- #@logger.error 'ERROR saving result to database'
204
+ # @logger.error 'ERROR saving result to database'
187
205
  end
188
206
  end
189
207
 
190
- # TODO: can this be deprecated in favor a checking the class?
191
- def communicate_results_json(eplus_json, analysis_dir)
192
- # noop
193
- end
194
-
195
- # TODO: not needed anymore i think...
196
- def reload
197
- # noop
198
- end
199
-
200
208
  # TODO: Implement the writing to the mongo_db for logging
201
- def get_logger(directory, options={})
209
+ def get_logger(directory, options = {})
202
210
  # get the datapoint object
203
211
  get_datapoint(directory, options) unless @datapoint
204
212
  @log = OpenStudio::Workflow::Adapters::MongoLog.new(@datapoint)
213
+
205
214
  @log
206
215
  end
207
216
 
@@ -213,47 +222,28 @@ module OpenStudio
213
222
  DataPoint.find_or_create_by(uuid: uuid)
214
223
  end
215
224
 
216
- # TODO: this uses a system call to zip results at the moment
217
- def zip_results(analysis_dir, analysis_type = 'workflow')
218
- eplus_search_path = nil
225
+ # TODO: this uses a system call to zip results at the moment, replace with rubylib
226
+ def zip_results(directory, _analysis_type = 'workflow')
219
227
  current_dir = Dir.pwd
220
- FileUtils.mkdir_p "#{analysis_dir}/reports"
221
- case analysis_type
222
- when 'workflow'
223
- eplus_search_path = "#{analysis_dir}/*run*/eplustbl.htm"
224
- when 'runmanager'
225
- eplus_search_path = "#{analysis_dir}/*EnergyPlus*/eplustbl.htm"
226
- end
227
-
228
- # copy some files into a report folder
229
- eplus_html = Dir.glob(eplus_search_path).last || nil
230
- if eplus_html
231
- #@logger.info "Checking for HTML Report: #{eplus_html}"
232
- if File.exist? eplus_html
233
- # do some encoding on the html if possible
234
- html = File.read(eplus_html)
235
- html = html.force_encoding('ISO-8859-1').encode('utf-8', replace: nil)
236
- File.open("#{analysis_dir}/reports/eplustbl.html", 'w') { |f| f << html }
228
+ begin
229
+ # create zip file using a system call
230
+ # @logger.info "Zipping up data point #{analysis_dir}"
231
+ if File.directory? directory
232
+ Dir.chdir(directory)
233
+ `zip -9 -r --exclude=*.rb* data_point_#{@datapoint.uuid}.zip .`
237
234
  end
238
- end
239
235
 
240
- # create zip file using a system call
241
- #@logger.info "Zipping up Analysis Directory #{analysis_dir}"
242
- if File.directory? analysis_dir
243
- Dir.chdir(analysis_dir)
244
- `zip -9 -r --exclude=*.rb* data_point_#{@datapoint.uuid}.zip .`
245
- end
246
-
247
- # zip up only the reports folder
248
- report_dir = "#{analysis_dir}"
249
- #@logger.info "Zipping up Analysis Reports Directory #{report_dir}/reports"
250
- if File.directory? report_dir
251
- Dir.chdir(report_dir)
252
- `zip -r data_point_#{@datapoint.uuid}_reports.zip reports`
236
+ # zip up only the reports folder
237
+ report_dir = 'reports'
238
+ # @logger.info "Zipping up Analysis Reports Directory #{report_dir}/reports"
239
+ if File.directory? report_dir
240
+ `zip -9 -r data_point_#{@datapoint.uuid}_reports.zip reports`
241
+ end
242
+ ensure
243
+ Dir.chdir(current_dir)
253
244
  end
254
- Dir.chdir(current_dir)
255
245
  end
256
246
  end
257
247
  end
258
248
  end
259
- end
249
+ end