eco-helpers 2.0.41 → 2.0.45

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 3f9f409fadb3cd263964b4254d35ea3b32cd97ff21eaaef4cd96bb0bafb61b83
4
- data.tar.gz: 411187eb287a25f73e75d08fd7ccc6984e3ece935786fd161f3998cfb53a0670
3
+ metadata.gz: f85bd156250d1f44ca911d98007005cbbb2281ff4a40a2fc07a80680fd42a408
4
+ data.tar.gz: 4ccda90fbf7d1bbfc814d18cd033110b7af2ae366687af1fa2db10370788c52d
5
5
  SHA512:
6
- metadata.gz: 7ef199c58c422283f767f90319e6768265a2d38b33e4b5be73ad6b3c77923949056e5263e2130b650307d08deb78d4e0e1da078fcfed70936cb294aa8bb1326f
7
- data.tar.gz: aee328dc5617c180ca6efc8e7399a205755b4abb6ff0b8546f19462e2d3a2f095bcc7be6d3b43eb2303b99710fd3bcc881c4421560a37d91b4b0a743bc6e3db6
6
+ metadata.gz: f4b7add6e04fb8b4dfa0c56371bc3f85e791659754e937f60586b88dec06186723f9286c87482144bd9b012298dbc03251388c49f1245008026a186cd6d16a88
7
+ data.tar.gz: 7a0bfda7154a8c0e7bb7e18100a96cd2a45f26a6d45309725e7d796dd9d0ad48cefb35be551c031706757d1241fab7505d74f401cf6a04764547bbe8c582e076
data/CHANGELOG.md CHANGED
@@ -1,10 +1,74 @@
1
1
  # Change Log
2
2
  All notable changes to this project will be documented in this file.
3
3
 
4
- ## [2.0.41] - 2021-09-xx
4
+ ## [2.0.45] - 2021-11-xx
5
5
 
6
6
  ### Added
7
+ - `Eco::API::UseCases::OozeSamples::RegisterExportCase`
8
+
7
9
  ### Changed
10
+ - Upgrade `ecoportal-api` dependency
11
+ - Upgrade `ecoportal-api-v2` dependency
12
+
13
+ ### Fixed
14
+
15
+ ## [2.0.44] - 2021-11-25
16
+
17
+ ### Changed
18
+ - Upgrade `ecoportal-api` dependency
19
+
20
+ ## [2.0.43] - 2021-11-25
21
+
22
+ ### Added
23
+ - `Eco::API::Session::Batch::Job` added better **logging**
24
+
25
+ ### Changed
26
+ - `Eco::API::Session::Batch` added **resilience** and **recovery** to connection errors
27
+ - `Eco::API::Policies::DefaultPolicies::UserAccess` changed logging from `warn` to `info`.
28
+
29
+ ## [2.0.42] - 2021-10-30
30
+
31
+ ### Added
32
+ - `eco/api/organization/presets_values.json` added abilities:
33
+ - `visitor_management`
34
+ - `broadcast_notifications`
35
+ - `cross_register_reporting`
36
+ - Due to dev pipeline, the `ecoportal-api` version `0.8.4` hasn't been released
37
+ - This part of the current `eco-helpers` version depends on that gem version (to be upgraded later)
38
+ - Changed the `workflow` in
39
+ - `before(:post_launch, :usecases)` => post launch cases are not prevented for **partial** updates
40
+ - rectified message
41
+ - `run(:post_launch, :usecases)` => missing **people** error will log in `debug` level
42
+ - it won't break the script anymore (just prevent a post use case to be run)
43
+ - New use case `Eco::API::UseCases::DefaultCases::EntriesToCsv`: to export input entries as `csv`
44
+ - invokable via `-entries-to-csv`
45
+ - More errors:
46
+ - `Eco::API::Error::ExternalIdTaken`
47
+ - `Eco::API::Error::InternalServerError`
48
+ - **Added** option `processed-people-to-csv file.csv` to export a `csv` with the results of processing people
49
+ - This option allows to preview the final data when launching a `dry-run` (`-simulate`)
50
+ - **Added** ooze case sample `Eco::API::UseCases::OozeSamples::TargetOozesUpdateCase`
51
+ - Allows to retrieve the target entries based on a `csv`
52
+
53
+ ### Changed
54
+ - Moved methods of `#person_ref`, `#get_attr` and `#get_row` as **class methods** of `Eco::API::Session::Batch::Feedback`
55
+ - `Eco::API::Session::Errors` methods above to use them via `Feedback` class
56
+ - `Eco::API::Session::Errors::ErrorCache`: **added** new property `response`
57
+ - `#by_type` added parameter `only_entries` to specify the output type
58
+ - `#errors` capture the `response` in the generated `ErrorCache` object
59
+
60
+ ### Fixed
61
+ - `Eco::API::Session::Config::Workflow#run` to validate **callback** output class
62
+ - It should be a `Eco::API::UseCases::BaseIO`
63
+ - Prevent **uncreated people** to be present in queues or people refresh (if the server side worked perfectly, this contingency shouldn't be necessary):
64
+ - `Eco::API::Session::Batch::Job#processed_queue`
65
+ - Sanity-check to exclude people that are **new** and do not have `id` when the current `Job` is **not** of type `:create` (meaning that it was supposed to be created but failed and it probably doesn't exist on server-side)
66
+ - This prevents errors when trying to update/delete a person that most probably does not exist on the server.
67
+ - `Eco::API::MicroCases#people_refresh`
68
+ - Remove from people run-time object those that are **new** that are `dirty` (with pending changes)
69
+
70
+ ## [2.0.41] - 2021-10-06
71
+
8
72
  ### Fixed
9
73
  - `Eco::API::Session::Batch::Job` `backup_update`
10
74
  - Saved `requests` filename was overlapping due to only batch job type being used
data/eco-helpers.gemspec CHANGED
@@ -30,8 +30,8 @@ Gem::Specification.new do |spec|
30
30
  spec.add_development_dependency "yard", ">= 0.9.26", "< 0.10"
31
31
  spec.add_development_dependency "redcarpet", ">= 3.5.1", "< 3.6"
32
32
 
33
- spec.add_dependency 'ecoportal-api', '>= 0.8.3', '< 0.9'
34
- spec.add_dependency 'ecoportal-api-v2', '>= 0.8.19', '< 0.9'
33
+ spec.add_dependency 'ecoportal-api', '>= 0.8.4', '< 0.9'
34
+ spec.add_dependency 'ecoportal-api-v2', '>= 0.8.21', '< 0.9'
35
35
  spec.add_dependency 'aws-sdk-s3', '>= 1.83.0', '< 2'
36
36
  spec.add_dependency 'aws-sdk-ses', '>= 1.36.0', '< 2'
37
37
  spec.add_dependency 'dotenv', '>= 2.7.6', '< 2.8'
@@ -166,7 +166,7 @@ module Eco
166
166
  run = true
167
167
  if Eco::API::Common::Session::FileManager.file_exists?(file)
168
168
  prompt_user("Do you want to overwrite it? (Y/n):", explanation: "The file '#{file}' already exists.", default: "Y") do |response|
169
- run = (response == "") || reponse.upcase.start_with?("Y")
169
+ run = (response == "") || response.upcase.start_with?("Y")
170
170
  end
171
171
  end
172
172
 
@@ -1,5 +1,4 @@
1
1
  require "net/sftp"
2
-
3
2
  module Eco
4
3
  module API
5
4
  module Common
data/lib/eco/api/error.rb CHANGED
@@ -19,10 +19,18 @@ module Eco
19
19
  @match = /.*/
20
20
  end
21
21
 
22
+ class InternalServerError < Eco::API::Error
23
+ @str_err = "Internal Server Error"
24
+ @match = /#{@str_err}/
25
+ end
22
26
  class UnknownPersonId < Eco::API::Error
23
27
  @str_err = "Unknown person id"
24
28
  @match = /Cannot find person with id (.*)/
25
29
  end
30
+ class ExternalIdTaken < Eco::API::Error
31
+ @str_err = "external ID already taken"
32
+ @match = /#{@str_err}/
33
+ end
26
34
  class EmailMissing < Eco::API::Error
27
35
  @str_err = "missing email for account creation"
28
36
  @match = /#{@str_err}/
@@ -2,15 +2,22 @@ module Eco
2
2
  module API
3
3
  class MicroCases
4
4
  # Helper to obtain all the elements of `people` anew from the _People Manager_.
5
- # @note this helper is normally used to run consecutive usecases, where data needs refresh.
5
+ # @note
6
+ # 1. This helper is normally used to run consecutive usecases, where data needs refresh.
7
+ # 2. It only includes new people if they are not dirty (they do not have pending updates)
8
+ # - This contingency wouldn't be necessary if the server worked perfectly.
6
9
  # @param people [Eco::API::Organization::People] the people that needs refresh.
7
10
  # @param include_created [Boolean] include people created during this session? (will check `:create` batch jobs).
8
11
  # @return [Eco::API::Organization::People] the `People` object with the data.
9
12
  def people_refresh(people:, include_created: true)
13
+ people = people.newFrom people.select do |person|
14
+ !person.new? || !person.dirty?
15
+ end
10
16
  ini = people.length
11
17
  if include_created
12
18
  session.job_groups.find_jobs(type: :create).map do |job|
13
- people = people.merge(job.people)
19
+ to_add = job.people.select {|person| !person.dirty?}
20
+ people = people.merge(to_add)
14
21
  end
15
22
  end
16
23
 
@@ -12,5 +12,8 @@
12
12
  "person_core_edit": [null, "edit"],
13
13
  "person_details": [null, "view", "edit_public", "view_private", "edit_private"],
14
14
  "person_account": [null, "view", "create", "edit"],
15
- "person_abilities": [null, "view", "edit"]
15
+ "person_abilities": [null, "view", "edit"],
16
+ "visitor_management": [null, "view", "edit", "administrate"],
17
+ "broadcast_notifications": [null, "view", "administrate"],
18
+ "cross_register_reporting": [null, "view"]
16
19
  }
@@ -22,7 +22,7 @@ class Eco::API::Policies::DefaultPolicies::UserAccess < Eco::API::Common::Loader
22
22
  def warn_account_removal!
23
23
  if account_removed_count > 0
24
24
  msg = "(DefaultPolicy on job '#{job.name}') Removed account to #{account_removed_count} people"
25
- session.logger.warn(msg)
25
+ session.logger.info(msg)
26
26
  end
27
27
  end
28
28
 
@@ -8,7 +8,7 @@ module Eco
8
8
  # @attr_reader status [Eco::API::Session::Batch::Status] `batch status` this `Errors` object is associated to.
9
9
  attr_reader :status
10
10
 
11
- ErrorCache = Struct.new(:type, :err, :entry)
11
+ ErrorCache = Struct.new(:type, :err, :entry, :response)
12
12
 
13
13
  # @param status [Eco::API::Session::Batch::Status] `batch status` this `Errors` object is associated to.
14
14
  def initialize(status:)
@@ -46,31 +46,52 @@ module Eco
46
46
 
47
47
  # @!group Pure errors helper methods
48
48
 
49
+ # @return [Integer] the number of `entries` that got error.
50
+ def count
51
+ entries.length
52
+ end
53
+
49
54
  # Was there any _Sever_ (reply) **error** as a result of this batch?
50
55
  # @return [Boolean] `true` if any of the queried _entries_ got an unsuccessful `Ecoportal::API::Common::BatchResponse`
51
56
  def any?
52
57
  queue.any? {|query| !status[query].success?}
53
58
  end
54
59
 
55
- # @return [Integer] the number of `entries` that got error.
56
- def count
57
- entries.length
60
+ # Groups `entries` with error `type`
61
+ # @return [Hash] where each `key` is a `type` **error** and each value is an `Array` of:
62
+ # 1. `entries` that got that error, if `only_entries` is `true`
63
+ # 2. `ErrorCache` objects, if `only_entries` is `false`
64
+ def by_type(only_entries: true)
65
+ errors.group_by do |e|
66
+ e.type
67
+ end.transform_values do |arr|
68
+ if only_entries
69
+ arr.map {|e| e.entry}
70
+ else
71
+ arr
72
+ end
73
+ end
58
74
  end
75
+ # @!endgroup
59
76
 
60
- # For all the `entries` with errors generates a `Hash` object
61
- # @return [Array<Hash>] where each `Hash` has
62
- # 1. `:type` -> the error type
63
- # 2. `:err` -> the error `class` of that `:type`
64
- # 3. `:entry` -> the entry that generated the error
77
+ # For all the `entries` with errors generates an `Array` of `ErrorCache` objects
78
+ # @return [Array<Eco::API::Session::Batch::Errors::ErrorCache>] where each `object` has
79
+ # 1. `type` -> the error type `Class`
80
+ # 2. `err` -> an instance object of that error `class` type
81
+ # 3. `entry` -> the entry that generated the error
82
+ # 4. `response` -> the original response from the server that carries the error
65
83
  def errors
66
84
  entries.each_with_object([]) do |entry, arr|
67
- if body = status[entry].body
68
- if errs = body["errors"]
85
+ response = status[entry]
86
+ if body = response.body
87
+ if errs = body["errors"] || body["error"]
88
+ errs = [errs].flatten(1).compact
69
89
  errs.each do |msg|
70
90
  arr.push(ErrorCache.new(
71
91
  klass = Eco::API::Error.get_type(msg),
72
92
  klass.new(err_msg: msg, entry: entry, session: session),
73
- entry
93
+ entry,
94
+ response
74
95
  ))
75
96
  end
76
97
  end
@@ -78,18 +99,6 @@ module Eco
78
99
  end
79
100
  end
80
101
 
81
- # Groups `entries` with error `type`
82
- # @return [Hash] where each `key` is a `type` **error** and each value is
83
- # an `Array` of `entries` that got that error
84
- def by_type
85
- errors.group_by do |e|
86
- e.type
87
- end.transform_values do |arr|
88
- arr.map {|e| e.entry}
89
- end
90
- end
91
- # @!endgroup
92
-
93
102
  # @!group Messaging methods
94
103
 
95
104
  def message
@@ -112,8 +121,7 @@ module Eco
112
121
  # @!endgroup
113
122
 
114
123
  def person_ref(entry)
115
- row_str = (row = get_row(entry)) ? "(row: #{row}) " : nil
116
- "#{row_str}(id: '#{get_attr(entry, :id)}') '#{get_attr(entry, :name)}' ('#{get_attr(entry, :external_id)}': '#{get_attr(entry, :email)}')"
124
+ Eco::API::Session::Batch::Feedback.person_ref(entry)
117
125
  end
118
126
 
119
127
  private
@@ -142,20 +150,11 @@ module Eco
142
150
  end
143
151
 
144
152
  def get_attr(entry, attr)
145
- if entry.respond_to?(attr.to_sym)
146
- entry.public_send(attr.to_sym)
147
- elsif entry.is_a?(Hash)
148
- entry["#{attr}"]
149
- end
153
+ Eco::API::Session::Batch::Feedback.get_attr(entry, attr)
150
154
  end
151
155
 
152
156
  def get_row(value)
153
- case value
154
- when Eco::API::Common::People::PersonEntry
155
- value.idx
156
- when Ecoportal::API::V1::Person
157
- get_row(value.entry)
158
- end
157
+ Eco::API::Session::Batch::Feedback.get_row(value)
159
158
  end
160
159
 
161
160
  # Sorts the entries that got server error by error `type` and generates the error messages.
@@ -5,6 +5,30 @@ module Eco
5
5
  # @attr_reader job [Eco::API::Session::Batch::Job] `batch job` the feedback is associated with
6
6
  class Feedback
7
7
 
8
+ class << self
9
+ def person_ref(entry)
10
+ row_str = (row = get_row(entry)) ? "(row: #{row}) " : nil
11
+ "#{row_str}(id: '#{get_attr(entry, :id)}') '#{get_attr(entry, :name)}' ('#{get_attr(entry, :external_id)}': '#{get_attr(entry, :email)}')"
12
+ end
13
+
14
+ def get_attr(entry, attr)
15
+ if entry.respond_to?(attr.to_sym)
16
+ entry.public_send(attr.to_sym)
17
+ elsif entry.is_a?(Hash)
18
+ entry["#{attr}"]
19
+ end
20
+ end
21
+
22
+ def get_row(value)
23
+ case value
24
+ when Eco::API::Common::People::PersonEntry
25
+ value.idx
26
+ when Ecoportal::API::V1::Person
27
+ get_row(value.entry)
28
+ end
29
+ end
30
+ end
31
+
8
32
  attr_reader :job
9
33
 
10
34
  # @param job [Eco::API::Session::Batch::Job] `batch job` the feedback is associated with
@@ -176,6 +176,7 @@ module Eco
176
176
  if pqueue.length > 0
177
177
  req_backup = as_update(pqueue, add_feedback: false)
178
178
  backup_update(req_backup)
179
+ logger.debug("Job ('#{name}':#{type}): going to launch batch against #{pqueue.count} entries")
179
180
  session.batch.launch(pqueue, method: type).tap do |job_status|
180
181
  @status = job_status
181
182
  status.root = self
@@ -235,9 +236,23 @@ module Eco
235
236
  end
236
237
  end
237
238
 
239
+ # Method to generate the base of people that will be present in the queue
240
+ # @note
241
+ # - If the entry is a new person, we are not in a creation job and the person doesn't have `id`
242
+ # it means that it failed to be created (it doesn't exist on server-side).
243
+ # The entry won't be included.
244
+ # - The contingency above wouldn't be necessary if the server worked perfectly.
238
245
  def processed_queue
239
- @queue.each {|e| @callbacks[e].call(e) if @callbacks.key?(e) }
240
- apply_policies(api_included(@queue)).select do |e|
246
+ pre_filtered = @queue.select do |entry|
247
+ if unexisting = entry.new? && !entry.id && type != :create
248
+ ref = Eco::API::Session::Batch::Feedback.person_ref(entry)
249
+ msg = "Job ('#{name}':#{type}): excluded unexisting entry (failed creation): #{ref}"
250
+ session.logger.warn(msg)
251
+ end
252
+ !unexisting
253
+ end
254
+ pre_filtered.each {|e| @callbacks[e].call(e) if @callbacks.key?(e) }
255
+ apply_policies(api_included(pre_filtered)).select do |e|
241
256
  !as_update(e).empty?
242
257
  end.select do |e|
243
258
  next true unless e.is_a?(Ecoportal::API::V1::Person)
@@ -320,7 +335,7 @@ module Eco
320
335
  handlers.each do |handler|
321
336
  if entries = err_types[handler.name]
322
337
  handler_job = subjobs_add("#{self.name} => #{handler.name}", usecase: handler)
323
- logger.debug("Running error handler #{handler.name}")
338
+ logger.debug("Running error handler #{handler.name} (against #{entries.count} entries)")
324
339
  handler.launch(people: people(entries), session: session, options: options, job: handler_job)
325
340
  logger.debug("Launching job of error handler: #{handler_job.name}")
326
341
  handler_job.launch(simulate: simulate)
@@ -92,6 +92,7 @@ module Eco
92
92
  return people_api.get_all(params: params, silent: silent)
93
93
  end
94
94
 
95
+
95
96
  def batch_from(data, method:, params: {}, silent: false)
96
97
  fatal "Invalid batch method: #{method}." if !self.class.valid_method?(method)
97
98
  return nil if !data || !data.is_a?(Enumerable)
@@ -101,12 +102,23 @@ module Eco
101
102
  params = {per_page: DEFAULT_BATCH_BLOCK}.merge(params)
102
103
  per_page = params[:per_page] || DEFAULT_BATCH_BLOCK
103
104
 
105
+ launch_batch(data,
106
+ method: method,
107
+ per_page: per_page,
108
+ people_api: people_api,
109
+ silent: silent
110
+ )
111
+ end
112
+
113
+ def launch_batch(data, method:, status: nil, job_mode: true, per_page: DEFAULT_BATCH_BLOCK, people_api: api&.people, silent: false)
104
114
  iteration = 1; done = 0
105
115
  iterations = (data.length.to_f / per_page).ceil
106
116
 
107
- Eco::API::Session::Batch::Status.new(enviro, queue: data, method: method).tap do |status|
117
+ status ||= Eco::API::Session::Batch::Status.new(enviro, queue: data, method: method)
118
+ status.tap do |status|
108
119
  start_time = Time.now
109
120
  start_slice = Time.now; slice = []
121
+ pending_for_server_error = data.to_a[0..-1]
110
122
  data.each_slice(per_page) do |slice|
111
123
  msg = "starting batch '#{method}' iteration #{iteration}/#{iterations},"
112
124
  msg += " with #{slice.length} entries of #{data.length} -- #{done} done"
@@ -115,11 +127,14 @@ module Eco
115
127
 
116
128
  start_slice = Time.now
117
129
  offer_retry_on(Ecoportal::API::Errors::TimeOut) do
118
- people_api.batch do |batch|
130
+ people_api.batch(job_mode: false) do |batch|
119
131
  slice.each do |person|
120
132
  batch.public_send(method, person) do |response|
121
133
  faltal("Request with no response") unless !!response
122
- status[person] = response
134
+ unless server_error?(response)
135
+ pending_for_server_error.delete(person)
136
+ status[person] = response
137
+ end
123
138
  end
124
139
  end
125
140
  end # end batch
@@ -128,9 +143,31 @@ module Eco
128
143
  iteration += 1
129
144
  done += slice.length
130
145
  end # next slice
146
+
147
+ # temporary working around (due to back-end problems with batch/jobs)
148
+ unless pending_for_server_error.empty?
149
+ msg = "Going to re-try #{pending_for_server_error.count} due to server errors"
150
+ logger.info(msg) unless silent
151
+ launch_batch(pending_for_server_error,
152
+ status: status,
153
+ method: method,
154
+ job_mode: false,
155
+ per_page: per_page,
156
+ people_api: people_api,
157
+ silent: silent
158
+ )
159
+ end
131
160
  end
132
161
  end
133
162
 
163
+ def server_error?(response)
164
+ res_status = response.status
165
+ server_error = !res_status || res_status.server_error?
166
+ other_error = !server_error && (!res_status.code || res_status.code < 100)
167
+ no_body = !server_error && !other_error && !response.body
168
+ server_error || other_error || no_body
169
+ end
170
+
134
171
  def offer_retry_on(error_type, retries_left = 3, &block)
135
172
  begin
136
173
  block.call
@@ -185,6 +185,7 @@ module Eco
185
185
  # - it will **not** run the `callback` for `on` defined during the configuration time
186
186
  # - it will rather `yield` the target stage after all the `before` _callbacks_ have been run
187
187
  # - aside of this, the rest will be the same as when the _block_ is provided (see previous note)
188
+ # @raise [ArgumentError] if the object returned by `before` and `after` callbacks is not an `Eco::API::UseCases::BaseIO`.
188
189
  # @param key [Symbol, nil] cases:
189
190
  # - if `key` is not provided, it targets the _current stage_
190
191
  # - if `key` is provided, it targets the specific _sub-stage_
@@ -200,7 +201,16 @@ module Eco
200
201
  if key
201
202
  io = stage(key).run(io: io, &block)
202
203
  elsif pending?
203
- @before.each {|c| io = c.call(self, io)}
204
+ @before.each do |c|
205
+ io = c.call(self, io).tap do |i_o|
206
+ unless i_o.is_a?(Eco::API::UseCases::BaseIO)
207
+ msg = "Workflow callaback before('#{name}') should return Eco::API::UseCases::BaseIO object."
208
+ msg += " Given #{i_o.class}"
209
+ msg += " • Callback source location: '#{c.source_location}'"
210
+ raise ArgumentError.new(msg)
211
+ end
212
+ end
213
+ end
204
214
 
205
215
  unless skip?
206
216
  io.session.logger.debug("(Workflow: #{path}) running now")
@@ -218,7 +228,16 @@ module Eco
218
228
  @pending = false
219
229
  end
220
230
 
221
- @after.each {|c| io = c.call(self, io)}
231
+ @after.each do |c|
232
+ io = c.call(self, io).tap do |i_o|
233
+ unless i_o.is_a?(Eco::API::UseCases::BaseIO)
234
+ msg = "Workflow callaback after('#{name}') should return Eco::API::UseCases::BaseIO object."
235
+ msg += " Given #{i_o.class}"
236
+ msg += " • Callback source location: '#{c.source_location}'"
237
+ raise ArgumentError.new(msg)
238
+ end
239
+ end
240
+ end
222
241
  end
223
242
  rescue SystemExit
224
243
  exit
@@ -0,0 +1,18 @@
1
+ class Eco::API::UseCases::DefaultCases::EntriesToCsv < Eco::API::Common::Loaders::UseCase
2
+ name "entries-to-csv"
3
+ type :import
4
+
5
+ attr_reader :session, :options
6
+
7
+ def main(entries, session, options, usecase)
8
+ @options = options
9
+ @session = session
10
+ entries.export(filename)
11
+ end
12
+
13
+ private
14
+
15
+ def filename
16
+ options.dig(:export, :file) || "entries.csv"
17
+ end
18
+ end
@@ -24,6 +24,7 @@ require_relative 'default_cases/email_as_id_case'
24
24
  require_relative 'default_cases/hris_case'
25
25
  require_relative 'default_cases/new_id_case'
26
26
  require_relative 'default_cases/new_email_case'
27
+ require_relative 'default_cases/entries_to_csv_case'
27
28
  require_relative 'default_cases/org_data_convert_case'
28
29
  require_relative 'default_cases/refresh_case'
29
30
  require_relative 'default_cases/reinvite_trans_case'
@@ -0,0 +1,184 @@
1
+ # Use case to update a register
2
+ # @note
3
+ # - You can define methods `filters` and `search` to change the target entries of the register
4
+ # - You need to define the `process_ooze` method
5
+ # This case expects `options[:source][:register_id]`
6
+ class Eco::API::UseCases::OozeSamples::RegisterExportCase < Eco::API::Common::Loaders::UseCase
7
+
8
+ class << self
9
+ # @return [Integer] the number of pages to be processed in each batch
10
+ def batch_size(size = nil)
11
+ @batch_size ||= 25
12
+ return @batch_size unless size
13
+ @batch_size = size
14
+ end
15
+ end
16
+
17
+ include Eco::API::UseCases::OozeSamples::Helpers
18
+
19
+ name "register-export-case"
20
+ type :other
21
+
22
+ attr_reader :session, :options, :usecase
23
+ attr_reader :target
24
+
25
+ def main(session, options, usecase)
26
+ options[:end_get] = false
27
+ @session = session; @options = options; @usecase = usecase
28
+ @target = nil
29
+ raise "You need to inherit from this class ('#{self.class}') and call super with a block" unless block_given?
30
+ with_each_entry do |ooze|
31
+ process_ooze(ooze)
32
+ end
33
+ yield
34
+ end
35
+
36
+ # Write here your script
37
+ def process_ooze(ooze = target)
38
+ raise "You need to inherit from this class ('#{self.class}') and call super with a block" unless block_given?
39
+ yield(ooze)
40
+ end
41
+
42
+ private
43
+
44
+ def new_target(object)
45
+ @target = object
46
+ end
47
+
48
+ def with_each_entry
49
+ batched_search_results do |page_results|
50
+ page_results.each do |page_result|
51
+ if ooz = build_full_ooze(page_result.id)
52
+ yield(ooz)
53
+ end
54
+ end
55
+ end
56
+ end
57
+
58
+ # It builds a full page model (not updatable)
59
+ # @note If it's a page instance with stages, there's where it will be handy
60
+ # @param ooze_id [String]
61
+ # @return [Ecoportal::API::V2::Page]
62
+ def build_full_ooze(ooze_id)
63
+ if page = ooze(ooze_id)
64
+ return page unless page.is_a?(Ecoportal::API::V2::Pages::PageStage)
65
+ secs_doc = page.sections.doc
66
+ flds_doc = page.components.doc
67
+ pending_stage_ids = page.stages.map(&:id) - [page.current_stage_id]
68
+ pending_stage_ids.each do |id|
69
+ if page = stage(id, ooze: page)
70
+ page.sections.doc.each do |sec_doc|
71
+ unless secs_doc.find {|sec| sec["id"] == sec_doc["id"]}
72
+ secs_doc << sec_doc
73
+ end
74
+ end
75
+ page.components.doc.each do |comp_doc|
76
+ unless flds_doc.find {|fld| fld["id"] == comp_doc["id"]}
77
+ flds_doc << comp_doc
78
+ end
79
+ end
80
+ end
81
+ end
82
+ Ecoportal::API::V2::Page.new(page.doc)
83
+ end
84
+ end
85
+
86
+ #def update_oozes(batched_oozes = batch_queue)
87
+ # batched_oozes.each do |ooze|
88
+ # update_ooze(ooze)
89
+ # end
90
+ # batched_oozes.clear
91
+ #end
92
+
93
+ def batched_search_results
94
+ raise "Missing block. It yields in slices of #{self.class.batch_size} results" unless block_given?
95
+ results_preview
96
+ results = []
97
+ apiv2.registers.search(register_id, search_options) do |page_result|
98
+ results << page_result
99
+ if results.length >= self.class.batch_size
100
+ yield(results)
101
+ results = []
102
+ end
103
+ end
104
+ yield(results) unless results.empty?
105
+ end
106
+
107
+ def ooze(ooze_id = nil, stage_id: nil)
108
+ return target unless ooze_id
109
+ apiv2.pages.get(ooze_id, stage_id: stage_id).tap do |ooze|
110
+ if ooze
111
+ new_target(ooze)
112
+ logger.info("Got #{object_reference(ooze)}")
113
+ else
114
+ exit_error "Could not get ooze '#{ooze_id}'"
115
+ end
116
+ end
117
+ end
118
+
119
+ def stage(id_name = nil, ooze: target)
120
+ if ooze_id = ooze && ooze.id
121
+ exit_error "#{object_reference(ooze)} does not have stages!" unless ooze.stages?
122
+ else
123
+ exit_error "There's no target ooze to get retrieve stages from"
124
+ end
125
+
126
+ if stg = ooze.stages[id_name] || ooze.stages.get_by_name(id_name)
127
+ return ooze if ooze.respond_to?(:current_stage_id) && (ooze.current_stage_id == stg.id)
128
+ return apiv2.pages.get(ooze_id, stage_id: stg.id).tap do |stage|
129
+ if stage
130
+ new_target(stage)
131
+ logger.info("Got #{object_reference(stage)} from #{object_reference(ooze)}")
132
+ else
133
+ exit_error "Could not get stage '#{id_name}' in ooze '#{ooze_id}'"
134
+ end
135
+ end
136
+ end
137
+ exit_error "Stage '#{id_name}' doesn't exist in ooze '#{ooze_id}'"
138
+ end
139
+
140
+ def results_preview
141
+ apiv2.registers.search(register_id, search_options.merge(only_first: true)).tap do |search_results|
142
+ str_results = "Total target entries: #{search_results.total} (out of #{search_results.total_before_filtering})"
143
+ session.prompt_user("Do you want to proceed (y/N):", explanation: str_results, default: "N", timeout: 10) do |res|
144
+ unless res.upcase.start_with?("Y")
145
+ puts "..."
146
+ logger.info "Aborting script..."
147
+ exit(0)
148
+ end
149
+ end
150
+ end
151
+ end
152
+
153
+ def search_options
154
+ @search_options ||= {}.tap do |opts|
155
+ opts.merge!(sort: "created_at")
156
+ opts.merge!(dir: "asc")
157
+ opts.merge!(query: conf_search) if conf_search
158
+ opts.merge!(filters: conf_filters)
159
+ end
160
+ end
161
+
162
+ def conf_filters
163
+ return filters if self.respond_to?(:filters)
164
+ []
165
+ end
166
+
167
+ def conf_search
168
+ return search if self.respond_to?(:search)
169
+ end
170
+
171
+ def register_id
172
+ options.dig(:source, :register_id)
173
+ end
174
+
175
+ def apiv2
176
+ @apiv2 ||= session.api(version: :oozes)
177
+ end
178
+
179
+ def exit_error(msg)
180
+ logger.error(msg)
181
+ exit(1)
182
+ end
183
+
184
+ end
@@ -0,0 +1,64 @@
1
+ # Use case to update a target oozes
2
+ # @note
3
+ # - `target_ids` => Expects options[:source][:file] where at least the 1st column should be the target entry `ids`
4
+ class Eco::API::UseCases::OozeSamples::TargetOozesUpdateCase < Eco::API::UseCases::OozeSamples::RegisterUpdateCase
5
+ name "target-oozes-update-case"
6
+ type :other
7
+
8
+ private
9
+
10
+ def with_each_entry
11
+ batched_target_ids do |ids|
12
+ ids.each do |id|
13
+ if pending = queue_shift(id)
14
+ if dirty?(pending)
15
+ msg = "Same entry 'id' appears more than once. "
16
+ msg << "Launching update on '#{object_reference(pending)}' to be able to queue it back"
17
+ console.warn msg
18
+ update_ooze(pending)
19
+ end
20
+ end
21
+ if ooz = ooze(id)
22
+ yield(ooz)
23
+ end
24
+ end
25
+ update_oozes
26
+ end
27
+ end
28
+
29
+ def batched_target_ids
30
+ raise "Missing block. It yields in slices of #{self.class.batch_size} ids" unless block_given?
31
+ target_ids_preview
32
+ pool = []
33
+ target_ids.each do |id|
34
+ pool << id
35
+ if pool.length >= self.class.batch_size
36
+ yield(pool)
37
+ pool = []
38
+ end
39
+ end
40
+ yield(pool) unless pool.empty?
41
+ end
42
+
43
+ def target_ids_preview
44
+ dups = target_ids.select {|id| target_ids.count(id) > 1}
45
+ dups_str = dups.count > 0 ? "There are #{dups.count} duplicated ids" : "No duplicates detected"
46
+ msg = "Total target entries: #{target_ids.count} (#{dups_str})"
47
+ session.prompt_user("Do you want to proceed (y/N):", explanation: msg, default: "N", timeout: 10) do |res|
48
+ unless res.upcase.start_with?("Y")
49
+ puts "..."
50
+ logger.info "Aborting script..."
51
+ exit(0)
52
+ end
53
+ end
54
+ end
55
+
56
+ def target_ids
57
+ @target_ids ||= input_csv.columns.first[1..-1]
58
+ end
59
+
60
+ def input_csv
61
+ @input_csv ||= Eco::CSV.read(options.dig(:source, :file))
62
+ end
63
+
64
+ end
@@ -10,7 +10,9 @@ end
10
10
 
11
11
  require_relative 'ooze_samples/helpers'
12
12
  require_relative 'ooze_samples/ooze_base_case'
13
+ require_relative 'ooze_samples/register_export_case'
13
14
  require_relative 'ooze_samples/ooze_run_base_case'
14
15
  require_relative 'ooze_samples/ooze_update_case'
15
16
  require_relative 'ooze_samples/ooze_from_doc_case'
16
17
  require_relative 'ooze_samples/register_update_case'
18
+ require_relative 'ooze_samples/target_oozes_update_case'
@@ -96,6 +96,13 @@ ASSETS.cli.config do |cnf|
96
96
  })
97
97
  end
98
98
 
99
+ desc = "Used to export to a csv the final people (after processing). "
100
+ desc += "It is useful analyse the data after a -dry-run (-simulate)."
101
+ options_set.add("-processed-people-to-csv", desc) do |options, session|
102
+ file = SCR.get_file("-processed-people-to-csv", required: true, should_exist: false)
103
+ options.deep_merge!(report: {people: {csv: file}})
104
+ end
105
+
99
106
  desc = "Runs in dry-run (no requests sent to server)"
100
107
  options_set.add(["-dry-run", "-simulate"], desc) do |options, session|
101
108
  options[:dry_run] = true
@@ -137,6 +137,13 @@ ASSETS.cli.config do |cnf|
137
137
  end
138
138
  end
139
139
 
140
+ desc = "Input file dump into a CSV as is."
141
+ cases.add("-entries-to-csv", :import, desc, case_name: "entries-to-csv")
142
+ .add_option("-out") do |options|
143
+ file = SCR.get_file("-out")
144
+ options.deep_merge(export: {file: file})
145
+ end
146
+
140
147
  desc = "Usage '-org-data-convert backup.json -restore-db-from'."
141
148
  desc += " Transforms an input .json file to the values of the destination environment "
142
149
  desc += " (names missmatch won't solve: i.e. usergroups)"
@@ -196,7 +203,7 @@ ASSETS.cli.config do |cnf|
196
203
  .add_option("-append-starters", as1) do |options|
197
204
  options.deep_merge!(people: {append_created: true})
198
205
  end
199
-
206
+
200
207
  desc = "Creates people with only details"
201
208
  cases.add("-create-details-from", :sync, desc, case_name: "create-details")
202
209
  .add_option("-append-starters", as1) do |options|
@@ -119,9 +119,7 @@ ASSETS.cli.config do |config|
119
119
  else
120
120
  wf_post.skip!
121
121
  msg = "Although there are post_launch cases, they will NOT be RUN"
122
- if !partial_update
123
- msg+= ", because it is not a partial update (-get-partial)"
124
- elsif io.options[:dry_run]
122
+ if io.options[:dry_run]
125
123
  msg+= ", because we are in dry-run (simulate)."
126
124
  end
127
125
  io.session.logger.info(msg)
@@ -134,7 +132,15 @@ ASSETS.cli.config do |config|
134
132
 
135
133
  wf_post.on(:usecases) do |wf_postcases, io|
136
134
  io.session.post_launch.each do |use|
137
- io = use.launch(io: io).base
135
+ begin
136
+ io = use.launch(io: io).base
137
+ rescue Eco::API::UseCases::BaseIO::MissingParameter => e
138
+ if e.required == :people
139
+ io.session.logger.debug("Skipping use case '#{use.name}' -- no base people detected for the current run")
140
+ else
141
+ raise
142
+ end
143
+ end
138
144
  end
139
145
  io
140
146
  end
@@ -146,8 +152,13 @@ ASSETS.cli.config do |config|
146
152
  end
147
153
 
148
154
  wf.on(:report) do |wf_report, io|
149
- #config.reports.active(io: io)
150
- #io.session.reports
155
+ if file = io.options.dig(:report, :people, :csv)
156
+ io.options.deep_merge!(export: {
157
+ options: {internal_names: true, nice_header: true, split_schemas: true},
158
+ file: {name: file, format: :csv}
159
+ })
160
+ io = io.session.process_case("to-csv", io: io, type: :export)
161
+ end
151
162
  io
152
163
  end
153
164
 
data/lib/eco/csv/table.rb CHANGED
@@ -15,6 +15,7 @@ module Eco
15
15
  end
16
16
  end
17
17
 
18
+ # It allows to rename the header names
18
19
  # @return [Eco::CSV::Table]
19
20
  def transform_headers
20
21
  header = self.headers
@@ -25,6 +26,8 @@ module Eco
25
26
  columns_to_table(cols)
26
27
  end
27
28
 
29
+ # When there are headers with the same name, it merges those columns
30
+ # @note it also offers a way to resolve merge conflicts
28
31
  # @return [Eco::CSV::Table]
29
32
  def merge_same_header_names
30
33
  dups = self.duplicated_header_names
data/lib/eco/version.rb CHANGED
@@ -1,3 +1,3 @@
1
1
  module Eco
2
- VERSION = "2.0.41"
2
+ VERSION = "2.0.45"
3
3
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: eco-helpers
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.0.41
4
+ version: 2.0.45
5
5
  platform: ruby
6
6
  authors:
7
7
  - Oscar Segura
@@ -116,7 +116,7 @@ dependencies:
116
116
  requirements:
117
117
  - - ">="
118
118
  - !ruby/object:Gem::Version
119
- version: 0.8.3
119
+ version: 0.8.4
120
120
  - - "<"
121
121
  - !ruby/object:Gem::Version
122
122
  version: '0.9'
@@ -126,7 +126,7 @@ dependencies:
126
126
  requirements:
127
127
  - - ">="
128
128
  - !ruby/object:Gem::Version
129
- version: 0.8.3
129
+ version: 0.8.4
130
130
  - - "<"
131
131
  - !ruby/object:Gem::Version
132
132
  version: '0.9'
@@ -136,7 +136,7 @@ dependencies:
136
136
  requirements:
137
137
  - - ">="
138
138
  - !ruby/object:Gem::Version
139
- version: 0.8.19
139
+ version: 0.8.21
140
140
  - - "<"
141
141
  - !ruby/object:Gem::Version
142
142
  version: '0.9'
@@ -146,7 +146,7 @@ dependencies:
146
146
  requirements:
147
147
  - - ">="
148
148
  - !ruby/object:Gem::Version
149
- version: 0.8.19
149
+ version: 0.8.21
150
150
  - - "<"
151
151
  - !ruby/object:Gem::Version
152
152
  version: '0.9'
@@ -533,6 +533,7 @@ files:
533
533
  - lib/eco/api/usecases/default_cases/delete_sync_case.rb
534
534
  - lib/eco/api/usecases/default_cases/delete_trans_case.rb
535
535
  - lib/eco/api/usecases/default_cases/email_as_id_case.rb
536
+ - lib/eco/api/usecases/default_cases/entries_to_csv_case.rb
536
537
  - lib/eco/api/usecases/default_cases/hris_case.rb
537
538
  - lib/eco/api/usecases/default_cases/new_email_case.rb
538
539
  - lib/eco/api/usecases/default_cases/new_id_case.rb
@@ -563,7 +564,9 @@ files:
563
564
  - lib/eco/api/usecases/ooze_samples/ooze_from_doc_case.rb
564
565
  - lib/eco/api/usecases/ooze_samples/ooze_run_base_case.rb
565
566
  - lib/eco/api/usecases/ooze_samples/ooze_update_case.rb
567
+ - lib/eco/api/usecases/ooze_samples/register_export_case.rb
566
568
  - lib/eco/api/usecases/ooze_samples/register_update_case.rb
569
+ - lib/eco/api/usecases/ooze_samples/target_oozes_update_case.rb
567
570
  - lib/eco/api/usecases/use_case.rb
568
571
  - lib/eco/api/usecases/use_case_chain.rb
569
572
  - lib/eco/api/usecases/use_case_io.rb