dwf 0.1.9 → 0.1.13

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: d37bac690df59157001462d6017a8a63d3956a280689de789d9a99163a476832
4
- data.tar.gz: 935d76e99ff5ca109f00be80c01950c20272969f556fc70d0e1ef5c026c024e4
3
+ metadata.gz: 131525481d493d4765a33fa9b5d34524f74f221b05399b1a39b2cc47d439aa65
4
+ data.tar.gz: 4483fe6f6e98e45579d65362c2ba3f0298303522d103647c0a3d04085580af16
5
5
  SHA512:
6
- metadata.gz: c105d178cd44f12577ee9eeed5065768a23adc508da7d686e50c46d26e4eed9c160dc71b10942ab8041e38f6c732384742f40fa9d3076e0a924fbdfb1cd62845
7
- data.tar.gz: ad9c6f1041f092847364e91408887be311435be62e0502ab54f2b408c92fc598a879792eabb00c81b05d5c97fb37ea4a19136765d54b0d4efe9dbdb9ad2f64b5
6
+ metadata.gz: 9a5e357a59d77faa86eaa54552f19b2ad06b82af8a72bc80a2c66fd8ba56584391c08878c0e0b16cf32df6e6dc564d01bf60b0c16b2bcafe7649b6522643db48
7
+ data.tar.gz: 056e815c2beed08ca7e8fccc0c22afb291c849f9195b1bd5d7469d22096b62a37a8a457a21eeaa949f436e034586217e3aacf4e7d6cf44e52f325c88126c8b6c
@@ -1,13 +1,9 @@
1
1
  name: Ruby Gem
2
2
 
3
3
  on:
4
- # Manually publish
5
4
  workflow_dispatch:
6
- # Alternatively, publish whenever changes are merged to the `main` branch.
7
5
  push:
8
6
  branches: [ master ]
9
- paths:
10
- - 'dwf.gemspec'
11
7
 
12
8
  jobs:
13
9
  build:
@@ -13,12 +13,16 @@ jobs:
13
13
 
14
14
  runs-on: ubuntu-latest
15
15
 
16
+ strategy:
17
+ matrix:
18
+ ruby-version: ["2.5", "2.6", "2.7", "3.0"]
19
+
16
20
  steps:
17
21
  - uses: actions/checkout@v2
18
22
  - name: Set up Ruby
19
23
  uses: ruby/setup-ruby@477b21f02be01bcb8030d50f37cfec92bfa615b6
20
24
  with:
21
- ruby-version: 3.0.0
25
+ ruby-version: ${{ matrix.ruby-version }}
22
26
  - name: Install dependencies
23
27
  run: bundle install
24
28
  - name: Run tests
data/.gitignore CHANGED
@@ -4,3 +4,4 @@ Gemfile.lock
4
4
  dwf-*.gem
5
5
  :w
6
6
  :W
7
+ coverage/
data/CHANGELOG.md CHANGED
@@ -1,5 +1,132 @@
1
1
  # Changelog
2
2
  All notable changes to this project will be documented in this file.
3
+
4
+ ## 0.1.12
5
+ ### Added
6
+ #### Dynamic workflows
7
+ There might be a case when you have to contruct the workflow dynamically depend on the input
8
+ As an example, let's write a workflow which puts from 1 to 100 into the terminal parallely . Additionally after finish all job, it will puts the finshed word into the terminal
9
+ ```ruby
10
+ class FirstMainItem < Dwf::Item
11
+ def perform
12
+ puts "#{self.class.name}: running #{params}"
13
+ end
14
+ end
15
+
16
+ SecondMainItem = Class.new(FirstMainItem)
17
+
18
+ class TestWf < Dwf::Workflow
19
+ def configure
20
+ items = (1..100).to_a.map do |number|
21
+ run FirstMainItem, params: number
22
+ end
23
+ run SecondMainItem, after: items, params: "finished"
24
+ end
25
+ end
26
+
27
+ ```
28
+ We can achieve that because run method returns the id of the created job, which we can use for chaining dependencies.
29
+ Now, when we create the workflow like this:
30
+ ```ruby
31
+ wf = TestWf.create
32
+ # wf.callback_type = Dwf::Workflow::SK_BATCH
33
+ wf.start!
34
+ ```
35
+
36
+ ## 0.1.12
37
+ ### Added
38
+ #### Subworkflow for all callback types
39
+ same with `0.1.11`
40
+ ## 0.1.11
41
+ ### Added
42
+ #### Subworkflow - Only support sidekiq pro
43
+ There might be a case when you want to reuse a workflow in another workflow
44
+
45
+ As an example, let's write a workflow which contain another workflow, expected that the SubWorkflow workflow execute after `SecondItem` and the `ThirdItem` execute after `SubWorkflow`
46
+
47
+ ```ruby
48
+ gem 'dwf', '~> 0.1.11'
49
+ ```
50
+
51
+ ### Setup
52
+ ```ruby
53
+ class FirstItem < Dwf::Item
54
+ def perform
55
+ puts "Main flow: #{self.class.name} running"
56
+ puts "Main flow: #{self.class.name} finish"
57
+ end
58
+ end
59
+
60
+ SecondItem = Class.new(FirstItem)
61
+ ThirtItem = Class.new(FirstItem)
62
+
63
+ class FirstSubItem < Dwf::Item
64
+ def perform
65
+ puts "Sub flow: #{self.class.name} running"
66
+ puts "Sub flow: #{self.class.name} finish"
67
+ end
68
+ end
69
+
70
+ SecondSubItem = Class.new(FirstSubItem)
71
+
72
+ class SubWorkflow < Dwf::Workflow
73
+ def configure
74
+ run FirstSubItem
75
+ run SecondSubItem, after: FirstSubItem
76
+ end
77
+ end
78
+
79
+
80
+ class TestWf < Dwf::Workflow
81
+ def configure
82
+ run FirstItem
83
+ run SecondItem, after: FirstItem
84
+ run SubWorkflow, after: SecondItem
85
+ run ThirtItem, after: SubWorkflow
86
+ end
87
+ end
88
+
89
+ wf = TestWf.create
90
+ wf.start!
91
+ ```
92
+
93
+ ### Result
94
+ ```
95
+ Main flow: FirstItem running
96
+ Main flow: FirstItem finish
97
+ Main flow: SecondItem running
98
+ Main flow: SecondItem finish
99
+ Sub flow: FirstSubItem running
100
+ Sub flow: FirstSubItem finish
101
+ Sub flow: SecondSubItem running
102
+ Sub flow: SecondSubItem finish
103
+ Main flow: ThirtItem running
104
+ Main flow: ThirtItem finish
105
+ ```
106
+
107
+ ## 0.1.10
108
+ ### Added
109
+ - Allow to use argument within workflow and update the defining callback way
110
+ ```
111
+ class TestWf < Dwf::Workflow
112
+ def configure(arguments)
113
+ run A
114
+ run B, after: A, params: argument
115
+ run C, after: A, params: argument
116
+ end
117
+ end
118
+
119
+ wf = TestWf.create(arguments)
120
+ wf.callback_type = Dwf::Workflow::SK_BATCH
121
+
122
+ ```
123
+ - Support `find` workflow and `reload` workflow
124
+ ```
125
+ wf = TestWf.create
126
+ Dwf::Workflow.find(wf.id)
127
+ wf.reload
128
+ ```
129
+
3
130
  ## 0.1.9
4
131
  ### Added
5
132
  ### Fixed
data/README.md CHANGED
@@ -1,25 +1,48 @@
1
- # DSL playground
2
- [Gush](https://github.com/chaps-io/gush) cloned without [ActiveJob](https://guides.rubyonrails.org/active_job_basics.html) but requried [Sidekiq](https://github.com/mperham/sidekiq). This project is for researching DSL purpose
1
+ # DWF
2
+ Distributed workflow runner following [Gush](https://github.com/chaps-io/gush) interface using [Sidekiq](https://github.com/mperham/sidekiq) and [Redis](https://redis.io/). This project is for researching DSL purpose
3
3
 
4
4
  # Installation
5
5
  ## 1. Add `dwf` to Gemfile
6
6
  ```ruby
7
- gem 'dwf', '~> 0.1.9'
7
+ gem 'dwf', '~> 0.1.12'
8
8
  ```
9
- ## 2. Execute flow
9
+ ## 2. Execute flow example
10
10
  ### Declare jobs
11
11
 
12
12
  ```ruby
13
13
  require 'dwf'
14
14
 
15
- class A < Dwf::Item
15
+ class FirstItem < Dwf::Item
16
16
  def perform
17
- puts "#{self.class.name} Working"
18
- sleep 2
19
- puts params
20
- puts "#{self.class.name} Finished"
17
+ puts "#{self.class.name}: running"
18
+ puts "#{self.class.name}: finish"
21
19
  end
22
20
  end
21
+
22
+ class SecondItem < Dwf::Item
23
+ def perform
24
+ puts "#{self.class.name}: running"
25
+ output('Send to ThirdItem')
26
+ puts "#{self.class.name} finish"
27
+ end
28
+ end
29
+
30
+ class ThirdItem < Dwf::Item
31
+ def perform
32
+ puts "#{self.class.name}: running"
33
+ puts "#{self.class.name}: finish"
34
+ end
35
+ end
36
+
37
+ class FourthItem < Dwf::Item
38
+ def perform
39
+ puts "#{self.class.name}: running"
40
+ puts "payloads from incoming: #{payloads.inspect}"
41
+ puts "#{self.class.name}: finish"
42
+ end
43
+ end
44
+
45
+ FifthItem = Class.new(FirstItem)
23
46
  ```
24
47
 
25
48
  ### Declare flow
@@ -28,20 +51,23 @@ require 'dwf'
28
51
 
29
52
  class TestWf < Dwf::Workflow
30
53
  def configure
31
- run A
32
- run B, after: A
33
- run C, after: A
34
- run E, after: [B, C], params: 'E say hello'
35
- run D, after: [E], params: 'D say hello'
36
- run F, params: 'F say hello'
54
+ run FirstItem
55
+ run SecondItem, after: FirstItem
56
+ run ThirdItem, after: FirstItem
57
+ run FourthItem, after: [ThirdItem, SecondItem]
58
+ run FifthItem, after: FourthItem
37
59
  end
38
60
  end
39
61
  ```
40
-
62
+ ### Start background worker process
63
+ ```
64
+ bundle exec sidekiq -q dwf
65
+ ```
41
66
 
42
67
  ### Execute flow
43
68
  ```ruby
44
- wf = TestWf.create(callback_type: Dwf::Workflow::SK_BATCH)
69
+ wf = TestWf.create
70
+ wf.callback_type = Dwf::Workflow::SK_BATCH
45
71
  wf.start!
46
72
  ```
47
73
 
@@ -54,21 +80,16 @@ By default `dwf` will use `Dwf::Workflow::BUILD_IN` callback.
54
80
 
55
81
  ### Output
56
82
  ```
57
- A Working
58
- F Working
59
- A Finished
60
- F say hello
61
- F Finished
62
- C Working
63
- B Working
64
- C Finished
65
- B Finished
66
- E Working
67
- E say hello
68
- E Finished
69
- D Working
70
- D say hello
71
- D Finished
83
+ FirstItem: running
84
+ FirstItem: finish
85
+ SecondItem: running
86
+ SecondItem finish
87
+ ThirdItem: running
88
+ ThirdItem: finish
89
+ FourthItem: running
90
+ FourthItem: finish
91
+ FifthItem: running
92
+ FifthItem: finish
72
93
  ```
73
94
 
74
95
  # Config redis and default queue
@@ -83,8 +104,8 @@ Dwf.config do |config|
83
104
  config.namespace = 'dwf'
84
105
  end
85
106
  ```
86
-
87
- # Pinelining
107
+ # Advanced features
108
+ ## Pipelining
88
109
  You can pass jobs result to next nodes
89
110
 
90
111
  ```ruby
@@ -117,6 +138,96 @@ end
117
138
  }
118
139
  ]
119
140
  ```
141
+ ## Sub workflow
142
+ There might be a case when you want to reuse a workflow in another workflow
143
+
144
+ As an example, let's write a workflow which contain another workflow, expected that the SubWorkflow workflow execute after `SecondItem` and the `ThirdItem` execute after `SubWorkflow`
145
+
146
+ ### Setup
147
+ ```ruby
148
+ class FirstItem < Dwf::Item
149
+ def perform
150
+ puts "Main flow: #{self.class.name} running"
151
+ puts "Main flow: #{self.class.name} finish"
152
+ end
153
+ end
154
+
155
+ SecondItem = Class.new(FirstItem)
156
+ ThirtItem = Class.new(FirstItem)
157
+
158
+ class FirstSubItem < Dwf::Item
159
+ def perform
160
+ puts "Sub flow: #{self.class.name} running"
161
+ puts "Sub flow: #{self.class.name} finish"
162
+ end
163
+ end
164
+
165
+ SecondSubItem = Class.new(FirstSubItem)
166
+
167
+ class SubWorkflow < Dwf::Workflow
168
+ def configure
169
+ run FirstSubItem
170
+ run SecondSubItem, after: FirstSubItem
171
+ end
172
+ end
173
+
174
+
175
+ class TestWf < Dwf::Workflow
176
+ def configure
177
+ run FirstItem
178
+ run SecondItem, after: FirstItem
179
+ run SubWorkflow, after: SecondItem
180
+ run ThirtItem, after: SubWorkflow
181
+ end
182
+ end
183
+
184
+ wf = TestWf.create
185
+ wf.start!
186
+ ```
187
+
188
+ ### Result
189
+ ```
190
+ Main flow: FirstItem running
191
+ Main flow: FirstItem finish
192
+ Main flow: SecondItem running
193
+ Main flow: SecondItem finish
194
+ Sub flow: FirstSubItem running
195
+ Sub flow: FirstSubItem finish
196
+ Sub flow: SecondSubItem running
197
+ Sub flow: SecondSubItem finish
198
+ Main flow: ThirtItem running
199
+ Main flow: ThirtItem finish
200
+ ```
201
+
202
+ ## Dynamic workflows
203
+ There might be a case when you have to contruct the workflow dynamically depend on the input
204
+ As an example, let's write a workflow which puts from 1 to 100 into the terminal parallelly . Additionally after finish all job, it will puts the finshed word into the terminal
205
+ ```ruby
206
+ class FirstMainItem < Dwf::Item
207
+ def perform
208
+ puts "#{self.class.name}: running #{params}"
209
+ end
210
+ end
211
+
212
+ SecondMainItem = Class.new(FirstMainItem)
213
+
214
+ class TestWf < Dwf::Workflow
215
+ def configure
216
+ items = (1..100).to_a.map do |number|
217
+ run FirstMainItem, params: number
218
+ end
219
+ run SecondMainItem, after: items, params: "finished"
220
+ end
221
+ end
222
+
223
+ ```
224
+ We can achieve that because run method returns the id of the created job, which we can use for chaining dependencies.
225
+ Now, when we create the workflow like this:
226
+ ```ruby
227
+ wf = TestWf.create
228
+ # wf.callback_type = Dwf::Workflow::SK_BATCH
229
+ wf.start!
230
+ ```
120
231
 
121
232
  # Todo
122
233
  - [x] Make it work
@@ -124,9 +235,12 @@ end
124
235
  - [x] Support with build-in callback
125
236
  - [x] Add github workflow
126
237
  - [x] Redis configurable
127
- - [x] Pinelining
128
- - [ ] [WIP] Test
238
+ - [x] Pipelining
239
+ - [X] Test
240
+ - [x] Sub workflow
129
241
  - [ ] Support [Resque](https://github.com/resque/resque)
242
+ - [ ] Key value store plugable
243
+ - [ ] research https://github.com/moneta-rb/moneta
130
244
 
131
245
  # References
132
246
  - https://github.com/chaps-io/gush
data/dwf.gemspec CHANGED
@@ -3,14 +3,15 @@
3
3
 
4
4
  lib = File.expand_path('../lib', __FILE__)
5
5
  $LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
6
+ require_relative 'lib/dwf/version'
6
7
 
7
8
  Gem::Specification.new do |spec|
8
- spec.name = "dwf"
9
- spec.version = '0.1.9'
10
- spec.authors = ["dthtien"]
11
- spec.email = ["tiendt2311@gmail.com"]
9
+ spec.name = 'dwf'
10
+ spec.version = Dwf::VERSION
11
+ spec.authors = ['dthtien']
12
+ spec.email = ['tiendt2311@gmail.com']
12
13
 
13
- spec.summary = 'Gush cloned without ActiveJob but requried Sidekiq. This project is for researching DSL purpose'
14
+ spec.summary = 'Distributed workflow runner following Gush interface using Sidekiq and Redis'
14
15
  spec.description = 'Workflow'
15
16
  spec.homepage = 'https://github.com/dthtien/wf'
16
17
  spec.license = "MIT"
@@ -25,8 +26,10 @@ Gem::Specification.new do |spec|
25
26
  # guide at: https://bundler.io/guides/creating_gem.html
26
27
 
27
28
  spec.add_development_dependency 'byebug', '~> 11.1.3'
29
+ spec.add_development_dependency 'mock_redis', '~> 0.27.2'
28
30
  spec.add_dependency 'redis', '~> 4.2.0'
31
+ spec.add_dependency 'redis-mutex', '~> 4.0.2'
29
32
  spec.add_development_dependency 'rspec', '~> 3.2'
30
- spec.add_development_dependency 'mock_redis', '~> 0.27.2'
31
33
  spec.add_dependency 'sidekiq', '~> 6.2.0'
34
+ spec.add_development_dependency 'simplecov'
32
35
  end
data/lib/dwf/callback.rb CHANGED
@@ -9,8 +9,8 @@ module Dwf
9
9
  previous_job_names = options['names']
10
10
  workflow_id = options['workflow_id']
11
11
  processing_job_names = previous_job_names.map do |job_name|
12
- job = client.find_job(workflow_id, job_name)
13
- job.outgoing
12
+ node = client.find_node(job_name, workflow_id)
13
+ node.outgoing
14
14
  end.flatten.uniq
15
15
  return if processing_job_names.empty?
16
16
 
@@ -19,7 +19,7 @@ module Dwf
19
19
  end
20
20
 
21
21
  def start(job)
22
- job.outgoing.any? ? start_with_batch(job) : job.perform_async
22
+ job.outgoing.any? ? start_with_batch(job) : job.persist_and_perform_async!
23
23
  end
24
24
 
25
25
  private
@@ -40,11 +40,14 @@ module Dwf
40
40
  batch.on(
41
41
  :success,
42
42
  'Dwf::Callback#process_next_step',
43
- names: jobs.map(&:klass),
43
+ names: jobs.map(&:name),
44
44
  workflow_id: workflow_id
45
45
  )
46
46
  batch.jobs do
47
- jobs.each { |job| job.persist_and_perform_async! if job.ready_to_start? }
47
+ jobs.each do |job|
48
+ job.reload
49
+ job.persist_and_perform_async! if job.ready_to_start?
50
+ end
48
51
  end
49
52
  end
50
53
 
@@ -61,25 +64,24 @@ module Dwf
61
64
 
62
65
  def fetch_jobs(processing_job_names, workflow_id)
63
66
  processing_job_names.map do |job_name|
64
- client.find_job(workflow_id, job_name)
67
+ client.find_node(job_name, workflow_id)
65
68
  end.compact
66
69
  end
67
70
 
68
- def with_lock(workflow_id, job_name)
69
- client.check_or_lock(workflow_id, job_name)
70
- yield
71
- client.release_lock(workflow_id, job_name)
71
+ def with_lock(workflow_id, job_name, &block)
72
+ client.check_or_lock(workflow_id, job_name, &block)
72
73
  end
73
74
 
74
- def start_with_batch(job)
75
+ def start_with_batch(node)
75
76
  batch = Sidekiq::Batch.new
77
+ workflow_id = node.is_a?(Dwf::Workflow) ? node.parent_id : node.workflow_id
76
78
  batch.on(
77
79
  :success,
78
80
  'Dwf::Callback#process_next_step',
79
- names: [job.name],
80
- workflow_id: job.workflow_id
81
+ names: [node.name],
82
+ workflow_id: workflow_id
81
83
  )
82
- batch.jobs { job.perform_async }
84
+ batch.jobs { node.persist_and_perform_async! }
83
85
  end
84
86
 
85
87
  def client
data/lib/dwf/client.rb CHANGED
@@ -1,3 +1,6 @@
1
+ require_relative 'errors'
2
+ require 'redis-mutex'
3
+
1
4
  module Dwf
2
5
  class Client
3
6
  attr_reader :config
@@ -20,18 +23,50 @@ module Dwf
20
23
  Dwf::Item.from_hash(Dwf::Utils.symbolize_keys(data))
21
24
  end
22
25
 
26
+ def find_node(name, workflow_id)
27
+ if Utils.workflow_name?(name)
28
+ if name.include?('|')
29
+ _, id = name.split('|')
30
+ else
31
+ id = workflow_id(name, workflow_id)
32
+ end
33
+ find_workflow(id)
34
+ else
35
+ find_job(workflow_id, name)
36
+ end
37
+ end
38
+
39
+ def find_workflow(id)
40
+ key = redis.keys("dwf.workflows.#{id}*").first
41
+ data = redis.get(key)
42
+ raise WorkflowNotFound, "Workflow with given id doesn't exist" if data.nil?
43
+
44
+ hash = JSON.parse(data)
45
+ hash = Dwf::Utils.symbolize_keys(hash)
46
+ nodes = parse_nodes(id)
47
+ workflow_from_hash(hash, nodes)
48
+ end
49
+
50
+ def find_sub_workflow(name, parent_id)
51
+ find_workflow(workflow_id(name, parent_id))
52
+ end
53
+
54
+ def sub_workflows(id)
55
+ keys = redis.keys("dwf.workflows.*.*.#{id}")
56
+ keys.map do |key|
57
+ id = key.split('.')[2]
58
+
59
+ find_workflow(id)
60
+ end
61
+ end
62
+
23
63
  def persist_job(job)
24
64
  redis.hset("dwf.jobs.#{job.workflow_id}.#{job.klass}", job.id, job.as_json)
25
65
  end
26
66
 
27
- def check_or_lock(workflow_id, job_name)
67
+ def check_or_lock(workflow_id, job_name, &block)
28
68
  key = "wf_enqueue_outgoing_jobs_#{workflow_id}-#{job_name}"
29
-
30
- if key_exists?(key)
31
- sleep 2
32
- else
33
- set(key, 'running')
34
- end
69
+ RedisMutex.with_lock(key, sleep: 0.3, block: 2, &block)
35
70
  end
36
71
 
37
72
  def release_lock(workflow_id, job_name)
@@ -39,7 +74,10 @@ module Dwf
39
74
  end
40
75
 
41
76
  def persist_workflow(workflow)
42
- redis.set("dwf.workflows.#{workflow.id}", workflow.as_json)
77
+ key = [
78
+ 'dwf', 'workflows', workflow.id, workflow.class.name, workflow.parent_id
79
+ ].compact.join('.')
80
+ redis.set(key, workflow.as_json)
43
81
  end
44
82
 
45
83
  def build_job_id(workflow_id, job_klass)
@@ -84,6 +122,13 @@ module Dwf
84
122
 
85
123
  private
86
124
 
125
+ def workflow_id(name, parent_id)
126
+ key = redis.keys("dwf.workflows.*.#{name}.#{parent_id}").first
127
+ return if key.nil?
128
+
129
+ key.split('.')[2]
130
+ end
131
+
87
132
  def find_job_by_klass_and_id(workflow_id, job_name)
88
133
  job_klass, job_id = job_name.split('|')
89
134
 
@@ -99,8 +144,36 @@ module Dwf
99
144
  job
100
145
  end
101
146
 
147
+ def parse_nodes(id)
148
+ keys = redis.scan_each(match: "dwf.jobs.#{id}.*")
149
+
150
+ items = keys.map do |key|
151
+ redis.hvals(key).map do |json|
152
+ node = Dwf::Utils.symbolize_keys JSON.parse(json)
153
+ Dwf::Item.from_hash(node)
154
+ end
155
+ end.flatten
156
+ workflows = sub_workflows(id)
157
+ items + workflows
158
+ end
159
+
160
+ def workflow_from_hash(hash, jobs = [])
161
+ flow = Module.const_get(hash[:klass]).new(*hash[:arguments])
162
+ flow.jobs = []
163
+ flow.outgoing = hash.fetch(:outgoing, [])
164
+ flow.parent_id = hash[:parent_id]
165
+ flow.incoming = hash.fetch(:incoming, [])
166
+ flow.stopped = hash.fetch(:stopped, false)
167
+ flow.callback_type = hash.fetch(:callback_type, Workflow::BUILD_IN)
168
+ flow.id = hash[:id]
169
+ flow.jobs = jobs
170
+ flow
171
+ end
172
+
102
173
  def redis
103
- @redis ||= Redis.new(config.redis_opts)
174
+ @redis ||= Redis.new(config.redis_opts).tap do |instance|
175
+ RedisClassy.redis = instance
176
+ end
104
177
  end
105
178
  end
106
179
  end
@@ -0,0 +1,29 @@
1
+ module Dwf
2
+ module Concerns
3
+ module Checkable
4
+ def no_dependencies?
5
+ incoming.empty?
6
+ end
7
+
8
+ def leaf?
9
+ outgoing.empty?
10
+ end
11
+
12
+ def ready_to_start?
13
+ !running? && !enqueued? && !finished? && !failed? && parents_succeeded?
14
+ end
15
+
16
+ def succeeded?
17
+ finished? && !failed?
18
+ end
19
+
20
+ def running?
21
+ started? && !finished?
22
+ end
23
+
24
+ def started?
25
+ !!started_at
26
+ end
27
+ end
28
+ end
29
+ end
data/lib/dwf/errors.rb ADDED
@@ -0,0 +1,3 @@
1
+ module Dwf
2
+ class WorkflowNotFound < StandardError; end
3
+ end