fractor 0.1.4 → 0.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (189) hide show
  1. checksums.yaml +4 -4
  2. data/.rubocop-https---raw-githubusercontent-com-riboseinc-oss-guides-main-ci-rubocop-yml +552 -0
  3. data/.rubocop.yml +14 -8
  4. data/.rubocop_todo.yml +284 -43
  5. data/README.adoc +111 -950
  6. data/docs/.lycheeignore +16 -0
  7. data/docs/Gemfile +24 -0
  8. data/docs/README.md +157 -0
  9. data/docs/_config.yml +151 -0
  10. data/docs/_features/error-handling.adoc +1192 -0
  11. data/docs/_features/index.adoc +80 -0
  12. data/docs/_features/monitoring.adoc +589 -0
  13. data/docs/_features/signal-handling.adoc +202 -0
  14. data/docs/_features/workflows.adoc +1235 -0
  15. data/docs/_guides/continuous-mode.adoc +736 -0
  16. data/docs/_guides/cookbook.adoc +1133 -0
  17. data/docs/_guides/index.adoc +55 -0
  18. data/docs/_guides/pipeline-mode.adoc +730 -0
  19. data/docs/_guides/troubleshooting.adoc +358 -0
  20. data/docs/_pages/architecture.adoc +1390 -0
  21. data/docs/_pages/core-concepts.adoc +1392 -0
  22. data/docs/_pages/design-principles.adoc +862 -0
  23. data/docs/_pages/getting-started.adoc +290 -0
  24. data/docs/_pages/installation.adoc +143 -0
  25. data/docs/_reference/api.adoc +1080 -0
  26. data/docs/_reference/error-reporting.adoc +670 -0
  27. data/docs/_reference/examples.adoc +181 -0
  28. data/docs/_reference/index.adoc +96 -0
  29. data/docs/_reference/troubleshooting.adoc +862 -0
  30. data/docs/_tutorials/complex-workflows.adoc +1022 -0
  31. data/docs/_tutorials/data-processing-pipeline.adoc +740 -0
  32. data/docs/_tutorials/first-application.adoc +384 -0
  33. data/docs/_tutorials/index.adoc +48 -0
  34. data/docs/_tutorials/long-running-services.adoc +931 -0
  35. data/docs/assets/images/favicon-16.png +0 -0
  36. data/docs/assets/images/favicon-32.png +0 -0
  37. data/docs/assets/images/favicon-48.png +0 -0
  38. data/docs/assets/images/favicon.ico +0 -0
  39. data/docs/assets/images/favicon.png +0 -0
  40. data/docs/assets/images/favicon.svg +45 -0
  41. data/docs/assets/images/fractor-icon.svg +49 -0
  42. data/docs/assets/images/fractor-logo.svg +61 -0
  43. data/docs/index.adoc +131 -0
  44. data/docs/lychee.toml +39 -0
  45. data/examples/api_aggregator/README.adoc +627 -0
  46. data/examples/api_aggregator/api_aggregator.rb +376 -0
  47. data/examples/auto_detection/README.adoc +407 -29
  48. data/examples/auto_detection/auto_detection.rb +9 -9
  49. data/examples/continuous_chat_common/message_protocol.rb +53 -0
  50. data/examples/continuous_chat_fractor/README.adoc +217 -0
  51. data/examples/continuous_chat_fractor/chat_client.rb +303 -0
  52. data/examples/continuous_chat_fractor/chat_common.rb +83 -0
  53. data/examples/continuous_chat_fractor/chat_server.rb +167 -0
  54. data/examples/continuous_chat_fractor/simulate.rb +345 -0
  55. data/examples/continuous_chat_server/README.adoc +135 -0
  56. data/examples/continuous_chat_server/chat_client.rb +303 -0
  57. data/examples/continuous_chat_server/chat_server.rb +359 -0
  58. data/examples/continuous_chat_server/simulate.rb +343 -0
  59. data/examples/error_reporting.rb +207 -0
  60. data/examples/file_processor/README.adoc +170 -0
  61. data/examples/file_processor/file_processor.rb +615 -0
  62. data/examples/file_processor/sample_files/invalid.csv +1 -0
  63. data/examples/file_processor/sample_files/orders.xml +24 -0
  64. data/examples/file_processor/sample_files/products.json +23 -0
  65. data/examples/file_processor/sample_files/users.csv +6 -0
  66. data/examples/hierarchical_hasher/README.adoc +629 -41
  67. data/examples/hierarchical_hasher/hierarchical_hasher.rb +12 -8
  68. data/examples/image_processor/README.adoc +610 -0
  69. data/examples/image_processor/image_processor.rb +349 -0
  70. data/examples/image_processor/processed_images/sample_10_processed.jpg.json +12 -0
  71. data/examples/image_processor/processed_images/sample_1_processed.jpg.json +12 -0
  72. data/examples/image_processor/processed_images/sample_2_processed.jpg.json +12 -0
  73. data/examples/image_processor/processed_images/sample_3_processed.jpg.json +12 -0
  74. data/examples/image_processor/processed_images/sample_4_processed.jpg.json +12 -0
  75. data/examples/image_processor/processed_images/sample_5_processed.jpg.json +12 -0
  76. data/examples/image_processor/processed_images/sample_6_processed.jpg.json +12 -0
  77. data/examples/image_processor/processed_images/sample_7_processed.jpg.json +12 -0
  78. data/examples/image_processor/processed_images/sample_8_processed.jpg.json +12 -0
  79. data/examples/image_processor/processed_images/sample_9_processed.jpg.json +12 -0
  80. data/examples/image_processor/test_images/sample_1.png +1 -0
  81. data/examples/image_processor/test_images/sample_10.png +1 -0
  82. data/examples/image_processor/test_images/sample_2.png +1 -0
  83. data/examples/image_processor/test_images/sample_3.png +1 -0
  84. data/examples/image_processor/test_images/sample_4.png +1 -0
  85. data/examples/image_processor/test_images/sample_5.png +1 -0
  86. data/examples/image_processor/test_images/sample_6.png +1 -0
  87. data/examples/image_processor/test_images/sample_7.png +1 -0
  88. data/examples/image_processor/test_images/sample_8.png +1 -0
  89. data/examples/image_processor/test_images/sample_9.png +1 -0
  90. data/examples/log_analyzer/README.adoc +662 -0
  91. data/examples/log_analyzer/log_analyzer.rb +579 -0
  92. data/examples/log_analyzer/sample_logs/apache.log +20 -0
  93. data/examples/log_analyzer/sample_logs/json.log +15 -0
  94. data/examples/log_analyzer/sample_logs/nginx.log +15 -0
  95. data/examples/log_analyzer/sample_logs/rails.log +29 -0
  96. data/examples/multi_work_type/README.adoc +576 -26
  97. data/examples/multi_work_type/multi_work_type.rb +30 -29
  98. data/examples/performance_monitoring.rb +120 -0
  99. data/examples/pipeline_processing/README.adoc +740 -26
  100. data/examples/pipeline_processing/pipeline_processing.rb +16 -16
  101. data/examples/priority_work_example.rb +155 -0
  102. data/examples/producer_subscriber/README.adoc +889 -46
  103. data/examples/producer_subscriber/producer_subscriber.rb +20 -16
  104. data/examples/scatter_gather/README.adoc +829 -27
  105. data/examples/scatter_gather/scatter_gather.rb +29 -28
  106. data/examples/simple/README.adoc +347 -0
  107. data/examples/simple/sample.rb +5 -5
  108. data/examples/specialized_workers/README.adoc +622 -26
  109. data/examples/specialized_workers/specialized_workers.rb +88 -45
  110. data/examples/stream_processor/README.adoc +206 -0
  111. data/examples/stream_processor/stream_processor.rb +284 -0
  112. data/examples/web_scraper/README.adoc +625 -0
  113. data/examples/web_scraper/web_scraper.rb +285 -0
  114. data/examples/workflow/README.adoc +406 -0
  115. data/examples/workflow/circuit_breaker/README.adoc +360 -0
  116. data/examples/workflow/circuit_breaker/circuit_breaker_workflow.rb +225 -0
  117. data/examples/workflow/conditional/README.adoc +483 -0
  118. data/examples/workflow/conditional/conditional_workflow.rb +215 -0
  119. data/examples/workflow/dead_letter_queue/README.adoc +374 -0
  120. data/examples/workflow/dead_letter_queue/dead_letter_queue_workflow.rb +217 -0
  121. data/examples/workflow/fan_out/README.adoc +381 -0
  122. data/examples/workflow/fan_out/fan_out_workflow.rb +202 -0
  123. data/examples/workflow/retry/README.adoc +248 -0
  124. data/examples/workflow/retry/retry_workflow.rb +195 -0
  125. data/examples/workflow/simple_linear/README.adoc +267 -0
  126. data/examples/workflow/simple_linear/simple_linear_workflow.rb +175 -0
  127. data/examples/workflow/simplified/README.adoc +329 -0
  128. data/examples/workflow/simplified/simplified_workflow.rb +222 -0
  129. data/exe/fractor +10 -0
  130. data/lib/fractor/cli.rb +288 -0
  131. data/lib/fractor/configuration.rb +307 -0
  132. data/lib/fractor/continuous_server.rb +183 -0
  133. data/lib/fractor/error_formatter.rb +72 -0
  134. data/lib/fractor/error_report_generator.rb +152 -0
  135. data/lib/fractor/error_reporter.rb +244 -0
  136. data/lib/fractor/error_statistics.rb +147 -0
  137. data/lib/fractor/execution_tracer.rb +162 -0
  138. data/lib/fractor/logger.rb +230 -0
  139. data/lib/fractor/main_loop_handler.rb +406 -0
  140. data/lib/fractor/main_loop_handler3.rb +135 -0
  141. data/lib/fractor/main_loop_handler4.rb +299 -0
  142. data/lib/fractor/performance_metrics_collector.rb +181 -0
  143. data/lib/fractor/performance_monitor.rb +215 -0
  144. data/lib/fractor/performance_report_generator.rb +202 -0
  145. data/lib/fractor/priority_work.rb +93 -0
  146. data/lib/fractor/priority_work_queue.rb +189 -0
  147. data/lib/fractor/result_aggregator.rb +33 -1
  148. data/lib/fractor/shutdown_handler.rb +168 -0
  149. data/lib/fractor/signal_handler.rb +80 -0
  150. data/lib/fractor/supervisor.rb +430 -144
  151. data/lib/fractor/supervisor_logger.rb +88 -0
  152. data/lib/fractor/version.rb +1 -1
  153. data/lib/fractor/work.rb +12 -0
  154. data/lib/fractor/work_distribution_manager.rb +151 -0
  155. data/lib/fractor/work_queue.rb +88 -0
  156. data/lib/fractor/work_result.rb +181 -9
  157. data/lib/fractor/worker.rb +75 -1
  158. data/lib/fractor/workflow/builder.rb +210 -0
  159. data/lib/fractor/workflow/chain_builder.rb +169 -0
  160. data/lib/fractor/workflow/circuit_breaker.rb +183 -0
  161. data/lib/fractor/workflow/circuit_breaker_orchestrator.rb +208 -0
  162. data/lib/fractor/workflow/circuit_breaker_registry.rb +112 -0
  163. data/lib/fractor/workflow/dead_letter_queue.rb +334 -0
  164. data/lib/fractor/workflow/execution_hooks.rb +39 -0
  165. data/lib/fractor/workflow/execution_strategy.rb +225 -0
  166. data/lib/fractor/workflow/execution_trace.rb +134 -0
  167. data/lib/fractor/workflow/helpers.rb +191 -0
  168. data/lib/fractor/workflow/job.rb +290 -0
  169. data/lib/fractor/workflow/job_dependency_validator.rb +120 -0
  170. data/lib/fractor/workflow/logger.rb +110 -0
  171. data/lib/fractor/workflow/pre_execution_context.rb +193 -0
  172. data/lib/fractor/workflow/retry_config.rb +156 -0
  173. data/lib/fractor/workflow/retry_orchestrator.rb +184 -0
  174. data/lib/fractor/workflow/retry_strategy.rb +93 -0
  175. data/lib/fractor/workflow/structured_logger.rb +30 -0
  176. data/lib/fractor/workflow/type_compatibility_validator.rb +222 -0
  177. data/lib/fractor/workflow/visualizer.rb +211 -0
  178. data/lib/fractor/workflow/workflow_context.rb +132 -0
  179. data/lib/fractor/workflow/workflow_executor.rb +669 -0
  180. data/lib/fractor/workflow/workflow_result.rb +55 -0
  181. data/lib/fractor/workflow/workflow_validator.rb +295 -0
  182. data/lib/fractor/workflow.rb +333 -0
  183. data/lib/fractor/wrapped_ractor.rb +66 -91
  184. data/lib/fractor/wrapped_ractor3.rb +161 -0
  185. data/lib/fractor/wrapped_ractor4.rb +242 -0
  186. data/lib/fractor.rb +93 -3
  187. metadata +192 -6
  188. data/tests/sample.rb.bak +0 -309
  189. data/tests/sample_working.rb.bak +0 -209
@@ -0,0 +1,248 @@
1
+ = Retry Workflow Example
2
+
3
+ This example demonstrates Fractor's retry and error handling capabilities for workflows.
4
+
5
+ == Overview
6
+
7
+ The retry feature allows workflows to automatically retry failed jobs with configurable backoff strategies. This is essential for handling transient errors in production systems, such as:
8
+
9
+ * Temporary network issues
10
+ * Rate-limited API calls
11
+ * Database connection timeouts
12
+ * Resource contention
13
+
14
+ == Features Demonstrated
15
+
16
+ === Retry Strategies
17
+
18
+ ==== Exponential Backoff
19
+
20
+ Delays increase exponentially between retries:
21
+
22
+ [source,ruby]
23
+ ----
24
+ retry_on_error max_attempts: 3,
25
+ backoff: :exponential,
26
+ initial_delay: 0.5,
27
+ max_delay: 5
28
+ ----
29
+
30
+ Delay sequence: 0.5s → 1s → 2s → 4s (capped at 5s)
31
+
32
+ ==== Linear Backoff
33
+
34
+ Delays increase linearly between retries:
35
+
36
+ [source,ruby]
37
+ ----
38
+ retry_on_error max_attempts: 5,
39
+ backoff: :linear,
40
+ initial_delay: 1,
41
+ increment: 0.5
42
+ ----
43
+
44
+ Delay sequence: 1s → 1.5s → 2s → 2.5s → 3s
45
+
46
+ ==== Constant Delay
47
+
48
+ Fixed delay between retries:
49
+
50
+ [source,ruby]
51
+ ----
52
+ retry_on_error max_attempts: 4,
53
+ backoff: :constant,
54
+ delay: 1
55
+ ----
56
+
57
+ Delay sequence: 1s → 1s → 1s → 1s
58
+
59
+ === Error Handlers
60
+
61
+ Add custom error handling logic:
62
+
63
+ [source,ruby]
64
+ ----
65
+ on_error do |error, context|
66
+ puts "Error in job: #{error.message}"
67
+ # Log to monitoring service
68
+ # Send alerts
69
+ # Update metrics
70
+ end
71
+ ----
72
+
73
+ === Fallback Jobs
74
+
75
+ Provide alternative execution paths when retries are exhausted:
76
+
77
+ [source,ruby]
78
+ ----
79
+ job "fetch_api_data" do
80
+ runs_with UnreliableApiWorker
81
+ retry_on_error max_attempts: 3, backoff: :exponential
82
+ fallback_to "fetch_cached_data"
83
+ end
84
+
85
+ job "fetch_cached_data" do
86
+ runs_with CachedDataWorker
87
+ inputs_from_workflow
88
+ end
89
+ ----
90
+
91
+ == Running the Example
92
+
93
+ [source,shell]
94
+ ----
95
+ ruby examples/workflow/retry/retry_workflow.rb
96
+ ----
97
+
98
+ == Example Output
99
+
100
+ ----
101
+ ============================================================
102
+ Retry Workflow Examples
103
+ ============================================================
104
+
105
+ 1. Exponential Backoff Retry (with fallback)
106
+ ------------------------------------------------------------
107
+ Error in fetch_api_data: API timeout: https://api.example.com/data
108
+ Executing fallback job
109
+ Result: Using cached data from https://api.example.com/data
110
+ Status: SUCCESS
111
+
112
+ 2. Linear Backoff Retry
113
+ ------------------------------------------------------------
114
+ Job retry attempt: attempt 2/5, delay: 1.0s
115
+ Job retry attempt: attempt 3/5, delay: 1.5s
116
+ Job retry succeeded on attempt 3
117
+ Result: Fresh data from https://api.example.com/data
118
+ Status: SUCCESS
119
+
120
+ 3. Constant Delay Retry
121
+ ------------------------------------------------------------
122
+ Job retry attempt: attempt 2/4, delay: 1.0s
123
+ Job retry attempt: attempt 3/4, delay: 1.0s
124
+ Job retry attempt: attempt 4/4, delay: 1.0s
125
+ Workflow failed after retries: Job 'fetch_api_data' failed: API timeout
126
+ ----
127
+
128
+ == Use Cases
129
+
130
+ === External API Calls
131
+
132
+ [example]
133
+ ====
134
+ [source,ruby]
135
+ ----
136
+ job "call_payment_api" do
137
+ runs_with PaymentApiWorker
138
+ retry_on_error max_attempts: 5,
139
+ backoff: :exponential,
140
+ retryable_errors: [Net::HTTPRetriableError, Timeout::Error]
141
+ end
142
+ ----
143
+ ====
144
+
145
+ === Database Operations
146
+
147
+ [example]
148
+ ====
149
+ [source,ruby]
150
+ ----
151
+ job "save_to_database" do
152
+ runs_with DatabaseWorker
153
+ retry_on_error max_attempts: 3,
154
+ backoff: :linear,
155
+ retryable_errors: [ActiveRecord::ConnectionNotEstablished]
156
+ end
157
+ ----
158
+ ====
159
+
160
+ === File I/O
161
+
162
+ [example]
163
+ ====
164
+ [source,ruby]
165
+ ----
166
+ job "write_to_storage" do
167
+ runs_with StorageWorker
168
+ retry_on_error max_attempts: 4,
169
+ backoff: :constant,
170
+ delay: 2
171
+ fallback_to "write_to_local_cache"
172
+ end
173
+ ----
174
+ ====
175
+
176
+ == Best Practices
177
+
178
+ . *Choose appropriate backoff strategy*
179
+ * Exponential: Best for most transient errors
180
+ * Linear: Good for predictable retry windows
181
+ * Constant: Use when timing is critical
182
+
183
+ . *Set reasonable max_attempts*
184
+ * Too few: Might not recover from transient issues
185
+ * Too many: Wastes resources and delays failure detection
186
+
187
+ . *Use retryable_errors selectively*
188
+ * Only retry errors that are likely to be transient
189
+ * Don't retry validation errors or permanent failures
190
+
191
+ . *Implement fallback strategies*
192
+ * Provide cached data
193
+ * Use default values
194
+ * Degrade gracefully
195
+
196
+ . *Monitor retry metrics*
197
+ * Track retry rates
198
+ * Alert on high retry counts
199
+ * Identify patterns in failures
200
+
201
+ == Configuration Options
202
+
203
+ [cols="1,1,3"]
204
+ |===
205
+ |Option |Type |Description
206
+
207
+ |`max_attempts`
208
+ |Integer
209
+ |Maximum number of attempts (including initial attempt). Default: 3
210
+
211
+ |`backoff`
212
+ |Symbol
213
+ |Retry strategy: `:exponential`, `:linear`, `:constant`, `:none`. Default: `:exponential`
214
+
215
+ |`initial_delay`
216
+ |Numeric
217
+ |Initial delay in seconds. Default: 1
218
+
219
+ |`max_delay`
220
+ |Numeric
221
+ |Maximum delay between retries. Default: nil (no cap)
222
+
223
+ |`increment`
224
+ |Numeric
225
+ |(Linear only) Delay increment per attempt. Default: 1
226
+
227
+ |`multiplier`
228
+ |Numeric
229
+ |(Exponential only) Delay multiplier per attempt. Default: 2
230
+
231
+ |`delay`
232
+ |Numeric
233
+ |(Constant only) Fixed delay in seconds. Default: 1
234
+
235
+ |`timeout`
236
+ |Numeric
237
+ |Job execution timeout in seconds. Default: nil
238
+
239
+ |`retryable_errors`
240
+ |Array<Class>
241
+ |List of retryable error classes. Default: `[StandardError]`
242
+ |===
243
+
244
+ == See Also
245
+
246
+ * link:../../../docs/workflows.adoc[Workflow Documentation]
247
+ * link:../simple_linear/README.adoc[Simple Linear Workflow]
248
+ * link:../conditional/README.adoc[Conditional Workflow]
@@ -0,0 +1,195 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "../../../lib/fractor"
4
+
5
+ # Simulates an unreliable external API
6
+ class UnreliableApiWorker < Fractor::Worker
7
+ input_type String
8
+ output_type Hash
9
+
10
+ def process(work)
11
+ api_url = work.input
12
+
13
+ # Simulate random failures (70% failure rate for demonstration)
14
+ if rand < 0.7
15
+ raise StandardError, "API timeout: #{api_url}"
16
+ end
17
+
18
+ # Simulate successful response
19
+ Fractor::WorkResult.new(
20
+ result: {
21
+ status: "success",
22
+ data: { url: api_url, timestamp: Time.now },
23
+ },
24
+ work: work,
25
+ )
26
+ end
27
+ end
28
+
29
+ # Fallback worker that uses cached data
30
+ class CachedDataWorker < Fractor::Worker
31
+ input_type String
32
+ output_type Hash
33
+
34
+ def process(work)
35
+ api_url = work.input
36
+
37
+ # Return cached/default data
38
+ Fractor::WorkResult.new(
39
+ result: {
40
+ status: "cached",
41
+ data: { url: api_url, cached: true, timestamp: Time.now - 3600 },
42
+ },
43
+ work: work,
44
+ )
45
+ end
46
+ end
47
+
48
+ # Process the API response
49
+ class ProcessResponseWorker < Fractor::Worker
50
+ input_type Hash
51
+ output_type String
52
+
53
+ def process(work)
54
+ response = work.input
55
+ status = response[:status]
56
+ data = response[:data]
57
+
58
+ message = if status == "success"
59
+ "Fresh data from #{data[:url]}"
60
+ else
61
+ "Using cached data from #{data[:url]}"
62
+ end
63
+
64
+ Fractor::WorkResult.new(result: message, work: work)
65
+ end
66
+ end
67
+
68
+ # Workflow demonstrating retry with exponential backoff
69
+ class ExponentialRetryWorkflow < Fractor::Workflow
70
+ workflow "exponential-retry-demo" do
71
+ start_with "fetch_api_data"
72
+
73
+ job "fetch_api_data" do
74
+ runs_with UnreliableApiWorker
75
+ inputs_from_workflow
76
+
77
+ # Retry up to 3 times with exponential backoff
78
+ retry_on_error max_attempts: 3,
79
+ backoff: :exponential,
80
+ initial_delay: 0.5,
81
+ max_delay: 5
82
+
83
+ # Add error handler for logging
84
+ on_error do |error, context|
85
+ puts "Error in fetch_api_data: #{error.message}"
86
+ end
87
+
88
+ # Fallback to cached data if all retries fail
89
+ fallback_to "fetch_cached_data"
90
+ end
91
+
92
+ job "fetch_cached_data" do
93
+ runs_with CachedDataWorker
94
+ inputs_from_workflow
95
+ end
96
+
97
+ job "process_response" do
98
+ runs_with ProcessResponseWorker
99
+ needs "fetch_api_data"
100
+ outputs_to_workflow
101
+ end
102
+ end
103
+ end
104
+
105
+ # Workflow demonstrating retry with linear backoff
106
+ class LinearRetryWorkflow < Fractor::Workflow
107
+ workflow "linear-retry-demo" do
108
+ start_with "fetch_api_data"
109
+
110
+ job "fetch_api_data" do
111
+ runs_with UnreliableApiWorker
112
+ inputs_from_workflow
113
+
114
+ # Retry up to 5 times with linear backoff
115
+ retry_on_error max_attempts: 5,
116
+ backoff: :linear,
117
+ initial_delay: 1,
118
+ increment: 0.5
119
+ end
120
+
121
+ job "process_response" do
122
+ runs_with ProcessResponseWorker
123
+ needs "fetch_api_data"
124
+ outputs_to_workflow
125
+ end
126
+ end
127
+ end
128
+
129
+ # Workflow demonstrating retry with constant delay
130
+ class ConstantRetryWorkflow < Fractor::Workflow
131
+ workflow "constant-retry-demo" do
132
+ start_with "fetch_api_data"
133
+
134
+ job "fetch_api_data" do
135
+ runs_with UnreliableApiWorker
136
+ inputs_from_workflow
137
+
138
+ # Retry up to 4 times with constant 1 second delay
139
+ retry_on_error max_attempts: 4,
140
+ backoff: :constant,
141
+ delay: 1
142
+ end
143
+
144
+ job "process_response" do
145
+ runs_with ProcessResponseWorker
146
+ needs "fetch_api_data"
147
+ outputs_to_workflow
148
+ end
149
+ end
150
+ end
151
+
152
+ if __FILE__ == $PROGRAM_NAME
153
+ puts "=" * 60
154
+ puts "Retry Workflow Examples"
155
+ puts "=" * 60
156
+ puts
157
+
158
+ api_url = "https://api.example.com/data"
159
+
160
+ puts "1. Exponential Backoff Retry (with fallback)"
161
+ puts "-" * 60
162
+ workflow1 = ExponentialRetryWorkflow.new
163
+ result1 = workflow1.execute(api_url)
164
+ puts "Result: #{result1.output}"
165
+ puts "Status: #{result1.success? ? 'SUCCESS' : 'FAILED'}"
166
+ puts
167
+
168
+ puts "2. Linear Backoff Retry"
169
+ puts "-" * 60
170
+ workflow2 = LinearRetryWorkflow.new
171
+ begin
172
+ result2 = workflow2.execute(api_url)
173
+ puts "Result: #{result2.output}"
174
+ puts "Status: #{result2.success? ? 'SUCCESS' : 'FAILED'}"
175
+ rescue Fractor::WorkflowExecutionError => e
176
+ puts "Workflow failed after retries: #{e.message}"
177
+ end
178
+ puts
179
+
180
+ puts "3. Constant Delay Retry"
181
+ puts "-" * 60
182
+ workflow3 = ConstantRetryWorkflow.new
183
+ begin
184
+ result3 = workflow3.execute(api_url)
185
+ puts "Result: #{result3.output}"
186
+ puts "Status: #{result3.success? ? 'SUCCESS' : 'FAILED'}"
187
+ rescue Fractor::WorkflowExecutionError => e
188
+ puts "Workflow failed after retries: #{e.message}"
189
+ end
190
+ puts
191
+
192
+ puts "=" * 60
193
+ puts "Examples complete"
194
+ puts "=" * 60
195
+ end
@@ -0,0 +1,267 @@
1
+ = Simple Linear Workflow
2
+
3
+ == Purpose
4
+
5
+ Demonstrates sequential job processing with data transformation at each stage using Fractor's workflow system.
6
+
7
+ == Focus
8
+
9
+ This example focuses on demonstrating:
10
+
11
+ * Sequential job dependencies using `needs`
12
+ * Type-safe data flow with `input_type` and `output_type` declarations
13
+ * Workflow entry point definition with `start_with`
14
+ * Workflow exit point definition with `end_with`
15
+ * Job output mapping using `inputs_from_job`
16
+ * Data transformation through a pipeline of workers
17
+
18
+ == Architecture
19
+
20
+ .Data Flow Through Sequential Jobs
21
+ [source]
22
+ ----
23
+ [Workflow Input]
24
+
25
+ │ TextData { text: "hello world from fractor" }
26
+
27
+ ┌─────────────────┐
28
+ │ Uppercase Job │
29
+ │ UppercaseWorker │
30
+ └─────────────────┘
31
+
32
+ │ UppercaseOutput { uppercased_text: "HELLO...", char_count: 24 }
33
+
34
+ ┌─────────────────┐
35
+ │ Reverse Job │
36
+ │ ReverseWorker │
37
+ └─────────────────┘
38
+
39
+ │ ReversedOutput { reversed_text: "ROTCARF...", word_count: 4 }
40
+
41
+ ┌─────────────────┐
42
+ │ Finalize Job │
43
+ │ FinalizeWorker │
44
+ └─────────────────┘
45
+
46
+ │ FinalOutput { result: "ROTCARF MORF DLROW OLLEH", total_operations: 3 }
47
+
48
+ [Workflow Output]
49
+ ----
50
+
51
+ .Workflow Execution Flow
52
+ [source]
53
+ ----
54
+ Main Thread
55
+
56
+ ├─→ Create Workflow Input (TextData)
57
+
58
+ ├─→ Execute Workflow
59
+ │ │
60
+ │ ├─→ Validate Workflow Structure
61
+ │ │ • Check dependencies (topological sort)
62
+ │ │ • Verify entry/exit points
63
+ │ │ • Validate type declarations
64
+ │ │
65
+ │ ├─→ Execute Job: "uppercase"
66
+ │ │ │
67
+ │ │ ├─→ Create Work from workflow input
68
+ │ │ ├─→ Spawn Worker Ractor (UppercaseWorker)
69
+ │ │ ├─→ Process work
70
+ │ │ └─→ Store result in context
71
+ │ │
72
+ │ ├─→ Execute Job: "reverse"
73
+ │ │ │
74
+ │ │ ├─→ Build input from "uppercase" output
75
+ │ │ ├─→ Spawn Worker Ractor (ReverseWorker)
76
+ │ │ ├─→ Process work
77
+ │ │ └─→ Store result in context
78
+ │ │
79
+ │ └─→ Execute Job: "finalize"
80
+ │ │
81
+ │ ├─→ Build input from "reverse" output
82
+ │ ├─→ Spawn Worker Ractor (FinalizeWorker)
83
+ │ ├─→ Process work
84
+ │ └─→ Store result in context
85
+
86
+ └─→ Return Workflow Result
87
+ • Status: SUCCESS/FAILED
88
+ • Output: FinalOutput object
89
+ • Execution time
90
+ • Completed jobs list
91
+ ----
92
+
93
+ == Key Components
94
+
95
+ === Data Models
96
+
97
+ Each stage requires explicit input and output type declarations:
98
+
99
+ [source,ruby]
100
+ ----
101
+ class TextData
102
+ attr_accessor :text
103
+
104
+ def initialize(text:)
105
+ @text = text
106
+ end
107
+ end
108
+
109
+ class UppercaseOutput
110
+ attr_accessor :uppercased_text, :char_count
111
+
112
+ def initialize(uppercased_text:, char_count:)
113
+ @uppercased_text = uppercased_text
114
+ @char_count = char_count
115
+ end
116
+ end
117
+ ----
118
+
119
+ === Workers
120
+
121
+ Workers declare their expected input and output types using class methods:
122
+
123
+ [source,ruby]
124
+ ----
125
+ class UppercaseWorker < Fractor::Worker
126
+ input_type TextData # <1>
127
+ output_type UppercaseOutput # <2>
128
+
129
+ def process(work)
130
+ input = work.input
131
+ uppercased = input.text.upcase
132
+
133
+ output = UppercaseOutput.new(
134
+ uppercased_text: uppercased,
135
+ char_count: uppercased.length,
136
+ )
137
+
138
+ Fractor::WorkResult.new(result: output, work: work) # <3>
139
+ end
140
+ end
141
+ ----
142
+ <1> Declares the expected input type for type validation
143
+ <2> Declares the output type returned by this worker
144
+ <3> Returns a WorkResult wrapping the output
145
+
146
+ === Workflow Definition
147
+
148
+ The workflow DSL defines the job dependencies and data flow:
149
+
150
+ [source,ruby]
151
+ ----
152
+ class SimpleLinearWorkflow < Fractor::Workflow
153
+ workflow "simple-linear" do
154
+ # Declare workflow input/output types
155
+ input_type TextData # <1>
156
+ output_type FinalOutput # <2>
157
+
158
+ # Define workflow boundaries
159
+ start_with "uppercase" # <3>
160
+ end_with "finalize" # <4>
161
+
162
+ # Job 1: Uppercase the text
163
+ job "uppercase" do
164
+ runs_with UppercaseWorker # <5>
165
+ inputs_from_workflow # <6>
166
+ end
167
+
168
+ # Job 2: Reverse the uppercased text
169
+ job "reverse" do
170
+ needs "uppercase" # <7>
171
+ runs_with ReverseWorker
172
+ inputs_from_job "uppercase" # <8>
173
+ end
174
+
175
+ # Job 3: Finalize the result
176
+ job "finalize" do
177
+ needs "reverse"
178
+ runs_with FinalizeWorker
179
+ inputs_from_job "reverse"
180
+ outputs_to_workflow # <9>
181
+ terminates_workflow # <10>
182
+ end
183
+ end
184
+ end
185
+ ----
186
+ <1> Workflow accepts TextData as input
187
+ <2> Workflow returns FinalOutput as output
188
+ <3> First job to execute
189
+ <4> Last job that completes the workflow
190
+ <5> Associates job with worker class
191
+ <6> Job takes input directly from workflow input
192
+ <7> Job dependency - must run after "uppercase"
193
+ <8> Job takes input from "uppercase" job's output
194
+ <9> Job's output becomes workflow output
195
+ <10> Job terminates the workflow when complete
196
+
197
+ == Usage
198
+
199
+ Run the example from the project root:
200
+
201
+ [source,shell]
202
+ ----
203
+ ruby examples/workflow/simple_linear/simple_linear_workflow.rb
204
+ ----
205
+
206
+ == Expected Output
207
+
208
+ [example]
209
+ ====
210
+ [source]
211
+ ----
212
+ Simple Linear Workflow Example
213
+ ==================================================
214
+
215
+ Input: hello world from fractor
216
+
217
+ Workflow Results:
218
+ --------------------------------------------------
219
+ Status: SUCCESS
220
+ Execution Time: 0.002s
221
+ Completed Jobs: uppercase, reverse, finalize
222
+
223
+ Final Output:
224
+ Result: ROTCARF MORF DLROW OLLEH
225
+ Total Operations: 3
226
+ ----
227
+ ====
228
+
229
+ == Learning Points
230
+
231
+ === Sequential Dependencies
232
+
233
+ * Jobs execute in topological order based on `needs` declarations
234
+ * Each job waits for its dependencies to complete before starting
235
+ * The workflow engine automatically computes the execution order
236
+
237
+ === Type Safety
238
+
239
+ * Workers declare expected input/output types using `input_type` and `output_type`
240
+ * Workflow validates type compatibility at definition time
241
+ * Type declarations serve as documentation and enable validation
242
+
243
+ === Data Flow
244
+
245
+ * `inputs_from_workflow`: Job receives workflow input directly
246
+ * `inputs_from_job "source"`: Job receives output from specified job
247
+ * `outputs_to_workflow`: Job's output becomes the workflow result
248
+
249
+ === Workflow Boundaries
250
+
251
+ * `start_with`: Defines entry point(s) for the workflow
252
+ * `end_with`: Defines exit point(s) for the workflow
253
+ * `terminates_workflow`: Marks jobs that complete the workflow
254
+
255
+ === Error Handling
256
+
257
+ * If any job fails, the workflow stops immediately
258
+ * Workflow result includes success status and error information
259
+ * Completed jobs are tracked even if workflow fails partway through
260
+
261
+ == Next Steps
262
+
263
+ After understanding simple linear workflows, explore:
264
+
265
+ * link:../fan_out/README.adoc[Fan-Out Workflow] - Parallel job execution patterns
266
+ * link:../conditional/README.adoc[Conditional Workflow] - Runtime conditional execution
267
+ * link:../README.adoc[Workflow Overview] - Complete workflow system documentation