flowengine 0.2.1 → 0.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c6118d723586350e431080d822f01dbda45c9cc58257dd812e4d3dd7c30f8de6
4
- data.tar.gz: 606d8ac60526863acb699c166b396d00baa29fdedb009df4c4a4b6bfb4a783a0
3
+ metadata.gz: 2cfa285d8ba7e6975cffead57c7a737c42ddb119be74e37edb617198c56d391c
4
+ data.tar.gz: 842c4b843cbdd83fceb49cb066bb49553d4d95a6d0cf2d1df891c8d553eb005b
5
5
  SHA512:
6
- metadata.gz: 00c92c2094931b5c4d427aec5e2de4bce98d733be459aadd367ce58f3d64564c7332c53d8810d37d1f5d95352439572fe2252eb6068a4fa9278c0e1cef223c78
7
- data.tar.gz: d9bc34f2cfeb87ee7e6b6bc75e429903db85d4d2482a517240da2ab995bdd38202a03f3585f54a00e5e1d263751cf9f82225e34303889bbb67cdcbc0a7e47e92
6
+ metadata.gz: 742b19a6eb3c667612ce74d83dab6f0402b02bf5cc0f808a4cf9c2070dfb649617e126a6674c7059465cf6654f0898d9d367bd483c6da4bffe8ec785ec5ae6c1
7
+ data.tar.gz: 225aae48aa9c12121c7a466a263ff66a5735064472720776cf17f0404b9d3a9f581107df57a8efa2d4071f3a635568b3b7ed37ff8d58f4b876f6b29af498e2fd
data/.rubocop_todo.yml CHANGED
@@ -1,6 +1,6 @@
1
1
  # This configuration was generated by
2
2
  # `rubocop --auto-gen-config`
3
- # on 2026-03-10 19:57:08 UTC using RuboCop version 1.85.1.
3
+ # on 2026-03-11 14:01:54 UTC using RuboCop version 1.85.1.
4
4
  # The point is for the user to remove these configuration records
5
5
  # one by one as the offenses are removed from the code base.
6
6
  # Note that changes in the inspected code, or installation of new
@@ -13,11 +13,6 @@ Gemspec/DevelopmentDependencies:
13
13
  Exclude:
14
14
  - 'flowengine.gemspec'
15
15
 
16
- # Offense count: 1
17
- # Configuration parameters: AllowedMethods, AllowedPatterns, CountRepeatedAttributes.
18
- Metrics/AbcSize:
19
- Max: 18
20
-
21
16
  # Offense count: 2
22
17
  # Configuration parameters: AllowedMethods, AllowedPatterns.
23
18
  Metrics/CyclomaticComplexity:
data/README.md CHANGED
@@ -98,6 +98,69 @@ engine.history
98
98
 
99
99
  ### Using the `flowengine-cli` gem to Generate the JSON Answers File
100
100
 
101
+ ---
102
+
103
+ ## LLM-Based DSL Capabilities & Environment Variables
104
+
105
+ There are several environment variables that define which vendor and which model you can talk to should you choose to engage LLM in your decision logic.
106
+
107
+ There is a very special YAML file that's provided with this gem, which locks in the list of supported vendors and three types of models per vendor:
108
+
109
+ - best bang for the buck models
110
+ - deep thinking and hard mode models
111
+ - fastest models
112
+ - *at some point we might also add the "cheapest".*
113
+
114
+ +The file [resources/models.yml](resources/models.yml) defines which models are available to the adapter. This file is used at the startup of the gem, to load and initialize all LLM Adapters for which we have the API Key defined in the environment. And for those we'll have at least three model names loaded:
115
+
116
+ - `top:` — best results, likely the most expensive.
117
+ - `default:` — default model, if the user of the adapter does not specify.
118
+ - `fastest` — the fastest model from this vendor.
119
+
120
+ Here is the contents of `resources/models.yml` verbatim:
121
+
122
+ ```yaml
123
+ models:
124
+ version: "1.0"
125
+ date: "Wed Mar 11 02:35:39 PDT 2026"
126
+ vendors:
127
+ anthropic:
128
+ adapter: "FlowEngine::LLM::Adapters::AnthropicAdapter"
129
+ var: "ANTHROPIC_API_KEY"
130
+ top: "claude-opus-4-6"
131
+ default: "claude-sonnet-4-6"
132
+ fastest: "claude-haiku-4-5-20251001"
133
+ openai:
134
+ adapter: "FlowEngine::LLM::Adapters::OpenAIAdapter"
135
+ var: "OPENAI_API_KEY"
136
+ top: "gpt-5.4"
137
+ default: "gpt-5-mini"
138
+ fastest: "gpt-5-nano"
139
+ gemini:
140
+ adapter: "FlowEngine::LLM::Adapters::GeminiAdapters"
141
+ var: "GEMINI_API_KEY"
142
+ top: "gemini-3.1-pro-preview"
143
+ default: "gemini-2.5-flash"
144
+ fastest: "gemini-2.5-flash-lite"
145
+ ```
146
+
147
+ Notice how this file operates as almost a sort of glue for the gem: it explicitly tells you the names of variables to store your API keys, the class names of the corresponding Adapters, and the three models for each vendor:
148
+
149
+ 1. `:top`
150
+ 2. `:default`
151
+ 3. `:fastest`
152
+
153
+ > [!IMPORTANT]
154
+ >
155
+ > The reason these models are extracted into a separate YAML file should be obvious: the contents of this list seems to change every week, and gem can remain at the same version for years. For this reason, the gem honors the environment variable `${FLOWENGINE_LLM_MODELS_PATH}` and will read the models and vendors from the file pointed to by that path environment variable. This is your door to better models, and other LLM vendors that RubyLLM supports.
156
+
157
+ When the gem is loading, one of the first things it does is load this YAML file and instantiate the hash of pre-initialized adapters.
158
+
159
+ Need an adapter to throw with your API call?
160
+
161
+ ```ruby
162
+ FlowEngine::LLM[vendor: :anthropic, 'claude-opus-4-6']
163
+
101
164
  ## LLM-parsed Introduction
102
165
 
103
166
  FlowEngine supports an optional **introduction step** that collects free-form text from the user before the structured flow begins. An LLM parses this text to pre-fill answers, automatically skipping steps the user already answered in their introduction.
@@ -144,9 +207,13 @@ end
144
207
  ### Using the Introduction at Runtime
145
208
 
146
209
  ```ruby
147
- # 1. Configure an LLM adapter and client
148
- adapter = FlowEngine::LLM::OpenAIAdapter.new(api_key: ENV["OPENAI_API_KEY"])
149
- client = FlowEngine::LLM::Client.new(adapter: adapter, model: "gpt-4o-mini")
210
+ # 1. Auto-detect adapter from environment (checks ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY)
211
+ client = FlowEngine::LLM.auto_client
212
+
213
+ # Or explicitly choose a provider:
214
+ # client = FlowEngine::LLM.auto_client(anthropic_api_key: "sk-ant-...")
215
+ # client = FlowEngine::LLM.auto_client(openai_api_key: "sk-...", model: "gpt-4o")
216
+ # client = FlowEngine::LLM.auto_client(gemini_api_key: "AIza...")
150
217
 
151
218
  # 2. Create the engine and submit the introduction
152
219
  engine = FlowEngine::Engine.new(definition)
@@ -176,19 +243,27 @@ Before any text reaches the LLM, `submit_introduction` scans for sensitive data
176
243
  - **EIN**: `12-3456789`
177
244
  - **Nine consecutive digits**: `123456789`
178
245
 
179
- If detected, a `FlowEngine::SensitiveDataError` is raised immediately. The introduction text is discarded and no LLM call is made.
246
+ If detected, a `FlowEngine::Errors::SensitiveDataError` is raised immediately. The introduction text is discarded and no LLM call is made.
180
247
 
181
248
  ```ruby
182
249
  engine.submit_introduction("My SSN is 123-45-6789", llm_client: client)
183
- # => raises FlowEngine::SensitiveDataError
250
+ # => raises FlowEngine::Errors::SensitiveDataError
184
251
  ```
185
252
 
186
253
  ### Custom LLM Adapters
187
254
 
188
- The LLM integration uses an adapter pattern. The gem ships with an OpenAI adapter (via the [`ruby_llm`](https://github.com/crmne/ruby_llm) gem), but you can create adapters for any provider:
255
+ The LLM integration uses an adapter pattern. The gem ships with three adapters (all via the [`ruby_llm`](https://github.com/crmne/ruby_llm) gem):
256
+
257
+ | Adapter | Env Variable | Default Model |
258
+ |---------|-------------|---------------|
259
+ | `AnthropicAdapter` | `ANTHROPIC_API_KEY` | `claude-sonnet-4-20250514` |
260
+ | `OpenAIAdapter` | `OPENAI_API_KEY` | `gpt-4o-mini` |
261
+ | `GeminiAdapter` | `GEMINI_API_KEY` | `gemini-2.0-flash` |
262
+
263
+ You can also create adapters for any other provider:
189
264
 
190
265
  ```ruby
191
- class MyAnthropicAdapter < FlowEngine::LLM::Adapter
266
+ class MyCustomAdapter < FlowEngine::LLM::Adapter
192
267
  def initialize(api_key:)
193
268
  super()
194
269
  @api_key = api_key
@@ -200,8 +275,8 @@ class MyAnthropicAdapter < FlowEngine::LLM::Adapter
200
275
  end
201
276
  end
202
277
 
203
- adapter = MyAnthropicAdapter.new(api_key: ENV["ANTHROPIC_API_KEY"])
204
- client = FlowEngine::LLM::Client.new(adapter: adapter, model: "claude-sonnet-4-20250514")
278
+ adapter = MyCustomAdapter.new(api_key: ENV["MY_API_KEY"])
279
+ client = FlowEngine::LLM::Client.new(adapter: adapter, model: "my-model")
205
280
  ```
206
281
 
207
282
  ### State Persistence
@@ -236,8 +311,11 @@ The core has **zero UI logic**, **zero DB logic**, and **zero framework dependen
236
311
  | `Engine` | Stateful runtime: tracks current step, answers, history, and introduction |
237
312
  | `Validation::Adapter` | Interface for pluggable validation (dry-validation, JSON Schema, etc.) |
238
313
  | `LLM::Adapter` | Abstract interface for LLM API calls |
314
+ | `LLM::AnthropicAdapter` | Anthropic/Claude implementation via `ruby_llm` gem |
239
315
  | `LLM::OpenAIAdapter` | OpenAI implementation via `ruby_llm` gem |
316
+ | `LLM::GeminiAdapter` | Google Gemini implementation via `ruby_llm` gem |
240
317
  | `LLM::Client` | High-level: builds prompt, calls adapter, parses JSON response |
318
+ | `LLM.auto_client` | Factory: auto-detects provider from environment API keys |
241
319
  | `LLM::SensitiveDataFilter` | Rejects text containing SSN, ITIN, EIN patterns |
242
320
  | `Graph::MermaidExporter` | Exports the flow definition as a Mermaid diagram |
243
321
 
@@ -452,11 +530,11 @@ engine = FlowEngine::Engine.new(definition)
452
530
  ```ruby
453
531
  # Answering after the flow is complete
454
532
  engine.answer("extra")
455
- # => raises FlowEngine::AlreadyFinishedError
533
+ # => raises FlowEngine::Errors::AlreadyFinishedError
456
534
 
457
535
  # Referencing an unknown step in a definition
458
536
  definition.step(:nonexistent)
459
- # => raises FlowEngine::UnknownStepError
537
+ # => raises FlowEngine::Errors::UnknownStepError
460
538
 
461
539
  # Invalid definition (start step doesn't exist)
462
540
  FlowEngine.define do
@@ -466,19 +544,19 @@ FlowEngine.define do
466
544
  question "Hello"
467
545
  end
468
546
  end
469
- # => raises FlowEngine::DefinitionError
547
+ # => raises FlowEngine::Errors::DefinitionError
470
548
 
471
549
  # Sensitive data in introduction
472
550
  engine.submit_introduction("My SSN is 123-45-6789", llm_client: client)
473
- # => raises FlowEngine::SensitiveDataError
551
+ # => raises FlowEngine::Errors::SensitiveDataError
474
552
 
475
553
  # Introduction exceeds maxlength
476
554
  engine.submit_introduction("A" * 3000, llm_client: client)
477
- # => raises FlowEngine::ValidationError
555
+ # => raises FlowEngine::Errors::ValidationError
478
556
 
479
557
  # Missing API key or LLM response parsing failure
480
- FlowEngine::LLM::OpenAIAdapter.new # without OPENAI_API_KEY
481
- # => raises FlowEngine::LLMError
558
+ FlowEngine::LLM::Adapters::OpenAIAdapter.new # without OPENAI_API_KEY
559
+ # => raises FlowEngine::Errors::LLMError
482
560
  ```
483
561
 
484
562
  ## Validation
@@ -495,8 +573,8 @@ class FlowEngine::Validation::Adapter
495
573
  end
496
574
 
497
575
  # Result object
498
- FlowEngine::Validation::Result.new(valid: true, errors: [])
499
- FlowEngine::Validation::Result.new(valid: false, errors: ["must be a number"])
576
+ FlowEngine::Errors::Validation::Result.new(valid: true, errors: [])
577
+ FlowEngine::Errors::Validation::Result.new(valid: false, errors: ["must be a number"])
500
578
  ```
501
579
 
502
580
  ### Custom Validator Example
@@ -517,12 +595,12 @@ class MyValidator < FlowEngine::Validation::Adapter
517
595
  end
518
596
  end
519
597
 
520
- FlowEngine::Validation::Result.new(valid: errors.empty?, errors: errors)
598
+ FlowEngine::Errors::Validation::Result.new(valid: errors.empty?, errors: errors)
521
599
  end
522
600
  end
523
601
 
524
602
  engine = FlowEngine::Engine.new(definition, validator: MyValidator.new)
525
- engine.answer("not_a_number") # => raises FlowEngine::ValidationError
603
+ engine.answer("not_a_number") # => raises FlowEngine::Errors::ValidationError
526
604
  ```
527
605
 
528
606
  ## Mermaid Diagram Export
@@ -0,0 +1,21 @@
1
+ # frozen_string_literal: true
2
+
3
+ module FlowEngine
4
+ # Immutable result from an AI intake or clarification round.
5
+ #
6
+ # @attr_reader answered [Hash<Symbol, Object>] step_id => value pairs filled this round
7
+ # @attr_reader pending_steps [Array<Symbol>] step ids still unanswered after this round
8
+ # @attr_reader follow_up [String, nil] clarifying question from the LLM, or nil if done
9
+ # @attr_reader round [Integer] current clarification round (1-based)
10
+ ClarificationResult = Data.define(:answered, :pending_steps, :follow_up, :round) do
11
+ def initialize(answered: {}, pending_steps: [], follow_up: nil, round: 1)
12
+ super
13
+ freeze
14
+ end
15
+
16
+ # @return [Boolean] true when the LLM has no more questions or max rounds reached
17
+ def done?
18
+ follow_up.nil?
19
+ end
20
+ end
21
+ end
@@ -30,7 +30,7 @@ module FlowEngine
30
30
  # @return [Node] the node for that step
31
31
  # @raise [UnknownStepError] if id is not in steps
32
32
  def step(id)
33
- steps.fetch(id) { raise UnknownStepError, "Unknown step: #{id.inspect}" }
33
+ steps.fetch(id) { raise Errors::UnknownStepError, "Unknown step: #{id.inspect}" }
34
34
  end
35
35
 
36
36
  # @return [Array<Symbol>] all step ids in the definition
@@ -41,7 +41,9 @@ module FlowEngine
41
41
  private
42
42
 
43
43
  def validate!
44
- raise DefinitionError, "Start step #{start_step_id.inspect} not found in nodes" unless steps.key?(start_step_id)
44
+ return if steps.key?(start_step_id)
45
+
46
+ raise Errors::DefinitionError, "Start step #{start_step_id.inspect} not found in nodes"
45
47
  end
46
48
  end
47
49
  end
@@ -45,8 +45,8 @@ module FlowEngine
45
45
  # @return [Definition]
46
46
  # @raise [DefinitionError] if start was not set or no steps were defined
47
47
  def build
48
- raise DefinitionError, "No start step defined" if @start_step_id.nil?
49
- raise DefinitionError, "No steps defined" if @nodes.empty?
48
+ raise ::FlowEngine::Errors::DefinitionError, "No start step defined" if @start_step_id.nil?
49
+ raise ::FlowEngine::Errors::DefinitionError, "No steps defined" if @nodes.empty?
50
50
 
51
51
  Definition.new(start_step_id: @start_step_id, nodes: @nodes, introduction: @introduction)
52
52
  end
@@ -15,6 +15,7 @@ module FlowEngine
15
15
  @transitions = []
16
16
  @visibility_rule = nil
17
17
  @decorations = nil
18
+ @max_clarifications = 0
18
19
  end
19
20
 
20
21
  # Sets the step/input type (e.g. :multi_select, :number_matrix).
@@ -62,6 +63,13 @@ module FlowEngine
62
63
  @visibility_rule = rule
63
64
  end
64
65
 
66
+ # Sets the maximum number of clarification rounds for an :ai_intake step.
67
+ #
68
+ # @param count [Integer] max follow-up rounds (0 = one-shot, no clarifications)
69
+ def max_clarifications(count)
70
+ @max_clarifications = count
71
+ end
72
+
65
73
  # Builds the {Node} for the given step id from accumulated attributes.
66
74
  #
67
75
  # @param id [Symbol] step id
@@ -75,7 +83,8 @@ module FlowEngine
75
83
  fields: @fields,
76
84
  transitions: @transitions,
77
85
  visibility_rule: @visibility_rule,
78
- decorations: @decorations
86
+ decorations: @decorations,
87
+ max_clarifications: @max_clarifications
79
88
  )
80
89
  end
81
90
  end
@@ -1,9 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
- require_relative "dsl/rule_helpers"
4
- require_relative "dsl/step_builder"
5
- require_relative "dsl/flow_builder"
6
-
7
3
  module FlowEngine
8
4
  # Namespace for the declarative flow DSL: {FlowBuilder} builds a {Definition} from blocks,
9
5
  # {StepBuilder} builds individual {Node}s, and {RuleHelpers} provide rule factory methods.
@@ -0,0 +1,50 @@
1
+ # frozen_string_literal: true
2
+
3
+ module FlowEngine
4
+ class Engine
5
+ # Handles state serialization and deserialization for Engine persistence.
6
+ # Normalizes string-keyed hashes (from JSON) to symbol-keyed hashes.
7
+ module StateSerializer
8
+ SYMBOLIZERS = {
9
+ current_step_id: ->(v) { v&.to_sym },
10
+ active_intake_step_id: ->(v) { v&.to_sym },
11
+ history: ->(v) { Array(v).map { |e| e&.to_sym } },
12
+ answers: ->(v) { symbolize_answers(v) },
13
+ conversation_history: ->(v) { symbolize_conversation_history(v) }
14
+ }.freeze
15
+
16
+ # Normalizes a state hash so step ids and history entries are symbols.
17
+ def self.symbolize_state(hash)
18
+ return hash unless hash.is_a?(Hash)
19
+
20
+ hash.each_with_object({}) do |(key, value), result|
21
+ sym_key = key.to_sym
22
+ result[sym_key] = SYMBOLIZERS.fetch(sym_key, ->(v) { v }).call(value)
23
+ end
24
+ end
25
+
26
+ # @param answers [Hash] answers map (keys may be strings)
27
+ # @return [Hash] same map with symbol keys
28
+ def self.symbolize_answers(answers)
29
+ return {} unless answers.is_a?(Hash)
30
+
31
+ answers.each_with_object({}) { |(k, v), h| h[k.to_sym] = v }
32
+ end
33
+
34
+ # @param history [Array<Hash>] conversation history entries
35
+ # @return [Array<Hash>] same entries with symbolized keys and role
36
+ def self.symbolize_conversation_history(history)
37
+ return [] unless history.is_a?(Array)
38
+
39
+ history.map do |entry|
40
+ next entry unless entry.is_a?(Hash)
41
+
42
+ entry.each_with_object({}) do |(k, v), h|
43
+ sym_key = k.to_sym
44
+ h[sym_key] = sym_key == :role ? v.to_sym : v
45
+ end
46
+ end
47
+ end
48
+ end
49
+ end
50
+ end
@@ -3,14 +3,9 @@
3
3
  module FlowEngine
4
4
  # Runtime session that drives flow navigation: holds definition, answers, and current step.
5
5
  # Validates each answer via an optional {Validation::Adapter}, then advances using node transitions.
6
- #
7
- # @attr_reader definition [Definition] immutable flow definition
8
- # @attr_reader answers [Hash] step_id => value (mutable as user answers)
9
- # @attr_reader history [Array<Symbol>] ordered list of step ids visited (including current)
10
- # @attr_reader current_step_id [Symbol, nil] current step id, or nil when flow is finished
11
- # @attr_reader introduction_text [String, nil] free-form text submitted before the flow began
12
- class Engine
13
- attr_reader :definition, :answers, :history, :current_step_id, :introduction_text
6
+ class Engine # rubocop:disable Metrics/ClassLength
7
+ attr_reader :definition, :answers, :history, :current_step_id, :introduction_text,
8
+ :clarification_round, :conversation_history
14
9
 
15
10
  # @param definition [Definition] the flow to run
16
11
  # @param validator [Validation::Adapter] validator for step answers (default: {Validation::NullAdapter})
@@ -21,6 +16,9 @@ module FlowEngine
21
16
  @current_step_id = definition.start_step_id
22
17
  @validator = validator
23
18
  @introduction_text = nil
19
+ @clarification_round = 0
20
+ @conversation_history = []
21
+ @active_intake_step_id = nil
24
22
  @history << @current_step_id
25
23
  end
26
24
 
@@ -42,10 +40,10 @@ module FlowEngine
42
40
  # @raise [AlreadyFinishedError] if the flow has already finished
43
41
  # @raise [ValidationError] if the validator rejects the value
44
42
  def answer(value)
45
- raise AlreadyFinishedError, "Flow is already finished" if finished?
43
+ raise Errors::AlreadyFinishedError, "Flow is already finished" if finished?
46
44
 
47
45
  result = @validator.validate(current_step, value)
48
- raise ValidationError, "Validation failed: #{result.errors.join(", ")}" unless result.valid?
46
+ raise Errors::ValidationError, "Validation failed: #{result.errors.join(", ")}" unless result.valid?
49
47
 
50
48
  answers[@current_step_id] = value
51
49
  advance_step
@@ -56,9 +54,6 @@ module FlowEngine
56
54
  #
57
55
  # @param text [String] user's free-form introduction
58
56
  # @param llm_client [LLM::Client] configured LLM client for parsing
59
- # @raise [SensitiveDataError] if text contains SSN, ITIN, EIN, etc.
60
- # @raise [ValidationError] if text exceeds the introduction maxlength
61
- # @raise [LLMError] on LLM communication or parsing failures
62
57
  def submit_introduction(text, llm_client:)
63
58
  validate_introduction_length!(text)
64
59
  LLM::SensitiveDataFilter.check!(text)
@@ -68,65 +63,67 @@ module FlowEngine
68
63
  auto_advance_prefilled
69
64
  end
70
65
 
71
- # Serializable state for persistence or resumption.
66
+ # Submits free-form text for the current AI intake step. Returns a ClarificationResult.
72
67
  #
73
- # @return [Hash] current_step_id, answers, history, and introduction_text
68
+ # @param text [String] user's free-form text
69
+ # @param llm_client [LLM::Client] configured LLM client
70
+ # @return [ClarificationResult]
71
+ def submit_ai_intake(text, llm_client:)
72
+ node = current_step
73
+ raise Errors::EngineError, "Current step is not an AI intake step" unless node&.ai_intake?
74
+
75
+ LLM::SensitiveDataFilter.check!(text)
76
+
77
+ @active_intake_step_id = @current_step_id
78
+ @clarification_round = 1
79
+ @conversation_history = [{ role: :user, text: text }]
80
+
81
+ perform_intake_round(text, llm_client, node)
82
+ end
83
+
84
+ # Submits a clarification response for an ongoing AI intake conversation.
85
+ #
86
+ # @param text [String] user's response to the follow-up question
87
+ # @param llm_client [LLM::Client] configured LLM client
88
+ # @return [ClarificationResult]
89
+ def submit_clarification(text, llm_client:)
90
+ raise Errors::EngineError, "No active AI intake conversation to clarify" unless @active_intake_step_id
91
+
92
+ LLM::SensitiveDataFilter.check!(text)
93
+
94
+ node = @definition.step(@active_intake_step_id)
95
+ @clarification_round += 1
96
+ @conversation_history << { role: :user, text: text }
97
+
98
+ perform_intake_round(text, llm_client, node)
99
+ end
100
+
101
+ # Serializable state for persistence or resumption.
74
102
  def to_state
75
103
  {
76
104
  current_step_id: @current_step_id,
77
105
  answers: @answers,
78
106
  history: @history,
79
- introduction_text: @introduction_text
107
+ introduction_text: @introduction_text,
108
+ clarification_round: @clarification_round,
109
+ conversation_history: @conversation_history,
110
+ active_intake_step_id: @active_intake_step_id
80
111
  }
81
112
  end
82
113
 
83
- # Rebuilds an engine from a previously saved state (e.g. from DB or session).
114
+ # Rebuilds an engine from a previously saved state.
84
115
  #
85
116
  # @param definition [Definition] same definition used when state was captured
86
- # @param state_hash [Hash] hash with :current_step_id, :answers, :history (keys may be strings)
117
+ # @param state_hash [Hash] hash with state keys (may be strings from JSON)
87
118
  # @param validator [Validation::Adapter] validator to use (default: NullAdapter)
88
119
  # @return [Engine] restored engine instance
89
120
  def self.from_state(definition, state_hash, validator: Validation::NullAdapter.new)
90
- state = symbolize_state(state_hash)
121
+ state = StateSerializer.symbolize_state(state_hash)
91
122
  engine = allocate
92
123
  engine.send(:restore_state, definition, state, validator)
93
124
  engine
94
125
  end
95
126
 
96
- # Normalizes a state hash so step ids and history entries are symbols; answers keys are symbols.
97
- #
98
- # @param hash [Hash] raw state (e.g. from JSON)
99
- # @return [Hash] symbolized state
100
- def self.symbolize_state(hash)
101
- return hash unless hash.is_a?(Hash)
102
-
103
- hash.each_with_object({}) do |(key, value), result|
104
- sym_key = key.to_sym
105
- result[sym_key] = case sym_key
106
- when :current_step_id
107
- value&.to_sym
108
- when :history
109
- Array(value).map { |v| v&.to_sym }
110
- when :answers
111
- symbolize_answers(value)
112
- else
113
- value
114
- end
115
- end
116
- end
117
-
118
- # @param answers [Hash] answers map (keys may be strings)
119
- # @return [Hash] same map with symbol keys
120
- def self.symbolize_answers(answers)
121
- return {} unless answers.is_a?(Hash)
122
-
123
- answers.each_with_object({}) do |(key, value), result|
124
- result[key.to_sym] = value
125
- end
126
- end
127
-
128
- private_class_method :symbolize_state, :symbolize_answers
129
-
130
127
  private
131
128
 
132
129
  def restore_state(definition, state, validator)
@@ -136,12 +133,14 @@ module FlowEngine
136
133
  @answers = state[:answers] || {}
137
134
  @history = state[:history] || []
138
135
  @introduction_text = state[:introduction_text]
136
+ @clarification_round = state[:clarification_round] || 0
137
+ @conversation_history = state[:conversation_history] || []
138
+ @active_intake_step_id = state[:active_intake_step_id]
139
139
  end
140
140
 
141
141
  def advance_step
142
142
  node = definition.step(@current_step_id)
143
143
  next_id = node.next_step_id(answers)
144
-
145
144
  @current_step_id = next_id
146
145
  @history << next_id if next_id
147
146
  end
@@ -151,13 +150,58 @@ module FlowEngine
151
150
  return unless maxlength
152
151
  return if text.length <= maxlength
153
152
 
154
- raise ValidationError, "Introduction text exceeds maxlength (#{text.length}/#{maxlength})"
153
+ raise Errors::ValidationError, "Introduction text exceeds maxlength (#{text.length}/#{maxlength})"
155
154
  end
156
155
 
157
- # Advances through consecutive steps that already have pre-filled answers.
158
- # Stops at the first step without a pre-filled answer or when the flow ends.
159
156
  def auto_advance_prefilled
160
157
  advance_step while @current_step_id && @answers.key?(@current_step_id)
161
158
  end
159
+
160
+ def perform_intake_round(user_text, llm_client, node)
161
+ result = llm_client.parse_ai_intake(
162
+ definition: @definition, user_text: user_text,
163
+ answered: @answers, conversation_history: @conversation_history
164
+ )
165
+ @answers.merge!(result[:answers])
166
+ follow_up = resolve_follow_up(result[:follow_up], node)
167
+
168
+ build_clarification_result(result[:answers], follow_up)
169
+ end
170
+
171
+ def resolve_follow_up(follow_up, node)
172
+ if follow_up && @clarification_round <= node.max_clarifications
173
+ @conversation_history << { role: :assistant, text: follow_up }
174
+ follow_up
175
+ else
176
+ finalize_intake
177
+ nil
178
+ end
179
+ end
180
+
181
+ def build_clarification_result(round_answers, follow_up)
182
+ ClarificationResult.new(
183
+ answered: round_answers,
184
+ pending_steps: pending_non_intake_steps,
185
+ follow_up: follow_up,
186
+ round: @clarification_round
187
+ )
188
+ end
189
+
190
+ def finalize_intake
191
+ @answers[@active_intake_step_id] = conversation_summary
192
+ @active_intake_step_id = nil
193
+ advance_step
194
+ auto_advance_prefilled
195
+ end
196
+
197
+ def conversation_summary
198
+ @conversation_history.map { |e| "#{e[:role]}: #{e[:text]}" }.join("\n")
199
+ end
200
+
201
+ def pending_non_intake_steps
202
+ @definition.steps.each_with_object([]) do |(id, node), pending|
203
+ pending << id unless node.ai_intake? || @answers.key?(id)
204
+ end
205
+ end
162
206
  end
163
207
  end