flowengine 0.2.1 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c6118d723586350e431080d822f01dbda45c9cc58257dd812e4d3dd7c30f8de6
4
- data.tar.gz: 606d8ac60526863acb699c166b396d00baa29fdedb009df4c4a4b6bfb4a783a0
3
+ metadata.gz: af012989841606174c73ceabb8895212385fe8896546229bf4f03f9beec72ccb
4
+ data.tar.gz: 0d1928d509626fba026b404c411513a10769af29890fc8ceccc090024c364a01
5
5
  SHA512:
6
- metadata.gz: 00c92c2094931b5c4d427aec5e2de4bce98d733be459aadd367ce58f3d64564c7332c53d8810d37d1f5d95352439572fe2252eb6068a4fa9278c0e1cef223c78
7
- data.tar.gz: d9bc34f2cfeb87ee7e6b6bc75e429903db85d4d2482a517240da2ab995bdd38202a03f3585f54a00e5e1d263751cf9f82225e34303889bbb67cdcbc0a7e47e92
6
+ metadata.gz: 7e49a5c46ed5eb3bdf1025134f94c3b8bb921d4234f7bdc9b081448d410e6bbe933f529fdcdb226099334c760bfeadda947c038c7038f6234e66e35c74cdeb96
7
+ data.tar.gz: 05d0dced1e4452a20b09b55eae0673e4ebb6e9fc2e76adfcee813148474a71cdb08a69bef9a8f2b9bf388c5aa314a7aaf803259186511c75387c43ce7b1a69a3
data/.rubocop_todo.yml CHANGED
@@ -1,6 +1,6 @@
1
1
  # This configuration was generated by
2
2
  # `rubocop --auto-gen-config`
3
- # on 2026-03-10 19:57:08 UTC using RuboCop version 1.85.1.
3
+ # on 2026-03-10 21:46:24 UTC using RuboCop version 1.85.1.
4
4
  # The point is for the user to remove these configuration records
5
5
  # one by one as the offenses are removed from the code base.
6
6
  # Note that changes in the inspected code, or installation of new
data/README.md CHANGED
@@ -144,9 +144,13 @@ end
144
144
  ### Using the Introduction at Runtime
145
145
 
146
146
  ```ruby
147
- # 1. Configure an LLM adapter and client
148
- adapter = FlowEngine::LLM::OpenAIAdapter.new(api_key: ENV["OPENAI_API_KEY"])
149
- client = FlowEngine::LLM::Client.new(adapter: adapter, model: "gpt-4o-mini")
147
+ # 1. Auto-detect adapter from environment (checks ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY)
148
+ client = FlowEngine::LLM.auto_client
149
+
150
+ # Or explicitly choose a provider:
151
+ # client = FlowEngine::LLM.auto_client(anthropic_api_key: "sk-ant-...")
152
+ # client = FlowEngine::LLM.auto_client(openai_api_key: "sk-...", model: "gpt-4o")
153
+ # client = FlowEngine::LLM.auto_client(gemini_api_key: "AIza...")
150
154
 
151
155
  # 2. Create the engine and submit the introduction
152
156
  engine = FlowEngine::Engine.new(definition)
@@ -185,10 +189,18 @@ engine.submit_introduction("My SSN is 123-45-6789", llm_client: client)
185
189
 
186
190
  ### Custom LLM Adapters
187
191
 
188
- The LLM integration uses an adapter pattern. The gem ships with an OpenAI adapter (via the [`ruby_llm`](https://github.com/crmne/ruby_llm) gem), but you can create adapters for any provider:
192
+ The LLM integration uses an adapter pattern. The gem ships with three adapters (all via the [`ruby_llm`](https://github.com/crmne/ruby_llm) gem):
193
+
194
+ | Adapter | Env Variable | Default Model |
195
+ |---------|-------------|---------------|
196
+ | `AnthropicAdapter` | `ANTHROPIC_API_KEY` | `claude-sonnet-4-20250514` |
197
+ | `OpenAIAdapter` | `OPENAI_API_KEY` | `gpt-4o-mini` |
198
+ | `GeminiAdapter` | `GEMINI_API_KEY` | `gemini-2.0-flash` |
199
+
200
+ You can also create adapters for any other provider:
189
201
 
190
202
  ```ruby
191
- class MyAnthropicAdapter < FlowEngine::LLM::Adapter
203
+ class MyCustomAdapter < FlowEngine::LLM::Adapter
192
204
  def initialize(api_key:)
193
205
  super()
194
206
  @api_key = api_key
@@ -200,8 +212,8 @@ class MyAnthropicAdapter < FlowEngine::LLM::Adapter
200
212
  end
201
213
  end
202
214
 
203
- adapter = MyAnthropicAdapter.new(api_key: ENV["ANTHROPIC_API_KEY"])
204
- client = FlowEngine::LLM::Client.new(adapter: adapter, model: "claude-sonnet-4-20250514")
215
+ adapter = MyCustomAdapter.new(api_key: ENV["MY_API_KEY"])
216
+ client = FlowEngine::LLM::Client.new(adapter: adapter, model: "my-model")
205
217
  ```
206
218
 
207
219
  ### State Persistence
@@ -236,8 +248,11 @@ The core has **zero UI logic**, **zero DB logic**, and **zero framework dependen
236
248
  | `Engine` | Stateful runtime: tracks current step, answers, history, and introduction |
237
249
  | `Validation::Adapter` | Interface for pluggable validation (dry-validation, JSON Schema, etc.) |
238
250
  | `LLM::Adapter` | Abstract interface for LLM API calls |
251
+ | `LLM::AnthropicAdapter` | Anthropic/Claude implementation via `ruby_llm` gem |
239
252
  | `LLM::OpenAIAdapter` | OpenAI implementation via `ruby_llm` gem |
253
+ | `LLM::GeminiAdapter` | Google Gemini implementation via `ruby_llm` gem |
240
254
  | `LLM::Client` | High-level: builds prompt, calls adapter, parses JSON response |
255
+ | `LLM.auto_client` | Factory: auto-detects provider from environment API keys |
241
256
  | `LLM::SensitiveDataFilter` | Rejects text containing SSN, ITIN, EIN patterns |
242
257
  | `Graph::MermaidExporter` | Exports the flow definition as a Mermaid diagram |
243
258
 
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "ruby_llm"
4
+
5
+ module FlowEngine
6
+ module LLM
7
+ # Anthropic adapter using the ruby_llm gem. Configures the API key
8
+ # and delegates chat calls to RubyLLM's conversation interface.
9
+ class AnthropicAdapter < Adapter
10
+ DEFAULT_MODEL = "claude-sonnet-4-20250514"
11
+
12
+ # @param api_key [String, nil] Anthropic API key; falls back to ANTHROPIC_API_KEY env var
13
+ # @raise [LLMError] if no API key is available
14
+ def initialize(api_key: nil)
15
+ super()
16
+ @api_key = api_key || ENV.fetch("ANTHROPIC_API_KEY", nil)
17
+ raise LLMError, "Anthropic API key not provided and ANTHROPIC_API_KEY not set" unless @api_key
18
+ end
19
+
20
+ # @param system_prompt [String] system instructions
21
+ # @param user_prompt [String] user's text
22
+ # @param model [String] Anthropic model identifier
23
+ # @return [String] response content from the LLM
24
+ def chat(system_prompt:, user_prompt:, model: DEFAULT_MODEL)
25
+ configure_ruby_llm!
26
+ conversation = RubyLLM.chat(model: model)
27
+ response = conversation.with_instructions(system_prompt).ask(user_prompt)
28
+ response.content
29
+ end
30
+
31
+ private
32
+
33
+ def configure_ruby_llm!
34
+ RubyLLM.configure do |config|
35
+ config.anthropic_api_key = @api_key
36
+ end
37
+ end
38
+ end
39
+ end
40
+ end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "ruby_llm"
4
+
5
+ module FlowEngine
6
+ module LLM
7
+ # Google Gemini adapter using the ruby_llm gem. Configures the API key
8
+ # and delegates chat calls to RubyLLM's conversation interface.
9
+ class GeminiAdapter < Adapter
10
+ DEFAULT_MODEL = "gemini-2.0-flash"
11
+
12
+ # @param api_key [String, nil] Gemini API key; falls back to GEMINI_API_KEY env var
13
+ # @raise [LLMError] if no API key is available
14
+ def initialize(api_key: nil)
15
+ super()
16
+ @api_key = api_key || ENV.fetch("GEMINI_API_KEY", nil)
17
+ raise LLMError, "Gemini API key not provided and GEMINI_API_KEY not set" unless @api_key
18
+ end
19
+
20
+ # @param system_prompt [String] system instructions
21
+ # @param user_prompt [String] user's text
22
+ # @param model [String] Gemini model identifier
23
+ # @return [String] response content from the LLM
24
+ def chat(system_prompt:, user_prompt:, model: DEFAULT_MODEL)
25
+ configure_ruby_llm!
26
+ conversation = RubyLLM.chat(model: model)
27
+ response = conversation.with_instructions(system_prompt).ask(user_prompt)
28
+ response.content
29
+ end
30
+
31
+ private
32
+
33
+ def configure_ruby_llm!
34
+ RubyLLM.configure do |config|
35
+ config.gemini_api_key = @api_key
36
+ end
37
+ end
38
+ end
39
+ end
40
+ end
@@ -2,6 +2,8 @@
2
2
 
3
3
  require_relative "llm/adapter"
4
4
  require_relative "llm/openai_adapter"
5
+ require_relative "llm/anthropic_adapter"
6
+ require_relative "llm/gemini_adapter"
5
7
  require_relative "llm/sensitive_data_filter"
6
8
  require_relative "llm/system_prompt_builder"
7
9
  require_relative "llm/client"
@@ -10,5 +12,37 @@ module FlowEngine
10
12
  # Namespace for LLM integration: adapters, system prompt building,
11
13
  # sensitive data filtering, and the high-level Client.
12
14
  module LLM
15
+ # Provider registry: ordered by priority (first match wins).
16
+ # Each entry maps an explicit kwarg name to [env_var, adapter_class, default_model].
17
+ PROVIDERS = [
18
+ [:anthropic_api_key, "ANTHROPIC_API_KEY", AnthropicAdapter, AnthropicAdapter::DEFAULT_MODEL],
19
+ [:openai_api_key, "OPENAI_API_KEY", OpenAIAdapter, "gpt-4o-mini"],
20
+ [:gemini_api_key, "GEMINI_API_KEY", GeminiAdapter, GeminiAdapter::DEFAULT_MODEL]
21
+ ].freeze
22
+
23
+ # Builds an adapter and Client by detecting which API key is available
24
+ # in the environment. Priority: Anthropic > OpenAI > Gemini.
25
+ #
26
+ # @param anthropic_api_key [String, nil] explicit Anthropic key
27
+ # @param openai_api_key [String, nil] explicit OpenAI key
28
+ # @param gemini_api_key [String, nil] explicit Gemini key
29
+ # @param model [String, nil] override model; auto-selected if nil
30
+ # @return [Client] configured client with the detected adapter
31
+ # @raise [LLMError] if no API key is found for any provider
32
+ def self.auto_client(anthropic_api_key: nil, openai_api_key: nil, gemini_api_key: nil, model: nil)
33
+ explicit_keys = { anthropic_api_key: anthropic_api_key, openai_api_key: openai_api_key,
34
+ gemini_api_key: gemini_api_key }
35
+
36
+ PROVIDERS.each do |kwarg, env_var, adapter_class, default_model|
37
+ key = explicit_keys[kwarg] || ENV.fetch(env_var, nil)
38
+ next unless key
39
+
40
+ adapter = adapter_class.new(api_key: key)
41
+ return Client.new(adapter: adapter, model: model || default_model)
42
+ end
43
+
44
+ env_vars = PROVIDERS.map { |_, env_var, _, _| env_var }.join(", ")
45
+ raise FlowEngine::LLMError, "No LLM API key found. Set #{env_vars}"
46
+ end
13
47
  end
14
48
  end
@@ -2,5 +2,5 @@
2
2
 
3
3
  module FlowEngine
4
4
  # Semantic version of the flowengine gem (major.minor.patch).
5
- VERSION = "0.2.1"
5
+ VERSION = "0.3.0"
6
6
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: flowengine
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.1
4
+ version: 0.3.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Konstantin Gredeskoul
@@ -127,7 +127,9 @@ files:
127
127
  - lib/flowengine/introduction.rb
128
128
  - lib/flowengine/llm.rb
129
129
  - lib/flowengine/llm/adapter.rb
130
+ - lib/flowengine/llm/anthropic_adapter.rb
130
131
  - lib/flowengine/llm/client.rb
132
+ - lib/flowengine/llm/gemini_adapter.rb
131
133
  - lib/flowengine/llm/openai_adapter.rb
132
134
  - lib/flowengine/llm/sensitive_data_filter.rb
133
135
  - lib/flowengine/llm/system_prompt_builder.rb