riffer 0.5.1 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 8b78c9eff3bd51ccab98e50b8adbee7c5aabd8b5da8fea825559c7a7b7783340
4
- data.tar.gz: 605b9ab5ab96bf8f7b6af9083a955293931a0603875c6a17011a1249d4533948
3
+ metadata.gz: 414f5f9afaa98b3d19f1e649ea15aaa632fa60073d11ac940d6fdbb893b4356e
4
+ data.tar.gz: dd693e59f9acabd91f9bb4f7621aced10ca871d4f76d030ad6ab2ab743b9f2b9
5
5
  SHA512:
6
- metadata.gz: e80cf949f74dbb12d9e161525c4403cfa8f560333e322580f28f187a946c0c061acd47e92a68d328767819daeba4397639891d7f6585d3c095109824d6702638
7
- data.tar.gz: 1f81bb4bd577b13d0e0c45c37d594012c340f13bf8e1bf59a34f88481cb6cecddd54078a51d1dae262982e73d8bcbbbe6fae030e61a3d7cc7a3c7ae3014ed39d
6
+ metadata.gz: b4ee5e296e43d8727a2e9a214d8fe713bae89f0ba90c7e46ec6b7c1c3fae4cd2c8b0dc8efcbf771955a1c36b4a1ed5a4b591d034f9bf88b17ee99464590ed551
7
+ data.tar.gz: 1eb8e7c0ee9e381a78d5ddcbeb707f6bed15935bf8d45367da26a9f723372de7503dc14c7379d2e180c581b4aad6257571888eb46bc73e2dce190ef1bd8b823c
@@ -1,3 +1,3 @@
1
1
  {
2
- ".": "0.5.1"
2
+ ".": "0.6.0"
3
3
  }
data/CHANGELOG.md CHANGED
@@ -5,6 +5,14 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [0.6.0](https://github.com/janeapp/riffer/compare/riffer/v0.5.1...riffer/v0.6.0) (2026-01-14)
9
+
10
+
11
+ ### Features
12
+
13
+ * aws bedrock provider ([#73](https://github.com/janeapp/riffer/issues/73)) ([428ae90](https://github.com/janeapp/riffer/commit/428ae902db90c2d3765186ea06d76ee379b3eae7))
14
+ * reasoning support ([#75](https://github.com/janeapp/riffer/issues/75)) ([fcee502](https://github.com/janeapp/riffer/commit/fcee502054882f41d15ea312222a5538c8f04220))
15
+
8
16
  ## [0.5.1](https://github.com/janeapp/riffer/compare/riffer/v0.5.0...riffer/v0.5.1) (2026-01-10)
9
17
 
10
18
 
data/CLAUDE.md ADDED
@@ -0,0 +1,73 @@
1
+ # CLAUDE.md
2
+
3
+ This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4
+
5
+ ## Project Overview
6
+
7
+ Riffer is a Ruby gem framework for building AI-powered applications and agents. It uses Zeitwerk for autoloading.
8
+
9
+ ## Commands
10
+
11
+ ```bash
12
+ # Install dependencies
13
+ bin/setup
14
+
15
+ # Run tests
16
+ bundle exec rake test
17
+
18
+ # Run a single test file
19
+ bundle exec ruby -Ilib:test test/riffer/agent_test.rb
20
+
21
+ # Run a specific test by name
22
+ bundle exec ruby -Ilib:test test/riffer/agent_test.rb --name "test_something"
23
+
24
+ # Check code style
25
+ bundle exec rake standard
26
+
27
+ # Auto-fix style issues
28
+ bundle exec rake standard:fix
29
+
30
+ # Run both tests and linting (default task)
31
+ bundle exec rake
32
+
33
+ # Interactive console
34
+ bin/console
35
+
36
+ # Generate documentation
37
+ bundle exec rake docs
38
+ ```
39
+
40
+ ## Architecture
41
+
42
+ ### Core Components
43
+
44
+ - **Agent** (`lib/riffer/agent.rb`): Base class for AI agents. Subclass and use DSL methods `model` and `instructions` to configure. Orchestrates message flow, LLM calls, and tool execution via a generate/stream loop.
45
+
46
+ - **Providers** (`lib/riffer/providers/`): Adapters for LLM APIs. Each provider extends `Riffer::Providers::Base` and implements `perform_generate_text` and `perform_stream_text`. Registered in `Repository` with identifier (e.g., `openai`).
47
+
48
+ - **Messages** (`lib/riffer/messages/`): Typed message objects (`System`, `User`, `Assistant`, `Tool`). All extend `Base`. The `Converter` module handles hash-to-object conversion.
49
+
50
+ - **StreamEvents** (`lib/riffer/stream_events/`): Structured events for streaming (`TextDelta`, `TextDone`).
51
+
52
+ ### Key Patterns
53
+
54
+ - Model strings use `provider/model` format (e.g., `openai/gpt-4`)
55
+ - Configuration via `Riffer.configure { |c| c.openai.api_key = "..." }`
56
+ - Providers use `depends_on` helper for runtime dependency checking
57
+ - Tests use VCR cassettes in `test/fixtures/vcr_cassettes/`
58
+
59
+ ### Adding a New Provider
60
+
61
+ 1. Create `lib/riffer/providers/your_provider.rb` extending `Riffer::Providers::Base`
62
+ 2. Implement `perform_generate_text(messages, model:)` returning `Riffer::Messages::Assistant`
63
+ 3. Implement `perform_stream_text(messages, model:)` returning an `Enumerator` yielding stream events
64
+ 4. Register in `Riffer::Providers::Repository::REPO`
65
+ 5. Add provider config to `Riffer::Config` if needed
66
+
67
+ ## Code Style
68
+
69
+ - Ruby 3.2+ required
70
+ - All files must have `# frozen_string_literal: true`
71
+ - StandardRB for formatting (double quotes, 2-space indent)
72
+ - Minitest with spec DSL for tests
73
+ - YARD-style documentation for public APIs
data/README.md CHANGED
@@ -58,7 +58,6 @@ Riffer.configure do |config|
58
58
  end
59
59
 
60
60
  class EchoAgent < Riffer::Agent
61
- identifier 'echo'
62
61
  model 'openai/gpt-5-mini' # provider/model
63
62
  instructions 'You are an assistant that repeats what the user says.'
64
63
  end
data/lib/riffer/agent.rb CHANGED
@@ -11,13 +11,14 @@ class Riffer::Agent
11
11
  include Riffer::Messages::Converter
12
12
 
13
13
  class << self
14
+ include Riffer::Helpers::ClassNameConverter
14
15
  include Riffer::Helpers::Validations
15
16
 
16
17
  # Gets or sets the agent identifier
17
18
  # @param value [String, nil] the identifier to set, or nil to get
18
19
  # @return [String] the agent identifier
19
20
  def identifier(value = nil)
20
- return @identifier if value.nil?
21
+ return @identifier || class_name_to_path(name) if value.nil?
21
22
  @identifier = value.to_s
22
23
  end
23
24
 
@@ -39,6 +40,12 @@ class Riffer::Agent
39
40
  @instructions = instructions_text
40
41
  end
41
42
 
43
+ def reasoning(level = nil)
44
+ return @reasoning if level.nil?
45
+ validate_is_string!(level, "reasoning")
46
+ @reasoning = level
47
+ end
48
+
42
49
  # Finds an agent class by identifier
43
50
  # @param identifier [String] the identifier to search for
44
51
  # @return [Class, nil] the agent class, or nil if not found
@@ -64,6 +71,7 @@ class Riffer::Agent
64
71
  @messages = []
65
72
  @model_string = self.class.model
66
73
  @instructions_text = self.class.instructions
74
+ @reasoning = self.class.reasoning
67
75
 
68
76
  provider_name, model_name = @model_string.split("/", 2)
69
77
 
@@ -132,11 +140,11 @@ class Riffer::Agent
132
140
  end
133
141
 
134
142
  def call_llm
135
- provider_instance.generate_text(messages: @messages, model: @model_name)
143
+ provider_instance.generate_text(messages: @messages, model: @model_name, reasoning: @reasoning)
136
144
  end
137
145
 
138
146
  def call_llm_stream
139
- provider_instance.stream_text(messages: @messages, model: @model_name)
147
+ provider_instance.stream_text(messages: @messages, model: @model_name, reasoning: @reasoning)
140
148
  end
141
149
 
142
150
  def provider_instance
data/lib/riffer/config.rb CHANGED
@@ -6,14 +6,23 @@
6
6
  #
7
7
  # @example Setting the OpenAI API key
8
8
  # Riffer.config.openai.api_key = "sk-..."
9
+ #
10
+ # @example Setting Amazon Bedrock configuration
11
+ # Riffer.config.amazon_bedrock.region = "us-east-1"
12
+ # Riffer.config.amazon_bedrock.api_token = "..."
9
13
  class Riffer::Config
10
14
  # OpenAI configuration
11
15
  # @return [Struct]
12
16
  attr_reader :openai
13
17
 
18
+ # Amazon Bedrock configuration
19
+ # @return [Struct]
20
+ attr_reader :amazon_bedrock
21
+
14
22
  # Initializes the configuration
15
23
  # @return [void]
16
24
  def initialize
17
25
  @openai = Struct.new(:api_key).new
26
+ @amazon_bedrock = Struct.new(:api_token, :region).new
18
27
  end
19
28
  end
@@ -0,0 +1,15 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Riffer::Helpers::ClassNameConverter
4
+ # Converts a class name to snake_case path format
5
+ # @param class_name [String] the class name (e.g., "Riffer::Agent")
6
+ # @return [String] the snake_case path (e.g., "riffer/agent")
7
+ def class_name_to_path(class_name)
8
+ class_name
9
+ .to_s
10
+ .gsub("::", "/")
11
+ .gsub(/([A-Z]+)([A-Z][a-z])/, '\1_\2')
12
+ .gsub(/([a-z\d])([A-Z])/, '\1_\2')
13
+ .downcase
14
+ end
15
+ end
@@ -0,0 +1,106 @@
1
+ # frozen_string_literal: true
2
+
3
+ class Riffer::Providers::AmazonBedrock < Riffer::Providers::Base
4
+ # Initializes the Amazon Bedrock provider.
5
+ #
6
+ # @param options [Hash] options passed to Aws::BedrockRuntime::Client
7
+ # @option options [String] :api_token Bearer token for API authentication (requires :region)
8
+ # @option options [String] :region AWS region
9
+ # @see https://docs.aws.amazon.com/sdk-for-ruby/v3/api/Aws/BedrockRuntime/Client.html
10
+ def initialize(api_token: nil, region: nil, **options)
11
+ depends_on "aws-sdk-bedrockruntime"
12
+
13
+ api_token ||= Riffer.config.amazon_bedrock.api_token
14
+ region ||= Riffer.config.amazon_bedrock.region
15
+
16
+ @client = if api_token && !api_token.empty?
17
+ Aws::BedrockRuntime::Client.new(
18
+ region: region,
19
+ token_provider: Aws::StaticTokenProvider.new(api_token),
20
+ auth_scheme_preference: ["httpBearerAuth"],
21
+ **options
22
+ )
23
+ else
24
+ Aws::BedrockRuntime::Client.new(region: region, **options)
25
+ end
26
+ end
27
+
28
+ private
29
+
30
+ def perform_generate_text(messages, model:, reasoning: nil)
31
+ partitioned_messages = partition_messages(messages)
32
+
33
+ params = {
34
+ model_id: model,
35
+ system: partitioned_messages[:system],
36
+ messages: partitioned_messages[:conversation]
37
+ }
38
+
39
+ response = @client.converse(**params)
40
+ extract_assistant_message(response)
41
+ end
42
+
43
+ def perform_stream_text(messages, model:, reasoning: nil)
44
+ Enumerator.new do |yielder|
45
+ partitioned_messages = partition_messages(messages)
46
+
47
+ params = {
48
+ model_id: model,
49
+ system: partitioned_messages[:system],
50
+ messages: partitioned_messages[:conversation]
51
+ }
52
+
53
+ accumulated_text = ""
54
+
55
+ @client.converse_stream(**params) do |stream|
56
+ stream.on_content_block_delta_event do |event|
57
+ if event.delta&.text
58
+ delta_text = event.delta.text
59
+ accumulated_text += delta_text
60
+ yielder << Riffer::StreamEvents::TextDelta.new(delta_text)
61
+ end
62
+ end
63
+
64
+ stream.on_message_stop_event do |_event|
65
+ yielder << Riffer::StreamEvents::TextDone.new(accumulated_text)
66
+ end
67
+ end
68
+ end
69
+ end
70
+
71
+ def partition_messages(messages)
72
+ system_prompts = []
73
+ conversation_messages = []
74
+
75
+ messages.each do |message|
76
+ case message
77
+ when Riffer::Messages::System
78
+ system_prompts << {text: message.content}
79
+ when Riffer::Messages::User
80
+ conversation_messages << {role: "user", content: [{text: message.content}]}
81
+ when Riffer::Messages::Assistant
82
+ conversation_messages << {role: "assistant", content: [{text: message.content}]}
83
+ when Riffer::Messages::Tool
84
+ raise NotImplementedError, "Tool messages are not supported by Amazon Bedrock provider yet"
85
+ end
86
+ end
87
+
88
+ {
89
+ system: system_prompts,
90
+ conversation: conversation_messages
91
+ }
92
+ end
93
+
94
+ def extract_assistant_message(response)
95
+ output = response.output
96
+ raise Riffer::Error, "No output returned from Bedrock API" if output.nil? || output.message.nil?
97
+
98
+ content_blocks = output.message.content
99
+ raise Riffer::Error, "No content returned from Bedrock API" if content_blocks.nil? || content_blocks.empty?
100
+
101
+ text_block = content_blocks.find { |block| block.respond_to?(:text) && block.text }
102
+ raise Riffer::Error, "No text content returned from Bedrock API" if text_block.nil?
103
+
104
+ Riffer::Messages::Assistant.new(text_block.text)
105
+ end
106
+ end
@@ -10,12 +10,13 @@ class Riffer::Providers::Base
10
10
  # @param system [String, nil] an optional system message
11
11
  # @param messages [Array<Hash, Riffer::Messages::Base>, nil] optional messages array
12
12
  # @param model [String, nil] optional model string to override the configured model
13
+ # @param reasoning [String, nil] optional reasoning level or instructions
13
14
  # @return [Riffer::Messages::Assistant] the generated assistant message
14
- def generate_text(prompt: nil, system: nil, messages: nil, model: nil)
15
+ def generate_text(prompt: nil, system: nil, messages: nil, model: nil, reasoning: nil)
15
16
  validate_input!(prompt: prompt, system: system, messages: messages)
16
17
  normalized_messages = normalize_messages(prompt: prompt, system: system, messages: messages)
17
18
  validate_normalized_messages!(normalized_messages)
18
- perform_generate_text(normalized_messages, model: model)
19
+ perform_generate_text(normalized_messages, model: model, reasoning: reasoning)
19
20
  end
20
21
 
21
22
  # Streams text from the provider.
@@ -24,21 +25,22 @@ class Riffer::Providers::Base
24
25
  # @param system [String, nil] an optional system message
25
26
  # @param messages [Array<Hash, Riffer::Messages::Base>, nil] optional messages array
26
27
  # @param model [String, nil] optional model string to override the configured model
28
+ # @param reasoning [String, nil] optional reasoning level or instructions
27
29
  # @return [Enumerator] an enumerator yielding stream events or chunks (provider-specific)
28
- def stream_text(prompt: nil, system: nil, messages: nil, model: nil)
30
+ def stream_text(prompt: nil, system: nil, messages: nil, model: nil, reasoning: nil)
29
31
  validate_input!(prompt: prompt, system: system, messages: messages)
30
32
  normalized_messages = normalize_messages(prompt: prompt, system: system, messages: messages)
31
33
  validate_normalized_messages!(normalized_messages)
32
- perform_stream_text(normalized_messages, model: model)
34
+ perform_stream_text(normalized_messages, model: model, reasoning: reasoning)
33
35
  end
34
36
 
35
37
  private
36
38
 
37
- def perform_generate_text(messages, model: nil)
39
+ def perform_generate_text(messages, model: nil, reasoning: nil)
38
40
  raise NotImplementedError, "Subclasses must implement #perform_generate_text"
39
41
  end
40
42
 
41
- def perform_stream_text(messages, model: nil)
43
+ def perform_stream_text(messages, model: nil, reasoning: nil)
42
44
  raise NotImplementedError, "Subclasses must implement #perform_stream_text"
43
45
  end
44
46
 
@@ -17,8 +17,8 @@ class Riffer::Providers::OpenAI < Riffer::Providers::Base
17
17
 
18
18
  private
19
19
 
20
- def perform_generate_text(messages, model:)
21
- params = build_request_params(messages, model)
20
+ def perform_generate_text(messages, model:, reasoning: nil)
21
+ params = build_request_params(messages, model, reasoning)
22
22
  response = @client.responses.create(params)
23
23
 
24
24
  output = response.output.find { |o| o.type == :message }
@@ -44,19 +44,23 @@ class Riffer::Providers::OpenAI < Riffer::Providers::Base
44
44
  Riffer::Messages::Assistant.new(content.text)
45
45
  end
46
46
 
47
- def perform_stream_text(messages, model:)
47
+ def perform_stream_text(messages, model:, reasoning: nil)
48
48
  Enumerator.new do |yielder|
49
- params = build_request_params(messages, model)
49
+ params = build_request_params(messages, model, reasoning)
50
50
  stream = @client.responses.stream(params)
51
51
 
52
52
  process_stream_events(stream, yielder)
53
53
  end
54
54
  end
55
55
 
56
- def build_request_params(messages, model)
56
+ def build_request_params(messages, model, reasoning)
57
57
  {
58
+ input: convert_message_to_openai_format(messages),
58
59
  model: model,
59
- input: convert_message_to_openai_format(messages)
60
+ reasoning: reasoning && {
61
+ effort: reasoning,
62
+ summary: "auto"
63
+ }
60
64
  }
61
65
  end
62
66
 
@@ -91,6 +95,10 @@ class Riffer::Providers::OpenAI < Riffer::Providers::Base
91
95
  Riffer::StreamEvents::TextDelta.new(event.delta)
92
96
  when :"response.output_text.done"
93
97
  Riffer::StreamEvents::TextDone.new(event.text)
98
+ when :"response.reasoning_summary_text.delta"
99
+ Riffer::StreamEvents::ReasoningDelta.new(event.delta)
100
+ when :"response.reasoning_summary_text.done"
101
+ Riffer::StreamEvents::ReasoningDone.new(event.text)
94
102
  end
95
103
  end
96
104
  end
@@ -1,6 +1,7 @@
1
1
  class Riffer::Providers::Repository
2
2
  REPO = {
3
3
  openai: -> { Riffer::Providers::OpenAI },
4
+ amazon_bedrock: -> { Riffer::Providers::AmazonBedrock },
4
5
  test: -> { Riffer::Providers::Test }
5
6
  }.freeze
6
7
 
@@ -8,16 +8,18 @@ class Riffer::Providers::Test < Riffer::Providers::Base
8
8
  @current_index = 0
9
9
  @calls = []
10
10
  @stubbed_response = nil
11
+ @stubbed_reasoning = nil
11
12
  end
12
13
 
13
- def stub_response(content, tool_calls: [])
14
+ def stub_response(content, tool_calls: [], reasoning: nil)
14
15
  @stubbed_response = {role: "assistant", content: content, tool_calls: tool_calls}
16
+ @stubbed_reasoning = reasoning
15
17
  end
16
18
 
17
19
  private
18
20
 
19
- def perform_generate_text(messages, model: nil)
20
- @calls << {messages: messages.map(&:to_h)}
21
+ def perform_generate_text(messages, model: nil, reasoning: nil)
22
+ @calls << {messages: messages.map(&:to_h), model: model, reasoning: reasoning}
21
23
  response = @stubbed_response || @responses[@current_index] || {role: "assistant", content: "Test response"}
22
24
  @current_index += 1
23
25
 
@@ -28,11 +30,21 @@ class Riffer::Providers::Test < Riffer::Providers::Base
28
30
  end
29
31
  end
30
32
 
31
- def perform_stream_text(messages, model: nil)
32
- @calls << {messages: messages.map(&:to_h)}
33
+ def perform_stream_text(messages, model: nil, reasoning: nil)
34
+ @calls << {messages: messages.map(&:to_h), model: model, reasoning: reasoning}
33
35
  response = @stubbed_response || @responses[@current_index] || {role: "assistant", content: "Test response"}
34
36
  @current_index += 1
35
37
  Enumerator.new do |yielder|
38
+ if @stubbed_reasoning
39
+ reasoning_parts = @stubbed_reasoning.split(". ").map { |part| part + (part.end_with?(".") ? "" : ".") }
40
+
41
+ reasoning_parts.each do |part|
42
+ yielder << Riffer::StreamEvents::ReasoningDelta.new(part + " ")
43
+ end
44
+
45
+ yielder << Riffer::StreamEvents::ReasoningDone.new(@stubbed_reasoning)
46
+ end
47
+
36
48
  full_content = response[:content]
37
49
  content_parts = full_content.split(". ").map { |part| part + (part.end_with?(".") ? "" : ".") }
38
50
 
@@ -0,0 +1,14 @@
1
+ # frozen_string_literal: true
2
+
3
+ class Riffer::StreamEvents::ReasoningDelta < Riffer::StreamEvents::Base
4
+ attr_reader :content
5
+
6
+ def initialize(content, role: "assistant")
7
+ super(role: role)
8
+ @content = content
9
+ end
10
+
11
+ def to_h
12
+ {role: @role, content: @content}
13
+ end
14
+ end
@@ -0,0 +1,14 @@
1
+ # frozen_string_literal: true
2
+
3
+ class Riffer::StreamEvents::ReasoningDone < Riffer::StreamEvents::Base
4
+ attr_reader :content
5
+
6
+ def initialize(content, role: "assistant")
7
+ super(role: role)
8
+ @content = content
9
+ end
10
+
11
+ def to_h
12
+ {role: @role, content: @content}
13
+ end
14
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Riffer
4
- VERSION = "0.5.1"
4
+ VERSION = "0.6.0"
5
5
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: riffer
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.5.1
4
+ version: 0.6.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Jake Bottrall
@@ -29,6 +29,20 @@ dependencies:
29
29
  - - ">="
30
30
  - !ruby/object:Gem::Version
31
31
  version: 2.6.0
32
+ - !ruby/object:Gem::Dependency
33
+ name: aws-sdk-bedrockruntime
34
+ requirement: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - "~>"
37
+ - !ruby/object:Gem::Version
38
+ version: '1.0'
39
+ type: :development
40
+ prerelease: false
41
+ version_requirements: !ruby/object:Gem::Requirement
42
+ requirements:
43
+ - - "~>"
44
+ - !ruby/object:Gem::Version
45
+ version: '1.0'
32
46
  - !ruby/object:Gem::Dependency
33
47
  name: openai
34
48
  requirement: !ruby/object:Gem::Requirement
@@ -130,6 +144,7 @@ files:
130
144
  - ".ruby-version"
131
145
  - ".standard.yml"
132
146
  - CHANGELOG.md
147
+ - CLAUDE.md
133
148
  - CODE_OF_CONDUCT.md
134
149
  - LICENSE.txt
135
150
  - README.md
@@ -138,6 +153,7 @@ files:
138
153
  - lib/riffer/agent.rb
139
154
  - lib/riffer/config.rb
140
155
  - lib/riffer/core.rb
156
+ - lib/riffer/helpers/class_name_converter.rb
141
157
  - lib/riffer/helpers/dependencies.rb
142
158
  - lib/riffer/helpers/validations.rb
143
159
  - lib/riffer/messages.rb
@@ -148,12 +164,15 @@ files:
148
164
  - lib/riffer/messages/tool.rb
149
165
  - lib/riffer/messages/user.rb
150
166
  - lib/riffer/providers.rb
167
+ - lib/riffer/providers/amazon_bedrock.rb
151
168
  - lib/riffer/providers/base.rb
152
169
  - lib/riffer/providers/open_ai.rb
153
170
  - lib/riffer/providers/repository.rb
154
171
  - lib/riffer/providers/test.rb
155
172
  - lib/riffer/stream_events.rb
156
173
  - lib/riffer/stream_events/base.rb
174
+ - lib/riffer/stream_events/reasoning_delta.rb
175
+ - lib/riffer/stream_events/reasoning_done.rb
157
176
  - lib/riffer/stream_events/text_delta.rb
158
177
  - lib/riffer/stream_events/text_done.rb
159
178
  - lib/riffer/version.rb