riffer 0.6.1 → 0.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (60) hide show
  1. checksums.yaml +4 -4
  2. data/.agents/architecture.md +113 -0
  3. data/.agents/code-style.md +42 -0
  4. data/.agents/providers.md +46 -0
  5. data/.agents/rdoc.md +51 -0
  6. data/.agents/testing.md +56 -0
  7. data/.release-please-manifest.json +1 -1
  8. data/AGENTS.md +28 -0
  9. data/CHANGELOG.md +17 -0
  10. data/README.md +26 -36
  11. data/Rakefile +1 -1
  12. data/docs/01_OVERVIEW.md +106 -0
  13. data/docs/02_GETTING_STARTED.md +128 -0
  14. data/docs/03_AGENTS.md +226 -0
  15. data/docs/04_TOOLS.md +251 -0
  16. data/docs/05_MESSAGES.md +173 -0
  17. data/docs/06_STREAM_EVENTS.md +191 -0
  18. data/docs/07_CONFIGURATION.md +195 -0
  19. data/docs_providers/01_PROVIDERS.md +168 -0
  20. data/docs_providers/02_AMAZON_BEDROCK.md +196 -0
  21. data/docs_providers/03_ANTHROPIC.md +211 -0
  22. data/docs_providers/04_OPENAI.md +157 -0
  23. data/docs_providers/05_TEST_PROVIDER.md +163 -0
  24. data/docs_providers/06_CUSTOM_PROVIDERS.md +304 -0
  25. data/lib/riffer/agent.rb +220 -57
  26. data/lib/riffer/config.rb +20 -12
  27. data/lib/riffer/core.rb +7 -7
  28. data/lib/riffer/helpers/class_name_converter.rb +6 -3
  29. data/lib/riffer/helpers/dependencies.rb +18 -0
  30. data/lib/riffer/helpers/validations.rb +9 -0
  31. data/lib/riffer/messages/assistant.rb +23 -1
  32. data/lib/riffer/messages/base.rb +15 -0
  33. data/lib/riffer/messages/converter.rb +15 -5
  34. data/lib/riffer/messages/system.rb +8 -1
  35. data/lib/riffer/messages/tool.rb +58 -4
  36. data/lib/riffer/messages/user.rb +8 -1
  37. data/lib/riffer/messages.rb +7 -0
  38. data/lib/riffer/providers/amazon_bedrock.rb +128 -13
  39. data/lib/riffer/providers/anthropic.rb +209 -0
  40. data/lib/riffer/providers/base.rb +23 -18
  41. data/lib/riffer/providers/open_ai.rb +119 -39
  42. data/lib/riffer/providers/repository.rb +9 -4
  43. data/lib/riffer/providers/test.rb +78 -24
  44. data/lib/riffer/providers.rb +6 -0
  45. data/lib/riffer/stream_events/base.rb +13 -1
  46. data/lib/riffer/stream_events/reasoning_delta.rb +15 -1
  47. data/lib/riffer/stream_events/reasoning_done.rb +15 -1
  48. data/lib/riffer/stream_events/text_delta.rb +14 -1
  49. data/lib/riffer/stream_events/text_done.rb +14 -1
  50. data/lib/riffer/stream_events/tool_call_delta.rb +35 -0
  51. data/lib/riffer/stream_events/tool_call_done.rb +40 -0
  52. data/lib/riffer/stream_events.rb +9 -0
  53. data/lib/riffer/tool.rb +120 -0
  54. data/lib/riffer/tools/param.rb +68 -0
  55. data/lib/riffer/tools/params.rb +118 -0
  56. data/lib/riffer/tools.rb +9 -0
  57. data/lib/riffer/version.rb +1 -1
  58. data/lib/riffer.rb +23 -19
  59. metadata +41 -2
  60. data/CLAUDE.md +0 -73
@@ -0,0 +1,168 @@
1
+ # Providers Overview
2
+
3
+ Providers are adapters that connect Riffer to LLM services. They implement a common interface for text generation and streaming.
4
+
5
+ ## Available Providers
6
+
7
+ | Provider | Identifier | Gem Required |
8
+ | -------------- | ---------------- | ------------------------ |
9
+ | OpenAI | `openai` | `openai` |
10
+ | Amazon Bedrock | `amazon_bedrock` | `aws-sdk-bedrockruntime` |
11
+ | Anthropic | `anthropic` | `anthropic` |
12
+ | Test | `test` | None |
13
+
14
+ ## Model String Format
15
+
16
+ Agents specify providers using the `provider/model` format:
17
+
18
+ ```ruby
19
+ class MyAgent < Riffer::Agent
20
+ model 'openai/gpt-4o' # OpenAI
21
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0' # Bedrock
22
+ model 'anthropic/claude-3-5-sonnet-20241022' # Anthropic
23
+ model 'test/any' # Test provider
24
+ end
25
+ ```
26
+
27
+ ## Provider Interface
28
+
29
+ All providers inherit from `Riffer::Providers::Base` and implement:
30
+
31
+ ### generate_text
32
+
33
+ Generates a response synchronously:
34
+
35
+ ```ruby
36
+ provider = Riffer::Providers::OpenAI.new(api_key: "...")
37
+
38
+ response = provider.generate_text(
39
+ prompt: "Hello!",
40
+ model: "gpt-4o"
41
+ )
42
+ # => Riffer::Messages::Assistant
43
+
44
+ # Or with messages
45
+ response = provider.generate_text(
46
+ messages: [Riffer::Messages::User.new("Hello!")],
47
+ model: "gpt-4o"
48
+ )
49
+ ```
50
+
51
+ ### stream_text
52
+
53
+ Streams a response as an Enumerator:
54
+
55
+ ```ruby
56
+ provider.stream_text(prompt: "Tell me a story", model: "gpt-4o").each do |event|
57
+ case event
58
+ when Riffer::StreamEvents::TextDelta
59
+ print event.content
60
+ end
61
+ end
62
+ ```
63
+
64
+ ## Method Parameters
65
+
66
+ | Parameter | Description |
67
+ | ----------- | --------------------------------------------------------- |
68
+ | `prompt` | String prompt (required if `messages` not provided) |
69
+ | `system` | Optional system message string |
70
+ | `messages` | Array of message objects/hashes (alternative to `prompt`) |
71
+ | `model` | Model name string |
72
+ | `tools` | Array of Tool classes |
73
+ | `**options` | Provider-specific options |
74
+
75
+ You must provide either `prompt` or `messages`, but not both.
76
+
77
+ ## Using Providers Directly
78
+
79
+ While agents abstract provider usage, you can use providers directly:
80
+
81
+ ```ruby
82
+ require 'riffer'
83
+
84
+ Riffer.configure do |config|
85
+ config.openai.api_key = ENV['OPENAI_API_KEY']
86
+ end
87
+
88
+ provider = Riffer::Providers::OpenAI.new
89
+
90
+ # Simple prompt
91
+ response = provider.generate_text(
92
+ prompt: "What is Ruby?",
93
+ model: "gpt-4o"
94
+ )
95
+ puts response.content
96
+
97
+ # With system message
98
+ response = provider.generate_text(
99
+ prompt: "Explain recursion",
100
+ system: "You are a programming tutor. Use simple language.",
101
+ model: "gpt-4o"
102
+ )
103
+
104
+ # With message history
105
+ messages = [
106
+ Riffer::Messages::System.new("You are helpful."),
107
+ Riffer::Messages::User.new("Hi!"),
108
+ Riffer::Messages::Assistant.new("Hello!"),
109
+ Riffer::Messages::User.new("How are you?")
110
+ ]
111
+
112
+ response = provider.generate_text(
113
+ messages: messages,
114
+ model: "gpt-4o"
115
+ )
116
+ ```
117
+
118
+ ## Tool Support
119
+
120
+ Providers convert tools to their native format:
121
+
122
+ ```ruby
123
+ class WeatherTool < Riffer::Tool
124
+ description "Gets weather"
125
+ params do
126
+ required :city, String
127
+ end
128
+ def call(context:, city:)
129
+ "Sunny in #{city}"
130
+ end
131
+ end
132
+
133
+ response = provider.generate_text(
134
+ prompt: "What's the weather in Tokyo?",
135
+ model: "gpt-4o",
136
+ tools: [WeatherTool]
137
+ )
138
+
139
+ if response.tool_calls.any?
140
+ # Handle tool calls
141
+ end
142
+ ```
143
+
144
+ ## Provider Registry
145
+
146
+ Riffer uses a registry to find providers by identifier:
147
+
148
+ ```ruby
149
+ Riffer::Providers::Repository.find(:openai)
150
+ # => Riffer::Providers::OpenAI
151
+
152
+ Riffer::Providers::Repository.find(:amazon_bedrock)
153
+ # => Riffer::Providers::AmazonBedrock
154
+
155
+ Riffer::Providers::Repository.find(:anthropic)
156
+ # => Riffer::Providers::Anthropic
157
+
158
+ Riffer::Providers::Repository.find(:test)
159
+ # => Riffer::Providers::Test
160
+ ```
161
+
162
+ ## Provider-Specific Guides
163
+
164
+ - [Amazon Bedrock](02_AMAZON_BEDROCK.md) - Claude and other models via AWS
165
+ - [Anthropic](03_ANTHROPIC.md) - Claude models via Anthropic API
166
+ - [OpenAI](04_OPENAI.md) - GPT models
167
+ - [Test](05_TEST_PROVIDER.md) - Mock provider for testing
168
+ - [Custom Providers](06_CUSTOM_PROVIDERS.md) - Creating your own provider
@@ -0,0 +1,196 @@
1
+ # Amazon Bedrock Provider
2
+
3
+ The Amazon Bedrock provider connects to AWS Bedrock for Claude and other foundation models.
4
+
5
+ ## Installation
6
+
7
+ Add the AWS SDK gem to your Gemfile:
8
+
9
+ ```ruby
10
+ gem 'aws-sdk-bedrockruntime'
11
+ ```
12
+
13
+ ## Configuration
14
+
15
+ ### IAM Authentication (Recommended)
16
+
17
+ Configure your AWS credentials using standard AWS methods (environment variables, IAM roles, etc.):
18
+
19
+ ```ruby
20
+ Riffer.configure do |config|
21
+ config.amazon_bedrock.region = 'us-east-1'
22
+ end
23
+ ```
24
+
25
+ ### Bearer Token Authentication
26
+
27
+ For API token authentication:
28
+
29
+ ```ruby
30
+ Riffer.configure do |config|
31
+ config.amazon_bedrock.region = 'us-east-1'
32
+ config.amazon_bedrock.api_token = ENV['BEDROCK_API_TOKEN']
33
+ end
34
+ ```
35
+
36
+ Or per-agent:
37
+
38
+ ```ruby
39
+ class MyAgent < Riffer::Agent
40
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
41
+ provider_options region: 'us-west-2', api_token: ENV['BEDROCK_API_TOKEN']
42
+ end
43
+ ```
44
+
45
+ ## Supported Models
46
+
47
+ Use Bedrock model IDs in the `amazon_bedrock/model` format:
48
+
49
+ ```ruby
50
+ # Claude models
51
+ model 'amazon_bedrock/anthropic.claude-3-opus-20240229-v1:0'
52
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
53
+ model 'amazon_bedrock/anthropic.claude-3-haiku-20240307-v1:0'
54
+
55
+ # Other foundation models available in Bedrock
56
+ model 'amazon_bedrock/amazon.titan-text-express-v1'
57
+ ```
58
+
59
+ ## Model Options
60
+
61
+ ### temperature
62
+
63
+ Controls randomness:
64
+
65
+ ```ruby
66
+ model_options temperature: 0.7
67
+ ```
68
+
69
+ ### max_tokens
70
+
71
+ Maximum tokens in response:
72
+
73
+ ```ruby
74
+ model_options max_tokens: 4096
75
+ ```
76
+
77
+ ### top_p
78
+
79
+ Nucleus sampling parameter:
80
+
81
+ ```ruby
82
+ model_options top_p: 0.95
83
+ ```
84
+
85
+ ### top_k
86
+
87
+ Top-k sampling parameter:
88
+
89
+ ```ruby
90
+ model_options top_k: 250
91
+ ```
92
+
93
+ ## Example
94
+
95
+ ```ruby
96
+ Riffer.configure do |config|
97
+ config.amazon_bedrock.region = 'us-east-1'
98
+ end
99
+
100
+ class AssistantAgent < Riffer::Agent
101
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
102
+ instructions 'You are a helpful assistant.'
103
+ model_options temperature: 0.7, max_tokens: 4096
104
+ end
105
+
106
+ agent = AssistantAgent.new
107
+ puts agent.generate("Explain cloud computing")
108
+ ```
109
+
110
+ ## Streaming
111
+
112
+ ```ruby
113
+ agent.stream("Tell me about AWS services").each do |event|
114
+ case event
115
+ when Riffer::StreamEvents::TextDelta
116
+ print event.content
117
+ when Riffer::StreamEvents::TextDone
118
+ puts "\n[Complete]"
119
+ when Riffer::StreamEvents::ToolCallDone
120
+ puts "[Tool: #{event.name}]"
121
+ end
122
+ end
123
+ ```
124
+
125
+ ## Tool Calling
126
+
127
+ Bedrock provider converts tools to the Bedrock tool_config format:
128
+
129
+ ```ruby
130
+ class S3ListTool < Riffer::Tool
131
+ description "Lists objects in an S3 bucket"
132
+
133
+ params do
134
+ required :bucket, String, description: "The S3 bucket name"
135
+ optional :prefix, String, description: "Object prefix filter"
136
+ end
137
+
138
+ def call(context:, bucket:, prefix: nil)
139
+ # Implementation
140
+ "Found 10 objects in #{bucket}"
141
+ end
142
+ end
143
+
144
+ class AWSAgent < Riffer::Agent
145
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
146
+ uses_tools [S3ListTool]
147
+ end
148
+ ```
149
+
150
+ ## Message Format
151
+
152
+ The provider converts Riffer messages to Bedrock format:
153
+
154
+ | Riffer Message | Bedrock Format |
155
+ | -------------- | ----------------------------------------------- |
156
+ | `System` | Added to `system` array as `{text: ...}` |
157
+ | `User` | `{role: "user", content: [{text: ...}]}` |
158
+ | `Assistant` | `{role: "assistant", content: [...]}` |
159
+ | `Tool` | `{role: "user", content: [{tool_result: ...}]}` |
160
+
161
+ ## Direct Provider Usage
162
+
163
+ ```ruby
164
+ provider = Riffer::Providers::AmazonBedrock.new(
165
+ region: 'us-east-1',
166
+ api_token: ENV['BEDROCK_API_TOKEN'] # Optional
167
+ )
168
+
169
+ response = provider.generate_text(
170
+ prompt: "Hello!",
171
+ model: "anthropic.claude-3-sonnet-20240229-v1:0",
172
+ temperature: 0.7
173
+ )
174
+
175
+ puts response.content
176
+ ```
177
+
178
+ ## AWS IAM Permissions
179
+
180
+ Ensure your IAM role/user has the following permissions:
181
+
182
+ ```json
183
+ {
184
+ "Version": "2012-10-17",
185
+ "Statement": [
186
+ {
187
+ "Effect": "Allow",
188
+ "Action": [
189
+ "bedrock:InvokeModel",
190
+ "bedrock:InvokeModelWithResponseStream"
191
+ ],
192
+ "Resource": "arn:aws:bedrock:*::foundation-model/*"
193
+ }
194
+ ]
195
+ }
196
+ ```
@@ -0,0 +1,211 @@
1
+ # Anthropic Provider
2
+
3
+ The Anthropic provider connects to Claude models via the Anthropic API.
4
+
5
+ ## Installation
6
+
7
+ Add the Anthropic gem to your Gemfile:
8
+
9
+ ```ruby
10
+ gem 'anthropic'
11
+ ```
12
+
13
+ ## Configuration
14
+
15
+ Configure your Anthropic API key:
16
+
17
+ ```ruby
18
+ Riffer.configure do |config|
19
+ config.anthropic.api_key = ENV['ANTHROPIC_API_KEY']
20
+ end
21
+ ```
22
+
23
+ Or per-agent:
24
+
25
+ ```ruby
26
+ class MyAgent < Riffer::Agent
27
+ model 'anthropic/claude-4-5-haiku-20251001'
28
+ provider_options api_key: ENV['ANTHROPIC_API_KEY']
29
+ end
30
+ ```
31
+
32
+ ## Supported Models
33
+
34
+ Use Anthropic model IDs in the `anthropic/model` format:
35
+
36
+ ```ruby
37
+ model 'anthropic/claude-haiku-4-5-20251001'
38
+ model 'anthropic/claude-sonnet-4-5-20250929'
39
+ model 'anthropic/claude-opus-4-5-20251101'
40
+ ```
41
+
42
+ ## Model Options
43
+
44
+ ### temperature
45
+
46
+ Controls randomness:
47
+
48
+ ```ruby
49
+ model_options temperature: 0.7
50
+ ```
51
+
52
+ ### max_tokens
53
+
54
+ Maximum tokens in response:
55
+
56
+ ```ruby
57
+ model_options max_tokens: 4096
58
+ ```
59
+
60
+ ### top_p
61
+
62
+ Nucleus sampling parameter:
63
+
64
+ ```ruby
65
+ model_options top_p: 0.95
66
+ ```
67
+
68
+ ### top_k
69
+
70
+ Top-k sampling parameter:
71
+
72
+ ```ruby
73
+ model_options top_k: 250
74
+ ```
75
+
76
+ ### thinking
77
+
78
+ Enable extended thinking (reasoning) for supported models. Pass the thinking configuration hash directly as Anthropic expects:
79
+
80
+ ```ruby
81
+ # Enable with budget tokens
82
+ model_options thinking: {type: "enabled", budget_tokens: 10000}
83
+ ```
84
+
85
+ ## Example
86
+
87
+ ```ruby
88
+ Riffer.configure do |config|
89
+ config.anthropic.api_key = ENV['ANTHROPIC_API_KEY']
90
+ end
91
+
92
+ class AssistantAgent < Riffer::Agent
93
+ model 'anthropic/claude-4-5-haiku-20251001'
94
+ instructions 'You are a helpful assistant.'
95
+ model_options temperature: 0.7, max_tokens: 4096
96
+ end
97
+
98
+ agent = AssistantAgent.new
99
+ puts agent.generate("Explain quantum computing")
100
+ ```
101
+
102
+ ## Streaming
103
+
104
+ ```ruby
105
+ agent.stream("Tell me about Claude models").each do |event|
106
+ case event
107
+ when Riffer::StreamEvents::TextDelta
108
+ print event.content
109
+ when Riffer::StreamEvents::TextDone
110
+ puts "\n[Complete]"
111
+ when Riffer::StreamEvents::ReasoningDelta
112
+ print "[Thinking] #{event.content}"
113
+ when Riffer::StreamEvents::ReasoningDone
114
+ puts "\n[Thinking Complete]"
115
+ when Riffer::StreamEvents::ToolCallDone
116
+ puts "[Tool: #{event.name}]"
117
+ end
118
+ end
119
+ ```
120
+
121
+ ## Tool Calling
122
+
123
+ Anthropic provider converts tools to the Anthropic tool format:
124
+
125
+ ```ruby
126
+ class WeatherTool < Riffer::Tool
127
+ description "Gets the current weather for a location"
128
+
129
+ params do
130
+ required :city, String, description: "The city name"
131
+ optional :unit, String, description: "Temperature unit (celsius or fahrenheit)"
132
+ end
133
+
134
+ def call(context:, city:, unit: "celsius")
135
+ # Implementation
136
+ "It's 22 degrees #{unit} in #{city}"
137
+ end
138
+ end
139
+
140
+ class WeatherAgent < Riffer::Agent
141
+ model 'anthropic/claude-4-5-haiku-20251001'
142
+ uses_tools [WeatherTool]
143
+ end
144
+ ```
145
+
146
+ ## Extended Thinking
147
+
148
+ Extended thinking enables Claude to reason through complex problems before responding. This is available on Claude 3.7 models.
149
+
150
+ ```ruby
151
+ class ReasoningAgent < Riffer::Agent
152
+ model 'anthropic/claude-4-5-haiku-20251001'
153
+ model_options thinking: {type: "enabled", budget_tokens: 10000}
154
+ end
155
+ ```
156
+
157
+ When streaming with extended thinking enabled, you'll receive `ReasoningDelta` events containing the model's thought process, followed by a `ReasoningDone` event when thinking completes:
158
+
159
+ ```ruby
160
+ agent.stream("Solve this complex math problem").each do |event|
161
+ case event
162
+ when Riffer::StreamEvents::ReasoningDelta
163
+ # Model's internal reasoning
164
+ print "[Thinking] #{event.content}"
165
+ when Riffer::StreamEvents::ReasoningDone
166
+ puts "\n[Thinking complete]"
167
+ when Riffer::StreamEvents::TextDelta
168
+ # Final response
169
+ print event.content
170
+ end
171
+ end
172
+ ```
173
+
174
+ ## Message Format
175
+
176
+ The provider converts Riffer messages to Anthropic format:
177
+
178
+ | Riffer Message | Anthropic Format |
179
+ | -------------- | --------------------------------------------------------------- |
180
+ | `System` | Added to `system` array as `{type: "text", text: ...}` |
181
+ | `User` | `{role: "user", content: "..."}` |
182
+ | `Assistant` | `{role: "assistant", content: [...]}` with text/tool_use blocks |
183
+ | `Tool` | `{role: "user", content: [{type: "tool_result", ...}]}` |
184
+
185
+ ## Direct Provider Usage
186
+
187
+ ```ruby
188
+ provider = Riffer::Providers::Anthropic.new(
189
+ api_key: ENV['ANTHROPIC_API_KEY']
190
+ )
191
+
192
+ response = provider.generate_text(
193
+ prompt: "Hello!",
194
+ model: "claude-4-5-haiku-20251001",
195
+ temperature: 0.7
196
+ )
197
+
198
+ puts response.content
199
+ ```
200
+
201
+ ### With extended thinking:
202
+
203
+ ```ruby
204
+ response = provider.generate_text(
205
+ prompt: "Explain step by step how to solve a Rubik's cube",
206
+ model: "claude-4-5-haiku-20251001",
207
+ thinking: { type: "enabled", budget_tokens: 10000 }
208
+ )
209
+
210
+ puts response.content
211
+ ```