riffer 0.7.0 → 0.9.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (60) hide show
  1. checksums.yaml +4 -4
  2. data/.agents/architecture.md +113 -0
  3. data/.agents/code-style.md +42 -0
  4. data/.agents/providers.md +46 -0
  5. data/.agents/rdoc.md +51 -0
  6. data/.agents/testing.md +56 -0
  7. data/.release-please-manifest.json +1 -1
  8. data/AGENTS.md +21 -308
  9. data/CHANGELOG.md +17 -0
  10. data/README.md +21 -112
  11. data/Rakefile +1 -1
  12. data/docs/01_OVERVIEW.md +106 -0
  13. data/docs/02_GETTING_STARTED.md +128 -0
  14. data/docs/03_AGENTS.md +226 -0
  15. data/docs/04_TOOLS.md +342 -0
  16. data/docs/05_MESSAGES.md +173 -0
  17. data/docs/06_STREAM_EVENTS.md +191 -0
  18. data/docs/07_CONFIGURATION.md +195 -0
  19. data/docs_providers/01_PROVIDERS.md +168 -0
  20. data/docs_providers/02_AMAZON_BEDROCK.md +196 -0
  21. data/docs_providers/03_ANTHROPIC.md +211 -0
  22. data/docs_providers/04_OPENAI.md +157 -0
  23. data/docs_providers/05_TEST_PROVIDER.md +163 -0
  24. data/docs_providers/06_CUSTOM_PROVIDERS.md +304 -0
  25. data/lib/riffer/agent.rb +103 -63
  26. data/lib/riffer/config.rb +20 -12
  27. data/lib/riffer/core.rb +7 -7
  28. data/lib/riffer/helpers/class_name_converter.rb +6 -3
  29. data/lib/riffer/helpers/dependencies.rb +18 -0
  30. data/lib/riffer/helpers/validations.rb +9 -0
  31. data/lib/riffer/messages/assistant.rb +23 -1
  32. data/lib/riffer/messages/base.rb +15 -0
  33. data/lib/riffer/messages/converter.rb +15 -5
  34. data/lib/riffer/messages/system.rb +8 -1
  35. data/lib/riffer/messages/tool.rb +45 -2
  36. data/lib/riffer/messages/user.rb +8 -1
  37. data/lib/riffer/messages.rb +7 -0
  38. data/lib/riffer/providers/amazon_bedrock.rb +8 -4
  39. data/lib/riffer/providers/anthropic.rb +209 -0
  40. data/lib/riffer/providers/base.rb +17 -12
  41. data/lib/riffer/providers/open_ai.rb +7 -1
  42. data/lib/riffer/providers/repository.rb +9 -4
  43. data/lib/riffer/providers/test.rb +25 -7
  44. data/lib/riffer/providers.rb +6 -0
  45. data/lib/riffer/stream_events/base.rb +13 -1
  46. data/lib/riffer/stream_events/reasoning_delta.rb +15 -1
  47. data/lib/riffer/stream_events/reasoning_done.rb +15 -1
  48. data/lib/riffer/stream_events/text_delta.rb +14 -1
  49. data/lib/riffer/stream_events/text_done.rb +14 -1
  50. data/lib/riffer/stream_events/tool_call_delta.rb +18 -11
  51. data/lib/riffer/stream_events/tool_call_done.rb +22 -12
  52. data/lib/riffer/stream_events.rb +9 -0
  53. data/lib/riffer/tool.rb +92 -25
  54. data/lib/riffer/tools/param.rb +19 -16
  55. data/lib/riffer/tools/params.rb +28 -22
  56. data/lib/riffer/tools/response.rb +90 -0
  57. data/lib/riffer/tools.rb +6 -0
  58. data/lib/riffer/version.rb +1 -1
  59. data/lib/riffer.rb +21 -21
  60. metadata +35 -1
@@ -0,0 +1,195 @@
1
+ # Configuration
2
+
3
+ Riffer uses a centralized configuration system for provider credentials and settings.
4
+
5
+ ## Global Configuration
6
+
7
+ Use `Riffer.configure` to set up provider credentials:
8
+
9
+ ```ruby
10
+ Riffer.configure do |config|
11
+ config.openai.api_key = ENV['OPENAI_API_KEY']
12
+ config.amazon_bedrock.region = 'us-east-1'
13
+ config.amazon_bedrock.api_token = ENV['BEDROCK_API_TOKEN']
14
+ config.anthropic.api_key = ENV['ANTHROPIC_API_KEY']
15
+ end
16
+ ```
17
+
18
+ ## Accessing Configuration
19
+
20
+ Access the current configuration via `Riffer.config`:
21
+
22
+ ```ruby
23
+ Riffer.config.openai.api_key
24
+ # => "sk-..."
25
+
26
+ Riffer.config.amazon_bedrock.region
27
+ # => "us-east-1"
28
+
29
+ Riffer.config.anthropic.api_key
30
+ # => "sk-ant-..."
31
+ ```
32
+
33
+ ## Provider-Specific Configuration
34
+
35
+ ### OpenAI
36
+
37
+ | Option | Description |
38
+ | --------- | ------------------- |
39
+ | `api_key` | Your OpenAI API key |
40
+
41
+ ```ruby
42
+ Riffer.configure do |config|
43
+ config.openai.api_key = ENV['OPENAI_API_KEY']
44
+ end
45
+ ```
46
+
47
+ ### Amazon Bedrock
48
+
49
+ | Option | Description |
50
+ | ----------- | -------------------------------------------- |
51
+ | `region` | AWS region (e.g., `us-east-1`) |
52
+ | `api_token` | Optional bearer token for API authentication |
53
+
54
+ ```ruby
55
+ Riffer.configure do |config|
56
+ config.amazon_bedrock.region = 'us-east-1'
57
+ config.amazon_bedrock.api_token = ENV['BEDROCK_API_TOKEN'] # Optional
58
+ end
59
+ ```
60
+
61
+ When `api_token` is not set, the provider uses standard AWS IAM authentication.
62
+
63
+ ### Anthropic
64
+
65
+ | Option | Description |
66
+ | --------- | ---------------------- |
67
+ | `api_key` | Your Anthropic API key |
68
+
69
+ ```ruby
70
+ Riffer.configure do |config|
71
+ config.anthropic.api_key = ENV['ANTHROPIC_API_KEY']
72
+ end
73
+ ```
74
+
75
+ ## Agent-Level Configuration
76
+
77
+ Override global configuration at the agent level:
78
+
79
+ ### provider_options
80
+
81
+ Pass options directly to the provider client:
82
+
83
+ ```ruby
84
+ class MyAgent < Riffer::Agent
85
+ model 'openai/gpt-4o'
86
+
87
+ # Override API key for this agent only
88
+ provider_options api_key: ENV['CUSTOM_OPENAI_KEY']
89
+ end
90
+ ```
91
+
92
+ ### model_options
93
+
94
+ Pass options to each LLM request:
95
+
96
+ ```ruby
97
+ class MyAgent < Riffer::Agent
98
+ model 'openai/gpt-4o'
99
+
100
+ # These options are sent with every generate/stream call
101
+ model_options temperature: 0.7, reasoning: 'medium'
102
+ end
103
+ ```
104
+
105
+ ## Common Model Options
106
+
107
+ ### OpenAI
108
+
109
+ | Option | Description |
110
+ | ------------- | ------------------------------------------------ |
111
+ | `temperature` | Sampling temperature (0.0-2.0) |
112
+ | `max_tokens` | Maximum tokens in response |
113
+ | `top_p` | Nucleus sampling parameter |
114
+ | `reasoning` | Reasoning effort level (`low`, `medium`, `high`) |
115
+
116
+ ```ruby
117
+ class MyAgent < Riffer::Agent
118
+ model 'openai/gpt-4o'
119
+ model_options temperature: 0.7, reasoning: 'medium'
120
+ end
121
+ ```
122
+
123
+ ### Amazon Bedrock
124
+
125
+ | Option | Description |
126
+ | ------------- | -------------------------- |
127
+ | `temperature` | Sampling temperature |
128
+ | `max_tokens` | Maximum tokens in response |
129
+ | `top_p` | Nucleus sampling parameter |
130
+ | `top_k` | Top-k sampling parameter |
131
+
132
+ ```ruby
133
+ class MyAgent < Riffer::Agent
134
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
135
+ model_options temperature: 0.7, max_tokens: 4096
136
+ end
137
+ ```
138
+
139
+ ### Anthropic
140
+
141
+ | Option | Description |
142
+ | ------------- | ------------------------------------------- |
143
+ | `temperature` | Sampling temperature |
144
+ | `max_tokens` | Maximum tokens in response |
145
+ | `top_p` | Nucleus sampling parameter |
146
+ | `top_k` | Top-k sampling parameter |
147
+ | `thinking` | Extended thinking config hash (Claude 3.7+) |
148
+
149
+ ```ruby
150
+ class MyAgent < Riffer::Agent
151
+ model 'anthropic/claude-3-5-sonnet-20241022'
152
+ model_options temperature: 0.7, max_tokens: 4096
153
+ end
154
+
155
+ # With extended thinking (Claude 3.7+)
156
+ class ReasoningAgent < Riffer::Agent
157
+ model 'anthropic/claude-3-7-sonnet-20250219'
158
+ model_options thinking: {type: "enabled", budget_tokens: 10000}
159
+ end
160
+ ```
161
+
162
+ ## Environment Variables
163
+
164
+ Recommended pattern for managing credentials:
165
+
166
+ ```ruby
167
+ # config/initializers/riffer.rb (Rails)
168
+ # or at application startup
169
+
170
+ Riffer.configure do |config|
171
+ config.openai.api_key = ENV.fetch('OPENAI_API_KEY') { raise 'OPENAI_API_KEY not set' }
172
+
173
+ if ENV['BEDROCK_REGION']
174
+ config.amazon_bedrock.region = ENV['BEDROCK_REGION']
175
+ config.amazon_bedrock.api_token = ENV['BEDROCK_API_TOKEN']
176
+ end
177
+ end
178
+ ```
179
+
180
+ ## Multiple Configurations
181
+
182
+ For different environments or use cases, use agent-level overrides:
183
+
184
+ ```ruby
185
+ class ProductionAgent < Riffer::Agent
186
+ model 'openai/gpt-4o'
187
+ provider_options api_key: ENV['PRODUCTION_OPENAI_KEY']
188
+ end
189
+
190
+ class DevelopmentAgent < Riffer::Agent
191
+ model 'openai/gpt-4o-mini'
192
+ provider_options api_key: ENV['DEV_OPENAI_KEY']
193
+ model_options temperature: 0.0 # Deterministic for testing
194
+ end
195
+ ```
@@ -0,0 +1,168 @@
1
+ # Providers Overview
2
+
3
+ Providers are adapters that connect Riffer to LLM services. They implement a common interface for text generation and streaming.
4
+
5
+ ## Available Providers
6
+
7
+ | Provider | Identifier | Gem Required |
8
+ | -------------- | ---------------- | ------------------------ |
9
+ | OpenAI | `openai` | `openai` |
10
+ | Amazon Bedrock | `amazon_bedrock` | `aws-sdk-bedrockruntime` |
11
+ | Anthropic | `anthropic` | `anthropic` |
12
+ | Test | `test` | None |
13
+
14
+ ## Model String Format
15
+
16
+ Agents specify providers using the `provider/model` format:
17
+
18
+ ```ruby
19
+ class MyAgent < Riffer::Agent
20
+ model 'openai/gpt-4o' # OpenAI
21
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0' # Bedrock
22
+ model 'anthropic/claude-3-5-sonnet-20241022' # Anthropic
23
+ model 'test/any' # Test provider
24
+ end
25
+ ```
26
+
27
+ ## Provider Interface
28
+
29
+ All providers inherit from `Riffer::Providers::Base` and implement:
30
+
31
+ ### generate_text
32
+
33
+ Generates a response synchronously:
34
+
35
+ ```ruby
36
+ provider = Riffer::Providers::OpenAI.new(api_key: "...")
37
+
38
+ response = provider.generate_text(
39
+ prompt: "Hello!",
40
+ model: "gpt-4o"
41
+ )
42
+ # => Riffer::Messages::Assistant
43
+
44
+ # Or with messages
45
+ response = provider.generate_text(
46
+ messages: [Riffer::Messages::User.new("Hello!")],
47
+ model: "gpt-4o"
48
+ )
49
+ ```
50
+
51
+ ### stream_text
52
+
53
+ Streams a response as an Enumerator:
54
+
55
+ ```ruby
56
+ provider.stream_text(prompt: "Tell me a story", model: "gpt-4o").each do |event|
57
+ case event
58
+ when Riffer::StreamEvents::TextDelta
59
+ print event.content
60
+ end
61
+ end
62
+ ```
63
+
64
+ ## Method Parameters
65
+
66
+ | Parameter | Description |
67
+ | ----------- | --------------------------------------------------------- |
68
+ | `prompt` | String prompt (required if `messages` not provided) |
69
+ | `system` | Optional system message string |
70
+ | `messages` | Array of message objects/hashes (alternative to `prompt`) |
71
+ | `model` | Model name string |
72
+ | `tools` | Array of Tool classes |
73
+ | `**options` | Provider-specific options |
74
+
75
+ You must provide either `prompt` or `messages`, but not both.
76
+
77
+ ## Using Providers Directly
78
+
79
+ While agents abstract provider usage, you can use providers directly:
80
+
81
+ ```ruby
82
+ require 'riffer'
83
+
84
+ Riffer.configure do |config|
85
+ config.openai.api_key = ENV['OPENAI_API_KEY']
86
+ end
87
+
88
+ provider = Riffer::Providers::OpenAI.new
89
+
90
+ # Simple prompt
91
+ response = provider.generate_text(
92
+ prompt: "What is Ruby?",
93
+ model: "gpt-4o"
94
+ )
95
+ puts response.content
96
+
97
+ # With system message
98
+ response = provider.generate_text(
99
+ prompt: "Explain recursion",
100
+ system: "You are a programming tutor. Use simple language.",
101
+ model: "gpt-4o"
102
+ )
103
+
104
+ # With message history
105
+ messages = [
106
+ Riffer::Messages::System.new("You are helpful."),
107
+ Riffer::Messages::User.new("Hi!"),
108
+ Riffer::Messages::Assistant.new("Hello!"),
109
+ Riffer::Messages::User.new("How are you?")
110
+ ]
111
+
112
+ response = provider.generate_text(
113
+ messages: messages,
114
+ model: "gpt-4o"
115
+ )
116
+ ```
117
+
118
+ ## Tool Support
119
+
120
+ Providers convert tools to their native format:
121
+
122
+ ```ruby
123
+ class WeatherTool < Riffer::Tool
124
+ description "Gets weather"
125
+ params do
126
+ required :city, String
127
+ end
128
+ def call(context:, city:)
129
+ "Sunny in #{city}"
130
+ end
131
+ end
132
+
133
+ response = provider.generate_text(
134
+ prompt: "What's the weather in Tokyo?",
135
+ model: "gpt-4o",
136
+ tools: [WeatherTool]
137
+ )
138
+
139
+ if response.tool_calls.any?
140
+ # Handle tool calls
141
+ end
142
+ ```
143
+
144
+ ## Provider Registry
145
+
146
+ Riffer uses a registry to find providers by identifier:
147
+
148
+ ```ruby
149
+ Riffer::Providers::Repository.find(:openai)
150
+ # => Riffer::Providers::OpenAI
151
+
152
+ Riffer::Providers::Repository.find(:amazon_bedrock)
153
+ # => Riffer::Providers::AmazonBedrock
154
+
155
+ Riffer::Providers::Repository.find(:anthropic)
156
+ # => Riffer::Providers::Anthropic
157
+
158
+ Riffer::Providers::Repository.find(:test)
159
+ # => Riffer::Providers::Test
160
+ ```
161
+
162
+ ## Provider-Specific Guides
163
+
164
+ - [Amazon Bedrock](02_AMAZON_BEDROCK.md) - Claude and other models via AWS
165
+ - [Anthropic](03_ANTHROPIC.md) - Claude models via Anthropic API
166
+ - [OpenAI](04_OPENAI.md) - GPT models
167
+ - [Test](05_TEST_PROVIDER.md) - Mock provider for testing
168
+ - [Custom Providers](06_CUSTOM_PROVIDERS.md) - Creating your own provider
@@ -0,0 +1,196 @@
1
+ # Amazon Bedrock Provider
2
+
3
+ The Amazon Bedrock provider connects to AWS Bedrock for Claude and other foundation models.
4
+
5
+ ## Installation
6
+
7
+ Add the AWS SDK gem to your Gemfile:
8
+
9
+ ```ruby
10
+ gem 'aws-sdk-bedrockruntime'
11
+ ```
12
+
13
+ ## Configuration
14
+
15
+ ### IAM Authentication (Recommended)
16
+
17
+ Configure your AWS credentials using standard AWS methods (environment variables, IAM roles, etc.):
18
+
19
+ ```ruby
20
+ Riffer.configure do |config|
21
+ config.amazon_bedrock.region = 'us-east-1'
22
+ end
23
+ ```
24
+
25
+ ### Bearer Token Authentication
26
+
27
+ For API token authentication:
28
+
29
+ ```ruby
30
+ Riffer.configure do |config|
31
+ config.amazon_bedrock.region = 'us-east-1'
32
+ config.amazon_bedrock.api_token = ENV['BEDROCK_API_TOKEN']
33
+ end
34
+ ```
35
+
36
+ Or per-agent:
37
+
38
+ ```ruby
39
+ class MyAgent < Riffer::Agent
40
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
41
+ provider_options region: 'us-west-2', api_token: ENV['BEDROCK_API_TOKEN']
42
+ end
43
+ ```
44
+
45
+ ## Supported Models
46
+
47
+ Use Bedrock model IDs in the `amazon_bedrock/model` format:
48
+
49
+ ```ruby
50
+ # Claude models
51
+ model 'amazon_bedrock/anthropic.claude-3-opus-20240229-v1:0'
52
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
53
+ model 'amazon_bedrock/anthropic.claude-3-haiku-20240307-v1:0'
54
+
55
+ # Other foundation models available in Bedrock
56
+ model 'amazon_bedrock/amazon.titan-text-express-v1'
57
+ ```
58
+
59
+ ## Model Options
60
+
61
+ ### temperature
62
+
63
+ Controls randomness:
64
+
65
+ ```ruby
66
+ model_options temperature: 0.7
67
+ ```
68
+
69
+ ### max_tokens
70
+
71
+ Maximum tokens in response:
72
+
73
+ ```ruby
74
+ model_options max_tokens: 4096
75
+ ```
76
+
77
+ ### top_p
78
+
79
+ Nucleus sampling parameter:
80
+
81
+ ```ruby
82
+ model_options top_p: 0.95
83
+ ```
84
+
85
+ ### top_k
86
+
87
+ Top-k sampling parameter:
88
+
89
+ ```ruby
90
+ model_options top_k: 250
91
+ ```
92
+
93
+ ## Example
94
+
95
+ ```ruby
96
+ Riffer.configure do |config|
97
+ config.amazon_bedrock.region = 'us-east-1'
98
+ end
99
+
100
+ class AssistantAgent < Riffer::Agent
101
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
102
+ instructions 'You are a helpful assistant.'
103
+ model_options temperature: 0.7, max_tokens: 4096
104
+ end
105
+
106
+ agent = AssistantAgent.new
107
+ puts agent.generate("Explain cloud computing")
108
+ ```
109
+
110
+ ## Streaming
111
+
112
+ ```ruby
113
+ agent.stream("Tell me about AWS services").each do |event|
114
+ case event
115
+ when Riffer::StreamEvents::TextDelta
116
+ print event.content
117
+ when Riffer::StreamEvents::TextDone
118
+ puts "\n[Complete]"
119
+ when Riffer::StreamEvents::ToolCallDone
120
+ puts "[Tool: #{event.name}]"
121
+ end
122
+ end
123
+ ```
124
+
125
+ ## Tool Calling
126
+
127
+ Bedrock provider converts tools to the Bedrock tool_config format:
128
+
129
+ ```ruby
130
+ class S3ListTool < Riffer::Tool
131
+ description "Lists objects in an S3 bucket"
132
+
133
+ params do
134
+ required :bucket, String, description: "The S3 bucket name"
135
+ optional :prefix, String, description: "Object prefix filter"
136
+ end
137
+
138
+ def call(context:, bucket:, prefix: nil)
139
+ # Implementation
140
+ "Found 10 objects in #{bucket}"
141
+ end
142
+ end
143
+
144
+ class AWSAgent < Riffer::Agent
145
+ model 'amazon_bedrock/anthropic.claude-3-sonnet-20240229-v1:0'
146
+ uses_tools [S3ListTool]
147
+ end
148
+ ```
149
+
150
+ ## Message Format
151
+
152
+ The provider converts Riffer messages to Bedrock format:
153
+
154
+ | Riffer Message | Bedrock Format |
155
+ | -------------- | ----------------------------------------------- |
156
+ | `System` | Added to `system` array as `{text: ...}` |
157
+ | `User` | `{role: "user", content: [{text: ...}]}` |
158
+ | `Assistant` | `{role: "assistant", content: [...]}` |
159
+ | `Tool` | `{role: "user", content: [{tool_result: ...}]}` |
160
+
161
+ ## Direct Provider Usage
162
+
163
+ ```ruby
164
+ provider = Riffer::Providers::AmazonBedrock.new(
165
+ region: 'us-east-1',
166
+ api_token: ENV['BEDROCK_API_TOKEN'] # Optional
167
+ )
168
+
169
+ response = provider.generate_text(
170
+ prompt: "Hello!",
171
+ model: "anthropic.claude-3-sonnet-20240229-v1:0",
172
+ temperature: 0.7
173
+ )
174
+
175
+ puts response.content
176
+ ```
177
+
178
+ ## AWS IAM Permissions
179
+
180
+ Ensure your IAM role/user has the following permissions:
181
+
182
+ ```json
183
+ {
184
+ "Version": "2012-10-17",
185
+ "Statement": [
186
+ {
187
+ "Effect": "Allow",
188
+ "Action": [
189
+ "bedrock:InvokeModel",
190
+ "bedrock:InvokeModelWithResponseStream"
191
+ ],
192
+ "Resource": "arn:aws:bedrock:*::foundation-model/*"
193
+ }
194
+ ]
195
+ }
196
+ ```