rails_ai 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/.rspec_status +96 -0
- data/AGENT_GUIDE.md +513 -0
- data/Appraisals +49 -0
- data/COMMERCIAL_LICENSE_TEMPLATE.md +92 -0
- data/FEATURES.md +204 -0
- data/LEGAL_PROTECTION_GUIDE.md +222 -0
- data/LICENSE +62 -0
- data/LICENSE_SUMMARY.md +74 -0
- data/MIT-LICENSE +62 -0
- data/PERFORMANCE.md +300 -0
- data/PROVIDERS.md +495 -0
- data/README.md +454 -0
- data/Rakefile +11 -0
- data/SPEED_OPTIMIZATIONS.md +217 -0
- data/STRUCTURE.md +139 -0
- data/USAGE_GUIDE.md +288 -0
- data/app/channels/ai_stream_channel.rb +33 -0
- data/app/components/ai/prompt_component.rb +25 -0
- data/app/controllers/concerns/ai/context_aware.rb +77 -0
- data/app/controllers/concerns/ai/streaming.rb +41 -0
- data/app/helpers/ai_helper.rb +164 -0
- data/app/jobs/ai/generate_embedding_job.rb +25 -0
- data/app/jobs/ai/generate_summary_job.rb +25 -0
- data/app/models/concerns/ai/embeddable.rb +38 -0
- data/app/views/rails_ai/dashboard/index.html.erb +51 -0
- data/config/routes.rb +19 -0
- data/lib/generators/rails_ai/install/install_generator.rb +38 -0
- data/lib/rails_ai/agents/agent_manager.rb +258 -0
- data/lib/rails_ai/agents/agent_team.rb +243 -0
- data/lib/rails_ai/agents/base_agent.rb +331 -0
- data/lib/rails_ai/agents/collaboration.rb +238 -0
- data/lib/rails_ai/agents/memory.rb +116 -0
- data/lib/rails_ai/agents/message_bus.rb +95 -0
- data/lib/rails_ai/agents/specialized_agents.rb +391 -0
- data/lib/rails_ai/agents/task_queue.rb +111 -0
- data/lib/rails_ai/cache.rb +14 -0
- data/lib/rails_ai/config.rb +40 -0
- data/lib/rails_ai/context.rb +7 -0
- data/lib/rails_ai/context_analyzer.rb +86 -0
- data/lib/rails_ai/engine.rb +48 -0
- data/lib/rails_ai/events.rb +9 -0
- data/lib/rails_ai/image_context.rb +110 -0
- data/lib/rails_ai/performance.rb +231 -0
- data/lib/rails_ai/provider.rb +8 -0
- data/lib/rails_ai/providers/anthropic_adapter.rb +256 -0
- data/lib/rails_ai/providers/base.rb +60 -0
- data/lib/rails_ai/providers/dummy_adapter.rb +29 -0
- data/lib/rails_ai/providers/gemini_adapter.rb +509 -0
- data/lib/rails_ai/providers/openai_adapter.rb +535 -0
- data/lib/rails_ai/providers/secure_anthropic_adapter.rb +206 -0
- data/lib/rails_ai/providers/secure_openai_adapter.rb +284 -0
- data/lib/rails_ai/railtie.rb +48 -0
- data/lib/rails_ai/redactor.rb +12 -0
- data/lib/rails_ai/security/api_key_manager.rb +82 -0
- data/lib/rails_ai/security/audit_logger.rb +46 -0
- data/lib/rails_ai/security/error_handler.rb +62 -0
- data/lib/rails_ai/security/input_validator.rb +176 -0
- data/lib/rails_ai/security/secure_file_handler.rb +45 -0
- data/lib/rails_ai/security/secure_http_client.rb +177 -0
- data/lib/rails_ai/security.rb +0 -0
- data/lib/rails_ai/version.rb +5 -0
- data/lib/rails_ai/window_context.rb +103 -0
- data/lib/rails_ai.rb +502 -0
- data/monitoring/ci_setup_guide.md +214 -0
- data/monitoring/enhanced_monitoring_script.rb +237 -0
- data/monitoring/google_alerts_setup.md +42 -0
- data/monitoring_log_20250921.txt +0 -0
- data/monitoring_script.rb +161 -0
- data/rails_ai.gemspec +54 -0
- data/scripts/security_scanner.rb +353 -0
- data/setup_monitoring.sh +163 -0
- data/wiki/API-Documentation.md +734 -0
- data/wiki/Architecture-Overview.md +672 -0
- data/wiki/Contributing-Guide.md +407 -0
- data/wiki/Development-Setup.md +532 -0
- data/wiki/Home.md +278 -0
- data/wiki/Installation-Guide.md +527 -0
- data/wiki/Quick-Start.md +186 -0
- data/wiki/README.md +135 -0
- data/wiki/Release-Process.md +467 -0
- metadata +385 -0
data/PROVIDERS.md
ADDED
@@ -0,0 +1,495 @@
|
|
1
|
+
# AI Providers
|
2
|
+
|
3
|
+
Rails AI supports multiple AI providers through a flexible adapter system. This allows you to easily switch between different AI services or use multiple providers in the same application.
|
4
|
+
|
5
|
+
## 🚀 Supported Providers
|
6
|
+
|
7
|
+
### OpenAI
|
8
|
+
- **Provider Key**: `:openai`
|
9
|
+
- **Models**: GPT-5, GPT-4o, GPT-4, GPT-3.5, DALL-E 3, DALL-E 2, Sora, Whisper, TTS
|
10
|
+
- **Features**: Text generation, image generation, video generation, audio processing, embeddings
|
11
|
+
- **API Key**: `OPENAI_API_KEY`
|
12
|
+
- **Dependency**: None! (Uses direct API calls)
|
13
|
+
- **Best for**: Latest AI models, comprehensive multimodal capabilities
|
14
|
+
|
15
|
+
### Anthropic (Claude)
|
16
|
+
- **Provider Key**: `:anthropic`
|
17
|
+
- **Models**: Claude 3.5 Sonnet, Claude 3 Sonnet, Claude 3 Haiku, Claude 3 Opus
|
18
|
+
- **Features**: Text generation, image analysis (Claude 3 Vision)
|
19
|
+
- **API Key**: `ANTHROPIC_API_KEY`
|
20
|
+
- **Dependency**: None! (Uses direct API calls)
|
21
|
+
- **Best for**: Code analysis, text generation, reasoning tasks
|
22
|
+
|
23
|
+
### Google Gemini
|
24
|
+
- **Provider Key**: `:gemini`
|
25
|
+
- **Models**: Gemini 2.0 Flash, Gemini 1.5 Pro, Gemini 1.5 Flash, Gemini 1.0 Pro
|
26
|
+
- **Features**: Text generation, image generation, image analysis, video generation, audio processing, embeddings
|
27
|
+
- **API Key**: `GEMINI_API_KEY`
|
28
|
+
- **Dependency**: None! (Uses direct API calls)
|
29
|
+
- **Best for**: Multimodal analysis, Google ecosystem integration, latest AI capabilities
|
30
|
+
|
31
|
+
### Dummy (Testing)
|
32
|
+
- **Provider Key**: `:dummy`
|
33
|
+
- **Models**: Mock models for testing
|
34
|
+
- **Features**: All operations with mock responses
|
35
|
+
- **API Key**: Not required
|
36
|
+
- **Best for**: Testing and development
|
37
|
+
|
38
|
+
## ⚙️ Configuration
|
39
|
+
|
40
|
+
### Basic Configuration
|
41
|
+
|
42
|
+
```ruby
|
43
|
+
# config/initializers/rails_ai.rb
|
44
|
+
RailsAi.configure do |config|
|
45
|
+
config.provider = :openai # or :gemini, :anthropic, :dummy
|
46
|
+
config.default_model = "gpt-5" # or "gemini-2.0-flash-exp", "claude-3-5-sonnet-20241022"
|
47
|
+
end
|
48
|
+
```
|
49
|
+
|
50
|
+
### Provider-Specific Configuration
|
51
|
+
|
52
|
+
```ruby
|
53
|
+
# OpenAI configuration (no gem required!)
|
54
|
+
RailsAi.configure do |config|
|
55
|
+
config.provider = :openai
|
56
|
+
config.default_model = "gpt-5" # Latest GPT-5 model!
|
57
|
+
config.api_key = ENV['OPENAI_API_KEY']
|
58
|
+
end
|
59
|
+
|
60
|
+
# Anthropic configuration (no gem required!)
|
61
|
+
RailsAi.configure do |config|
|
62
|
+
config.provider = :anthropic
|
63
|
+
config.default_model = "claude-3-5-sonnet-20241022" # Latest Claude 3.5 Sonnet!
|
64
|
+
config.api_key = ENV['ANTHROPIC_API_KEY']
|
65
|
+
end
|
66
|
+
|
67
|
+
# Gemini configuration (no gem required!)
|
68
|
+
RailsAi.configure do |config|
|
69
|
+
config.provider = :gemini
|
70
|
+
config.default_model = "gemini-2.0-flash-exp"
|
71
|
+
config.api_key = ENV['GEMINI_API_KEY']
|
72
|
+
end
|
73
|
+
```
|
74
|
+
|
75
|
+
### Environment Variables
|
76
|
+
|
77
|
+
```bash
|
78
|
+
# .env
|
79
|
+
OPENAI_API_KEY=your_openai_api_key_here
|
80
|
+
ANTHROPIC_API_KEY=your_anthropic_api_key_here
|
81
|
+
GEMINI_API_KEY=your_gemini_api_key_here
|
82
|
+
```
|
83
|
+
|
84
|
+
## 🔄 Switching Providers
|
85
|
+
|
86
|
+
### Runtime Switching
|
87
|
+
|
88
|
+
```ruby
|
89
|
+
# Switch to OpenAI (latest models!)
|
90
|
+
RailsAi.configure { |c| c.provider = :openai }
|
91
|
+
result = RailsAi.chat("Hello", model: "gpt-5")
|
92
|
+
|
93
|
+
# Switch to Anthropic (latest Claude!)
|
94
|
+
RailsAi.configure { |c| c.provider = :anthropic }
|
95
|
+
result = RailsAi.chat("Hello", model: "claude-3-5-sonnet-20241022")
|
96
|
+
|
97
|
+
# Switch to Gemini
|
98
|
+
RailsAi.configure { |c| c.provider = :gemini }
|
99
|
+
result = RailsAi.chat("Hello", model: "gemini-2.0-flash-exp")
|
100
|
+
```
|
101
|
+
|
102
|
+
### Per-Request Switching
|
103
|
+
|
104
|
+
```ruby
|
105
|
+
# Use specific provider for a request
|
106
|
+
RailsAi.configure { |c| c.provider = :openai }
|
107
|
+
openai_result = RailsAi.chat("Generate code", model: "gpt-5")
|
108
|
+
|
109
|
+
RailsAi.configure { |c| c.provider = :anthropic }
|
110
|
+
anthropic_result = RailsAi.chat("Analyze this code", model: "claude-3-5-sonnet-20241022")
|
111
|
+
|
112
|
+
RailsAi.configure { |c| c.provider = :gemini }
|
113
|
+
gemini_result = RailsAi.chat("Explain this code", model: "gemini-2.0-flash-exp")
|
114
|
+
```
|
115
|
+
|
116
|
+
## 📊 Provider Capabilities
|
117
|
+
|
118
|
+
| Feature | OpenAI | Anthropic | Gemini | Dummy |
|
119
|
+
|---------|--------|-----------|--------|-------|
|
120
|
+
| **Text Generation** | ✅ | ✅ | ✅ | ✅ |
|
121
|
+
| **Text Streaming** | ✅ | ✅ | ✅ | ✅ |
|
122
|
+
| **Image Generation** | ✅ | ❌ | ✅ | ✅ |
|
123
|
+
| **Image Analysis** | ✅ | ✅ | ✅ | ✅ |
|
124
|
+
| **Video Generation** | ✅ | ❌ | ✅ | ✅ |
|
125
|
+
| **Audio Generation** | ✅ | ❌ | ✅ | ✅ |
|
126
|
+
| **Audio Transcription** | ✅ | ❌ | ✅ | ✅ |
|
127
|
+
| **Embeddings** | ✅ | ⚠️ | ✅ | ✅ |
|
128
|
+
|
129
|
+
**Legend:**
|
130
|
+
- ✅ Fully supported
|
131
|
+
- ❌ Not supported (raises helpful error)
|
132
|
+
- ⚠️ Limited support (workaround available)
|
133
|
+
|
134
|
+
## 🎯 Provider-Specific Features
|
135
|
+
|
136
|
+
### OpenAI Features (Latest Models!)
|
137
|
+
|
138
|
+
```ruby
|
139
|
+
# Text generation with latest models
|
140
|
+
RailsAi.chat("Write a blog post", model: "gpt-5") # GPT-5!
|
141
|
+
RailsAi.chat("Write a blog post", model: "gpt-4o") # GPT-4o
|
142
|
+
RailsAi.chat("Write a blog post", model: "gpt-4") # GPT-4
|
143
|
+
|
144
|
+
# Image generation with DALL-E 3
|
145
|
+
RailsAi.generate_image("A sunset over mountains", model: "dall-e-3")
|
146
|
+
|
147
|
+
# Video generation with Sora
|
148
|
+
RailsAi.generate_video("A cat playing with a ball", model: "sora")
|
149
|
+
|
150
|
+
# Audio processing
|
151
|
+
RailsAi.generate_speech("Hello world", voice: "alloy")
|
152
|
+
RailsAi.transcribe_audio(audio_file)
|
153
|
+
|
154
|
+
# Embeddings
|
155
|
+
RailsAi.embed(["Ruby on Rails", "Django"])
|
156
|
+
|
157
|
+
# Advanced configuration
|
158
|
+
RailsAi.chat("Write code",
|
159
|
+
model: "gpt-5",
|
160
|
+
temperature: 0.7,
|
161
|
+
top_p: 0.9,
|
162
|
+
frequency_penalty: 0.1,
|
163
|
+
presence_penalty: 0.1
|
164
|
+
)
|
165
|
+
```
|
166
|
+
|
167
|
+
### Anthropic (Claude) Features (Latest Models!)
|
168
|
+
|
169
|
+
```ruby
|
170
|
+
# Text generation with latest Claude models
|
171
|
+
RailsAi.chat("Write a blog post", model: "claude-3-5-sonnet-20241022") # Latest Claude 3.5 Sonnet!
|
172
|
+
RailsAi.chat("Write a blog post", model: "claude-3-sonnet-20240229") # Claude 3 Sonnet
|
173
|
+
RailsAi.chat("Write a blog post", model: "claude-3-haiku-20240307") # Claude 3 Haiku
|
174
|
+
|
175
|
+
# Image analysis with Claude 3 Vision
|
176
|
+
RailsAi.analyze_image(image_file, "What do you see?", model: "claude-3-5-sonnet-20241022")
|
177
|
+
|
178
|
+
# Streaming with Claude
|
179
|
+
RailsAi.stream("Write a long story", model: "claude-3-5-sonnet-20241022") do |token|
|
180
|
+
puts token
|
181
|
+
end
|
182
|
+
|
183
|
+
# Advanced configuration
|
184
|
+
RailsAi.chat("Write code",
|
185
|
+
model: "claude-3-5-sonnet-20241022",
|
186
|
+
temperature: 0.7,
|
187
|
+
top_p: 0.9,
|
188
|
+
top_k: 40
|
189
|
+
)
|
190
|
+
```
|
191
|
+
|
192
|
+
### Google Gemini Features
|
193
|
+
|
194
|
+
```ruby
|
195
|
+
# Text generation with Gemini models
|
196
|
+
RailsAi.chat("Write a blog post", model: "gemini-2.0-flash-exp")
|
197
|
+
|
198
|
+
# Image generation with Gemini 2.0 Flash
|
199
|
+
RailsAi.generate_image("A sunset over mountains", model: "gemini-2.0-flash-exp")
|
200
|
+
|
201
|
+
# Image analysis with Gemini Vision
|
202
|
+
RailsAi.analyze_image(image_file, "What do you see?", model: "gemini-2.0-flash-exp")
|
203
|
+
|
204
|
+
# Video generation with Gemini 2.0 Flash
|
205
|
+
RailsAi.generate_video("A cat playing with a ball", model: "gemini-2.0-flash-exp")
|
206
|
+
|
207
|
+
# Audio processing with Gemini 2.0 Flash
|
208
|
+
RailsAi.generate_speech("Hello world", model: "gemini-2.0-flash-exp")
|
209
|
+
RailsAi.transcribe_audio(audio_file, model: "gemini-2.0-flash-exp")
|
210
|
+
|
211
|
+
# Streaming with Gemini
|
212
|
+
RailsAi.stream("Write a long story") do |token|
|
213
|
+
puts token
|
214
|
+
end
|
215
|
+
|
216
|
+
# Advanced configuration with Gemini
|
217
|
+
RailsAi.chat("Write code",
|
218
|
+
model: "gemini-2.0-flash-exp",
|
219
|
+
temperature: 0.7,
|
220
|
+
top_p: 0.8,
|
221
|
+
top_k: 40
|
222
|
+
)
|
223
|
+
```
|
224
|
+
|
225
|
+
## 🔧 Advanced Usage
|
226
|
+
|
227
|
+
### Custom Provider Selection
|
228
|
+
|
229
|
+
```ruby
|
230
|
+
# Select provider based on operation type
|
231
|
+
def smart_ai_operation(prompt, operation_type)
|
232
|
+
case operation_type
|
233
|
+
when :text_generation
|
234
|
+
RailsAi.configure { |c| c.provider = :openai }
|
235
|
+
RailsAi.chat(prompt, model: "gpt-5") # Use latest GPT-5!
|
236
|
+
when :code_analysis
|
237
|
+
RailsAi.configure { |c| c.provider = :anthropic }
|
238
|
+
RailsAi.chat(prompt, model: "claude-3-5-sonnet-20241022") # Use latest Claude!
|
239
|
+
when :image_generation
|
240
|
+
RailsAi.configure { |c| c.provider = :openai }
|
241
|
+
RailsAi.generate_image(prompt, model: "dall-e-3")
|
242
|
+
when :image_analysis
|
243
|
+
RailsAi.configure { |c| c.provider = :gemini }
|
244
|
+
RailsAi.analyze_image(image, prompt)
|
245
|
+
end
|
246
|
+
end
|
247
|
+
```
|
248
|
+
|
249
|
+
### Fallback Providers
|
250
|
+
|
251
|
+
```ruby
|
252
|
+
def robust_ai_operation(prompt)
|
253
|
+
providers = [:openai, :anthropic, :gemini] # All with latest models!
|
254
|
+
|
255
|
+
providers.each do |provider|
|
256
|
+
begin
|
257
|
+
RailsAi.configure { |c| c.provider = provider }
|
258
|
+
return RailsAi.chat(prompt)
|
259
|
+
rescue => e
|
260
|
+
Rails.logger.warn("#{provider} failed: #{e.message}")
|
261
|
+
next
|
262
|
+
end
|
263
|
+
end
|
264
|
+
|
265
|
+
raise "All providers failed"
|
266
|
+
end
|
267
|
+
```
|
268
|
+
|
269
|
+
### Provider-Specific Models
|
270
|
+
|
271
|
+
```ruby
|
272
|
+
# OpenAI models (latest and most capable)
|
273
|
+
openai_models = {
|
274
|
+
text: "gpt-5", # Latest GPT-5!
|
275
|
+
text_advanced: "gpt-4o", # GPT-4o
|
276
|
+
text_standard: "gpt-4", # GPT-4
|
277
|
+
image: "dall-e-3", # DALL-E 3
|
278
|
+
video: "sora", # Sora video generation
|
279
|
+
audio: "tts-1", # Text-to-speech
|
280
|
+
embedding: "text-embedding-3-small" # Embeddings
|
281
|
+
}
|
282
|
+
|
283
|
+
# Anthropic models (latest and most capable)
|
284
|
+
anthropic_models = {
|
285
|
+
text: "claude-3-5-sonnet-20241022", # Latest Claude 3.5 Sonnet!
|
286
|
+
text_standard: "claude-3-sonnet-20240229", # Claude 3 Sonnet
|
287
|
+
text_fast: "claude-3-haiku-20240307", # Claude 3 Haiku
|
288
|
+
vision: "claude-3-5-sonnet-20241022" # Vision capabilities
|
289
|
+
}
|
290
|
+
|
291
|
+
# Gemini models (latest and most capable)
|
292
|
+
gemini_models = {
|
293
|
+
text: "gemini-2.0-flash-exp", # Latest Gemini 2.0 Flash!
|
294
|
+
vision: "gemini-2.0-flash-exp", # Vision capabilities
|
295
|
+
fast: "gemini-1.5-flash", # Fast responses
|
296
|
+
pro: "gemini-1.5-pro" # Pro version
|
297
|
+
}
|
298
|
+
```
|
299
|
+
|
300
|
+
## 🧪 Testing with Providers
|
301
|
+
|
302
|
+
### Using Dummy Provider
|
303
|
+
|
304
|
+
```ruby
|
305
|
+
# In your tests
|
306
|
+
RSpec.describe "AI Features" do
|
307
|
+
before do
|
308
|
+
RailsAi.configure do |config|
|
309
|
+
config.provider = :dummy
|
310
|
+
config.stub_responses = true
|
311
|
+
end
|
312
|
+
end
|
313
|
+
|
314
|
+
it "generates content" do
|
315
|
+
result = RailsAi.chat("Hello")
|
316
|
+
expect(result).to include("Hello")
|
317
|
+
end
|
318
|
+
end
|
319
|
+
```
|
320
|
+
|
321
|
+
### Testing Multiple Providers
|
322
|
+
|
323
|
+
```ruby
|
324
|
+
RSpec.describe "Provider Compatibility" do
|
325
|
+
%i[openai anthropic gemini dummy].each do |provider|
|
326
|
+
context "with #{provider} provider" do
|
327
|
+
before do
|
328
|
+
RailsAi.configure { |c| c.provider = provider }
|
329
|
+
end
|
330
|
+
|
331
|
+
it "generates text" do
|
332
|
+
result = RailsAi.chat("Test")
|
333
|
+
expect(result).to be_a(String)
|
334
|
+
end
|
335
|
+
end
|
336
|
+
end
|
337
|
+
end
|
338
|
+
```
|
339
|
+
|
340
|
+
## 🚀 Adding New Providers
|
341
|
+
|
342
|
+
### 1. Create Provider Class
|
343
|
+
|
344
|
+
```ruby
|
345
|
+
# lib/rails_ai/providers/my_provider.rb
|
346
|
+
module RailsAi
|
347
|
+
module Providers
|
348
|
+
class MyProvider < Base
|
349
|
+
def initialize
|
350
|
+
@api_key = ENV.fetch("MY_API_KEY")
|
351
|
+
super
|
352
|
+
end
|
353
|
+
|
354
|
+
def chat!(messages:, model:, **opts)
|
355
|
+
# Implementation using direct API calls
|
356
|
+
end
|
357
|
+
|
358
|
+
def generate_image!(prompt:, model:, **opts)
|
359
|
+
# Implementation
|
360
|
+
end
|
361
|
+
end
|
362
|
+
end
|
363
|
+
end
|
364
|
+
```
|
365
|
+
|
366
|
+
### 2. Add to Provider Selection
|
367
|
+
|
368
|
+
```ruby
|
369
|
+
# lib/rails_ai.rb
|
370
|
+
def provider
|
371
|
+
@provider ||= Performance::LazyProvider.new do
|
372
|
+
case config.provider.to_sym
|
373
|
+
when :openai then Providers::OpenAIAdapter.new
|
374
|
+
when :anthropic then Providers::AnthropicAdapter.new
|
375
|
+
when :gemini then Providers::GeminiAdapter.new
|
376
|
+
when :my_provider then Providers::MyProvider.new
|
377
|
+
when :dummy then Providers::DummyAdapter.new
|
378
|
+
else Providers::DummyAdapter.new
|
379
|
+
end
|
380
|
+
end
|
381
|
+
end
|
382
|
+
```
|
383
|
+
|
384
|
+
### 3. Add Tests
|
385
|
+
|
386
|
+
```ruby
|
387
|
+
# spec/providers/my_provider_spec.rb
|
388
|
+
RSpec.describe RailsAi::Providers::MyProvider do
|
389
|
+
# Test implementation
|
390
|
+
end
|
391
|
+
```
|
392
|
+
|
393
|
+
## 📚 Provider Documentation
|
394
|
+
|
395
|
+
### OpenAI
|
396
|
+
- [OpenAI API Documentation](https://platform.openai.com/docs)
|
397
|
+
- [OpenAI Models](https://platform.openai.com/docs/models)
|
398
|
+
|
399
|
+
### Anthropic
|
400
|
+
- [Anthropic API Documentation](https://docs.anthropic.com/)
|
401
|
+
- [Anthropic Models](https://docs.anthropic.com/en/docs/models)
|
402
|
+
|
403
|
+
### Google Gemini
|
404
|
+
- [Gemini API Documentation](https://ai.google.dev/docs)
|
405
|
+
- [Gemini API Reference](https://ai.google.dev/api/rest)
|
406
|
+
|
407
|
+
## 🔍 Troubleshooting
|
408
|
+
|
409
|
+
### Common Issues
|
410
|
+
|
411
|
+
#### Provider Not Found
|
412
|
+
```ruby
|
413
|
+
# Error: Unknown provider
|
414
|
+
# Solution: Check provider name
|
415
|
+
RailsAi.configure { |c| c.provider = :openai } # Correct
|
416
|
+
RailsAi.configure { |c| c.provider = :OpenAI } # Wrong
|
417
|
+
```
|
418
|
+
|
419
|
+
#### API Key Missing
|
420
|
+
```bash
|
421
|
+
# Error: API key not found
|
422
|
+
# Solution: Set environment variable
|
423
|
+
export OPENAI_API_KEY=your_key_here
|
424
|
+
export ANTHROPIC_API_KEY=your_key_here
|
425
|
+
export GEMINI_API_KEY=your_key_here
|
426
|
+
```
|
427
|
+
|
428
|
+
#### Unsupported Operation
|
429
|
+
```ruby
|
430
|
+
# Error: Operation not supported
|
431
|
+
# Solution: Check provider capabilities
|
432
|
+
RailsAi.configure { |c| c.provider = :anthropic }
|
433
|
+
RailsAi.generate_image("test") # Will raise NotImplementedError
|
434
|
+
```
|
435
|
+
|
436
|
+
### Debug Mode
|
437
|
+
|
438
|
+
```ruby
|
439
|
+
# Enable debug logging
|
440
|
+
Rails.logger.level = :debug
|
441
|
+
|
442
|
+
# Check current provider
|
443
|
+
RailsAi.provider.class
|
444
|
+
# => RailsAi::Providers::OpenAIAdapter
|
445
|
+
|
446
|
+
# Check configuration
|
447
|
+
RailsAi.config.provider
|
448
|
+
# => :openai
|
449
|
+
```
|
450
|
+
|
451
|
+
## 🎯 Best Practices
|
452
|
+
|
453
|
+
### Provider Selection Strategy
|
454
|
+
|
455
|
+
1. **OpenAI** - Best for latest models (GPT-5!), comprehensive capabilities
|
456
|
+
2. **Anthropic** - Best for code analysis, reasoning tasks (Claude 3.5 Sonnet!)
|
457
|
+
3. **Gemini** - Best for multimodal tasks, Google ecosystem integration
|
458
|
+
4. **Dummy** - Best for testing and development
|
459
|
+
|
460
|
+
### Performance Optimization
|
461
|
+
|
462
|
+
```ruby
|
463
|
+
# Use appropriate models for tasks
|
464
|
+
RailsAi.configure { |c| c.provider = :openai }
|
465
|
+
RailsAi.chat("Quick response", model: "gpt-4o") # Fast
|
466
|
+
RailsAi.chat("Complex analysis", model: "gpt-5") # Latest and most advanced
|
467
|
+
|
468
|
+
# Use Claude for code analysis
|
469
|
+
RailsAi.configure { |c| c.provider = :anthropic }
|
470
|
+
RailsAi.chat("Analyze this code", model: "claude-3-5-sonnet-20241022")
|
471
|
+
|
472
|
+
# Cache expensive operations
|
473
|
+
RailsAi.generate_image("Complex image", model: "dall-e-3")
|
474
|
+
```
|
475
|
+
|
476
|
+
### Error Handling
|
477
|
+
|
478
|
+
```ruby
|
479
|
+
def safe_ai_operation(prompt)
|
480
|
+
begin
|
481
|
+
RailsAi.chat(prompt, model: "gpt-5")
|
482
|
+
rescue => e
|
483
|
+
Rails.logger.error("AI operation failed: #{e.message}")
|
484
|
+
"Sorry, I'm having trouble right now."
|
485
|
+
end
|
486
|
+
end
|
487
|
+
```
|
488
|
+
|
489
|
+
---
|
490
|
+
|
491
|
+
**Rails AI supports multiple providers for maximum flexibility!** ��
|
492
|
+
|
493
|
+
**All providers now offer zero-dependency access to the latest AI models!** ✨
|
494
|
+
|
495
|
+
**GPT-5, Claude 3.5 Sonnet, Gemini 2.0 Flash, and Sora are all available with direct API calls!** 🎉
|