rails_ai_promptable 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +7 -0
- data/.rspec +3 -0
- data/.rubocop.yml +8 -0
- data/CHANGELOG.md +5 -0
- data/CODE_OF_CONDUCT.md +132 -0
- data/LICENSE.txt +21 -0
- data/README.md +596 -0
- data/Rakefile +12 -0
- data/lib/generators/rails_ai_promptable/install_generator.rb +124 -0
- data/lib/generators/rails_ai_promptable/templates/POST_INSTALL +46 -0
- data/lib/generators/rails_ai_promptable/templates/rails_ai_promptable.rb.tt +42 -0
- data/lib/rails_ai_promptable/background_job.rb +38 -0
- data/lib/rails_ai_promptable/configuration.rb +82 -0
- data/lib/rails_ai_promptable/logger.rb +18 -0
- data/lib/rails_ai_promptable/promptable.rb +65 -0
- data/lib/rails_ai_promptable/providers/anthropic_provider.rb +55 -0
- data/lib/rails_ai_promptable/providers/azure_openai_provider.rb +65 -0
- data/lib/rails_ai_promptable/providers/base_provider.rb +15 -0
- data/lib/rails_ai_promptable/providers/cohere_provider.rb +52 -0
- data/lib/rails_ai_promptable/providers/gemini_provider.rb +55 -0
- data/lib/rails_ai_promptable/providers/mistral_provider.rb +52 -0
- data/lib/rails_ai_promptable/providers/openai_provider.rb +44 -0
- data/lib/rails_ai_promptable/providers/openrouter_provider.rb +60 -0
- data/lib/rails_ai_promptable/providers.rb +40 -0
- data/lib/rails_ai_promptable/template_registry.rb +73 -0
- data/lib/rails_ai_promptable/version.rb +5 -0
- data/lib/rails_ai_promptable.rb +28 -0
- data/sig/rails_ai_promptable.rbs +4 -0
- metadata +104 -0
data/README.md
ADDED
|
@@ -0,0 +1,596 @@
|
|
|
1
|
+
# Rails AI Promptable
|
|
2
|
+
|
|
3
|
+
A powerful and flexible gem to integrate AI capabilities into your Rails applications. Support for multiple AI providers including OpenAI, Anthropic (Claude), Google Gemini, Cohere, Azure OpenAI, Mistral AI, and OpenRouter.
|
|
4
|
+
|
|
5
|
+
[](https://badge.fury.io/rb/rails_ai_promptable)
|
|
6
|
+
[](https://github.com/shoaibmalik786/rails_ai_promptable/actions)
|
|
7
|
+
|
|
8
|
+
## Features
|
|
9
|
+
|
|
10
|
+
- 🚀 **Multiple AI Providers**: OpenAI, Anthropic, Gemini, Cohere, Azure OpenAI, Mistral, OpenRouter
|
|
11
|
+
- 🔌 **Easy Integration**: Include in any Rails model or service
|
|
12
|
+
- 📝 **Template System**: Simple prompt templating with variable interpolation
|
|
13
|
+
- ⚡ **Background Processing**: Built-in ActiveJob support for async AI generation
|
|
14
|
+
- 🛠️ **Configurable**: Flexible configuration per provider
|
|
15
|
+
- 🔒 **Type Safe**: Full test coverage with RSpec
|
|
16
|
+
- 🎯 **Rails-First**: Designed specifically for Rails applications
|
|
17
|
+
|
|
18
|
+
## Installation
|
|
19
|
+
|
|
20
|
+
Add this line to your application's Gemfile:
|
|
21
|
+
|
|
22
|
+
```ruby
|
|
23
|
+
gem 'rails_ai_promptable'
|
|
24
|
+
```
|
|
25
|
+
|
|
26
|
+
And then execute:
|
|
27
|
+
|
|
28
|
+
```bash
|
|
29
|
+
bundle install
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
## Quick Start
|
|
33
|
+
|
|
34
|
+
### 1. Generate Configuration (Recommended)
|
|
35
|
+
|
|
36
|
+
Run the installation generator to create the configuration file:
|
|
37
|
+
|
|
38
|
+
```bash
|
|
39
|
+
# Default: OpenAI
|
|
40
|
+
rails generate rails_ai_promptable:install
|
|
41
|
+
|
|
42
|
+
# Or specify a different provider
|
|
43
|
+
rails generate rails_ai_promptable:install --provider=anthropic
|
|
44
|
+
rails generate rails_ai_promptable:install --provider=gemini
|
|
45
|
+
rails generate rails_ai_promptable:install --provider=cohere
|
|
46
|
+
rails generate rails_ai_promptable:install --provider=azure_openai
|
|
47
|
+
rails generate rails_ai_promptable:install --provider=mistral
|
|
48
|
+
rails generate rails_ai_promptable:install --provider=openrouter
|
|
49
|
+
```
|
|
50
|
+
|
|
51
|
+
This will create `config/initializers/rails_ai_promptable.rb` with provider-specific configuration.
|
|
52
|
+
|
|
53
|
+
### 1. Configure Manually (Alternative)
|
|
54
|
+
|
|
55
|
+
Alternatively, create an initializer `config/initializers/rails_ai_promptable.rb` manually:
|
|
56
|
+
|
|
57
|
+
```ruby
|
|
58
|
+
RailsAIPromptable.configure do |config|
|
|
59
|
+
config.provider = :openai
|
|
60
|
+
config.api_key = ENV['OPENAI_API_KEY']
|
|
61
|
+
config.default_model = 'gpt-4o-mini'
|
|
62
|
+
config.timeout = 30
|
|
63
|
+
end
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
### 2. Include in Your Models
|
|
67
|
+
|
|
68
|
+
```ruby
|
|
69
|
+
class Article < ApplicationRecord
|
|
70
|
+
include RailsAIPromptable::Promptable
|
|
71
|
+
|
|
72
|
+
# Define a prompt template
|
|
73
|
+
prompt_template "Summarize this article in 3 sentences: %{content}"
|
|
74
|
+
|
|
75
|
+
def generate_summary
|
|
76
|
+
ai_generate(context: { content: body })
|
|
77
|
+
end
|
|
78
|
+
end
|
|
79
|
+
```
|
|
80
|
+
|
|
81
|
+
### 3. Use in Your Application
|
|
82
|
+
|
|
83
|
+
```ruby
|
|
84
|
+
article = Article.find(1)
|
|
85
|
+
summary = article.generate_summary
|
|
86
|
+
# => "This article discusses..."
|
|
87
|
+
```
|
|
88
|
+
|
|
89
|
+
## Supported Providers
|
|
90
|
+
|
|
91
|
+
### OpenAI
|
|
92
|
+
|
|
93
|
+
```ruby
|
|
94
|
+
RailsAIPromptable.configure do |config|
|
|
95
|
+
config.provider = :openai
|
|
96
|
+
config.api_key = ENV['OPENAI_API_KEY']
|
|
97
|
+
config.default_model = 'gpt-4o-mini'
|
|
98
|
+
config.openai_base_url = 'https://api.openai.com/v1' # optional, for custom endpoints
|
|
99
|
+
end
|
|
100
|
+
```
|
|
101
|
+
|
|
102
|
+
**Available Models**: `gpt-4o`, `gpt-4o-mini`, `gpt-4-turbo`, `gpt-3.5-turbo`
|
|
103
|
+
|
|
104
|
+
### Anthropic (Claude)
|
|
105
|
+
|
|
106
|
+
```ruby
|
|
107
|
+
RailsAIPromptable.configure do |config|
|
|
108
|
+
config.provider = :anthropic # or :claude
|
|
109
|
+
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
|
|
110
|
+
config.default_model = 'claude-3-5-sonnet-20241022'
|
|
111
|
+
end
|
|
112
|
+
```
|
|
113
|
+
|
|
114
|
+
**Available Models**: `claude-3-5-sonnet-20241022`, `claude-3-opus-20240229`, `claude-3-sonnet-20240229`, `claude-3-haiku-20240307`
|
|
115
|
+
|
|
116
|
+
### Google Gemini
|
|
117
|
+
|
|
118
|
+
```ruby
|
|
119
|
+
RailsAIPromptable.configure do |config|
|
|
120
|
+
config.provider = :gemini # or :google
|
|
121
|
+
config.gemini_api_key = ENV['GEMINI_API_KEY']
|
|
122
|
+
config.default_model = 'gemini-pro'
|
|
123
|
+
end
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
**Available Models**: `gemini-pro`, `gemini-pro-vision`, `gemini-1.5-pro`, `gemini-1.5-flash`
|
|
127
|
+
|
|
128
|
+
### Cohere
|
|
129
|
+
|
|
130
|
+
```ruby
|
|
131
|
+
RailsAIPromptable.configure do |config|
|
|
132
|
+
config.provider = :cohere
|
|
133
|
+
config.cohere_api_key = ENV['COHERE_API_KEY']
|
|
134
|
+
config.default_model = 'command'
|
|
135
|
+
end
|
|
136
|
+
```
|
|
137
|
+
|
|
138
|
+
**Available Models**: `command`, `command-light`, `command-nightly`
|
|
139
|
+
|
|
140
|
+
### Azure OpenAI
|
|
141
|
+
|
|
142
|
+
```ruby
|
|
143
|
+
RailsAIPromptable.configure do |config|
|
|
144
|
+
config.provider = :azure_openai # or :azure
|
|
145
|
+
config.azure_api_key = ENV['AZURE_OPENAI_API_KEY']
|
|
146
|
+
config.azure_base_url = ENV['AZURE_OPENAI_BASE_URL'] # e.g., https://your-resource.openai.azure.com
|
|
147
|
+
config.azure_deployment_name = ENV['AZURE_OPENAI_DEPLOYMENT_NAME']
|
|
148
|
+
config.azure_api_version = '2024-02-15-preview' # optional
|
|
149
|
+
end
|
|
150
|
+
```
|
|
151
|
+
|
|
152
|
+
### Mistral AI
|
|
153
|
+
|
|
154
|
+
```ruby
|
|
155
|
+
RailsAIPromptable.configure do |config|
|
|
156
|
+
config.provider = :mistral
|
|
157
|
+
config.mistral_api_key = ENV['MISTRAL_API_KEY']
|
|
158
|
+
config.default_model = 'mistral-small-latest'
|
|
159
|
+
end
|
|
160
|
+
```
|
|
161
|
+
|
|
162
|
+
**Available Models**: `mistral-large-latest`, `mistral-small-latest`, `mistral-medium-latest`
|
|
163
|
+
|
|
164
|
+
### OpenRouter
|
|
165
|
+
|
|
166
|
+
```ruby
|
|
167
|
+
RailsAIPromptable.configure do |config|
|
|
168
|
+
config.provider = :openrouter
|
|
169
|
+
config.openrouter_api_key = ENV['OPENROUTER_API_KEY']
|
|
170
|
+
config.openrouter_app_name = 'Your App Name' # optional, for tracking
|
|
171
|
+
config.openrouter_site_url = 'https://yourapp.com' # optional, for attribution
|
|
172
|
+
config.default_model = 'openai/gpt-3.5-turbo'
|
|
173
|
+
end
|
|
174
|
+
```
|
|
175
|
+
|
|
176
|
+
**Note**: OpenRouter provides access to multiple models from different providers. See [OpenRouter documentation](https://openrouter.ai/docs) for available models.
|
|
177
|
+
|
|
178
|
+
## Usage Examples
|
|
179
|
+
|
|
180
|
+
### Basic Usage
|
|
181
|
+
|
|
182
|
+
```ruby
|
|
183
|
+
class Product < ApplicationRecord
|
|
184
|
+
include RailsAIPromptable::Promptable
|
|
185
|
+
|
|
186
|
+
prompt_template "Generate a compelling product description for: %{name}. Features: %{features}"
|
|
187
|
+
|
|
188
|
+
def generate_description
|
|
189
|
+
ai_generate(
|
|
190
|
+
context: {
|
|
191
|
+
name: title,
|
|
192
|
+
features: features.join(', ')
|
|
193
|
+
}
|
|
194
|
+
)
|
|
195
|
+
end
|
|
196
|
+
end
|
|
197
|
+
```
|
|
198
|
+
|
|
199
|
+
### Custom Model and Temperature
|
|
200
|
+
|
|
201
|
+
```ruby
|
|
202
|
+
class BlogPost < ApplicationRecord
|
|
203
|
+
include RailsAIPromptable::Promptable
|
|
204
|
+
|
|
205
|
+
prompt_template "Write a creative blog post about: %{topic}"
|
|
206
|
+
|
|
207
|
+
def generate_content
|
|
208
|
+
ai_generate(
|
|
209
|
+
context: { topic: title },
|
|
210
|
+
model: 'gpt-4o', # Override default model
|
|
211
|
+
temperature: 0.9, # Higher temperature for more creativity
|
|
212
|
+
format: :text # Output format
|
|
213
|
+
)
|
|
214
|
+
end
|
|
215
|
+
end
|
|
216
|
+
```
|
|
217
|
+
|
|
218
|
+
### Background Processing
|
|
219
|
+
|
|
220
|
+
For long-running AI tasks, use background processing:
|
|
221
|
+
|
|
222
|
+
```ruby
|
|
223
|
+
class Report < ApplicationRecord
|
|
224
|
+
include RailsAIPromptable::Promptable
|
|
225
|
+
|
|
226
|
+
prompt_template "Analyze this data and provide insights: %{data}"
|
|
227
|
+
|
|
228
|
+
def generate_analysis
|
|
229
|
+
# Enqueues the AI generation job
|
|
230
|
+
ai_generate_later(
|
|
231
|
+
context: { data: raw_data },
|
|
232
|
+
model: 'gpt-4o'
|
|
233
|
+
)
|
|
234
|
+
end
|
|
235
|
+
end
|
|
236
|
+
```
|
|
237
|
+
|
|
238
|
+
### Dynamic Templates
|
|
239
|
+
|
|
240
|
+
You can set templates dynamically:
|
|
241
|
+
|
|
242
|
+
```ruby
|
|
243
|
+
class ContentGenerator
|
|
244
|
+
include RailsAIPromptable::Promptable
|
|
245
|
+
|
|
246
|
+
def generate_with_custom_template(template, context)
|
|
247
|
+
self.class.prompt_template(template)
|
|
248
|
+
ai_generate(context: context)
|
|
249
|
+
end
|
|
250
|
+
end
|
|
251
|
+
|
|
252
|
+
generator = ContentGenerator.new
|
|
253
|
+
result = generator.generate_with_custom_template(
|
|
254
|
+
"Translate this to Spanish: %{text}",
|
|
255
|
+
{ text: "Hello, world!" }
|
|
256
|
+
)
|
|
257
|
+
```
|
|
258
|
+
|
|
259
|
+
### Multiple Providers in One Application
|
|
260
|
+
|
|
261
|
+
```ruby
|
|
262
|
+
# In your initializer
|
|
263
|
+
RailsAIPromptable.configure do |config|
|
|
264
|
+
config.provider = :openai
|
|
265
|
+
config.api_key = ENV['OPENAI_API_KEY']
|
|
266
|
+
|
|
267
|
+
# Also configure other providers
|
|
268
|
+
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
|
|
269
|
+
config.gemini_api_key = ENV['GEMINI_API_KEY']
|
|
270
|
+
end
|
|
271
|
+
|
|
272
|
+
# Switch providers at runtime
|
|
273
|
+
RailsAIPromptable.configure do |config|
|
|
274
|
+
config.provider = :anthropic
|
|
275
|
+
end
|
|
276
|
+
RailsAIPromptable.reset_client! # Reset to use new provider
|
|
277
|
+
```
|
|
278
|
+
|
|
279
|
+
### Service Objects
|
|
280
|
+
|
|
281
|
+
```ruby
|
|
282
|
+
class AIContentService
|
|
283
|
+
include RailsAIPromptable::Promptable
|
|
284
|
+
|
|
285
|
+
prompt_template "%{task}"
|
|
286
|
+
|
|
287
|
+
def initialize(task)
|
|
288
|
+
@task = task
|
|
289
|
+
end
|
|
290
|
+
|
|
291
|
+
def execute
|
|
292
|
+
ai_generate(
|
|
293
|
+
context: { task: @task },
|
|
294
|
+
model: determine_best_model,
|
|
295
|
+
temperature: 0.7
|
|
296
|
+
)
|
|
297
|
+
end
|
|
298
|
+
|
|
299
|
+
private
|
|
300
|
+
|
|
301
|
+
def determine_best_model
|
|
302
|
+
case RailsAIPromptable.configuration.provider
|
|
303
|
+
when :openai then 'gpt-4o-mini'
|
|
304
|
+
when :anthropic then 'claude-3-5-sonnet-20241022'
|
|
305
|
+
when :gemini then 'gemini-pro'
|
|
306
|
+
else RailsAIPromptable.configuration.default_model
|
|
307
|
+
end
|
|
308
|
+
end
|
|
309
|
+
end
|
|
310
|
+
|
|
311
|
+
# Usage
|
|
312
|
+
service = AIContentService.new("Explain quantum computing in simple terms")
|
|
313
|
+
result = service.execute
|
|
314
|
+
```
|
|
315
|
+
|
|
316
|
+
## Generator Options
|
|
317
|
+
|
|
318
|
+
The `rails_ai_promptable:install` generator accepts the following options:
|
|
319
|
+
|
|
320
|
+
### Available Options
|
|
321
|
+
|
|
322
|
+
```bash
|
|
323
|
+
rails generate rails_ai_promptable:install [options]
|
|
324
|
+
```
|
|
325
|
+
|
|
326
|
+
**--provider**: Specify which AI provider to configure (default: openai)
|
|
327
|
+
- `openai` - OpenAI (GPT models)
|
|
328
|
+
- `anthropic` or `claude` - Anthropic (Claude models)
|
|
329
|
+
- `gemini` or `google` - Google Gemini
|
|
330
|
+
- `cohere` - Cohere
|
|
331
|
+
- `azure_openai` or `azure` - Azure OpenAI
|
|
332
|
+
- `mistral` - Mistral AI
|
|
333
|
+
- `openrouter` - OpenRouter (multi-provider gateway)
|
|
334
|
+
|
|
335
|
+
### What the Generator Creates
|
|
336
|
+
|
|
337
|
+
The generator creates a configuration file at `config/initializers/rails_ai_promptable.rb` with:
|
|
338
|
+
|
|
339
|
+
1. **Provider-specific configuration** - Pre-configured settings for your chosen provider
|
|
340
|
+
2. **Default model** - Appropriate default model for the provider
|
|
341
|
+
3. **Environment variable setup** - Ready-to-use ENV variable references
|
|
342
|
+
4. **Comments and examples** - Helpful documentation and usage examples
|
|
343
|
+
5. **Multi-provider setup** - Optional configuration for using multiple providers
|
|
344
|
+
|
|
345
|
+
### Examples
|
|
346
|
+
|
|
347
|
+
```bash
|
|
348
|
+
# Generate config for OpenAI (default)
|
|
349
|
+
rails generate rails_ai_promptable:install
|
|
350
|
+
|
|
351
|
+
# Generate config for Claude/Anthropic
|
|
352
|
+
rails generate rails_ai_promptable:install --provider=anthropic
|
|
353
|
+
|
|
354
|
+
# Generate config for Azure OpenAI
|
|
355
|
+
rails generate rails_ai_promptable:install --provider=azure_openai
|
|
356
|
+
```
|
|
357
|
+
|
|
358
|
+
After running the generator, you'll see helpful post-installation instructions guiding you through the next steps.
|
|
359
|
+
|
|
360
|
+
## Configuration Options
|
|
361
|
+
|
|
362
|
+
### Global Configuration
|
|
363
|
+
|
|
364
|
+
```ruby
|
|
365
|
+
RailsAIPromptable.configure do |config|
|
|
366
|
+
# Provider Selection
|
|
367
|
+
config.provider = :openai # Required: AI provider to use
|
|
368
|
+
|
|
369
|
+
# Authentication
|
|
370
|
+
config.api_key = ENV['API_KEY'] # Generic API key (fallback)
|
|
371
|
+
|
|
372
|
+
# Model Settings
|
|
373
|
+
config.default_model = 'gpt-4o-mini' # Default model for generation
|
|
374
|
+
config.timeout = 30 # HTTP timeout in seconds
|
|
375
|
+
|
|
376
|
+
# Logging
|
|
377
|
+
config.logger = Rails.logger # Custom logger
|
|
378
|
+
|
|
379
|
+
# Provider-specific settings
|
|
380
|
+
config.openai_base_url = 'https://api.openai.com/v1'
|
|
381
|
+
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
|
|
382
|
+
config.gemini_api_key = ENV['GEMINI_API_KEY']
|
|
383
|
+
# ... and more
|
|
384
|
+
end
|
|
385
|
+
```
|
|
386
|
+
|
|
387
|
+
### Environment Variables
|
|
388
|
+
|
|
389
|
+
The gem supports the following environment variables:
|
|
390
|
+
|
|
391
|
+
```bash
|
|
392
|
+
# OpenAI
|
|
393
|
+
OPENAI_API_KEY=your-key
|
|
394
|
+
|
|
395
|
+
# Anthropic
|
|
396
|
+
ANTHROPIC_API_KEY=your-key
|
|
397
|
+
|
|
398
|
+
# Google Gemini
|
|
399
|
+
GEMINI_API_KEY=your-key
|
|
400
|
+
|
|
401
|
+
# Cohere
|
|
402
|
+
COHERE_API_KEY=your-key
|
|
403
|
+
|
|
404
|
+
# Azure OpenAI
|
|
405
|
+
AZURE_OPENAI_API_KEY=your-key
|
|
406
|
+
AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
|
|
407
|
+
AZURE_OPENAI_DEPLOYMENT_NAME=your-deployment
|
|
408
|
+
|
|
409
|
+
# Mistral AI
|
|
410
|
+
MISTRAL_API_KEY=your-key
|
|
411
|
+
|
|
412
|
+
# OpenRouter
|
|
413
|
+
OPENROUTER_API_KEY=your-key
|
|
414
|
+
OPENROUTER_APP_NAME=your-app-name
|
|
415
|
+
OPENROUTER_SITE_URL=https://yourapp.com
|
|
416
|
+
```
|
|
417
|
+
|
|
418
|
+
## API Reference
|
|
419
|
+
|
|
420
|
+
### Configuration Methods
|
|
421
|
+
|
|
422
|
+
#### `RailsAIPromptable.configure`
|
|
423
|
+
Configure the gem with a block.
|
|
424
|
+
|
|
425
|
+
```ruby
|
|
426
|
+
RailsAIPromptable.configure do |config|
|
|
427
|
+
config.provider = :openai
|
|
428
|
+
config.api_key = 'your-key'
|
|
429
|
+
end
|
|
430
|
+
```
|
|
431
|
+
|
|
432
|
+
#### `RailsAIPromptable.client`
|
|
433
|
+
Get the current provider client instance.
|
|
434
|
+
|
|
435
|
+
```ruby
|
|
436
|
+
client = RailsAIPromptable.client
|
|
437
|
+
# => #<RailsAIPromptable::Providers::OpenAIProvider>
|
|
438
|
+
```
|
|
439
|
+
|
|
440
|
+
#### `RailsAIPromptable.reset_client!`
|
|
441
|
+
Reset the memoized client (useful when changing providers).
|
|
442
|
+
|
|
443
|
+
```ruby
|
|
444
|
+
RailsAIPromptable.reset_client!
|
|
445
|
+
```
|
|
446
|
+
|
|
447
|
+
#### `RailsAIPromptable::Providers.available_providers`
|
|
448
|
+
Get a list of all supported providers.
|
|
449
|
+
|
|
450
|
+
```ruby
|
|
451
|
+
RailsAIPromptable::Providers.available_providers
|
|
452
|
+
# => [:openai, :anthropic, :gemini, :cohere, :azure_openai, :mistral, :openrouter]
|
|
453
|
+
```
|
|
454
|
+
|
|
455
|
+
### Promptable Module Methods
|
|
456
|
+
|
|
457
|
+
#### `.prompt_template(template)`
|
|
458
|
+
Define or retrieve the prompt template for a class.
|
|
459
|
+
|
|
460
|
+
```ruby
|
|
461
|
+
class Article
|
|
462
|
+
include RailsAIPromptable::Promptable
|
|
463
|
+
prompt_template "Summarize: %{content}"
|
|
464
|
+
end
|
|
465
|
+
```
|
|
466
|
+
|
|
467
|
+
#### `#ai_generate(context:, model:, temperature:, format:)`
|
|
468
|
+
Generate AI content synchronously.
|
|
469
|
+
|
|
470
|
+
**Parameters:**
|
|
471
|
+
- `context` (Hash): Variables to interpolate in the template
|
|
472
|
+
- `model` (String, optional): Override the default model
|
|
473
|
+
- `temperature` (Float, optional): Control randomness (0.0 - 1.0), defaults to 0.7
|
|
474
|
+
- `format` (Symbol, optional): Output format (`:text`, `:json`), defaults to `:text`
|
|
475
|
+
|
|
476
|
+
**Returns:** String - The generated content
|
|
477
|
+
|
|
478
|
+
```ruby
|
|
479
|
+
result = article.ai_generate(
|
|
480
|
+
context: { content: article.body },
|
|
481
|
+
model: 'gpt-4o',
|
|
482
|
+
temperature: 0.5
|
|
483
|
+
)
|
|
484
|
+
```
|
|
485
|
+
|
|
486
|
+
#### `#ai_generate_later(context:, **kwargs)`
|
|
487
|
+
Enqueue AI generation as a background job.
|
|
488
|
+
|
|
489
|
+
```ruby
|
|
490
|
+
article.ai_generate_later(
|
|
491
|
+
context: { content: article.body },
|
|
492
|
+
model: 'gpt-4o'
|
|
493
|
+
)
|
|
494
|
+
```
|
|
495
|
+
|
|
496
|
+
## Testing
|
|
497
|
+
|
|
498
|
+
The gem includes comprehensive test coverage. Run tests with:
|
|
499
|
+
|
|
500
|
+
```bash
|
|
501
|
+
bundle exec rspec
|
|
502
|
+
```
|
|
503
|
+
|
|
504
|
+
### Mocking in Tests
|
|
505
|
+
|
|
506
|
+
```ruby
|
|
507
|
+
# In your spec_helper.rb or test setup
|
|
508
|
+
RSpec.configure do |config|
|
|
509
|
+
config.before(:each) do
|
|
510
|
+
allow(RailsAIPromptable).to receive(:client).and_return(mock_client)
|
|
511
|
+
end
|
|
512
|
+
end
|
|
513
|
+
|
|
514
|
+
# In your tests
|
|
515
|
+
let(:mock_client) { instance_double('Client') }
|
|
516
|
+
|
|
517
|
+
it 'generates content' do
|
|
518
|
+
allow(mock_client).to receive(:generate).and_return('Generated text')
|
|
519
|
+
|
|
520
|
+
result = article.generate_summary
|
|
521
|
+
expect(result).to eq('Generated text')
|
|
522
|
+
end
|
|
523
|
+
```
|
|
524
|
+
|
|
525
|
+
## Error Handling
|
|
526
|
+
|
|
527
|
+
All providers include error handling with logging:
|
|
528
|
+
|
|
529
|
+
```ruby
|
|
530
|
+
class Article < ApplicationRecord
|
|
531
|
+
include RailsAIPromptable::Promptable
|
|
532
|
+
|
|
533
|
+
def safe_generate
|
|
534
|
+
result = ai_generate(context: { content: body })
|
|
535
|
+
|
|
536
|
+
if result.nil?
|
|
537
|
+
# Check logs for error details
|
|
538
|
+
Rails.logger.error("AI generation failed for Article #{id}")
|
|
539
|
+
"Unable to generate content at this time"
|
|
540
|
+
else
|
|
541
|
+
result
|
|
542
|
+
end
|
|
543
|
+
end
|
|
544
|
+
end
|
|
545
|
+
```
|
|
546
|
+
|
|
547
|
+
## Performance Tips
|
|
548
|
+
|
|
549
|
+
1. **Use Background Jobs**: For non-critical content, use `ai_generate_later`
|
|
550
|
+
2. **Choose Appropriate Models**: Smaller models like `gpt-4o-mini` or `claude-3-haiku` are faster
|
|
551
|
+
3. **Set Timeouts**: Adjust `timeout` based on your needs
|
|
552
|
+
4. **Cache Results**: Consider caching generated content to avoid redundant API calls
|
|
553
|
+
|
|
554
|
+
```ruby
|
|
555
|
+
class Article < ApplicationRecord
|
|
556
|
+
include RailsAIPromptable::Promptable
|
|
557
|
+
|
|
558
|
+
def cached_summary
|
|
559
|
+
Rails.cache.fetch("article_summary_#{id}", expires_in: 1.day) do
|
|
560
|
+
ai_generate(context: { content: body })
|
|
561
|
+
end
|
|
562
|
+
end
|
|
563
|
+
end
|
|
564
|
+
```
|
|
565
|
+
|
|
566
|
+
## Contributing
|
|
567
|
+
|
|
568
|
+
Bug reports and pull requests are welcome on GitHub at https://github.com/shoaibmalik786/rails_ai_promptable.
|
|
569
|
+
|
|
570
|
+
1. Fork the repository
|
|
571
|
+
2. Create your feature branch (`git checkout -b my-new-feature`)
|
|
572
|
+
3. Commit your changes (`git commit -am 'Add some feature'`)
|
|
573
|
+
4. Push to the branch (`git push origin my-new-feature`)
|
|
574
|
+
5. Create a new Pull Request
|
|
575
|
+
|
|
576
|
+
## License
|
|
577
|
+
|
|
578
|
+
The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
|
|
579
|
+
|
|
580
|
+
## Code of Conduct
|
|
581
|
+
|
|
582
|
+
Everyone interacting in the RailsAIPromptable project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/shoaibmalik786/rails_ai_promptable/blob/main/CODE_OF_CONDUCT.md).
|
|
583
|
+
|
|
584
|
+
## Changelog
|
|
585
|
+
|
|
586
|
+
See [CHANGELOG.md](CHANGELOG.md) for details on updates and changes.
|
|
587
|
+
|
|
588
|
+
## Support
|
|
589
|
+
|
|
590
|
+
- 📚 [Documentation](https://github.com/shoaibmalik786/rails_ai_promptable)
|
|
591
|
+
- 🐛 [Issue Tracker](https://github.com/shoaibmalik786/rails_ai_promptable/issues)
|
|
592
|
+
- 💬 [Discussions](https://github.com/shoaibmalik786/rails_ai_promptable/discussions)
|
|
593
|
+
|
|
594
|
+
## Acknowledgments
|
|
595
|
+
|
|
596
|
+
Special thanks to all contributors and the Ruby on Rails community for their support.
|