ruby_llm 0.1.0.pre41 → 0.1.0.pre42
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/.rspec_status +5 -4
- data/README.md +18 -2
- data/lib/ruby_llm/content.rb +1 -9
- data/lib/ruby_llm/models.json +345 -452
- data/lib/ruby_llm/provider.rb +1 -2
- data/lib/ruby_llm/providers/gemini/chat.rb +140 -0
- data/lib/ruby_llm/providers/gemini/embeddings.rb +53 -0
- data/lib/ruby_llm/providers/gemini/images.rb +51 -0
- data/lib/ruby_llm/providers/gemini/media.rb +136 -0
- data/lib/ruby_llm/providers/gemini/models.rb +41 -6
- data/lib/ruby_llm/providers/gemini/streaming.rb +99 -0
- data/lib/ruby_llm/providers/gemini/tools.rb +88 -0
- data/lib/ruby_llm/providers/gemini.rb +10 -4
- data/lib/ruby_llm/providers/openai/images.rb +0 -2
- data/lib/ruby_llm/stream_accumulator.rb +1 -1
- data/lib/ruby_llm/version.rb +1 -1
- metadata +7 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 2dac5b1b15a5d840d74ac63af791044a26669e0d41b0512d37236530cdd89693
|
4
|
+
data.tar.gz: b737c0061790066680424051afd3b9ea75d9d63f014d1684261dde81c1bedf37
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: a90362b3d2dbe5b9f5c54851d94646273b343a32eef452528903026ea2e1add335a105273b0efc17a5ebde1da15421d894029ebe401e94c511531a1e8d2f706b
|
7
|
+
data.tar.gz: 504e5dd4e3692db83b6b9589e26e33f553865234b59264d05aec1f495e403e8e28eb2b7dfc2b2258047df07dd04f3740005b44dee8a874927b12a55f045fae6e
|
data/.rspec_status
CHANGED
@@ -34,7 +34,8 @@ example_id | status | run_time |
|
|
34
34
|
./spec/ruby_llm/embeddings_spec.rb[1:1:1:2] | passed | 0.43632 seconds |
|
35
35
|
./spec/ruby_llm/embeddings_spec.rb[1:1:2:1] | passed | 0.65614 seconds |
|
36
36
|
./spec/ruby_llm/embeddings_spec.rb[1:1:2:2] | passed | 2.16 seconds |
|
37
|
-
./spec/ruby_llm/error_handling_spec.rb[1:1] | passed | 0.
|
38
|
-
./spec/ruby_llm/image_generation_spec.rb[1:1:1] | passed |
|
39
|
-
./spec/ruby_llm/image_generation_spec.rb[1:1:2] | passed |
|
40
|
-
./spec/ruby_llm/image_generation_spec.rb[1:1:3] | passed |
|
37
|
+
./spec/ruby_llm/error_handling_spec.rb[1:1] | passed | 0.29366 seconds |
|
38
|
+
./spec/ruby_llm/image_generation_spec.rb[1:1:1] | passed | 11.61 seconds |
|
39
|
+
./spec/ruby_llm/image_generation_spec.rb[1:1:2] | passed | 17.63 seconds |
|
40
|
+
./spec/ruby_llm/image_generation_spec.rb[1:1:3] | passed | 8.77 seconds |
|
41
|
+
./spec/ruby_llm/image_generation_spec.rb[1:1:4] | passed | 0.00319 seconds |
|
data/README.md
CHANGED
@@ -24,10 +24,10 @@ A delightful Ruby way to work with AI. Chat in text, analyze and generate images
|
|
24
24
|
## Features
|
25
25
|
|
26
26
|
- 💬 **Beautiful Chat Interface** - Converse with AI models as easily as `RubyLLM.chat.ask "teach me Ruby"`
|
27
|
-
- 📄 **PDF Analysis** - Analyze PDF documents directly with Claude models using `chat.ask "What's in this?", with: { pdf: "document.pdf" }`
|
28
27
|
- 🎵 **Audio Analysis** - Get audio transcription and understanding with `chat.ask "what's said here?", with: { audio: "clip.wav" }`
|
29
28
|
- 👁️ **Vision Understanding** - Let AIs analyze images with a simple `chat.ask "what's in this?", with: { image: "photo.jpg" }`
|
30
29
|
- 🌊 **Streaming** - Real-time responses with proper Ruby streaming with `chat.ask "hello" do |chunk| puts chunk.content end`
|
30
|
+
- 📄 **PDF Analysis** - Analyze PDF documents directly with `chat.ask "What's in this?", with: { pdf: "document.pdf" }`
|
31
31
|
- 🚂 **Rails Integration** - Persist chats and messages with ActiveRecord with `acts_as_{chat|message|tool_call}`
|
32
32
|
- 🛠️ **Tool Support** - Give AIs access to your Ruby code with `chat.with_tool(Calculator).ask "what's 2+2?"`
|
33
33
|
- 🎨 **Paint with AI** - Create images as easily as `RubyLLM.paint "a sunset over mountains"`
|
@@ -116,7 +116,8 @@ chat.ask "What's being said in this recording?", with: { audio: "meeting.wav" }
|
|
116
116
|
# Combine multiple pieces of content
|
117
117
|
chat.ask "Compare these diagrams", with: { image: ["diagram1.png", "diagram2.png"] }
|
118
118
|
|
119
|
-
# Ask about PDFs
|
119
|
+
# Ask about PDFs
|
120
|
+
|
120
121
|
chat = RubyLLM.chat(model: 'claude-3-7-sonnet-20250219')
|
121
122
|
chat.ask "Summarize this research paper", with: { pdf: "research.pdf" }
|
122
123
|
|
@@ -447,6 +448,21 @@ pp chat.messages.map(&:role)
|
|
447
448
|
#=> [:user, :assistant, :tool, :assistant]
|
448
449
|
```
|
449
450
|
|
451
|
+
## Provider Comparison
|
452
|
+
|
453
|
+
| Feature | OpenAI | Anthropic | Google | DeepSeek |
|
454
|
+
|---------|--------|-----------|--------|----------|
|
455
|
+
| Chat | ✅ GPT-4o, GPT-3.5 | ✅ Claude 3.7, 3.5, 3 | ✅ Gemini 2.0, 1.5 | ✅ DeepSeek Chat, Reasoner |
|
456
|
+
| Vision | ✅ GPT-4o, GPT-4 | ✅ All Claude 3 models | ✅ Gemini 2.0, 1.5 | ❌ |
|
457
|
+
| Audio | ✅ GPT-4o-audio, Whisper | ❌ | ✅ Gemini models | ❌ |
|
458
|
+
| PDF Analysis | ❌ | ✅ All Claude 3 models | ✅ Gemini models | ❌ |
|
459
|
+
| Function Calling | ✅ Most models | ✅ Claude 3 models | ✅ Gemini models (except Lite) | ✅ |
|
460
|
+
| JSON Mode | ✅ Most recent models | ✅ Claude 3 models | ✅ Gemini models | ❌ |
|
461
|
+
| Image Generation | ✅ DALL-E 3 | ❌ | ✅ Imagen | ❌ |
|
462
|
+
| Embeddings | ✅ text-embedding-3 | ❌ | ✅ text-embedding-004 | ❌ |
|
463
|
+
| Context Size | ⭐ Up to 200K (o1) | ⭐ 200K tokens | ⭐ Up to 2M tokens | 64K tokens |
|
464
|
+
| Streaming | ✅ | ✅ | ✅ | ✅ |
|
465
|
+
|
450
466
|
## Development
|
451
467
|
|
452
468
|
After checking out the repo, run `bin/setup` to install dependencies. Then, run `bin/console` for an interactive prompt.
|
data/lib/ruby_llm/content.rb
CHANGED
@@ -68,7 +68,7 @@ module RubyLLM
|
|
68
68
|
}
|
69
69
|
end
|
70
70
|
|
71
|
-
def attach_pdf(source)
|
71
|
+
def attach_pdf(source)
|
72
72
|
source = File.expand_path(source) unless source.start_with?('http')
|
73
73
|
|
74
74
|
pdf_data = {
|
@@ -80,19 +80,11 @@ module RubyLLM
|
|
80
80
|
unless source.start_with?('http')
|
81
81
|
raise Error, "PDF file not found: #{source}" unless File.exist?(source)
|
82
82
|
|
83
|
-
# Simple check for PDF file type (could be more robust)
|
84
|
-
unless source.downcase.end_with?('.pdf') || File.read(source, 5) == '%PDF-'
|
85
|
-
RubyLLM.logger.warn "File may not be a valid PDF: #{source}"
|
86
|
-
end
|
87
|
-
|
88
83
|
# Preload file content for providers that need it
|
89
84
|
pdf_data[:content] = File.read(source)
|
90
85
|
end
|
91
86
|
|
92
87
|
pdf_data
|
93
|
-
rescue StandardError => e
|
94
|
-
RubyLLM.logger.error "Error attaching PDF #{source}: #{e.message}"
|
95
|
-
raise Error, "Failed to attach PDF: #{e.message}"
|
96
88
|
end
|
97
89
|
|
98
90
|
def encode_file(source)
|