ruby_llm 1.3.0 → 1.3.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 8a1f6385e98e9396c7f9d8021bd2e574f5f8c8aa1cc05c7f10311e2059101017
4
- data.tar.gz: e7617280adc17488f9dbc810b424b22ee21f36c0684f0a5f37762dd290d6c0de
3
+ metadata.gz: a2287ea41c7591e593a315a35fd34ef8e32968c1b7337042c0b08338caeda784
4
+ data.tar.gz: 55a2690e1faa46d9ecdda3e8115ae64549be8d52c80fb5ffdb2aadd6fa8887dd
5
5
  SHA512:
6
- metadata.gz: 80bcef1cb440519b7eb146bd4064f8eb5a40cf1fd5e309ac582c6be8a3aee0a33b244e8c4961bc69ac763b2665adab918238a0d8787b41c2fb6e5e7fd6343ae3
7
- data.tar.gz: 3d95b550d6d879ad29e20bb328cb09d7191b4055180cba95e237ddab2d1173845463d62e407c777d6b76425958531fb6e7abc664939478c9e89b41565971ad2b
6
+ metadata.gz: 8d5c7d2b3b73f289bdad5c5faac487a03221101302567a1d36b073de84e97e3650cc4d103ce6f9e641659c0ecc69484593535ed964b933f7d14b19b5dce34e81
7
+ data.tar.gz: 1515124ad74198461fdbc8105f7173392df7bee2038c701ad85d6da258107bb3ddd366878fa140bd5de354b41dfe8abc5887a78adf2c7cb3beb0cf6caa64f764
data/README.md CHANGED
@@ -37,7 +37,7 @@
37
37
 
38
38
  Every AI provider comes with its own client library, its own response format, its own conventions for streaming, and its own way of handling errors. Want to use multiple providers? Prepare to juggle incompatible APIs and bloated dependencies.
39
39
 
40
- RubyLLM fixes all that. One beautiful API for everything. One consistent format. Minimal dependencies — just Faraday and Zeitwerk. Because working with AI should be a joy, not a chore.
40
+ RubyLLM fixes all that. One beautiful API for everything. One consistent format. Minimal dependencies — just Faraday, Zeitwerk, and Marcel. Because working with AI should be a joy, not a chore.
41
41
 
42
42
  ## What makes it great
43
43
 
@@ -46,14 +46,14 @@ RubyLLM fixes all that. One beautiful API for everything. One consistent format.
46
46
  chat = RubyLLM.chat
47
47
  chat.ask "What's the best way to learn Ruby?"
48
48
 
49
- # Analyze images
50
- chat.ask "What's in this image?", with: { image: "ruby_conf.jpg" }
49
+ # Analyze images, audio, documents, and text files
50
+ chat.ask "What's in this image?", with: "ruby_conf.jpg"
51
+ chat.ask "Describe this meeting", with: "meeting.wav"
52
+ chat.ask "Summarize this document", with: "contract.pdf"
53
+ chat.ask "Explain this code", with: "app.rb"
51
54
 
52
- # Analyze audio recordings
53
- chat.ask "Describe this meeting", with: { audio: "meeting.wav" }
54
-
55
- # Analyze documents
56
- chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
55
+ # Multiple files at once - types automatically detected
56
+ chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"]
57
57
 
58
58
  # Stream responses in real-time
59
59
  chat.ask "Tell me a story about a Ruby programmer" do |chunk|
@@ -90,7 +90,7 @@ chat.with_tool(Weather).ask "What's the weather in Berlin? (52.5200, 13.4050)"
90
90
  * 💬 **Unified Chat:** Converse with models from OpenAI, Anthropic, Gemini, Bedrock, OpenRouter, DeepSeek, Ollama, or any OpenAI-compatible API using `RubyLLM.chat`.
91
91
  * 👁️ **Vision:** Analyze images within chats.
92
92
  * 🔊 **Audio:** Transcribe and understand audio content.
93
- * 📄 **PDF Analysis:** Extract information and summarize PDF documents.
93
+ * 📄 **Document Analysis:** Extract information from PDFs, text files, and other documents.
94
94
  * 🖼️ **Image Generation:** Create images with `RubyLLM.paint`.
95
95
  * 📊 **Embeddings:** Generate text embeddings for vector search with `RubyLLM.embed`.
96
96
  * 🔧 **Tools (Function Calling):** Let AI models call your Ruby code using `RubyLLM::Tool`.
@@ -143,6 +143,10 @@ end
143
143
  # Now interacting with a Chat record persists the conversation:
144
144
  chat_record = Chat.create!(model_id: "gpt-4.1-nano")
145
145
  chat_record.ask("Explain Active Record callbacks.") # User & Assistant messages saved
146
+
147
+ # Works seamlessly with file attachments - types automatically detected
148
+ chat_record.ask("What's in this file?", with: "report.pdf")
149
+ chat_record.ask("Analyze these", with: ["image.jpg", "data.csv", "notes.txt"])
146
150
  ```
147
151
  Check the [Rails Integration Guide](https://rubyllm.com/guides/rails) for more.
148
152
 
@@ -18,6 +18,7 @@ module RubyLLM
18
18
  has_many :messages,
19
19
  -> { order(created_at: :asc) },
20
20
  class_name: @message_class,
21
+ inverse_of: :chat,
21
22
  dependent: :destroy
22
23
 
23
24
  delegate :add_message, to: :to_llm
@@ -73,6 +73,10 @@
73
73
  "gemini": "gemini-2.5-pro-exp-03-25",
74
74
  "openrouter": "google/gemini-2.5-pro-exp-03-25"
75
75
  },
76
+ "gemini-2.5-pro-preview-05-06": {
77
+ "gemini": "gemini-2.5-pro-preview-05-06",
78
+ "openrouter": "google/gemini-2.5-pro-preview-05-06"
79
+ },
76
80
  "gemma-3-12b-it": {
77
81
  "gemini": "gemma-3-12b-it",
78
82
  "openrouter": "google/gemma-3-12b-it"
@@ -201,6 +205,10 @@
201
205
  "openai": "o3-mini",
202
206
  "openrouter": "openai/o3-mini"
203
207
  },
208
+ "o3-pro": {
209
+ "openai": "o3-pro",
210
+ "openrouter": "openai/o3-pro"
211
+ },
204
212
  "o4-mini": {
205
213
  "openai": "o4-mini",
206
214
  "openrouter": "openai/o4-mini"
@@ -36,6 +36,7 @@ module RubyLLM
36
36
  :retry_interval_randomness,
37
37
  :http_proxy,
38
38
  # Logging configuration
39
+ :logger,
39
40
  :log_file,
40
41
  :log_level,
41
42
  :log_assume_model_exists