ruby_llm_community 0.0.5 → 0.0.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.md +73 -91
- data/lib/ruby_llm/active_record/acts_as.rb +5 -0
- data/lib/ruby_llm/aliases.json +4 -0
- data/lib/ruby_llm/models.json +476 -344
- data/lib/ruby_llm/version.rb +1 -1
- metadata +1 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: d3fa93c4ea3fddd52d6cf639bb620c99d10a0fdc01082ecab1c1e9ddb022763d
|
4
|
+
data.tar.gz: 4be92604b71b94c096162a25165178344f56e22ea332815f542a87af095350f5
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 376559dd9e4c9ab5dbcb4c0e4c556d548c13fdebedaecef9edaebffb608f1af5380213bd388562d48866375b4bb63729c877b1548bfab17d07e2d3a9c550d67e
|
7
|
+
data.tar.gz: c2f8a24697f9abb88509bdf9b793020360bff7175544bd62b31fb3644f276fc103ae5809dc8a8dd8098962f2f59b1decc0baa1f08482d82bb4ab40e29578c705
|
data/README.md
CHANGED
@@ -1,99 +1,114 @@
|
|
1
|
+
<div align="center">
|
2
|
+
|
1
3
|
<picture>
|
2
4
|
<source media="(prefers-color-scheme: dark)" srcset="/docs/assets/images/logotype_dark.svg">
|
3
5
|
<img src="/docs/assets/images/logotype.svg" alt="RubyLLM" height="120" width="250">
|
4
6
|
</picture>
|
5
7
|
|
6
|
-
|
8
|
+
<strong>One *beautiful* Ruby API for GPT, Claude, Gemini, and more.</strong>
|
9
|
+
|
10
|
+
Battle tested at [<picture><source media="(prefers-color-scheme: dark)" srcset="https://chatwithwork.com/logotype-dark.svg"><img src="https://chatwithwork.com/logotype.svg" alt="Chat with Work" height="30" align="absmiddle"></picture>](https://chatwithwork.com) — *Claude Code for your documents*
|
11
|
+
|
12
|
+
[](https://badge.fury.io/rb/ruby_llm)
|
13
|
+
[](https://github.com/testdouble/standard)
|
14
|
+
[](https://rubygems.org/gems/ruby_llm)
|
15
|
+
[](https://codecov.io/gh/crmne/ruby_llm)
|
7
16
|
|
8
|
-
<
|
9
|
-
<a href="https://badge.fury.io/rb/ruby_llm"><img src="https://badge.fury.io/rb/ruby_llm.svg?a=5" alt="Gem Version" /></a>
|
10
|
-
<a href="https://github.com/testdouble/standard"><img src="https://img.shields.io/badge/code_style-standard-brightgreen.svg" alt="Ruby Style Guide" /></a>
|
11
|
-
<a href="https://rubygems.org/gems/ruby_llm"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm"></a>
|
12
|
-
<a href="https://codecov.io/gh/crmne/ruby_llm"><img src="https://codecov.io/gh/crmne/ruby_llm/branch/main/graph/badge.svg" alt="codecov" /></a>
|
17
|
+
<a href="https://trendshift.io/repositories/13640" target="_blank"><img src="https://trendshift.io/api/badge/repositories/13640" alt="crmne%2Fruby_llm | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
|
13
18
|
</div>
|
14
19
|
|
15
|
-
|
20
|
+
---
|
21
|
+
|
22
|
+
Build chatbots, AI agents, RAG applications. Works with OpenAI, Anthropic, Google, AWS, local models, and any OpenAI-compatible API.
|
16
23
|
|
17
|
-
##
|
24
|
+
## Why RubyLLM?
|
18
25
|
|
19
|
-
Every AI provider
|
26
|
+
Every AI provider ships their own bloated client. Different APIs. Different response formats. Different conventions. It's exhausting.
|
20
27
|
|
21
|
-
RubyLLM
|
28
|
+
RubyLLM gives you one beautiful API for all of them. Same interface whether you're using GPT, Claude, or your local Ollama. Just three dependencies: [Faraday](https://github.com/lostisland/faraday), [Zeitwerk](https://github.com/fxn/zeitwerk), and [Marcel](https://github.com/rails/marcel). That's it.
|
22
29
|
|
23
|
-
##
|
30
|
+
## Show me the code
|
24
31
|
|
25
32
|
```ruby
|
26
33
|
# Just ask questions
|
27
34
|
chat = RubyLLM.chat
|
28
35
|
chat.ask "What's the best way to learn Ruby?"
|
36
|
+
```
|
29
37
|
|
30
|
-
|
38
|
+
```ruby
|
39
|
+
# Analyze any file type
|
31
40
|
chat.ask "What's in this image?", with: "ruby_conf.jpg"
|
32
41
|
chat.ask "Describe this meeting", with: "meeting.wav"
|
33
42
|
chat.ask "Summarize this document", with: "contract.pdf"
|
34
43
|
chat.ask "Explain this code", with: "app.rb"
|
44
|
+
```
|
35
45
|
|
36
|
-
|
46
|
+
```ruby
|
47
|
+
# Multiple files at once
|
37
48
|
chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"]
|
49
|
+
```
|
38
50
|
|
39
|
-
|
40
|
-
|
51
|
+
```ruby
|
52
|
+
# Stream responses
|
53
|
+
chat.ask "Tell me a story about Ruby" do |chunk|
|
41
54
|
print chunk.content
|
42
55
|
end
|
56
|
+
```
|
43
57
|
|
58
|
+
```ruby
|
44
59
|
# Generate images
|
45
60
|
RubyLLM.paint "a sunset over mountains in watercolor style"
|
61
|
+
```
|
46
62
|
|
47
|
-
|
63
|
+
```ruby
|
64
|
+
# Create embeddings
|
48
65
|
RubyLLM.embed "Ruby is elegant and expressive"
|
66
|
+
```
|
49
67
|
|
68
|
+
```ruby
|
50
69
|
# Let AI use your code
|
51
70
|
class Weather < RubyLLM::Tool
|
52
|
-
description "
|
53
|
-
param :latitude
|
54
|
-
param :longitude
|
71
|
+
description "Get current weather"
|
72
|
+
param :latitude
|
73
|
+
param :longitude
|
55
74
|
|
56
75
|
def execute(latitude:, longitude:)
|
57
76
|
url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}¤t=temperature_2m,wind_speed_10m"
|
58
|
-
|
59
|
-
response = Faraday.get(url)
|
60
|
-
data = JSON.parse(response.body)
|
61
|
-
rescue => e
|
62
|
-
{ error: e.message }
|
77
|
+
JSON.parse(Faraday.get(url).body)
|
63
78
|
end
|
64
79
|
end
|
65
80
|
|
66
|
-
chat.with_tool(Weather).ask "What's the weather in Berlin?
|
81
|
+
chat.with_tool(Weather).ask "What's the weather in Berlin?"
|
82
|
+
```
|
67
83
|
|
68
|
-
|
84
|
+
```ruby
|
85
|
+
# Get structured output
|
69
86
|
class ProductSchema < RubyLLM::Schema
|
70
|
-
string :name
|
71
|
-
number :price
|
72
|
-
array :features
|
73
|
-
string
|
87
|
+
string :name
|
88
|
+
number :price
|
89
|
+
array :features do
|
90
|
+
string
|
74
91
|
end
|
75
92
|
end
|
76
93
|
|
77
|
-
response = chat.with_schema(ProductSchema)
|
78
|
-
.ask "Analyze this product description", with: "product.txt"
|
79
|
-
# response.content => { "name" => "...", "price" => 99.99, "features" => [...] }
|
94
|
+
response = chat.with_schema(ProductSchema).ask "Analyze this product", with: "product.txt"
|
80
95
|
```
|
81
96
|
|
82
|
-
##
|
83
|
-
|
84
|
-
*
|
85
|
-
*
|
86
|
-
*
|
87
|
-
*
|
88
|
-
*
|
89
|
-
*
|
90
|
-
*
|
91
|
-
*
|
92
|
-
*
|
93
|
-
*
|
94
|
-
*
|
95
|
-
*
|
96
|
-
*
|
97
|
+
## Features
|
98
|
+
|
99
|
+
* **Chat:** Conversational AI with `RubyLLM.chat`
|
100
|
+
* **Vision:** Analyze images and screenshots
|
101
|
+
* **Audio:** Transcribe and understand speech
|
102
|
+
* **Documents:** Extract from PDFs, CSVs, JSON, any file type
|
103
|
+
* **Image generation:** Create images with `RubyLLM.paint`
|
104
|
+
* **Embeddings:** Vector search with `RubyLLM.embed`
|
105
|
+
* **Tools:** Let AI call your Ruby methods
|
106
|
+
* **Structured output:** JSON schemas that just work
|
107
|
+
* **Streaming:** Real-time responses with blocks
|
108
|
+
* **Rails:** ActiveRecord integration with `acts_as_chat`
|
109
|
+
* **Async:** Fiber-based concurrency
|
110
|
+
* **Model registry:** 500+ models with capability detection and pricing
|
111
|
+
* **Providers:** OpenAI, Anthropic, Gemini, Bedrock, DeepSeek, Mistral, Ollama, OpenRouter, Perplexity, GPUStack, and any OpenAI-compatible API
|
97
112
|
|
98
113
|
## Installation
|
99
114
|
|
@@ -103,69 +118,36 @@ gem 'ruby_llm_community'
|
|
103
118
|
```
|
104
119
|
Then `bundle install`.
|
105
120
|
|
106
|
-
Configure your API keys
|
121
|
+
Configure your API keys:
|
107
122
|
```ruby
|
108
|
-
# config/initializers/ruby_llm.rb
|
123
|
+
# config/initializers/ruby_llm.rb
|
109
124
|
RubyLLM.configure do |config|
|
110
|
-
config.openai_api_key = ENV
|
111
|
-
# Add keys ONLY for providers you intend to use
|
112
|
-
# config.anthropic_api_key = ENV.fetch('ANTHROPIC_API_KEY', nil)
|
113
|
-
# ... see Configuration guide for all options ...
|
125
|
+
config.openai_api_key = ENV['OPENAI_API_KEY']
|
114
126
|
end
|
115
127
|
```
|
116
|
-
See the [Installation Guide](https://rubyllm.com/installation) for full details.
|
117
128
|
|
118
|
-
## Rails
|
119
|
-
|
120
|
-
Add persistence to your chat models effortlessly:
|
129
|
+
## Rails
|
121
130
|
|
122
131
|
```bash
|
123
|
-
# Generate models and migrations
|
124
132
|
rails generate ruby_llm:install
|
125
133
|
```
|
126
134
|
|
127
135
|
```ruby
|
128
|
-
# Or add to existing models
|
129
136
|
class Chat < ApplicationRecord
|
130
|
-
acts_as_chat
|
131
|
-
end
|
132
|
-
|
133
|
-
class Message < ApplicationRecord
|
134
|
-
acts_as_message
|
137
|
+
acts_as_chat
|
135
138
|
end
|
136
139
|
|
137
|
-
|
138
|
-
|
139
|
-
end
|
140
|
-
|
141
|
-
# Now chats persist automatically
|
142
|
-
chat = Chat.create!(model_id: "gpt-4.1-nano")
|
143
|
-
chat.ask("What's in this file?", with: "report.pdf")
|
140
|
+
chat = Chat.create! model_id: "claude-sonnet-4"
|
141
|
+
chat.ask "What's in this file?", with: "report.pdf"
|
144
142
|
```
|
145
143
|
|
146
|
-
|
147
|
-
|
148
|
-
## Learn More
|
149
|
-
|
150
|
-
Dive deeper with the official documentation:
|
144
|
+
## Documentation
|
151
145
|
|
152
|
-
|
153
|
-
- [Configuration](https://rubyllm.com/configuration)
|
154
|
-
- **Guides:**
|
155
|
-
- [Getting Started](https://rubyllm.com/guides/getting-started)
|
156
|
-
- [Chatting with AI Models](https://rubyllm.com/guides/chat)
|
157
|
-
- [Using Tools](https://rubyllm.com/guides/tools)
|
158
|
-
- [Streaming Responses](https://rubyllm.com/guides/streaming)
|
159
|
-
- [Rails Integration](https://rubyllm.com/guides/rails)
|
160
|
-
- [Image Generation](https://rubyllm.com/guides/image-generation)
|
161
|
-
- [Embeddings](https://rubyllm.com/guides/embeddings)
|
162
|
-
- [Working with Models](https://rubyllm.com/guides/models)
|
163
|
-
- [Error Handling](https://rubyllm.com/guides/error-handling)
|
164
|
-
- [Available Models](https://rubyllm.com/guides/available-models)
|
146
|
+
[rubyllm.com](https://rubyllm.com)
|
165
147
|
|
166
148
|
## Contributing
|
167
149
|
|
168
|
-
|
150
|
+
See [CONTRIBUTING.md](CONTRIBUTING.md).
|
169
151
|
|
170
152
|
## License
|
171
153
|
|
data/lib/ruby_llm/aliases.json
CHANGED