ruby_llm 0.1.0.pre37 → 0.1.0.pre39

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
data/docs/guides/tools.md DELETED
@@ -1,247 +0,0 @@
1
- ---
2
- layout: default
3
- title: Tools
4
- parent: Guides
5
- nav_order: 3
6
- permalink: /guides/tools
7
- ---
8
-
9
- # Using Tools with RubyLLM
10
-
11
- Tools allow AI models to call your Ruby code to perform actions or retrieve information. This guide explains how to create and use tools with RubyLLM.
12
-
13
- ## What Are Tools?
14
-
15
- Tools (also known as "functions" or "plugins") let AI models:
16
-
17
- 1. Recognize when external functionality is needed
18
- 2. Call your Ruby code with appropriate parameters
19
- 3. Use the results to enhance their responses
20
-
21
- Common use cases include:
22
- - Retrieving real-time data
23
- - Performing calculations
24
- - Accessing databases
25
- - Controlling external systems
26
-
27
- ## Creating a Tool
28
-
29
- Tools are defined as Ruby classes that inherit from `RubyLLM::Tool`:
30
-
31
- ```ruby
32
- class Calculator < RubyLLM::Tool
33
- description "Performs arithmetic calculations"
34
-
35
- param :expression,
36
- type: :string,
37
- desc: "A mathematical expression to evaluate (e.g. '2 + 2')"
38
-
39
- def execute(expression:)
40
- eval(expression).to_s
41
- rescue StandardError => e
42
- "Error: #{e.message}"
43
- end
44
- end
45
- ```
46
-
47
- ### Tool Components
48
-
49
- Each tool has these key elements:
50
-
51
- 1. **Description** - Explains what the tool does, helping the AI decide when to use it
52
- 2. **Parameters** - Define the inputs the tool expects
53
- 3. **Execute Method** - The code that runs when the tool is called
54
-
55
- ### Parameter Definition
56
-
57
- Parameters accept several options:
58
-
59
- ```ruby
60
- param :parameter_name,
61
- type: :string, # Data type (:string, :integer, :boolean, :array, :object)
62
- desc: "Description", # Description of what the parameter does
63
- required: true # Whether the parameter is required (default: true)
64
- ```
65
-
66
- ## Using Tools in Chat
67
-
68
- To use a tool, attach it to a chat:
69
-
70
- ```ruby
71
- # Create the chat
72
- chat = RubyLLM.chat
73
-
74
- # Add a tool
75
- chat.with_tool(Calculator)
76
-
77
- # Now you can ask questions that might require calculation
78
- response = chat.ask "What's 123 * 456?"
79
- # => "Let me calculate that for you. 123 * 456 = 56088."
80
- ```
81
-
82
- ### Multiple Tools
83
-
84
- You can provide multiple tools to a single chat:
85
-
86
- ```ruby
87
- class Weather < RubyLLM::Tool
88
- description "Gets current weather for a location"
89
-
90
- param :location,
91
- desc: "City name or zip code"
92
-
93
- def execute(location:)
94
- # Simulate weather lookup
95
- "72°F and sunny in #{location}"
96
- end
97
- end
98
-
99
- # Add multiple tools
100
- chat = RubyLLM.chat
101
- .with_tools(Calculator, Weather)
102
-
103
- # Ask questions that might use either tool
104
- chat.ask "What's the temperature in New York City?"
105
- chat.ask "If it's 72°F in NYC and 54°F in Boston, what's the average?"
106
- ```
107
-
108
- ## Custom Initialization
109
-
110
- Tools can have custom initialization:
111
-
112
- ```ruby
113
- class DocumentSearch < RubyLLM::Tool
114
- description "Searches documents by relevance"
115
-
116
- param :query,
117
- desc: "The search query"
118
-
119
- param :limit,
120
- type: :integer,
121
- desc: "Maximum number of results",
122
- required: false
123
-
124
- def initialize(database)
125
- @database = database
126
- end
127
-
128
- def execute(query:, limit: 5)
129
- # Search in @database
130
- @database.search(query, limit: limit)
131
- end
132
- end
133
-
134
- # Initialize with dependencies
135
- search_tool = DocumentSearch.new(MyDatabase)
136
- chat.with_tool(search_tool)
137
- ```
138
-
139
- ## The Tool Execution Flow
140
-
141
- Here's what happens when a tool is used:
142
-
143
- 1. You ask a question
144
- 2. The model decides a tool is needed
145
- 3. The model selects the tool and provides arguments
146
- 4. RubyLLM calls your tool's `execute` method
147
- 5. The result is sent back to the model
148
- 6. The model incorporates the result into its response
149
-
150
- For example:
151
-
152
- ```ruby
153
- response = chat.ask "What's 123 squared plus 456?"
154
-
155
- # Behind the scenes:
156
- # 1. Model decides it needs to calculate
157
- # 2. Model calls Calculator with expression: "123 * 123 + 456"
158
- # 3. Tool returns "15,585"
159
- # 4. Model incorporates this in its response
160
- ```
161
-
162
- ## Debugging Tools
163
-
164
- Enable debugging to see tool calls in action:
165
-
166
- ```ruby
167
- # Enable debug logging
168
- ENV['RUBY_LLM_DEBUG'] = 'true'
169
-
170
- # Make a request
171
- chat.ask "What's 15329 divided by 437?"
172
-
173
- # Console output:
174
- # D, -- RubyLLM: Tool calculator called with: {"expression"=>"15329 / 437"}
175
- # D, -- RubyLLM: Tool calculator returned: "35.078719"
176
- ```
177
-
178
- ## Error Handling
179
-
180
- Tools can handle errors gracefully:
181
-
182
- ```ruby
183
- class Calculator < RubyLLM::Tool
184
- description "Performs arithmetic calculations"
185
-
186
- param :expression,
187
- type: :string,
188
- desc: "Math expression to evaluate"
189
-
190
- def execute(expression:)
191
- eval(expression).to_s
192
- rescue StandardError => e
193
- # Return error as a result
194
- { error: "Error calculating #{expression}: #{e.message}" }
195
- end
196
- end
197
-
198
- # When there's an error, the model will receive and explain it
199
- chat.ask "What's 1/0?"
200
- # => "I tried to calculate 1/0, but there was an error: divided by 0"
201
- ```
202
-
203
- ## Advanced Tool Parameters
204
-
205
- Tools can have complex parameter types:
206
-
207
- ```ruby
208
- class DataAnalysis < RubyLLM::Tool
209
- description "Analyzes numerical data"
210
-
211
- param :data,
212
- type: :array,
213
- desc: "Array of numbers to analyze"
214
-
215
- param :operations,
216
- type: :object,
217
- desc: "Analysis operations to perform",
218
- required: false
219
-
220
- def execute(data:, operations: {mean: true, median: false})
221
- result = {}
222
-
223
- result[:mean] = data.sum.to_f / data.size if operations[:mean]
224
- result[:median] = calculate_median(data) if operations[:median]
225
-
226
- result
227
- end
228
-
229
- private
230
-
231
- def calculate_median(data)
232
- sorted = data.sort
233
- mid = sorted.size / 2
234
- sorted.size.odd? ? sorted[mid] : (sorted[mid-1] + sorted[mid]) / 2.0
235
- end
236
- end
237
- ```
238
-
239
- ## When to Use Tools
240
-
241
- Tools are best for:
242
-
243
- 1. **External data retrieval** - Getting real-time information like weather, prices, or database records
244
- 2. **Computation** - When calculations are complex or involve large numbers
245
- 3. **System integration** - Connecting to external APIs or services
246
- 4. **Data processing** - Working with files, formatting data, or analyzing information
247
- 5. **Stateful operations** - When you need to maintain state between calls
data/docs/index.md DELETED
@@ -1,53 +0,0 @@
1
- ---
2
- layout: default
3
- title: Home
4
- nav_order: 1
5
- description: "RubyLLM is a delightful Ruby way to work with AI."
6
- permalink: /
7
- ---
8
-
9
- # RubyLLM
10
- {: .fs-9 }
11
-
12
- A delightful Ruby way to work with AI through a unified interface to OpenAI, Anthropic, Google, and DeepSeek.
13
- {: .fs-6 .fw-300 }
14
-
15
- [Get started now]({% link installation.md %}){: .btn .btn-primary .fs-5 .mb-4 .mb-md-0 .mr-2 }
16
- [View on GitHub](https://github.com/crmne/ruby_llm){: .btn .fs-5 .mb-4 .mb-md-0 }
17
-
18
- ---
19
-
20
- ## Overview
21
-
22
- RubyLLM provides a beautiful, unified interface to modern AI services, including:
23
-
24
- - 💬 **Chat** with OpenAI GPT, Anthropic Claude, Google Gemini, and DeepSeek models
25
- - 🖼️ **Image generation** with DALL-E and other providers
26
- - 🔍 **Embeddings** for vector search and semantic analysis
27
- - 🔧 **Tools** that let AI use your Ruby code
28
- - 🚊 **Rails integration** to persist chats and messages with ActiveRecord
29
- - 🌊 **Streaming** responses with proper Ruby patterns
30
-
31
- ## Quick start
32
-
33
- ```ruby
34
- require 'ruby_llm'
35
-
36
- # Configure your API keys
37
- RubyLLM.configure do |config|
38
- config.openai_api_key = ENV['OPENAI_API_KEY']
39
- end
40
-
41
- # Start chatting
42
- chat = RubyLLM.chat
43
- response = chat.ask "What's the best way to learn Ruby?"
44
-
45
- # Generate images
46
- image = RubyLLM.paint "a sunset over mountains"
47
- puts image.url
48
- ```
49
-
50
- ## Learn more
51
-
52
- - [Installation]({% link installation.md %})
53
- - [Guides]({% link guides/index.md %})
data/docs/installation.md DELETED
@@ -1,98 +0,0 @@
1
- ---
2
- layout: default
3
- title: Installation
4
- nav_order: 2
5
- permalink: /installation
6
- ---
7
-
8
- # Installation
9
-
10
- RubyLLM is packaged as a Ruby gem, making it easy to install in your projects.
11
-
12
- ## Requirements
13
-
14
- * Ruby 3.1 or later
15
- * An API key from at least one of the supported providers:
16
- * OpenAI
17
- * Anthropic
18
- * Google (Gemini)
19
- * DeepSeek
20
-
21
- ## Installation Methods
22
-
23
- ### Using Bundler (recommended)
24
-
25
- Add RubyLLM to your project's Gemfile:
26
-
27
- ```ruby
28
- gem 'ruby_llm'
29
- ```
30
-
31
- Then install the dependencies:
32
-
33
- ```bash
34
- bundle install
35
- ```
36
-
37
- ### Manual Installation
38
-
39
- If you're not using Bundler, you can install RubyLLM directly:
40
-
41
- ```bash
42
- gem install ruby_llm
43
- ```
44
-
45
- ## Configuration
46
-
47
- After installing RubyLLM, you'll need to configure it with your API keys:
48
-
49
- ```ruby
50
- require 'ruby_llm'
51
-
52
- RubyLLM.configure do |config|
53
- # Required: At least one API key
54
- config.openai_api_key = ENV['OPENAI_API_KEY']
55
- config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
56
- config.gemini_api_key = ENV['GEMINI_API_KEY']
57
- config.deepseek_api_key = ENV['DEEPSEEK_API_KEY']
58
-
59
- # Optional: Set default models
60
- config.default_model = 'gpt-4o-mini' # Default chat model
61
- config.default_embedding_model = 'text-embedding-3-small' # Default embedding model
62
- config.default_image_model = 'dall-e-3' # Default image generation model
63
-
64
- # Optional: Configure request settings
65
- config.request_timeout = 120 # Request timeout in seconds
66
- config.max_retries = 3 # Number of retries on failures
67
- end
68
- ```
69
-
70
- We recommend storing your API keys as environment variables rather than hardcoding them in your application.
71
-
72
- ## Verifying Installation
73
-
74
- You can verify that RubyLLM is correctly installed and configured by running a simple test:
75
-
76
- ```ruby
77
- require 'ruby_llm'
78
-
79
- # Configure with at least one API key
80
- RubyLLM.configure do |config|
81
- config.openai_api_key = ENV['OPENAI_API_KEY']
82
- end
83
-
84
- # Try a simple query
85
- chat = RubyLLM.chat
86
- response = chat.ask "Hello, world!"
87
- puts response.content
88
-
89
- # Check available models
90
- puts "Available models:"
91
- RubyLLM.models.chat_models.each do |model|
92
- puts "- #{model.id} (#{model.provider})"
93
- end
94
- ```
95
-
96
- ## Next Steps
97
-
98
- Once you've successfully installed RubyLLM, check out the [Getting Started guide]({% link guides/getting-started.md %}) to learn how to use it in your applications.