ruby_llm-agents 2.1.0 → 3.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: b341c701b87662c5947fec756ba5a925759f8179700683dda02c1b30951d3213
4
- data.tar.gz: bec52ce05c7958b6954b8f6df23a8a9058ac2a9f900decdb858ed108623ff5af
3
+ metadata.gz: a72ac8e976da462e8a77ff8b50641acfab95fa061daacf1f38a70d5ca25dc80a
4
+ data.tar.gz: 61b3595c0ef91fa70dfa776ef6deddce257c96098146a94deb902d0705a483bd
5
5
  SHA512:
6
- metadata.gz: fefed6d947fb4e9dc8e85a683c72754062d22c016c1de61b9922d49ae9a54cef9e84e638cadb1000e221fdd7616bfd1764fbf47bf5ba11404126acedcebe7c03
7
- data.tar.gz: f93588c36b8057878b951c8411e00c2cd94ecc6189be8d33f70cb49e7d7e81098755606fbc3dcb6b2b8dfe85efc5c043a3d1884b7ecf48fc185a4863b4d0f16d
6
+ metadata.gz: 0c3827e5607ba8376976a84048ae2a9486fe5797720c8842e371ad3793190aaa2a8ec6bea0015b94cbb1c34c8bd516502a6a052dca823e68b5291665d5ae63b8
7
+ data.tar.gz: 6ac84e773905ee65d8ca0f482633247fb7254c64a13b3b25d83698503a9c70f70070a3ceb1127fc8d96271611e4270616030b1f64837c431b4553991b3038209
data/README.md CHANGED
@@ -9,6 +9,7 @@
9
9
  > **Production-ready Rails engine for building, managing, and monitoring LLM-powered AI agents**
10
10
 
11
11
  [![Gem Version](https://badge.fury.io/rb/ruby_llm-agents.svg)](https://rubygems.org/gems/ruby_llm-agents)
12
+ [![CI](https://github.com/adham90/ruby_llm-agents/actions/workflows/ci.yml/badge.svg)](https://github.com/adham90/ruby_llm-agents/actions/workflows/ci.yml)
12
13
  [![Ruby](https://img.shields.io/badge/ruby-%3E%3D%203.1-ruby.svg)](https://www.ruby-lang.org)
13
14
  [![Rails](https://img.shields.io/badge/rails-%3E%3D%207.0-red.svg)](https://rubyonrails.org)
14
15
  [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
@@ -26,16 +27,15 @@ Build intelligent AI agents in Ruby with a clean DSL, automatic execution tracki
26
27
  ## Show Me the Code
27
28
 
28
29
  ```ruby
29
- # app/agents/search_intent_agent.rb
30
+ # Template agent — structured input via .call
30
31
  class SearchIntentAgent < ApplicationAgent
31
32
  model "gpt-4o"
32
33
  temperature 0.0
33
34
 
34
- # Prompts with {placeholder} syntax - params auto-registered
35
35
  system "You are a search intent analyzer. Extract structured data from queries."
36
- prompt "Extract search intent from: {query}"
36
+ user "Extract search intent from: {query}"
37
+ assistant '{"refined_query":' # Force JSON output
37
38
 
38
- # Structured output with returns DSL
39
39
  returns do
40
40
  string :refined_query, description: "Cleaned search query"
41
41
  array :filters, of: :string, description: "Extracted filters"
@@ -51,15 +51,17 @@ result.duration_ms # => 850
51
51
  ```
52
52
 
53
53
  ```ruby
54
- # Multi-turn conversations
55
- result = ChatAgent.call(
56
- query: "What's my name?",
57
- messages: [
58
- { role: :user, content: "My name is Alice" },
59
- { role: :assistant, content: "Nice to meet you, Alice!" }
60
- ]
61
- )
62
- # => "Your name is Alice!"
54
+ # Conversational agent — freeform input via .ask
55
+ class RubyExpert < ApplicationAgent
56
+ model "claude-sonnet-4-5-20250929"
57
+ system "You are a senior Ruby developer with 20 years of experience."
58
+ end
59
+
60
+ result = RubyExpert.ask("What's the difference between proc and lambda?")
61
+ puts result.content
62
+
63
+ # Stream the response
64
+ RubyExpert.ask("Explain metaprogramming") { |chunk| print chunk.content }
63
65
  ```
64
66
 
65
67
  ```ruby
@@ -67,7 +69,7 @@ result = ChatAgent.call(
67
69
  class ReliableAgent < ApplicationAgent
68
70
  model "gpt-4o"
69
71
 
70
- prompt "{query}"
72
+ user "{query}"
71
73
 
72
74
  on_failure do
73
75
  retries times: 3, backoff: :exponential
@@ -152,6 +154,19 @@ rails db:migrate
152
154
 
153
155
  ### Configure API Keys
154
156
 
157
+ Configure all provider API keys in one place (v2.1+):
158
+
159
+ ```ruby
160
+ # config/initializers/ruby_llm_agents.rb
161
+ RubyLLM::Agents.configure do |config|
162
+ config.openai_api_key = ENV["OPENAI_API_KEY"]
163
+ config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"]
164
+ config.gemini_api_key = ENV["GOOGLE_API_KEY"]
165
+ end
166
+ ```
167
+
168
+ Or use environment variables directly (auto-detected by RubyLLM):
169
+
155
170
  ```bash
156
171
  # .env
157
172
  OPENAI_API_KEY=sk-...
@@ -212,7 +227,7 @@ mount RubyLLM::Agents::Engine => "/agents"
212
227
 
213
228
  - **Ruby** >= 3.1.0
214
229
  - **Rails** >= 7.0
215
- - **RubyLLM** >= 1.11.0
230
+ - **RubyLLM** >= 1.12.0
216
231
 
217
232
  ## Contributing
218
233
 
@@ -76,6 +76,39 @@ module RubyLLM
76
76
  instance.call(&block)
77
77
  end
78
78
 
79
+ # Executes the agent with a freeform message as the user prompt
80
+ #
81
+ # Designed for conversational agents that define a persona (system +
82
+ # optional assistant prefill) but accept freeform input at runtime.
83
+ # Also works on template agents as an escape hatch to bypass the
84
+ # user template.
85
+ #
86
+ # @param message [String] The user message to send
87
+ # @param with [String, Array<String>, nil] Attachments (files, URLs)
88
+ # @param kwargs [Hash] Additional options (model:, temperature:, etc.)
89
+ # @yield [chunk] Yields chunks when streaming
90
+ # @return [Result] The processed response
91
+ #
92
+ # @example Basic usage
93
+ # RubyExpert.ask("What is metaprogramming?")
94
+ #
95
+ # @example With streaming
96
+ # RubyExpert.ask("Explain closures") { |chunk| print chunk.content }
97
+ #
98
+ # @example With attachments
99
+ # RubyExpert.ask("What's in this image?", with: "photo.jpg")
100
+ #
101
+ def ask(message, with: nil, **kwargs, &block)
102
+ opts = kwargs.merge(_ask_message: message)
103
+ opts[:with] = with if with
104
+
105
+ if block
106
+ stream(**opts, &block)
107
+ else
108
+ call(**opts)
109
+ end
110
+ end
111
+
79
112
  # Returns the agent type for this class
80
113
  #
81
114
  # Used by middleware to determine which tracking/budget config to use.
@@ -221,12 +254,13 @@ module RubyLLM
221
254
  # @param temperature [Float] Override the class-level temperature
222
255
  # @param options [Hash] Agent parameters defined via the param DSL
223
256
  def initialize(model: self.class.model, temperature: self.class.temperature, **options)
257
+ @ask_message = options.delete(:_ask_message)
224
258
  @model = model
225
259
  @temperature = temperature
226
260
  @options = options
227
261
  @tracked_tool_calls = []
228
262
  @pending_tool_call = nil
229
- validate_required_params!
263
+ validate_required_params! unless @ask_message
230
264
  end
231
265
 
232
266
  # Executes the agent through the middleware pipeline
@@ -245,15 +279,21 @@ module RubyLLM
245
279
 
246
280
  # User prompt to send to the LLM
247
281
  #
248
- # If a class-level `prompt` DSL is defined (string template or block),
249
- # it will be used. Otherwise, subclasses must implement this method.
282
+ # Resolution order:
283
+ # 1. Subclass method override (standard Ruby dispatch this method is never called)
284
+ # 2. .ask(message) runtime message — bypasses template
285
+ # 3. Class-level `user` / `prompt` template — interpolated with {placeholders}
286
+ # 4. Inherited from superclass
287
+ # 5. NotImplementedError
250
288
  #
251
289
  # @return [String] The user prompt
252
290
  def user_prompt
253
- prompt_config = self.class.prompt_config
254
- return resolve_prompt_from_config(prompt_config) if prompt_config
291
+ return @ask_message if @ask_message
255
292
 
256
- raise NotImplementedError, "#{self.class} must implement #user_prompt or use the prompt DSL"
293
+ config = self.class.user_config
294
+ return resolve_prompt_from_config(config) if config
295
+
296
+ raise NotImplementedError, "#{self.class} must implement #user_prompt, use the `user` DSL, or call with .ask(message)"
257
297
  end
258
298
 
259
299
  # System prompt for LLM instructions
@@ -269,6 +309,19 @@ module RubyLLM
269
309
  nil
270
310
  end
271
311
 
312
+ # Assistant prefill to prime the model's response
313
+ #
314
+ # If a class-level `assistant` DSL is defined, it will be used.
315
+ # Otherwise returns nil (no prefill).
316
+ #
317
+ # @return [String, nil] The assistant prefill, or nil for none
318
+ def assistant_prompt
319
+ config = self.class.assistant_config
320
+ return resolve_prompt_from_config(config) if config
321
+
322
+ nil
323
+ end
324
+
272
325
  # Response schema for structured output
273
326
  #
274
327
  # Delegates to the class-level schema DSL by default.
@@ -381,6 +434,7 @@ module RubyLLM
381
434
  {
382
435
  temperature: temperature,
383
436
  system_prompt: system_prompt,
437
+ assistant_prefill: assistant_prompt,
384
438
  schema: schema,
385
439
  messages: resolved_messages,
386
440
  tools: resolved_tools,
@@ -419,11 +473,26 @@ module RubyLLM
419
473
 
420
474
  # Resolves messages for this execution
421
475
  #
476
+ # Includes conversation history and assistant prefill if defined.
477
+ # The assistant prefill is appended as the last message so it appears
478
+ # after the user prompt in the conversation.
479
+ #
422
480
  # @return [Array<Hash>] Messages to apply
423
481
  def resolved_messages
424
- return @options[:messages] if @options[:messages]&.any?
482
+ msgs = @options[:messages]&.any? ? @options[:messages] : messages
483
+ msgs.dup
484
+ end
425
485
 
426
- messages
486
+ # Returns the assistant prefill message if defined
487
+ #
488
+ # Called after the user prompt is sent to inject the prefill.
489
+ #
490
+ # @return [Hash, nil] The assistant prefill message hash, or nil
491
+ def resolved_assistant_prefill
492
+ prefill = assistant_prompt
493
+ return nil if prefill.nil? || (prefill.is_a?(String) && prefill.empty?)
494
+
495
+ { role: :assistant, content: prefill }
427
496
  end
428
497
 
429
498
  # Returns whether streaming is enabled
@@ -446,6 +515,7 @@ module RubyLLM
446
515
  timeout: self.class.timeout,
447
516
  system_prompt: system_prompt,
448
517
  user_prompt: user_prompt,
518
+ assistant_prompt: assistant_prompt,
449
519
  attachments: @options[:with],
450
520
  schema: schema&.class&.name,
451
521
  streaming: self.class.streaming,
@@ -478,22 +548,14 @@ module RubyLLM
478
548
  end
479
549
  end
480
550
 
481
- # Resolves a prompt from DSL configuration (template string or block)
551
+ # Resolves a prompt from DSL configuration (template string)
482
552
  #
483
- # For string templates, interpolates {placeholder} with parameter values.
484
- # For blocks, evaluates in the instance context.
553
+ # Interpolates {placeholder} patterns with parameter values.
485
554
  #
486
- # @param config [String, Proc] The prompt configuration
555
+ # @param config [String] The prompt template
487
556
  # @return [String] The resolved prompt
488
557
  def resolve_prompt_from_config(config)
489
- case config
490
- when String
491
- interpolate_template(config)
492
- when Proc
493
- instance_eval(&config)
494
- else
495
- config.to_s
496
- end
558
+ interpolate_template(config)
497
559
  end
498
560
 
499
561
  # Interpolates {placeholder} patterns in a template string
@@ -546,32 +608,75 @@ module RubyLLM
546
608
 
547
609
  # Executes the LLM call
548
610
  #
611
+ # When an assistant prefill is defined, messages are added manually
612
+ # (user, then assistant) before calling complete, so the model
613
+ # continues from the prefill. Otherwise, uses the standard .ask flow.
614
+ #
549
615
  # @param client [RubyLLM::Chat] The configured client
550
616
  # @param context [Pipeline::Context] The execution context
551
617
  # @return [RubyLLM::Message] The response
552
618
  def execute_llm_call(client, context)
553
619
  timeout = self.class.timeout
554
- ask_opts = {}
555
- ask_opts[:with] = @options[:with] if @options[:with]
620
+ prefill = resolved_assistant_prefill
556
621
 
557
622
  Timeout.timeout(timeout) do
558
- if streaming_enabled? && context.stream_block
559
- execute_with_streaming(client, context, ask_opts)
623
+ if prefill
624
+ execute_with_prefill(client, context, prefill)
625
+ elsif streaming_enabled? && context.stream_block
626
+ execute_with_streaming(client, context)
560
627
  else
628
+ ask_opts = {}
629
+ ask_opts[:with] = @options[:with] if @options[:with]
561
630
  client.ask(user_prompt, **ask_opts)
562
631
  end
563
632
  end
564
633
  end
565
634
 
635
+ # Executes with assistant prefill
636
+ #
637
+ # Manually adds the user message and assistant prefill, then calls
638
+ # complete so the model continues from the prefill text.
639
+ #
640
+ # @param client [RubyLLM::Chat] The client
641
+ # @param context [Pipeline::Context] The context
642
+ # @param prefill [Hash] The assistant prefill message ({role:, content:})
643
+ # @return [RubyLLM::Message] The response
644
+ def execute_with_prefill(client, context, prefill)
645
+ # Add user message — use .ask for attachment support, then add prefill
646
+ # We use add_message + complete instead of .ask so we can insert the
647
+ # assistant prefill between user and completion
648
+ client.add_message(role: :user, content: user_prompt)
649
+ client.add_message(**prefill)
650
+
651
+ if streaming_enabled? && context.stream_block
652
+ first_chunk_at = nil
653
+ started_at = context.started_at || Time.current
654
+
655
+ response = client.complete do |chunk|
656
+ first_chunk_at ||= Time.current
657
+ context.stream_block.call(chunk)
658
+ end
659
+
660
+ if first_chunk_at
661
+ context.time_to_first_token_ms = ((first_chunk_at - started_at) * 1000).to_i
662
+ end
663
+
664
+ response
665
+ else
666
+ client.complete
667
+ end
668
+ end
669
+
566
670
  # Executes with streaming enabled
567
671
  #
568
672
  # @param client [RubyLLM::Chat] The client
569
673
  # @param context [Pipeline::Context] The context
570
- # @param ask_opts [Hash] Options for the ask call
571
674
  # @return [RubyLLM::Message] The response
572
- def execute_with_streaming(client, context, ask_opts)
675
+ def execute_with_streaming(client, context)
573
676
  first_chunk_at = nil
574
677
  started_at = context.started_at || Time.current
678
+ ask_opts = {}
679
+ ask_opts[:with] = @options[:with] if @options[:with]
575
680
 
576
681
  response = client.ask(user_prompt, **ask_opts) do |chunk|
577
682
  first_chunk_at ||= Time.current
@@ -4,6 +4,6 @@ module RubyLLM
4
4
  module Agents
5
5
  # Current version of the RubyLLM::Agents gem
6
6
  # @return [String] Semantic version string
7
- VERSION = "2.1.0"
7
+ VERSION = "3.0.0"
8
8
  end
9
9
  end
@@ -7,17 +7,23 @@ module RubyLLM
7
7
  #
8
8
  # Provides common configuration methods that every agent type needs:
9
9
  # - model: The LLM model to use
10
- # - prompt: The user prompt (string with {placeholders} or block)
11
10
  # - system: System instructions
11
+ # - user: The user prompt (string with {placeholders})
12
+ # - assistant: Assistant prefill (string with optional {placeholders})
12
13
  # - description: Human-readable description
13
14
  # - timeout: Request timeout
14
15
  # - returns: Structured output schema
15
16
  #
16
- # @example Simplified DSL
17
+ # Two levels for defining prompts:
18
+ # - Class-level string/heredoc for static content
19
+ # - Instance method override for dynamic content
20
+ #
21
+ # @example Template agent (structured input via .call)
17
22
  # class SearchAgent < RubyLLM::Agents::BaseAgent
18
23
  # model "gpt-4o"
19
24
  # system "You are a helpful search assistant."
20
- # prompt "Search for: {query} (limit: {limit})"
25
+ # user "Search for: {query} (limit: {limit})"
26
+ # assistant '{"results":['
21
27
  #
22
28
  # param :limit, default: 10 # Override auto-detected param
23
29
  #
@@ -29,10 +35,22 @@ module RubyLLM
29
35
  # end
30
36
  # end
31
37
  #
32
- # @example Dynamic prompt with block
33
- # class SummaryAgent < RubyLLM::Agents::BaseAgent
34
- # prompt do
35
- # "Summarize in #{word_count} words: #{text}"
38
+ # @example Conversational agent (freeform input via .ask)
39
+ # class RubyExpert < RubyLLM::Agents::BaseAgent
40
+ # model "gpt-4o"
41
+ # system "You are a senior Ruby developer."
42
+ # end
43
+ #
44
+ # RubyExpert.ask("What is metaprogramming?")
45
+ #
46
+ # @example Dynamic prompts with method overrides
47
+ # class SmartAgent < RubyLLM::Agents::BaseAgent
48
+ # def system_prompt
49
+ # "You are helping #{company.name}. Today is #{Date.today}."
50
+ # end
51
+ #
52
+ # def user_prompt
53
+ # "Question: #{params[:question]}"
36
54
  # end
37
55
  # end
38
56
  #
@@ -53,75 +71,131 @@ module RubyLLM
53
71
  @model || inherited_or_default(:model, default_model)
54
72
  end
55
73
 
56
- # Sets the user prompt template or block
74
+ # Sets the user prompt template
57
75
  #
58
76
  # When a string is provided, {placeholder} syntax is used to interpolate
59
77
  # parameters. Parameters are automatically registered (as required) unless
60
78
  # already defined with `param`.
61
79
  #
62
- # When a block is provided, it's evaluated in the instance context at
63
- # execution time, allowing access to all instance methods and parameters.
64
- #
65
80
  # @param template [String, nil] Prompt template with {placeholder} syntax
66
- # @yield Block that returns the prompt string (evaluated at execution time)
67
- # @return [String, Proc, nil] The current prompt configuration
81
+ # @return [String, nil] The current user prompt configuration
68
82
  #
69
83
  # @example With template string (parameters auto-detected)
70
- # prompt "Search for: {query} in {category}"
84
+ # user "Search for: {query} in {category}"
71
85
  # # Automatically registers :query and :category as required params
72
86
  #
73
- # @example With block for dynamic prompts
74
- # prompt do
75
- # base = "Analyze the following"
76
- # base += " in #{language}" if language != "en"
77
- # "#{base}: #{text}"
78
- # end
87
+ # @example Multi-line with heredoc
88
+ # user <<~S
89
+ # Search for: {query}
90
+ # Category: {category}
91
+ # Limit: {limit}
92
+ # S
79
93
  #
80
- def prompt(template = nil, &block)
94
+ def user(template = nil)
81
95
  if template
82
- @prompt_template = template
96
+ @user_template = template
97
+ auto_register_params_from_template(template)
98
+ end
99
+ @user_template || inherited_or_default(:user_config, nil)
100
+ end
101
+
102
+ # Returns the user prompt configuration
103
+ #
104
+ # @return [String, nil] The user template, or nil
105
+ def user_config
106
+ @user_template || inherited_or_default(:user_config, nil)
107
+ end
108
+
109
+ # Backward-compatible alias for `user`
110
+ #
111
+ # @deprecated Use `user` instead.
112
+ # @param template [String, nil] Prompt template with {placeholder} syntax
113
+ # @return [String, nil] The current prompt configuration
114
+ def prompt(template = nil)
115
+ if template
116
+ Deprecations.warn(
117
+ "`prompt` is deprecated. Use `user` instead (e.g., `user \"#{template.truncate(40)}\"`).",
118
+ caller
119
+ )
120
+ @user_template = template
83
121
  auto_register_params_from_template(template)
84
- elsif block
85
- @prompt_block = block
86
122
  end
87
- @prompt_template || @prompt_block || inherited_or_default(:prompt_config, nil)
123
+ @user_template || inherited_or_default(:user_config, nil)
88
124
  end
89
125
 
90
- # Returns the prompt configuration (template or block)
126
+ # Returns the prompt configuration (alias for user_config)
91
127
  #
92
- # @return [String, Proc, nil] The prompt template, block, or nil
128
+ # @deprecated Use `user_config` instead
129
+ # @return [String, nil] The prompt template, or nil
93
130
  def prompt_config
94
- @prompt_template || @prompt_block || inherited_or_default(:prompt_config, nil)
131
+ user_config
95
132
  end
96
133
 
97
134
  # Sets the system prompt/instructions
98
135
  #
136
+ # When a string is provided, {placeholder} syntax is supported for
137
+ # parameter interpolation, same as the `user` DSL.
138
+ #
99
139
  # @param text [String, nil] System instructions for the LLM
100
- # @yield Block that returns the system prompt (evaluated at execution time)
101
- # @return [String, Proc, nil] The current system prompt
140
+ # @return [String, nil] The current system prompt
102
141
  #
103
142
  # @example Static system prompt
104
143
  # system "You are a helpful assistant. Be concise and accurate."
105
144
  #
106
- # @example Dynamic system prompt
107
- # system do
108
- # "You are helping #{user_name}. Their preferences: #{preferences}"
145
+ # @example With placeholders
146
+ # system "You are helping {user_name} with their {task}."
147
+ #
148
+ # @example Dynamic system prompt (method override)
149
+ # def system_prompt
150
+ # "You are helping #{user_name}. Today is #{Date.today}."
109
151
  # end
110
152
  #
111
- def system(text = nil, &block)
153
+ def system(text = nil)
112
154
  if text
113
155
  @system_template = text
114
- elsif block
115
- @system_block = block
156
+ auto_register_params_from_template(text)
116
157
  end
117
- @system_template || @system_block || inherited_or_default(:system_config, nil)
158
+ @system_template || inherited_or_default(:system_config, nil)
118
159
  end
119
160
 
120
161
  # Returns the system prompt configuration
121
162
  #
122
- # @return [String, Proc, nil] The system template, block, or nil
163
+ # @return [String, nil] The system template, or nil
123
164
  def system_config
124
- @system_template || @system_block || inherited_or_default(:system_config, nil)
165
+ @system_template || inherited_or_default(:system_config, nil)
166
+ end
167
+
168
+ # Sets the assistant prefill string
169
+ #
170
+ # The assistant prefill is sent as the last message with the "assistant"
171
+ # role, priming the model to continue from that point. Useful for:
172
+ # - Forcing output format (e.g., starting with "{" for JSON)
173
+ # - Steering the response style
174
+ #
175
+ # Supports {placeholder} syntax for parameter interpolation.
176
+ #
177
+ # @param text [String, nil] The assistant prefill text
178
+ # @return [String, nil] The current assistant configuration
179
+ #
180
+ # @example Force JSON output
181
+ # assistant '{"category":'
182
+ #
183
+ # @example With placeholders
184
+ # assistant "Results for {query}:"
185
+ #
186
+ def assistant(text = nil)
187
+ if text
188
+ @assistant_template = text
189
+ auto_register_params_from_template(text)
190
+ end
191
+ @assistant_template || inherited_or_default(:assistant_config, nil)
192
+ end
193
+
194
+ # Returns the assistant prefill configuration
195
+ #
196
+ # @return [String, nil] The assistant template, or nil
197
+ def assistant_config
198
+ @assistant_template || inherited_or_default(:assistant_config, nil)
125
199
  end
126
200
 
127
201
  # Sets or returns the description for this agent class
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby_llm-agents
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.1.0
4
+ version: 3.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - adham90