llm.rb 4.17.0 → 4.18.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: d834b30dd18d6bf83ccd2334bbedba0ee5a9f75d0d6d0616aefb7baa6be68ad0
4
- data.tar.gz: d8c8a1e43cc89d888ab1ea008baddce7e1427fc3598daf68ca04fd0eff1351b0
3
+ metadata.gz: 42357f605b8995c24722c0a2d285d98de0e99fe6f9687e2f4451339dcfbc2e87
4
+ data.tar.gz: 2953c742deeec6dc7c7acc9ed77f4520846a92299a4df94e415416d5c623033c
5
5
  SHA512:
6
- metadata.gz: f9fbe6f7080294c0500153dbfeb3c721407553289a2f34cd37e63552a84171bdc1bc130fb325aaf164f099c5fd6bfb417f550eea60dc126c20f940a7737e393a
7
- data.tar.gz: 67cef120d84c98ea695caab059d2bb008ca1cbdeb15a1b2f13c819166b1edd7a523c7213acabed5b35c44e7123ae85481c0eaff815d6254088c858e5a0516ae3
6
+ metadata.gz: 534fad616562768b0a851142f411117634d824a92187577d6baad4427c1e0eddf7a765dbfe3b662f76ba52011d760803793473a1881b4f69b8f4c7f9fc16bdc3
7
+ data.tar.gz: f1d90ad7f7b71f4817fcfa494f48a4e7f189560674d7b86d618f1f4e217e2a34189e15156e71ecdcfffe81e9588315f4759a82f4ec3eaed3846a29c77a906a66
data/CHANGELOG.md CHANGED
@@ -2,24 +2,75 @@
2
2
 
3
3
  ## Unreleased
4
4
 
5
+ Changes since `v4.18.0`.
6
+
7
+ ### Change
8
+
9
+ * **Make provider tracers default to the provider instance** <br>
10
+ Change `llm.tracer = ...` so it sets a provider default tracer instead of
11
+ relying on scoped fiber-local state alone. This makes tracer configuration
12
+ behave more predictably across normal tasks, threads, and fibers that share
13
+ the same provider instance.
14
+
15
+ * **Add `LLM::Provider#with_tracer` for scoped overrides** <br>
16
+ Add `with_tracer` as the opt-in escape hatch for request- or turn-scoped
17
+ tracer overrides. Use it when you want temporary tracing on the current
18
+ fiber without replacing the provider's default tracer.
19
+
20
+ * **Trace concurrent tool calls outside ractors** <br>
21
+ Make tool tracing fire correctly when functions run through `:thread`,
22
+ `:task`, or `:fiber` concurrency. Experimental `:ractor` execution still
23
+ does not emit tool tracer events.
24
+
25
+ * **Trace streamed tool calls, including MCP tools** <br>
26
+ Bind stream metadata through `LLM::Stream#extra` so streamed tool calls
27
+ inherit tracer and model context before they are handed to `on_tool_call`.
28
+ This restores tool tracing for streamed MCP and local tool execution.
29
+
30
+ * **Support symbol-based ORM option hooks** <br>
31
+ Let `provider:`, `context:`, and `tracer:` on the Sequel plugin and
32
+ the ActiveRecord `acts_as_llm` / `acts_as_agent` wrappers resolve through
33
+ model method names as well as procs.
34
+
35
+ * **Add experimental ractor tool concurrency** <br>
36
+ Add `:ractor` support to `LLM::Function#spawn`, `LLM::Function::Array#wait`,
37
+ `LLM::Stream#wait`, and `LLM::Agent.concurrency` so class-based tools with
38
+ ractor-safe arguments and return values can run in Ruby ractors and report
39
+ their results back into the normal LLM tool-return path. MCP tools are not
40
+ supported by the current `:ractor` mode, but mixed workloads can still
41
+ branch on `tool.mcp?` and choose a supported strategy per tool. `:ractor`
42
+ is especially useful for CPU-bound tools, while `:task`, `:fiber`, or
43
+ `:thread` may be a better fit for I/O-bound work.
44
+
45
+ ## v4.18.0
46
+
5
47
  Changes since `v4.17.0`.
6
48
 
49
+ This release improves tracing and tool execution behavior across llm.rb.
50
+ It makes provider tracers default to the provider instance, adds
51
+ `LLM::Provider#with_tracer` for scoped overrides, restores tool tracing for
52
+ concurrent and streamed tool execution, extends streamed tracing to MCP tools,
53
+ and adds symbol-based ORM option hooks alongside experimental ractor tool
54
+ concurrency.
55
+
7
56
  ## v4.17.0
8
57
 
9
58
  Changes since `v4.16.1`.
10
59
 
11
60
  This release expands agent support across llm.rb. It brings `LLM::Agent`
12
- closer to `LLM::Context`, adds configurable automatic tool concurrency,
61
+ closer to `LLM::Context`, adds configurable automatic tool concurrency
62
+ including experimental ractor support for class-based tools,
13
63
  extends persisted ORM wrappers with more of the context runtime surface and
14
- fiber-local tracer hooks, and introduces built-in ActiveRecord agent
15
- persistence through `acts_as_agent`.
64
+ tracer hooks, and introduces built-in ActiveRecord agent persistence through
65
+ `acts_as_agent`.
16
66
 
17
67
  ### Change
18
68
 
19
69
  * **Add configurable tool concurrency to `LLM::Agent`** <br>
20
70
  Add the class-level `concurrency` DSL to `LLM::Agent` so automatic
21
- tool loops can run with `:call`, `:thread`, `:task`, or `:fiber`
22
- instead of always executing sequentially.
71
+ tool loops can run with `:call`, `:thread`, `:task`, `:fiber`, or
72
+ experimental `:ractor` support for class-based tools instead of
73
+ always executing sequentially.
23
74
 
24
75
  * **Bring `LLM::Agent` closer to `LLM::Context`** <br>
25
76
  Expand `LLM::Agent` so it exposes more of the same runtime surface as
@@ -35,8 +86,8 @@ persistence through `acts_as_agent`.
35
86
 
36
87
  * **Add ORM tracer hooks for persisted contexts** <br>
37
88
  Add `tracer:` to both the Sequel plugin and `acts_as_llm` so models
38
- can resolve and assign fiber-local tracers onto the provider used by
39
- their persisted `LLM::Context`.
89
+ can resolve and assign tracers onto the provider used by their persisted
90
+ `LLM::Context`.
40
91
 
41
92
  * **Bring persisted ORM wrappers closer to `LLM::Context`** <br>
42
93
  Expand both the Sequel plugin and `acts_as_llm` so record-backed
data/README.md CHANGED
@@ -4,7 +4,7 @@
4
4
  <p align="center">
5
5
  <a href="https://0x1eef.github.io/x/llm.rb?rebuild=1"><img src="https://img.shields.io/badge/docs-0x1eef.github.io-blue.svg" alt="RubyDoc"></a>
6
6
  <a href="https://opensource.org/license/0bsd"><img src="https://img.shields.io/badge/License-0BSD-orange.svg?" alt="License"></a>
7
- <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.17.0-green.svg?" alt="Version"></a>
7
+ <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.18.0-green.svg?" alt="Version"></a>
8
8
  </p>
9
9
 
10
10
  ## About
@@ -23,6 +23,8 @@ pieces only when needed, includes built-in ActiveRecord support through
23
23
  long-lived, tool-capable, stateful AI workflows instead of just
24
24
  request/response helpers.
25
25
 
26
+ Want to see some code? Jump to [the examples](#examples) section.
27
+
26
28
  ## Architecture
27
29
 
28
30
  <p align="center">
@@ -66,7 +68,10 @@ same context object.
66
68
  - **Agents auto-manage tool execution** <br>
67
69
  Use `LLM::Agent` when you want the same stateful runtime surface as
68
70
  `LLM::Context`, but with tool loops executed automatically according to a
69
- configured concurrency mode such as `:call`, `:thread`, `:task`, or `:fiber`.
71
+ configured concurrency mode such as `:call`, `:thread`, `:task`, `:fiber`,
72
+ or experimental `:ractor` support for class-based tools. MCP tools are not
73
+ supported by the current `:ractor` mode, but mixed tool sets can still
74
+ route MCP tools and local tools through different strategies at runtime.
70
75
  - **Tool calls have an explicit lifecycle** <br>
71
76
  A tool call can be executed, cancelled through
72
77
  [`LLM::Function#cancel`](https://0x1eef.github.io/x/llm.rb/LLM/Function.html#cancel-instance_method),
@@ -78,7 +83,12 @@ same context object.
78
83
  [`LLM::Context#cancel!`](https://0x1eef.github.io/x/llm.rb/LLM/Context.html#cancel-21-instance_method)
79
84
  is inspired by Go's context cancellation model.
80
85
  - **Concurrency is a first-class feature** <br>
81
- Use threads, fibers, or async tasks without rewriting your tool layer.
86
+ Use threads, fibers, async tasks, or experimental ractors without
87
+ rewriting your tool layer. The current `:ractor` mode is for class-based
88
+ tools and does not support MCP tools, but mixed workloads can branch on
89
+ `tool.mcp?` and choose a supported strategy per tool. `:ractor` is
90
+ especially useful for CPU-bound tools, while `:task`, `:fiber`, or
91
+ `:thread` may be a better fit for I/O-bound work.
82
92
  - **Advanced workloads are built in, not bolted on** <br>
83
93
  Streaming, concurrent tool execution, persistence, tracing, and MCP support
84
94
  all fit the same runtime model.
@@ -199,6 +209,7 @@ The `plugin :llm` integration wraps [`LLM::Context`](https://0x1eef.github.io/x/
199
209
 
200
210
  ```ruby
201
211
  require "llm"
212
+ require "net/http/persistent"
202
213
  require "sequel"
203
214
  require "sequel/plugins/llm"
204
215
 
@@ -218,6 +229,7 @@ provides full control over tool execution. <br> See the [deepdive](https://0x1ee
218
229
 
219
230
  ```ruby
220
231
  require "llm"
232
+ require "net/http/persistent"
221
233
  require "active_record"
222
234
  require "llm/active_record"
223
235
 
@@ -237,6 +249,7 @@ manages tool execution for you. <br> See the [deepdive](https://0x1eef.github.io
237
249
 
238
250
  ```ruby
239
251
  require "llm"
252
+ require "net/http/persistent"
240
253
  require "active_record"
241
254
  require "llm/active_record"
242
255
 
@@ -68,9 +68,9 @@ module LLM::ActiveRecord
68
68
  # Storage format for the serialized agent state. Use `:string` for text
69
69
  # columns, or `:json` / `:jsonb` for structured JSON columns with
70
70
  # ActiveRecord JSON typecasting enabled.
71
- # @option options [Proc, LLM::Tracer, nil] :tracer
72
- # Optional tracer or proc that resolves to one and is assigned through
73
- # `llm.tracer = ...` on the resolved provider.
71
+ # @option options [Proc, Symbol, LLM::Tracer, nil] :tracer
72
+ # Optional tracer, method name, or proc that resolves to one and is
73
+ # assigned through `llm.tracer = ...` on the resolved provider.
74
74
  # @return [void]
75
75
  def acts_as_agent(options = EMPTY_HASH)
76
76
  options = DEFAULTS.merge(options)
@@ -90,7 +90,8 @@ module LLM::ActiveRecord
90
90
  options = self.class.llm_agent_options
91
91
  provider = self[columns[:provider_column]]
92
92
  kwargs = resolve_options(options[:provider])
93
- @llm ||= LLM.method(provider).call(**kwargs)
93
+ return @llm if @llm
94
+ @llm = LLM.method(provider).call(**kwargs)
94
95
  @llm.tracer = resolve_option(options[:tracer]) if options[:tracer]
95
96
  @llm
96
97
  end
@@ -134,6 +135,7 @@ module LLM::ActiveRecord
134
135
  def resolve_option(option)
135
136
  case option
136
137
  when Proc then instance_exec(&option)
138
+ when Symbol then send(option)
137
139
  when Hash then option.dup
138
140
  else option
139
141
  end
@@ -13,8 +13,8 @@ module LLM::ActiveRecord
13
13
  # default) or as a structured object (`format: :json` / `:jsonb`) for
14
14
  # databases such as PostgreSQL that can persist JSON natively.
15
15
  # `:json` and `:jsonb` expect a real JSON column type with ActiveRecord
16
- # handling JSON typecasting for the model. A `tracer:` proc can also be
17
- # configured to assign a fiber-local tracer onto the resolved provider.
16
+ # handling JSON typecasting for the model. `provider:`, `context:`, and
17
+ # `tracer:` can also be configured as symbols that are called on the model.
18
18
  module ActsAsLLM
19
19
  EMPTY_HASH = {}.freeze
20
20
  DEFAULT_USAGE_COLUMNS = {
@@ -54,9 +54,9 @@ module LLM::ActiveRecord
54
54
  # Storage format for the serialized context. Use `:string` for text
55
55
  # columns, or `:json` / `:jsonb` for structured JSON columns with
56
56
  # ActiveRecord JSON typecasting enabled.
57
- # @option options [Proc, LLM::Tracer, nil] :tracer
58
- # Optional tracer or proc that resolves to one and is assigned through
59
- # `llm.tracer = ...` on the resolved provider.
57
+ # @option options [Proc, Symbol, LLM::Tracer, nil] :tracer
58
+ # Optional tracer, method name, or proc that resolves to one and is
59
+ # assigned through `llm.tracer = ...` on the resolved provider.
60
60
  # @return [void]
61
61
  def acts_as_llm(options = EMPTY_HASH)
62
62
  options = DEFAULTS.merge(options)
@@ -215,7 +215,8 @@ module LLM::ActiveRecord
215
215
  options = self.class.llm_plugin_options
216
216
  provider = self[columns[:provider_column]]
217
217
  kwargs = resolve_options(options[:provider])
218
- @llm ||= LLM.method(provider).call(**kwargs)
218
+ return @llm if @llm
219
+ @llm = LLM.method(provider).call(**kwargs)
219
220
  @llm.tracer = resolve_option(options[:tracer]) if options[:tracer]
220
221
  @llm
221
222
  end
@@ -259,6 +260,7 @@ module LLM::ActiveRecord
259
260
  def resolve_option(option)
260
261
  case option
261
262
  when Proc then instance_exec(&option)
263
+ when Symbol then send(option)
262
264
  when Hash then option.dup
263
265
  else option
264
266
  end
data/lib/llm/agent.rb CHANGED
@@ -17,7 +17,7 @@ module LLM
17
17
  # * Instructions are injected only on the first request.
18
18
  # * An agent automatically executes tool loops (unlike {LLM::Context LLM::Context}).
19
19
  # * Tool loop execution can be configured with `concurrency :call`,
20
- # `:thread`, `:task`, or `:fiber`.
20
+ # `:thread`, `:task`, `:fiber`, or `:ractor`.
21
21
  #
22
22
  # @example
23
23
  # class SystemAdmin < LLM::Agent
@@ -89,6 +89,8 @@ module LLM
89
89
  # - `:thread`: concurrent threads
90
90
  # - `:task`: concurrent async tasks
91
91
  # - `:fiber`: concurrent raw fibers
92
+ # - `:ractor`: concurrent Ruby ractors for class-based tools; MCP tools are not supported,
93
+ # and this mode is especially useful for CPU-bound tool work
92
94
  # @return [Symbol, nil]
93
95
  def self.concurrency(concurrency = nil)
94
96
  return @concurrency if concurrency.nil?
@@ -346,8 +348,8 @@ module LLM
346
348
  def call_functions
347
349
  case concurrency || :call
348
350
  when :call then call(:functions)
349
- when :thread, :task, :fiber then wait(concurrency)
350
- else raise ArgumentError, "Unknown concurrency: #{concurrency.inspect}. Expected :call, :thread, :task, or :fiber"
351
+ when :thread, :task, :fiber, :ractor then wait(concurrency)
352
+ else raise ArgumentError, "Unknown concurrency: #{concurrency.inspect}. Expected :call, :thread, :task, :fiber, or :ractor"
351
353
  end
352
354
  end
353
355
  end
data/lib/llm/context.rb CHANGED
@@ -86,6 +86,7 @@ module LLM
86
86
  return respond(prompt, params) if mode == :responses
87
87
  params = params.merge(messages: @messages.to_a)
88
88
  params = @params.merge(params)
89
+ bind!(params[:stream], params[:model])
89
90
  res = @llm.complete(prompt, params)
90
91
  role = params[:role] || @llm.user_role
91
92
  role = @llm.tool_role if params[:role].nil? && [*prompt].grep(LLM::Function::Return).any?
@@ -110,6 +111,7 @@ module LLM
110
111
  # puts res.output_text
111
112
  def respond(prompt, params = {})
112
113
  params = @params.merge(params)
114
+ bind!(params[:stream], params[:model])
113
115
  res_id = params[:store] == false ? nil : @messages.find(&:assistant?)&.response&.response_id
114
116
  params = params.merge(previous_response_id: res_id, input: @messages.to_a).compact
115
117
  res = @llm.responses.create(prompt, params)
@@ -356,6 +358,14 @@ module LLM
356
358
  rescue LLM::NoSuchModelError, LLM::NoSuchRegistryError
357
359
  0
358
360
  end
361
+
362
+ private
363
+
364
+ def bind!(stream, model)
365
+ return unless LLM::Stream === stream
366
+ stream.extra[:tracer] = tracer
367
+ stream.extra[:model] = model
368
+ end
359
369
  end
360
370
 
361
371
  # Backward-compatible alias
@@ -27,8 +27,9 @@ class LLM::Function
27
27
  # - `:thread`: Use threads
28
28
  # - `:task`: Use async tasks (requires async gem)
29
29
  # - `:fiber`: Use raw fibers
30
+ # - `:ractor`: Use Ruby ractors (class-based tools only; MCP tools are not supported)
30
31
  #
31
- # @return [LLM::Function::ThreadGroup, LLM::Function::TaskGroup, LLM::Function::FiberGroup]
32
+ # @return [LLM::Function::ThreadGroup, LLM::Function::TaskGroup, LLM::Function::FiberGroup, LLM::Function::Ractor::Group]
32
33
  def spawn(strategy)
33
34
  case strategy
34
35
  when :task
@@ -37,8 +38,10 @@ class LLM::Function
37
38
  ThreadGroup.new(map { |fn| fn.spawn(:thread) })
38
39
  when :fiber
39
40
  FiberGroup.new(map { |fn| fn.spawn(:fiber) })
41
+ when :ractor
42
+ Ractor::Group.new(map { |fn| fn.spawn(:ractor) })
40
43
  else
41
- raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, or :fiber"
44
+ raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, :fiber, or :ractor"
42
45
  end
43
46
  end
44
47
 
@@ -51,6 +54,7 @@ class LLM::Function
51
54
  # - `:thread`: Use threads
52
55
  # - `:task`: Use async tasks (requires async gem)
53
56
  # - `:fiber`: Use raw fibers
57
+ # - `:ractor`: Use Ruby ractors (class-based tools only; MCP tools are not supported)
54
58
  #
55
59
  # @return [Array<LLM::Function::Return>]
56
60
  # Returns values to be reported back to the LLM.
@@ -0,0 +1,59 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ ##
5
+ # The {LLM::Function::Ractor::Job} class manages execution and mailbox
6
+ # coordination for a single ractor-backed function call.
7
+ class Ractor::Job
8
+ ##
9
+ # @param [::Ractor] mailbox
10
+ # @param [Class] runner_class
11
+ # @param [String, nil] id
12
+ # @param [String] name
13
+ # @param [Hash, Array, nil] arguments
14
+ # @return [LLM::Function::Ractor::Job]
15
+ def initialize(mailbox, runner_class, id, name, arguments)
16
+ @mailbox = mailbox
17
+ @runner_class = runner_class
18
+ @id = id
19
+ @name = name
20
+ @arguments = arguments
21
+ end
22
+
23
+ ##
24
+ # @return [void]
25
+ def call
26
+ spawn
27
+ wait
28
+ end
29
+
30
+ private
31
+
32
+ def wait
33
+ done = false
34
+ result = nil
35
+ waiters = []
36
+ loop do
37
+ case ::Ractor.receive
38
+ in [:done, *result]
39
+ done = true
40
+ waiters.each { _1.send(result) }
41
+ waiters.clear
42
+ in [:alive?, reply]
43
+ reply.send(!done)
44
+ in [:wait, reply]
45
+ done ? reply.send(result) : waiters << reply
46
+ end
47
+ end
48
+ end
49
+
50
+ def spawn
51
+ ::Ractor.new(@mailbox, @runner_class, @id, @name, @arguments) do |mailbox, runner_class, id, name, arguments|
52
+ kwargs = Hash === arguments ? arguments.transform_keys(&:to_sym) : arguments
53
+ mailbox.send([:done, id, name, runner_class.new.call(**kwargs)])
54
+ rescue => ex
55
+ mailbox.send([:done, id, name, {error: true, type: ex.class.name, message: ex.message}])
56
+ end
57
+ end
58
+ end
59
+ end
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ ##
5
+ # The {LLM::Function::Ractor::Mailbox} class manages the mailbox protocol
6
+ # for a single ractor-backed function call.
7
+ class Ractor::Mailbox
8
+ ##
9
+ # @return [::Ractor]
10
+ attr_reader :task
11
+
12
+ ##
13
+ # @param [::Ractor] task
14
+ # @return [LLM::Function::Ractor::Mailbox]
15
+ def initialize(task)
16
+ @task = task
17
+ end
18
+
19
+ ##
20
+ # @return [Boolean]
21
+ def alive?
22
+ request(:alive?)
23
+ end
24
+
25
+ ##
26
+ # @return [Array]
27
+ def wait
28
+ request(:wait)
29
+ end
30
+
31
+ private
32
+
33
+ def request(type)
34
+ reply = ::Ractor.new { ::Ractor.receive }
35
+ task.send([type, reply])
36
+ reply.respond_to?(:take) ? reply.take : ::Ractor.select(reply).last
37
+ end
38
+ end
39
+ end
@@ -0,0 +1,45 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ ##
5
+ # The {LLM::Function::Ractor::Task} class wraps a ractor-backed function
6
+ # call and delegates mailbox coordination to
7
+ # {LLM::Function::Ractor::Mailbox}.
8
+ class Ractor::Task
9
+ ##
10
+ # @return [LLM::Function::Ractor::Mailbox]
11
+ attr_reader :mailbox
12
+
13
+ ##
14
+ # @param [Class] runner_class
15
+ # @param [String, nil] id
16
+ # @param [String] name
17
+ # @param [Hash, Array, nil] arguments
18
+ # @return [LLM::Function::Ractor::Task]
19
+ def initialize(runner_class, id, name, arguments)
20
+ @mailbox = Ractor::Mailbox.new(build_task(runner_class, id, name, arguments))
21
+ end
22
+
23
+ ##
24
+ # @return [Boolean]
25
+ def alive?
26
+ mailbox.alive?
27
+ end
28
+
29
+ ##
30
+ # @return [LLM::Function::Return]
31
+ def wait
32
+ id, name, value = mailbox.wait
33
+ Return.new(id, name, value)
34
+ end
35
+ alias_method :value, :wait
36
+
37
+ private
38
+
39
+ def build_task(runner_class, id, name, arguments)
40
+ ::Ractor.new(runner_class, id, name, arguments) do |runner_class, id, name, arguments|
41
+ LLM::Function::Ractor::Job.new(::Ractor.current, runner_class, id, name, arguments).call
42
+ end
43
+ end
44
+ end
45
+ end
@@ -0,0 +1,9 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ module Ractor
5
+ require_relative "ractor/mailbox"
6
+ require_relative "ractor/job"
7
+ require_relative "ractor/task"
8
+ end
9
+ end
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ class LLM::Function
4
+ ##
5
+ # The {LLM::Function::Ractor::Group} class wraps an array of
6
+ # {LLM::Function::Ractor::Task} objects that are running
7
+ # {LLM::Function} calls concurrently.
8
+ class Ractor::Group
9
+ ##
10
+ # @param [Array<LLM::Function::Task>] tasks
11
+ # @return [LLM::Function::Ractor::Group]
12
+ def initialize(tasks)
13
+ @tasks = tasks
14
+ end
15
+
16
+ ##
17
+ # @return [Boolean]
18
+ def alive?
19
+ @tasks.any?(&:alive?)
20
+ end
21
+
22
+ ##
23
+ # @return [Array<LLM::Function::Return>]
24
+ def wait
25
+ @tasks.map(&:wait)
26
+ end
27
+ alias_method :value, :wait
28
+ end
29
+ end
@@ -14,7 +14,7 @@ class LLM::Function
14
14
  attr_reader :function
15
15
 
16
16
  ##
17
- # @param [Thread, Fiber, Async::Task] task
17
+ # @param [Thread, Fiber, Async::Task, Ractor, LLM::Function::Ractor::Task] task
18
18
  # @param [LLM::Function, nil] function
19
19
  # @return [LLM::Function::Task]
20
20
  def initialize(task, function = nil)
@@ -25,7 +25,8 @@ class LLM::Function
25
25
  ##
26
26
  # @return [Boolean]
27
27
  def alive?
28
- task.alive?
28
+ return task.alive? if task.respond_to?(:alive?)
29
+ false
29
30
  end
30
31
 
31
32
  ##
data/lib/llm/function.rb CHANGED
@@ -36,6 +36,8 @@ class LLM::Function
36
36
  require_relative "function/thread_group"
37
37
  require_relative "function/fiber_group"
38
38
  require_relative "function/task_group"
39
+ require_relative "function/ractor"
40
+ require_relative "function/ractor_group"
39
41
 
40
42
  extend LLM::Function::Registry
41
43
  prepend LLM::Function::Tracing
@@ -174,6 +176,7 @@ class LLM::Function
174
176
  # - `:thread`: Use threads
175
177
  # - `:task`: Use async tasks (requires async gem)
176
178
  # - `:fiber`: Use raw fibers
179
+ # - `:ractor`: Use Ruby ractors (class-based tools only; MCP tools are not supported)
177
180
  #
178
181
  # @return [LLM::Function::Task]
179
182
  # Returns a task whose `#value` is an {LLM::Function::Return}.
@@ -181,17 +184,20 @@ class LLM::Function
181
184
  task = case strategy
182
185
  when :task
183
186
  require "async" unless defined?(::Async)
184
- Async { call_function }
187
+ Async { call }
185
188
  when :thread
186
- Thread.new { call_function }
189
+ Thread.new { call }
187
190
  when :fiber
188
191
  Fiber.new do
189
- call_function
192
+ call
190
193
  ensure
191
194
  Fiber.yield
192
195
  end.tap(&:resume)
196
+ when :ractor
197
+ raise ArgumentError, "Ractor concurrency only supports class-based tools" unless Class === @runner
198
+ Ractor::Task.new(@runner, id, name, arguments)
193
199
  else
194
- raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, or :fiber"
200
+ raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, :fiber, or :ractor"
195
201
  end
196
202
  Task.new(task, self)
197
203
  ensure
data/lib/llm/provider.rb CHANGED
@@ -271,34 +271,48 @@ class LLM::Provider
271
271
 
272
272
  ##
273
273
  # @return [LLM::Tracer]
274
- # Returns a fiber-local tracer
274
+ # Returns the current scoped tracer override or provider default tracer
275
275
  def tracer
276
- weakmap[self] || LLM::Tracer::Null.new(self)
276
+ weakmap[self] || @tracer || LLM::Tracer::Null.new(self)
277
277
  end
278
278
 
279
279
  ##
280
- # Set a fiber-local tracer
280
+ # Set the provider's default tracer
281
+ # This tracer is shared by the provider instance and becomes the fallback
282
+ # whenever no scoped override is active.
281
283
  # @example
282
284
  # llm = LLM.openai(key: ENV["KEY"])
283
- # Thread.new do
284
- # llm.tracer = LLM::Tracer::Logger.new(llm, path: "/path/to/log/1.txt")
285
- # end
286
- # Thread.new do
287
- # llm.tracer = LLM::Tracer::Logger.new(llm, path: "/path/to/log/2.txt")
288
- # end
289
- # # ...
285
+ # llm.tracer = LLM::Tracer::Logger.new(llm, path: "/path/to/log.txt")
290
286
  # @param [LLM::Tracer] tracer
291
287
  # A tracer
292
288
  # @return [void]
293
289
  def tracer=(tracer)
294
- if tracer.nil?
295
- if weakmap.respond_to?(:delete)
296
- weakmap.delete(self)
297
- else
298
- weakmap[self] = nil
299
- end
290
+ @tracer = tracer
291
+ end
292
+
293
+ ##
294
+ # Override the tracer for the current fiber while the block runs.
295
+ # This is useful when you want per-request or per-turn tracing without
296
+ # replacing the provider's default tracer.
297
+ # @example
298
+ # llm.with_tracer(LLM::Tracer::Logger.new(llm, io: $stdout)) do
299
+ # llm.complete("hello", model: "gpt-5.4-mini")
300
+ # end
301
+ # @param [LLM::Tracer] tracer
302
+ # @yield
303
+ # @return [Object]
304
+ def with_tracer(tracer)
305
+ had_override = weakmap.key?(self)
306
+ previous = weakmap[self]
307
+ weakmap[self] = tracer
308
+ yield
309
+ ensure
310
+ if had_override
311
+ weakmap[self] = previous
312
+ elsif weakmap.respond_to?(:delete)
313
+ weakmap.delete(self)
300
314
  else
301
- weakmap[self] = tracer
315
+ weakmap[self] = nil
302
316
  end
303
317
  end
304
318
 
@@ -109,6 +109,8 @@ class LLM::Anthropic
109
109
  fn = (registered || LLM::Function.new(tool["name"])).dup.tap do |fn|
110
110
  fn.id = tool["id"]
111
111
  fn.arguments = LLM::Anthropic.parse_tool_input(tool["input"])
112
+ fn.tracer = @stream.extra[:tracer]
113
+ fn.model = @stream.extra[:model]
112
114
  end
113
115
  [fn, (registered ? nil : @stream.tool_not_found(fn))]
114
116
  end
@@ -157,6 +157,8 @@ class LLM::Google
157
157
  fn = (registered || LLM::Function.new(call["name"])).dup.tap do |fn|
158
158
  fn.id = LLM::Google.tool_id(part:, cindex:, pindex:)
159
159
  fn.arguments = call["args"]
160
+ fn.tracer = @stream.extra[:tracer]
161
+ fn.model = @stream.extra[:model]
160
162
  end
161
163
  [fn, (registered ? nil : @stream.tool_not_found(fn))]
162
164
  end
@@ -273,6 +273,8 @@ class LLM::OpenAI
273
273
  fn = (registered || LLM::Function.new(tool["name"])).dup.tap do |fn|
274
274
  fn.id = tool["call_id"]
275
275
  fn.arguments = arguments
276
+ fn.tracer = @stream.extra[:tracer]
277
+ fn.model = @stream.extra[:model]
276
278
  end
277
279
  [fn, (registered ? nil : @stream.tool_not_found(fn))]
278
280
  end
@@ -189,6 +189,8 @@ class LLM::OpenAI
189
189
  fn = (registered || LLM::Function.new(function["name"])).dup.tap do |fn|
190
190
  fn.id = tool["id"]
191
191
  fn.arguments = arguments
192
+ fn.tracer = @stream.extra[:tracer]
193
+ fn.model = @stream.extra[:model]
192
194
  end
193
195
  [fn, (registered ? nil : @stream.tool_not_found(fn))]
194
196
  end
@@ -13,8 +13,8 @@ module LLM::Sequel
13
13
  # default) or as a structured object (`format: :json` / `:jsonb`) for
14
14
  # databases such as PostgreSQL that can persist JSON natively.
15
15
  # `:json` and `:jsonb` expect a real JSON column type with Sequel handling
16
- # JSON typecasting for the model. A `tracer:` proc can also be configured
17
- # to assign a fiber-local tracer onto the resolved provider.
16
+ # JSON typecasting for the model. `provider:`, `context:`, and `tracer:`
17
+ # can also be configured as symbols that are called on the model.
18
18
  module Plugin
19
19
  EMPTY_HASH = {}.freeze
20
20
  DEFAULT_USAGE_COLUMNS = {
@@ -61,9 +61,9 @@ module LLM::Sequel
61
61
  # Storage format for the serialized context. Use `:string` for text
62
62
  # columns, or `:json` / `:jsonb` for structured JSON columns with Sequel
63
63
  # JSON typecasting enabled.
64
- # @option options [Proc, LLM::Tracer, nil] :tracer
65
- # Optional tracer or proc that resolves to one and is assigned through
66
- # `llm.tracer = ...` on the resolved provider.
64
+ # @option options [Proc, Symbol, LLM::Tracer, nil] :tracer
65
+ # Optional tracer, method name, or proc that resolves to one and is
66
+ # assigned through `llm.tracer = ...` on the resolved provider.
67
67
  # @return [void]
68
68
  def self.configure(model, options = EMPTY_HASH)
69
69
  options = DEFAULTS.merge(options)
@@ -233,7 +233,8 @@ module LLM::Sequel
233
233
  options = self.class.llm_plugin_options
234
234
  provider = self[columns[:provider_column]]
235
235
  kwargs = resolve_options(options[:provider])
236
- @llm ||= LLM.method(provider).call(**kwargs)
236
+ return @llm if @llm
237
+ @llm = LLM.method(provider).call(**kwargs)
237
238
  @llm.tracer = resolve_option(options[:tracer]) if options[:tracer]
238
239
  @llm
239
240
  end
@@ -276,6 +277,7 @@ module LLM::Sequel
276
277
  def resolve_option(option)
277
278
  case option
278
279
  when Proc then instance_exec(&option)
280
+ when Symbol then send(option)
279
281
  when Hash then option.dup
280
282
  else option
281
283
  end
@@ -38,6 +38,7 @@ class LLM::Stream
38
38
  # - `:thread`: Use threads
39
39
  # - `:task`: Use async tasks (requires async gem)
40
40
  # - `:fiber`: Use raw fibers
41
+ # - `:ractor`: Use Ruby ractors (class-based tools only; MCP tools are not supported)
41
42
  # @return [Array<LLM::Function::Return>]
42
43
  def wait(strategy)
43
44
  returns, tasks = @items.shift(@items.length).partition { LLM::Function::Return === _1 }
@@ -45,7 +46,8 @@ class LLM::Stream
45
46
  when :thread then LLM::Function::ThreadGroup.new(tasks).wait
46
47
  when :task then LLM::Function::TaskGroup.new(tasks).wait
47
48
  when :fiber then LLM::Function::FiberGroup.new(tasks).wait
48
- else raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, or :fiber"
49
+ when :ractor then LLM::Function::Ractor::Group.new(tasks).wait
50
+ else raise ArgumentError, "Unknown strategy: #{strategy.inspect}. Expected :thread, :task, :fiber, or :ractor"
49
51
  end
50
52
  returns.concat fire_hooks(tasks, results)
51
53
  end
data/lib/llm/stream.rb CHANGED
@@ -22,6 +22,13 @@ module LLM
22
22
  class Stream
23
23
  require_relative "stream/queue"
24
24
 
25
+ ##
26
+ # Returns extra context associated with the current streamed request.
27
+ # @return [Hash]
28
+ def extra
29
+ @extra ||= LLM::Object.from({})
30
+ end
31
+
25
32
  ##
26
33
  # Returns a lazily-initialized queue for tool results or spawned work.
27
34
  # @return [LLM::Stream::Queue]
@@ -63,13 +70,16 @@ module LLM
63
70
  # Called when a streamed tool call has been fully constructed.
64
71
  # @note A stream implementation may start tool execution here, for
65
72
  # example by pushing `tool.spawn(:thread)`, `tool.spawn(:fiber)`, or
66
- # `tool.spawn(:task)` onto {#queue}. When a streamed tool cannot be
67
- # resolved, `error` is passed as an {LLM::Function::Return}. It can be
68
- # sent back to the model, allowing the tool-call path to recover and the
69
- # session to continue. Tool resolution depends on
73
+ # `tool.spawn(:task)` onto {#queue}. Mixed strategies can also be
74
+ # selected per tool, such as `tool.mcp? ? tool.spawn(:task) :
75
+ # tool.spawn(:ractor)`. When a streamed tool cannot be resolved, `error`
76
+ # is passed as an {LLM::Function::Return}. It can be sent back to the
77
+ # model, allowing the tool-call path to recover and the session to
78
+ # continue. Tool resolution depends on
70
79
  # {LLM::Function.registry}, which includes {LLM::Tool LLM::Tool}
71
80
  # subclasses, including MCP tools, but not functions defined with
72
- # {LLM.function}.
81
+ # {LLM.function}. The current `:ractor` mode is for class-based tools
82
+ # and does not support MCP tools.
73
83
  # @param [LLM::Function] tool
74
84
  # The parsed tool call.
75
85
  # @param [LLM::Function::Return, nil] error
data/lib/llm/version.rb CHANGED
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LLM
4
- VERSION = "4.17.0"
4
+ VERSION = "4.18.0"
5
5
  end
data/llm.gemspec CHANGED
@@ -17,10 +17,11 @@ Gem::Specification.new do |spec|
17
17
  persisted state, so real systems can be built out of one coherent
18
18
  execution model instead of a pile of adapters. It stays close to Ruby, runs
19
19
  on the standard library by default, loads optional pieces only when needed,
20
- works naturally in Rails or ActiveRecord through acts_as_llm, includes
21
- built-in Sequel support through plugin :llm, and is designed for
22
- engineers who want control over long-lived, tool-capable, stateful AI
23
- workflows instead of just request/response helpers.
20
+ includes built-in ActiveRecord support through acts_as_llm and
21
+ acts_as_agent, includes built-in Sequel support through plugin :llm,
22
+ and is designed for engineers who want control over long-lived,
23
+ tool-capable, stateful AI workflows instead of just request/response
24
+ helpers.
24
25
  DESCRIPTION
25
26
 
26
27
  spec.license = "0BSD"
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm.rb
3
3
  version: !ruby/object:Gem::Version
4
- version: 4.17.0
4
+ version: 4.18.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Antar Azri
@@ -201,10 +201,11 @@ description: |
201
201
  persisted state, so real systems can be built out of one coherent
202
202
  execution model instead of a pile of adapters. It stays close to Ruby, runs
203
203
  on the standard library by default, loads optional pieces only when needed,
204
- works naturally in Rails or ActiveRecord through acts_as_llm, includes
205
- built-in Sequel support through plugin :llm, and is designed for
206
- engineers who want control over long-lived, tool-capable, stateful AI
207
- workflows instead of just request/response helpers.
204
+ includes built-in ActiveRecord support through acts_as_llm and
205
+ acts_as_agent, includes built-in Sequel support through plugin :llm,
206
+ and is designed for engineers who want control over long-lived,
207
+ tool-capable, stateful AI workflows instead of just request/response
208
+ helpers.
208
209
  email:
209
210
  - azantar@proton.me
210
211
  - 0x1eef@hardenedbsd.org
@@ -242,6 +243,11 @@ files:
242
243
  - lib/llm/function.rb
243
244
  - lib/llm/function/array.rb
244
245
  - lib/llm/function/fiber_group.rb
246
+ - lib/llm/function/ractor.rb
247
+ - lib/llm/function/ractor/job.rb
248
+ - lib/llm/function/ractor/mailbox.rb
249
+ - lib/llm/function/ractor/task.rb
250
+ - lib/llm/function/ractor_group.rb
245
251
  - lib/llm/function/registry.rb
246
252
  - lib/llm/function/task.rb
247
253
  - lib/llm/function/task_group.rb
@@ -403,7 +409,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
403
409
  - !ruby/object:Gem::Version
404
410
  version: '0'
405
411
  requirements: []
406
- rubygems_version: 3.6.9
412
+ rubygems_version: 4.0.3
407
413
  specification_version: 4
408
414
  summary: Lightweight runtime for building capable AI systems in Ruby.
409
415
  test_files: []