llm.rb 4.16.1 → 4.18.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6119e0673d055f50139fd33a4523bd99ffabb0fc2688a361f23297ef9ac428fd
4
- data.tar.gz: bfa5943241a2e9944245f0976458cfdd4895e7b72dbb83fd186f9d7ad2b987ee
3
+ metadata.gz: 42357f605b8995c24722c0a2d285d98de0e99fe6f9687e2f4451339dcfbc2e87
4
+ data.tar.gz: 2953c742deeec6dc7c7acc9ed77f4520846a92299a4df94e415416d5c623033c
5
5
  SHA512:
6
- metadata.gz: e139622d5cbaa8f42a1a6739b9792119f9086ae0b11ae896794953d9bb0fa9ed46159da7f10f7a8b619ef468404ae425b2158b359c940a11220dd48b0b230e2e
7
- data.tar.gz: 563f48928b6836d2bbcbc8d72bd594aa265fa285ad99cb8ff0c9574f9e67b7da60f9d42c2eeb1bddce0ba72e1425dab260bc65670fa7d812f7a77dc8016262d7
6
+ metadata.gz: 534fad616562768b0a851142f411117634d824a92187577d6baad4427c1e0eddf7a765dbfe3b662f76ba52011d760803793473a1881b4f69b8f4c7f9fc16bdc3
7
+ data.tar.gz: f1d90ad7f7b71f4817fcfa494f48a4e7f189560674d7b86d618f1f4e217e2a34189e15156e71ecdcfffe81e9588315f4759a82f4ec3eaed3846a29c77a906a66
data/CHANGELOG.md CHANGED
@@ -2,8 +2,104 @@
2
2
 
3
3
  ## Unreleased
4
4
 
5
+ Changes since `v4.18.0`.
6
+
7
+ ### Change
8
+
9
+ * **Make provider tracers default to the provider instance** <br>
10
+ Change `llm.tracer = ...` so it sets a provider default tracer instead of
11
+ relying on scoped fiber-local state alone. This makes tracer configuration
12
+ behave more predictably across normal tasks, threads, and fibers that share
13
+ the same provider instance.
14
+
15
+ * **Add `LLM::Provider#with_tracer` for scoped overrides** <br>
16
+ Add `with_tracer` as the opt-in escape hatch for request- or turn-scoped
17
+ tracer overrides. Use it when you want temporary tracing on the current
18
+ fiber without replacing the provider's default tracer.
19
+
20
+ * **Trace concurrent tool calls outside ractors** <br>
21
+ Make tool tracing fire correctly when functions run through `:thread`,
22
+ `:task`, or `:fiber` concurrency. Experimental `:ractor` execution still
23
+ does not emit tool tracer events.
24
+
25
+ * **Trace streamed tool calls, including MCP tools** <br>
26
+ Bind stream metadata through `LLM::Stream#extra` so streamed tool calls
27
+ inherit tracer and model context before they are handed to `on_tool_call`.
28
+ This restores tool tracing for streamed MCP and local tool execution.
29
+
30
+ * **Support symbol-based ORM option hooks** <br>
31
+ Let `provider:`, `context:`, and `tracer:` on the Sequel plugin and
32
+ the ActiveRecord `acts_as_llm` / `acts_as_agent` wrappers resolve through
33
+ model method names as well as procs.
34
+
35
+ * **Add experimental ractor tool concurrency** <br>
36
+ Add `:ractor` support to `LLM::Function#spawn`, `LLM::Function::Array#wait`,
37
+ `LLM::Stream#wait`, and `LLM::Agent.concurrency` so class-based tools with
38
+ ractor-safe arguments and return values can run in Ruby ractors and report
39
+ their results back into the normal LLM tool-return path. MCP tools are not
40
+ supported by the current `:ractor` mode, but mixed workloads can still
41
+ branch on `tool.mcp?` and choose a supported strategy per tool. `:ractor`
42
+ is especially useful for CPU-bound tools, while `:task`, `:fiber`, or
43
+ `:thread` may be a better fit for I/O-bound work.
44
+
45
+ ## v4.18.0
46
+
47
+ Changes since `v4.17.0`.
48
+
49
+ This release improves tracing and tool execution behavior across llm.rb.
50
+ It makes provider tracers default to the provider instance, adds
51
+ `LLM::Provider#with_tracer` for scoped overrides, restores tool tracing for
52
+ concurrent and streamed tool execution, extends streamed tracing to MCP tools,
53
+ and adds symbol-based ORM option hooks alongside experimental ractor tool
54
+ concurrency.
55
+
56
+ ## v4.17.0
57
+
5
58
  Changes since `v4.16.1`.
6
59
 
60
+ This release expands agent support across llm.rb. It brings `LLM::Agent`
61
+ closer to `LLM::Context`, adds configurable automatic tool concurrency
62
+ including experimental ractor support for class-based tools,
63
+ extends persisted ORM wrappers with more of the context runtime surface and
64
+ tracer hooks, and introduces built-in ActiveRecord agent persistence through
65
+ `acts_as_agent`.
66
+
67
+ ### Change
68
+
69
+ * **Add configurable tool concurrency to `LLM::Agent`** <br>
70
+ Add the class-level `concurrency` DSL to `LLM::Agent` so automatic
71
+ tool loops can run with `:call`, `:thread`, `:task`, `:fiber`, or
72
+ experimental `:ractor` support for class-based tools instead of
73
+ always executing sequentially.
74
+
75
+ * **Bring `LLM::Agent` closer to `LLM::Context`** <br>
76
+ Expand `LLM::Agent` so it exposes more of the same runtime surface as
77
+ `LLM::Context`, including returns, interruption, mode, cost, context
78
+ window, structured serialization, and other context-backed helpers,
79
+ while still auto-managing tool loops.
80
+
81
+ * **Refresh agent docs and coverage** <br>
82
+ Update the README and deep dive to explain the current role of
83
+ `LLM::Agent`, add examples that show automatic tool execution and
84
+ concurrency, and add focused specs for the expanded agent surface and
85
+ tool-loop behavior.
86
+
87
+ * **Add ORM tracer hooks for persisted contexts** <br>
88
+ Add `tracer:` to both the Sequel plugin and `acts_as_llm` so models
89
+ can resolve and assign tracers onto the provider used by their persisted
90
+ `LLM::Context`.
91
+
92
+ * **Bring persisted ORM wrappers closer to `LLM::Context`** <br>
93
+ Expand both the Sequel plugin and `acts_as_llm` so record-backed
94
+ contexts expose more of the same runtime surface as `LLM::Context`,
95
+ including mode, returns, interruption, prompt helpers, file helpers,
96
+ and tracer access.
97
+
98
+ * **Add ActiveRecord agent persistence with `acts_as_agent`** <br>
99
+ Add `acts_as_agent` for ActiveRecord models that should wrap
100
+ `LLM::Agent`, reusing the same record-backed runtime shape as
101
+ `acts_as_llm` while letting tool execution be managed by the agent.
102
+
7
103
  ## v4.16.1
8
104
 
9
105
  Changes since `v4.16.0`.
data/README.md CHANGED
@@ -4,27 +4,26 @@
4
4
  <p align="center">
5
5
  <a href="https://0x1eef.github.io/x/llm.rb?rebuild=1"><img src="https://img.shields.io/badge/docs-0x1eef.github.io-blue.svg" alt="RubyDoc"></a>
6
6
  <a href="https://opensource.org/license/0bsd"><img src="https://img.shields.io/badge/License-0BSD-orange.svg?" alt="License"></a>
7
- <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.16.1-green.svg?" alt="Version"></a>
7
+ <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.18.0-green.svg?" alt="Version"></a>
8
8
  </p>
9
9
 
10
10
  ## About
11
11
 
12
- llm.rb is a runtime for building AI systems that integrate directly with your
13
- application. It is not just an API wrapper. It provides a unified execution
14
- model for providers, tools, MCP servers, streaming, schemas, files, and
15
- state.
12
+ llm.rb is a lightweight runtime for building capable AI systems in Ruby.
16
13
 
17
- It is built for engineers who want control over how these systems run. llm.rb
18
- stays close to Ruby, runs on the standard library by default, loads optional
19
- pieces only when needed, and remains easy to extend. It also works well in
20
- Rails or ActiveRecord applications, with built-in `acts_as_llm`, and includes
21
- built-in Sequel support through `plugin :llm`, so long-lived context state can
22
- be saved and restored across requests, jobs, or retries.
14
+ It is not just an API wrapper. llm.rb gives you one runtime for providers,
15
+ contexts, agents, tools, MCP servers, streaming, schemas, files, and persisted
16
+ state, so real systems can be built out of one coherent execution model instead
17
+ of a pile of adapters.
23
18
 
24
- Most LLM libraries stop at request/response APIs. Building real systems means
25
- stitching together streaming, tools, state, persistence, and external
26
- services by hand. llm.rb provides a single execution model for all of these,
27
- so they compose naturally instead of becoming separate subsystems.
19
+ It stays close to Ruby, runs on the standard library by default, loads optional
20
+ pieces only when needed, includes built-in ActiveRecord support through
21
+ `acts_as_llm` and `acts_as_agent`, includes built-in Sequel support through
22
+ `plugin :llm`, and is designed for engineers who want control over
23
+ long-lived, tool-capable, stateful AI workflows instead of just
24
+ request/response helpers.
25
+
26
+ Want to see some code? Jump to [the examples](#examples) section.
28
27
 
29
28
  ## Architecture
30
29
 
@@ -66,6 +65,13 @@ same context object.
66
65
  - **Streaming and tool execution work together** <br>
67
66
  Start tool work while output is still streaming so you can hide latency
68
67
  instead of waiting for turns to finish.
68
+ - **Agents auto-manage tool execution** <br>
69
+ Use `LLM::Agent` when you want the same stateful runtime surface as
70
+ `LLM::Context`, but with tool loops executed automatically according to a
71
+ configured concurrency mode such as `:call`, `:thread`, `:task`, `:fiber`,
72
+ or experimental `:ractor` support for class-based tools. MCP tools are not
73
+ supported by the current `:ractor` mode, but mixed tool sets can still
74
+ route MCP tools and local tools through different strategies at runtime.
69
75
  - **Tool calls have an explicit lifecycle** <br>
70
76
  A tool call can be executed, cancelled through
71
77
  [`LLM::Function#cancel`](https://0x1eef.github.io/x/llm.rb/LLM/Function.html#cancel-instance_method),
@@ -77,7 +83,12 @@ same context object.
77
83
  [`LLM::Context#cancel!`](https://0x1eef.github.io/x/llm.rb/LLM/Context.html#cancel-21-instance_method)
78
84
  is inspired by Go's context cancellation model.
79
85
  - **Concurrency is a first-class feature** <br>
80
- Use threads, fibers, or async tasks without rewriting your tool layer.
86
+ Use threads, fibers, async tasks, or experimental ractors without
87
+ rewriting your tool layer. The current `:ractor` mode is for class-based
88
+ tools and does not support MCP tools, but mixed workloads can branch on
89
+ `tool.mcp?` and choose a supported strategy per tool. `:ractor` is
90
+ especially useful for CPU-bound tools, while `:task`, `:fiber`, or
91
+ `:thread` may be a better fit for I/O-bound work.
81
92
  - **Advanced workloads are built in, not bolted on** <br>
82
93
  Streaming, concurrent tool execution, persistence, tracing, and MCP support
83
94
  all fit the same runtime model.
@@ -88,11 +99,14 @@ same context object.
88
99
  Connect to MCP servers over stdio or HTTP without bolting on a separate
89
100
  integration stack.
90
101
  - **ActiveRecord and Sequel persistence are built in** <br>
91
- Use `acts_as_llm` on ActiveRecord models or `plugin :llm` on Sequel models
92
- to persist `LLM::Context` state with sensible default columns. Both support
93
- `provider:` and `context:` hooks, plus `format: :string` for text columns
94
- or `format: :jsonb` for native PostgreSQL JSON storage when ORM JSON
95
- typecasting support is enabled.
102
+ llm.rb includes built-in ActiveRecord support through `acts_as_llm` and
103
+ `acts_as_agent`, plus built-in Sequel support through `plugin :llm`.
104
+ Use `acts_as_llm` when you want to wrap `LLM::Context`, `acts_as_agent`
105
+ when you want to wrap `LLM::Agent`, or `plugin :llm` on Sequel models to
106
+ persist `LLM::Context` state with sensible default columns. These
107
+ integrations support `provider:` and `context:` hooks, plus `format:
108
+ :string` for text columns or `format: :jsonb` for native PostgreSQL JSON
109
+ storage when ORM JSON typecasting support is enabled.
96
110
  - **Persistent HTTP pooling is shared process-wide** <br>
97
111
  When enabled, separate
98
112
  [`LLM::Provider`](https://0x1eef.github.io/x/llm.rb/LLM/Provider.html)
@@ -174,7 +188,7 @@ gem install llm.rb
174
188
 
175
189
  **REPL**
176
190
 
177
- See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
191
+ This example uses [`LLM::Context`](https://0x1eef.github.io/x/llm.rb/LLM/Context.html) directly for an interactive REPL. <br> See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
178
192
 
179
193
  ```ruby
180
194
  require "llm"
@@ -191,10 +205,11 @@ end
191
205
 
192
206
  **Sequel (ORM)**
193
207
 
194
- See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
208
+ The `plugin :llm` integration wraps [`LLM::Context`](https://0x1eef.github.io/x/llm.rb/LLM/Context.html) on a `Sequel::Model` and keeps tool execution explicit. <br> See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
195
209
 
196
210
  ```ruby
197
211
  require "llm"
212
+ require "net/http/persistent"
198
213
  require "sequel"
199
214
  require "sequel/plugins/llm"
200
215
 
@@ -207,12 +222,14 @@ ctx.talk("Remember that my favorite language is Ruby")
207
222
  puts ctx.talk("What is my favorite language?").content
208
223
  ```
209
224
 
210
- **ActiveRecord (ORM)**
225
+ **ActiveRecord (ORM): acts_as_llm**
211
226
 
212
- See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
227
+ The `acts_as_llm` method wraps [`LLM::Context`](https://0x1eef.github.io/x/llm.rb/LLM/Context.html) and
228
+ provides full control over tool execution. <br> See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
213
229
 
214
230
  ```ruby
215
231
  require "llm"
232
+ require "net/http/persistent"
216
233
  require "active_record"
217
234
  require "llm/active_record"
218
235
 
@@ -225,6 +242,48 @@ ctx.talk("Remember that my favorite language is Ruby")
225
242
  puts ctx.talk("What is my favorite language?").content
226
243
  ```
227
244
 
245
+ **ActiveRecord (ORM): acts_as_agent**
246
+
247
+ The `acts_as_agent` method wraps [`LLM::Agent`](https://0x1eef.github.io/x/llm.rb/LLM/Agent.html) and
248
+ manages tool execution for you. <br> See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
249
+
250
+ ```ruby
251
+ require "llm"
252
+ require "net/http/persistent"
253
+ require "active_record"
254
+ require "llm/active_record"
255
+
256
+ class Ticket < ApplicationRecord
257
+ acts_as_agent provider: -> { { key: ENV["#{provider.upcase}_SECRET"], persistent: true } }
258
+ model "gpt-5.4-mini"
259
+ instructions "You are a concise support assistant."
260
+ tools SearchDocs, Escalate
261
+ concurrency :thread
262
+ end
263
+
264
+ ticket = Ticket.create!(provider: "openai", model: "gpt-5.4-mini")
265
+ puts ticket.talk("How do I rotate my API key?").content
266
+ ```
267
+
268
+ **Agent**
269
+
270
+ This example uses [`LLM::Agent`](https://0x1eef.github.io/x/llm.rb/LLM/Agent.html) directly and lets the agent manage tool execution. <br> See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
271
+
272
+ ```ruby
273
+ require "llm"
274
+
275
+ class ShellAgent < LLM::Agent
276
+ model "gpt-5.4-mini"
277
+ instructions "You are a Linux system assistant."
278
+ tools Shell
279
+ concurrency :thread
280
+ end
281
+
282
+ llm = LLM.openai(key: ENV["KEY"])
283
+ agent = ShellAgent.new(llm)
284
+ puts agent.talk("What time is it on this system?").content
285
+ ```
286
+
228
287
  ## Resources
229
288
 
230
289
  - [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) is the
@@ -0,0 +1,179 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LLM::ActiveRecord
4
+ ##
5
+ # ActiveRecord integration for persisting {LLM::Agent LLM::Agent} state.
6
+ #
7
+ # This wrapper reuses the same record-backed runtime surface as
8
+ # {LLM::ActiveRecord::ActsAsLLM}, but builds an {LLM::Agent LLM::Agent}
9
+ # instead of an {LLM::Context LLM::Context}. Agent defaults such as model,
10
+ # tools, schema, instructions, and concurrency are configured on the model
11
+ # class and forwarded to an internal agent subclass.
12
+ module ActsAsAgent
13
+ EMPTY_HASH = LLM::ActiveRecord::ActsAsLLM::EMPTY_HASH
14
+ DEFAULT_USAGE_COLUMNS = LLM::ActiveRecord::ActsAsLLM::DEFAULT_USAGE_COLUMNS
15
+ DEFAULTS = LLM::ActiveRecord::ActsAsLLM::DEFAULTS
16
+
17
+ module ClassMethods
18
+ def model(model = nil)
19
+ return agent.model if model.nil?
20
+ agent.model(model)
21
+ end
22
+
23
+ def tools(*tools)
24
+ return agent.tools if tools.empty?
25
+ agent.tools(*tools)
26
+ end
27
+
28
+ def schema(schema = nil)
29
+ return agent.schema if schema.nil?
30
+ agent.schema(schema)
31
+ end
32
+
33
+ def instructions(instructions = nil)
34
+ return agent.instructions if instructions.nil?
35
+ agent.instructions(instructions)
36
+ end
37
+
38
+ def concurrency(concurrency = nil)
39
+ return agent.concurrency if concurrency.nil?
40
+ agent.concurrency(concurrency)
41
+ end
42
+
43
+ def agent
44
+ @agent ||= Class.new(LLM::Agent)
45
+ end
46
+ end
47
+
48
+ module Hooks
49
+ ##
50
+ # Called when hooks are extended onto an ActiveRecord model.
51
+ #
52
+ # @param [Class] model
53
+ # @return [void]
54
+ def self.extended(model)
55
+ options = model.llm_agent_options
56
+ model.validates options[:provider_column], options[:model_column], presence: true
57
+ model.include LLM::ActiveRecord::ActsAsLLM::InstanceMethods unless model.ancestors.include?(LLM::ActiveRecord::ActsAsLLM::InstanceMethods)
58
+ model.include InstanceMethods unless model.ancestors.include?(InstanceMethods)
59
+ model.extend ClassMethods unless model.singleton_class.ancestors.include?(ClassMethods)
60
+ end
61
+ end
62
+
63
+ ##
64
+ # Installs the `acts_as_agent` wrapper on an ActiveRecord model.
65
+ #
66
+ # @param [Hash] options
67
+ # @option options [Symbol] :format
68
+ # Storage format for the serialized agent state. Use `:string` for text
69
+ # columns, or `:json` / `:jsonb` for structured JSON columns with
70
+ # ActiveRecord JSON typecasting enabled.
71
+ # @option options [Proc, Symbol, LLM::Tracer, nil] :tracer
72
+ # Optional tracer, method name, or proc that resolves to one and is
73
+ # assigned through `llm.tracer = ...` on the resolved provider.
74
+ # @return [void]
75
+ def acts_as_agent(options = EMPTY_HASH)
76
+ options = DEFAULTS.merge(options)
77
+ usage_columns = DEFAULT_USAGE_COLUMNS.merge(options[:usage_columns] || EMPTY_HASH)
78
+ class_attribute :llm_agent_options, instance_accessor: false, default: DEFAULTS unless respond_to?(:llm_agent_options)
79
+ self.llm_agent_options = options.merge(usage_columns: usage_columns.freeze).freeze
80
+ extend Hooks
81
+ end
82
+
83
+ module InstanceMethods
84
+ private
85
+
86
+ ##
87
+ # Returns the resolved provider instance for this record.
88
+ # @return [LLM::Provider]
89
+ def llm
90
+ options = self.class.llm_agent_options
91
+ provider = self[columns[:provider_column]]
92
+ kwargs = resolve_options(options[:provider])
93
+ return @llm if @llm
94
+ @llm = LLM.method(provider).call(**kwargs)
95
+ @llm.tracer = resolve_option(options[:tracer]) if options[:tracer]
96
+ @llm
97
+ end
98
+
99
+ ##
100
+ # @return [LLM::Agent]
101
+ def ctx
102
+ @ctx ||= begin
103
+ options = self.class.llm_agent_options
104
+ params = resolve_options(options[:context]).dup
105
+ params[:model] ||= self[columns[:model_column]]
106
+ ctx = self.class.agent.new(llm, params.compact)
107
+ data = self[columns[:data_column]]
108
+ if data.nil? || data == ""
109
+ ctx
110
+ else
111
+ case options[:format]
112
+ when :string then ctx.restore(string: data)
113
+ when :json, :jsonb then ctx.restore(data:)
114
+ else raise ArgumentError, "Unknown format: #{options[:format].inspect}"
115
+ end
116
+ end
117
+ end
118
+ end
119
+
120
+ ##
121
+ # @return [void]
122
+ def flush
123
+ attrs = {
124
+ columns[:data_column] => serialize_context(self.class.llm_agent_options[:format]),
125
+ columns[:input_tokens] => ctx.usage.input_tokens,
126
+ columns[:output_tokens] => ctx.usage.output_tokens,
127
+ columns[:total_tokens] => ctx.usage.total_tokens
128
+ }
129
+ assign_attributes(attrs)
130
+ save!
131
+ end
132
+
133
+ ##
134
+ # @return [Hash]
135
+ def resolve_option(option)
136
+ case option
137
+ when Proc then instance_exec(&option)
138
+ when Symbol then send(option)
139
+ when Hash then option.dup
140
+ else option
141
+ end
142
+ end
143
+
144
+ ##
145
+ # @return [Hash]
146
+ def resolve_options(option)
147
+ case option
148
+ when Proc, Hash then resolve_option(option)
149
+ else EMPTY_HASH.dup
150
+ end
151
+ end
152
+
153
+ def serialize_context(format)
154
+ case format
155
+ when :string then ctx.to_json
156
+ when :json, :jsonb then ctx.to_h
157
+ else raise ArgumentError, "Unknown format: #{format.inspect}"
158
+ end
159
+ end
160
+
161
+ def columns
162
+ @columns ||= begin
163
+ options = self.class.llm_agent_options
164
+ usage_columns = options[:usage_columns]
165
+ {
166
+ provider_column: options[:provider_column],
167
+ model_column: options[:model_column],
168
+ data_column: options[:data_column],
169
+ input_tokens: usage_columns[:input_tokens],
170
+ output_tokens: usage_columns[:output_tokens],
171
+ total_tokens: usage_columns[:total_tokens]
172
+ }.freeze
173
+ end
174
+ end
175
+ end
176
+ end
177
+ end
178
+
179
+ ::ActiveRecord::Base.extend(LLM::ActiveRecord::ActsAsAgent)
@@ -13,7 +13,8 @@ module LLM::ActiveRecord
13
13
  # default) or as a structured object (`format: :json` / `:jsonb`) for
14
14
  # databases such as PostgreSQL that can persist JSON natively.
15
15
  # `:json` and `:jsonb` expect a real JSON column type with ActiveRecord
16
- # handling JSON typecasting for the model.
16
+ # handling JSON typecasting for the model. `provider:`, `context:`, and
17
+ # `tracer:` can also be configured as symbols that are called on the model.
17
18
  module ActsAsLLM
18
19
  EMPTY_HASH = {}.freeze
19
20
  DEFAULT_USAGE_COLUMNS = {
@@ -27,6 +28,7 @@ module LLM::ActiveRecord
27
28
  data_column: :data,
28
29
  format: :string,
29
30
  usage_columns: DEFAULT_USAGE_COLUMNS,
31
+ tracer: nil,
30
32
  provider: EMPTY_HASH,
31
33
  context: EMPTY_HASH
32
34
  }.freeze
@@ -52,6 +54,9 @@ module LLM::ActiveRecord
52
54
  # Storage format for the serialized context. Use `:string` for text
53
55
  # columns, or `:json` / `:jsonb` for structured JSON columns with
54
56
  # ActiveRecord JSON typecasting enabled.
57
+ # @option options [Proc, Symbol, LLM::Tracer, nil] :tracer
58
+ # Optional tracer, method name, or proc that resolves to one and is
59
+ # assigned through `llm.tracer = ...` on the resolved provider.
55
60
  # @return [void]
56
61
  def acts_as_llm(options = EMPTY_HASH)
57
62
  options = DEFAULTS.merge(options)
@@ -94,6 +99,13 @@ module LLM::ActiveRecord
94
99
  ctx.call(...)
95
100
  end
96
101
 
102
+ ##
103
+ # @see LLM::Context#mode
104
+ # @return [Symbol]
105
+ def mode
106
+ ctx.mode
107
+ end
108
+
97
109
  ##
98
110
  # @see LLM::Context#messages
99
111
  # @return [Array<LLM::Message>]
@@ -116,6 +128,13 @@ module LLM::ActiveRecord
116
128
  ctx.functions
117
129
  end
118
130
 
131
+ ##
132
+ # @see LLM::Context#returns
133
+ # @return [Array<LLM::Function::Return>]
134
+ def returns
135
+ ctx.returns
136
+ end
137
+
119
138
  ##
120
139
  # @see LLM::Context#cost
121
140
  # @return [LLM::Cost]
@@ -143,6 +162,50 @@ module LLM::ActiveRecord
143
162
  )
144
163
  end
145
164
 
165
+ ##
166
+ # @see LLM::Context#interrupt!
167
+ # @return [nil]
168
+ def interrupt!
169
+ ctx.interrupt!
170
+ end
171
+ alias_method :cancel!, :interrupt!
172
+
173
+ ##
174
+ # @see LLM::Context#prompt
175
+ # @return [LLM::Prompt]
176
+ def prompt(&)
177
+ ctx.prompt(&)
178
+ end
179
+ alias_method :build_prompt, :prompt
180
+
181
+ ##
182
+ # @see LLM::Context#image_url
183
+ # @return [LLM::Object]
184
+ def image_url(...)
185
+ ctx.image_url(...)
186
+ end
187
+
188
+ ##
189
+ # @see LLM::Context#local_file
190
+ # @return [LLM::Object]
191
+ def local_file(...)
192
+ ctx.local_file(...)
193
+ end
194
+
195
+ ##
196
+ # @see LLM::Context#remote_file
197
+ # @return [LLM::Object]
198
+ def remote_file(...)
199
+ ctx.remote_file(...)
200
+ end
201
+
202
+ ##
203
+ # @see LLM::Context#tracer
204
+ # @return [LLM::Tracer]
205
+ def tracer
206
+ ctx.tracer
207
+ end
208
+
146
209
  private
147
210
 
148
211
  ##
@@ -152,7 +215,10 @@ module LLM::ActiveRecord
152
215
  options = self.class.llm_plugin_options
153
216
  provider = self[columns[:provider_column]]
154
217
  kwargs = resolve_options(options[:provider])
155
- @llm ||= LLM.method(provider).call(**kwargs)
218
+ return @llm if @llm
219
+ @llm = LLM.method(provider).call(**kwargs)
220
+ @llm.tracer = resolve_option(options[:tracer]) if options[:tracer]
221
+ @llm
156
222
  end
157
223
 
158
224
  ##
@@ -194,6 +260,7 @@ module LLM::ActiveRecord
194
260
  def resolve_option(option)
195
261
  case option
196
262
  when Proc then instance_exec(&option)
263
+ when Symbol then send(option)
197
264
  when Hash then option.dup
198
265
  else option
199
266
  end
@@ -1,3 +1,4 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require "llm/active_record/acts_as_llm"
4
+ require "llm/active_record/acts_as_agent"