ruby-pi 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: 766afd01deb469fa6f86f35844e7dd4e100c2de2deb2ce593e07fa75b3ae4637
4
+ data.tar.gz: 8a72e80d2448359004909ed3afcf5b01fbd41429c2538dc5c8e6b84b6d18326b
5
+ SHA512:
6
+ metadata.gz: 0cadf39edb5c0a83e3f1bf27f503bc9d682b9a4f264eb8a0fbb026609ce66508839acef3151ce53541aee62f1068819a1f8953bddca1c076eb7011c148799cb2
7
+ data.tar.gz: 861befb924a6dbf153f755fff18cf7c579bb175e5ebc227623591e1dfea637e444e041a795d5f750d77de2bec621147e903ae76dae12dd76b87e4936507e94fe
data/CHANGELOG.md ADDED
@@ -0,0 +1,31 @@
1
+ # Changelog
2
+
3
+ All notable changes to this project will be documented in this file.
4
+
5
+ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
+ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
+
8
+ ## [0.1.0] - 2026-04-28
9
+
10
+ ### Added
11
+
12
+ - `RubyPi::LLM` with Gemini, Anthropic, and OpenAI providers
13
+ - Unified `Response`, `ToolCall`, and `StreamEvent` value objects
14
+ - `BaseProvider` with automatic retry and exponential backoff with jitter
15
+ - Streaming support via Faraday SSE parsing for all providers
16
+ - `RubyPi::LLM::Fallback` for automatic provider failover
17
+ - `RubyPi::LLM::Model` descriptor for deferred provider construction
18
+ - `RubyPi::Tools::Definition` with `RubyPi::Tool.define` convenience API
19
+ - `RubyPi::Schema` DSL for building JSON Schema parameter definitions
20
+ - `RubyPi::Tools::Registry` -- thread-safe tool store with category filtering and subset extraction
21
+ - `RubyPi::Tools::Executor` -- parallel and sequential tool dispatch with per-tool timeouts
22
+ - `RubyPi::Tools::Result` -- execution outcome with value/error and timing
23
+ - `RubyPi::Agent` with think-act-observe loop
24
+ - Event system (`text_delta`, `tool_execution_start`, `tool_execution_end`, `turn_start`, `turn_end`, `before_tool_call`, `after_tool_call`, `agent_end`, `error`)
25
+ - `RubyPi::Context::Compaction` for long-context management
26
+ - `RubyPi::Context::Transform` helpers for message list preprocessing
27
+ - `RubyPi::Extensions::Base` hook DSL for cross-cutting agent behavior
28
+ - `RubyPi::Configuration` with API keys, retry settings, timeouts, and default models
29
+ - Typed error hierarchy (`ApiError`, `AuthenticationError`, `RateLimitError`, `TimeoutError`, `ProviderError`)
30
+ - Full RSpec test suite with WebMock (unit + integration)
31
+ - GitHub Actions CI for Ruby 3.2 and 3.3
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 EJ White
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,415 @@
1
+ # RubyPi
2
+
3
+ [![Gem Version](https://badge.fury.io/rb/ruby-pi.svg)](https://rubygems.org/gems/ruby-pi)
4
+ [![CI](https://github.com/ejwhite7/ruby-pi/actions/workflows/ci.yml/badge.svg)](https://github.com/ejwhite7/ruby-pi/actions/workflows/ci.yml)
5
+ [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
6
+ [![Ruby](https://img.shields.io/badge/Ruby-%3E%3D%203.2-red.svg)](https://www.ruby-lang.org)
7
+
8
+ **A minimal, composable toolkit for building LLM-powered agents in Ruby.**
9
+
10
+ RubyPi is an anti-framework. Instead of imposing a sprawling abstraction layer, it gives you small, focused modules you can compose however you like: swap providers, define tools, run agent loops, manage context -- all without buying into a monolithic architecture. Every piece works on its own, and they snap together when you need them to.
11
+
12
+ ---
13
+
14
+ ## Table of Contents
15
+
16
+ - [Installation](#installation)
17
+ - [Quick Start](#quick-start)
18
+ - [API Reference](#api-reference)
19
+ - [RubyPi::LLM](#rubypillm)
20
+ - [RubyPi::Tools](#rubypitools)
21
+ - [RubyPi::Agent](#rubypiagent)
22
+ - [RubyPi::Context](#rubypicontext)
23
+ - [RubyPi::Extensions](#rubypiextensions)
24
+ - [Configuration](#configuration)
25
+ - [Contributing](#contributing)
26
+ - [License](#license)
27
+
28
+ ---
29
+
30
+ ## Installation
31
+
32
+ Add to your Gemfile:
33
+
34
+ ```ruby
35
+ gem "ruby-pi"
36
+ ```
37
+
38
+ Then run:
39
+
40
+ ```bash
41
+ bundle install
42
+ ```
43
+
44
+ Or install directly:
45
+
46
+ ```bash
47
+ gem install ruby-pi
48
+ ```
49
+
50
+ ---
51
+
52
+ ## Quick Start
53
+
54
+ ```ruby
55
+ require "ruby_pi"
56
+
57
+ # 1. Configure
58
+ RubyPi.configure do |c|
59
+ c.gemini_api_key = ENV["GEMINI_API_KEY"]
60
+ end
61
+
62
+ # 2. Define a tool
63
+ weather = RubyPi::Tool.define(
64
+ name: :get_weather,
65
+ description: "Get the current weather for a city",
66
+ parameters: RubyPi::Schema.object(
67
+ city: RubyPi::Schema.string("City name", required: true)
68
+ )
69
+ ) { |args| { temp: 72, condition: "sunny", city: args["city"] } }
70
+
71
+ # 3. Build a registry and an agent
72
+ registry = RubyPi::Tools::Registry.new
73
+ registry.register(weather)
74
+
75
+ model = RubyPi::LLM.model(:gemini, "gemini-2.0-flash")
76
+ agent = RubyPi::Agent.new(model: model, tools: registry, stream: true)
77
+
78
+ # 4. Subscribe to events
79
+ agent.on(:text_delta) { |e| print e[:data] }
80
+ agent.on(:tool_execution_end) { |e| puts "\n[Tool] #{e[:name]} => #{e[:result]}" }
81
+
82
+ # 5. Run
83
+ result = agent.run("What's the weather in San Francisco?")
84
+ puts "\nDone: #{result.output}"
85
+ ```
86
+
87
+ ---
88
+
89
+ ## API Reference
90
+
91
+ ### RubyPi::LLM
92
+
93
+ The LLM module provides a provider-agnostic interface for text generation, streaming, and tool calling across Gemini, Anthropic, and OpenAI.
94
+
95
+ #### Model Factory
96
+
97
+ ```ruby
98
+ # Build a provider instance
99
+ model = RubyPi::LLM.model(:gemini, "gemini-2.0-flash")
100
+ model = RubyPi::LLM.model(:anthropic, "claude-sonnet-4-20250514")
101
+ model = RubyPi::LLM.model(:openai, "gpt-4o")
102
+ ```
103
+
104
+ #### Completions
105
+
106
+ ```ruby
107
+ response = model.complete(
108
+ messages: [{ role: "user", content: "Hello!" }],
109
+ tools: [], # optional tool definitions
110
+ stream: false # set true for streaming
111
+ )
112
+
113
+ response.content # => "Hi there!"
114
+ response.tool_calls # => [] or [ToolCall, ...]
115
+ response.tool_calls? # => false
116
+ response.usage # => { prompt_tokens: 10, completion_tokens: 20, total_tokens: 30 }
117
+ response.finish_reason # => "stop"
118
+ ```
119
+
120
+ #### Streaming
121
+
122
+ ```ruby
123
+ model.complete(messages: messages, stream: true) do |event|
124
+ case event.type
125
+ when :text_delta
126
+ print event.data # incremental text chunk
127
+ when :tool_call_delta
128
+ handle_fragment(event.data) # partial tool call JSON
129
+ when :done
130
+ puts "\nStream finished"
131
+ end
132
+ end
133
+ ```
134
+
135
+ #### Response & ToolCall
136
+
137
+ | Class | Attributes |
138
+ |---|---|
139
+ | `RubyPi::LLM::Response` | `content`, `tool_calls`, `usage`, `finish_reason`, `tool_calls?` |
140
+ | `RubyPi::LLM::ToolCall` | `id`, `name`, `arguments` |
141
+ | `RubyPi::LLM::StreamEvent` | `type`, `data`, `text_delta?`, `tool_call_delta?`, `done?` |
142
+
143
+ #### Fallback
144
+
145
+ Automatically fail over to a backup provider when the primary is unavailable:
146
+
147
+ ```ruby
148
+ primary = RubyPi::LLM.model(:gemini, "gemini-2.0-flash")
149
+ backup = RubyPi::LLM.model(:openai, "gpt-4o")
150
+ provider = RubyPi::LLM::Fallback.new(primary: primary, fallback: backup)
151
+
152
+ # Uses Gemini; if it 500s or times out, retries with OpenAI
153
+ response = provider.complete(messages: messages)
154
+ ```
155
+
156
+ Authentication errors (401/403) are **not** retried with the fallback -- they indicate a configuration problem, not a transient failure.
157
+
158
+ ---
159
+
160
+ ### RubyPi::Tools
161
+
162
+ A lightweight DSL for defining tools (functions) that LLMs can call, plus a registry and executor for dispatching them.
163
+
164
+ #### Defining Tools
165
+
166
+ ```ruby
167
+ tool = RubyPi::Tool.define(
168
+ name: :create_post,
169
+ description: "Create a social media post",
170
+ category: :content,
171
+ parameters: RubyPi::Schema.object(
172
+ content: RubyPi::Schema.string("Post body", required: true),
173
+ tags: RubyPi::Schema.array(
174
+ description: "Tags",
175
+ items: RubyPi::Schema.string("A tag")
176
+ )
177
+ )
178
+ ) do |args|
179
+ { post_id: SecureRandom.uuid, status: "published" }
180
+ end
181
+ ```
182
+
183
+ #### Schema DSL
184
+
185
+ Build JSON Schema hashes with a fluent Ruby API:
186
+
187
+ ```ruby
188
+ RubyPi::Schema.string("A label", required: true, enum: ["a", "b"])
189
+ RubyPi::Schema.integer("Count", minimum: 0, maximum: 100)
190
+ RubyPi::Schema.number("Price", minimum: 0.0)
191
+ RubyPi::Schema.boolean("Active")
192
+ RubyPi::Schema.array(description: "Items", items: RubyPi::Schema.string)
193
+ RubyPi::Schema.object(name: RubyPi::Schema.string("Name", required: true))
194
+ ```
195
+
196
+ #### Registry
197
+
198
+ ```ruby
199
+ registry = RubyPi::Tools::Registry.new
200
+ registry.register(tool)
201
+
202
+ registry.find(:create_post) # => Definition or nil
203
+ registry.registered?(:create_post) # => true
204
+ registry.names # => [:create_post]
205
+ registry.size # => 1
206
+ registry.by_category(:content) # => [Definition, ...]
207
+ registry.subset([:create_post]) # => new Registry with just that tool
208
+ registry.all # => [Definition, ...]
209
+ ```
210
+
211
+ #### Executor
212
+
213
+ Run tool calls in parallel or sequentially with automatic error handling and timeouts:
214
+
215
+ ```ruby
216
+ executor = RubyPi::Tools::Executor.new(registry, mode: :parallel, timeout: 30)
217
+
218
+ results = executor.execute([
219
+ { name: "create_post", arguments: { content: "Hello" } },
220
+ { name: "get_analytics", arguments: { period: "7d" } }
221
+ ])
222
+
223
+ results.each do |r|
224
+ if r.success?
225
+ puts "#{r.name}: #{r.value}"
226
+ else
227
+ puts "#{r.name} failed: #{r.error} (#{r.duration_ms}ms)"
228
+ end
229
+ end
230
+ ```
231
+
232
+ | `Result` attribute | Description |
233
+ |---|---|
234
+ | `name` | Tool name |
235
+ | `success?` | Whether the call succeeded |
236
+ | `value` | Return value (on success) |
237
+ | `error` | Error message (on failure) |
238
+ | `duration_ms` | Execution time in milliseconds |
239
+
240
+ ---
241
+
242
+ ### RubyPi::Agent
243
+
244
+ The Agent implements a **think-act-observe** loop: send messages to the LLM, execute any tool calls it requests, feed results back, and repeat until the model produces a final text response or hits the iteration limit.
245
+
246
+ #### Creating an Agent
247
+
248
+ ```ruby
249
+ agent = RubyPi::Agent.new(
250
+ model: model, # required: an LLM provider instance
251
+ tools: registry, # optional: a Tools::Registry
252
+ stream: false, # optional: enable streaming
253
+ max_iterations: 10, # optional: loop safety limit
254
+ context_compaction: compaction, # optional: Context::Compaction instance
255
+ context_transform: transform, # optional: Context::Transform instance
256
+ extensions: [my_extension] # optional: Extension instances
257
+ )
258
+ ```
259
+
260
+ #### Running the Agent
261
+
262
+ ```ruby
263
+ # Single run
264
+ result = agent.run("What is the weather in Tokyo?")
265
+ result.output # => "The weather in Tokyo is..."
266
+ result.messages # => full conversation history
267
+ result.tool_calls_made # => [{ name: "get_weather", ... }, ...]
268
+ result.iterations # => 2
269
+ result.stop_reason # => :complete or :max_iterations
270
+
271
+ # Continue the conversation
272
+ result2 = agent.continue("And in London?")
273
+ ```
274
+
275
+ #### Event Subscriptions
276
+
277
+ Subscribe to lifecycle events for logging, monitoring, or custom behavior:
278
+
279
+ ```ruby
280
+ agent.on(:turn_start) { |e| puts "Turn #{e[:iteration]} starting" }
281
+ agent.on(:turn_end) { |e| puts "Turn #{e[:iteration]} ended" }
282
+ agent.on(:text_delta) { |e| print e[:data] }
283
+ agent.on(:tool_execution_start){ |e| puts "Calling #{e[:tool_name]}" }
284
+ agent.on(:tool_execution_end) { |e| puts "#{e[:name]} => #{e[:result]}" }
285
+ agent.on(:before_tool_call) { |e| puts "About to call #{e[:tool_name]}" }
286
+ agent.on(:after_tool_call) { |e| puts "Finished #{e[:tool_name]}" }
287
+ agent.on(:agent_end) { |e| puts "Agent finished" }
288
+ agent.on(:error) { |e| warn "Error: #{e[:error].message}" }
289
+ ```
290
+
291
+ ---
292
+
293
+ ### RubyPi::Context
294
+
295
+ Utilities for managing conversation context, especially in long-running or multi-turn agents.
296
+
297
+ #### Compaction
298
+
299
+ Prevent unbounded context growth by compacting older messages:
300
+
301
+ ```ruby
302
+ compaction = RubyPi::Context::Compaction.new(
303
+ max_tokens: 4000, # trigger compaction above this threshold
304
+ strategy: :truncate # :truncate removes oldest messages
305
+ )
306
+
307
+ agent = RubyPi::Agent.new(model: model, context_compaction: compaction)
308
+ ```
309
+
310
+ #### Transform
311
+
312
+ Apply arbitrary transformations to the message list before each LLM call:
313
+
314
+ ```ruby
315
+ transform = RubyPi::Context::Transform.new do |messages|
316
+ # Inject a system prompt at the beginning
317
+ [{ role: "system", content: "You are a helpful Ruby assistant." }] + messages
318
+ end
319
+
320
+ agent = RubyPi::Agent.new(model: model, context_transform: transform)
321
+ ```
322
+
323
+ ---
324
+
325
+ ### RubyPi::Extensions
326
+
327
+ Extensions hook into the agent's event system to add cross-cutting behavior (logging, metrics, guardrails) without modifying the core loop.
328
+
329
+ #### Defining an Extension
330
+
331
+ ```ruby
332
+ class MetricsExtension < RubyPi::Extensions::Base
333
+ on_event :turn_start do |event|
334
+ @turn_timer = Process.clock_gettime(Process::CLOCK_MONOTONIC)
335
+ end
336
+
337
+ on_event :turn_end do |event|
338
+ elapsed = Process.clock_gettime(Process::CLOCK_MONOTONIC) - @turn_timer
339
+ puts "Turn #{event[:iteration]} took #{elapsed.round(2)}s"
340
+ end
341
+
342
+ on_event :agent_end do |event|
343
+ puts "Agent completed in #{event[:iterations]} iterations"
344
+ end
345
+ end
346
+ ```
347
+
348
+ #### Registering Extensions
349
+
350
+ ```ruby
351
+ agent = RubyPi::Agent.new(
352
+ model: model,
353
+ extensions: [MetricsExtension.new, AnotherExtension.new]
354
+ )
355
+ ```
356
+
357
+ Extensions receive the same event payloads as `agent.on(...)` callbacks. Use them when you want reusable, self-contained behavior modules.
358
+
359
+ ---
360
+
361
+ ## Configuration
362
+
363
+ All settings are managed through a global configuration block:
364
+
365
+ ```ruby
366
+ RubyPi.configure do |config|
367
+ # API keys
368
+ config.gemini_api_key = ENV["GEMINI_API_KEY"]
369
+ config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"]
370
+ config.openai_api_key = ENV["OPENAI_API_KEY"]
371
+
372
+ # Retry behavior
373
+ config.max_retries = 3 # retries for transient errors
374
+ config.retry_base_delay = 1.0 # base delay (seconds) for exponential backoff
375
+ config.retry_max_delay = 30.0 # cap on retry delay
376
+
377
+ # Timeouts
378
+ config.request_timeout = 120 # HTTP request timeout (seconds)
379
+ config.open_timeout = 10 # connection open timeout (seconds)
380
+
381
+ # Default models (used when you omit the model name)
382
+ config.default_gemini_model = "gemini-2.0-flash"
383
+ config.default_anthropic_model = "claude-sonnet-4-20250514"
384
+ config.default_openai_model = "gpt-4o"
385
+
386
+ # Logging
387
+ config.logger = Logger.new($stdout)
388
+ end
389
+ ```
390
+
391
+ You can also reset configuration to defaults:
392
+
393
+ ```ruby
394
+ RubyPi.reset_configuration!
395
+ ```
396
+
397
+ ---
398
+
399
+ ## Contributing
400
+
401
+ 1. Fork the repository
402
+ 2. Create a feature branch (`git checkout -b feature/my-feature`)
403
+ 3. Write tests for your changes
404
+ 4. Ensure all tests pass: `bundle exec rspec`
405
+ 5. Commit your changes (`git commit -am 'Add my feature'`)
406
+ 6. Push to the branch (`git push origin feature/my-feature`)
407
+ 7. Open a Pull Request
408
+
409
+ Please follow the existing code style and include tests for any new functionality.
410
+
411
+ ---
412
+
413
+ ## License
414
+
415
+ Released under the [MIT License](LICENSE).
@@ -0,0 +1,175 @@
1
+ # frozen_string_literal: true
2
+
3
+ # lib/ruby_pi/agent/core.rb
4
+ #
5
+ # RubyPi::Agent::Core — The main Agent class and public entry point.
6
+ #
7
+ # Core orchestrates the full agent lifecycle: it manages state, includes the
8
+ # EventEmitter for event subscriptions, delegates to Loop for the think-act-observe
9
+ # cycle, and supports extensions via the `use` method. This is the class users
10
+ # instantiate and interact with when building agentic workflows.
11
+ #
12
+ # Usage:
13
+ # agent = RubyPi::Agent::Core.new(
14
+ # system_prompt: "You are a helpful assistant.",
15
+ # model: RubyPi::LLM.model(:gemini, "gemini-2.0-flash"),
16
+ # tools: registry
17
+ # )
18
+ # agent.on(:text_delta) { |d| print d[:content] }
19
+ # result = agent.run("Tell me about Ruby")
20
+
21
+ module RubyPi
22
+ module Agent
23
+ # The main agent class. Wraps State, Loop, and EventEmitter into a cohesive
24
+ # interface for running agentic LLM interactions with tool use, streaming,
25
+ # and lifecycle hooks.
26
+ #
27
+ # @example Full lifecycle
28
+ # agent = RubyPi::Agent::Core.new(
29
+ # system_prompt: "You are Olli, an AI assistant.",
30
+ # model: RubyPi::LLM.model(:gemini, "gemini-2.0-flash"),
31
+ # tools: registry,
32
+ # max_iterations: 10,
33
+ # before_tool_call: ->(tc) { puts "Calling #{tc.name}" },
34
+ # after_tool_call: ->(tc, r) { puts "Done: #{r.success?}" }
35
+ # )
36
+ #
37
+ # agent.on(:text_delta) { |d| stream.write(d[:content]) }
38
+ # agent.on(:agent_end) { |_| stream.close }
39
+ #
40
+ # result = agent.run("Create a LinkedIn post")
41
+ # result = agent.continue("Make it shorter")
42
+ class Core
43
+ include EventEmitter
44
+
45
+ # @return [RubyPi::Agent::State] the agent's mutable state
46
+ attr_reader :state
47
+
48
+ # Creates a new Agent instance.
49
+ #
50
+ # @param system_prompt [String] the system-level instruction prompt
51
+ # @param model [RubyPi::LLM::BaseProvider] the LLM provider instance
52
+ # @param tools [RubyPi::Tools::Registry, nil] tool registry
53
+ # @param max_iterations [Integer] max think-act-observe cycles (default: 10)
54
+ # @param transform_context [Proc, nil] context transform hook
55
+ # @param before_tool_call [Proc, nil] pre-tool-execution hook
56
+ # @param after_tool_call [Proc, nil] post-tool-execution hook
57
+ # @param compaction [RubyPi::Context::Compaction, nil] compaction strategy
58
+ # @param user_data [Hash] arbitrary data bag for transforms/extensions
59
+ def initialize(
60
+ system_prompt:,
61
+ model:,
62
+ tools: nil,
63
+ max_iterations: 10,
64
+ transform_context: nil,
65
+ before_tool_call: nil,
66
+ after_tool_call: nil,
67
+ compaction: nil,
68
+ user_data: {}
69
+ )
70
+ @state = State.new(
71
+ system_prompt: system_prompt,
72
+ model: model,
73
+ tools: tools,
74
+ max_iterations: max_iterations,
75
+ transform_context: transform_context,
76
+ before_tool_call: before_tool_call,
77
+ after_tool_call: after_tool_call,
78
+ user_data: user_data
79
+ )
80
+ @compaction = compaction
81
+ @extensions = []
82
+ end
83
+
84
+ # Runs the agent with an initial user prompt. Adds the prompt to the
85
+ # conversation history, executes the think-act-observe loop, emits
86
+ # :agent_end when done, and returns the result.
87
+ #
88
+ # @param prompt [String] the user's initial message
89
+ # @return [RubyPi::Agent::Result] the outcome of the agent run
90
+ def run(prompt)
91
+ @state.add_message(role: :user, content: prompt)
92
+ execute_loop
93
+ end
94
+
95
+ # Continues the conversation with a follow-up user message. Preserves
96
+ # the existing conversation history and appends the new prompt before
97
+ # resuming the loop.
98
+ #
99
+ # @param prompt [String] the follow-up user message
100
+ # @return [RubyPi::Agent::Result] the outcome of the continued run
101
+ def continue(prompt)
102
+ # Reset the iteration counter for the new run while keeping history
103
+ @state.instance_variable_set(:@iteration, 0)
104
+ @state.add_message(role: :user, content: prompt)
105
+ execute_loop
106
+ end
107
+
108
+ # Registers an extension with this agent. The extension's hooks are
109
+ # automatically subscribed to the agent's events.
110
+ #
111
+ # @param extension_class [Class] a subclass of RubyPi::Extensions::Base
112
+ # @return [void]
113
+ # @raise [ArgumentError] if the argument is not a valid extension class
114
+ def use(extension_class)
115
+ unless extension_class.respond_to?(:hooks)
116
+ raise ArgumentError,
117
+ "Expected an extension class with a .hooks method, got #{extension_class.inspect}"
118
+ end
119
+
120
+ # Subscribe each hook to the corresponding event
121
+ extension_class.hooks.each do |event, handlers|
122
+ handlers.each do |handler|
123
+ on(event) do |data|
124
+ handler.call(data, self)
125
+ end
126
+ end
127
+ end
128
+
129
+ @extensions << extension_class
130
+ end
131
+
132
+ private
133
+
134
+ # Creates a Loop instance and executes it, emitting :agent_end when
135
+ # the loop completes.
136
+ #
137
+ # @return [RubyPi::Agent::Result]
138
+ def execute_loop
139
+ loop_runner = Loop.new(
140
+ state: @state,
141
+ emitter: self,
142
+ compaction: @compaction
143
+ )
144
+
145
+ result = loop_runner.run
146
+
147
+ emit(:agent_end, result: result, success: result.success?)
148
+
149
+ result
150
+ end
151
+ end
152
+ end
153
+
154
+ # Module-level convenience method for creating Agent instances without
155
+ # referencing Agent::Core directly. Allows `RubyPi::Agent.new(...)`.
156
+ module Agent
157
+ class << self
158
+ # Creates a new Agent::Core instance. This is the recommended entry
159
+ # point for building agents.
160
+ #
161
+ # @param args [Hash] constructor arguments forwarded to Agent::Core.new
162
+ # @return [RubyPi::Agent::Core] a new agent instance
163
+ #
164
+ # @example
165
+ # agent = RubyPi::Agent.new(
166
+ # system_prompt: "You are helpful.",
167
+ # model: model,
168
+ # tools: registry
169
+ # )
170
+ def new(**args)
171
+ Core.new(**args)
172
+ end
173
+ end
174
+ end
175
+ end