ruby-pi 0.1.5 → 0.1.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 45a50058497c3e040f81e977ffdbe0030b901ea053e91002e4ddddba06c23a6f
4
- data.tar.gz: 6066bc184d7f8eb951ae137595a8ed082ec41c3abc8ca84cbfe5b4206fde6f66
3
+ metadata.gz: c78d37122ed67d80e61cf51b182dcd79a20a7efa77b503c8b0340963ad60b728
4
+ data.tar.gz: e3b147cb2b01fe28ac15c2a65d6177156992be7560601886296b16941784ee08
5
5
  SHA512:
6
- metadata.gz: d486b3a171dca8c59bf442ba5c03f135ee450a083c47c05f075af6b23d87eb24285c72c734a9579456d71d6ac247ed4357f7bd16b813219e77655a98f44b0cc2
7
- data.tar.gz: 602ed6dc493731203c9fedb8d69f118c81ce292c653ca42ac4ca0d9f8d793dc281697fa87f9df9f7601022812dd791d4420d0f22ba89ad6051119791e87f33ca
6
+ metadata.gz: cbc0c9abddf98885bf1a22352a9cd09475c324f9aff4bcdff66ce3a6a87e06eb677ab045c966038666744cf9819d5114714c66ba5b7c676de5958d5d964a6242
7
+ data.tar.gz: 3f9c28b1a30d0e3ad0f1badd391c95065adea822927c1d334dc5fc5c9867e658b43e339e9307dd3eba8dd5a534043c9fae3ea8d0384bfae8eb35a1a09356f035
data/CHANGELOG.md CHANGED
@@ -5,6 +5,17 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
+ ## [0.1.6] - 2026-05-01
9
+
10
+ ### Fixed (adversarial review round 4)
11
+
12
+ - **Faraday transport errors leaked untyped, bypassed retry (Critical)**: `BaseProvider#complete` rescued only `RubyPi::*` errors, but providers never wrapped Faraday network exceptions. A `Faraday::TimeoutError`, `Faraday::ConnectionFailed`, or `Faraday::SSLError` propagated as the raw Faraday class — breaking the documented error hierarchy and skipping the retry loop entirely (the exact case retries exist for). Added `BaseProvider#with_transport_errors` which translates `Faraday::TimeoutError` → `RubyPi::TimeoutError` and `Faraday::ConnectionFailed`/`SSLError`/other `Faraday::Error` → `RubyPi::ApiError`. Wrapped every `conn.post` call in all three providers (standard and streaming paths). `RubyPi::ProviderError` is now also retryable
13
+ - **Gemini multi-turn tool use was broken (Critical)**: `Gemini#format_message` rendered assistant messages as text-only and silently dropped the `:tool_calls` field set by the agent loop. The next turn's `functionResponse` had no preceding `functionCall` to bind to, so Gemini rejected any conversation that included a tool call followed by a tool result. Assistant messages now emit one `functionCall` part per tool call (mirroring Anthropic's `tool_use` and OpenAI's `tool_calls` behavior). Empty text parts are also no longer emitted on tool-only assistant turns
14
+ - **Compaction split tool_use/tool_result pairs (Critical)**: When `preserve_last_n` cut between an assistant `tool_calls` message (in droppable) and its matching `:tool` result (in preserved), Anthropic and OpenAI rejected the conversation with "tool_result without preceding tool_use". Compaction now strips orphan `:tool` messages from the head of preserved (moves them into droppable so they're summarized away). Mirror case where preserved starts with a tool result whose assistant is the last droppable message also handled
15
+ - **`Tools::Executor` swallowed non-StandardError exceptions as nil success (Major)**: The worker thread rescued only `StandardError`. A tool block raising `Interrupt`, `SystemExit`, or any other `Exception` subclass left both `value` and `error` nil; the join then reported a *successful* `nil` result. Now rescues `Exception`, captures it as a failed `Result`. Worker thread also sets `report_on_exception = false` to avoid stderr spam
16
+ - **Gemini tool_call IDs collided across turns (Major)**: IDs were generated as `"gemini_#{accumulated_tool_calls.length}"` — every response restarted numbering at 0, so a multi-turn conversation produced multiple tool calls all named `"gemini_0"`. Any caller using ID as a hash key (observability, result correlation) saw collisions. IDs now use `SecureRandom.hex(8)` for global uniqueness across both standard and streaming responses
17
+ - **OpenAI passed malformed tool_call.arguments JSON verbatim (Minor)**: A non-JSON string in `tool_call.arguments` on an assistant message was forwarded unchanged to OpenAI, producing an opaque HTTP 400. Now validated up-front with `JSON.parse`; malformed input raises a typed `RubyPi::ProviderError` with the tool name and parse error before sending the request, matching Anthropic's input validation
18
+
8
19
  ## [0.1.5] - 2026-04-30
9
20
 
10
21
  ### Fixed (adversarial review round 3)
@@ -75,12 +75,47 @@ module RubyPi
75
75
 
76
76
  # Split into messages to summarize and messages to keep
77
77
  preserved_count = [@preserve_last_n, messages.size].min
78
- droppable = messages[0...(messages.size - preserved_count)]
79
- preserved = messages[(messages.size - preserved_count)..]
78
+ droppable = messages[0...(messages.size - preserved_count)].dup
79
+ preserved = messages[(messages.size - preserved_count)..].dup
80
80
 
81
81
  # If there's nothing to drop, we can't compact further
82
82
  return nil if droppable.empty?
83
83
 
84
+ # Anthropic and OpenAI both require every tool_result / tool message
85
+ # to reference a tool_use / tool_call from a preceding assistant
86
+ # message. If we summarize the assistant turn that originated a tool
87
+ # call but keep the matching tool_result, the API rejects the
88
+ # request with "tool_result without preceding tool_use".
89
+ #
90
+ # The boundary between droppable and preserved can split a tool
91
+ # exchange in two ways:
92
+ # (a) preserved starts with one or more :tool messages whose
93
+ # matching assistant turn is in droppable. Strip those
94
+ # orphan tool messages from the head of preserved (move
95
+ # them into droppable so they are summarized, not sent).
96
+ # (b) the last droppable message is an :assistant with tool_calls,
97
+ # but its matching :tool result(s) are in preserved. Pull
98
+ # that assistant message back into preserved so the pair
99
+ # stays intact.
100
+ #
101
+ # We apply (a) first: it's the common case (preserve_last_n=4 cuts
102
+ # mid-pair, leaving a stranded tool message). Then (b) catches the
103
+ # mirror case.
104
+ while preserved.first && preserved.first[:role] == :tool
105
+ droppable << preserved.shift
106
+ end
107
+
108
+ if droppable.last &&
109
+ droppable.last[:role] == :assistant &&
110
+ droppable.last[:tool_calls].is_a?(Array) &&
111
+ !droppable.last[:tool_calls].empty? &&
112
+ preserved.first && preserved.first[:role] == :tool
113
+ preserved.unshift(droppable.pop)
114
+ end
115
+
116
+ # After the boundary fix-ups, droppable may have become empty.
117
+ return nil if droppable.empty?
118
+
84
119
  # Generate a summary of the dropped messages
85
120
  summary = summarize(droppable)
86
121
 
@@ -330,9 +330,11 @@ module RubyPi
330
330
  headers: default_headers
331
331
  )
332
332
 
333
- response = conn.post("/v1/messages") do |req|
334
- req.headers["Content-Type"] = "application/json"
335
- req.body = JSON.generate(body)
333
+ response = with_transport_errors do
334
+ conn.post("/v1/messages") do |req|
335
+ req.headers["Content-Type"] = "application/json"
336
+ req.body = JSON.generate(body)
337
+ end
336
338
  end
337
339
 
338
340
  handle_error_response(response) unless response.success?
@@ -375,11 +377,12 @@ module RubyPi
375
377
  # full body even though on_data consumed the chunks.
376
378
  error_body = +""
377
379
 
378
- response = conn.post("/v1/messages") do |req|
379
- req.headers["Content-Type"] = "application/json"
380
- req.body = JSON.generate(body)
380
+ response = with_transport_errors do
381
+ conn.post("/v1/messages") do |req|
382
+ req.headers["Content-Type"] = "application/json"
383
+ req.body = JSON.generate(body)
381
384
 
382
- # Use Faraday's on_data callback for real incremental streaming.
385
+ # Use Faraday's on_data callback for real incremental streaming.
383
386
  # Without this, Faraday buffers the entire response body before
384
387
  # returning, which means no deltas reach the caller until the model
385
388
  # finishes generating (fake streaming).
@@ -424,7 +427,8 @@ module RubyPi
424
427
  finish_reason = stream_state[:finish_reason]
425
428
  end
426
429
  end
427
- end
430
+ end # conn.post
431
+ end # with_transport_errors
428
432
 
429
433
  # Check for HTTP errors. When on_data was active, the response body
430
434
  # was consumed by the callback, so we pass the accumulated error_body
@@ -78,7 +78,7 @@ module RubyPi
78
78
  rescue RubyPi::AuthenticationError
79
79
  # Authentication errors are not retryable — raise immediately
80
80
  raise
81
- rescue RubyPi::RateLimitError, RubyPi::ApiError, RubyPi::TimeoutError => e
81
+ rescue RubyPi::RateLimitError, RubyPi::ApiError, RubyPi::TimeoutError, RubyPi::ProviderError => e
82
82
  # Retry up to max_retries times AFTER the initial attempt.
83
83
  # With max_retries: 3, attempt goes 1 (initial), 2, 3, 4 — the condition
84
84
  # `attempt <= @max_retries` allows retries on attempts 1..3, so we get
@@ -178,6 +178,45 @@ module RubyPi
178
178
  end
179
179
  end
180
180
 
181
+ # Wraps an HTTP block, translating Faraday transport-level exceptions
182
+ # (DNS failures, connection resets, TLS handshakes, read/write timeouts)
183
+ # into the RubyPi typed-error hierarchy so callers and the retry loop
184
+ # can rescue them uniformly.
185
+ #
186
+ # Without this wrapper, a `Faraday::TimeoutError` or
187
+ # `Faraday::ConnectionFailed` would propagate out of the provider as
188
+ # the raw Faraday class. That breaks two contracts:
189
+ # 1. The documented retry policy (BaseProvider#complete) only rescues
190
+ # RubyPi errors, so transport failures would not be retried —
191
+ # exactly the case retries exist for.
192
+ # 2. Callers `rescue RubyPi::TimeoutError` per the documented error
193
+ # hierarchy and would not catch real network timeouts.
194
+ #
195
+ # @yield the HTTP call to wrap
196
+ # @return [Object] whatever the block returns
197
+ # @raise [RubyPi::TimeoutError] on Faraday::TimeoutError
198
+ # @raise [RubyPi::ApiError] on connection failures, SSL errors, or
199
+ # any other Faraday::Error not otherwise classified
200
+ def with_transport_errors
201
+ yield
202
+ rescue Faraday::TimeoutError => e
203
+ raise RubyPi::TimeoutError, "#{provider_name} request timed out: #{e.message}"
204
+ rescue Faraday::ConnectionFailed, Faraday::SSLError => e
205
+ raise RubyPi::ApiError.new(
206
+ "#{provider_name} transport error: #{e.class}: #{e.message}",
207
+ status_code: nil,
208
+ response_body: nil
209
+ )
210
+ rescue Faraday::Error => e
211
+ # Catch-all for any other Faraday-level failure (parsing, adapter
212
+ # issues, etc.) so transport problems never leak provider internals.
213
+ raise RubyPi::ApiError.new(
214
+ "#{provider_name} HTTP client error: #{e.class}: #{e.message}",
215
+ status_code: nil,
216
+ response_body: nil
217
+ )
218
+ end
219
+
181
220
  # Handles HTTP error responses by raising the appropriate RubyPi error.
182
221
  # When streaming with on_data, the response body is consumed by the
183
222
  # callback and response.body may be empty. Pass override_body with the
@@ -6,6 +6,8 @@
6
6
  # the Gemini REST API for both synchronous and streaming completions, including
7
7
  # tool/function calling support.
8
8
 
9
+ require "securerandom"
10
+
9
11
  module RubyPi
10
12
  module LLM
11
13
  # Google Gemini provider implementation. Communicates with the Gemini
@@ -115,44 +117,116 @@ module RubyPi
115
117
 
116
118
  # Converts a normalized message hash to Gemini's content format.
117
119
  #
120
+ # Critically, an assistant message that carries `tool_calls` (set by
121
+ # the agent loop after a tool-using turn) must be rendered with one
122
+ # `functionCall` part per tool call. Without those parts, Gemini
123
+ # rejects any subsequent `functionResponse` on the next turn because
124
+ # the response has nothing to correlate against. Earlier versions
125
+ # dropped `tool_calls` here, breaking multi-turn tool use.
126
+ #
118
127
  # @param message [Hash] a message with :role and :content keys
119
128
  # @return [Hash] Gemini-formatted content object
120
129
  def format_message(message)
121
130
  role = message[:role]&.to_s || message["role"]&.to_s || "user"
122
- content = message[:content] || message["content"] || ""
123
-
124
- # Gemini uses "user" and "model" roles. Map tool results to "user"
125
- # role with a functionResponse part when we have the metadata, or
126
- # plain text otherwise. System messages should have been extracted
127
- # by build_request_body before reaching this method.
128
- gemini_role = case role
129
- when "assistant" then "model"
130
- when "tool" then "user"
131
- else role
132
- end
133
-
134
- # Tool-role messages carry function call results. When tool_call_id
135
- # and name are present, send as a Gemini functionResponse so the
136
- # model can correlate the result with its earlier functionCall.
131
+ content = message[:content] || message["content"]
132
+
133
+ # Tool-role messages carry function-call results. When the tool name
134
+ # is present, send as a Gemini functionResponse so the model can
135
+ # correlate the result with its earlier functionCall. System messages
136
+ # should have been extracted by build_request_body before reaching
137
+ # this method.
137
138
  tool_name = message[:name] || message["name"]
138
139
  if role == "tool" && tool_name
140
+ # Gemini's functionResponse expects a structured `response` object.
141
+ # Tool results are pre-serialized by the loop as either a JSON
142
+ # string (success) or an "Error: ..." string (failure). Try to
143
+ # parse JSON so the model receives structured data; fall back to
144
+ # wrapping the raw string under :result for plain-text content.
145
+ response_payload = parse_tool_response(content)
139
146
  return {
140
147
  role: "user",
141
148
  parts: [{
142
149
  functionResponse: {
143
150
  name: tool_name.to_s,
144
- response: { result: content.to_s }
151
+ response: response_payload
145
152
  }
146
153
  }]
147
154
  }
148
155
  end
149
156
 
157
+ # Assistant messages may carry `tool_calls` from a prior turn. Each
158
+ # one must be emitted as a `functionCall` part on the model turn so
159
+ # that the next turn's `functionResponse` has something to bind to.
160
+ if role == "assistant"
161
+ parts = []
162
+ text = content.to_s
163
+ parts << { text: text } unless text.empty?
164
+
165
+ tool_calls = message[:tool_calls] || message["tool_calls"]
166
+ if tool_calls.is_a?(Array)
167
+ tool_calls.each do |tc|
168
+ tc_name = (tc[:name] || tc["name"]).to_s
169
+ tc_args = tc[:arguments] || tc["arguments"] || {}
170
+ tc_args = parse_tool_arguments(tc_args)
171
+ parts << { functionCall: { name: tc_name, args: tc_args } }
172
+ end
173
+ end
174
+
175
+ # Gemini rejects an empty parts array on a model turn. If the
176
+ # assistant truly had no content and no tool_calls, fall back to
177
+ # an empty text part.
178
+ parts << { text: "" } if parts.empty?
179
+
180
+ return { role: "model", parts: parts }
181
+ end
182
+
150
183
  {
151
- role: gemini_role,
184
+ role: role,
152
185
  parts: [{ text: content.to_s }]
153
186
  }
154
187
  end
155
188
 
189
+ # Best-effort parse of a tool-result string into a structured object
190
+ # for Gemini's `functionResponse.response`. JSON content is returned
191
+ # as-is (wrapped in a hash if it parsed to a non-hash); non-JSON
192
+ # content (e.g., "Error: ...") is wrapped under :result.
193
+ #
194
+ # @param content [String, Hash, nil]
195
+ # @return [Hash]
196
+ def parse_tool_response(content)
197
+ return { result: "" } if content.nil?
198
+ return content if content.is_a?(Hash)
199
+
200
+ str = content.to_s
201
+ return { result: str } if str.strip.empty?
202
+
203
+ begin
204
+ parsed = JSON.parse(str)
205
+ parsed.is_a?(Hash) ? parsed : { result: parsed }
206
+ rescue JSON::ParserError
207
+ { result: str }
208
+ end
209
+ end
210
+
211
+ # Coerce a tool_call.arguments value (Hash, JSON string, or other)
212
+ # into a Hash suitable for Gemini's `functionCall.args`. Malformed
213
+ # or non-Hash values become an empty hash so the request is still
214
+ # well-formed.
215
+ #
216
+ # @param args [Hash, String, nil]
217
+ # @return [Hash]
218
+ def parse_tool_arguments(args)
219
+ return args if args.is_a?(Hash)
220
+ return {} unless args.is_a?(String) && !args.strip.empty?
221
+
222
+ begin
223
+ parsed = JSON.parse(args)
224
+ parsed.is_a?(Hash) ? parsed : {}
225
+ rescue JSON::ParserError
226
+ {}
227
+ end
228
+ end
229
+
156
230
  # Converts a tool definition to Gemini's function declaration format.
157
231
  # Accepts either a RubyPi::Tools::Definition or a plain Hash.
158
232
  #
@@ -198,9 +272,11 @@ module RubyPi
198
272
  conn = build_connection(base_url: BASE_URL, headers: default_headers)
199
273
  url = "/#{API_VERSION}/models/#{@model}:generateContent"
200
274
 
201
- response = conn.post(url) do |req|
202
- req.headers["Content-Type"] = "application/json"
203
- req.body = JSON.generate(body)
275
+ response = with_transport_errors do
276
+ conn.post(url) do |req|
277
+ req.headers["Content-Type"] = "application/json"
278
+ req.body = JSON.generate(body)
279
+ end
204
280
  end
205
281
 
206
282
  handle_error_response(response) unless response.success?
@@ -233,11 +309,12 @@ module RubyPi
233
309
  response_status = nil
234
310
  error_body = +""
235
311
 
236
- response = conn.post(url) do |req|
237
- req.headers["Content-Type"] = "application/json"
238
- req.body = JSON.generate(body)
312
+ response = with_transport_errors do
313
+ conn.post(url) do |req|
314
+ req.headers["Content-Type"] = "application/json"
315
+ req.body = JSON.generate(body)
239
316
 
240
- # Use Faraday's on_data callback for real incremental streaming.
317
+ # Use Faraday's on_data callback for real incremental streaming.
241
318
  # Without this, Faraday buffers the entire response body before
242
319
  # returning — no deltas reach the caller until the model finishes
243
320
  # generating (fake streaming).
@@ -281,7 +358,12 @@ module RubyPi
281
358
  elsif part.key?("functionCall")
282
359
  fc = part["functionCall"]
283
360
  tool_call = ToolCall.new(
284
- id: "gemini_#{accumulated_tool_calls.length}",
361
+ # Generate a globally-unique ID per tool call. A simple
362
+ # length-based counter ("gemini_0", "gemini_1") collides
363
+ # across turns since each response restarts numbering at
364
+ # 0, breaking any caller that uses ID as a hash key for
365
+ # observability or result correlation.
366
+ id: "gemini_#{SecureRandom.hex(8)}",
285
367
  name: fc["name"],
286
368
  arguments: fc["args"] || {}
287
369
  )
@@ -308,7 +390,8 @@ module RubyPi
308
390
  end
309
391
  end
310
392
  end
311
- end
393
+ end # conn.post
394
+ end # with_transport_errors
312
395
 
313
396
  # When on_data is active, the response body was consumed by the
314
397
  # callback. Pass the accumulated error_body so ApiError carries the
@@ -347,7 +430,9 @@ module RubyPi
347
430
  elsif part.key?("functionCall")
348
431
  fc = part["functionCall"]
349
432
  tool_calls << ToolCall.new(
350
- id: "gemini_#{tool_calls.length}",
433
+ # See note in perform_streaming_request: per-response counters
434
+ # collide across turns, so we generate a globally-unique ID.
435
+ id: "gemini_#{SecureRandom.hex(8)}",
351
436
  name: fc["name"],
352
437
  arguments: fc["args"] || {}
353
438
  )
@@ -183,11 +183,31 @@ module RubyPi
183
183
  tc_name = tc[:name] || tc["name"]
184
184
  tc_args = tc[:arguments] || tc["arguments"] || {}
185
185
 
186
- # OpenAI requires arguments as a JSON string
187
- args_string = if tc_args.is_a?(String)
188
- tc_args
189
- elsif tc_args.is_a?(Hash)
186
+ # OpenAI requires arguments to be a JSON-encoded string. We
187
+ # validate up-front so a malformed string fails fast with a
188
+ # typed error here rather than as an opaque HTTP 400 from
189
+ # OpenAI. This mirrors Anthropic's input validation in
190
+ # build_assistant_message.
191
+ args_string = case tc_args
192
+ when Hash
190
193
  JSON.generate(tc_args)
194
+ when String
195
+ stripped = tc_args.strip
196
+ if stripped.empty?
197
+ "{}"
198
+ else
199
+ begin
200
+ JSON.parse(tc_args)
201
+ tc_args
202
+ rescue JSON::ParserError => e
203
+ raise RubyPi::ProviderError.new(
204
+ "Invalid JSON in assistant tool_call.arguments " \
205
+ "for tool '#{tc_name || "unknown"}': #{e.message} " \
206
+ "(raw: #{tc_args.inspect})",
207
+ provider: :openai
208
+ )
209
+ end
210
+ end
191
211
  else
192
212
  "{}"
193
213
  end
@@ -261,9 +281,11 @@ module RubyPi
261
281
  headers: default_headers
262
282
  )
263
283
 
264
- response = conn.post("/v1/chat/completions") do |req|
265
- req.headers["Content-Type"] = "application/json"
266
- req.body = JSON.generate(body)
284
+ response = with_transport_errors do
285
+ conn.post("/v1/chat/completions") do |req|
286
+ req.headers["Content-Type"] = "application/json"
287
+ req.body = JSON.generate(body)
288
+ end
267
289
  end
268
290
 
269
291
  handle_error_response(response) unless response.success?
@@ -300,11 +322,12 @@ module RubyPi
300
322
  response_status = nil
301
323
  error_body = +""
302
324
 
303
- response = conn.post("/v1/chat/completions") do |req|
304
- req.headers["Content-Type"] = "application/json"
305
- req.body = JSON.generate(body)
325
+ response = with_transport_errors do
326
+ conn.post("/v1/chat/completions") do |req|
327
+ req.headers["Content-Type"] = "application/json"
328
+ req.body = JSON.generate(body)
306
329
 
307
- # Use Faraday's on_data callback for real incremental streaming.
330
+ # Use Faraday's on_data callback for real incremental streaming.
308
331
  # Without this, Faraday buffers the entire response body before
309
332
  # returning — no deltas reach the caller until the model finishes
310
333
  # generating (fake streaming).
@@ -389,7 +412,8 @@ module RubyPi
389
412
  end
390
413
  end
391
414
  end
392
- end
415
+ end # conn.post
416
+ end # with_transport_errors
393
417
 
394
418
  # When on_data is active, the response body was consumed by the
395
419
  # callback. Pass the accumulated error_body so ApiError carries the
@@ -195,9 +195,18 @@ module RubyPi
195
195
  error = nil
196
196
 
197
197
  worker = Thread.new do
198
+ # Don't spam stderr from the rescued worker thread.
199
+ Thread.current.report_on_exception = false
198
200
  begin
199
201
  value = tool.call(arguments)
200
- rescue StandardError => e
202
+ rescue Exception => e # rubocop:disable Lint/RescueException
203
+ # Rescue the full Exception hierarchy (not just StandardError).
204
+ # If a tool block raises Interrupt, SystemExit, or any other
205
+ # non-StandardError, rescuing only StandardError leaves both
206
+ # `value` and `error` nil; the join then reports a successful
207
+ # nil result — a panic in a tool silently becomes "returned nil".
208
+ # Capture the failure here; the main thread surfaces it as a
209
+ # failed Result. The worker thread itself does not propagate.
201
210
  error = e
202
211
  end
203
212
  end
@@ -7,5 +7,5 @@
7
7
 
8
8
  module RubyPi
9
9
  # The current version of the RubyPi gem, following Semantic Versioning.
10
- VERSION = "0.1.5"
10
+ VERSION = "0.1.6"
11
11
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby-pi
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.5
4
+ version: 0.1.6
5
5
  platform: ruby
6
6
  authors:
7
7
  - RubyPi Contributors
8
8
  bindir: bin
9
9
  cert_chain: []
10
- date: 1980-01-02 00:00:00.000000000 Z
10
+ date: 2026-05-01 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: faraday
@@ -157,7 +157,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
157
157
  - !ruby/object:Gem::Version
158
158
  version: '0'
159
159
  requirements: []
160
- rubygems_version: 3.6.9
160
+ rubygems_version: 3.6.2
161
161
  specification_version: 4
162
162
  summary: AI agent harness for Ruby — build LLM agents with tool calling, streaming,
163
163
  and a unified interface to OpenAI, Anthropic Claude, and Google Gemini.