llm.rb 8.0.0 → 8.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 4d726213f6b63342582738a133f7f82c1158934d6f25a48ae6b6c9e59a8f8262
4
- data.tar.gz: 6288d177adc7a07a37368066329c882f746747d5bed9ffba7cb50d2bcbd1d98c
3
+ metadata.gz: 8aa3ee461642fb157bece63a4ebe00ceda8ec66ce24df5c842efdcc176861a53
4
+ data.tar.gz: 2d26e36b812704a80e5c8ba4814cfbec770afd5694be71b69d7937422f9a642c
5
5
  SHA512:
6
- metadata.gz: 4ae089f4117dc384000a70500c40ebadf48f42d1bd820d0840568b3b31b0197e51c65e9f60fe65d0e75c23aa4c7eac977be928a38969580174169bd0efe39912
7
- data.tar.gz: 9653135f93b9b2b722102f055dc961346949368dab161a3cff64e99ddfc6781933a94b527151da9a24ff39451814f76c5409389f91c3692852eb17bd5d3d11f9
6
+ metadata.gz: 3a30bf9d5309bf49c660137ed5e81b74f9b028f8846077f3db0b7c92745a5d96b16115db765a4bd1970ba0cbaaa7bd805e0a4a37c04c7e63aacdf3d019d268ec
7
+ data.tar.gz: 4e297d159dc459ee9ec228862f271b7a21be48ce06f092773c4b56d9cc007252b1cfeb66a119c7e14f3e683213e5923d34b0c256397b92cfa981cd47fe023008
data/CHANGELOG.md CHANGED
@@ -2,6 +2,51 @@
2
2
 
3
3
  ## Unreleased
4
4
 
5
+ ## v8.1.0
6
+
7
+ Changes since `v8.0.0`.
8
+
9
+ This release adds Amazon Bedrock provider support through the Converse
10
+ API, including AWS SigV4 request signing, event stream decoding,
11
+ structured output through `schema:`, and a models.dev-backed registry.
12
+ It exposes `llm.models.all` for Bedrock via the ListFoundationModels
13
+ API and adds `LLM::Object#transform_values!` for in-place value
14
+ transformation. Several Bedrock-specific fixes land as well, including
15
+ response id exposure, blank text block suppression in tool turns, and
16
+ DSML tool-marker filtering in streamed text.
17
+
18
+ ### Add
19
+
20
+ * **Add AWS Bedrock provider support** <br>
21
+ Add `LLM.bedrock(...)` with Bedrock Converse chat support, AWS SigV4
22
+ request signing, Bedrock event stream decoding, structured output
23
+ support through `schema:`, and models.dev-backed `bedrock.json`
24
+ registry generation.
25
+
26
+ * **Add AWS Bedrock Models endpoint support** <br>
27
+ Add `llm.models.all` for Bedrock via the ListFoundationModels API,
28
+ including SigV4 signing for the control-plane endpoint and normalized
29
+ `LLM::Model` collection responses.
30
+
31
+ * **Add `LLM::Object#transform_values!`** <br>
32
+ Let `LLM::Object` transform stored values in place through
33
+ `#transform_values!`.
34
+
35
+ ### Fix
36
+
37
+ * **Expose response ids on Bedrock completion responses** <br>
38
+ Read the Bedrock request id into `LLM::Response#id` for completion
39
+ responses adapted from the Converse API.
40
+
41
+ * **Avoid blank assistant text blocks in Bedrock tool turns** <br>
42
+ Stop replaying assistant tool-call messages with empty text content
43
+ blocks that Bedrock rejects.
44
+
45
+ * **Suppress Bedrock DSML tool markers in streamed text** <br>
46
+ Filter `"<|DSML|function_calls"` markers out of streamed Bedrock
47
+ assistant text so tool-call sentinels do not leak into user-visible
48
+ output.
49
+
5
50
  ## v8.0.0
6
51
 
7
52
  Changes since `v7.0.0`.
data/README.md CHANGED
@@ -4,7 +4,7 @@
4
4
  <p align="center">
5
5
  <a href="https://0x1eef.github.io/x/llm.rb?rebuild=1"><img src="https://img.shields.io/badge/docs-0x1eef.github.io-blue.svg" alt="RubyDoc"></a>
6
6
  <a href="https://opensource.org/license/0bsd"><img src="https://img.shields.io/badge/License-0BSD-orange.svg?" alt="License"></a>
7
- <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-8.0.0-green.svg?" alt="Version"></a>
7
+ <a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-8.1.0-green.svg?" alt="Version"></a>
8
8
  </p>
9
9
 
10
10
  ## About
@@ -24,13 +24,18 @@ It provides one runtime for providers, agents, tools, skills, MCP servers, strea
24
24
  schemas, files, and persisted state, so real systems can be built out of one coherent
25
25
  execution model instead of a pile of adapters.
26
26
 
27
+ It supports providers including OpenAI, Anthropic, Google Gemini, DeepSeek, xAI,
28
+ Z.ai, and AWS Bedrock.
29
+
27
30
  It provides concurrent tool execution with multiple strategies exposed through a single
28
31
  runtime: async-task, threads, fibers, ractors and processes (fork). The first three are
29
32
  good for IO-bound work and the last two are good for CPU-bound work. Ractor support is
30
33
  experimental and comes with limitations.
31
34
 
32
35
  Want to see some code? Jump to [the examples](#examples) section. <br>
33
- Want to see a self-hosted LLM environment built on llm.rb? Check out [Relay](https://github.com/llmrb/relay).
36
+ Want to see a self-hosted LLM environment built on llm.rb? Check out [relay.app](https://github.com/llmrb/relay.app). <br>
37
+ Want to use llm.rb with mruby ? Check out [mruby-llm](https://github.com/llmrb/mruby-llm)
38
+
34
39
 
35
40
  ## Architecture
36
41
 
@@ -442,7 +447,7 @@ worker.join
442
447
  preserve OpenAI request shapes but change the API root path.
443
448
  - **Provider support is broad** <br>
444
449
  Work with OpenAI, OpenAI-compatible endpoints, Anthropic, Google, DeepSeek,
445
- Z.ai, xAI, llama.cpp, and Ollama through the same runtime.
450
+ Z.ai, xAI, AWS Bedrock, llama.cpp, and Ollama through the same runtime.
446
451
  - **Tools are explicit** <br>
447
452
  Run local tools, provider-native tools, and MCP tools through the same path
448
453
  with fewer special cases.