llm.rb 4.15.0 → 4.16.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +38 -0
- data/README.md +34 -12
- data/data/anthropic.json +218 -198
- data/data/deepseek.json +1 -1
- data/data/google.json +481 -429
- data/data/openai.json +742 -704
- data/data/xai.json +277 -277
- data/data/zai.json +160 -126
- data/lib/llm/active_record/acts_as_llm.rb +237 -0
- data/lib/llm/active_record.rb +3 -0
- data/lib/llm/context.rb +9 -6
- data/lib/llm/provider.rb +16 -1
- data/lib/llm/providers/openai/audio.rb +4 -4
- data/lib/llm/providers/openai/files.rb +6 -6
- data/lib/llm/providers/openai/images.rb +4 -4
- data/lib/llm/providers/openai/models.rb +2 -2
- data/lib/llm/providers/openai/moderations.rb +2 -2
- data/lib/llm/providers/openai/responses.rb +4 -4
- data/lib/llm/providers/openai/vector_stores.rb +12 -12
- data/lib/llm/providers/openai.rb +4 -4
- data/lib/llm/sequel/plugin.rb +3 -4
- data/lib/llm/version.rb +1 -1
- metadata +3 -1
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: 6119e0673d055f50139fd33a4523bd99ffabb0fc2688a361f23297ef9ac428fd
|
|
4
|
+
data.tar.gz: bfa5943241a2e9944245f0976458cfdd4895e7b72dbb83fd186f9d7ad2b987ee
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: e139622d5cbaa8f42a1a6739b9792119f9086ae0b11ae896794953d9bb0fa9ed46159da7f10f7a8b619ef468404ae425b2158b359c940a11220dd48b0b230e2e
|
|
7
|
+
data.tar.gz: 563f48928b6836d2bbcbc8d72bd594aa265fa285ad99cb8ff0c9574f9e67b7da60f9d42c2eeb1bddce0ba72e1425dab260bc65670fa7d812f7a77dc8016262d7
|
data/CHANGELOG.md
CHANGED
|
@@ -2,8 +2,46 @@
|
|
|
2
2
|
|
|
3
3
|
## Unreleased
|
|
4
4
|
|
|
5
|
+
Changes since `v4.16.1`.
|
|
6
|
+
|
|
7
|
+
## v4.16.1
|
|
8
|
+
|
|
9
|
+
Changes since `v4.16.0`.
|
|
10
|
+
|
|
11
|
+
This release tightens ORM persistence by removing an unnecessary JSON
|
|
12
|
+
round-trip when restoring structured `:json` and `:jsonb` context
|
|
13
|
+
payloads.
|
|
14
|
+
|
|
15
|
+
### Change
|
|
16
|
+
|
|
17
|
+
* **Restore structured ORM payloads directly** <br>
|
|
18
|
+
Teach `LLM::Context#restore` to accept parsed data payloads and use
|
|
19
|
+
that path from the ActiveRecord and Sequel persistence wrappers for
|
|
20
|
+
`format: :json` and `:jsonb`, avoiding a redundant
|
|
21
|
+
`Hash -> JSON string -> Hash` round-trip on restore.
|
|
22
|
+
|
|
23
|
+
## v4.16.0
|
|
24
|
+
|
|
5
25
|
Changes since `v4.15.0`.
|
|
6
26
|
|
|
27
|
+
This release expands ORM support with built-in ActiveRecord persistence
|
|
28
|
+
and improves compatibility with OpenAI-compatible gateways, proxies, and
|
|
29
|
+
self-hosted servers that use non-standard API root paths.
|
|
30
|
+
|
|
31
|
+
### Change
|
|
32
|
+
|
|
33
|
+
* **Support OpenAI-compatible base paths** <br>
|
|
34
|
+
Add `base_path:` to provider configuration so OpenAI-compatible
|
|
35
|
+
endpoints can vary both host and API prefix. This supports providers,
|
|
36
|
+
proxies, and gateways that keep OpenAI request shapes but use
|
|
37
|
+
non-standard URL layouts such as DeepInfra's `/v1/openai/...`.
|
|
38
|
+
|
|
39
|
+
* **Add ActiveRecord context persistence with `acts_as_llm`** <br>
|
|
40
|
+
Add a built-in ActiveRecord wrapper that mirrors the Sequel plugin
|
|
41
|
+
API so applications can persist `LLM::Context` state on records with
|
|
42
|
+
default columns, provider/context hooks, validation-backed writes,
|
|
43
|
+
and `format: :string`, `:json`, or `:jsonb` storage.
|
|
44
|
+
|
|
7
45
|
## v4.15.0
|
|
8
46
|
|
|
9
47
|
Changes since `v4.14.0`.
|
data/README.md
CHANGED
|
@@ -4,7 +4,7 @@
|
|
|
4
4
|
<p align="center">
|
|
5
5
|
<a href="https://0x1eef.github.io/x/llm.rb?rebuild=1"><img src="https://img.shields.io/badge/docs-0x1eef.github.io-blue.svg" alt="RubyDoc"></a>
|
|
6
6
|
<a href="https://opensource.org/license/0bsd"><img src="https://img.shields.io/badge/License-0BSD-orange.svg?" alt="License"></a>
|
|
7
|
-
<a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.
|
|
7
|
+
<a href="https://github.com/llmrb/llm.rb/tags"><img src="https://img.shields.io/badge/version-4.16.1-green.svg?" alt="Version"></a>
|
|
8
8
|
</p>
|
|
9
9
|
|
|
10
10
|
## About
|
|
@@ -17,9 +17,9 @@ state.
|
|
|
17
17
|
It is built for engineers who want control over how these systems run. llm.rb
|
|
18
18
|
stays close to Ruby, runs on the standard library by default, loads optional
|
|
19
19
|
pieces only when needed, and remains easy to extend. It also works well in
|
|
20
|
-
Rails or ActiveRecord applications,
|
|
21
|
-
support
|
|
22
|
-
|
|
20
|
+
Rails or ActiveRecord applications, with built-in `acts_as_llm`, and includes
|
|
21
|
+
built-in Sequel support through `plugin :llm`, so long-lived context state can
|
|
22
|
+
be saved and restored across requests, jobs, or retries.
|
|
23
23
|
|
|
24
24
|
Most LLM libraries stop at request/response APIs. Building real systems means
|
|
25
25
|
stitching together streaming, tools, state, persistence, and external
|
|
@@ -87,12 +87,12 @@ same context object.
|
|
|
87
87
|
- **MCP is built in** <br>
|
|
88
88
|
Connect to MCP servers over stdio or HTTP without bolting on a separate
|
|
89
89
|
integration stack.
|
|
90
|
-
- **Sequel persistence
|
|
91
|
-
Use `
|
|
92
|
-
sensible default columns
|
|
93
|
-
`provider:`
|
|
94
|
-
`format: :jsonb`
|
|
95
|
-
|
|
90
|
+
- **ActiveRecord and Sequel persistence are built in** <br>
|
|
91
|
+
Use `acts_as_llm` on ActiveRecord models or `plugin :llm` on Sequel models
|
|
92
|
+
to persist `LLM::Context` state with sensible default columns. Both support
|
|
93
|
+
`provider:` and `context:` hooks, plus `format: :string` for text columns
|
|
94
|
+
or `format: :jsonb` for native PostgreSQL JSON storage when ORM JSON
|
|
95
|
+
typecasting support is enabled.
|
|
96
96
|
- **Persistent HTTP pooling is shared process-wide** <br>
|
|
97
97
|
When enabled, separate
|
|
98
98
|
[`LLM::Provider`](https://0x1eef.github.io/x/llm.rb/LLM/Provider.html)
|
|
@@ -101,6 +101,10 @@ same context object.
|
|
|
101
101
|
[`LLM::MCP`](https://0x1eef.github.io/x/llm.rb/LLM/MCP.html)
|
|
102
102
|
instances can do the same, instead of each object creating its own
|
|
103
103
|
isolated per-instance transport.
|
|
104
|
+
- **OpenAI-compatible gateways are supported** <br>
|
|
105
|
+
Target OpenAI-compatible services such as DeepInfra and OpenRouter, as well
|
|
106
|
+
as proxies and self-hosted servers, with `host:` and `base_path:` when they
|
|
107
|
+
preserve OpenAI request shapes but change the API root path.
|
|
104
108
|
- **Provider support is broad** <br>
|
|
105
109
|
Work with OpenAI, OpenAI-compatible endpoints, Anthropic, Google, DeepSeek,
|
|
106
110
|
Z.ai, xAI, llama.cpp, and Ollama through the same runtime.
|
|
@@ -203,12 +207,30 @@ ctx.talk("Remember that my favorite language is Ruby")
|
|
|
203
207
|
puts ctx.talk("What is my favorite language?").content
|
|
204
208
|
```
|
|
205
209
|
|
|
210
|
+
**ActiveRecord (ORM)**
|
|
211
|
+
|
|
212
|
+
See the [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) for more examples.
|
|
213
|
+
|
|
214
|
+
```ruby
|
|
215
|
+
require "llm"
|
|
216
|
+
require "active_record"
|
|
217
|
+
require "llm/active_record"
|
|
218
|
+
|
|
219
|
+
class Context < ApplicationRecord
|
|
220
|
+
acts_as_llm provider: -> { { key: ENV["#{provider.upcase}_SECRET"], persistent: true } }
|
|
221
|
+
end
|
|
222
|
+
|
|
223
|
+
ctx = Context.create!(provider: "openai", model: "gpt-5.4-mini")
|
|
224
|
+
ctx.talk("Remember that my favorite language is Ruby")
|
|
225
|
+
puts ctx.talk("What is my favorite language?").content
|
|
226
|
+
```
|
|
227
|
+
|
|
206
228
|
## Resources
|
|
207
229
|
|
|
208
230
|
- [deepdive](https://0x1eef.github.io/x/llm.rb/file.deepdive.html) is the
|
|
209
231
|
examples guide.
|
|
210
|
-
- [
|
|
211
|
-
of llm.rb.
|
|
232
|
+
- [relay](https://github.com/llmrb/relay) shows a real application built on
|
|
233
|
+
top of llm.rb.
|
|
212
234
|
- [doc site](https://0x1eef.github.io/x/llm.rb?rebuild=1) has the API docs.
|
|
213
235
|
|
|
214
236
|
## License
|