llm.rb 4.11.1 → 4.13.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +70 -0
- data/README.md +124 -695
- data/lib/llm/context.rb +2 -2
- data/lib/llm/function/task.rb +7 -1
- data/lib/llm/function.rb +14 -3
- data/lib/llm/mcp/error.rb +31 -1
- data/lib/llm/mcp/rpc.rb +8 -3
- data/lib/llm/mcp/transport/http.rb +2 -1
- data/lib/llm/mcp/transport/stdio.rb +1 -0
- data/lib/llm/mcp.rb +43 -1
- data/lib/llm/provider.rb +3 -4
- data/lib/llm/providers/anthropic/request_adapter/completion.rb +8 -1
- data/lib/llm/providers/anthropic/response_adapter/completion.rb +7 -2
- data/lib/llm/providers/anthropic/stream_parser.rb +1 -1
- data/lib/llm/providers/anthropic/utils.rb +23 -0
- data/lib/llm/providers/anthropic.rb +11 -0
- data/lib/llm/providers/openai/request_adapter/respond.rb +11 -5
- data/lib/llm/providers/openai/response_adapter/responds.rb +13 -1
- data/lib/llm/providers/openai/responses/stream_parser.rb +31 -0
- data/lib/llm/stream/queue.rb +15 -2
- data/lib/llm/stream.rb +24 -10
- data/lib/llm/version.rb +1 -1
- data/llm.gemspec +17 -39
- metadata +17 -36
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: 7847fee7ea1e63553ad5323750fc2e5ac1b4a9082c2f4c5aba71f4587440ea75
|
|
4
|
+
data.tar.gz: e63bdae085b2f0f606cbdb4633a7eff93fd6e2428fcb85ff5fe94fc78851bf5d
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: b1c8d8600b3214da5613d152677d13fde796b42e6a29cf8af035e4ad5f28b7cea0466a375b9b444a748e9e063d2e6ad6720b653609cb2b7038e8040cd2b44e39
|
|
7
|
+
data.tar.gz: c76882f9cd5416312e26f4e25493403df8f9f8c61ee14cba5096383b449bd7a4ce8b9d70834d12176648c3d9206f0f555a1eec4b22bdb6426d88c0c36c8ed592
|
data/CHANGELOG.md
CHANGED
|
@@ -2,10 +2,80 @@
|
|
|
2
2
|
|
|
3
3
|
## Unreleased
|
|
4
4
|
|
|
5
|
+
Changes since `v4.13.0`.
|
|
6
|
+
|
|
7
|
+
## v4.13.0
|
|
8
|
+
|
|
9
|
+
Changes since `v4.12.0`.
|
|
10
|
+
|
|
11
|
+
This release expands MCP prompt support, improves reasoning support in the
|
|
12
|
+
OpenAI Responses API, and refreshes the docs around llm.rb's runtime model,
|
|
13
|
+
contexts, and advanced workflows.
|
|
14
|
+
|
|
15
|
+
### Add
|
|
16
|
+
|
|
17
|
+
- Add `LLM::MCP#prompts` and `LLM::MCP#find_prompt` for MCP prompt support.
|
|
18
|
+
|
|
19
|
+
### Change
|
|
20
|
+
|
|
21
|
+
- Rework the README around llm.rb as a runtime for AI systems.
|
|
22
|
+
- Add a dedicated deep dive guide for providers, contexts, persistence,
|
|
23
|
+
tools, agents, MCP, tracing, multimodal prompts, and retrieval.
|
|
24
|
+
|
|
25
|
+
### Fix
|
|
26
|
+
|
|
27
|
+
All of these fixes apply to MCP:
|
|
28
|
+
|
|
29
|
+
- fix(mcp): raise `LLM::MCP::MismatchError` on mismatched response ids.
|
|
30
|
+
- fix(mcp): normalize prompt message content while preserving the original payload.
|
|
31
|
+
|
|
32
|
+
All of these fixes apply to OpenAI's Responses API:
|
|
33
|
+
|
|
34
|
+
- fix(openai): emit `on_reasoning_content` for streamed reasoning summaries.
|
|
35
|
+
- fix(openai): skip `previous_response_id` on `store: false` follow-up calls.
|
|
36
|
+
- fix(openai): fall back to an empty object schema for tools without params.
|
|
37
|
+
- fix(openai): preserve original tool-call payloads on re-sent assistant tool messages.
|
|
38
|
+
- fix(openai): emit `output_text` for assistant-authored response content.
|
|
39
|
+
- fix(openai): return `nil` for `system_fingerprint` on normalized response objects.
|
|
40
|
+
|
|
41
|
+
## v4.12.0
|
|
42
|
+
|
|
5
43
|
Changes since `v4.11.1`.
|
|
6
44
|
|
|
45
|
+
This release expands advanced streaming and MCP execution while reframing
|
|
46
|
+
llm.rb more clearly as a system integration layer for LLMs, tools, MCP
|
|
47
|
+
sources, and application APIs.
|
|
48
|
+
|
|
49
|
+
### Add
|
|
50
|
+
|
|
51
|
+
- Add `persistent` as an alias for `persist!` on providers and MCP transports.
|
|
52
|
+
- Add `LLM::Stream#on_tool_return` for observing completed streamed tool work.
|
|
53
|
+
- Add `LLM::Function::Return#error?`.
|
|
54
|
+
|
|
55
|
+
### Change
|
|
56
|
+
|
|
57
|
+
- Expect advanced streaming callbacks to use `LLM::Stream` subclasses
|
|
58
|
+
instead of duck-typing them onto arbitrary objects. Basic `#<<`
|
|
59
|
+
streaming remains supported.
|
|
60
|
+
|
|
61
|
+
### Fix
|
|
62
|
+
|
|
63
|
+
- Fix Anthropic tools without params by always emitting `input_schema`.
|
|
64
|
+
- Fix Anthropic tool-only responses to still produce an assistant message.
|
|
65
|
+
- Fix Anthropic tool results to use the `user` role.
|
|
66
|
+
- Fix Anthropic tool input normalization.
|
|
67
|
+
|
|
7
68
|
## v4.11.1
|
|
8
69
|
|
|
70
|
+
Changes since `v4.11.0`.
|
|
71
|
+
|
|
72
|
+
### Fix
|
|
73
|
+
|
|
74
|
+
* Cast OpenTelemetry tool-related values to strings. <br>
|
|
75
|
+
Otherwise they're rejected by opentelemetry-sdk as invalid attributes.
|
|
76
|
+
|
|
77
|
+
## v4.11.0
|
|
78
|
+
|
|
9
79
|
Changes since `v4.10.0`.
|
|
10
80
|
|
|
11
81
|
### Add
|