llm_ruby 0.3.0 → 0.3.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/README.md +20 -1
- data/lib/llm.rb +2 -0
- data/lib/llm_ruby.rb +1 -0
- metadata +2 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: d377d13dd257a9b4a6def17668d3bb3badf4692f9e865faba142d5189d746519
|
4
|
+
data.tar.gz: a201962fb8dd4f245face648cc433e127afcc22832cb1499da89db2bc2816d7e
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 10a190d6b7f4aa364c17d91b383a744fd490b94437ba4779c973e1f9bdc65d92f16d03c075d807d2a6aaf9509cf837736a0c48860c6ebc9d7c3dea6fb41eacd4
|
7
|
+
data.tar.gz: a3e63a044316a257c1e429bf85882c86ff78d3af193a87a978e4552f3464cdb417c7ada8aaf35a3556672ea6e79196812dada4c25c2f9a624466981b195c2ad3
|
data/README.md
CHANGED
@@ -5,7 +5,6 @@
|
|
5
5
|
[](https://opensource.org/licenses/MIT)
|
6
6
|
|
7
7
|
|
8
|
-
|
9
8
|
LLMRuby is a Ruby gem that provides a consistent interface for interacting with multiple Large Language Model (LLM) APIs. Most OpenAI, Anthropic and Gemini models are currently supported.
|
10
9
|
|
11
10
|
## Installation
|
@@ -172,6 +171,26 @@ export ANTHROPIC_API_KEY=your_api_key_here
|
|
172
171
|
export GEMINI_API_KEY=your_api_key_here
|
173
172
|
```
|
174
173
|
|
174
|
+
## Structured Outputs
|
175
|
+
|
176
|
+
OpenAI and Gemini models can be configured to generate responses that adhere to a provided schema. Even though each use a different format for configuring this schema, `llm_ruby` can handle the translation for you, so that you can share a single schema definition across models.
|
177
|
+
|
178
|
+
```ruby
|
179
|
+
|
180
|
+
llm = LLM.from_string!("gpt-4o")
|
181
|
+
|
182
|
+
# Create a client
|
183
|
+
client = llm.client
|
184
|
+
|
185
|
+
# Send a chat message
|
186
|
+
response_format = LLM::Schema.new("test_schema", {"type" => "object", "properties" => {"name" => {"type" => "string"}, "age" => {"type" => "integer"}}, "additionalProperties" => false, "required" => ["name", "age"]})
|
187
|
+
# or load the schema from a file: LLM::Schema.from_file('myschema.json')
|
188
|
+
response = client.chat([{role: :user, content: "Hello, world!"}], response_format: response_format)
|
189
|
+
|
190
|
+
response.structured_output[:name] # Alex
|
191
|
+
response.structured_output_object.name # Alex
|
192
|
+
```
|
193
|
+
|
175
194
|
## Development
|
176
195
|
|
177
196
|
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
|
data/lib/llm.rb
CHANGED
data/lib/llm_ruby.rb
ADDED
@@ -0,0 +1 @@
|
|
1
|
+
require "llm"
|
metadata
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
--- !ruby/object:Gem::Specification
|
2
2
|
name: llm_ruby
|
3
3
|
version: !ruby/object:Gem::Version
|
4
|
-
version: 0.3.
|
4
|
+
version: 0.3.1
|
5
5
|
platform: ruby
|
6
6
|
authors:
|
7
7
|
- Alex Gamble
|
@@ -158,6 +158,7 @@ files:
|
|
158
158
|
- lib/llm/schema.rb
|
159
159
|
- lib/llm/stop_reason.rb
|
160
160
|
- lib/llm/version.rb
|
161
|
+
- lib/llm_ruby.rb
|
161
162
|
homepage: https://github.com/agamble/llm_ruby
|
162
163
|
licenses:
|
163
164
|
- MIT
|