omniai 1.9.6 → 2.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 23098dd1fbbb9542265f3d32a9c37642a1a554fa4a789c78fb4e25a364213116
4
- data.tar.gz: 4885f37cc46af15c1e208a6c9c4f0f06b84d4986c1d0b9361fd2b9e451fb3352
3
+ metadata.gz: c58b1fbebfaaaa7bcd77fe9754d0bf0bddc7b55c07b76ae31cbae0ae24b2024f
4
+ data.tar.gz: d4462dbf00a04ce81db7f8bf7c4f45018aaab994fd2699483e8b2de324546ab2
5
5
  SHA512:
6
- metadata.gz: a63e630e0aeeeb8342cceba0d2505c487e9dacfcd76bcc07e75d8c29c132a9806d3d4fdc4346ebcd6a6eb02534a14de75f9bd5530388eec1652ff19a61b2d008
7
- data.tar.gz: a61425c68913a53de2d0300fe3c74ea546491f69fffd53c1c6fbd414924b7204b718c7676de5b604a063532a8149bbaef132032b1cbc49ac0e69d9030bfd456e
6
+ metadata.gz: b7857bee4ec80f63d8e2a8ae3a6534c36f21ae622ba3a672653ae6aa2017e929baa2d4f89d9bcd783ab645bf17c9e65c6dfa8fa18fc757a38106d59375305bb8
7
+ data.tar.gz: d0c5f8a0ab434740df1a6abc4f5f9d96590d067fe5d6e4e1a9849edea9647bcfa4c5768fdfe74c14a520ff6a9849e8b51f56b764ca8f28424bdb614a87d47f0e
data/README.md CHANGED
@@ -16,124 +16,161 @@ OmniAI provides a unified Ruby API for integrating with multiple AI providers, i
16
16
 
17
17
  ## Examples
18
18
 
19
- ### Example #1: [Chat](https://github.com/ksylvest/omniai/blob/main/examples/chat)
19
+ ### Example #1: [Chat w/ Text](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_text)
20
20
 
21
- This example demonstrates using `OmniAI` with **Anthropic** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
21
+ This example demonstrates using `OmniAI` with **Anthropic** to ask for a joke. The response is parsed and printed.
22
22
 
23
23
  ```ruby
24
24
  require 'omniai/anthropic'
25
25
 
26
- CLIENT = OmniAI::Anthropic::Client.new
27
-
28
- CAT_URL = 'https://images.unsplash.com/photo-1472491235688-bdc81a63246e?q=80&w=1024&h=1024&fit=crop&fm=jpg'
29
- DOG_URL = 'https://images.unsplash.com/photo-1517849845537-4d257902454a?q=80&w=1024&h=1024&fit=crop&fm=jpg'
26
+ client = OmniAI::Anthropic::Client.new
30
27
 
31
- CLIENT.chat(stream: $stdout) do |prompt|
32
- prompt.system('You are a helpful biologist with an expertise in animals that responds with the latin names.')
33
- prompt.user do |message|
34
- message.text('What animals are in the attached photos?')
35
- message.url(CAT_URL, 'image/jpeg')
36
- message.url(DOG_URL, 'image/jpeg')
37
- end
38
- end
28
+ puts client.chat("Tell me a joke").text
39
29
  ```
40
30
 
41
31
  ```
42
- The animals in the photos are:
43
-
44
- 1. A cat (*Felis catus*).
45
- 2. A dog (*Canis familiaris*).
32
+ Why don't scientists trust atoms? Because they make up everything!
46
33
  ```
47
34
 
48
- ### Example #2: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
35
+ ### Example #2: [Chat w/ Prompt](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_prompt)
49
36
 
50
- This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
37
+ This example demonstrates using `OmniAI` with **Mistral** to ask for the fastest animal. It includes a system and user message in the prompt. The response is streamed in real time.
51
38
 
52
39
  ```ruby
53
- require 'omniai/openai'
40
+ require "omniai/mistral"
54
41
 
55
- CLIENT = OmniAI::OpenAI::Client.new
42
+ client = OmniAI::Mistral::Client.new
56
43
 
57
- File.open(File.join(__dir__, 'audio.wav'), 'wb') do |file|
58
- CLIENT.speak('Sally sells seashells by the seashore.', format: OmniAI::Speak::Format::WAV) do |chunk|
59
- file << chunk
60
- end
44
+ client.chat(stream: $stdout) do |prompt|
45
+ prompt.system "Respond in both English and French."
46
+ prompt.user "What is the fastest animal?"
61
47
  end
62
48
  ```
63
49
 
64
- ### Example #3: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
50
+ ```
51
+ **English**: The peregrine falcon is generally considered the fastest animal, reaching speeds of over 390 km/h.
52
+ **French**: Le faucon pèlerin est généralement considéré comme l'animal le plus rapide, atteignant des vitesses de plus de 390 km/h.
53
+ ```
65
54
 
66
- This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
55
+ ### Example #3: [Chat w/ Vision](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_vision)
56
+
57
+ This example demonstrates using `OmniAI` with **OpenAI** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
67
58
 
68
59
  ```ruby
69
- require 'omniai/openai'
60
+ require "omniai/openai"
70
61
 
71
- CLIENT = OmniAI::OpenAI::Client.new
62
+ client = OmniAI::OpenAI::Client.new
72
63
 
73
- File.open(File.join(__dir__, 'audio.wav'), 'rb') do |file|
74
- transcription = CLIENT.transcribe(file)
75
- puts(transcription.text)
64
+ CAT_URL = "https://images.unsplash.com/photo-1472491235688-bdc81a63246e?q=80&w=1024&h=1024&fit=crop&fm=jpg"
65
+ DOG_URL = "https://images.unsplash.com/photo-1517849845537-4d257902454a?q=80&w=1024&h=1024&fit=crop&fm=jpg"
66
+
67
+ client.chat(stream: $stdout) do |prompt|
68
+ prompt.system("You are a helpful biologist with expertise in animals who responds with the Latin names.")
69
+ prompt.user do |message|
70
+ message.text("What animals are in the attached photos?")
71
+ message.url(CAT_URL, "image/jpeg")
72
+ message.url(DOG_URL, "image/jpeg")
73
+ end
76
74
  end
77
75
  ```
78
76
 
79
- ### Example #4: [Tools](https://github.com/ksylvest/omniai/blob/main/examples/tools)
77
+ ```
78
+ The first photo is of a cat, *Felis Catus*.
79
+ The second photo is of a dog, *Canis Familiaris*.
80
+ ```
81
+
82
+ ### Example #4: [Chat w/ Tools](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_tools)
80
83
 
81
- This example demonstrates how to use `OmniAI` with **Google** to call a custom tool that generates a weather report. The tool accepts a list of locations (city and country) and returns a temperature randomly generated for demonstration purposes based on the provided parameters.
84
+ This example demonstrates using `OmniAI` with **Google** to ask for the weather. A tool “Weather” is provided. The tool accepts a location and unit (Celsius or Fahrenheit) then calculates the weather. The LLM makes multiple tool-call requests and is automatically provided with a tool-call response prior to streaming in real-time the result.
82
85
 
83
86
  ```ruby
84
87
  require 'omniai/google'
85
88
 
86
- CLIENT = OmniAI::Google::Client.new
89
+ client = OmniAI::Google::Client.new
87
90
 
88
- LOCATION = OmniAI::Tool::Property.object(
89
- properties: {
90
- city: OmniAI::Tool::Property.string(description: 'e.g. "Toronto"'),
91
- country: OmniAI::Tool::Property.string(description: 'e.g. "Canada"'),
92
- },
93
- required: %i[city country]
94
- )
91
+ class Weather < OmniAI::Tool
92
+ description "Lookup the weather for a location"
95
93
 
96
- LOCATIONS = OmniAI::Tool::Property.array(
97
- min_items: 1,
98
- max_items: 5,
99
- items: LOCATION
100
- )
94
+ parameter :location, :string, description: "A location (e.g. 'Toronto, Canada')."
95
+ parameter :unit, :string, enum: %w[Celsius Fahrenheit], description: "The unit of measurement."
96
+ required %i[location]
101
97
 
102
- UNIT = OmniAI::Tool::Property.string(enum: %w[celcius fahrenheit])
98
+ # @param location [String] required
99
+ # @param unit [String] optional - "Celcius" or "Fahrenheit"
100
+ # @return [String]
101
+ def execute(location:, unit: "Celsius")
102
+ puts "[weather] location=#{location} unit=#{unit}"
103
+ "#{rand(20..50)}° #{unit} at #{location}"
104
+ end
105
+ end
103
106
 
104
- WEATHER = proc do |locations:, unit: 'celsius'|
105
- locations.map do |location|
106
- "#{rand(20..50)}° #{unit} in #{location[:city]}, #{location[:country]}"
107
- end.join("\n")
107
+ client.chat(stream: $stdout, tools: [Weather.new]) do |prompt|
108
+ prompt.system "You are an expert in weather."
109
+ prompt.user 'What is the weather in "London" in Celsius and "Madrid" in Fahrenheit?'
108
110
  end
111
+ ```
109
112
 
110
- TOOL = OmniAI::Tool.new(
111
- WEATHER,
112
- name: 'Weather',
113
- description: 'Lookup the weather in a location',
114
- parameters: OmniAI::Tool::Parameters.new(
115
- properties: {
116
- locations: LOCATIONS,
117
- unit: UNIT,
118
- },
119
- required: %i[locations]
120
- )
121
- )
113
+ ```
114
+ [weather] location=London unit=Celsius
115
+ [weather] location=Madrid unit=Fahrenheit
116
+ ```
122
117
 
123
- completion = CLIENT.chat(tools: [TOOL]) do |prompt|
124
- prompt.user do |message|
125
- message.text('What is the weather in "London" in celcius and "Seattle" in fahrenheit?')
126
- end
127
- end
118
+ ```
119
+ The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.
120
+ ```
121
+
122
+ ### Example #5: Chat w/ CLI
123
+
124
+ The `OmniAI` gem also ships with a CLI to simplify quick tests.
128
125
 
129
- puts(completion.text)
126
+ ```bash
127
+ omniai chat "Who designed the Ruby programming language?"
130
128
  ```
131
129
 
132
130
  ```
133
- The weather is 24° celcius in London and 42° fahrenheit in Seattle.
131
+ The Ruby programming language was created by Yukihiro Matsumoto, often known as "Matz."
132
+ ```
133
+
134
+ ```bash
135
+ omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"
136
+ ```
137
+
138
+ ```
139
+ I am a large language model, trained by Google.
140
+ ```
141
+
142
+ ### Example #6: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
143
+
144
+ This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
145
+
146
+ ```ruby
147
+ require 'omniai/openai'
148
+
149
+ client = OmniAI::OpenAI::Client.new
150
+
151
+ File.open(File.join(__dir__, 'audio.wav'), 'wb') do |file|
152
+ client.speak('Sally sells seashells by the seashore.', format: OmniAI::Speak::Format::WAV) do |chunk|
153
+ file << chunk
154
+ end
155
+ end
156
+ ```
157
+
158
+ ### Example #7: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
159
+
160
+ This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
161
+
162
+ ```ruby
163
+ require 'omniai/openai'
164
+
165
+ client = OmniAI::OpenAI::Client.new
166
+
167
+ File.open(File.join(__dir__, 'audio.wav'), 'rb') do |file|
168
+ transcription = client.transcribe(file)
169
+ puts(transcription.text)
170
+ end
134
171
  ```
135
172
 
136
- ### Example #5: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
173
+ ### Example #8: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
137
174
 
138
175
  This example demonstrates using `OmniAI` with **Mistral** to generate embeddings for a dataset. It defines a set of entries (e.g. "George is a teacher." or "Ringo is a doctor.") and then compares the embeddings generated from a query (e.g. "What does George do?" or "Who is a doctor?") to rank the entries by relevance.
139
176
 
@@ -282,7 +319,7 @@ logger = Logger.new(STDOUT)
282
319
  client = OmniAI::OpenAI::Client.new(timeout: 8) # i.e. 8 seconds
283
320
  ```
284
321
 
285
- Timeouts are also be configurable by passing a `timeout` hash with `timeout` / `read` / `write` / `keys using:
322
+ Timeouts are also configurable by passing a `timeout` hash with `timeout` / `read` / `write` / keys using:
286
323
 
287
324
  ```ruby
288
325
  require 'omniai/openai'
@@ -351,18 +388,18 @@ A chat can also be initialized with tools:
351
388
 
352
389
  ```ruby
353
390
  tool = OmniAI::Tool.new(
354
- proc { |location:, unit: 'celsius'| "#{rand(20..50)}° #{unit} in #{location}" },
391
+ proc { |location:, unit: 'Celsius'| "#{rand(20..50)}° #{unit} in #{location}" },
355
392
  name: 'Weather',
356
393
  description: 'Lookup the weather in a location',
357
394
  parameters: OmniAI::Tool::Parameters.new(
358
395
  properties: {
359
396
  location: OmniAI::Tool::Property.string(description: 'e.g. Toronto'),
360
- unit: OmniAI::Tool::Property.string(enum: %w[celcius fahrenheit]),
397
+ unit: OmniAI::Tool::Property.string(enum: %w[Celsius Fahrenheit]),
361
398
  },
362
399
  required: %i[location]
363
400
  )
364
401
  )
365
- client.chat('What is the weather in "London" in celcius and "Paris" in fahrenheit?', tools: [tool])
402
+ client.chat('What is the weather in "London" in Celsius and "Paris" in Fahrenheit?', tools: [tool])
366
403
  ```
367
404
 
368
405
  ### Transcribe
@@ -54,13 +54,6 @@ module OmniAI
54
54
  }
55
55
  end
56
56
 
57
- # @param other [OmniAI::Chat::Choice]
58
- #
59
- # @return [OmniAI::Chat::Choice]
60
- def merge(other)
61
- self.class.new(index: @index, message: @message.merge(other.message))
62
- end
63
-
64
57
  # @return [Message]
65
58
  def delta
66
59
  message
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class Chat
5
+ # An `OmniAI::Chat::Delta` is used to assist with streaming. It represents a chunk of a conversation that is yielded
6
+ # back to the caller.
7
+ class Delta
8
+ # @!attribute [rw] text
9
+ #
10
+ attr_accessor :text
11
+
12
+ # @param text [String]
13
+ def initialize(text:)
14
+ @text = text
15
+ end
16
+
17
+ # @return [Boolean]
18
+ def text?
19
+ !text.empty?
20
+ end
21
+ end
22
+ end
23
+ end
@@ -4,10 +4,12 @@ module OmniAI
4
4
  class Chat
5
5
  # A function that includes a name / arguments.
6
6
  class Function
7
- # @return [String]
7
+ # @!attribute [rw] name
8
+ # @return [String]
8
9
  attr_accessor :name
9
10
 
10
- # @return [Hash]
11
+ # @!attribute [rw] arguments
12
+ # @return [Hash]
11
13
  attr_accessor :arguments
12
14
 
13
15
  # @param name [String]
@@ -47,12 +47,12 @@ module OmniAI
47
47
  attr_accessor :role
48
48
 
49
49
  # @!attribute [rw] tool_call_list
50
- # @return [Array<ToolCall>, nil]
50
+ # @return [ToolCallList, nil]
51
51
  attr_accessor :tool_call_list
52
52
 
53
53
  # @param content [String, nil]
54
54
  # @param role [String]
55
- # @param tool_call_list [Array<ToolCall>, nil]
55
+ # @param tool_call_list [ToolCallList, nil]
56
56
  def initialize(content:, role: Role::USER, tool_call_list: nil)
57
57
  @content = content
58
58
  @role = role
@@ -86,7 +86,7 @@ module OmniAI
86
86
 
87
87
  role = data["role"]
88
88
  content = Content.deserialize(data["content"], context:)
89
- tool_call_list = data["tool_calls"]&.map { |subdata| ToolCall.deserialize(subdata, context:) }
89
+ tool_call_list = ToolCallList.deserialize(data["tool_calls"], context:)
90
90
 
91
91
  new(content:, role:, tool_call_list:)
92
92
  end
@@ -104,32 +104,11 @@ module OmniAI
104
104
  return serializer.call(self, context:) if serializer
105
105
 
106
106
  content = @content.is_a?(Array) ? @content.map { |content| content.serialize(context:) } : @content
107
- tool_calls = @tool_call_list&.map { |tool_call| tool_call.serialize(context:) }
107
+ tool_calls = @tool_call_list&.serialize(context:)
108
108
 
109
109
  { role: @role, content:, tool_calls: }.compact
110
110
  end
111
111
 
112
- # @param other [OmniAI::Chat::Message]
113
- #
114
- # @return [OmniAI::Chat::Message]
115
- def merge(other)
116
- content =
117
- if @content && other.content
118
- @content + other.content
119
- else
120
- @content || other.content
121
- end
122
-
123
- tool_call_list =
124
- if @tool_call_list && other.tool_call_list
125
- @tool_call_list + other.tool_call_list
126
- else
127
- @tool_call_list || other.tool_call_list
128
- end
129
-
130
- self.class.new(content:, role:, tool_call_list:)
131
- end
132
-
133
112
  # @return [Boolean]
134
113
  def role?(role)
135
114
  String(@role).eql?(String(role))
@@ -2,68 +2,116 @@
2
2
 
3
3
  module OmniAI
4
4
  class Chat
5
- # Used when processing everything at once.
5
+ # An `OmniAI::Chat::Response` encapsulates the result of generating a chat completion.
6
6
  class Response
7
7
  # @return [Hash]
8
8
  attr_accessor :data
9
9
 
10
+ # @return [Array<Choice>]
11
+ attr_accessor :choices
12
+
13
+ # @return [Usage, nil]
14
+ attr_accessor :usage
15
+
10
16
  # @param data [Hash]
11
- # @param context [Context, nil]
12
- def initialize(data:, context: nil)
17
+ # @param choices [Array<Choice>]
18
+ # @param usage [Usage, nil]
19
+ def initialize(data:, choices: [], usage: nil)
13
20
  @data = data
14
- @context = context
21
+ @choices = choices
22
+ @usage = usage
15
23
  end
16
24
 
17
- # @return [Payload]
18
- def completion
19
- @completion ||= Payload.deserialize(@data, context: @context)
25
+ # @return [String]
26
+ def inspect
27
+ "#<#{self.class.name} choices=#{choices.inspect} usage=#{usage.inspect}>"
20
28
  end
21
29
 
22
- # @return [Usage, nil]
23
- def usage
24
- completion.usage
25
- end
30
+ # @param data [Hash]
31
+ # @param context [OmniAI::Context] optional
32
+ def self.deserialize(data, context: nil)
33
+ deserialize = context&.deserializer(:response)
34
+ return deserialize.call(data, context:) if deserialize
26
35
 
27
- # @return [Array<Choice>]
28
- def choices
29
- completion.choices
36
+ choices = data["choices"].map { |choice_data| Choice.deserialize(choice_data, context:) }
37
+ usage = Usage.deserialize(data["usage"], context:) if data["usage"]
38
+
39
+ new(data:, choices:, usage:)
30
40
  end
31
41
 
32
- # @return [Array<Message>]
33
- def messages
34
- completion.messages
42
+ # @param context [OmniAI::Context] optional
43
+ # @return [Hash]
44
+ def serialize(context:)
45
+ serialize = context&.serializer(:response)
46
+ return serialize.call(self, context:) if serialize
47
+
48
+ {
49
+ choices: choices.map { |choice| choice.serialize(context:) },
50
+ usage: usage&.serialize(context:),
51
+ }
35
52
  end
36
53
 
37
54
  # @param index [Integer]
38
- # @return [Choice]
55
+ #
56
+ # @return [Choice, nil]
39
57
  def choice(index: 0)
40
- completion.choice(index:)
58
+ @choices[index]
59
+ end
60
+
61
+ # @param index [Integer]
62
+ #
63
+ # @return [Boolean]
64
+ def choice?(index: 0)
65
+ !choice(index:).nil?
41
66
  end
42
67
 
43
68
  # @param index [Integer]
44
- # @return [Message]
69
+ #
70
+ # @return [Message, nil]
45
71
  def message(index: 0)
46
- completion.message(index:)
72
+ choice(index:)&.message
47
73
  end
48
74
 
49
- # @return [String]
50
- def text
51
- message.text
75
+ # @param index [Integer]
76
+ #
77
+ # @return [Boolean]
78
+ def message?
79
+ !message(index:).nil?
80
+ end
81
+
82
+ # @return [Array<Message>]
83
+ def messages
84
+ @choices.map(&:message)
85
+ end
86
+
87
+ # @param index [Integer]
88
+ #
89
+ # @return [String, nil]
90
+ def text(index: 0)
91
+ message(index:)&.text
52
92
  end
53
93
 
94
+ # @param index [Integer]
95
+ #
54
96
  # @return [Boolean]
55
- def text?
56
- message.text?
97
+ def text?(index: 0)
98
+ message = message(index:)
99
+
100
+ !message.nil? && message.text?
57
101
  end
58
102
 
59
- # @return [Array<ToolCall>]
60
- def tool_call_list
61
- choice.tool_call_list
103
+ # @param index [Integer]
104
+ #
105
+ # @return [ToolCallList]
106
+ def tool_call_list(index: 0)
107
+ message(index:)&.tool_call_list
62
108
  end
63
109
 
64
110
  # @return [Boolean]
65
- def tool_call_list?
66
- tool_call_list&.any?
111
+ def tool_call_list?(index: 0)
112
+ tool_call_list = tool_call_list(index:)
113
+
114
+ !tool_call_list.nil? && tool_call_list.any?
67
115
  end
68
116
  end
69
117
  end
@@ -13,30 +13,32 @@ module OmniAI
13
13
  class Stream
14
14
  # @param logger [OmniAI::Client]
15
15
  # @param chunks [Enumerable<String>]
16
- # @param context [Context, nil]
17
- def initialize(chunks:, logger: nil, context: nil)
16
+ def initialize(chunks:, logger: nil)
18
17
  @chunks = chunks
19
18
  @logger = logger
20
- @context = context
21
19
  end
22
20
 
23
- # @yield [payload]
24
- # @yieldparam payload [OmniAI::Chat::Payload]
21
+ # @yield [delta]
22
+ # @yieldparam delta [OmniAI::Chat::Delta]
25
23
  #
26
- # @return [OmniAI::Chat::Payload]
24
+ # @return [Hash]
27
25
  def stream!(&block)
28
- OmniAI::Chat::Payload.new.tap do |payload|
29
- @chunks.map do |chunk|
30
- parser.feed(chunk) do |type, data, id|
31
- result = process!(type, data, id, &block)
32
- payload.merge!(result) if result.is_a?(OmniAI::Chat::Payload)
33
- end
26
+ @data = { "choices" => [] }
27
+
28
+ @chunks.map do |chunk|
29
+ parser.feed(chunk) do |type, data, id|
30
+ process!(type, data, id, &block)
34
31
  end
35
32
  end
33
+
34
+ @data
36
35
  end
37
36
 
38
37
  protected
39
38
 
39
+ # @yield [delta]
40
+ # @yieldparam delta [OmniAI::Chat::Delta]
41
+ #
40
42
  # @param type [String]
41
43
  # @param data [String]
42
44
  # @param id [String]
@@ -50,19 +52,78 @@ module OmniAI
50
52
  @logger&.debug("Stream#process! #{arguments.join(' ')}")
51
53
  end
52
54
 
55
+ # @yield [delta]
56
+ # @yieldparam delta [OmniAI::Chat::Delta]
57
+ #
53
58
  # @param type [String]
54
59
  # @param data [String]
55
60
  # @param id [String]
56
- #
57
- # @return [OmniAI::Chat::Payload, nil]
58
- def process!(type, data, id, &block)
61
+ def process!(type, data, id, &)
59
62
  log(type, data, id)
60
63
 
61
64
  return if data.eql?("[DONE]")
62
65
 
63
- payload = Payload.deserialize(JSON.parse(data), context: @context)
64
- block&.call(payload)
65
- payload
66
+ process_data!(data: JSON.parse(data), &)
67
+ end
68
+
69
+ # @yield [delta]
70
+ # @yieldparam delta [OmniAI::Chat::Delta]
71
+ #
72
+ # @param data [Hash]
73
+ def process_data!(data:, &block)
74
+ data.each do |key, value|
75
+ @data[key] = value unless key.eql?("choices") || key.eql?("object")
76
+ end
77
+
78
+ data["choices"].each do |choice|
79
+ merge_choice!(choice:)
80
+
81
+ text = choice["delta"]["content"]
82
+ block&.call(Delta.new(text:)) if text
83
+ end
84
+ end
85
+
86
+ # @param choice [Hash]
87
+ def merge_choice!(choice:)
88
+ index = choice["index"]
89
+ delta = choice["delta"]
90
+
91
+ if @data["choices"][index].nil?
92
+ @data["choices"][index] = {
93
+ "index" => index,
94
+ "message" => delta,
95
+ }
96
+ else
97
+ message = @data["choices"][index]["message"]
98
+
99
+ message["content"] += delta["content"] if delta["content"]
100
+
101
+ merge_tool_call_list!(tool_call_list: delta["tool_calls"], message:)
102
+ end
103
+ end
104
+
105
+ # @param tool_call_list [Array<Hash>, nil]
106
+ # @param message [Hash]
107
+ def merge_tool_call_list!(tool_call_list:, message:)
108
+ return unless tool_call_list
109
+
110
+ message["tool_calls"] ||= []
111
+
112
+ tool_call_list.each do |tool_call|
113
+ merge_tool_call_data!(tool_call:, message:)
114
+ end
115
+ end
116
+
117
+ # @param tool_call [Hash]
118
+ # @param message [Hash]
119
+ def merge_tool_call_data!(tool_call:, message:)
120
+ tool_call_index = tool_call["index"]
121
+
122
+ if message["tool_calls"][tool_call_index].nil?
123
+ message["tool_calls"][tool_call_index] = tool_call
124
+ else
125
+ message["tool_calls"][tool_call_index]["function"]["arguments"] += tool_call["function"]["arguments"]
126
+ end
66
127
  end
67
128
 
68
129
  # @return [EventStreamParser::Parser]
@@ -4,10 +4,12 @@ module OmniAI
4
4
  class Chat
5
5
  # A tool-call that includes an ID / function.
6
6
  class ToolCall
7
- # @return [String]
7
+ # @!attribute [rw] id
8
+ # @return [String]
8
9
  attr_accessor :id
9
10
 
10
- # @return [Function]
11
+ # @!attribute [rw] function
12
+ # @return [Function]
11
13
  attr_accessor :function
12
14
 
13
15
  # @param id [String]
@@ -25,7 +27,7 @@ module OmniAI
25
27
  # @param data [Hash]
26
28
  # @param context [Context] optional
27
29
  #
28
- # @return [Function]
30
+ # @return [ToolCall]
29
31
  def self.deserialize(data, context: nil)
30
32
  deserialize = context&.deserializer(:tool_call)
31
33
  return deserialize.call(data, context:) if deserialize
@@ -0,0 +1,52 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class Chat
5
+ # An `OmniAI::Chat::ToolCallList` is a collection designed to handle the merging of multiple tool call arrays. LLMs
6
+ # provide a subset of tool call items when chunking, so merging requires the use of the tool call index to combine.
7
+ class ToolCallList
8
+ include Enumerable
9
+
10
+ # @param entries [Array<ToolCall>]
11
+ def initialize(entries: [])
12
+ @entries = entries
13
+ end
14
+
15
+ # Usage:
16
+ #
17
+ # ToolCallList.deserialize([]) # => #<ToolCallList ...>
18
+ #
19
+ # @param data [Array]
20
+ # @param context [Context] optional
21
+ #
22
+ # @return [ToolCallList]
23
+ def self.deserialize(data, context: nil)
24
+ return unless data
25
+
26
+ new(entries: data.map { |subdata| ToolCall.deserialize(subdata, context:) })
27
+ end
28
+
29
+ # Usage:
30
+ #
31
+ # tool_call_list.serialize # => [...]
32
+ #
33
+ # @param context [Context] optional
34
+ #
35
+ # @return [Array<Hash>]
36
+ def serialize(context: nil)
37
+ @entries.compact.map { |tool_call| tool_call.serialize(context:) }
38
+ end
39
+
40
+ # @return [String]
41
+ def inspect
42
+ "#<#{self.class.name} entries=#{@entries.inspect}>"
43
+ end
44
+
45
+ # @yield toolcall
46
+ # @yieldparam toolcall [ToolCall]
47
+ def each(&)
48
+ @entries.each(&)
49
+ end
50
+ end
51
+ end
52
+ end
@@ -28,7 +28,7 @@ module OmniAI
28
28
  #
29
29
  # @return [ToolMessage]
30
30
  def self.deserialize(data, context: nil)
31
- deserialize = context&.deserializer(:tool_message)
31
+ deserialize = context&.deserializer(:tool_call_message)
32
32
  return deserialize.call(data, context:) if deserialize
33
33
 
34
34
  role = data["role"]
@@ -47,7 +47,7 @@ module OmniAI
47
47
  #
48
48
  # @return [Hash]
49
49
  def serialize(context: nil)
50
- serializer = context&.serializer(:tool_message)
50
+ serializer = context&.serializer(:tool_call_message)
51
51
  return serializer.call(self, context:) if serializer
52
52
 
53
53
  role = @role
data/lib/omniai/chat.rb CHANGED
@@ -86,12 +86,25 @@ module OmniAI
86
86
  end
87
87
 
88
88
  # @raise [HTTPError]
89
+ #
90
+ # @return [OmniAI::Chat::Response]
89
91
  def process!
90
92
  response = request!
91
93
 
92
94
  raise HTTPError, response.flush unless response.status.ok?
93
95
 
94
- parse!(response:)
96
+ completion = parse!(response:)
97
+
98
+ if @tools && completion.tool_call_list?
99
+ spawn!(
100
+ @prompt.dup.tap do |prompt|
101
+ prompt.messages += completion.messages
102
+ prompt.messages += build_tool_call_messages(completion.tool_call_list)
103
+ end
104
+ ).process!
105
+ else
106
+ completion
107
+ end
95
108
  end
96
109
 
97
110
  protected
@@ -135,6 +148,7 @@ module OmniAI
135
148
  end
136
149
 
137
150
  # @param response [HTTP::Response]
151
+ #
138
152
  # @return [OmniAI::Chat::Response]
139
153
  def parse!(response:)
140
154
  if @stream
@@ -145,32 +159,22 @@ module OmniAI
145
159
  end
146
160
 
147
161
  # @param response [HTTP::Response]
162
+ #
148
163
  # @return [OmniAI::Chat::Response]
149
164
  def complete!(response:)
150
- completion = self.class::Response.new(data: response.parse, context:)
151
-
152
- if @tools && completion.tool_call_list?
153
- spawn!(
154
- @prompt.dup.tap do |prompt|
155
- prompt.messages += completion.messages
156
- prompt.messages += build_tool_call_messages(completion.tool_call_list)
157
- end
158
- ).process!
159
- else
160
- completion
161
- end
165
+ OmniAI::Chat::Response.deserialize(response.parse, context:)
162
166
  end
163
167
 
164
168
  # @param response [HTTP::Response]
165
169
  #
166
- # @return [OmniAI::Chat::Payload]
170
+ # @return [OmniAI::Chat::Response]
167
171
  def stream!(response:)
168
172
  raise Error, "#{self.class.name}#stream! unstreamable" unless @stream
169
173
 
170
- payload = self.class::Stream.new(chunks: response.body, logger:, context:).stream! do |chunk|
174
+ data = self.class::Stream.new(chunks: response.body, logger:).stream! do |chunk|
171
175
  case @stream
172
176
  when IO, StringIO
173
- if chunk.text
177
+ if chunk.text?
174
178
  @stream << chunk.text
175
179
  @stream.flush
176
180
  end
@@ -178,9 +182,11 @@ module OmniAI
178
182
  end
179
183
  end
180
184
 
181
- @stream.puts if @stream.is_a?(IO) || @stream.is_a?(StringIO)
185
+ response = OmniAI::Chat::Response.deserialize(data, context:)
186
+
187
+ @stream.puts if response.text? && (@stream.is_a?(IO) || @stream.is_a?(StringIO))
182
188
 
183
- payload
189
+ response
184
190
  end
185
191
 
186
192
  # @return [HTTP::Response]
data/lib/omniai/client.rb CHANGED
@@ -134,7 +134,7 @@ module OmniAI
134
134
  # @yield [prompt] optional
135
135
  # @yieldparam prompt [OmniAI::Chat::Prompt]
136
136
  #
137
- # @return [OmniAI::Chat::Completion]
137
+ # @return [OmniAI::Chat::Response]
138
138
  def chat(messages = nil, model:, temperature: nil, format: nil, stream: nil, tools: nil, &)
139
139
  raise NotImplementedError, "#{self.class.name}#chat undefined"
140
140
  end
@@ -28,6 +28,19 @@ module OmniAI
28
28
  # @return [Array<String>, nil]
29
29
  attr_reader :enum
30
30
 
31
+ # @param kind [Symbol]
32
+ # @return [OmniAI::Tool::Property]
33
+ def self.build(kind, **args)
34
+ case kind
35
+ when :array then array(**args)
36
+ when :object then object(**args)
37
+ when :boolean then boolean(**args)
38
+ when :integer then integer(**args)
39
+ when :string then string(**args)
40
+ when :number then number(**args)
41
+ end
42
+ end
43
+
31
44
  # @example
32
45
  # property = OmniAI::Tool::Property.array(
33
46
  # items: OmniAI::Tool::Property.string(description: 'The name of the person.'),
data/lib/omniai/tool.rb CHANGED
@@ -3,40 +3,82 @@
3
3
  module OmniAI
4
4
  # Usage:
5
5
  #
6
- # fibonacci = proc do |n:|
7
- # next(0) if n == 0
8
- # next(1) if n == 1
9
- # fibonacci.call(n: n - 1) + fibonacci.call(n: n - 2)
10
- # end
6
+ # class Weather < OmniAI::Tool
7
+ # description 'Find the weather for a location'
8
+ #
9
+ # parameter :location, :string, description: 'The location to find the weather for (e.g. "Toronto, Canada").'
10
+ # parameter :unit, :string, description: 'The unit of measurement (e.g. "Celcius" or "Fahrenheit").'
11
+ # required %i[location]
11
12
  #
12
- # tool = OmniAI::Tool.new(fibonacci,
13
- # name: 'Fibonacci',
14
- # description: 'Cacluate the nth Fibonacci',
15
- # parameters: OmniAI::Tool::Parameters.new(
16
- # properties: {
17
- # n: OmniAI::Tool::Property.integer(description: 'The nth Fibonacci number to calculate')
18
- # },
19
- # required: %i[n],
20
- # )
21
- # )
13
+ # def execute!(location:)
14
+ # # ...
15
+ # end
16
+ # end
22
17
  class Tool
23
- # @return [Proc]
18
+ class << self
19
+ # @param description [String]
20
+ def description(description = nil)
21
+ return @description if description.nil?
22
+
23
+ @description = description
24
+ end
25
+
26
+ # @return [OmniAI::Tool::Parameters]
27
+ def parameters
28
+ @parameters ||= Parameters.new
29
+ end
30
+
31
+ # @param name [Symbol]
32
+ # @param kind [Symbol]
33
+ def parameter(name, kind, **)
34
+ parameters.properties[name] = Property.build(kind, **)
35
+ end
36
+
37
+ # @param names [Array<Symbol>]
38
+ def required(names)
39
+ parameters.required = names
40
+ end
41
+
42
+ # Converts a class name to a tool:
43
+ # - e.g. "IBM::Watson::SearchTool" => "ibm_watson_search"
44
+ #
45
+ # @return [String]
46
+ def namify
47
+ name
48
+ .gsub("::", "_")
49
+ .gsub(/(?<prefix>[A-Z+])(?<suffix>[A-Z][a-z])/, '\k<prefix>_\k<suffix>')
50
+ .gsub(/(?<prefix>[a-z])(?<suffix>[A-Z])/, '\k<prefix>_\k<suffix>')
51
+ .gsub(/_tool$/i, "")
52
+ .downcase
53
+ end
54
+ end
55
+
56
+ # @!attribute [rw] function
57
+ # @return [Proc]
24
58
  attr_accessor :function
25
59
 
26
- # @return [String]
60
+ # @!attribute [rw] name
61
+ # @return [String]
27
62
  attr_accessor :name
28
63
 
29
- # @return [String, nil]
64
+ # @!attribute [description]
65
+ # @return [String, nil]
30
66
  attr_accessor :description
31
67
 
32
- # @return [Hash, nil]
68
+ # @!attribute[parameters]
69
+ # @return [Hash, nil]
33
70
  attr_accessor :parameters
34
71
 
35
72
  # @param function [Proc]
36
73
  # @param name [String]
37
74
  # @param description [String]
38
75
  # @param parameters [Hash]
39
- def initialize(function, name:, description: nil, parameters: nil)
76
+ def initialize(
77
+ function = method(:execute),
78
+ name: self.class.namify,
79
+ description: self.class.description,
80
+ parameters: self.class.parameters
81
+ )
40
82
  @function = function
41
83
  @name = name
42
84
  @description = description
@@ -79,6 +121,10 @@ module OmniAI
79
121
  }
80
122
  end
81
123
 
124
+ def execute(...)
125
+ raise NotImplementedError, "#{self.class}#execute undefined"
126
+ end
127
+
82
128
  # @example
83
129
  # tool.call({ "n" => 6 })
84
130
  # #=> 8
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "1.9.6"
4
+ VERSION = "2.1.0"
5
5
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.9.6
4
+ version: 2.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-02-27 00:00:00.000000000 Z
10
+ date: 2025-03-17 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -83,17 +83,18 @@ files:
83
83
  - lib/omniai/chat.rb
84
84
  - lib/omniai/chat/choice.rb
85
85
  - lib/omniai/chat/content.rb
86
+ - lib/omniai/chat/delta.rb
86
87
  - lib/omniai/chat/file.rb
87
88
  - lib/omniai/chat/function.rb
88
89
  - lib/omniai/chat/media.rb
89
90
  - lib/omniai/chat/message.rb
90
91
  - lib/omniai/chat/message/builder.rb
91
- - lib/omniai/chat/payload.rb
92
92
  - lib/omniai/chat/prompt.rb
93
93
  - lib/omniai/chat/response.rb
94
94
  - lib/omniai/chat/stream.rb
95
95
  - lib/omniai/chat/text.rb
96
96
  - lib/omniai/chat/tool_call.rb
97
+ - lib/omniai/chat/tool_call_list.rb
97
98
  - lib/omniai/chat/tool_call_message.rb
98
99
  - lib/omniai/chat/tool_call_result.rb
99
100
  - lib/omniai/chat/url.rb
@@ -1,107 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module OmniAI
4
- class Chat
5
- # A chunk or completion.
6
- class Payload
7
- # @return [Array<Choice>]
8
- attr_accessor :choices
9
-
10
- # @return [Usage, nil]
11
- attr_accessor :usage
12
-
13
- # @param choices [Array<Choice>]
14
- # @param usage [Usage, nil]
15
- def initialize(choices: [], usage: nil)
16
- @id = SecureRandom.alphanumeric
17
- @choices = choices
18
- @usage = usage
19
- end
20
-
21
- # @return [String]
22
- def inspect
23
- "#<#{self.class.name} choices=#{choices.inspect} usage=#{usage.inspect}>"
24
- end
25
-
26
- # @param data [Hash]
27
- # @param context [OmniAI::Context] optional
28
- def self.deserialize(data, context: nil)
29
- deserialize = context&.deserializer(:payload)
30
- return deserialize.call(data, context:) if deserialize
31
-
32
- choices = data["choices"].map { |choice_data| Choice.deserialize(choice_data, context:) }
33
- usage = Usage.deserialize(data["usage"], context:) if data["usage"]
34
-
35
- new(choices:, usage:)
36
- end
37
-
38
- # @param context [OmniAI::Context] optional
39
- # @return [Hash]
40
- def serialize(context:)
41
- serialize = context&.serializer(:payload)
42
- return serialize.call(self, context:) if serialize
43
-
44
- {
45
- choices: choices.map { |choice| choice.serialize(context:) },
46
- usage: usage&.serialize(context:),
47
- }
48
- end
49
-
50
- # @param other [OmniAI::Chat::Payload]
51
- def merge!(other)
52
- return unless other
53
-
54
- @usage = other.usage if other.usage
55
-
56
- other.choices.each do |choice|
57
- if choice?(index: choice.index)
58
- choices[choice.index] = choices[choice.index].merge(choice)
59
- else
60
- @choices[choice.index] = choice
61
- end
62
- end
63
- end
64
-
65
- # @param index [Integer]
66
- # @return [Choice]
67
- def choice(index: 0)
68
- @choices[index]
69
- end
70
-
71
- # @param index [Integer]
72
- # @return [Boolean]
73
- def choice?(index: 0)
74
- !choice(index:).nil?
75
- end
76
-
77
- # @param index [Integer]
78
- # @return [Message]
79
- def message(index: 0)
80
- choice(index:).message
81
- end
82
-
83
- # @return [Array<Message>]
84
- def messages
85
- @choices.map(&:message)
86
- end
87
-
88
- # @param index [Integer]
89
- # @return [String, nil]
90
- def text(index: 0)
91
- message(index:).text
92
- end
93
-
94
- # @param index [Integer]
95
- # @return [Boolean]
96
- def text?(index: 0)
97
- message(index:).text?
98
- end
99
-
100
- # @param index [Integer]
101
- # @return [Array<ToolCall>]
102
- def tool_call_list(index:)
103
- message(index:).tool_call_list
104
- end
105
- end
106
- end
107
- end