omniai 1.9.5 → 2.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c57a422d4d3213f3d952940908890c4da086668972a6596f4ce69530af1bf547
4
- data.tar.gz: a66bb5326ce64219cc58ab1f072ef40741ae03a83aecd86d96a6417767911c5b
3
+ metadata.gz: 116095cbb73cb4e826fe28a4c644e37ea1ecb887d601b0ca4cbdc3a70e0ca81e
4
+ data.tar.gz: d17b5078d80e7cf6858b0e77e22d5949c157841672aa5bbe54a2782e8e10df31
5
5
  SHA512:
6
- metadata.gz: 832c57105c19df9bf3675a1050cae1bdb775cd117a4222b81ee7032abf5356007fa04f7282017585da7563855d0e6ddc55aa8f330a9dfdceb22b0b9f632c2f96
7
- data.tar.gz: d202c4bc08eaefa7b5ec08c3192c0fa08f6938fbd3feade8018d65c4a46cb56d078194b072b2b483388c6fca736fa334e3f717fb14d607123425ce250f969488
6
+ metadata.gz: 5b3a610780db624592fab65b2437ac552de7dd9f7d5c4f6ec62aff546720db832a40f4a7e24e68d64f42ce84dff24ed73a3105a08c37c61f00faa83d65fd55ca
7
+ data.tar.gz: 5000a20feca11ee1e29b03d71885d4e75aa6792a39ac828ae78b3fcae5783430f65fe93b209350aa3399dfc7cce1ccbadc0a844518e0ccb772d0b8704f304cc3
data/README.md CHANGED
@@ -6,7 +6,7 @@
6
6
  [![Yard](https://img.shields.io/badge/docs-site-blue.svg)](https://omniai.ksylvest.com)
7
7
  [![CircleCI](https://img.shields.io/circleci/build/github/ksylvest/omniai)](https://circleci.com/gh/ksylvest/omniai)
8
8
 
9
- OmniAI standardizes the APIs of various AI / LLM companies such as Anthropic, Google, Mistral and OpenAI for the generation of text, the conversion of text-to-speech, the conversion of speech-to-text, the generation of embeddings, and more. It offers a unified API regardless of the provider and task.
9
+ OmniAI provides a unified Ruby API for integrating with multiple AI providers, including Anthropic, DeepSeek, Google, Mistral, and OpenAI. It streamlines AI development by offering a consistent interface for features such as chat, text-to-speech, speech-to-text, and embeddings—ensuring seamless interoperability across platforms. Switching between providers is effortless, making any integration more flexible and reliable.
10
10
 
11
11
  - [OmniAI::Anthropic](https://github.com/ksylvest/omniai-anthropic)
12
12
  - [OmniAI::DeepSeek](https://github.com/ksylvest/omniai-deepseek)
@@ -18,6 +18,8 @@ OmniAI standardizes the APIs of various AI / LLM companies such as Anthropic, Go
18
18
 
19
19
  ### Example #1: [Chat](https://github.com/ksylvest/omniai/blob/main/examples/chat)
20
20
 
21
+ This example demonstrates using `OmniAI` with **Anthropic** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
22
+
21
23
  ```ruby
22
24
  require 'omniai/anthropic'
23
25
 
@@ -45,6 +47,8 @@ The animals in the photos are:
45
47
 
46
48
  ### Example #2: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
47
49
 
50
+ This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
51
+
48
52
  ```ruby
49
53
  require 'omniai/openai'
50
54
 
@@ -59,6 +63,8 @@ end
59
63
 
60
64
  ### Example #3: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
61
65
 
66
+ This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
67
+
62
68
  ```ruby
63
69
  require 'omniai/openai'
64
70
 
@@ -72,6 +78,8 @@ end
72
78
 
73
79
  ### Example #4: [Tools](https://github.com/ksylvest/omniai/blob/main/examples/tools)
74
80
 
81
+ This example demonstrates how to use `OmniAI` with **Google** to call a custom tool that generates a weather report. The tool accepts a list of locations (city and country) and returns a temperature — randomly generated for demonstration purposes — based on the provided parameters.
82
+
75
83
  ```ruby
76
84
  require 'omniai/google'
77
85
 
@@ -127,6 +135,8 @@ The weather is 24° celcius in London and 42° fahrenheit in Seattle.
127
135
 
128
136
  ### Example #5: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
129
137
 
138
+ This example demonstrates using `OmniAI` with **Mistral** to generate embeddings for a dataset. It defines a set of entries (e.g. "George is a teacher." or "Ringo is a doctor.") and then compares the embeddings generated from a query (e.g. "What does George do?" or "Who is a doctor?") to rank the entries by relevance.
139
+
130
140
  ```ruby
131
141
  require 'omniai/mistral'
132
142
 
@@ -5,15 +5,20 @@ module OmniAI
5
5
  # A choice wraps a message and index returned by an LLM. The default is to generate a single choice. Some LLMs
6
6
  # support generating multiple choices at once (e.g. giving you multiple options to choose from).
7
7
  class Choice
8
- # @return [Integer]
8
+ DEFAULT_INDEX = 0
9
+ private_constant :DEFAULT_INDEX
10
+
11
+ # @!attribute [rw] index
12
+ # @return [Integer]
9
13
  attr_accessor :index
10
14
 
11
- # @return [Message]
15
+ # @!attribute [rw] message
16
+ # @return [Message]
12
17
  attr_accessor :message
13
18
 
14
19
  # @param message [Message]
15
20
  # @param index [Integer]
16
- def initialize(message:, index: 0)
21
+ def initialize(message:, index: DEFAULT_INDEX)
17
22
  @message = message
18
23
  @index = index
19
24
  end
@@ -31,7 +36,7 @@ module OmniAI
31
36
  deserialize = context&.deserializer(:choice)
32
37
  return deserialize.call(data, context:) if deserialize
33
38
 
34
- index = data["index"]
39
+ index = data["index"] || DEFAULT_INDEX
35
40
  message = Message.deserialize(data["message"] || data["delta"], context:)
36
41
 
37
42
  new(message:, index:)
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class Chat
5
+ # An `OmniAI::Chat::Delta` is used to assist with streaming. It represents a chunk of a conversation that is yielded
6
+ # back to the caller.
7
+ class Delta
8
+ # @!attribute [rw] text
9
+ #
10
+ attr_accessor :text
11
+
12
+ # @param text [String]
13
+ def initialize(text:)
14
+ @text = text
15
+ end
16
+
17
+ # @return [Boolean]
18
+ def text?
19
+ !text.empty?
20
+ end
21
+ end
22
+ end
23
+ end
@@ -4,6 +4,8 @@ module OmniAI
4
4
  class Chat
5
5
  # A file is media that can be sent to many LLMs.
6
6
  class File < Media
7
+ # @!attribute [rw] io
8
+ # @return [IO, Pathname, String]
7
9
  attr_accessor :io
8
10
 
9
11
  # @param io [IO, Pathname, String]
@@ -4,10 +4,12 @@ module OmniAI
4
4
  class Chat
5
5
  # A function that includes a name / arguments.
6
6
  class Function
7
- # @return [String]
7
+ # @!attribute [rw] name
8
+ # @return [String]
8
9
  attr_accessor :name
9
10
 
10
- # @return [Hash]
11
+ # @!attribute [rw] arguments
12
+ # @return [Hash]
11
13
  attr_accessor :arguments
12
14
 
13
15
  # @param name [String]
@@ -38,18 +38,21 @@ module OmniAI
38
38
  end
39
39
  end
40
40
 
41
- # @return [Array<Content>, String]
41
+ # @!attribute [rw] content
42
+ # @return [Array<Content>, String]
42
43
  attr_accessor :content
43
44
 
44
- # @return [String]
45
+ # @!attribute [rw] role
46
+ # @return [String]
45
47
  attr_accessor :role
46
48
 
47
- # @return [Array<ToolCall>, nil]
49
+ # @!attribute [rw] tool_call_list
50
+ # @return [ToolCallList, nil]
48
51
  attr_accessor :tool_call_list
49
52
 
50
53
  # @param content [String, nil]
51
54
  # @param role [String]
52
- # @param tool_call_list [Array<ToolCall>, nil]
55
+ # @param tool_call_list [ToolCallList, nil]
53
56
  def initialize(content:, role: Role::USER, tool_call_list: nil)
54
57
  @content = content
55
58
  @role = role
@@ -83,7 +86,7 @@ module OmniAI
83
86
 
84
87
  role = data["role"]
85
88
  content = Content.deserialize(data["content"], context:)
86
- tool_call_list = data["tool_calls"]&.map { |subdata| ToolCall.deserialize(subdata, context:) }
89
+ tool_call_list = ToolCallList.deserialize(data["tool_calls"], context:)
87
90
 
88
91
  new(content:, role:, tool_call_list:)
89
92
  end
@@ -101,7 +104,7 @@ module OmniAI
101
104
  return serializer.call(self, context:) if serializer
102
105
 
103
106
  content = @content.is_a?(Array) ? @content.map { |content| content.serialize(context:) } : @content
104
- tool_calls = @tool_call_list&.map { |tool_call| tool_call.serialize(context:) }
107
+ tool_calls = @tool_call_list&.serialize(context:)
105
108
 
106
109
  { role: @role, content:, tool_calls: }.compact
107
110
  end
@@ -2,68 +2,116 @@
2
2
 
3
3
  module OmniAI
4
4
  class Chat
5
- # Used when processing everything at once.
5
+ # An `OmniAI::Chat::Response` encapsulates the result of generating a chat completion.
6
6
  class Response
7
7
  # @return [Hash]
8
8
  attr_accessor :data
9
9
 
10
+ # @return [Array<Choice>]
11
+ attr_accessor :choices
12
+
13
+ # @return [Usage, nil]
14
+ attr_accessor :usage
15
+
10
16
  # @param data [Hash]
11
- # @param context [Context, nil]
12
- def initialize(data:, context: nil)
17
+ # @param choices [Array<Choice>]
18
+ # @param usage [Usage, nil]
19
+ def initialize(data:, choices: [], usage: nil)
13
20
  @data = data
14
- @context = context
21
+ @choices = choices
22
+ @usage = usage
15
23
  end
16
24
 
17
- # @return [Payload]
18
- def completion
19
- @completion ||= Payload.deserialize(@data, context: @context)
25
+ # @return [String]
26
+ def inspect
27
+ "#<#{self.class.name} choices=#{choices.inspect} usage=#{usage.inspect}>"
20
28
  end
21
29
 
22
- # @return [Usage, nil]
23
- def usage
24
- completion.usage
25
- end
30
+ # @param data [Hash]
31
+ # @param context [OmniAI::Context] optional
32
+ def self.deserialize(data, context: nil)
33
+ deserialize = context&.deserializer(:response)
34
+ return deserialize.call(data, context:) if deserialize
26
35
 
27
- # @return [Array<Choice>]
28
- def choices
29
- completion.choices
36
+ choices = data["choices"].map { |choice_data| Choice.deserialize(choice_data, context:) }
37
+ usage = Usage.deserialize(data["usage"], context:) if data["usage"]
38
+
39
+ new(data:, choices:, usage:)
30
40
  end
31
41
 
32
- # @return [Array<Message>]
33
- def messages
34
- completion.messages
42
+ # @param context [OmniAI::Context] optional
43
+ # @return [Hash]
44
+ def serialize(context:)
45
+ serialize = context&.serializer(:response)
46
+ return serialize.call(self, context:) if serialize
47
+
48
+ {
49
+ choices: choices.map { |choice| choice.serialize(context:) },
50
+ usage: usage&.serialize(context:),
51
+ }
35
52
  end
36
53
 
37
54
  # @param index [Integer]
38
- # @return [Choice]
55
+ #
56
+ # @return [Choice, nil]
39
57
  def choice(index: 0)
40
- completion.choice(index:)
58
+ @choices[index]
59
+ end
60
+
61
+ # @param index [Integer]
62
+ #
63
+ # @return [Boolean]
64
+ def choice?(index: 0)
65
+ !choice(index:).nil?
41
66
  end
42
67
 
43
68
  # @param index [Integer]
44
- # @return [Message]
69
+ #
70
+ # @return [Message, nil]
45
71
  def message(index: 0)
46
- completion.message(index:)
72
+ choice(index:)&.message
47
73
  end
48
74
 
49
- # @return [String]
50
- def text
51
- message.text
75
+ # @param index [Integer]
76
+ #
77
+ # @return [Boolean]
78
+ def message?
79
+ !message(index:).nil?
80
+ end
81
+
82
+ # @return [Array<Message>]
83
+ def messages
84
+ @choices.map(&:message)
85
+ end
86
+
87
+ # @param index [Integer]
88
+ #
89
+ # @return [String, nil]
90
+ def text(index: 0)
91
+ message(index:)&.text
52
92
  end
53
93
 
94
+ # @param index [Integer]
95
+ #
54
96
  # @return [Boolean]
55
- def text?
56
- message.text?
97
+ def text?(index: 0)
98
+ message = message(index:)
99
+
100
+ !message.nil? && message.text?
57
101
  end
58
102
 
59
- # @return [Array<ToolCall>]
60
- def tool_call_list
61
- choice.tool_call_list
103
+ # @param index [Integer]
104
+ #
105
+ # @return [ToolCallList]
106
+ def tool_call_list(index: 0)
107
+ message(index:)&.tool_call_list
62
108
  end
63
109
 
64
110
  # @return [Boolean]
65
- def tool_call_list?
66
- tool_call_list&.any?
111
+ def tool_call_list?(index: 0)
112
+ tool_call_list = tool_call_list(index:)
113
+
114
+ !tool_call_list.nil? && tool_call_list.any?
67
115
  end
68
116
  end
69
117
  end
@@ -2,29 +2,43 @@
2
2
 
3
3
  module OmniAI
4
4
  class Chat
5
- # Used when streaming to process chunks of data.
5
+ # A stream is used to process a series of chunks of data. It converts the following into a combined payload:
6
+ #
7
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "role" :"assistant", "content":"" } }] }
8
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "content" :"A" } }] }
9
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "content" :"B" } }] }
10
+ # ...
11
+ #
12
+ # Every
6
13
  class Stream
7
14
  # @param logger [OmniAI::Client]
8
- # @param body [HTTP::Response::Body]
9
- # @param context [Context, nil]
10
- def initialize(body:, logger: nil, context: nil)
11
- @body = body
15
+ # @param chunks [Enumerable<String>]
16
+ def initialize(chunks:, logger: nil)
17
+ @chunks = chunks
12
18
  @logger = logger
13
- @context = context
14
19
  end
15
20
 
16
- # @yield [payload]
17
- # @yieldparam payload [OmniAI::Chat::Payload]
21
+ # @yield [delta]
22
+ # @yieldparam delta [OmniAI::Chat::Delta]
23
+ #
24
+ # @return [Hash]
18
25
  def stream!(&block)
19
- @body.each do |chunk|
26
+ @data = { "choices" => [] }
27
+
28
+ @chunks.map do |chunk|
20
29
  parser.feed(chunk) do |type, data, id|
21
30
  process!(type, data, id, &block)
22
31
  end
23
32
  end
33
+
34
+ @data
24
35
  end
25
36
 
26
37
  protected
27
38
 
39
+ # @yield [delta]
40
+ # @yieldparam delta [OmniAI::Chat::Delta]
41
+ #
28
42
  # @param type [String]
29
43
  # @param data [String]
30
44
  # @param id [String]
@@ -38,18 +52,78 @@ module OmniAI
38
52
  @logger&.debug("Stream#process! #{arguments.join(' ')}")
39
53
  end
40
54
 
55
+ # @yield [delta]
56
+ # @yieldparam delta [OmniAI::Chat::Delta]
57
+ #
41
58
  # @param type [String]
42
59
  # @param data [String]
43
60
  # @param id [String]
44
- #
45
- # @yield [payload]
46
- # @yieldparam payload [OmniAI::Chat::Payload]
47
- def process!(type, data, id, &block)
61
+ def process!(type, data, id, &)
48
62
  log(type, data, id)
49
63
 
50
64
  return if data.eql?("[DONE]")
51
65
 
52
- block.call(Payload.deserialize(JSON.parse(data), context: @context))
66
+ process_data!(data: JSON.parse(data), &)
67
+ end
68
+
69
+ # @yield [delta]
70
+ # @yieldparam delta [OmniAI::Chat::Delta]
71
+ #
72
+ # @param data [Hash]
73
+ def process_data!(data:, &block)
74
+ data.each do |key, value|
75
+ @data[key] = value unless key.eql?("choices") || key.eql?("object")
76
+ end
77
+
78
+ data["choices"].each do |choice|
79
+ merge_choice!(choice:)
80
+
81
+ text = choice["delta"]["content"]
82
+ block&.call(Delta.new(text:)) if text
83
+ end
84
+ end
85
+
86
+ # @param choice [Hash]
87
+ def merge_choice!(choice:)
88
+ index = choice["index"]
89
+ delta = choice["delta"]
90
+
91
+ if @data["choices"][index].nil?
92
+ @data["choices"][index] = {
93
+ "index" => index,
94
+ "message" => delta,
95
+ }
96
+ else
97
+ message = @data["choices"][index]["message"]
98
+
99
+ message["content"] += delta["content"] if delta["content"]
100
+
101
+ merge_tool_call_list!(tool_call_list: delta["tool_calls"], message:)
102
+ end
103
+ end
104
+
105
+ # @param tool_call_list [Array<Hash>, nil]
106
+ # @param message [Hash]
107
+ def merge_tool_call_list!(tool_call_list:, message:)
108
+ return unless tool_call_list
109
+
110
+ message["tool_calls"] ||= []
111
+
112
+ tool_call_list.each do |tool_call|
113
+ merge_tool_call_data!(tool_call:, message:)
114
+ end
115
+ end
116
+
117
+ # @param tool_call [Hash]
118
+ # @param message [Hash]
119
+ def merge_tool_call_data!(tool_call:, message:)
120
+ tool_call_index = tool_call["index"]
121
+
122
+ if message["tool_calls"][tool_call_index].nil?
123
+ message["tool_calls"][tool_call_index] = tool_call
124
+ else
125
+ message["tool_calls"][tool_call_index]["function"]["arguments"] += tool_call["function"]["arguments"]
126
+ end
53
127
  end
54
128
 
55
129
  # @return [EventStreamParser::Parser]
@@ -4,10 +4,12 @@ module OmniAI
4
4
  class Chat
5
5
  # A tool-call that includes an ID / function.
6
6
  class ToolCall
7
- # @return [String]
7
+ # @!attribute [rw] id
8
+ # @return [String]
8
9
  attr_accessor :id
9
10
 
10
- # @return [Function]
11
+ # @!attribute [rw] function
12
+ # @return [Function]
11
13
  attr_accessor :function
12
14
 
13
15
  # @param id [String]
@@ -25,7 +27,7 @@ module OmniAI
25
27
  # @param data [Hash]
26
28
  # @param context [Context] optional
27
29
  #
28
- # @return [Function]
30
+ # @return [ToolCall]
29
31
  def self.deserialize(data, context: nil)
30
32
  deserialize = context&.deserializer(:tool_call)
31
33
  return deserialize.call(data, context:) if deserialize
@@ -0,0 +1,52 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class Chat
5
+ # An `OmniAI::Chat::ToolCallList` is a collection designed to handle the merging of multiple tool call arrays. LLMs
6
+ # provide a subset of tool call items when chunking, so merging requires the use of the tool call index to combine.
7
+ class ToolCallList
8
+ include Enumerable
9
+
10
+ # @param entries [Array<ToolCall>]
11
+ def initialize(entries: [])
12
+ @entries = entries
13
+ end
14
+
15
+ # Usage:
16
+ #
17
+ # ToolCallList.deserialize([]) # => #<ToolCallList ...>
18
+ #
19
+ # @param data [Array]
20
+ # @param context [Context] optional
21
+ #
22
+ # @return [ToolCallList]
23
+ def self.deserialize(data, context: nil)
24
+ return unless data
25
+
26
+ new(entries: data.map { |subdata| ToolCall.deserialize(subdata, context:) })
27
+ end
28
+
29
+ # Usage:
30
+ #
31
+ # tool_call_list.serialize # => [...]
32
+ #
33
+ # @param context [Context] optional
34
+ #
35
+ # @return [Array<Hash>]
36
+ def serialize(context: nil)
37
+ @entries.compact.map { |tool_call| tool_call.serialize(context:) }
38
+ end
39
+
40
+ # @return [String]
41
+ def inspect
42
+ "#<#{self.class.name} entries=#{@entries.inspect}>"
43
+ end
44
+
45
+ # @yield toolcall
46
+ # @yieldparam toolcall [ToolCall]
47
+ def each(&)
48
+ @entries.each(&)
49
+ end
50
+ end
51
+ end
52
+ end
@@ -28,7 +28,7 @@ module OmniAI
28
28
  #
29
29
  # @return [ToolMessage]
30
30
  def self.deserialize(data, context: nil)
31
- deserialize = context&.deserializer(:tool_message)
31
+ deserialize = context&.deserializer(:tool_call_message)
32
32
  return deserialize.call(data, context:) if deserialize
33
33
 
34
34
  role = data["role"]
@@ -47,7 +47,7 @@ module OmniAI
47
47
  #
48
48
  # @return [Hash]
49
49
  def serialize(context: nil)
50
- serializer = context&.serializer(:tool_message)
50
+ serializer = context&.serializer(:tool_call_message)
51
51
  return serializer.call(self, context:) if serializer
52
52
 
53
53
  role = @role
data/lib/omniai/chat.rb CHANGED
@@ -86,12 +86,25 @@ module OmniAI
86
86
  end
87
87
 
88
88
  # @raise [HTTPError]
89
+ #
90
+ # @return [OmniAI::Chat::Response]
89
91
  def process!
90
92
  response = request!
91
93
 
92
94
  raise HTTPError, response.flush unless response.status.ok?
93
95
 
94
- parse!(response:)
96
+ completion = parse!(response:)
97
+
98
+ if @tools && completion.tool_call_list?
99
+ spawn!(
100
+ @prompt.dup.tap do |prompt|
101
+ prompt.messages += completion.messages
102
+ prompt.messages += build_tool_call_messages(completion.tool_call_list)
103
+ end
104
+ ).process!
105
+ else
106
+ completion
107
+ end
95
108
  end
96
109
 
97
110
  protected
@@ -135,6 +148,7 @@ module OmniAI
135
148
  end
136
149
 
137
150
  # @param response [HTTP::Response]
151
+ #
138
152
  # @return [OmniAI::Chat::Response]
139
153
  def parse!(response:)
140
154
  if @stream
@@ -145,31 +159,22 @@ module OmniAI
145
159
  end
146
160
 
147
161
  # @param response [HTTP::Response]
162
+ #
148
163
  # @return [OmniAI::Chat::Response]
149
164
  def complete!(response:)
150
- completion = self.class::Response.new(data: response.parse, context:)
151
-
152
- if @tools && completion.tool_call_list?
153
- spawn!(
154
- @prompt.dup.tap do |prompt|
155
- prompt.messages += completion.messages
156
- prompt.messages += build_tool_call_messages(completion.tool_call_list)
157
- end
158
- ).process!
159
- else
160
- completion
161
- end
165
+ OmniAI::Chat::Response.deserialize(response.parse, context:)
162
166
  end
163
167
 
164
168
  # @param response [HTTP::Response]
165
- # @return [OmniAI::Chat::Stream]
169
+ #
170
+ # @return [OmniAI::Chat::Response]
166
171
  def stream!(response:)
167
172
  raise Error, "#{self.class.name}#stream! unstreamable" unless @stream
168
173
 
169
- self.class::Stream.new(body: response.body, logger:, context:).stream! do |chunk|
174
+ data = self.class::Stream.new(chunks: response.body, logger:).stream! do |chunk|
170
175
  case @stream
171
176
  when IO, StringIO
172
- if chunk.text
177
+ if chunk.text?
173
178
  @stream << chunk.text
174
179
  @stream.flush
175
180
  end
@@ -177,7 +182,11 @@ module OmniAI
177
182
  end
178
183
  end
179
184
 
180
- @stream.puts if @stream.is_a?(IO) || @stream.is_a?(StringIO)
185
+ response = OmniAI::Chat::Response.deserialize(data, context:)
186
+
187
+ @stream.puts if response.text? && (@stream.is_a?(IO) || @stream.is_a?(StringIO))
188
+
189
+ response
181
190
  end
182
191
 
183
192
  # @return [HTTP::Response]
data/lib/omniai/cli.rb CHANGED
@@ -10,8 +10,6 @@ module OmniAI
10
10
  # cli = OmniAI::CLI.new
11
11
  # cli.parse
12
12
  class CLI
13
- ChatArgs = Struct.new(:provider, :model, :temperature)
14
-
15
13
  # @param stdin [IO] a stream
16
14
  # @param stdout [IO] a stream
17
15
  # @param provider [String] a provider
data/lib/omniai/client.rb CHANGED
@@ -134,7 +134,7 @@ module OmniAI
134
134
  # @yield [prompt] optional
135
135
  # @yieldparam prompt [OmniAI::Chat::Prompt]
136
136
  #
137
- # @return [OmniAI::Chat::Completion]
137
+ # @return [OmniAI::Chat::Response]
138
138
  def chat(messages = nil, model:, temperature: nil, format: nil, stream: nil, tools: nil, &)
139
139
  raise NotImplementedError, "#{self.class.name}#chat undefined"
140
140
  end
@@ -58,7 +58,7 @@ module OmniAI
58
58
  # )
59
59
  #
60
60
  # @param properties [Hash<String, OmniAI::Tool::Property>] required - the properties of the object
61
- # @param requird [Array<Symbol>] optional - the required properties
61
+ # @param required [Array<Symbol>] optional - the required properties
62
62
  # @param description [String] optional - a description of the object
63
63
  #
64
64
  # @return [OmniAI::Tool::Array]
@@ -4,7 +4,17 @@ module OmniAI
4
4
  class Transcribe
5
5
  # A transcription returned by the API.
6
6
  class Transcription
7
- attr_accessor :text, :model, :format
7
+ # @!attribute [rw] text
8
+ # @return [String]
9
+ attr_accessor :text
10
+
11
+ # @!attribute [rw] model
12
+ # @return [String]
13
+ attr_accessor :model
14
+
15
+ # @!attribute [rw] format
16
+ # @return [String]
17
+ attr_accessor :format
8
18
 
9
19
  # @param text [String]
10
20
  # @param model [String]
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "1.9.5"
4
+ VERSION = "2.0.0"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -27,6 +27,7 @@ module OmniAI
27
27
  @response = response
28
28
  end
29
29
 
30
+ # @return [String]
30
31
  def inspect
31
32
  "#<#{self.class} status=#{@response.status} body=#{@response.body}>"
32
33
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.9.5
4
+ version: 2.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-01-30 00:00:00.000000000 Z
10
+ date: 2025-03-03 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -83,17 +83,18 @@ files:
83
83
  - lib/omniai/chat.rb
84
84
  - lib/omniai/chat/choice.rb
85
85
  - lib/omniai/chat/content.rb
86
+ - lib/omniai/chat/delta.rb
86
87
  - lib/omniai/chat/file.rb
87
88
  - lib/omniai/chat/function.rb
88
89
  - lib/omniai/chat/media.rb
89
90
  - lib/omniai/chat/message.rb
90
91
  - lib/omniai/chat/message/builder.rb
91
- - lib/omniai/chat/payload.rb
92
92
  - lib/omniai/chat/prompt.rb
93
93
  - lib/omniai/chat/response.rb
94
94
  - lib/omniai/chat/stream.rb
95
95
  - lib/omniai/chat/text.rb
96
96
  - lib/omniai/chat/tool_call.rb
97
+ - lib/omniai/chat/tool_call_list.rb
97
98
  - lib/omniai/chat/tool_call_message.rb
98
99
  - lib/omniai/chat/tool_call_result.rb
99
100
  - lib/omniai/chat/url.rb
@@ -140,7 +141,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
140
141
  - !ruby/object:Gem::Version
141
142
  version: '0'
142
143
  requirements: []
143
- rubygems_version: 3.6.2
144
+ rubygems_version: 3.6.3
144
145
  specification_version: 4
145
146
  summary: A generalized framework for interacting with many AI services
146
147
  test_files: []
@@ -1,85 +0,0 @@
1
- # frozen_string_literal: true
2
-
3
- module OmniAI
4
- class Chat
5
- # A chunk or completion.
6
- class Payload
7
- # @return [Array<Choice>]
8
- attr_accessor :choices
9
-
10
- # @return [Usage, nil]
11
- attr_accessor :usage
12
-
13
- # @param choices [Array<Choice>]
14
- # @param usage [Usage, nil]
15
- def initialize(choices:, usage: nil)
16
- @choices = choices
17
- @usage = usage
18
- end
19
-
20
- # @return [String]
21
- def inspect
22
- "#<#{self.class.name} choices=#{choices.inspect} usage=#{usage.inspect}>"
23
- end
24
-
25
- # @param data [Hash]
26
- # @param context [OmniAI::Context] optional
27
- def self.deserialize(data, context: nil)
28
- deserialize = context&.deserializer(:payload)
29
- return deserialize.call(data, context:) if deserialize
30
-
31
- choices = data["choices"].map { |choice_data| Choice.deserialize(choice_data, context:) }
32
- usage = Usage.deserialize(data["usage"], context:) if data["usage"]
33
-
34
- new(choices:, usage:)
35
- end
36
-
37
- # @param context [OmniAI::Context] optional
38
- # @return [Hash]
39
- def serialize(context:)
40
- serialize = context&.serializer(:payload)
41
- return serialize.call(self, context:) if serialize
42
-
43
- {
44
- choices: choices.map { |choice| choice.serialize(context:) },
45
- usage: usage&.serialize(context:),
46
- }
47
- end
48
-
49
- # @param index [Integer]
50
- # @return [Choice]
51
- def choice(index: 0)
52
- @choices[index]
53
- end
54
-
55
- # @param index [Integer]
56
- # @return [Message]
57
- def message(index: 0)
58
- choice(index:).message
59
- end
60
-
61
- # @return [Array<Message>]
62
- def messages
63
- @choices.map(&:message)
64
- end
65
-
66
- # @param index [Integer]
67
- # @return [String, nil]
68
- def text(index: 0)
69
- message(index:).text
70
- end
71
-
72
- # @param index [Integer]
73
- # @return [Boolean]
74
- def text?(index: 0)
75
- message(index:).text?
76
- end
77
-
78
- # @param index [Integer]
79
- # @return [Array<ToolCall>]
80
- def tool_call_list(index:)
81
- message(index:).tool_call_list
82
- end
83
- end
84
- end
85
- end