omniai 1.9.5 → 1.9.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c57a422d4d3213f3d952940908890c4da086668972a6596f4ce69530af1bf547
4
- data.tar.gz: a66bb5326ce64219cc58ab1f072ef40741ae03a83aecd86d96a6417767911c5b
3
+ metadata.gz: 23098dd1fbbb9542265f3d32a9c37642a1a554fa4a789c78fb4e25a364213116
4
+ data.tar.gz: 4885f37cc46af15c1e208a6c9c4f0f06b84d4986c1d0b9361fd2b9e451fb3352
5
5
  SHA512:
6
- metadata.gz: 832c57105c19df9bf3675a1050cae1bdb775cd117a4222b81ee7032abf5356007fa04f7282017585da7563855d0e6ddc55aa8f330a9dfdceb22b0b9f632c2f96
7
- data.tar.gz: d202c4bc08eaefa7b5ec08c3192c0fa08f6938fbd3feade8018d65c4a46cb56d078194b072b2b483388c6fca736fa334e3f717fb14d607123425ce250f969488
6
+ metadata.gz: a63e630e0aeeeb8342cceba0d2505c487e9dacfcd76bcc07e75d8c29c132a9806d3d4fdc4346ebcd6a6eb02534a14de75f9bd5530388eec1652ff19a61b2d008
7
+ data.tar.gz: a61425c68913a53de2d0300fe3c74ea546491f69fffd53c1c6fbd414924b7204b718c7676de5b604a063532a8149bbaef132032b1cbc49ac0e69d9030bfd456e
data/README.md CHANGED
@@ -6,7 +6,7 @@
6
6
  [![Yard](https://img.shields.io/badge/docs-site-blue.svg)](https://omniai.ksylvest.com)
7
7
  [![CircleCI](https://img.shields.io/circleci/build/github/ksylvest/omniai)](https://circleci.com/gh/ksylvest/omniai)
8
8
 
9
- OmniAI standardizes the APIs of various AI / LLM companies such as Anthropic, Google, Mistral and OpenAI for the generation of text, the conversion of text-to-speech, the conversion of speech-to-text, the generation of embeddings, and more. It offers a unified API regardless of the provider and task.
9
+ OmniAI provides a unified Ruby API for integrating with multiple AI providers, including Anthropic, DeepSeek, Google, Mistral, and OpenAI. It streamlines AI development by offering a consistent interface for features such as chat, text-to-speech, speech-to-text, and embeddings—ensuring seamless interoperability across platforms. Switching between providers is effortless, making any integration more flexible and reliable.
10
10
 
11
11
  - [OmniAI::Anthropic](https://github.com/ksylvest/omniai-anthropic)
12
12
  - [OmniAI::DeepSeek](https://github.com/ksylvest/omniai-deepseek)
@@ -18,6 +18,8 @@ OmniAI standardizes the APIs of various AI / LLM companies such as Anthropic, Go
18
18
 
19
19
  ### Example #1: [Chat](https://github.com/ksylvest/omniai/blob/main/examples/chat)
20
20
 
21
+ This example demonstrates using `OmniAI` with **Anthropic** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
22
+
21
23
  ```ruby
22
24
  require 'omniai/anthropic'
23
25
 
@@ -45,6 +47,8 @@ The animals in the photos are:
45
47
 
46
48
  ### Example #2: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
47
49
 
50
+ This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
51
+
48
52
  ```ruby
49
53
  require 'omniai/openai'
50
54
 
@@ -59,6 +63,8 @@ end
59
63
 
60
64
  ### Example #3: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
61
65
 
66
+ This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
67
+
62
68
  ```ruby
63
69
  require 'omniai/openai'
64
70
 
@@ -72,6 +78,8 @@ end
72
78
 
73
79
  ### Example #4: [Tools](https://github.com/ksylvest/omniai/blob/main/examples/tools)
74
80
 
81
+ This example demonstrates how to use `OmniAI` with **Google** to call a custom tool that generates a weather report. The tool accepts a list of locations (city and country) and returns a temperature — randomly generated for demonstration purposes — based on the provided parameters.
82
+
75
83
  ```ruby
76
84
  require 'omniai/google'
77
85
 
@@ -127,6 +135,8 @@ The weather is 24° celcius in London and 42° fahrenheit in Seattle.
127
135
 
128
136
  ### Example #5: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
129
137
 
138
+ This example demonstrates using `OmniAI` with **Mistral** to generate embeddings for a dataset. It defines a set of entries (e.g. "George is a teacher." or "Ringo is a doctor.") and then compares the embeddings generated from a query (e.g. "What does George do?" or "Who is a doctor?") to rank the entries by relevance.
139
+
130
140
  ```ruby
131
141
  require 'omniai/mistral'
132
142
 
@@ -5,15 +5,20 @@ module OmniAI
5
5
  # A choice wraps a message and index returned by an LLM. The default is to generate a single choice. Some LLMs
6
6
  # support generating multiple choices at once (e.g. giving you multiple options to choose from).
7
7
  class Choice
8
- # @return [Integer]
8
+ DEFAULT_INDEX = 0
9
+ private_constant :DEFAULT_INDEX
10
+
11
+ # @!attribute [rw] index
12
+ # @return [Integer]
9
13
  attr_accessor :index
10
14
 
11
- # @return [Message]
15
+ # @!attribute [rw] message
16
+ # @return [Message]
12
17
  attr_accessor :message
13
18
 
14
19
  # @param message [Message]
15
20
  # @param index [Integer]
16
- def initialize(message:, index: 0)
21
+ def initialize(message:, index: DEFAULT_INDEX)
17
22
  @message = message
18
23
  @index = index
19
24
  end
@@ -31,7 +36,7 @@ module OmniAI
31
36
  deserialize = context&.deserializer(:choice)
32
37
  return deserialize.call(data, context:) if deserialize
33
38
 
34
- index = data["index"]
39
+ index = data["index"] || DEFAULT_INDEX
35
40
  message = Message.deserialize(data["message"] || data["delta"], context:)
36
41
 
37
42
  new(message:, index:)
@@ -49,6 +54,13 @@ module OmniAI
49
54
  }
50
55
  end
51
56
 
57
+ # @param other [OmniAI::Chat::Choice]
58
+ #
59
+ # @return [OmniAI::Chat::Choice]
60
+ def merge(other)
61
+ self.class.new(index: @index, message: @message.merge(other.message))
62
+ end
63
+
52
64
  # @return [Message]
53
65
  def delta
54
66
  message
@@ -4,6 +4,8 @@ module OmniAI
4
4
  class Chat
5
5
  # A file is media that can be sent to many LLMs.
6
6
  class File < Media
7
+ # @!attribute [rw] io
8
+ # @return [IO, Pathname, String]
7
9
  attr_accessor :io
8
10
 
9
11
  # @param io [IO, Pathname, String]
@@ -38,13 +38,16 @@ module OmniAI
38
38
  end
39
39
  end
40
40
 
41
- # @return [Array<Content>, String]
41
+ # @!attribute [rw] content
42
+ # @return [Array<Content>, String]
42
43
  attr_accessor :content
43
44
 
44
- # @return [String]
45
+ # @!attribute [rw] role
46
+ # @return [String]
45
47
  attr_accessor :role
46
48
 
47
- # @return [Array<ToolCall>, nil]
49
+ # @!attribute [rw] tool_call_list
50
+ # @return [Array<ToolCall>, nil]
48
51
  attr_accessor :tool_call_list
49
52
 
50
53
  # @param content [String, nil]
@@ -106,6 +109,27 @@ module OmniAI
106
109
  { role: @role, content:, tool_calls: }.compact
107
110
  end
108
111
 
112
+ # @param other [OmniAI::Chat::Message]
113
+ #
114
+ # @return [OmniAI::Chat::Message]
115
+ def merge(other)
116
+ content =
117
+ if @content && other.content
118
+ @content + other.content
119
+ else
120
+ @content || other.content
121
+ end
122
+
123
+ tool_call_list =
124
+ if @tool_call_list && other.tool_call_list
125
+ @tool_call_list + other.tool_call_list
126
+ else
127
+ @tool_call_list || other.tool_call_list
128
+ end
129
+
130
+ self.class.new(content:, role:, tool_call_list:)
131
+ end
132
+
109
133
  # @return [Boolean]
110
134
  def role?(role)
111
135
  String(@role).eql?(String(role))
@@ -12,7 +12,8 @@ module OmniAI
12
12
 
13
13
  # @param choices [Array<Choice>]
14
14
  # @param usage [Usage, nil]
15
- def initialize(choices:, usage: nil)
15
+ def initialize(choices: [], usage: nil)
16
+ @id = SecureRandom.alphanumeric
16
17
  @choices = choices
17
18
  @usage = usage
18
19
  end
@@ -46,12 +47,33 @@ module OmniAI
46
47
  }
47
48
  end
48
49
 
50
+ # @param other [OmniAI::Chat::Payload]
51
+ def merge!(other)
52
+ return unless other
53
+
54
+ @usage = other.usage if other.usage
55
+
56
+ other.choices.each do |choice|
57
+ if choice?(index: choice.index)
58
+ choices[choice.index] = choices[choice.index].merge(choice)
59
+ else
60
+ @choices[choice.index] = choice
61
+ end
62
+ end
63
+ end
64
+
49
65
  # @param index [Integer]
50
66
  # @return [Choice]
51
67
  def choice(index: 0)
52
68
  @choices[index]
53
69
  end
54
70
 
71
+ # @param index [Integer]
72
+ # @return [Boolean]
73
+ def choice?(index: 0)
74
+ !choice(index:).nil?
75
+ end
76
+
55
77
  # @param index [Integer]
56
78
  # @return [Message]
57
79
  def message(index: 0)
@@ -2,23 +2,35 @@
2
2
 
3
3
  module OmniAI
4
4
  class Chat
5
- # Used when streaming to process chunks of data.
5
+ # A stream is used to process a series of chunks of data. It converts the following into a combined payload:
6
+ #
7
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "role" :"assistant", "content":"" } }] }
8
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "content" :"A" } }] }
9
+ # { "id":"...", "choices": [{ "index": 0,"delta": { "content" :"B" } }] }
10
+ # ...
11
+ #
12
+ # Every
6
13
  class Stream
7
14
  # @param logger [OmniAI::Client]
8
- # @param body [HTTP::Response::Body]
15
+ # @param chunks [Enumerable<String>]
9
16
  # @param context [Context, nil]
10
- def initialize(body:, logger: nil, context: nil)
11
- @body = body
17
+ def initialize(chunks:, logger: nil, context: nil)
18
+ @chunks = chunks
12
19
  @logger = logger
13
20
  @context = context
14
21
  end
15
22
 
16
23
  # @yield [payload]
17
24
  # @yieldparam payload [OmniAI::Chat::Payload]
25
+ #
26
+ # @return [OmniAI::Chat::Payload]
18
27
  def stream!(&block)
19
- @body.each do |chunk|
20
- parser.feed(chunk) do |type, data, id|
21
- process!(type, data, id, &block)
28
+ OmniAI::Chat::Payload.new.tap do |payload|
29
+ @chunks.map do |chunk|
30
+ parser.feed(chunk) do |type, data, id|
31
+ result = process!(type, data, id, &block)
32
+ payload.merge!(result) if result.is_a?(OmniAI::Chat::Payload)
33
+ end
22
34
  end
23
35
  end
24
36
  end
@@ -42,14 +54,15 @@ module OmniAI
42
54
  # @param data [String]
43
55
  # @param id [String]
44
56
  #
45
- # @yield [payload]
46
- # @yieldparam payload [OmniAI::Chat::Payload]
57
+ # @return [OmniAI::Chat::Payload, nil]
47
58
  def process!(type, data, id, &block)
48
59
  log(type, data, id)
49
60
 
50
61
  return if data.eql?("[DONE]")
51
62
 
52
- block.call(Payload.deserialize(JSON.parse(data), context: @context))
63
+ payload = Payload.deserialize(JSON.parse(data), context: @context)
64
+ block&.call(payload)
65
+ payload
53
66
  end
54
67
 
55
68
  # @return [EventStreamParser::Parser]
data/lib/omniai/chat.rb CHANGED
@@ -162,11 +162,12 @@ module OmniAI
162
162
  end
163
163
 
164
164
  # @param response [HTTP::Response]
165
- # @return [OmniAI::Chat::Stream]
165
+ #
166
+ # @return [OmniAI::Chat::Payload]
166
167
  def stream!(response:)
167
168
  raise Error, "#{self.class.name}#stream! unstreamable" unless @stream
168
169
 
169
- self.class::Stream.new(body: response.body, logger:, context:).stream! do |chunk|
170
+ payload = self.class::Stream.new(chunks: response.body, logger:, context:).stream! do |chunk|
170
171
  case @stream
171
172
  when IO, StringIO
172
173
  if chunk.text
@@ -178,6 +179,8 @@ module OmniAI
178
179
  end
179
180
 
180
181
  @stream.puts if @stream.is_a?(IO) || @stream.is_a?(StringIO)
182
+
183
+ payload
181
184
  end
182
185
 
183
186
  # @return [HTTP::Response]
data/lib/omniai/cli.rb CHANGED
@@ -10,8 +10,6 @@ module OmniAI
10
10
  # cli = OmniAI::CLI.new
11
11
  # cli.parse
12
12
  class CLI
13
- ChatArgs = Struct.new(:provider, :model, :temperature)
14
-
15
13
  # @param stdin [IO] a stream
16
14
  # @param stdout [IO] a stream
17
15
  # @param provider [String] a provider
@@ -58,7 +58,7 @@ module OmniAI
58
58
  # )
59
59
  #
60
60
  # @param properties [Hash<String, OmniAI::Tool::Property>] required - the properties of the object
61
- # @param requird [Array<Symbol>] optional - the required properties
61
+ # @param required [Array<Symbol>] optional - the required properties
62
62
  # @param description [String] optional - a description of the object
63
63
  #
64
64
  # @return [OmniAI::Tool::Array]
@@ -4,7 +4,17 @@ module OmniAI
4
4
  class Transcribe
5
5
  # A transcription returned by the API.
6
6
  class Transcription
7
- attr_accessor :text, :model, :format
7
+ # @!attribute [rw] text
8
+ # @return [String]
9
+ attr_accessor :text
10
+
11
+ # @!attribute [rw] model
12
+ # @return [String]
13
+ attr_accessor :model
14
+
15
+ # @!attribute [rw] format
16
+ # @return [String]
17
+ attr_accessor :format
8
18
 
9
19
  # @param text [String]
10
20
  # @param model [String]
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "1.9.5"
4
+ VERSION = "1.9.6"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -27,6 +27,7 @@ module OmniAI
27
27
  @response = response
28
28
  end
29
29
 
30
+ # @return [String]
30
31
  def inspect
31
32
  "#<#{self.class} status=#{@response.status} body=#{@response.body}>"
32
33
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.9.5
4
+ version: 1.9.6
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-01-30 00:00:00.000000000 Z
10
+ date: 2025-02-27 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -140,7 +140,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
140
140
  - !ruby/object:Gem::Version
141
141
  version: '0'
142
142
  requirements: []
143
- rubygems_version: 3.6.2
143
+ rubygems_version: 3.6.3
144
144
  specification_version: 4
145
145
  summary: A generalized framework for interacting with many AI services
146
146
  test_files: []