omniai 2.6.0 → 2.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6b8a243fb8313c7c1fe3a483cdac33892f1b764cac19c38465a880e3f8a790f0
4
- data.tar.gz: b24f59696a3befee77cde24b8fd25b88331b3664a327e8e207d9f57a08abb524
3
+ metadata.gz: a9cab46e52805a61d08bf4638ef7ed9ab46549c016ada8740abb7471f3ee5d07
4
+ data.tar.gz: 3a34ea68b515d8759a76d689f2474648c7e55d51b42a93ee022c18df271e049e
5
5
  SHA512:
6
- metadata.gz: 985aafb6c64874122c77c172ba7dccc1a371a982855b33add51c3371d872db7008391a28382aa7442d640a2f76df1e3ca95426428fb7cd3b12a177d69ed3605e
7
- data.tar.gz: 42edecdd416fdc53562ea82caf09b94886e761ea2858c28db76dc4e909dd4da87835fe152bbebf3cc87a64a72eded21706c60ba13beba2293fa5470f11e822bd
6
+ metadata.gz: b241dca1c95f27efd023c09cb6ed9e8193f94e2487490b376a816e5c50d8a6a42e6cb0bc54e47ec3aeba0b12f23de7b3a258d9ca29d222ce6bf414469e2ad777
7
+ data.tar.gz: b74ba2a9950c821b7ca6bd5d1348165daab7dea3b7372db702d40cdda8d2669c057b32913cf5713b6d75cf3d97a234acf25c1f11af3357cc465740b18ec5990e
data/README.md CHANGED
@@ -147,9 +147,11 @@ end
147
147
  The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.
148
148
  ```
149
149
 
150
+ _For a set of pre-built tools for interacting with browsers, databases, docker, and more try the [OmniAI::Tools](https://omniai-tools.ksylvest.com/) project._
151
+
150
152
  ### Example #5: [Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
151
153
 
152
- Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back and forth conversation:
154
+ Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back-and-forth conversation:
153
155
 
154
156
  ```ruby
155
157
  require "omniai/openai"
@@ -176,6 +178,8 @@ end
176
178
 
177
179
  ### Example #6 [Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
178
180
 
181
+ Requesting structured data back from an LLM is possible by defining a schema, then passing the schema into the chat. The following example defines a structured schema using `OmniAI::Schema` to model a `Contact`. The results of the LLM call are then parsed using the schema to ensure all types are correct.
182
+
179
183
  ```ruby
180
184
  format = OmniAI::Schema.format(name: "Contact", schema: OmniAI::Schema.object(
181
185
  description: "A contact with a name, relationship, and addresses.",
@@ -224,24 +228,27 @@ puts format.parse(response.text)
224
228
  }
225
229
  ```
226
230
 
227
- ### Example #7: [Chat w/ CLI](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_cli)
231
+ ### Example #7: [CLI](https://github.com/ksylvest/omniai/blob/main/examples/cli)
228
232
 
229
233
  The `OmniAI` gem also ships with a CLI to simplify quick tests.
230
234
 
231
235
  ```bash
236
+ # Chat
237
+
232
238
  omniai chat "Who designed the Ruby programming language?"
233
- ```
239
+ omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"
234
240
 
235
- ```
236
- The Ruby programming language was created by Yukihiro Matsumoto, often known as "Matz."
237
- ```
241
+ ## Speech to Text
238
242
 
239
- ```bash
240
- omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"
241
- ```
243
+ omniai speak "Salley sells sea shells by the sea shore." > ./files/audio.wav
242
244
 
243
- ```
244
- I am a large language model, trained by Google.
245
+ # Text to Speech
246
+
247
+ omniai transcribe "./files/audio.wav"
248
+
249
+ # Embed
250
+
251
+ omniai embed "What is the capital of France?"
245
252
  ```
246
253
 
247
254
  ### Example #8: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
@@ -421,10 +428,10 @@ require 'omniai/openai'
421
428
  require 'logger'
422
429
 
423
430
  logger = Logger.new(STDOUT)
424
- client = OmniAI::OpenAI::Client.new(timeout: 8) # i.e. 8 seconds
431
+ client = OmniAI::OpenAI::Client.new(timeout: 8) # 8 seconds
425
432
  ```
426
433
 
427
- Timeouts are also configurable by passing a `timeout` hash with `timeout` / `read` / `write` / keys using:
434
+ Timeouts are also configurable by passing a `timeout` hash with `timeout` / `read` / `write` keys using:
428
435
 
429
436
  ```ruby
430
437
  require 'omniai/openai'
@@ -481,7 +488,7 @@ client.chat('Tell me a joke.', stream:)
481
488
 
482
489
  #### Completion using Streaming via IO
483
490
 
484
- The above code can also be supplied any IO (e.g. `File`, `$stdout`, `$stdin`, etc):
491
+ The above code can also be supplied with any IO object (e.g., `File`, `$stdout`, `$stdin`, etc.):
485
492
 
486
493
  ```ruby
487
494
  client.chat('Tell me a story', stream: $stdout)
@@ -489,11 +496,11 @@ client.chat('Tell me a story', stream: $stdout)
489
496
 
490
497
  #### Completion with Tools
491
498
 
492
- A chat can also be initialized with tools:
499
+ A chat can also be initialized using tools:
493
500
 
494
501
  ```ruby
495
502
  class WeatherTool
496
- description "Lookup the weather at a location in either Celsius for Fahrenheit."
503
+ description "Lookup the weather at a location in either Celsius or Fahrenheit."
497
504
 
498
505
  parameter :location, :string, description: "The location to find the weather."
499
506
  parameter :unit, :string, enum: %w[Celsius Fahrenheit], description: "The unit of measurement."
@@ -504,11 +511,11 @@ class WeatherTool
504
511
  #
505
512
  # @return [Hash]
506
513
  def execute(location:, unit: "Celsius")
507
- puts "[weather] location=#{locaiton} unit=#{unit}"
514
+ puts "[weather] location=#{location} unit=#{unit}"
508
515
 
509
516
  {
510
517
  temperature: "#{rand(20..50)}°",
511
- humidty: rand(0..100),
518
+ humidity: rand(0..100),
512
519
  }
513
520
  end
514
521
  end
@@ -538,7 +545,7 @@ end
538
545
 
539
546
  ### Speak
540
547
 
541
- Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to recordings via the following calls:
548
+ Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to speech via the following calls:
542
549
 
543
550
  #### Speech with Stream
544
551
 
@@ -568,12 +575,12 @@ response.usage # <OmniAI::Embed::Usage prompt_tokens=5 total_tokens=5>
568
575
  response.embedding # [0.1, 0.2, ...] >
569
576
  ```
570
577
 
571
- Batches of text can also be converted to embeddings via the following:
578
+ Theese APIs support generating embeddings in batches using the following code:
572
579
 
573
580
  ```ruby
574
581
  response = client.embed([
575
- '',
576
- '',
582
+ 'The quick brown fox jumps over a lazy dog',
583
+ 'Pack my box with five dozen liquor jugs',
577
584
  ])
578
585
  response.usage # <OmniAI::Embed::Usage prompt_tokens=5 total_tokens=5>
579
586
  response.embeddings.each do |embedding|
@@ -583,7 +590,7 @@ end
583
590
 
584
591
  ## CLI
585
592
 
586
- OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. A detailed CLI documentation can be found via help:
593
+ OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. CLI documentation is available with the `--help` flag:
587
594
 
588
595
  ```bash
589
596
  omniai --help
@@ -646,6 +653,18 @@ The capital of Spain is **Madrid**.
646
653
  ...
647
654
  ```
648
655
 
656
+ #### Text-to-Speech
657
+
658
+ ```bash
659
+ omniai speak "Sally sells sea shells on the sea shore." > audio.aac
660
+ ```
661
+
662
+ #### Speech-to-Text
663
+
664
+ ```bash
665
+ omniai transcribe ./audio.aac
666
+ ```
667
+
649
668
  ### MCP
650
669
 
651
670
  [MCP](https://modelcontextprotocol.io/introduction) is an open protocol designed to standardize giving context to LLMs. The OmniAI implementation supports building an MCP server that operates via the [stdio](https://modelcontextprotocol.io/docs/concepts/transports) transport.
@@ -31,7 +31,7 @@ module OmniAI
31
31
 
32
32
  # @return [OmniAI::Client]
33
33
  def client
34
- @client ||= OmniAI::Client.find(provider: @provider)
34
+ @client ||= OmniAI.client(provider: @provider)
35
35
  end
36
36
  end
37
37
  end
@@ -0,0 +1,59 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class CLI
5
+ # Used by CLI to process commands like:
6
+ #
7
+ # omniai speak "Sally sells seashells by the seashore." > ./audio.aac
8
+ class SpeakHandler < BaseHandler
9
+ # @param argv [Array<String>]
10
+ def handle!(argv:)
11
+ parser.parse!(argv)
12
+
13
+ speak(argv.join(" "))
14
+ end
15
+
16
+ private
17
+
18
+ # @param input [String]
19
+ def speak(input)
20
+ client.speak(input, **@args) do |chunk|
21
+ @stdout.write(chunk)
22
+ end
23
+ @stdout.flush
24
+ end
25
+
26
+ # @return [OptionParser]
27
+ def parser
28
+ OptionParser.new do |options|
29
+ options.banner = 'usage: omniai speak [options] "<prompt>"'
30
+
31
+ options.on("-h", "--help", "help") do
32
+ @stdout.puts(options)
33
+ exit
34
+ end
35
+
36
+ options.on("-p", "--provider=PROVIDER", "provider") do |provider|
37
+ @provider = provider
38
+ end
39
+
40
+ options.on("-m", "--model=MODEL", "model") do |model|
41
+ @args[:model] = model
42
+ end
43
+
44
+ options.on("-v", "--voice=VOICE", "voice") do |voice|
45
+ @args[:voice] = voice
46
+ end
47
+
48
+ options.on("-s", "--speed=SPEED", Float, "speed") do |speed|
49
+ @args[:speed] = speed
50
+ end
51
+
52
+ options.on("-f", "--format=FORMAT", "format") do |format|
53
+ @args[:format] = format
54
+ end
55
+ end
56
+ end
57
+ end
58
+ end
59
+ end
@@ -0,0 +1,56 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class CLI
5
+ # Used by CLI to process commands like:
6
+ #
7
+ # omniai transcribe ./audio.wav
8
+ class TranscribeHandler < BaseHandler
9
+ # @param argv [Array<String>]
10
+ def handle!(argv:)
11
+ parser.parse!(argv)
12
+
13
+ argv.each do |path|
14
+ transcribe(path)
15
+ end
16
+ end
17
+
18
+ private
19
+
20
+ # @param input [String]
21
+ def transcribe(input)
22
+ File.open(input, "rb") do |file|
23
+ @stdout.puts(client.transcribe(file, **@args).text)
24
+ end
25
+ end
26
+
27
+ # @return [OptionParser]
28
+ def parser
29
+ OptionParser.new do |options|
30
+ options.banner = 'usage: omniai chat [options] "<prompt>"'
31
+
32
+ options.on("-h", "--help", "help") do
33
+ @stdout.puts(options)
34
+ exit
35
+ end
36
+
37
+ options.on("-p", "--provider=PROVIDER", "provider") do |provider|
38
+ @provider = provider
39
+ end
40
+
41
+ options.on("-m", "--model=MODEL", "model") do |model|
42
+ @args[:model] = model
43
+ end
44
+
45
+ options.on("-l", "--language=LANGUAGE", "language") do |language|
46
+ @args[:language] = language
47
+ end
48
+
49
+ options.on("-f", "--format=FORMAT", "format") do |format|
50
+ @args[:format] = format
51
+ end
52
+ end
53
+ end
54
+ end
55
+ end
56
+ end
data/lib/omniai/cli.rb CHANGED
@@ -13,7 +13,7 @@ module OmniAI
13
13
  # @param stdin [IO] a stream
14
14
  # @param stdout [IO] a stream
15
15
  # @param provider [String] a provider
16
- def initialize(stdin: $stdin, stdout: $stdout, provider: "openai")
16
+ def initialize(stdin: $stdin, stdout: $stdout, provider: nil)
17
17
  @stdin = stdin
18
18
  @stdout = stdout
19
19
  @provider = provider
@@ -30,6 +30,8 @@ module OmniAI
30
30
  case command
31
31
  when "chat" then ChatHandler
32
32
  when "embed" then EmbedHandler
33
+ when "speak" then SpeakHandler
34
+ when "transcribe" then TranscribeHandler
33
35
  else raise Error, "unsupported command=#{command.inspect}"
34
36
  end
35
37
 
@@ -60,6 +62,9 @@ module OmniAI
60
62
  options.separator <<~COMMANDS
61
63
  commands:
62
64
  chat
65
+ embed
66
+ speak
67
+ transcribe
63
68
  COMMANDS
64
69
  end
65
70
  end
data/lib/omniai/client.rb CHANGED
@@ -34,69 +34,93 @@ module OmniAI
34
34
 
35
35
  # Initialize a client for Anthropic. This method requires the provider if it is undefined.
36
36
  #
37
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
37
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
38
38
  #
39
39
  # @return [Class<OmniAI::Client>]
40
40
  def self.anthropic
41
41
  require "omniai/anthropic" unless defined?(OmniAI::Anthropic::Client)
42
42
  OmniAI::Anthropic::Client
43
43
  rescue ::LoadError
44
- raise Error, "requires 'omniai-anthropic': `gem install omniai-anthropic`"
44
+ raise LoadError, "requires 'omniai-anthropic': `gem install omniai-anthropic`"
45
45
  end
46
46
 
47
47
  # Initialize a client for DeepSeek. This method requires the provider if it is undefined.
48
48
  #
49
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
49
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
50
50
  #
51
51
  # @return [Class<OmniAI::Client>]
52
52
  def self.deepseek
53
53
  require "omniai/deepseek" unless defined?(OmniAI::DeepSeek::Client)
54
54
  OmniAI::DeepSeek::Client
55
55
  rescue ::LoadError
56
- raise Error, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
56
+ raise LoadError, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
57
57
  end
58
58
 
59
59
  # Lookup the `OmniAI::Google::Client``. This method requires the provider if it is undefined.
60
60
  #
61
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
61
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
62
62
  #
63
63
  # @return [Class<OmniAI::Client>]
64
64
  def self.google
65
65
  require "omniai/google" unless defined?(OmniAI::Google::Client)
66
66
  OmniAI::Google::Client
67
67
  rescue ::LoadError
68
- raise Error, "requires 'omniai-google': `gem install omniai-google`"
68
+ raise LoadError, "requires 'omniai-google': `gem install omniai-google`"
69
69
  end
70
70
 
71
71
  # Initialize a client for Mistral. This method requires the provider if it is undefined.
72
72
  #
73
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
73
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
74
74
  #
75
75
  # @return [Class<OmniAI::Client>]
76
76
  def self.mistral
77
77
  require "omniai/mistral" unless defined?(OmniAI::Mistral::Client)
78
78
  OmniAI::Mistral::Client
79
79
  rescue ::LoadError
80
- raise Error, "requires 'omniai-mistral': `gem install omniai-mistral`"
80
+ raise LoadError, "requires 'omniai-mistral': `gem install omniai-mistral`"
81
81
  end
82
82
 
83
83
  # Initialize a client for OpenAI. This method requires the provider if it is undefined.
84
84
  #
85
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
85
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
86
86
  #
87
87
  # @return [Class<OmniAI::Client>]
88
88
  def self.openai
89
89
  require "omniai/openai" unless defined?(OmniAI::OpenAI::Client)
90
90
  OmniAI::OpenAI::Client
91
91
  rescue ::LoadError
92
- raise Error, "requires 'omniai-openai': `gem install omniai-openai`"
92
+ raise LoadError, "requires 'omniai-openai': `gem install omniai-openai`"
93
+ end
94
+
95
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
96
+ #
97
+ # @raise [OmniAI::LoadError] if no providers are installed
98
+ #
99
+ # @return [OmniAI::Client]
100
+ def self.discover(**)
101
+ %i[openai anthropic google mistral deepseek].each do |provider|
102
+ return find(provider:, **)
103
+ rescue LoadError
104
+ next
105
+ end
106
+
107
+ raise LoadError, <<~TEXT
108
+ requires 'omniai-openai' or 'omniai-anthropic' or 'openai-deepseek' or 'omniai-google' or 'omniai-mistral':
109
+
110
+ `gem install omniai-openai`
111
+ `gem install omniai-anthropic`
112
+ `gem install omniai-deepseek`
113
+ `gem install omniai-google`
114
+ `gem install omniai-mistral`
115
+ TEXT
93
116
  end
94
117
 
95
118
  # Initialize a client by provider (e.g. 'openai'). This method attempts to require the provider.
96
119
  #
97
- # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepsek', 'google', 'mistral', 'openai', etc)
98
120
  #
99
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
121
+ # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', etc)
122
+ #
123
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
100
124
  #
101
125
  # @return [OmniAI::Client]
102
126
  def self.find(provider:, **)
@@ -0,0 +1,7 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ # An error that wraps a ::LoadError for a missing library.
5
+ class LoadError < Error
6
+ end
7
+ end
data/lib/omniai/speak.rb CHANGED
@@ -23,7 +23,11 @@ module OmniAI
23
23
  # end
24
24
  # end
25
25
  #
26
- # client.transcribe(File.open("..."), model: "...", format: :json)
26
+ # File.open('audio.wav', 'wb') do |file|
27
+ # client.speak('Sally sells seashells by the seashore.', format: OmniAI::Speak::Format::WAV) do |chunk|
28
+ # file << chunk
29
+ # end
30
+ # end
27
31
  class Speak
28
32
  module Format
29
33
  AAC = "aac"
@@ -34,6 +38,8 @@ module OmniAI
34
38
  WAV = "wav"
35
39
  end
36
40
 
41
+ DEFAULT_FORMAT = Format::AAC
42
+
37
43
  # @raise [HTTPError]
38
44
  #
39
45
  # @param client [OmniAI::Client] required
@@ -52,7 +58,7 @@ module OmniAI
52
58
  # @yield [chunk]
53
59
  #
54
60
  # @return [Tempfile]
55
- def self.process!(input, client:, model:, voice:, speed: nil, format: nil, &)
61
+ def self.process!(input, client:, model:, voice:, speed: nil, format: DEFAULT_FORMAT, &)
56
62
  new(input, client:, model:, voice:, speed:, format:).process!(&)
57
63
  end
58
64
 
@@ -68,7 +74,7 @@ module OmniAI
68
74
  # - "opus"
69
75
  # - "pcm"
70
76
  # - "wav"
71
- def initialize(input, client:, model:, voice:, speed: nil, format: nil)
77
+ def initialize(input, client:, model:, voice:, speed: nil, format: DEFAULT_FORMAT)
72
78
  @input = input
73
79
  @client = client
74
80
  @model = model
@@ -23,7 +23,9 @@ module OmniAI
23
23
  # end
24
24
  # end
25
25
  #
26
- # client.transcribe(File.open("..."), model: "...", format: :json)
26
+ # File.open('audio.wav', 'rb') do |file|
27
+ # client.transcribe(file, model: "...", format: :json)
28
+ # end
27
29
  class Transcribe
28
30
  module Language
29
31
  AFRIKAANS = "af"
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "2.6.0"
4
+ VERSION = "2.8.0"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -16,6 +16,68 @@ loader.inflector.inflect "http_error" => "HTTPError"
16
16
  loader.inflector.inflect "ssl_error" => "SSLError"
17
17
  loader.setup
18
18
 
19
+ # @example
20
+ #
21
+ # OmniAI.chat(...)
22
+ # OmniAI.transcribe(...)
23
+ # OmniAI.speak(...)
24
+ # OmniAI.embed(...)
19
25
  module OmniAI
20
26
  class Error < StandardError; end
27
+
28
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
29
+ #
30
+ # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek) - optional
31
+ #
32
+ # @raise [OmniAI::LoadError] if no providers are installed
33
+ #
34
+ # @return [OmniAI::Client]
35
+ def self.client(provider: nil, **)
36
+ provider ? OmniAI::Client.find(provider:, **) : OmniAI::Client.discover(**)
37
+ end
38
+
39
+ # @example
40
+ # response = OmniAI.chat("What is the capital of Spain?")
41
+ # puts response.text
42
+ #
43
+ # @example
44
+ # OmniAI.chat(stream: $stdout) do |prompt|
45
+ # prompt.system("Answer in both English and French.")
46
+ # prompt.user("How many people live in Tokyo?")
47
+ # end
48
+ #
49
+ # @see OmniAI::Client#chat
50
+ def self.chat(...)
51
+ client.chat(...)
52
+ end
53
+
54
+ # @example
55
+ # File.open("audio.wav", "rb") do |file|
56
+ # puts OmniAI.transcribe(file).text
57
+ # end
58
+ #
59
+ # @see OmniAI::Client#transcribe
60
+ def self.transcribe(...)
61
+ client.transcribe(...)
62
+ end
63
+
64
+ # @example
65
+ # File.open("audio.wav", "wb") do |file|
66
+ # OmniAI.speak("Sally sells seashells by the seashore.") do |chunk|
67
+ # file << chunk
68
+ # end
69
+ # end
70
+ #
71
+ # @see OmniAI::Client#speak
72
+ def self.speak(...)
73
+ client.speak(...)
74
+ end
75
+
76
+ # @example
77
+ # embedding = OmniAI.embed("The quick brown fox jumps over the lazy dog.").embedding
78
+ #
79
+ # @see OmniAI::Client#embed
80
+ def self.embed(...)
81
+ client.embed(...)
82
+ end
21
83
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.6.0
4
+ version: 2.8.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-05-08 00:00:00.000000000 Z
10
+ date: 2025-05-22 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -103,6 +103,8 @@ files:
103
103
  - lib/omniai/cli/base_handler.rb
104
104
  - lib/omniai/cli/chat_handler.rb
105
105
  - lib/omniai/cli/embed_handler.rb
106
+ - lib/omniai/cli/speak_handler.rb
107
+ - lib/omniai/cli/transcribe_handler.rb
106
108
  - lib/omniai/client.rb
107
109
  - lib/omniai/config.rb
108
110
  - lib/omniai/context.rb
@@ -112,6 +114,7 @@ files:
112
114
  - lib/omniai/files.rb
113
115
  - lib/omniai/http_error.rb
114
116
  - lib/omniai/instrumentation.rb
117
+ - lib/omniai/load_error.rb
115
118
  - lib/omniai/mcp/jrpc.rb
116
119
  - lib/omniai/mcp/jrpc/error.rb
117
120
  - lib/omniai/mcp/jrpc/request.rb