omniai 2.7.0 → 2.8.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 488be723c0a59ca0095eb3a1f64081cceff14ae982208e598da5058ee0ea0c4b
4
- data.tar.gz: 47805f82344429912eaa2e0f00299cc95868d35d04a6d424f698b719bf9d0fe8
3
+ metadata.gz: a80f4da0021307bce467f92f209309aaea62060e93fbb4201a7fc059b0376cea
4
+ data.tar.gz: 329e4a51e2e8b1d66ed5eca4bf7979acc1111f720d62fdfb638d850ca9ff53d8
5
5
  SHA512:
6
- metadata.gz: 1c0d66e74c7a0e1eb0296e371b0339d44cedbb7a747c2beeb7cb0547303140d0a6550fd8d7fc6629c09804c57cfcf780dec38f01f6c8aac49004ab28e9968872
7
- data.tar.gz: aeb129f452a6ea0a9c2e33a25afbe5f8be1e7b27445fb58399556e7bdd0475b0840b36465c6b33edbdf857c6d943c10824b0cc21b7e6a917d161829e4ca033a6
6
+ metadata.gz: 5ce5f9abf61e35856183ebfdf344f41d2961f67140abdd19b48464203a10358e368b606ec8d775e32a515e202c87b69c205a3dc688743ddb072ef70962753948
7
+ data.tar.gz: f517cf093a2304dd4d6562bdf8a96c78480523063e9ff21463b08b2214064efdd65782af50ef070c836502e54dccce37d66f3636027c41409a62dff1abd2eb83
data/README.md CHANGED
@@ -10,13 +10,14 @@ OmniAI provides a unified Ruby API for integrating with multiple AI providers, i
10
10
 
11
11
  - [OmniAI::Anthropic](https://github.com/ksylvest/omniai-anthropic)
12
12
  - [OmniAI::DeepSeek](https://github.com/ksylvest/omniai-deepseek)
13
+ - [OmniAI::Llama](https://github.com/ksylvest/omniai-llama)
13
14
  - [OmniAI::Google](https://github.com/ksylvest/omniai-google)
14
15
  - [OmniAI::Mistral](https://github.com/ksylvest/omniai-mistral)
15
16
  - [OmniAI::OpenAI](https://github.com/ksylvest/omniai-openai)
16
17
 
17
- ## Examples
18
+ ## 📄 Examples
18
19
 
19
- ### Example #1: [Chat w/ Text](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_text)
20
+ ### Example #1: [💬 Chat w/ Text](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_text)
20
21
 
21
22
  This example demonstrates using `OmniAI` with **Anthropic** to ask for a joke. The response is parsed and printed.
22
23
 
@@ -32,7 +33,7 @@ puts client.chat("Tell me a joke").text
32
33
  Why don't scientists trust atoms? Because they make up everything!
33
34
  ```
34
35
 
35
- ### Example #2: [Chat w/ Prompt](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_prompt)
36
+ ### Example #2: [💬 Chat w/ Prompt](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_prompt)
36
37
 
37
38
  This example demonstrates using `OmniAI` with **Mistral** to ask for the fastest animal. It includes a system and user message in the prompt. The response is streamed in real time.
38
39
 
@@ -52,7 +53,7 @@ end
52
53
  **French**: Le faucon pèlerin est généralement considéré comme l'animal le plus rapide, atteignant des vitesses de plus de 390 km/h.
53
54
  ```
54
55
 
55
- ### Example #3: [Chat w/ Vision](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_vision)
56
+ ### Example #3: [💬 Chat w/ Vision](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_vision)
56
57
 
57
58
  This example demonstrates using `OmniAI` with **OpenAI** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
58
59
 
@@ -79,7 +80,7 @@ The first photo is of a cat, *Felis Catus*.
79
80
  The second photo is of a dog, *Canis Familiaris*.
80
81
  ```
81
82
 
82
- ### Example #4: [Chat w/ Tools](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_tools)
83
+ ### Example #4: [💬 Chat w/ Tools](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_tools)
83
84
 
84
85
  This example demonstrates using `OmniAI` with **Google** to ask for the weather. A tool “Weather” is provided. The tool accepts a location and unit (Celsius or Fahrenheit) then calculates the weather. The LLM makes multiple tool-call requests and is automatically provided with a tool-call response prior to streaming in real-time the result.
85
86
 
@@ -149,7 +150,7 @@ The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.
149
150
 
150
151
  _For a set of pre-built tools for interacting with browsers, databases, docker, and more try the [OmniAI::Tools](https://omniai-tools.ksylvest.com/) project._
151
152
 
152
- ### Example #5: [Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
153
+ ### Example #5: [💬 Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
153
154
 
154
155
  Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back-and-forth conversation:
155
156
 
@@ -176,7 +177,7 @@ loop do
176
177
  end
177
178
  ```
178
179
 
179
- ### Example #6 [Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
180
+ ### Example #6 [💬 Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
180
181
 
181
182
  Requesting structured data back from an LLM is possible by defining a schema, then passing the schema into the chat. The following example defines a structured schema using `OmniAI::Schema` to model a `Contact`. The results of the LLM call are then parsed using the schema to ensure all types are correct.
182
183
 
@@ -228,27 +229,30 @@ puts format.parse(response.text)
228
229
  }
229
230
  ```
230
231
 
231
- ### Example #7: [Chat w/ CLI](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_cli)
232
+ ### Example #7: [🐚 CLI](https://github.com/ksylvest/omniai/blob/main/examples/cli)
232
233
 
233
234
  The `OmniAI` gem also ships with a CLI to simplify quick tests.
234
235
 
235
236
  ```bash
237
+ # Chat
238
+
236
239
  omniai chat "Who designed the Ruby programming language?"
237
- ```
240
+ omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"
238
241
 
239
- ```
240
- The Ruby programming language was created by Yukihiro Matsumoto, often known as "Matz."
241
- ```
242
+ ## Speech to Text
242
243
 
243
- ```bash
244
- omniai chat --provider="google" --model="gemini-2.0-flash" "Who are you?"
245
- ```
244
+ omniai speak "Salley sells sea shells by the sea shore." > ./files/audio.wav
246
245
 
247
- ```
248
- I am a large language model, trained by Google.
246
+ # Text to Speech
247
+
248
+ omniai transcribe "./files/audio.wav"
249
+
250
+ # Embed
251
+
252
+ omniai embed "What is the capital of France?"
249
253
  ```
250
254
 
251
- ### Example #8: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
255
+ ### Example #8: [🔈 Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
252
256
 
253
257
  This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
254
258
 
@@ -264,7 +268,7 @@ File.open(File.join(__dir__, 'audio.wav'), 'wb') do |file|
264
268
  end
265
269
  ```
266
270
 
267
- ### Example #9: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
271
+ ### Example #9: [🎤 Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
268
272
 
269
273
  This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
270
274
 
@@ -279,7 +283,7 @@ File.open(File.join(__dir__, 'audio.wav'), 'rb') do |file|
279
283
  end
280
284
  ```
281
285
 
282
- ### Example #10: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
286
+ ### Example #10: [💻 Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
283
287
 
284
288
  This example demonstrates using `OmniAI` with **Mistral** to generate embeddings for a dataset. It defines a set of entries (e.g. "George is a teacher." or "Ringo is a doctor.") and then compares the embeddings generated from a query (e.g. "What does George do?" or "Who is a doctor?") to rank the entries by relevance.
285
289
 
@@ -322,7 +326,7 @@ search('Who do you call to fix a toilet?')
322
326
  'Who do you call to fix a toilet?': 'Paul is a plumber.'
323
327
  ```
324
328
 
325
- ## Installation
329
+ ## 📦 Installation
326
330
 
327
331
  The main `omniai` gem is installed with:
328
332
 
@@ -340,7 +344,7 @@ gem install omniai-google
340
344
  gem install omniai-openai
341
345
  ```
342
346
 
343
- ## Usage
347
+ ## 📖 Usage
344
348
 
345
349
  OmniAI implements APIs for a number of popular clients by default. A client can be initialized using the specific gem (e.g. `omniai-openai` for `OmniAI::OpenAI`). Vendor specific docs can be found within each repo.
346
350
 
@@ -362,6 +366,14 @@ require 'omniai/deepseek'
362
366
  client = OmniAI::DeepSeek::Client.new
363
367
  ```
364
368
 
369
+ #### [OmniAI::Llama](https://github.com/ksylvest/omniai-llama)
370
+
371
+ ```ruby
372
+ require 'omniai/llama'
373
+
374
+ client = OmniAI::Llama::Client.new
375
+ ```
376
+
365
377
  #### [OmniAI::Google](https://github.com/ksylvest/omniai-google)
366
378
 
367
379
  ```ruby
@@ -442,7 +454,7 @@ client = OmniAI::OpenAI::Client.new(timeout: {
442
454
  })
443
455
  ```
444
456
 
445
- ### Chat
457
+ ### 💬 Chat
446
458
 
447
459
  Clients that support chat (e.g. Anthropic w/ "Claude", Google w/ "Gemini", Mistral w/ "LeChat", OpenAI w/ "ChatGPT", etc) generate completions using the following calls:
448
460
 
@@ -520,7 +532,7 @@ end
520
532
  client.chat('What is the weather in "London" in Celsius and "Paris" in Fahrenheit?', tools: [WeatherTool.new])
521
533
  ```
522
534
 
523
- ### Transcribe
535
+ ### 🎤 Speech to Text
524
536
 
525
537
  Clients that support transcribe (e.g. OpenAI w/ "Whisper") convert recordings to text via the following calls:
526
538
 
@@ -540,7 +552,7 @@ File.open("example.ogg", "rb") do |file|
540
552
  end
541
553
  ```
542
554
 
543
- ### Speak
555
+ ### 🔈 Text to Speech
544
556
 
545
557
  Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to speech via the following calls:
546
558
 
@@ -562,7 +574,7 @@ tempfile.close
562
574
  tempfile.unlink
563
575
  ```
564
576
 
565
- ### Embeddings
577
+ ### 💻 Embeddings
566
578
 
567
579
  Clients that support generating embeddings (e.g. OpenAI, Mistral, etc.) convert text to embeddings via the following:
568
580
 
@@ -585,7 +597,7 @@ response.embeddings.each do |embedding|
585
597
  end
586
598
  ```
587
599
 
588
- ## CLI
600
+ ## 🐚 CLI
589
601
 
590
602
  OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. CLI documentation is available with the `--help` flag:
591
603
 
@@ -650,7 +662,19 @@ The capital of Spain is **Madrid**.
650
662
  ...
651
663
  ```
652
664
 
653
- ### MCP
665
+ #### Text-to-Speech
666
+
667
+ ```bash
668
+ omniai speak "Sally sells sea shells on the sea shore." > audio.aac
669
+ ```
670
+
671
+ #### Speech-to-Text
672
+
673
+ ```bash
674
+ omniai transcribe ./audio.aac
675
+ ```
676
+
677
+ ## MCP
654
678
 
655
679
  [MCP](https://modelcontextprotocol.io/introduction) is an open protocol designed to standardize giving context to LLMs. The OmniAI implementation supports building an MCP server that operates via the [stdio](https://modelcontextprotocol.io/docs/concepts/transports) transport.
656
680
 
@@ -31,7 +31,7 @@ module OmniAI
31
31
 
32
32
  # @return [OmniAI::Client]
33
33
  def client
34
- @client ||= OmniAI::Client.find(provider: @provider)
34
+ @client ||= OmniAI.client(provider: @provider)
35
35
  end
36
36
  end
37
37
  end
@@ -0,0 +1,59 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class CLI
5
+ # Used by CLI to process commands like:
6
+ #
7
+ # omniai speak "Sally sells seashells by the seashore." > ./audio.aac
8
+ class SpeakHandler < BaseHandler
9
+ # @param argv [Array<String>]
10
+ def handle!(argv:)
11
+ parser.parse!(argv)
12
+
13
+ speak(argv.join(" "))
14
+ end
15
+
16
+ private
17
+
18
+ # @param input [String]
19
+ def speak(input)
20
+ client.speak(input, **@args) do |chunk|
21
+ @stdout.write(chunk)
22
+ end
23
+ @stdout.flush
24
+ end
25
+
26
+ # @return [OptionParser]
27
+ def parser
28
+ OptionParser.new do |options|
29
+ options.banner = 'usage: omniai speak [options] "<prompt>"'
30
+
31
+ options.on("-h", "--help", "help") do
32
+ @stdout.puts(options)
33
+ exit
34
+ end
35
+
36
+ options.on("-p", "--provider=PROVIDER", "provider") do |provider|
37
+ @provider = provider
38
+ end
39
+
40
+ options.on("-m", "--model=MODEL", "model") do |model|
41
+ @args[:model] = model
42
+ end
43
+
44
+ options.on("-v", "--voice=VOICE", "voice") do |voice|
45
+ @args[:voice] = voice
46
+ end
47
+
48
+ options.on("-s", "--speed=SPEED", Float, "speed") do |speed|
49
+ @args[:speed] = speed
50
+ end
51
+
52
+ options.on("-f", "--format=FORMAT", "format") do |format|
53
+ @args[:format] = format
54
+ end
55
+ end
56
+ end
57
+ end
58
+ end
59
+ end
@@ -0,0 +1,56 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ class CLI
5
+ # Used by CLI to process commands like:
6
+ #
7
+ # omniai transcribe ./audio.wav
8
+ class TranscribeHandler < BaseHandler
9
+ # @param argv [Array<String>]
10
+ def handle!(argv:)
11
+ parser.parse!(argv)
12
+
13
+ argv.each do |path|
14
+ transcribe(path)
15
+ end
16
+ end
17
+
18
+ private
19
+
20
+ # @param input [String]
21
+ def transcribe(input)
22
+ File.open(input, "rb") do |file|
23
+ @stdout.puts(client.transcribe(file, **@args).text)
24
+ end
25
+ end
26
+
27
+ # @return [OptionParser]
28
+ def parser
29
+ OptionParser.new do |options|
30
+ options.banner = 'usage: omniai chat [options] "<prompt>"'
31
+
32
+ options.on("-h", "--help", "help") do
33
+ @stdout.puts(options)
34
+ exit
35
+ end
36
+
37
+ options.on("-p", "--provider=PROVIDER", "provider") do |provider|
38
+ @provider = provider
39
+ end
40
+
41
+ options.on("-m", "--model=MODEL", "model") do |model|
42
+ @args[:model] = model
43
+ end
44
+
45
+ options.on("-l", "--language=LANGUAGE", "language") do |language|
46
+ @args[:language] = language
47
+ end
48
+
49
+ options.on("-f", "--format=FORMAT", "format") do |format|
50
+ @args[:format] = format
51
+ end
52
+ end
53
+ end
54
+ end
55
+ end
56
+ end
data/lib/omniai/cli.rb CHANGED
@@ -13,7 +13,7 @@ module OmniAI
13
13
  # @param stdin [IO] a stream
14
14
  # @param stdout [IO] a stream
15
15
  # @param provider [String] a provider
16
- def initialize(stdin: $stdin, stdout: $stdout, provider: "openai")
16
+ def initialize(stdin: $stdin, stdout: $stdout, provider: nil)
17
17
  @stdin = stdin
18
18
  @stdout = stdout
19
19
  @provider = provider
@@ -30,6 +30,8 @@ module OmniAI
30
30
  case command
31
31
  when "chat" then ChatHandler
32
32
  when "embed" then EmbedHandler
33
+ when "speak" then SpeakHandler
34
+ when "transcribe" then TranscribeHandler
33
35
  else raise Error, "unsupported command=#{command.inspect}"
34
36
  end
35
37
 
@@ -60,6 +62,9 @@ module OmniAI
60
62
  options.separator <<~COMMANDS
61
63
  commands:
62
64
  chat
65
+ embed
66
+ speak
67
+ transcribe
63
68
  COMMANDS
64
69
  end
65
70
  end
data/lib/omniai/client.rb CHANGED
@@ -56,6 +56,18 @@ module OmniAI
56
56
  raise LoadError, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
57
57
  end
58
58
 
59
+ # Lookup the `OmniAI::LLama::Client``. This method requires the provider if it is undefined.
60
+ #
61
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
62
+ #
63
+ # @return [Class<OmniAI::Client>]
64
+ def self.llama
65
+ require "omniai/llama" unless defined?(OmniAI::Llama::Client)
66
+ OmniAI::Llama::Client
67
+ rescue ::LoadError
68
+ raise LoadError, "requires 'omniai-llama': `gem install omniai-llama`"
69
+ end
70
+
59
71
  # Lookup the `OmniAI::Google::Client``. This method requires the provider if it is undefined.
60
72
  #
61
73
  # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
@@ -105,11 +117,12 @@ module OmniAI
105
117
  end
106
118
 
107
119
  raise LoadError, <<~TEXT
108
- requires 'omniai-openai' or 'omniai-anthropic' or 'openai-deepseek' or 'omniai-google' or 'omniai-mistral':
120
+ Please run one of the following commands to install a provider specific gem:
109
121
 
110
122
  `gem install omniai-openai`
111
123
  `gem install omniai-anthropic`
112
124
  `gem install omniai-deepseek`
125
+ `gem install omniai-llama`
113
126
  `gem install omniai-google`
114
127
  `gem install omniai-mistral`
115
128
  TEXT
@@ -118,7 +131,7 @@ module OmniAI
118
131
  # Initialize a client by provider (e.g. 'openai'). This method attempts to require the provider.
119
132
  #
120
133
  #
121
- # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', etc)
134
+ # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', 'llama')
122
135
  #
123
136
  # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
124
137
  #
@@ -129,6 +142,7 @@ module OmniAI
129
142
  when :anthropic, "anthropic" then anthropic
130
143
  when :deepseek, "deepseek" then deepseek
131
144
  when :google, "google" then google
145
+ when :llama, "llama" then llama
132
146
  when :mistral, "mistral" then mistral
133
147
  when :openai, "openai" then openai
134
148
  else raise Error, "unknown provider=#{provider.inspect}"
@@ -64,6 +64,16 @@ module OmniAI
64
64
  @required = required
65
65
  end
66
66
 
67
+ # @return [OmniAI::Schema::Object]
68
+ def dup
69
+ self.class.new(
70
+ title: @title,
71
+ description: @description,
72
+ properties: @properties.dup,
73
+ required: @required.dup
74
+ )
75
+ end
76
+
67
77
  # @return [Hash]
68
78
  def serialize
69
79
  {
data/lib/omniai/speak.rb CHANGED
@@ -23,7 +23,11 @@ module OmniAI
23
23
  # end
24
24
  # end
25
25
  #
26
- # client.transcribe(File.open("..."), model: "...", format: :json)
26
+ # File.open('audio.wav', 'wb') do |file|
27
+ # client.speak('Sally sells seashells by the seashore.', format: OmniAI::Speak::Format::WAV) do |chunk|
28
+ # file << chunk
29
+ # end
30
+ # end
27
31
  class Speak
28
32
  module Format
29
33
  AAC = "aac"
@@ -34,6 +38,8 @@ module OmniAI
34
38
  WAV = "wav"
35
39
  end
36
40
 
41
+ DEFAULT_FORMAT = Format::AAC
42
+
37
43
  # @raise [HTTPError]
38
44
  #
39
45
  # @param client [OmniAI::Client] required
@@ -52,7 +58,7 @@ module OmniAI
52
58
  # @yield [chunk]
53
59
  #
54
60
  # @return [Tempfile]
55
- def self.process!(input, client:, model:, voice:, speed: nil, format: nil, &)
61
+ def self.process!(input, client:, model:, voice:, speed: nil, format: DEFAULT_FORMAT, &)
56
62
  new(input, client:, model:, voice:, speed:, format:).process!(&)
57
63
  end
58
64
 
@@ -68,7 +74,7 @@ module OmniAI
68
74
  # - "opus"
69
75
  # - "pcm"
70
76
  # - "wav"
71
- def initialize(input, client:, model:, voice:, speed: nil, format: nil)
77
+ def initialize(input, client:, model:, voice:, speed: nil, format: DEFAULT_FORMAT)
72
78
  @input = input
73
79
  @client = client
74
80
  @model = model
data/lib/omniai/tool.rb CHANGED
@@ -16,18 +16,20 @@ module OmniAI
16
16
  # end
17
17
  class Tool
18
18
  class << self
19
- # @param description [String]
19
+ # @param description [String] optional
20
20
  #
21
21
  # @return [String]
22
22
  def description(description = nil)
23
- return @description if description.nil?
24
-
25
- @description = description
23
+ if description.nil?
24
+ @description || (superclass.description unless superclass.eql?(OmniAI::Tool))
25
+ else
26
+ @description = description
27
+ end
26
28
  end
27
29
 
28
30
  # @return [OmniAI::Schema::Object]
29
31
  def parameters
30
- @parameters ||= OmniAI::Schema::Object.new
32
+ @parameters ||= superclass.eql?(OmniAI::Tool) ? superclass.parameters.dup : OmniAI::Schema::Object.new
31
33
  end
32
34
 
33
35
  # @param name [Symbol]
@@ -38,7 +40,7 @@ module OmniAI
38
40
 
39
41
  # @param names [Array<Symbol>]
40
42
  def required(names)
41
- parameters.required = names
43
+ parameters.required += names
42
44
  end
43
45
 
44
46
  # Converts a class name to a tool:
@@ -23,7 +23,9 @@ module OmniAI
23
23
  # end
24
24
  # end
25
25
  #
26
- # client.transcribe(File.open("..."), model: "...", format: :json)
26
+ # File.open('audio.wav', 'rb') do |file|
27
+ # client.transcribe(file, model: "...", format: :json)
28
+ # end
27
29
  class Transcribe
28
30
  module Language
29
31
  AFRIKAANS = "af"
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "2.7.0"
4
+ VERSION = "2.8.1"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -25,9 +25,9 @@ loader.setup
25
25
  module OmniAI
26
26
  class Error < StandardError; end
27
27
 
28
- # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
28
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek' then 'llama').
29
29
  #
30
- # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek) - optional
30
+ # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek, :llama)
31
31
  #
32
32
  # @raise [OmniAI::LoadError] if no providers are installed
33
33
  #
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.7.0
4
+ version: 2.8.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-05-22 00:00:00.000000000 Z
10
+ date: 2025-05-30 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -103,6 +103,8 @@ files:
103
103
  - lib/omniai/cli/base_handler.rb
104
104
  - lib/omniai/cli/chat_handler.rb
105
105
  - lib/omniai/cli/embed_handler.rb
106
+ - lib/omniai/cli/speak_handler.rb
107
+ - lib/omniai/cli/transcribe_handler.rb
106
108
  - lib/omniai/client.rb
107
109
  - lib/omniai/config.rb
108
110
  - lib/omniai/context.rb