omniai 2.8.0 → 2.8.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a9cab46e52805a61d08bf4638ef7ed9ab46549c016ada8740abb7471f3ee5d07
4
- data.tar.gz: 3a34ea68b515d8759a76d689f2474648c7e55d51b42a93ee022c18df271e049e
3
+ metadata.gz: a80f4da0021307bce467f92f209309aaea62060e93fbb4201a7fc059b0376cea
4
+ data.tar.gz: 329e4a51e2e8b1d66ed5eca4bf7979acc1111f720d62fdfb638d850ca9ff53d8
5
5
  SHA512:
6
- metadata.gz: b241dca1c95f27efd023c09cb6ed9e8193f94e2487490b376a816e5c50d8a6a42e6cb0bc54e47ec3aeba0b12f23de7b3a258d9ca29d222ce6bf414469e2ad777
7
- data.tar.gz: b74ba2a9950c821b7ca6bd5d1348165daab7dea3b7372db702d40cdda8d2669c057b32913cf5713b6d75cf3d97a234acf25c1f11af3357cc465740b18ec5990e
6
+ metadata.gz: 5ce5f9abf61e35856183ebfdf344f41d2961f67140abdd19b48464203a10358e368b606ec8d775e32a515e202c87b69c205a3dc688743ddb072ef70962753948
7
+ data.tar.gz: f517cf093a2304dd4d6562bdf8a96c78480523063e9ff21463b08b2214064efdd65782af50ef070c836502e54dccce37d66f3636027c41409a62dff1abd2eb83
data/README.md CHANGED
@@ -10,13 +10,14 @@ OmniAI provides a unified Ruby API for integrating with multiple AI providers, i
10
10
 
11
11
  - [OmniAI::Anthropic](https://github.com/ksylvest/omniai-anthropic)
12
12
  - [OmniAI::DeepSeek](https://github.com/ksylvest/omniai-deepseek)
13
+ - [OmniAI::Llama](https://github.com/ksylvest/omniai-llama)
13
14
  - [OmniAI::Google](https://github.com/ksylvest/omniai-google)
14
15
  - [OmniAI::Mistral](https://github.com/ksylvest/omniai-mistral)
15
16
  - [OmniAI::OpenAI](https://github.com/ksylvest/omniai-openai)
16
17
 
17
- ## Examples
18
+ ## 📄 Examples
18
19
 
19
- ### Example #1: [Chat w/ Text](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_text)
20
+ ### Example #1: [💬 Chat w/ Text](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_text)
20
21
 
21
22
  This example demonstrates using `OmniAI` with **Anthropic** to ask for a joke. The response is parsed and printed.
22
23
 
@@ -32,7 +33,7 @@ puts client.chat("Tell me a joke").text
32
33
  Why don't scientists trust atoms? Because they make up everything!
33
34
  ```
34
35
 
35
- ### Example #2: [Chat w/ Prompt](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_prompt)
36
+ ### Example #2: [💬 Chat w/ Prompt](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_prompt)
36
37
 
37
38
  This example demonstrates using `OmniAI` with **Mistral** to ask for the fastest animal. It includes a system and user message in the prompt. The response is streamed in real time.
38
39
 
@@ -52,7 +53,7 @@ end
52
53
  **French**: Le faucon pèlerin est généralement considéré comme l'animal le plus rapide, atteignant des vitesses de plus de 390 km/h.
53
54
  ```
54
55
 
55
- ### Example #3: [Chat w/ Vision](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_vision)
56
+ ### Example #3: [💬 Chat w/ Vision](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_vision)
56
57
 
57
58
  This example demonstrates using `OmniAI` with **OpenAI** to prompt a “biologist” for an analysis of photos, identifying the animals within each one. A system and user message are provided, and the response is streamed in real time.
58
59
 
@@ -79,7 +80,7 @@ The first photo is of a cat, *Felis Catus*.
79
80
  The second photo is of a dog, *Canis Familiaris*.
80
81
  ```
81
82
 
82
- ### Example #4: [Chat w/ Tools](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_tools)
83
+ ### Example #4: [💬 Chat w/ Tools](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_tools)
83
84
 
84
85
  This example demonstrates using `OmniAI` with **Google** to ask for the weather. A tool “Weather” is provided. The tool accepts a location and unit (Celsius or Fahrenheit) then calculates the weather. The LLM makes multiple tool-call requests and is automatically provided with a tool-call response prior to streaming in real-time the result.
85
86
 
@@ -149,7 +150,7 @@ The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.
149
150
 
150
151
  _For a set of pre-built tools for interacting with browsers, databases, docker, and more try the [OmniAI::Tools](https://omniai-tools.ksylvest.com/) project._
151
152
 
152
- ### Example #5: [Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
153
+ ### Example #5: [💬 Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
153
154
 
154
155
  Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back-and-forth conversation:
155
156
 
@@ -176,7 +177,7 @@ loop do
176
177
  end
177
178
  ```
178
179
 
179
- ### Example #6 [Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
180
+ ### Example #6 [💬 Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
180
181
 
181
182
  Requesting structured data back from an LLM is possible by defining a schema, then passing the schema into the chat. The following example defines a structured schema using `OmniAI::Schema` to model a `Contact`. The results of the LLM call are then parsed using the schema to ensure all types are correct.
182
183
 
@@ -228,7 +229,7 @@ puts format.parse(response.text)
228
229
  }
229
230
  ```
230
231
 
231
- ### Example #7: [CLI](https://github.com/ksylvest/omniai/blob/main/examples/cli)
232
+ ### Example #7: [🐚 CLI](https://github.com/ksylvest/omniai/blob/main/examples/cli)
232
233
 
233
234
  The `OmniAI` gem also ships with a CLI to simplify quick tests.
234
235
 
@@ -251,7 +252,7 @@ omniai transcribe "./files/audio.wav"
251
252
  omniai embed "What is the capital of France?"
252
253
  ```
253
254
 
254
- ### Example #8: [Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
255
+ ### Example #8: [🔈 Text-to-Speech](https://github.com/ksylvest/omniai/blob/main/examples/text_to_speech)
255
256
 
256
257
  This example demonstrates using `OmniAI` with **OpenAI** to convert text to speech and save it to a file.
257
258
 
@@ -267,7 +268,7 @@ File.open(File.join(__dir__, 'audio.wav'), 'wb') do |file|
267
268
  end
268
269
  ```
269
270
 
270
- ### Example #9: [Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
271
+ ### Example #9: [🎤 Speech-to-Text](https://github.com/ksylvest/omniai/blob/main/examples/speech_to_text)
271
272
 
272
273
  This example demonstrates using `OmniAI` with **OpenAI** to convert speech to text.
273
274
 
@@ -282,7 +283,7 @@ File.open(File.join(__dir__, 'audio.wav'), 'rb') do |file|
282
283
  end
283
284
  ```
284
285
 
285
- ### Example #10: [Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
286
+ ### Example #10: [💻 Embeddings](https://github.com/ksylvest/omniai/blob/main/examples/embeddings)
286
287
 
287
288
  This example demonstrates using `OmniAI` with **Mistral** to generate embeddings for a dataset. It defines a set of entries (e.g. "George is a teacher." or "Ringo is a doctor.") and then compares the embeddings generated from a query (e.g. "What does George do?" or "Who is a doctor?") to rank the entries by relevance.
288
289
 
@@ -325,7 +326,7 @@ search('Who do you call to fix a toilet?')
325
326
  'Who do you call to fix a toilet?': 'Paul is a plumber.'
326
327
  ```
327
328
 
328
- ## Installation
329
+ ## 📦 Installation
329
330
 
330
331
  The main `omniai` gem is installed with:
331
332
 
@@ -343,7 +344,7 @@ gem install omniai-google
343
344
  gem install omniai-openai
344
345
  ```
345
346
 
346
- ## Usage
347
+ ## 📖 Usage
347
348
 
348
349
  OmniAI implements APIs for a number of popular clients by default. A client can be initialized using the specific gem (e.g. `omniai-openai` for `OmniAI::OpenAI`). Vendor specific docs can be found within each repo.
349
350
 
@@ -365,6 +366,14 @@ require 'omniai/deepseek'
365
366
  client = OmniAI::DeepSeek::Client.new
366
367
  ```
367
368
 
369
+ #### [OmniAI::Llama](https://github.com/ksylvest/omniai-llama)
370
+
371
+ ```ruby
372
+ require 'omniai/llama'
373
+
374
+ client = OmniAI::Llama::Client.new
375
+ ```
376
+
368
377
  #### [OmniAI::Google](https://github.com/ksylvest/omniai-google)
369
378
 
370
379
  ```ruby
@@ -445,7 +454,7 @@ client = OmniAI::OpenAI::Client.new(timeout: {
445
454
  })
446
455
  ```
447
456
 
448
- ### Chat
457
+ ### 💬 Chat
449
458
 
450
459
  Clients that support chat (e.g. Anthropic w/ "Claude", Google w/ "Gemini", Mistral w/ "LeChat", OpenAI w/ "ChatGPT", etc) generate completions using the following calls:
451
460
 
@@ -523,7 +532,7 @@ end
523
532
  client.chat('What is the weather in "London" in Celsius and "Paris" in Fahrenheit?', tools: [WeatherTool.new])
524
533
  ```
525
534
 
526
- ### Transcribe
535
+ ### 🎤 Speech to Text
527
536
 
528
537
  Clients that support transcribe (e.g. OpenAI w/ "Whisper") convert recordings to text via the following calls:
529
538
 
@@ -543,7 +552,7 @@ File.open("example.ogg", "rb") do |file|
543
552
  end
544
553
  ```
545
554
 
546
- ### Speak
555
+ ### 🔈 Text to Speech
547
556
 
548
557
  Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to speech via the following calls:
549
558
 
@@ -565,7 +574,7 @@ tempfile.close
565
574
  tempfile.unlink
566
575
  ```
567
576
 
568
- ### Embeddings
577
+ ### 💻 Embeddings
569
578
 
570
579
  Clients that support generating embeddings (e.g. OpenAI, Mistral, etc.) convert text to embeddings via the following:
571
580
 
@@ -588,7 +597,7 @@ response.embeddings.each do |embedding|
588
597
  end
589
598
  ```
590
599
 
591
- ## CLI
600
+ ## 🐚 CLI
592
601
 
593
602
  OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. CLI documentation is available with the `--help` flag:
594
603
 
@@ -665,7 +674,7 @@ omniai speak "Sally sells sea shells on the sea shore." > audio.aac
665
674
  omniai transcribe ./audio.aac
666
675
  ```
667
676
 
668
- ### MCP
677
+ ## MCP
669
678
 
670
679
  [MCP](https://modelcontextprotocol.io/introduction) is an open protocol designed to standardize giving context to LLMs. The OmniAI implementation supports building an MCP server that operates via the [stdio](https://modelcontextprotocol.io/docs/concepts/transports) transport.
671
680
 
data/lib/omniai/client.rb CHANGED
@@ -56,6 +56,18 @@ module OmniAI
56
56
  raise LoadError, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
57
57
  end
58
58
 
59
+ # Lookup the `OmniAI::LLama::Client``. This method requires the provider if it is undefined.
60
+ #
61
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
62
+ #
63
+ # @return [Class<OmniAI::Client>]
64
+ def self.llama
65
+ require "omniai/llama" unless defined?(OmniAI::Llama::Client)
66
+ OmniAI::Llama::Client
67
+ rescue ::LoadError
68
+ raise LoadError, "requires 'omniai-llama': `gem install omniai-llama`"
69
+ end
70
+
59
71
  # Lookup the `OmniAI::Google::Client``. This method requires the provider if it is undefined.
60
72
  #
61
73
  # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
@@ -105,11 +117,12 @@ module OmniAI
105
117
  end
106
118
 
107
119
  raise LoadError, <<~TEXT
108
- requires 'omniai-openai' or 'omniai-anthropic' or 'openai-deepseek' or 'omniai-google' or 'omniai-mistral':
120
+ Please run one of the following commands to install a provider specific gem:
109
121
 
110
122
  `gem install omniai-openai`
111
123
  `gem install omniai-anthropic`
112
124
  `gem install omniai-deepseek`
125
+ `gem install omniai-llama`
113
126
  `gem install omniai-google`
114
127
  `gem install omniai-mistral`
115
128
  TEXT
@@ -118,7 +131,7 @@ module OmniAI
118
131
  # Initialize a client by provider (e.g. 'openai'). This method attempts to require the provider.
119
132
  #
120
133
  #
121
- # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', etc)
134
+ # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', 'llama')
122
135
  #
123
136
  # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
124
137
  #
@@ -129,6 +142,7 @@ module OmniAI
129
142
  when :anthropic, "anthropic" then anthropic
130
143
  when :deepseek, "deepseek" then deepseek
131
144
  when :google, "google" then google
145
+ when :llama, "llama" then llama
132
146
  when :mistral, "mistral" then mistral
133
147
  when :openai, "openai" then openai
134
148
  else raise Error, "unknown provider=#{provider.inspect}"
@@ -64,6 +64,16 @@ module OmniAI
64
64
  @required = required
65
65
  end
66
66
 
67
+ # @return [OmniAI::Schema::Object]
68
+ def dup
69
+ self.class.new(
70
+ title: @title,
71
+ description: @description,
72
+ properties: @properties.dup,
73
+ required: @required.dup
74
+ )
75
+ end
76
+
67
77
  # @return [Hash]
68
78
  def serialize
69
79
  {
data/lib/omniai/tool.rb CHANGED
@@ -16,18 +16,20 @@ module OmniAI
16
16
  # end
17
17
  class Tool
18
18
  class << self
19
- # @param description [String]
19
+ # @param description [String] optional
20
20
  #
21
21
  # @return [String]
22
22
  def description(description = nil)
23
- return @description if description.nil?
24
-
25
- @description = description
23
+ if description.nil?
24
+ @description || (superclass.description unless superclass.eql?(OmniAI::Tool))
25
+ else
26
+ @description = description
27
+ end
26
28
  end
27
29
 
28
30
  # @return [OmniAI::Schema::Object]
29
31
  def parameters
30
- @parameters ||= OmniAI::Schema::Object.new
32
+ @parameters ||= superclass.eql?(OmniAI::Tool) ? superclass.parameters.dup : OmniAI::Schema::Object.new
31
33
  end
32
34
 
33
35
  # @param name [Symbol]
@@ -38,7 +40,7 @@ module OmniAI
38
40
 
39
41
  # @param names [Array<Symbol>]
40
42
  def required(names)
41
- parameters.required = names
43
+ parameters.required += names
42
44
  end
43
45
 
44
46
  # Converts a class name to a tool:
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "2.8.0"
4
+ VERSION = "2.8.1"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -25,9 +25,9 @@ loader.setup
25
25
  module OmniAI
26
26
  class Error < StandardError; end
27
27
 
28
- # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
28
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek' then 'llama').
29
29
  #
30
- # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek) - optional
30
+ # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek, :llama)
31
31
  #
32
32
  # @raise [OmniAI::LoadError] if no providers are installed
33
33
  #
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.8.0
4
+ version: 2.8.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-05-22 00:00:00.000000000 Z
10
+ date: 2025-05-30 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser