omniai 2.6.0 → 2.7.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 6b8a243fb8313c7c1fe3a483cdac33892f1b764cac19c38465a880e3f8a790f0
4
- data.tar.gz: b24f59696a3befee77cde24b8fd25b88331b3664a327e8e207d9f57a08abb524
3
+ metadata.gz: 488be723c0a59ca0095eb3a1f64081cceff14ae982208e598da5058ee0ea0c4b
4
+ data.tar.gz: 47805f82344429912eaa2e0f00299cc95868d35d04a6d424f698b719bf9d0fe8
5
5
  SHA512:
6
- metadata.gz: 985aafb6c64874122c77c172ba7dccc1a371a982855b33add51c3371d872db7008391a28382aa7442d640a2f76df1e3ca95426428fb7cd3b12a177d69ed3605e
7
- data.tar.gz: 42edecdd416fdc53562ea82caf09b94886e761ea2858c28db76dc4e909dd4da87835fe152bbebf3cc87a64a72eded21706c60ba13beba2293fa5470f11e822bd
6
+ metadata.gz: 1c0d66e74c7a0e1eb0296e371b0339d44cedbb7a747c2beeb7cb0547303140d0a6550fd8d7fc6629c09804c57cfcf780dec38f01f6c8aac49004ab28e9968872
7
+ data.tar.gz: aeb129f452a6ea0a9c2e33a25afbe5f8be1e7b27445fb58399556e7bdd0475b0840b36465c6b33edbdf857c6d943c10824b0cc21b7e6a917d161829e4ca033a6
data/README.md CHANGED
@@ -147,9 +147,11 @@ end
147
147
  The weather is 24° Celsius in London and 42° Fahrenheit in Madrid.
148
148
  ```
149
149
 
150
+ _For a set of pre-built tools for interacting with browsers, databases, docker, and more try the [OmniAI::Tools](https://omniai-tools.ksylvest.com/) project._
151
+
150
152
  ### Example #5: [Chat w/ History](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_history)
151
153
 
152
- Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back and forth conversation:
154
+ Tracking a prompt history over multiple user and assistant messages is especially helpful when building an agent like conversation experience. A prompt can be used to track this back-and-forth conversation:
153
155
 
154
156
  ```ruby
155
157
  require "omniai/openai"
@@ -176,6 +178,8 @@ end
176
178
 
177
179
  ### Example #6 [Chat w/ Schema](https://github.com/ksylvest/omniai/blob/main/examples/chat_with_schema)
178
180
 
181
+ Requesting structured data back from an LLM is possible by defining a schema, then passing the schema into the chat. The following example defines a structured schema using `OmniAI::Schema` to model a `Contact`. The results of the LLM call are then parsed using the schema to ensure all types are correct.
182
+
179
183
  ```ruby
180
184
  format = OmniAI::Schema.format(name: "Contact", schema: OmniAI::Schema.object(
181
185
  description: "A contact with a name, relationship, and addresses.",
@@ -421,10 +425,10 @@ require 'omniai/openai'
421
425
  require 'logger'
422
426
 
423
427
  logger = Logger.new(STDOUT)
424
- client = OmniAI::OpenAI::Client.new(timeout: 8) # i.e. 8 seconds
428
+ client = OmniAI::OpenAI::Client.new(timeout: 8) # 8 seconds
425
429
  ```
426
430
 
427
- Timeouts are also configurable by passing a `timeout` hash with `timeout` / `read` / `write` / keys using:
431
+ Timeouts are also configurable by passing a `timeout` hash with `timeout` / `read` / `write` keys using:
428
432
 
429
433
  ```ruby
430
434
  require 'omniai/openai'
@@ -481,7 +485,7 @@ client.chat('Tell me a joke.', stream:)
481
485
 
482
486
  #### Completion using Streaming via IO
483
487
 
484
- The above code can also be supplied any IO (e.g. `File`, `$stdout`, `$stdin`, etc):
488
+ The above code can also be supplied with any IO object (e.g., `File`, `$stdout`, `$stdin`, etc.):
485
489
 
486
490
  ```ruby
487
491
  client.chat('Tell me a story', stream: $stdout)
@@ -489,11 +493,11 @@ client.chat('Tell me a story', stream: $stdout)
489
493
 
490
494
  #### Completion with Tools
491
495
 
492
- A chat can also be initialized with tools:
496
+ A chat can also be initialized using tools:
493
497
 
494
498
  ```ruby
495
499
  class WeatherTool
496
- description "Lookup the weather at a location in either Celsius for Fahrenheit."
500
+ description "Lookup the weather at a location in either Celsius or Fahrenheit."
497
501
 
498
502
  parameter :location, :string, description: "The location to find the weather."
499
503
  parameter :unit, :string, enum: %w[Celsius Fahrenheit], description: "The unit of measurement."
@@ -504,11 +508,11 @@ class WeatherTool
504
508
  #
505
509
  # @return [Hash]
506
510
  def execute(location:, unit: "Celsius")
507
- puts "[weather] location=#{locaiton} unit=#{unit}"
511
+ puts "[weather] location=#{location} unit=#{unit}"
508
512
 
509
513
  {
510
514
  temperature: "#{rand(20..50)}°",
511
- humidty: rand(0..100),
515
+ humidity: rand(0..100),
512
516
  }
513
517
  end
514
518
  end
@@ -538,7 +542,7 @@ end
538
542
 
539
543
  ### Speak
540
544
 
541
- Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to recordings via the following calls:
545
+ Clients that support speak (e.g. OpenAI w/ "Whisper") convert text to speech via the following calls:
542
546
 
543
547
  #### Speech with Stream
544
548
 
@@ -568,12 +572,12 @@ response.usage # <OmniAI::Embed::Usage prompt_tokens=5 total_tokens=5>
568
572
  response.embedding # [0.1, 0.2, ...] >
569
573
  ```
570
574
 
571
- Batches of text can also be converted to embeddings via the following:
575
+ Theese APIs support generating embeddings in batches using the following code:
572
576
 
573
577
  ```ruby
574
578
  response = client.embed([
575
- '',
576
- '',
579
+ 'The quick brown fox jumps over a lazy dog',
580
+ 'Pack my box with five dozen liquor jugs',
577
581
  ])
578
582
  response.usage # <OmniAI::Embed::Usage prompt_tokens=5 total_tokens=5>
579
583
  response.embeddings.each do |embedding|
@@ -583,7 +587,7 @@ end
583
587
 
584
588
  ## CLI
585
589
 
586
- OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. A detailed CLI documentation can be found via help:
590
+ OmniAI packages a basic command line interface (CLI) to allow for exploration of various APIs. CLI documentation is available with the `--help` flag:
587
591
 
588
592
  ```bash
589
593
  omniai --help
data/lib/omniai/client.rb CHANGED
@@ -34,69 +34,93 @@ module OmniAI
34
34
 
35
35
  # Initialize a client for Anthropic. This method requires the provider if it is undefined.
36
36
  #
37
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
37
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
38
38
  #
39
39
  # @return [Class<OmniAI::Client>]
40
40
  def self.anthropic
41
41
  require "omniai/anthropic" unless defined?(OmniAI::Anthropic::Client)
42
42
  OmniAI::Anthropic::Client
43
43
  rescue ::LoadError
44
- raise Error, "requires 'omniai-anthropic': `gem install omniai-anthropic`"
44
+ raise LoadError, "requires 'omniai-anthropic': `gem install omniai-anthropic`"
45
45
  end
46
46
 
47
47
  # Initialize a client for DeepSeek. This method requires the provider if it is undefined.
48
48
  #
49
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
49
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
50
50
  #
51
51
  # @return [Class<OmniAI::Client>]
52
52
  def self.deepseek
53
53
  require "omniai/deepseek" unless defined?(OmniAI::DeepSeek::Client)
54
54
  OmniAI::DeepSeek::Client
55
55
  rescue ::LoadError
56
- raise Error, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
56
+ raise LoadError, "requires 'omniai-deepseek': `gem install omniai-deepseek`"
57
57
  end
58
58
 
59
59
  # Lookup the `OmniAI::Google::Client``. This method requires the provider if it is undefined.
60
60
  #
61
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
61
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
62
62
  #
63
63
  # @return [Class<OmniAI::Client>]
64
64
  def self.google
65
65
  require "omniai/google" unless defined?(OmniAI::Google::Client)
66
66
  OmniAI::Google::Client
67
67
  rescue ::LoadError
68
- raise Error, "requires 'omniai-google': `gem install omniai-google`"
68
+ raise LoadError, "requires 'omniai-google': `gem install omniai-google`"
69
69
  end
70
70
 
71
71
  # Initialize a client for Mistral. This method requires the provider if it is undefined.
72
72
  #
73
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
73
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
74
74
  #
75
75
  # @return [Class<OmniAI::Client>]
76
76
  def self.mistral
77
77
  require "omniai/mistral" unless defined?(OmniAI::Mistral::Client)
78
78
  OmniAI::Mistral::Client
79
79
  rescue ::LoadError
80
- raise Error, "requires 'omniai-mistral': `gem install omniai-mistral`"
80
+ raise LoadError, "requires 'omniai-mistral': `gem install omniai-mistral`"
81
81
  end
82
82
 
83
83
  # Initialize a client for OpenAI. This method requires the provider if it is undefined.
84
84
  #
85
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
85
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
86
86
  #
87
87
  # @return [Class<OmniAI::Client>]
88
88
  def self.openai
89
89
  require "omniai/openai" unless defined?(OmniAI::OpenAI::Client)
90
90
  OmniAI::OpenAI::Client
91
91
  rescue ::LoadError
92
- raise Error, "requires 'omniai-openai': `gem install omniai-openai`"
92
+ raise LoadError, "requires 'omniai-openai': `gem install omniai-openai`"
93
+ end
94
+
95
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
96
+ #
97
+ # @raise [OmniAI::LoadError] if no providers are installed
98
+ #
99
+ # @return [OmniAI::Client]
100
+ def self.discover(**)
101
+ %i[openai anthropic google mistral deepseek].each do |provider|
102
+ return find(provider:, **)
103
+ rescue LoadError
104
+ next
105
+ end
106
+
107
+ raise LoadError, <<~TEXT
108
+ requires 'omniai-openai' or 'omniai-anthropic' or 'openai-deepseek' or 'omniai-google' or 'omniai-mistral':
109
+
110
+ `gem install omniai-openai`
111
+ `gem install omniai-anthropic`
112
+ `gem install omniai-deepseek`
113
+ `gem install omniai-google`
114
+ `gem install omniai-mistral`
115
+ TEXT
93
116
  end
94
117
 
95
118
  # Initialize a client by provider (e.g. 'openai'). This method attempts to require the provider.
96
119
  #
97
- # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepsek', 'google', 'mistral', 'openai', etc)
98
120
  #
99
- # @raise [OmniAI::Error] if the provider is not defined and the gem is not installed
121
+ # @param provider [String, Symbol] required (e.g. 'anthropic', 'deepseek', 'google', 'mistral', 'openai', etc)
122
+ #
123
+ # @raise [OmniAI::LoadError] if the provider is not defined and the gem is not installed
100
124
  #
101
125
  # @return [OmniAI::Client]
102
126
  def self.find(provider:, **)
@@ -0,0 +1,7 @@
1
+ # frozen_string_literal: true
2
+
3
+ module OmniAI
4
+ # An error that wraps a ::LoadError for a missing library.
5
+ class LoadError < Error
6
+ end
7
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module OmniAI
4
- VERSION = "2.6.0"
4
+ VERSION = "2.7.0"
5
5
  end
data/lib/omniai.rb CHANGED
@@ -16,6 +16,68 @@ loader.inflector.inflect "http_error" => "HTTPError"
16
16
  loader.inflector.inflect "ssl_error" => "SSLError"
17
17
  loader.setup
18
18
 
19
+ # @example
20
+ #
21
+ # OmniAI.chat(...)
22
+ # OmniAI.transcribe(...)
23
+ # OmniAI.speak(...)
24
+ # OmniAI.embed(...)
19
25
  module OmniAI
20
26
  class Error < StandardError; end
27
+
28
+ # Discover a client by provider ('openai' then 'anthropic' then 'google' then 'mistral' then 'deepseek').
29
+ #
30
+ # @param provider [Symbol] the provider to use (e.g. :openai, :anthropic, :google, :mistral, :deepseek) - optional
31
+ #
32
+ # @raise [OmniAI::LoadError] if no providers are installed
33
+ #
34
+ # @return [OmniAI::Client]
35
+ def self.client(provider: nil, **)
36
+ provider ? OmniAI::Client.find(provider:, **) : OmniAI::Client.discover(**)
37
+ end
38
+
39
+ # @example
40
+ # response = OmniAI.chat("What is the capital of Spain?")
41
+ # puts response.text
42
+ #
43
+ # @example
44
+ # OmniAI.chat(stream: $stdout) do |prompt|
45
+ # prompt.system("Answer in both English and French.")
46
+ # prompt.user("How many people live in Tokyo?")
47
+ # end
48
+ #
49
+ # @see OmniAI::Client#chat
50
+ def self.chat(...)
51
+ client.chat(...)
52
+ end
53
+
54
+ # @example
55
+ # File.open("audio.wav", "rb") do |file|
56
+ # puts OmniAI.transcribe(file).text
57
+ # end
58
+ #
59
+ # @see OmniAI::Client#transcribe
60
+ def self.transcribe(...)
61
+ client.transcribe(...)
62
+ end
63
+
64
+ # @example
65
+ # File.open("audio.wav", "wb") do |file|
66
+ # OmniAI.speak("Sally sells seashells by the seashore.") do |chunk|
67
+ # file << chunk
68
+ # end
69
+ # end
70
+ #
71
+ # @see OmniAI::Client#speak
72
+ def self.speak(...)
73
+ client.speak(...)
74
+ end
75
+
76
+ # @example
77
+ # embedding = OmniAI.embed("The quick brown fox jumps over the lazy dog.").embedding
78
+ #
79
+ # @see OmniAI::Client#embed
80
+ def self.embed(...)
81
+ client.embed(...)
82
+ end
21
83
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: omniai
3
3
  version: !ruby/object:Gem::Version
4
- version: 2.6.0
4
+ version: 2.7.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Kevin Sylvestre
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-05-08 00:00:00.000000000 Z
10
+ date: 2025-05-22 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: event_stream_parser
@@ -112,6 +112,7 @@ files:
112
112
  - lib/omniai/files.rb
113
113
  - lib/omniai/http_error.rb
114
114
  - lib/omniai/instrumentation.rb
115
+ - lib/omniai/load_error.rb
115
116
  - lib/omniai/mcp/jrpc.rb
116
117
  - lib/omniai/mcp/jrpc/error.rb
117
118
  - lib/omniai/mcp/jrpc/request.rb