ai_client 0.2.1 → 0.2.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c938cc076640fa6952b1140252b0c234a44e7dfd6ee24d2485c344d3b2c98880
4
- data.tar.gz: 1dc5cb21b49c2a731689e7ab7d4f6cce56392b06ecb83e2d54b535ec0d0b3d27
3
+ metadata.gz: 07a66886ce84dd916574f7a38e292626124ed1a048da92eec33cfdc0e66435e4
4
+ data.tar.gz: dc9c6734ae4a6a479ed8f6b76fc8edea31dbb588f67ae801b915a067346c5735
5
5
  SHA512:
6
- metadata.gz: d2d81aa7bc979a8c75856965261ce54470ef3df1586eb65e809c2e4d32f8a4771a22a85640e026260826ab691e54742666ce0b9d71f2ba992b60db3d5c689cc8
7
- data.tar.gz: 7347641ccb974d62c0393a364cb09b214019c0e99a2514478632a9b0c4736bf530c0ba415bae316b0438230691418265b21aed46ef32ca122935d73e30085f41
6
+ metadata.gz: ced4f6a3834132d0c1d127ad3fcd931e9600d748a28d6593fc9e382fb6bb98b9fd31de56d44d5a216c5f0bb8394c360f04779995fd2345ae1043e137cd994091
7
+ data.tar.gz: be77228ac195d1f3de41f1ec0e82f2d2e2aa0f280e51354156c00b9c40bdb466581c87b4ba73b22c57fa86300df9555b7a2230854ac6ff93e365028b946275ba
data/CHANGELOG.md CHANGED
@@ -1,6 +1,11 @@
1
1
  ## [Unreleased]
2
2
 
3
3
  ## Released
4
+ ### [0.2.2] - 2024-10-07
5
+ - Added support for open_router.ai with extensions/omniai-open_router.rb
6
+ - Removed the `model_type` object
7
+ - Added ability to dump a config to a YAML file
8
+
4
9
  ### [0.2.1] - 2024-10-05
5
10
  - Added support for YAML configuration files
6
11
 
data/README.md CHANGED
@@ -1,20 +1,35 @@
1
1
  # AiClient
2
2
 
3
- First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) upon which this effort depends.
4
-
5
- **This is a work in progress** I could use your help extending its capability.
6
-
7
- AiClien` is working. I've used early versions of it in several projects.
3
+ First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
8
4
 
9
5
  See the [change log](CHANGELOG.md) for recent modifications.
10
6
 
11
7
  ## Summary
12
8
 
13
- `ai_client` is a versatile Ruby gem that serves as a generic client for interacting with various AI service providers through a unified API provided by Kevin's gem `OmniAI`. The `AiClient` class is designed to simplify the integration of large language models (LLMs) into applications. `AiClient` allows developers to create instances using just the model name, greatly reducing configuration overhead.
9
+ Are you ready to supercharge your applications with cutting-edge AI capabilities?
10
+ Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface
11
+ for interacting with a multitude of AI service providers through a single,
12
+ unified API.
13
+
14
+ With `ai_client`, you can effortlessly integrate large language models (LLMs)
15
+ into your projects—simply specify the model name and let the gem handle the
16
+ rest! Say goodbye to tedious configuration and hello to rapid development.
17
+
18
+ This gem comes packed with built-in support for leading AI providers, including
19
+ OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to
20
+ implement chatbots, transcription services, speech synthesis, or embeddings,
21
+ `ai_client` abstracts the complexities of API interactions, allowing you to focus
22
+ on what truly matters: building amazing applications.
14
23
 
15
- With built-in support for popular AI providers—including OpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the complexities of API interactions, offering methods for tasks such as chatting, transcription, speech synthesis, and embedding.
24
+ Plus, with its flexible middleware architecture, you can easily customize request
25
+ and response processing—implement logging, retry logic, and more with minimal effort.
26
+ And thanks to its seamless integration with the `OmniAI` framework, you can leverage
27
+ the latest AI advancements without worrying about vendor lock-in.
28
+
29
+ Join the growing community of developers who are transforming their applications
30
+ with `ai_client`. Install it today and unlock the full potential of AI in your
31
+ projects!
16
32
 
17
- The middleware architecture enables customizable processing of requests and responses, making it easy to implement features like logging and retry logic. Seamlessly integrated with the `OmniAI` framework, `ai_client` empowers developers to leverage cutting-edge AI capabilities without vendor lock-in, making it an essential tool for modern AI-driven applications.
18
33
 
19
34
  ## Installation
20
35
 
@@ -30,6 +45,32 @@ If bundler is not being used to manage dependencies, install the gem by executin
30
45
  gem install ai_client
31
46
  ```
32
47
 
48
+ ## Providers Supported
49
+
50
+ To explicitely designate a provider to use with an AiClient instance
51
+ use the parameter `provider: :your_provider` with the Symbol for the supported
52
+ provider you want to use with the model you specify. The following providers
53
+ are supported by the OmniAI gem upon which AiClient depends along with a few
54
+ extensions.
55
+
56
+ | Symbol | Envar API Key | Client Source |
57
+ | --- | --- | --- |
58
+ | :anthropic | [ANTHROPIC_API_KEY](https://www.anthropic.com/) | OmniAI |
59
+ | :google | [GOOGLE_API_KEY](https://cloud.google.com/gemini) | OmniAI |
60
+ | :localai | [LOCALAI_API_KEY](https://localai.io/) | AiClient Extension |
61
+ | :mistral | [MISTRAL_API_KEY](https://mistral.ai/) | OmniAI |
62
+ | :ollama | [OLLAMA_API_KEY](https://ollama.com/) | AiClient Extension |
63
+ | :open_router | [OPEN_ROUTER_API_KEY](https://openrouter.ai/) | AiClient Extension |
64
+ | :openai | [OPENAI_API_KEY](https://www.openai.com/) | OmniAI |
65
+
66
+ In case you are using a different environment variable for your access token than the ones shown above you can use the `api_key:` parameter.
67
+
68
+ ```ruby
69
+ client = AiClient.new('provider/model_name', api_key: ENV['OPENROUTER_API_KEY'])
70
+ ```
71
+
72
+ This way if you are using `AiClient` inside of a Rails application you can retrieve your access token from a secretes file.
73
+
33
74
  ## Usage
34
75
 
35
76
  Basic usage:
@@ -45,6 +86,13 @@ c1 = AiClient.new('nomic-embeddings-text')
45
86
  c2 = AiClient.new('gpt-4o-mini')
46
87
  ```
47
88
 
89
+ You can also use the `provider:` parameter in the event that the model you want to use is available through multiple providers or that AiClient can not automatically associate the model name with a provider.
90
+
91
+
92
+ ```ruby
93
+ AI = AiClient.new('nomic-embed-text', provider: :ollama)
94
+ ```
95
+
48
96
  ### Configuration
49
97
 
50
98
  There are three levels of configuration, each inherenting from the level above. The following sections
@@ -136,6 +184,21 @@ TODO: list all providers supported and their envar
136
184
 
137
185
  TODO: document the options like `provider: :ollama`
138
186
 
187
+ ## Best ?? Practices
188
+
189
+ If you are going to be using one model for multiple purposes in different parts of your application you can assign the instance of `AiClient` to a constant so that the same client can be used everywhere.
190
+
191
+ ```ruby
192
+ AI = AiClient.new 'gpt-4o'
193
+ ...
194
+ AI.chat "do something with this #{stuff}"
195
+ ...
196
+ AI.speak "warning Will Robinson! #{bad_things_happened}"
197
+ ...
198
+ ```
199
+
200
+ Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
201
+
139
202
  ## Extensions for OmniAI
140
203
 
141
204
  The AiClient makes use of extensions to the OmniAI gem that define
@@ -145,6 +208,12 @@ additional providers and protocols.
145
208
  2. **OmniAI::LocalAI** which also wraps the OmniAI::OpenAI class
146
209
  3. **OmniAI::OpenRouter** TODO: Still under development
147
210
 
211
+ ## OmniAI and OpenRouter
212
+
213
+ OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
214
+
215
+ The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
216
+
148
217
  ## Contributing
149
218
 
150
219
  I can sure use your help. This industry is moving faster than I can keep up with. If you have a bug fix or new feature idea then have at it. Send me a pull request so we all can benefit from your efforts.
data/examples/speak.rb CHANGED
@@ -24,7 +24,7 @@ end
24
24
  title "Default Configuration Text-to-Speech"
25
25
 
26
26
  clients.each do |c|
27
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
27
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
28
28
  text = "Text to speach example using the #{c.model} by provider #{c.provider} with the default voice."
29
29
  result = c.speak(text)
30
30
  puts "Audio generated. Tempfile: #{result.path}"
data/examples/text.rb CHANGED
@@ -48,7 +48,7 @@ end
48
48
  title "Default Configuration Response to 'hello'"
49
49
 
50
50
  clients.each do |c|
51
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
51
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
52
52
  begin
53
53
  response = c.chat('hello')
54
54
  puts response
@@ -73,7 +73,7 @@ puts
73
73
  title "Raw Configuration Response to 'hello'"
74
74
 
75
75
  raw_clients.each do |c|
76
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
76
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
77
77
  begin
78
78
  result = c.chat('hello')
79
79
  puts result.pretty_inspect
@@ -21,7 +21,7 @@ title "Default Configuration Speech-to-Text"
21
21
 
22
22
 
23
23
  clients.each do |c|
24
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
24
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
25
25
  result = c.transcribe(audio_file)
26
26
  puts "Transcription: #{result.pretty_inspect}"
27
27
  end
@@ -0,0 +1,24 @@
1
+ # lib/ai_client/chat.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model: @model [String] optional
8
+ # format: @format [Symbol] optional :text or :json
9
+ # stream: @stream [Proc, nil] optional
10
+ # tools: @tools [Array<OmniAI::Tool>] optional
11
+ # temperature: @temperature [Float, nil] optional
12
+
13
+ def chat(messages, **params)
14
+ result = call_with_middlewares(:chat_without_middlewares, messages, **params)
15
+ @last_response = result
16
+ raw? ? result : content
17
+ end
18
+
19
+
20
+ def chat_without_middlewares(messages, **params)
21
+ @client.chat(messages, model: @model, **params)
22
+ end
23
+
24
+ end
@@ -21,13 +21,9 @@
21
21
  :providers: {}
22
22
  :provider_patterns:
23
23
  :anthropic: !ruby/regexp /^claude/i
24
- :openai: !ruby/regexp /^(gpt|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
25
- :google: !ruby/regexp /^(gemini|palm)/i
26
- :mistral: !ruby/regexp /^(mistral|codestral)/i
24
+ :openai: !ruby/regexp /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
25
+ :google: !ruby/regexp /^(gemini|gemma|palm)/i
26
+ :mistral: !ruby/regexp /^(mistral|codestral|mixtral)/i
27
27
  :localai: !ruby/regexp /^local-/i
28
28
  :ollama: !ruby/regexp /(llama|nomic)/i
29
- :model_types:
30
- :text_to_text: !ruby/regexp /^(nomic|gpt|davinci|curie|babbage|ada|claude|gemini|palm|command|generate|j2-|mistral|codestral)/i
31
- :speech_to_text: !ruby/regexp /^whisper/i
32
- :text_to_speech: !ruby/regexp /^tts/i
33
- :text_to_image: !ruby/regexp /^dall-e/i
29
+ :open_router: !ruby/regexp /\//
@@ -65,6 +65,18 @@
65
65
  # configuration you can do this.
66
66
  # client = AiClient.new('your_model')
67
67
  # client.config = AiClient::Config.load('path/to/file.yml')
68
+ #
69
+ # OmniAI Configuration Items
70
+ # OmniAI::OpenAI
71
+ # config.api_key = '...'
72
+ # config.host = 'http://localhost:8080'
73
+ # config.logger = Logger.new(STDOUT)
74
+ # config.timeout = 15
75
+ # config.chat_options = { ... }
76
+ # config.transcribe_options = { ... }
77
+ # config.speak_options = { ... }
78
+ #
79
+
68
80
 
69
81
 
70
82
  require 'hashie'
@@ -90,6 +102,14 @@ class AiClient
90
102
  # end
91
103
  # end
92
104
 
105
+
106
+ def save(filepath=ENV['HOME']+'/aiclient_config.yml')
107
+ filepath = Pathname.new(filepath) unless filepath.is_a? Pathname
108
+
109
+ filepath.write(YAML.dump(to_hash))
110
+ end
111
+
112
+
93
113
  class << self
94
114
  def load(filepath=DEFAULT_CONFIG_FILEPATH)
95
115
  filepath = Pathname.new(filepath) unless Pathname == filepath.class
@@ -119,19 +139,15 @@ class AiClient
119
139
  providers: {},
120
140
  provider_patterns: {
121
141
  anthropic: /^claude/i,
122
- openai: /^(gpt|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
123
- google: /^(gemini|palm)/i,
124
- mistral: /^(mistral|codestral)/i,
142
+ openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
143
+ google: /^(gemini|gemma|palm)/i,
144
+ mistral: /^(mistral|codestral|mixtral)/i,
125
145
  localai: /^local-/i,
126
- ollama: /(llama|nomic)/i
127
- },
128
- model_types: {
129
- text_to_text: /^(nomic|gpt|davinci|curie|babbage|ada|claude|gemini|palm|command|generate|j2-|mistral|codestral)/i,
130
- speech_to_text: /^whisper/i,
131
- text_to_speech: /^tts/i,
132
- text_to_image: /^dall-e/i
146
+ ollama: /(llama|nomic)/i,
147
+ open_router: /\//
133
148
  }
134
149
  )
150
+
135
151
  @class_config = @default_config.dup
136
152
  end
137
153
  end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/embed.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String] required
8
+ #
9
+
10
+ def embed(input, **params)
11
+ @client.embed(input, model: @model, **params)
12
+ end
13
+
14
+ def batch_embed(inputs, batch_size: 100, **params)
15
+ inputs.each_slice(batch_size).flat_map do |batch|
16
+ sleep 1 # DEBUG rate limits being exceeded
17
+ embed(batch, **params)
18
+ end
19
+ end
20
+
21
+ end
@@ -0,0 +1,26 @@
1
+ # lib/ai_client/speak.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # input [String] required
8
+ # model [String] required
9
+ # voice [String] required
10
+ # speed [Float] optional
11
+ # format [String] optional (default "aac")
12
+ # aac mp3 flac opus pcm wav
13
+ #
14
+ # @yield [output] optional
15
+ #
16
+ # @return [Tempfile``]
17
+
18
+ def speak(text, **params)
19
+ call_with_middlewares(:speak_without_middlewares, text, **params)
20
+ end
21
+
22
+ def speak_without_middlewares(text, **params)
23
+ @client.speak(text, model: @model, **params)
24
+ end
25
+
26
+ end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/transcribe.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String]
8
+ # language [String, nil] optional
9
+ # prompt [String, nil] optional
10
+ # format [Symbol] :text, :srt, :vtt, or :json (default)
11
+ # temperature [Float, nil] optional
12
+
13
+ def transcribe(audio, format: nil, **params)
14
+ call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
15
+ end
16
+
17
+ def transcribe_without_middlewares(audio, format: nil, **params)
18
+ @client.transcribe(audio, model: @model, format: format, **params)
19
+ end
20
+
21
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.2.1"
4
+ VERSION = "0.2.2"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -13,8 +13,17 @@ require 'omniai/anthropic'
13
13
  require 'omniai/google'
14
14
  require 'omniai/mistral'
15
15
  require 'omniai/openai'
16
- require_relative 'extensions/omniai-ollama'
16
+
17
+ require 'open_router'
18
+
17
19
  require_relative 'extensions/omniai-localai'
20
+ require_relative 'extensions/omniai-ollama'
21
+ require_relative 'extensions/omniai-open_router'
22
+
23
+ require_relative 'ai_client/chat'
24
+ require_relative 'ai_client/embed'
25
+ require_relative 'ai_client/speak'
26
+ require_relative 'ai_client/transcribe'
18
27
 
19
28
  require_relative 'ai_client/configuration'
20
29
  require_relative 'ai_client/middleware'
@@ -28,13 +37,41 @@ require_relative 'ai_client/version'
28
37
  # AiClient.use(LoggingMiddleware.new(AiClient.configuration.logger))
29
38
  #
30
39
 
31
-
32
40
  class AiClient
33
41
 
42
+ # Define the refinement for Hash
43
+ module HashRefinement
44
+ refine Hash do
45
+ def tunnel(target_key)
46
+ queue = [self] # Initialize the queue with the current hash
47
+
48
+ until queue.empty?
49
+ current = queue.shift # Dequeue the front hash
50
+
51
+ # Check if the current hash contains the target key
52
+ return current[target_key] if current.key?(target_key)
53
+
54
+ # Enqueue sub-hashes and arrays to the queue for further searching
55
+ current.each_value do |value|
56
+ case value
57
+ when Hash
58
+ queue << value
59
+ when Array
60
+ queue.concat(value.select { |v| v.is_a?(Hash) }) # Add sub-hashes from the array
61
+ end
62
+ end
63
+ end
64
+
65
+ nil # Return nil if the key is not found
66
+ end
67
+ end
68
+ end
69
+
70
+ using HashRefinement
71
+
34
72
  attr_reader :client, # OmniAI's client instance
35
73
  :provider, # [Symbol]
36
74
  :model, # [String]
37
- :model_type, # [Symbol]
38
75
  :logger,
39
76
  :last_response,
40
77
  :timeout,
@@ -71,7 +108,6 @@ class AiClient
71
108
  explicit_provider = options.fetch(:provider, config.provider)
72
109
 
73
110
  @provider = validate_provider(explicit_provider) || determine_provider(model)
74
- @model_type = determine_model_type(model)
75
111
 
76
112
  provider_config = @config.providers[@provider] || {}
77
113
 
@@ -86,8 +122,6 @@ class AiClient
86
122
  @last_response = nil
87
123
  end
88
124
 
89
-
90
-
91
125
  def response = last_response
92
126
  def raw? = config.return_raw
93
127
 
@@ -95,73 +129,33 @@ class AiClient
95
129
  config.return_raw = value
96
130
  end
97
131
 
98
-
99
-
100
- ######################################
101
- def chat(messages, **params)
102
- result = call_with_middlewares(:chat_without_middlewares, messages, **params)
103
- @last_response = result
104
- raw? ? result : content
105
- end
106
-
107
-
108
- def chat_without_middlewares(messages, **params)
109
- @client.chat(messages, model: @model, **params)
110
- end
111
-
112
- ######################################
113
- def transcribe(audio, format: nil, **params)
114
- call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
115
- end
116
-
117
- def transcribe_without_middlewares(audio, format: nil, **params)
118
- @client.transcribe(audio, model: @model, format: format, **params)
119
- end
120
-
121
- ######################################
122
- def speak(text, **params)
123
- call_with_middlewares(:speak_without_middlewares, text, **params)
124
- end
125
-
126
- def speak_without_middlewares(text, **params)
127
- @client.speak(text, model: @model, **params)
128
- end
129
-
130
-
131
- ######################################
132
- def embed(input, **params)
133
- @client.embed(input, model: @model, **params)
134
- end
135
-
136
- def batch_embed(inputs, batch_size: 100, **params)
137
- inputs.each_slice(batch_size).flat_map do |batch|
138
- sleep 1 # DEBUG rate limits being exceeded
139
- embed(batch, **params)
140
- end
141
- end
142
-
143
- ######################################
144
- ## Utilities
145
-
146
132
  def content
147
133
  case @provider
148
134
  when :openai, :localai, :ollama
149
- last_response.data.dig('choices', 0, 'message', 'content')
135
+ # last_response.data.dig('choices', 0, 'message', 'content')
136
+ last_response.data.tunnel 'content'
137
+
150
138
  when :anthropic
151
- last_response.data.dig('content',0,'text')
139
+ # last_response.data.dig('content',0,'text')
140
+ last_response.data.tunnel 'text'
141
+
152
142
  when :google
153
- last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
143
+ # last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
144
+ last_response.data.tunnel 'text'
145
+
154
146
  when :mistral
155
- last_response.data.dig('choices', 0, 'message', 'content')
147
+ # last_response.data.dig('choices', 0, 'message', 'content')
148
+ last_response.data.tunnel 'content'
149
+
150
+ when :open_router
151
+ last_response.data.tunnel 'content'
152
+
156
153
  else
157
154
  raise NotImplementedError, "Content extraction not implemented for provider: #{@provider}"
158
155
  end
159
156
  end
160
157
  alias_method :text, :content
161
158
 
162
- ##############################################
163
- ## Public Class Methods
164
-
165
159
  def method_missing(method_name, *args, &block)
166
160
  if @client.respond_to?(method_name)
167
161
  result = @client.send(method_name, *args, &block)
@@ -205,16 +199,25 @@ class AiClient
205
199
  case provider
206
200
  when :openai
207
201
  OmniAI::OpenAI::Client.new(**client_options)
202
+
208
203
  when :anthropic
209
204
  OmniAI::Anthropic::Client.new(**client_options)
205
+
210
206
  when :google
211
207
  OmniAI::Google::Client.new(**client_options)
208
+
212
209
  when :mistral
213
210
  OmniAI::Mistral::Client.new(**client_options)
211
+
214
212
  when :ollama
215
213
  OmniAI::Ollama::Client.new(**client_options)
214
+
216
215
  when :localai
217
216
  OmniAI::LocalAI::Client.new(**client_options)
217
+
218
+ when :open_router
219
+ OmniAI::OpenRouter::Client.new(**client_options)
220
+
218
221
  else
219
222
  raise ArgumentError, "Unsupported provider: #{@provider}"
220
223
  end
@@ -239,12 +242,6 @@ class AiClient
239
242
  config.provider_patterns.find { |provider, pattern| model.match?(pattern) }&.first ||
240
243
  raise(ArgumentError, "Unsupported model: #{model}")
241
244
  end
242
-
243
-
244
- def determine_model_type(model)
245
- config.model_types.find { |type, pattern| model.match?(pattern) }&.first ||
246
- raise(ArgumentError, "Unable to determine model type for: #{model}")
247
- end
248
245
  end
249
246
 
250
247
 
@@ -0,0 +1,92 @@
1
+ # lib/extensions/omniai-open_router.rb
2
+ # frozen_string_literal: true
3
+
4
+ require 'omniai'
5
+ require 'omniai/openai'
6
+
7
+ module OmniAI
8
+
9
+ # Create an alias for OmniAI::OpenAI module
10
+ module OpenRouter
11
+ extend OmniAI::OpenAI
12
+
13
+ # Alias classes from OmniAI::OpenAI
14
+ class Client < OmniAI::OpenAI::Client
15
+ def initialize(**options)
16
+ options[:host] = 'https://openrouter.ai/api/v1' unless options.has_key?(:host)
17
+ super(**options)
18
+ end
19
+
20
+ def self.openrouter
21
+ OmniAI::OpenRouter::Client
22
+ end
23
+
24
+ def self.open_router
25
+ OmniAI::OpenRouter::Client
26
+ end
27
+
28
+ def self.find(provider:, **)
29
+ return OmniAI.open_router.new(**) if :open_reouter == provider
30
+
31
+ super(provider: provider.to_s, **)
32
+ end
33
+ end
34
+
35
+ Chat = OmniAI::OpenAI::Chat
36
+
37
+ class Chat
38
+ def path
39
+ "/api/v1/chat/completions"
40
+ end
41
+ end
42
+
43
+ Config = OmniAI::OpenAI::Config
44
+
45
+ # Alias the Thread class and its nested classes
46
+ Thread = OmniAI::OpenAI::Thread
47
+ Thread::Annotation = OmniAI::OpenAI::Thread::Annotation
48
+ Thread::Attachment = OmniAI::OpenAI::Thread::Attachment
49
+ Thread::Message = OmniAI::OpenAI::Thread::Message
50
+ Thread::Run = OmniAI::OpenAI::Thread::Run
51
+ end
52
+ end
53
+
54
+ ######################################################
55
+ ## Extend Capabilities Using OpenRouter
56
+ #
57
+ # TODO: catch the models db
58
+ # TODO: consider wrapping the models database in an ActiveModel
59
+ #
60
+ class AiClient
61
+ class << self
62
+ def orc_models
63
+ @orc_models ||= ORC.models if defined?(ORC)
64
+ end
65
+
66
+ def orc_model_names(provider=nil)
67
+ if provider.nil?
68
+ orc_models.map{|e| e['id']}
69
+ else
70
+ orc_models
71
+ .map{|e| e['id']}
72
+ .select{|name| name.start_with? provider.to_s.downcase}
73
+ .map{|e| e.split('/')[1]}
74
+ end
75
+ end
76
+
77
+ def orc_model_details(model)
78
+ orc_models.select{|e| e['id'].include?(model)}
79
+ end
80
+ end
81
+ end
82
+
83
+ if ENV.fetch('OPEN_ROUTER_API_KEY', nil)
84
+ OpenRouter.configure do |config|
85
+ config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
86
+ end
87
+
88
+ # Use a default provider/model
89
+ AiClient::ORC = OpenRouter::Client.new
90
+ end
91
+
92
+
@@ -0,0 +1,97 @@
1
+ # Notes on OpenRouter
2
+
3
+ OpenROuter is a web service that has a common API to many
4
+ back-end LLM processors. Its goal is basically the same as the
5
+ OmniAI gem - provide the flexibility of using multiple models
6
+ processed by myltiple providers.
7
+
8
+ ```ruby
9
+ OpenRouter.configure do |config|
10
+ config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
11
+ end
12
+
13
+ # Use a default provider/model
14
+ AI = OpenRouter::Client.new
15
+
16
+ # Returns an Array of Hash for supported
17
+ # models/providers
18
+ Models = AI.models
19
+ ```
20
+
21
+ models with a "/" are targeted to open router.
22
+ before the "/" is the provider after it is the model name
23
+
24
+ Will need to add this entriy to the AiClient::Config `provider_patterns` Hash:
25
+
26
+ ```ruby
27
+ open_router: /\//, # /(.*)\/(.*)/ provider / model name
28
+ ```
29
+
30
+ models can be an Array of Strings. The first is the primary while
31
+ the rest are fallbacks in case there is one before fails
32
+
33
+ ```ruby
34
+ {
35
+ "models": ["anthropic/claude-2.1", "gryphe/mythomax-l2-13b"],
36
+ "route": "fallback",
37
+ ... // Other params
38
+ }
39
+ ```
40
+
41
+ You can have OpenRouter send your prompt to the best
42
+ provider/model for the prompt like this:
43
+
44
+ ```ruby
45
+ require "open_router"
46
+
47
+ OpenRouter.configure do |config|
48
+ config.access_token = ENV["ACCESS_TOKEN"]
49
+ config.site_name = "YOUR_APP_NAME"
50
+ config.site_url = "YOUR_SITE_URL"
51
+ end
52
+
53
+ OpenRouter::Client.new.complete(
54
+ model: "openrouter/auto",
55
+ messages: [
56
+ {
57
+ "role": "user",
58
+ "content": "What is the meaning of life?"
59
+ }
60
+ ]
61
+ ).then do |response|
62
+ puts response.dig("choices", 0, "message", "content")
63
+ end
64
+ ```
65
+
66
+ OpenRouter can also support OpenAI's API by using this
67
+ base_url: "https://openrouter.ai/api/v1",
68
+
69
+ Request Format Documentation
70
+ https://openrouter.ai/docs/requests
71
+
72
+ Simple Quick Start ...
73
+
74
+ ```ruby
75
+ OpenRouter::Client.new.complete(
76
+ model: "openai/gpt-3.5-turbo",
77
+ messages: [
78
+ {
79
+ "role": "user",
80
+ "content": "What is the meaning of life?"
81
+ }
82
+ ]
83
+ ).then do |response|
84
+ puts response.dig("choices", 0, "message", "content")
85
+ end
86
+ ```
87
+
88
+ ## Design Approaches
89
+
90
+ There are at least two different approaches to
91
+ integrate the OpenRouter capability. 1) Use the open_router gem
92
+ and forget about using the same common-ish
93
+ API established by OmniAI; or 2) Take advantage
94
+ or OpenRouter's OpenAI's API and do for it
95
+ the same thing that was done of Ollama and LocalAI.
96
+
97
+
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.1
4
+ version: 0.2.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-05 00:00:00.000000000 Z
11
+ date: 2024-10-08 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: hashie
@@ -94,6 +94,20 @@ dependencies:
94
94
  - - ">="
95
95
  - !ruby/object:Gem::Version
96
96
  version: '0'
97
+ - !ruby/object:Gem::Dependency
98
+ name: open_router
99
+ requirement: !ruby/object:Gem::Requirement
100
+ requirements:
101
+ - - ">="
102
+ - !ruby/object:Gem::Version
103
+ version: '0'
104
+ type: :runtime
105
+ prerelease: false
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - ">="
109
+ - !ruby/object:Gem::Version
110
+ version: '0'
97
111
  - !ruby/object:Gem::Dependency
98
112
  name: amazing_print
99
113
  requirement: !ruby/object:Gem::Requirement
@@ -150,18 +164,15 @@ dependencies:
150
164
  - - ">="
151
165
  - !ruby/object:Gem::Version
152
166
  version: '0'
153
- description: "`ai_client` is a versatile Ruby gem that serves as a generic client
154
- \nfor interacting with various AI service providers through a unified \nAPI. Designed
155
- to simplify the integration of large language models \n(LLMs) into applications,
156
- `ai_client` allows developers to create \ninstances using just the model name, greatly
157
- reducing configuration \noverhead. With built-in support for popular AI providers—including
158
- \nOpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the \ncomplexities
159
- of API interactions, offering methods for tasks such \nas chatting, transcription,
160
- speech synthesis, and embedding. The \nmiddleware architecture enables customizable
161
- processing of requests \nand responses, making it easy to implement features like
162
- logging and \nretry logic. Seamlessly integrated with the `OmniAI` framework, \n`ai_client`
163
- empowers developers to leverage cutting-edge AI capabilities \nwithout vendor lock-in,
164
- making it an essential tool for modern AI-driven \napplications.\n"
167
+ description: "`ai_client` is a versatile Ruby gem that offers a seamless interface
168
+ \nfor integrating a wide range of AI service providers through a single, \nunified
169
+ API. With `ai_client`, you can simply specify the model name \nand quickly leverage
170
+ advanced features such as chatting, transcription, \nand speech synthesis. The gem
171
+ supports prominent providers, including \nOpenAI, Anthropic, Google, Mistral, LocalAI,
172
+ Ollama, and OpenRouter, \nmaking it easier than ever to build innovative applications.
173
+ Plus, its \nflexible middleware architecture allows for easy customization of request
174
+ \nand response handling, ensuring that you can harness the power of AI \nwithout
175
+ vendor lock-in.\n"
165
176
  email:
166
177
  - dvanhoozer@gmail.com
167
178
  executables: []
@@ -182,14 +193,20 @@ files:
182
193
  - examples/text.rb
183
194
  - examples/transcribe.rb
184
195
  - lib/ai_client.rb
196
+ - lib/ai_client/chat.rb
185
197
  - lib/ai_client/config.yml
186
198
  - lib/ai_client/configuration.rb
199
+ - lib/ai_client/embed.rb
187
200
  - lib/ai_client/logger_middleware.rb
188
201
  - lib/ai_client/middleware.rb
189
202
  - lib/ai_client/retry_middleware.rb
203
+ - lib/ai_client/speak.rb
204
+ - lib/ai_client/transcribe.rb
190
205
  - lib/ai_client/version.rb
191
206
  - lib/extensions/omniai-localai.rb
192
207
  - lib/extensions/omniai-ollama.rb
208
+ - lib/extensions/omniai-open_router.rb
209
+ - lib/extensions/open_router.md
193
210
  - sig/ai_client.rbs
194
211
  - the_ollama_model_problem.md
195
212
  homepage: https://github.com/MadBomber/ai_client
@@ -215,7 +232,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
215
232
  - !ruby/object:Gem::Version
216
233
  version: '0'
217
234
  requirements: []
218
- rubygems_version: 3.5.20
235
+ rubygems_version: 3.5.21
219
236
  signing_key:
220
237
  specification_version: 4
221
238
  summary: A generic AI Client for many providers