ai_client 0.2.0 → 0.2.2

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 107e9d57f175adc72178f48365a833a4dab7fbe71088169774429f43624d7d9c
4
- data.tar.gz: d308008540c4fbf309235c31d60b1a98b14a608a0f69d7ee1bac6911f34dad47
3
+ metadata.gz: 07a66886ce84dd916574f7a38e292626124ed1a048da92eec33cfdc0e66435e4
4
+ data.tar.gz: dc9c6734ae4a6a479ed8f6b76fc8edea31dbb588f67ae801b915a067346c5735
5
5
  SHA512:
6
- metadata.gz: b001c640758c846f2372354660d4a4f4c7ef4a6c4671bbd743fcc713d543faff1fe54ff2e341285ec57dbaf284db3e3a18a50cbd6b420fd01ac911112a93ae8a
7
- data.tar.gz: 0055623e374bb4cf111a234712e9ac1441d6ddb541f2f7ebadc604c255e27bdd6ad14d1b2a81d56cb6e2797e01f3d3398eb4b4b3e6c7d04802dadf620ecb0390
6
+ metadata.gz: ced4f6a3834132d0c1d127ad3fcd931e9600d748a28d6593fc9e382fb6bb98b9fd31de56d44d5a216c5f0bb8394c360f04779995fd2345ae1043e137cd994091
7
+ data.tar.gz: be77228ac195d1f3de41f1ec0e82f2d2e2aa0f280e51354156c00b9c40bdb466581c87b4ba73b22c57fa86300df9555b7a2230854ac6ff93e365028b946275ba
data/CHANGELOG.md CHANGED
@@ -1,6 +1,14 @@
1
1
  ## [Unreleased]
2
2
 
3
3
  ## Released
4
+ ### [0.2.2] - 2024-10-07
5
+ - Added support for open_router.ai with extensions/omniai-open_router.rb
6
+ - Removed the `model_type` object
7
+ - Added ability to dump a config to a YAML file
8
+
9
+ ### [0.2.1] - 2024-10-05
10
+ - Added support for YAML configuration files
11
+
4
12
  ### [0.2.0] - 2024-10-04
5
13
  - Configuration is more robust. Still room for improvement.
6
14
 
data/README.md CHANGED
@@ -1,16 +1,35 @@
1
1
  # AiClient
2
2
 
3
- First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) upon which this effort depends.
3
+ First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
4
4
 
5
- **This is a work in progress** Its implemented as a class rather than the typical module for most gems. The `AiClient::Configuration` class is a little first draft-ish. I'm looking to bulk it up a lot. At this point I think some of the current tests are failing; but, over all `AiClien` is working. I've used early versions of it in several projects.
5
+ See the [change log](CHANGELOG.md) for recent modifications.
6
6
 
7
7
  ## Summary
8
8
 
9
- `ai_client` is a versatile Ruby gem that serves as a generic client for interacting with various AI service providers through a unified API provided by Kevin's gem `OmniAI`. The `AiClient` class is designed to simplify the integration of large language models (LLMs) into applications. `AiClient` allows developers to create instances using just the model name, greatly reducing configuration overhead.
9
+ Are you ready to supercharge your applications with cutting-edge AI capabilities?
10
+ Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface
11
+ for interacting with a multitude of AI service providers through a single,
12
+ unified API.
10
13
 
11
- With built-in support for popular AI providers—including OpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the complexities of API interactions, offering methods for tasks such as chatting, transcription, speech synthesis, and embedding.
14
+ With `ai_client`, you can effortlessly integrate large language models (LLMs)
15
+ into your projects—simply specify the model name and let the gem handle the
16
+ rest! Say goodbye to tedious configuration and hello to rapid development.
17
+
18
+ This gem comes packed with built-in support for leading AI providers, including
19
+ OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to
20
+ implement chatbots, transcription services, speech synthesis, or embeddings,
21
+ `ai_client` abstracts the complexities of API interactions, allowing you to focus
22
+ on what truly matters: building amazing applications.
23
+
24
+ Plus, with its flexible middleware architecture, you can easily customize request
25
+ and response processing—implement logging, retry logic, and more with minimal effort.
26
+ And thanks to its seamless integration with the `OmniAI` framework, you can leverage
27
+ the latest AI advancements without worrying about vendor lock-in.
28
+
29
+ Join the growing community of developers who are transforming their applications
30
+ with `ai_client`. Install it today and unlock the full potential of AI in your
31
+ projects!
12
32
 
13
- The middleware architecture enables customizable processing of requests and responses, making it easy to implement features like logging and retry logic. Seamlessly integrated with the `OmniAI` framework, `ai_client` empowers developers to leverage cutting-edge AI capabilities without vendor lock-in, making it an essential tool for modern AI-driven applications.
14
33
 
15
34
  ## Installation
16
35
 
@@ -26,6 +45,32 @@ If bundler is not being used to manage dependencies, install the gem by executin
26
45
  gem install ai_client
27
46
  ```
28
47
 
48
+ ## Providers Supported
49
+
50
+ To explicitely designate a provider to use with an AiClient instance
51
+ use the parameter `provider: :your_provider` with the Symbol for the supported
52
+ provider you want to use with the model you specify. The following providers
53
+ are supported by the OmniAI gem upon which AiClient depends along with a few
54
+ extensions.
55
+
56
+ | Symbol | Envar API Key | Client Source |
57
+ | --- | --- | --- |
58
+ | :anthropic | [ANTHROPIC_API_KEY](https://www.anthropic.com/) | OmniAI |
59
+ | :google | [GOOGLE_API_KEY](https://cloud.google.com/gemini) | OmniAI |
60
+ | :localai | [LOCALAI_API_KEY](https://localai.io/) | AiClient Extension |
61
+ | :mistral | [MISTRAL_API_KEY](https://mistral.ai/) | OmniAI |
62
+ | :ollama | [OLLAMA_API_KEY](https://ollama.com/) | AiClient Extension |
63
+ | :open_router | [OPEN_ROUTER_API_KEY](https://openrouter.ai/) | AiClient Extension |
64
+ | :openai | [OPENAI_API_KEY](https://www.openai.com/) | OmniAI |
65
+
66
+ In case you are using a different environment variable for your access token than the ones shown above you can use the `api_key:` parameter.
67
+
68
+ ```ruby
69
+ client = AiClient.new('provider/model_name', api_key: ENV['OPENROUTER_API_KEY'])
70
+ ```
71
+
72
+ This way if you are using `AiClient` inside of a Rails application you can retrieve your access token from a secretes file.
73
+
29
74
  ## Usage
30
75
 
31
76
  Basic usage:
@@ -41,30 +86,80 @@ c1 = AiClient.new('nomic-embeddings-text')
41
86
  c2 = AiClient.new('gpt-4o-mini')
42
87
  ```
43
88
 
89
+ You can also use the `provider:` parameter in the event that the model you want to use is available through multiple providers or that AiClient can not automatically associate the model name with a provider.
90
+
91
+
92
+ ```ruby
93
+ AI = AiClient.new('nomic-embed-text', provider: :ollama)
94
+ ```
95
+
44
96
  ### Configuration
45
97
 
46
- There is an internal hard-coded configuration default. That default is duppled into a class-level configuration which can be over-ridden with a class-level config block like this ...
98
+ There are three levels of configuration, each inherenting from the level above. The following sections
99
+ describe those configuration levels.
100
+
101
+ #### Default Configuration
102
+
103
+ The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to
104
+ update the [lib/ai_client/config.yml] file during development. If you have
105
+ some changes for this configuration please send me a pull request so we
106
+ can all benefit from your efforts.
107
+
108
+ #### Class Configuration
109
+
110
+ The class configuration is derived initially from the default configuration. It
111
+ can be changed in three ways.
112
+
113
+ 1. Class Configuration Block
47
114
 
48
115
  ```ruby
49
- AiClient.configure do |config|
116
+ AiClient.configuration do |config|
50
117
  config.some_item = some_value
118
+ ...
51
119
  end
52
120
  ```
53
121
 
54
- Every instance of the AiClient inherents the class-level configuration; however, the instance configuration can also be over-ridden also with a block like this ...
122
+ 2. Set by a Config File
123
+
124
+ ```ruby
125
+ AiClient.class_config = AiClient::Config.load('path/to/file.yml')
126
+ ```
127
+
128
+ 3. Supplemented by a Config File
129
+
130
+ ```ruby
131
+ AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
132
+ ```
133
+
134
+ #### Instance Configuration
135
+
136
+ All instances have a configuration. Initially that configuration is the same
137
+ as the class configuration; however, each instance can have its own separate
138
+ configuration. For an instance the class configuration can either be supplemented
139
+ or complete over-ridden.
140
+
141
+ 1. Supplement from a Constructor Block
55
142
 
56
143
  ```ruby
57
144
  client = AiClient.new('super-ai-overlord-model') do |config|
58
145
  config.some_item = some_value
146
+ ...
59
147
  end
60
148
  ```
61
149
 
62
- But wait, there's more. You can also load a YAML file as a configuration of an instance like this ...
150
+ 2. Suppliment from a YAML File
63
151
 
64
152
  ```ruby
65
153
  client = AiClient.new('baby-model', config: 'path/to/file.yml')
66
154
  ```
67
155
 
156
+ 3. Load Complete Configuration from a YAML File
157
+
158
+ ```ruby
159
+ client = AiClient.new('your-model')
160
+ client.config = AiClient::Config.load('path/to/file.yml')
161
+ ```
162
+
68
163
  ### What Now?
69
164
 
70
165
  TODO: Document the methods and their options.
@@ -77,7 +172,7 @@ AI.embed(...)
77
172
  AI.batch_embed(...)
78
173
  ```
79
174
 
80
- TODO: see the [examples] directory.
175
+ See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
81
176
 
82
177
  ### System Environment Variables
83
178
 
@@ -89,6 +184,36 @@ TODO: list all providers supported and their envar
89
184
 
90
185
  TODO: document the options like `provider: :ollama`
91
186
 
187
+ ## Best ?? Practices
188
+
189
+ If you are going to be using one model for multiple purposes in different parts of your application you can assign the instance of `AiClient` to a constant so that the same client can be used everywhere.
190
+
191
+ ```ruby
192
+ AI = AiClient.new 'gpt-4o'
193
+ ...
194
+ AI.chat "do something with this #{stuff}"
195
+ ...
196
+ AI.speak "warning Will Robinson! #{bad_things_happened}"
197
+ ...
198
+ ```
199
+
200
+ Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
201
+
202
+ ## Extensions for OmniAI
203
+
204
+ The AiClient makes use of extensions to the OmniAI gem that define
205
+ additional providers and protocols.
206
+
207
+ 1. **OmniAI::Ollama^** which wraps the OmniAI::OpenAI class
208
+ 2. **OmniAI::LocalAI** which also wraps the OmniAI::OpenAI class
209
+ 3. **OmniAI::OpenRouter** TODO: Still under development
210
+
211
+ ## OmniAI and OpenRouter
212
+
213
+ OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
214
+
215
+ The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
216
+
92
217
  ## Contributing
93
218
 
94
219
  I can sure use your help. This industry is moving faster than I can keep up with. If you have a bug fix or new feature idea then have at it. Send me a pull request so we all can benefit from your efforts.
@@ -0,0 +1,13 @@
1
+ # Examples
2
+
3
+ | File | Content |
4
+ | --- | --- |
5
+ | Bethany Hamilton.m4a | Audio file used to demonstrate transcribe|
6
+ | common.rb | Stuff used by each example |
7
+ | embed.rb | Demonstrates using Ollama locally to vectorize text for embeddings|
8
+ | speak.rb | Demonstrates using OpenAI's text to speech models |
9
+ | text.rb | Demonstrates text-to-text transformers "chat" |
10
+ | transcribe.rb | Uses OpenAI's audio-to-text model |
11
+
12
+ Many of these example programs show both the raw response object as well as just the
13
+ content from the response.
data/examples/speak.rb CHANGED
@@ -24,7 +24,7 @@ end
24
24
  title "Default Configuration Text-to-Speech"
25
25
 
26
26
  clients.each do |c|
27
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
27
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
28
28
  text = "Text to speach example using the #{c.model} by provider #{c.provider} with the default voice."
29
29
  result = c.speak(text)
30
30
  puts "Audio generated. Tempfile: #{result.path}"
data/examples/text.rb CHANGED
@@ -48,7 +48,7 @@ end
48
48
  title "Default Configuration Response to 'hello'"
49
49
 
50
50
  clients.each do |c|
51
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
51
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
52
52
  begin
53
53
  response = c.chat('hello')
54
54
  puts response
@@ -73,7 +73,7 @@ puts
73
73
  title "Raw Configuration Response to 'hello'"
74
74
 
75
75
  raw_clients.each do |c|
76
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
76
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
77
77
  begin
78
78
  result = c.chat('hello')
79
79
  puts result.pretty_inspect
@@ -21,7 +21,7 @@ title "Default Configuration Speech-to-Text"
21
21
 
22
22
 
23
23
  clients.each do |c|
24
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
24
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
25
25
  result = c.transcribe(audio_file)
26
26
  puts "Transcription: #{result.pretty_inspect}"
27
27
  end
@@ -0,0 +1,24 @@
1
+ # lib/ai_client/chat.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model: @model [String] optional
8
+ # format: @format [Symbol] optional :text or :json
9
+ # stream: @stream [Proc, nil] optional
10
+ # tools: @tools [Array<OmniAI::Tool>] optional
11
+ # temperature: @temperature [Float, nil] optional
12
+
13
+ def chat(messages, **params)
14
+ result = call_with_middlewares(:chat_without_middlewares, messages, **params)
15
+ @last_response = result
16
+ raw? ? result : content
17
+ end
18
+
19
+
20
+ def chat_without_middlewares(messages, **params)
21
+ @client.chat(messages, model: @model, **params)
22
+ end
23
+
24
+ end
@@ -0,0 +1,29 @@
1
+ ---
2
+ :logger: !ruby/object:Logger
3
+ level: 0
4
+ progname:
5
+ default_formatter: !ruby/object:Logger::Formatter
6
+ datetime_format:
7
+ formatter:
8
+ logdev: !ruby/object:Logger::LogDevice
9
+ shift_period_suffix:
10
+ shift_size:
11
+ shift_age:
12
+ filename:
13
+ dev: !ruby/object:IO {}
14
+ binmode: false
15
+ reraise_write_errors: []
16
+ mon_data: !ruby/object:Monitor {}
17
+ mon_data_owner_object_id: 920
18
+ level_override: {}
19
+ :timeout:
20
+ :return_raw: false
21
+ :providers: {}
22
+ :provider_patterns:
23
+ :anthropic: !ruby/regexp /^claude/i
24
+ :openai: !ruby/regexp /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
25
+ :google: !ruby/regexp /^(gemini|gemma|palm)/i
26
+ :mistral: !ruby/regexp /^(mistral|codestral|mixtral)/i
27
+ :localai: !ruby/regexp /^local-/i
28
+ :ollama: !ruby/regexp /(llama|nomic)/i
29
+ :open_router: !ruby/regexp /\//
@@ -1,27 +1,93 @@
1
1
  # ai_client/configuration.rb
2
2
  #
3
- # Design Objective:
4
- # AiClient.configure do |config|
5
- # # global config items that over-ride the defaults
6
- # end
3
+ # AiClient and AiClient::Config
4
+ #
5
+ # The AiClient class provides a configurable client for interacting with various AI service providers.
6
+ # It allows users to set global configurations and provider-specific settings via a block.
7
+ #
8
+ # There are three levels of configuration:
9
+ # * default_config .. the starting point
10
+ # * class_config .... for all instances
11
+ # * config .......... for an instance
12
+ #
13
+ # Class Configuration
14
+ # starts with the default configuration but can
15
+ # be changed in three different ways.
16
+ # 1. Use the configuration block
17
+ # AiClient.configuration do |config|
18
+ # some_item = some_value
19
+ # ...
20
+ # end
21
+ #
22
+ # 2. Automatic YAML configuration file
23
+ # Set the system environment variable AI_CLIENT_CONFIG_FILE
24
+ # to an existing configuration file. The contents of that
25
+ # file will be automatically merged on top of the
26
+ # default configuration.
27
+ #
28
+ # 3. Manual YAML configuration file
29
+ # You can completely replace the class configuration
30
+ # AiClient.class_config = AiClient::Config.load('path/to/file.yml')
31
+ # You can supplement the existing class config
32
+ # AiClient.class_config.merge!(AiClient::Config.load('path/to/file.yml'))
33
+ #
34
+ # Instance Configuration
35
+ # AiClient is setup so that you can have multiple instance
36
+ # of clients each using a different model / provider and having
37
+ # a different configuration. There are several ways you
38
+ # can manipulate an instance's configuration.
39
+ #
40
+ # 1. The default instance configuration inherents from the
41
+ # the class configuration.
42
+ # client = AiClient.new('your_model')
43
+ # You can access the instance configuration using
44
+ # client.config.some_item
45
+ # client.config[:some_item]
46
+ # client.config['some_item']
47
+ # All three ways returns the value for that configuration item.
48
+ # To change the value of an item its also that simple.
49
+ # client.config.some_item = some_value
50
+ # client.config[:some_item] = some_value
51
+ # client.config['some_item'] = some_value
52
+ # 2. Instance constructor block
53
+ # client = AiClient.new('your_model') do |config|
54
+ # config.some_item = some_value
55
+ # ...
56
+ # end
57
+ #
58
+ # 3. Like the class configuration you can can replace or
59
+ # supplement an instance's configuration from a YAML file.
60
+ # client = AiClient.new('your_model', config: 'path/to/file.yml')
61
+ # client.config.merge!(AiClient::Config.load('path/to/file.yml'))
62
+ # Both of those example suppliment / over0ride items in
63
+ # the class configuration to become the instance's
64
+ # configuration. To completely replace the instance's
65
+ # configuration you can do this.
66
+ # client = AiClient.new('your_model')
67
+ # client.config = AiClient::Config.load('path/to/file.yml')
68
+ #
69
+ # OmniAI Configuration Items
70
+ # OmniAI::OpenAI
71
+ # config.api_key = '...'
72
+ # config.host = 'http://localhost:8080'
73
+ # config.logger = Logger.new(STDOUT)
74
+ # config.timeout = 15
75
+ # config.chat_options = { ... }
76
+ # config.transcribe_options = { ... }
77
+ # config.speak_options = { ... }
7
78
  #
8
- # client = AiClient.new(...) do
9
- # # client specific config items that over-ride the global config
10
- # end
79
+
80
+
11
81
 
12
82
  require 'hashie'
13
83
  require 'logger'
84
+ require 'yaml'
85
+ require 'pathname'
14
86
 
15
87
  class AiClient
16
- # TODO: Use system environment varibles
17
- # AI_CLIENT_CONFIG_FILE
18
- #
19
- # TODO: Config.load('path/to/some_file.yml')
20
- # @@default_config (on require from lib/config.yml)
21
- # @@config (if the envar exists ?? merge with default)
22
- # @config ... done
23
-
24
88
  class Config < Hashie::Mash
89
+ DEFAULT_CONFIG_FILEPATH = Pathname.new(__dir__) + 'config.yml'
90
+
25
91
  include Hashie::Extensions::Mash::PermissiveRespondTo
26
92
  include Hashie::Extensions::Mash::SymbolizeKeys
27
93
  include Hashie::Extensions::Mash::DefineAccessors
@@ -36,48 +102,63 @@ class AiClient
36
102
  # end
37
103
  # end
38
104
 
39
- end
40
105
 
106
+ def save(filepath=ENV['HOME']+'/aiclient_config.yml')
107
+ filepath = Pathname.new(filepath) unless filepath.is_a? Pathname
108
+
109
+ filepath.write(YAML.dump(to_hash))
110
+ end
41
111
 
42
- # Class variables to hold default and current config
43
- @@default_config = Config.new(
44
- logger: Logger.new(STDOUT),
45
- timeout: nil,
46
- return_raw: false,
47
- providers: {},
48
- provider_patterns: {
49
- anthropic: /^claude/i,
50
- openai: /^(gpt|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
51
- google: /^(gemini|palm)/i,
52
- mistral: /^(mistral|codestral)/i,
53
- localai: /^local-/i,
54
- ollama: /(llama|nomic)/i
55
- },
56
- model_types: {
57
- text_to_text: /^(nomic|gpt|davinci|curie|babbage|ada|claude|gemini|palm|command|generate|j2-|mistral|codestral)/i,
58
- speech_to_text: /^whisper/i,
59
- text_to_speech: /^tts/i,
60
- text_to_image: /^dall-e/i
61
- }
62
- )
63
-
64
- @@class_config = @@default_config.dup
112
+
113
+ class << self
114
+ def load(filepath=DEFAULT_CONFIG_FILEPATH)
115
+ filepath = Pathname.new(filepath) unless Pathname == filepath.class
116
+ if filepath.exist?
117
+ new(YAML.parse(filepath.read).to_ruby)
118
+ else
119
+ raise ArgumentError, "#{filepath} does not exist"
120
+ end
121
+ end
122
+ end
123
+ end
65
124
 
66
125
  class << self
126
+ attr_accessor :class_config, :default_config
127
+
67
128
  def configure(&block)
68
129
  yield(class_config)
69
130
  end
70
131
 
71
- def class_config
72
- @@class_config
73
- end
74
-
75
- def class_config=(value)
76
- @@class_config = value
77
- end
132
+ private
78
133
 
79
- def default_config
80
- @@default_config
134
+ def initialize_defaults
135
+ @default_config = Config.new(
136
+ logger: Logger.new(STDOUT),
137
+ timeout: nil,
138
+ return_raw: false,
139
+ providers: {},
140
+ provider_patterns: {
141
+ anthropic: /^claude/i,
142
+ openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
143
+ google: /^(gemini|gemma|palm)/i,
144
+ mistral: /^(mistral|codestral|mixtral)/i,
145
+ localai: /^local-/i,
146
+ ollama: /(llama|nomic)/i,
147
+ open_router: /\//
148
+ }
149
+ )
150
+
151
+ @class_config = @default_config.dup
81
152
  end
82
153
  end
154
+
155
+ initialize_defaults
156
+ end
157
+
158
+
159
+ AiClient.default_config = AiClient::Config.load
160
+ AiClient.class_config = AiClient.default_config.dup
161
+
162
+ if config_file = ENV.fetch('AI_CLIENT_CONFIG_FILE', nil)
163
+ AiClient.class_config.merge!(AiClient::Config.load(config_file))
83
164
  end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/embed.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String] required
8
+ #
9
+
10
+ def embed(input, **params)
11
+ @client.embed(input, model: @model, **params)
12
+ end
13
+
14
+ def batch_embed(inputs, batch_size: 100, **params)
15
+ inputs.each_slice(batch_size).flat_map do |batch|
16
+ sleep 1 # DEBUG rate limits being exceeded
17
+ embed(batch, **params)
18
+ end
19
+ end
20
+
21
+ end
@@ -0,0 +1,26 @@
1
+ # lib/ai_client/speak.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # input [String] required
8
+ # model [String] required
9
+ # voice [String] required
10
+ # speed [Float] optional
11
+ # format [String] optional (default "aac")
12
+ # aac mp3 flac opus pcm wav
13
+ #
14
+ # @yield [output] optional
15
+ #
16
+ # @return [Tempfile``]
17
+
18
+ def speak(text, **params)
19
+ call_with_middlewares(:speak_without_middlewares, text, **params)
20
+ end
21
+
22
+ def speak_without_middlewares(text, **params)
23
+ @client.speak(text, model: @model, **params)
24
+ end
25
+
26
+ end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/transcribe.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String]
8
+ # language [String, nil] optional
9
+ # prompt [String, nil] optional
10
+ # format [Symbol] :text, :srt, :vtt, or :json (default)
11
+ # temperature [Float, nil] optional
12
+
13
+ def transcribe(audio, format: nil, **params)
14
+ call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
15
+ end
16
+
17
+ def transcribe_without_middlewares(audio, format: nil, **params)
18
+ @client.transcribe(audio, model: @model, format: format, **params)
19
+ end
20
+
21
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.2.0"
4
+ VERSION = "0.2.2"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -13,8 +13,17 @@ require 'omniai/anthropic'
13
13
  require 'omniai/google'
14
14
  require 'omniai/mistral'
15
15
  require 'omniai/openai'
16
- require_relative 'extensions/omniai-ollama'
16
+
17
+ require 'open_router'
18
+
17
19
  require_relative 'extensions/omniai-localai'
20
+ require_relative 'extensions/omniai-ollama'
21
+ require_relative 'extensions/omniai-open_router'
22
+
23
+ require_relative 'ai_client/chat'
24
+ require_relative 'ai_client/embed'
25
+ require_relative 'ai_client/speak'
26
+ require_relative 'ai_client/transcribe'
18
27
 
19
28
  require_relative 'ai_client/configuration'
20
29
  require_relative 'ai_client/middleware'
@@ -28,13 +37,41 @@ require_relative 'ai_client/version'
28
37
  # AiClient.use(LoggingMiddleware.new(AiClient.configuration.logger))
29
38
  #
30
39
 
31
-
32
40
  class AiClient
33
41
 
42
+ # Define the refinement for Hash
43
+ module HashRefinement
44
+ refine Hash do
45
+ def tunnel(target_key)
46
+ queue = [self] # Initialize the queue with the current hash
47
+
48
+ until queue.empty?
49
+ current = queue.shift # Dequeue the front hash
50
+
51
+ # Check if the current hash contains the target key
52
+ return current[target_key] if current.key?(target_key)
53
+
54
+ # Enqueue sub-hashes and arrays to the queue for further searching
55
+ current.each_value do |value|
56
+ case value
57
+ when Hash
58
+ queue << value
59
+ when Array
60
+ queue.concat(value.select { |v| v.is_a?(Hash) }) # Add sub-hashes from the array
61
+ end
62
+ end
63
+ end
64
+
65
+ nil # Return nil if the key is not found
66
+ end
67
+ end
68
+ end
69
+
70
+ using HashRefinement
71
+
34
72
  attr_reader :client, # OmniAI's client instance
35
73
  :provider, # [Symbol]
36
74
  :model, # [String]
37
- :model_type, # [Symbol]
38
75
  :logger,
39
76
  :last_response,
40
77
  :timeout,
@@ -71,7 +108,6 @@ class AiClient
71
108
  explicit_provider = options.fetch(:provider, config.provider)
72
109
 
73
110
  @provider = validate_provider(explicit_provider) || determine_provider(model)
74
- @model_type = determine_model_type(model)
75
111
 
76
112
  provider_config = @config.providers[@provider] || {}
77
113
 
@@ -86,8 +122,6 @@ class AiClient
86
122
  @last_response = nil
87
123
  end
88
124
 
89
-
90
-
91
125
  def response = last_response
92
126
  def raw? = config.return_raw
93
127
 
@@ -95,73 +129,33 @@ class AiClient
95
129
  config.return_raw = value
96
130
  end
97
131
 
98
-
99
-
100
- ######################################
101
- def chat(messages, **params)
102
- result = call_with_middlewares(:chat_without_middlewares, messages, **params)
103
- @last_response = result
104
- raw? ? result : content
105
- end
106
-
107
-
108
- def chat_without_middlewares(messages, **params)
109
- @client.chat(messages, model: @model, **params)
110
- end
111
-
112
- ######################################
113
- def transcribe(audio, format: nil, **params)
114
- call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
115
- end
116
-
117
- def transcribe_without_middlewares(audio, format: nil, **params)
118
- @client.transcribe(audio, model: @model, format: format, **params)
119
- end
120
-
121
- ######################################
122
- def speak(text, **params)
123
- call_with_middlewares(:speak_without_middlewares, text, **params)
124
- end
125
-
126
- def speak_without_middlewares(text, **params)
127
- @client.speak(text, model: @model, **params)
128
- end
129
-
130
-
131
- ######################################
132
- def embed(input, **params)
133
- @client.embed(input, model: @model, **params)
134
- end
135
-
136
- def batch_embed(inputs, batch_size: 100, **params)
137
- inputs.each_slice(batch_size).flat_map do |batch|
138
- sleep 1 # DEBUG rate limits being exceeded
139
- embed(batch, **params)
140
- end
141
- end
142
-
143
- ######################################
144
- ## Utilities
145
-
146
132
  def content
147
133
  case @provider
148
134
  when :openai, :localai, :ollama
149
- last_response.data.dig('choices', 0, 'message', 'content')
135
+ # last_response.data.dig('choices', 0, 'message', 'content')
136
+ last_response.data.tunnel 'content'
137
+
150
138
  when :anthropic
151
- last_response.data.dig('content',0,'text')
139
+ # last_response.data.dig('content',0,'text')
140
+ last_response.data.tunnel 'text'
141
+
152
142
  when :google
153
- last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
143
+ # last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
144
+ last_response.data.tunnel 'text'
145
+
154
146
  when :mistral
155
- last_response.data.dig('choices', 0, 'message', 'content')
147
+ # last_response.data.dig('choices', 0, 'message', 'content')
148
+ last_response.data.tunnel 'content'
149
+
150
+ when :open_router
151
+ last_response.data.tunnel 'content'
152
+
156
153
  else
157
154
  raise NotImplementedError, "Content extraction not implemented for provider: #{@provider}"
158
155
  end
159
156
  end
160
157
  alias_method :text, :content
161
158
 
162
- ##############################################
163
- ## Public Class Methods
164
-
165
159
  def method_missing(method_name, *args, &block)
166
160
  if @client.respond_to?(method_name)
167
161
  result = @client.send(method_name, *args, &block)
@@ -205,16 +199,25 @@ class AiClient
205
199
  case provider
206
200
  when :openai
207
201
  OmniAI::OpenAI::Client.new(**client_options)
202
+
208
203
  when :anthropic
209
204
  OmniAI::Anthropic::Client.new(**client_options)
205
+
210
206
  when :google
211
207
  OmniAI::Google::Client.new(**client_options)
208
+
212
209
  when :mistral
213
210
  OmniAI::Mistral::Client.new(**client_options)
211
+
214
212
  when :ollama
215
213
  OmniAI::Ollama::Client.new(**client_options)
214
+
216
215
  when :localai
217
216
  OmniAI::LocalAI::Client.new(**client_options)
217
+
218
+ when :open_router
219
+ OmniAI::OpenRouter::Client.new(**client_options)
220
+
218
221
  else
219
222
  raise ArgumentError, "Unsupported provider: #{@provider}"
220
223
  end
@@ -239,12 +242,6 @@ class AiClient
239
242
  config.provider_patterns.find { |provider, pattern| model.match?(pattern) }&.first ||
240
243
  raise(ArgumentError, "Unsupported model: #{model}")
241
244
  end
242
-
243
-
244
- def determine_model_type(model)
245
- config.model_types.find { |type, pattern| model.match?(pattern) }&.first ||
246
- raise(ArgumentError, "Unable to determine model type for: #{model}")
247
- end
248
245
  end
249
246
 
250
247
 
@@ -0,0 +1,92 @@
1
+ # lib/extensions/omniai-open_router.rb
2
+ # frozen_string_literal: true
3
+
4
+ require 'omniai'
5
+ require 'omniai/openai'
6
+
7
+ module OmniAI
8
+
9
+ # Create an alias for OmniAI::OpenAI module
10
+ module OpenRouter
11
+ extend OmniAI::OpenAI
12
+
13
+ # Alias classes from OmniAI::OpenAI
14
+ class Client < OmniAI::OpenAI::Client
15
+ def initialize(**options)
16
+ options[:host] = 'https://openrouter.ai/api/v1' unless options.has_key?(:host)
17
+ super(**options)
18
+ end
19
+
20
+ def self.openrouter
21
+ OmniAI::OpenRouter::Client
22
+ end
23
+
24
+ def self.open_router
25
+ OmniAI::OpenRouter::Client
26
+ end
27
+
28
+ def self.find(provider:, **)
29
+ return OmniAI.open_router.new(**) if :open_reouter == provider
30
+
31
+ super(provider: provider.to_s, **)
32
+ end
33
+ end
34
+
35
+ Chat = OmniAI::OpenAI::Chat
36
+
37
+ class Chat
38
+ def path
39
+ "/api/v1/chat/completions"
40
+ end
41
+ end
42
+
43
+ Config = OmniAI::OpenAI::Config
44
+
45
+ # Alias the Thread class and its nested classes
46
+ Thread = OmniAI::OpenAI::Thread
47
+ Thread::Annotation = OmniAI::OpenAI::Thread::Annotation
48
+ Thread::Attachment = OmniAI::OpenAI::Thread::Attachment
49
+ Thread::Message = OmniAI::OpenAI::Thread::Message
50
+ Thread::Run = OmniAI::OpenAI::Thread::Run
51
+ end
52
+ end
53
+
54
+ ######################################################
55
+ ## Extend Capabilities Using OpenRouter
56
+ #
57
+ # TODO: catch the models db
58
+ # TODO: consider wrapping the models database in an ActiveModel
59
+ #
60
+ class AiClient
61
+ class << self
62
+ def orc_models
63
+ @orc_models ||= ORC.models if defined?(ORC)
64
+ end
65
+
66
+ def orc_model_names(provider=nil)
67
+ if provider.nil?
68
+ orc_models.map{|e| e['id']}
69
+ else
70
+ orc_models
71
+ .map{|e| e['id']}
72
+ .select{|name| name.start_with? provider.to_s.downcase}
73
+ .map{|e| e.split('/')[1]}
74
+ end
75
+ end
76
+
77
+ def orc_model_details(model)
78
+ orc_models.select{|e| e['id'].include?(model)}
79
+ end
80
+ end
81
+ end
82
+
83
+ if ENV.fetch('OPEN_ROUTER_API_KEY', nil)
84
+ OpenRouter.configure do |config|
85
+ config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
86
+ end
87
+
88
+ # Use a default provider/model
89
+ AiClient::ORC = OpenRouter::Client.new
90
+ end
91
+
92
+
@@ -0,0 +1,97 @@
1
+ # Notes on OpenRouter
2
+
3
+ OpenROuter is a web service that has a common API to many
4
+ back-end LLM processors. Its goal is basically the same as the
5
+ OmniAI gem - provide the flexibility of using multiple models
6
+ processed by myltiple providers.
7
+
8
+ ```ruby
9
+ OpenRouter.configure do |config|
10
+ config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
11
+ end
12
+
13
+ # Use a default provider/model
14
+ AI = OpenRouter::Client.new
15
+
16
+ # Returns an Array of Hash for supported
17
+ # models/providers
18
+ Models = AI.models
19
+ ```
20
+
21
+ models with a "/" are targeted to open router.
22
+ before the "/" is the provider after it is the model name
23
+
24
+ Will need to add this entriy to the AiClient::Config `provider_patterns` Hash:
25
+
26
+ ```ruby
27
+ open_router: /\//, # /(.*)\/(.*)/ provider / model name
28
+ ```
29
+
30
+ models can be an Array of Strings. The first is the primary while
31
+ the rest are fallbacks in case there is one before fails
32
+
33
+ ```ruby
34
+ {
35
+ "models": ["anthropic/claude-2.1", "gryphe/mythomax-l2-13b"],
36
+ "route": "fallback",
37
+ ... // Other params
38
+ }
39
+ ```
40
+
41
+ You can have OpenRouter send your prompt to the best
42
+ provider/model for the prompt like this:
43
+
44
+ ```ruby
45
+ require "open_router"
46
+
47
+ OpenRouter.configure do |config|
48
+ config.access_token = ENV["ACCESS_TOKEN"]
49
+ config.site_name = "YOUR_APP_NAME"
50
+ config.site_url = "YOUR_SITE_URL"
51
+ end
52
+
53
+ OpenRouter::Client.new.complete(
54
+ model: "openrouter/auto",
55
+ messages: [
56
+ {
57
+ "role": "user",
58
+ "content": "What is the meaning of life?"
59
+ }
60
+ ]
61
+ ).then do |response|
62
+ puts response.dig("choices", 0, "message", "content")
63
+ end
64
+ ```
65
+
66
+ OpenRouter can also support OpenAI's API by using this
67
+ base_url: "https://openrouter.ai/api/v1",
68
+
69
+ Request Format Documentation
70
+ https://openrouter.ai/docs/requests
71
+
72
+ Simple Quick Start ...
73
+
74
+ ```ruby
75
+ OpenRouter::Client.new.complete(
76
+ model: "openai/gpt-3.5-turbo",
77
+ messages: [
78
+ {
79
+ "role": "user",
80
+ "content": "What is the meaning of life?"
81
+ }
82
+ ]
83
+ ).then do |response|
84
+ puts response.dig("choices", 0, "message", "content")
85
+ end
86
+ ```
87
+
88
+ ## Design Approaches
89
+
90
+ There are at least two different approaches to
91
+ integrate the OpenRouter capability. 1) Use the open_router gem
92
+ and forget about using the same common-ish
93
+ API established by OmniAI; or 2) Take advantage
94
+ or OpenRouter's OpenAI's API and do for it
95
+ the same thing that was done of Ollama and LocalAI.
96
+
97
+
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.0
4
+ version: 0.2.2
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-05 00:00:00.000000000 Z
11
+ date: 2024-10-08 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: hashie
@@ -94,6 +94,20 @@ dependencies:
94
94
  - - ">="
95
95
  - !ruby/object:Gem::Version
96
96
  version: '0'
97
+ - !ruby/object:Gem::Dependency
98
+ name: open_router
99
+ requirement: !ruby/object:Gem::Requirement
100
+ requirements:
101
+ - - ">="
102
+ - !ruby/object:Gem::Version
103
+ version: '0'
104
+ type: :runtime
105
+ prerelease: false
106
+ version_requirements: !ruby/object:Gem::Requirement
107
+ requirements:
108
+ - - ">="
109
+ - !ruby/object:Gem::Version
110
+ version: '0'
97
111
  - !ruby/object:Gem::Dependency
98
112
  name: amazing_print
99
113
  requirement: !ruby/object:Gem::Requirement
@@ -122,6 +136,20 @@ dependencies:
122
136
  - - ">="
123
137
  - !ruby/object:Gem::Version
124
138
  version: '0'
139
+ - !ruby/object:Gem::Dependency
140
+ name: hashdiff
141
+ requirement: !ruby/object:Gem::Requirement
142
+ requirements:
143
+ - - ">="
144
+ - !ruby/object:Gem::Version
145
+ version: '0'
146
+ type: :development
147
+ prerelease: false
148
+ version_requirements: !ruby/object:Gem::Requirement
149
+ requirements:
150
+ - - ">="
151
+ - !ruby/object:Gem::Version
152
+ version: '0'
125
153
  - !ruby/object:Gem::Dependency
126
154
  name: mocha
127
155
  requirement: !ruby/object:Gem::Requirement
@@ -136,18 +164,15 @@ dependencies:
136
164
  - - ">="
137
165
  - !ruby/object:Gem::Version
138
166
  version: '0'
139
- description: "`ai_client` is a versatile Ruby gem that serves as a generic client
140
- \nfor interacting with various AI service providers through a unified \nAPI. Designed
141
- to simplify the integration of large language models \n(LLMs) into applications,
142
- `ai_client` allows developers to create \ninstances using just the model name, greatly
143
- reducing configuration \noverhead. With built-in support for popular AI providers—including
144
- \nOpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the \ncomplexities
145
- of API interactions, offering methods for tasks such \nas chatting, transcription,
146
- speech synthesis, and embedding. The \nmiddleware architecture enables customizable
147
- processing of requests \nand responses, making it easy to implement features like
148
- logging and \nretry logic. Seamlessly integrated with the `OmniAI` framework, \n`ai_client`
149
- empowers developers to leverage cutting-edge AI capabilities \nwithout vendor lock-in,
150
- making it an essential tool for modern AI-driven \napplications.\n"
167
+ description: "`ai_client` is a versatile Ruby gem that offers a seamless interface
168
+ \nfor integrating a wide range of AI service providers through a single, \nunified
169
+ API. With `ai_client`, you can simply specify the model name \nand quickly leverage
170
+ advanced features such as chatting, transcription, \nand speech synthesis. The gem
171
+ supports prominent providers, including \nOpenAI, Anthropic, Google, Mistral, LocalAI,
172
+ Ollama, and OpenRouter, \nmaking it easier than ever to build innovative applications.
173
+ Plus, its \nflexible middleware architecture allows for easy customization of request
174
+ \nand response handling, ensuring that you can harness the power of AI \nwithout
175
+ vendor lock-in.\n"
151
176
  email:
152
177
  - dvanhoozer@gmail.com
153
178
  executables: []
@@ -161,19 +186,27 @@ files:
161
186
  - README.md
162
187
  - Rakefile
163
188
  - examples/Bethany Hamilton.m4a
189
+ - examples/README.md
164
190
  - examples/common.rb
165
191
  - examples/embed.rb
166
192
  - examples/speak.rb
167
193
  - examples/text.rb
168
194
  - examples/transcribe.rb
169
195
  - lib/ai_client.rb
196
+ - lib/ai_client/chat.rb
197
+ - lib/ai_client/config.yml
170
198
  - lib/ai_client/configuration.rb
199
+ - lib/ai_client/embed.rb
171
200
  - lib/ai_client/logger_middleware.rb
172
201
  - lib/ai_client/middleware.rb
173
202
  - lib/ai_client/retry_middleware.rb
203
+ - lib/ai_client/speak.rb
204
+ - lib/ai_client/transcribe.rb
174
205
  - lib/ai_client/version.rb
175
206
  - lib/extensions/omniai-localai.rb
176
207
  - lib/extensions/omniai-ollama.rb
208
+ - lib/extensions/omniai-open_router.rb
209
+ - lib/extensions/open_router.md
177
210
  - sig/ai_client.rbs
178
211
  - the_ollama_model_problem.md
179
212
  homepage: https://github.com/MadBomber/ai_client
@@ -199,7 +232,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
199
232
  - !ruby/object:Gem::Version
200
233
  version: '0'
201
234
  requirements: []
202
- rubygems_version: 3.5.20
235
+ rubygems_version: 3.5.21
203
236
  signing_key:
204
237
  specification_version: 4
205
238
  summary: A generic AI Client for many providers