ai_client 0.2.2 → 0.2.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 07a66886ce84dd916574f7a38e292626124ed1a048da92eec33cfdc0e66435e4
4
- data.tar.gz: dc9c6734ae4a6a479ed8f6b76fc8edea31dbb588f67ae801b915a067346c5735
3
+ metadata.gz: a4128ef9024e8bc84820346570886aa9b3e25a4e8e442556591d45e0900cf9ea
4
+ data.tar.gz: 64d339534320c27eb060b02ae927f39a829f3792319dc1cd71e3e0920d6aa0ed
5
5
  SHA512:
6
- metadata.gz: ced4f6a3834132d0c1d127ad3fcd931e9600d748a28d6593fc9e382fb6bb98b9fd31de56d44d5a216c5f0bb8394c360f04779995fd2345ae1043e137cd994091
7
- data.tar.gz: be77228ac195d1f3de41f1ec0e82f2d2e2aa0f280e51354156c00b9c40bdb466581c87b4ba73b22c57fa86300df9555b7a2230854ac6ff93e365028b946275ba
6
+ metadata.gz: ef72dd20224bd2f887686c1c75881e8bd9195c60ac2a416c9acee3a93cf09335b852381ce3bf314cab72e9b2a7136b4d2bf3f0f622f39f55f1555500796fca17
7
+ data.tar.gz: 33cc0e5fcf58ebadb5269138342f4e6c49d42604e2ee2fd22630764c1d288bfafdc3c7999467d2dfe890cd9daae573165f497bd1044d807f80d3a89e97f3de33
data/CHANGELOG.md CHANGED
@@ -1,6 +1,11 @@
1
1
  ## [Unreleased]
2
2
 
3
3
  ## Released
4
+ ### [0.2.3] - 2024-10-08
5
+ - refactored the OmniAI extensions for Ollama, LocalAI and OpenRouter
6
+ - added a file for OpenRouter extensions
7
+ - Added the LLM class
8
+
4
9
  ### [0.2.2] - 2024-10-07
5
10
  - Added support for open_router.ai with extensions/omniai-open_router.rb
6
11
  - Removed the `model_type` object
data/README.md CHANGED
@@ -4,32 +4,55 @@ First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com
4
4
 
5
5
  See the [change log](CHANGELOG.md) for recent modifications.
6
6
 
7
- ## Summary
8
7
 
9
- Are you ready to supercharge your applications with cutting-edge AI capabilities?
10
- Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface
11
- for interacting with a multitude of AI service providers through a single,
12
- unified API.
8
+ <!-- Tocer[start]: Auto-generated, don't remove. -->
9
+
10
+ ## Table of Contents
11
+
12
+ - [Summary](#summary)
13
+ - [Installation](#installation)
14
+ - [Environment Variables for Provider Access](#environment-variables-for-provider-access)
15
+ - [Changing Envar API Key Names](#changing-envar-api-key-names)
16
+ - [api_key: Parameter](#api_key-parameter)
17
+ - [provider: Parameter](#provider-parameter)
18
+ - [Usage](#usage)
19
+ - [Configuration](#configuration)
20
+ - [Default Configuration](#default-configuration)
21
+ - [Class Configuration](#class-configuration)
22
+ - [1. Class Configuration Block](#1-class-configuration-block)
23
+ - [2. Set by a Config File](#2-set-by-a-config-file)
24
+ - [3. Supplemented by a Config File](#3-supplemented-by-a-config-file)
25
+ - [Instance Configuration](#instance-configuration)
26
+ - [1. Supplement from a Constructor Block](#1-supplement-from-a-constructor-block)
27
+ - [2. Supplement from a YAML File](#2-supplement-from-a-yaml-file)
28
+ - [3. Load Complete Configuration from a YAML File](#3-load-complete-configuration-from-a-yaml-file)
29
+ - [Top-level Client Methods](#top-level-client-methods)
30
+ - [chat](#chat)
31
+ - [embed](#embed)
32
+ - [speak](#speak)
33
+ - [transcribe](#transcribe)
34
+ - [Options](#options)
35
+ - [Advanced Prompts](#advanced-prompts)
36
+ - [Advanced Prompts with Tools](#advanced-prompts-with-tools)
37
+ - [Best ?? Practices](#best--practices)
38
+ - [OmniAI and OpenRouter](#omniai-and-openrouter)
39
+ - [Contributing](#contributing)
40
+ - [License](#license)
41
+
42
+ <!-- Tocer[finish]: Auto-generated, don't remove. -->
43
+
44
+
45
+ ## Summary
13
46
 
14
- With `ai_client`, you can effortlessly integrate large language models (LLMs)
15
- into your projects—simply specify the model name and let the gem handle the
16
- rest! Say goodbye to tedious configuration and hello to rapid development.
47
+ Are you ready to supercharge your applications with cutting-edge AI capabilities? Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface for interacting with a multitude of AI service providers through a single, unified API.
17
48
 
18
- This gem comes packed with built-in support for leading AI providers, including
19
- OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to
20
- implement chatbots, transcription services, speech synthesis, or embeddings,
21
- `ai_client` abstracts the complexities of API interactions, allowing you to focus
22
- on what truly matters: building amazing applications.
49
+ With `ai_client`, you can effortlessly integrate large language models (LLMs) into your projects—simply specify the model name and let the gem handle the rest! Say goodbye to tedious configuration and hello to rapid development.
23
50
 
24
- Plus, with its flexible middleware architecture, you can easily customize request
25
- and response processing—implement logging, retry logic, and more with minimal effort.
26
- And thanks to its seamless integration with the `OmniAI` framework, you can leverage
27
- the latest AI advancements without worrying about vendor lock-in.
51
+ This gem comes packed with built-in support for leading AI providers, including OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to implement chatbots, transcription services, speech synthesis, or embeddings, `ai_client` abstracts the complexities of API interactions, allowing you to focus on what truly matters: building amazing applications.
28
52
 
29
- Join the growing community of developers who are transforming their applications
30
- with `ai_client`. Install it today and unlock the full potential of AI in your
31
- projects!
53
+ Plus, with its flexible middleware architecture, you can easily customize request and response processing—implement logging, retry logic, and more with minimal effort. And thanks to its seamless integration with the `OmniAI` framework, you can leverage the latest AI advancements without worrying about vendor lock-in.
32
54
 
55
+ Join the growing community of developers who are transforming their applications with `ai_client`. Install it today and unlock the full potential of AI in your projects!
33
56
 
34
57
  ## Installation
35
58
 
@@ -45,13 +68,9 @@ If bundler is not being used to manage dependencies, install the gem by executin
45
68
  gem install ai_client
46
69
  ```
47
70
 
48
- ## Providers Supported
71
+ ## Environment Variables for Provider Access
49
72
 
50
- To explicitely designate a provider to use with an AiClient instance
51
- use the parameter `provider: :your_provider` with the Symbol for the supported
52
- provider you want to use with the model you specify. The following providers
53
- are supported by the OmniAI gem upon which AiClient depends along with a few
54
- extensions.
73
+ For fee providers require an account and provide an access token to allow the use of their LLM models. The value of these access tokens is typically saved in system environment variables or some other secure data store. AiClient has a default set of system environment variable names for these access tokens based upon the pattern of `provider_api_key` which can be over-ridden.
55
74
 
56
75
  | Symbol | Envar API Key | Client Source |
57
76
  | --- | --- | --- |
@@ -63,6 +82,25 @@ extensions.
63
82
  | :open_router | [OPEN_ROUTER_API_KEY](https://openrouter.ai/) | AiClient Extension |
64
83
  | :openai | [OPENAI_API_KEY](https://www.openai.com/) | OmniAI |
65
84
 
85
+
86
+ ### Changing Envar API Key Names
87
+
88
+ You can also configure the system environment variable names to match your on standards at the class level.
89
+
90
+ ```ruby
91
+ AiClient.class_config.envar_api_key_bames = {
92
+ anthropic: 'your_envar_name',
93
+ google: 'your_envar_name',
94
+ mistral: 'your_envar_name',
95
+ open_router: 'your_envar_name',
96
+ opena: 'your_envar_name'
97
+ }
98
+
99
+ AiClient.class_config.save('path/to/file.yml')
100
+ ```
101
+
102
+ ### api_key: Parameter
103
+
66
104
  In case you are using a different environment variable for your access token than the ones shown above you can use the `api_key:` parameter.
67
105
 
68
106
  ```ruby
@@ -71,6 +109,12 @@ client = AiClient.new('provider/model_name', api_key: ENV['OPENROUTER_API_KEY'])
71
109
 
72
110
  This way if you are using `AiClient` inside of a Rails application you can retrieve your access token from a secretes file.
73
111
 
112
+
113
+ ### provider: Parameter
114
+
115
+ To explicitly designate a provider to use with an AiClient instance use the parameter `provider: :your_provider` with the Symbol for the supported provider you want to use with the model you specify. The following providers are supported by the OmniAI gem upon which AiClient depends along with a few extensions.
116
+
117
+
74
118
  ## Usage
75
119
 
76
120
  Basic usage:
@@ -95,22 +139,17 @@ AI = AiClient.new('nomic-embed-text', provider: :ollama)
95
139
 
96
140
  ### Configuration
97
141
 
98
- There are three levels of configuration, each inherenting from the level above. The following sections
99
- describe those configuration levels.
142
+ There are three levels of configuration, each inherenting from the level above. The following sections describe those configuration levels.
100
143
 
101
144
  #### Default Configuration
102
145
 
103
- The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to
104
- update the [lib/ai_client/config.yml] file during development. If you have
105
- some changes for this configuration please send me a pull request so we
106
- can all benefit from your efforts.
146
+ The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
107
147
 
108
148
  #### Class Configuration
109
149
 
110
- The class configuration is derived initially from the default configuration. It
111
- can be changed in three ways.
150
+ The class configuration is derived initially from the default configuration. It can be changed in three ways.
112
151
 
113
- 1. Class Configuration Block
152
+ ##### 1. Class Configuration Block
114
153
 
115
154
  ```ruby
116
155
  AiClient.configuration do |config|
@@ -119,13 +158,13 @@ AiClient.configuration do |config|
119
158
  end
120
159
  ```
121
160
 
122
- 2. Set by a Config File
161
+ ##### 2. Set by a Config File
123
162
 
124
163
  ```ruby
125
164
  AiClient.class_config = AiClient::Config.load('path/to/file.yml')
126
165
  ```
127
166
 
128
- 3. Supplemented by a Config File
167
+ ##### 3. Supplemented by a Config File
129
168
 
130
169
  ```ruby
131
170
  AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
@@ -133,12 +172,9 @@ AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
133
172
 
134
173
  #### Instance Configuration
135
174
 
136
- All instances have a configuration. Initially that configuration is the same
137
- as the class configuration; however, each instance can have its own separate
138
- configuration. For an instance the class configuration can either be supplemented
139
- or complete over-ridden.
175
+ All instances have a configuration. Initially that configuration is the same as the class configuration; however, each instance can have its own separate configuration. For an instance the class configuration can either be supplemented or complete over-ridden.
140
176
 
141
- 1. Supplement from a Constructor Block
177
+ ##### 1. Supplement from a Constructor Block
142
178
 
143
179
  ```ruby
144
180
  client = AiClient.new('super-ai-overlord-model') do |config|
@@ -147,43 +183,111 @@ client = AiClient.new('super-ai-overlord-model') do |config|
147
183
  end
148
184
  ```
149
185
 
150
- 2. Suppliment from a YAML File
186
+ ##### 2. Supplement from a YAML File
151
187
 
152
188
  ```ruby
153
189
  client = AiClient.new('baby-model', config: 'path/to/file.yml')
154
190
  ```
155
191
 
156
- 3. Load Complete Configuration from a YAML File
192
+ ##### 3. Load Complete Configuration from a YAML File
157
193
 
158
194
  ```ruby
159
195
  client = AiClient.new('your-model')
160
196
  client.config = AiClient::Config.load('path/to/file.yml')
161
197
  ```
162
198
 
163
- ### What Now?
199
+ ### Top-level Client Methods
200
+
201
+ See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
202
+
203
+ The following examples are based upon the same client configuration.
204
+
205
+ ```ruby
206
+ AI = AiClient.new(...) do ... end
207
+ ```
208
+
209
+ ##### chat
210
+
211
+ Typically `chat(...)` is the most used top-level. Sometimes refered to as completion. You are giving a prompt to an LLM and expecting the LLM to respond (ie. complete its transformation). If you consider the prompt to be a question, the response would be the answer. If the prompt were a task, the response would be the completion of that task.
212
+
213
+ ```ruby
214
+ response = AI.chat(...)
215
+ ```
216
+
217
+ The simplest form is a string prompt. The prompt can come from anywher - a litteral, variable, or get if from a database or a file.
218
+
219
+ ```ruby
220
+ response = AI.chat("Is there anything simpler than this?")
221
+ ```
222
+
223
+ The response will be a simple string or a response object based upon the setting of your `config.return_raw` item. If `true` then you get the whole shebang. If `false` you get just the string.
224
+
225
+ See the [Advanced Prompts] section to learn how to configure a complex prompt message.
226
+
227
+
228
+ ##### embed
164
229
 
165
- TODO: Document the methods and their options.
230
+ Embeddings (as in 'embed additional information') is how retrial augmented generation (RAG) works - which is a deeper subject for another place. Basically when using an LLM that supports the vectorization of stuff to create embeddings you can use `embed(stuff)` to return the vector associated with the stuff you gave the model. This vector (an Array of Floating Points Numbers) is a mathematical representation of the stuff that can be used to compare, mathematically, one piece of stuff to a collection of stuff to find other stuff in that collection that closely resembles the stuff for which you are looking. Q: What is stuff? A: You know; its just stuff.
166
231
 
167
232
  ```ruby
168
- AI.chat(...)
169
- AI.transcribe(...)
170
- AI.speak(...)
171
233
  AI.embed(...)
172
- AI.batch_embed(...)
234
+ response = AI.batch_embed(...)
173
235
  ```
174
236
 
175
- See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
237
+ Recommendation: Use PostgreSQL, pg_vector and the neighbor gem.
238
+
239
+ ##### speak
176
240
 
177
- ### System Environment Variables
241
+ ```ruby
242
+ res[pmse = AI.speak("Isn't it nice to have a computer that will talk to you?")
243
+ ```
244
+
245
+ The response will contain audio data that can be played, manipulated or saved to a file.
246
+
247
+ ##### transcribe
248
+
249
+ ```ruby
250
+ response = AI.transcribe(...)
251
+ ```
178
252
 
179
- The API keys used with each LLM provider have the pattern `XXX_API_KEY` where XXX is the name of the provided. For example `OPENAI_API_KEY1` and `ANTROPIC_API_KEY` etc.
180
253
 
181
- TODO: list all providers supported and their envar
182
254
 
183
255
  ### Options
184
256
 
185
257
  TODO: document the options like `provider: :ollama`
186
258
 
259
+ ### Advanced Prompts
260
+
261
+ In more complex application providing a simple string as your prompt is not sufficient. AiClient can take advantage of OmniAI's complex message builder.
262
+
263
+ ```ruby
264
+ client = AiClient.new 'some_model_bane'
265
+
266
+ completion = client.chat do |prompt|
267
+ prompt.system('You are an expert biologist with an expertise in animals.')
268
+ prompt.user do |message|
269
+ message.text 'What species are in the attached photos?'
270
+ message.url('https://.../cat.jpeg', "image/jpeg")
271
+ message.url('https://.../dog.jpeg', "image/jpeg")
272
+ message.file('./hamster.jpeg', "image/jpeg")
273
+ end
274
+ end
275
+
276
+ completion #=> 'The photos are of a cat, a dog, and a hamster.'
277
+ ```
278
+
279
+ Of course if `client.config.return_raw` is true, the completion value will be the complete response object.
280
+
281
+ ### Advanced Prompts with Tools
282
+
283
+ One of the latest innovations in LLMs is the ability to use functions (aka tools) as `callbacks` to gather more information or to execute a task at the direction of the LLM prompt processing.
284
+
285
+ See [blog post](https://ksylvest.com/posts/2024-08-16/using-omniai-to-leverage-tools-with-llms) by Kevin Sylvestre, author of the OmniAI gem.
286
+
287
+
288
+ TODO: Need to create an example RAG that does not need another access token to a service
289
+
290
+
187
291
  ## Best ?? Practices
188
292
 
189
293
  If you are going to be using one model for multiple purposes in different parts of your application you can assign the instance of `AiClient` to a constant so that the same client can be used everywhere.
@@ -199,18 +303,9 @@ AI.speak "warning Will Robinson! #{bad_things_happened}"
199
303
 
200
304
  Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
201
305
 
202
- ## Extensions for OmniAI
203
-
204
- The AiClient makes use of extensions to the OmniAI gem that define
205
- additional providers and protocols.
206
-
207
- 1. **OmniAI::Ollama^** which wraps the OmniAI::OpenAI class
208
- 2. **OmniAI::LocalAI** which also wraps the OmniAI::OpenAI class
209
- 3. **OmniAI::OpenRouter** TODO: Still under development
210
-
211
306
  ## OmniAI and OpenRouter
212
307
 
213
- OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
308
+ Both OmniAI and OpenRouter have similar goals - to provide a common interface to multiple providers and LLMs. OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
214
309
 
215
310
  The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
216
311
 
data/Rakefile CHANGED
@@ -3,6 +3,14 @@
3
3
  require "bundler/gem_tasks"
4
4
  require "minitest/test_task"
5
5
 
6
+ begin
7
+ require "tocer/rake/register"
8
+ rescue LoadError => error
9
+ puts error.message
10
+ end
11
+
12
+ Tocer::Rake::Register.call
13
+
6
14
  Minitest::TestTask.create
7
15
 
8
16
  task default: :test
@@ -14,11 +14,23 @@
14
14
  binmode: false
15
15
  reraise_write_errors: []
16
16
  mon_data: !ruby/object:Monitor {}
17
- mon_data_owner_object_id: 920
17
+ mon_data_owner_object_id: 700
18
18
  level_override: {}
19
19
  :timeout:
20
20
  :return_raw: false
21
21
  :providers: {}
22
+ :envar_api_key_names:
23
+ :anthropic:
24
+ - ANTHROPIC_API_KEY
25
+ :google:
26
+ - GOOGLE_API_KEY
27
+ :mistral:
28
+ - MISTRAL_API_KEY
29
+ :open_router:
30
+ - OPEN_ROUTER_API_KEY
31
+ - OPENROUTER_API_KEY
32
+ :openai:
33
+ - OPENAI_API_KEY
22
34
  :provider_patterns:
23
35
  :anthropic: !ruby/regexp /^claude/i
24
36
  :openai: !ruby/regexp /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
@@ -91,16 +91,6 @@ class AiClient
91
91
  include Hashie::Extensions::Mash::PermissiveRespondTo
92
92
  include Hashie::Extensions::Mash::SymbolizeKeys
93
93
  include Hashie::Extensions::Mash::DefineAccessors
94
-
95
-
96
- # I'm not sure about this ...
97
- # def provider(name, &block)
98
- # if block_given?
99
- # providers[name] = block.call
100
- # else
101
- # providers[name] || {}
102
- # end
103
- # end
104
94
 
105
95
 
106
96
  def save(filepath=ENV['HOME']+'/aiclient_config.yml')
@@ -137,6 +127,13 @@ class AiClient
137
127
  timeout: nil,
138
128
  return_raw: false,
139
129
  providers: {},
130
+ envar_api_key_names: {
131
+ anthropic: ['ANTHROPIC_API_KEY'],
132
+ google: ['GOOGLE_API_KEY'],
133
+ mistral: ['MISTRAL_API_KEY'],
134
+ open_router: ['OPEN_ROUTER_API_KEY', 'OPENROUTER_API_KEY'],
135
+ openai: ['OPENAI_API_KEY']
136
+ },
140
137
  provider_patterns: {
141
138
  anthropic: /^claude/i,
142
139
  openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
@@ -0,0 +1,26 @@
1
+ # lib/ai_client/llm.rb
2
+
3
+ require 'active_hash'
4
+
5
+ class AiClient
6
+
7
+ # TODO: Think about this for the OpenRouter modesl DB
8
+ # Might cahnge this to ActiveYaml
9
+
10
+ class LLM < ActiveHash::Base
11
+ self.data = AiClient.models
12
+
13
+ def model = id.split('/')[1]
14
+ def provider = id.split('/')[0]
15
+
16
+ class << self
17
+ def import(path_to_uml_file) # TODO: load
18
+ raise "TODO: Not Implemented: #{path_to_yml_file}"
19
+ end
20
+
21
+ def export(path_to_uml_file) # TODO: Dump
22
+ raise "TODO: Not Implemented: #{path_to_yml_file}"
23
+ end
24
+ end
25
+ end
26
+ end
@@ -0,0 +1,79 @@
1
+ # lib/ai_client/open_router_extensions.rb
2
+ # frozen_string_literal: true
3
+
4
+ # These extensions to AiClient are only available with
5
+ # a valid API Key for the open_router.ai web-service
6
+
7
+ require 'open_router'
8
+
9
+ class AiClient
10
+
11
+ def models = self.class.models
12
+ def providers = self.class.providers
13
+ def model_names(a_provider=nil) = self.class.model_names(a_provider)
14
+ def model_details(a_model) = self.class.model_details(a_model)
15
+ def find_model(a_model_substring) = self.class.find_model(a_model_substring)
16
+
17
+ class << self
18
+ def add_open_router_extensions
19
+ access_token = fetch_access_token
20
+
21
+ return unless access_token
22
+
23
+ configure_open_router(access_token)
24
+ initialize_orc_client
25
+ end
26
+
27
+ def orc_client
28
+ @orc_client ||= add_open_router_extensions || raise("OpenRouter extensions are not available")
29
+ end
30
+
31
+ def models
32
+ @models ||= orc_client.models
33
+ end
34
+
35
+ # TODO: Refactor these DB like methods to take
36
+ # advantage of AiClient::LLM
37
+
38
+ def model_names(provider=nil)
39
+ model_ids = models.map { _1['id'] }
40
+
41
+ return model_ids unless provider
42
+
43
+ model_ids.filter_map { _1.split('/')[1] if _1.start_with?(provider.to_s.downcase) }
44
+ end
45
+
46
+ def model_details(model)
47
+ orc_models.find { _1['id'].include?(model) }
48
+ end
49
+
50
+ def providers
51
+ @providers ||= models.map{ _1['id'].split('/')[0] }.sort.uniq
52
+ end
53
+
54
+ def find_model(a_model_substring)
55
+ model_names.select{ _1.include?(a_model_substring) }
56
+ end
57
+
58
+ private
59
+
60
+ # Similar to fetch_api_key but for the class_config
61
+ def fetch_access_token
62
+ class_config.envar_api_key_names.open_router
63
+ .map { |key| ENV[key] }
64
+ .compact
65
+ .first
66
+ end
67
+
68
+ def configure_open_router(access_token)
69
+ OpenRouter.configure { |config| config.access_token = access_token }
70
+ end
71
+
72
+ def initialize_orc_client
73
+ @orc_client ||= OpenRouter::Client.new
74
+ end
75
+ end
76
+ end
77
+
78
+
79
+ AiClient.add_open_router_extensions
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.2.2"
4
+ VERSION = "0.2.3"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -16,10 +16,6 @@ require 'omniai/openai'
16
16
 
17
17
  require 'open_router'
18
18
 
19
- require_relative 'extensions/omniai-localai'
20
- require_relative 'extensions/omniai-ollama'
21
- require_relative 'extensions/omniai-open_router'
22
-
23
19
  require_relative 'ai_client/chat'
24
20
  require_relative 'ai_client/embed'
25
21
  require_relative 'ai_client/speak'
@@ -29,6 +25,9 @@ require_relative 'ai_client/configuration'
29
25
  require_relative 'ai_client/middleware'
30
26
  require_relative 'ai_client/version'
31
27
 
28
+ require_relative 'ai_client/open_router_extensions'
29
+ require_relative 'ai_client/llm' # SMELL: must come after the open router stuff
30
+
32
31
  # Create a generic client instance using only model name
33
32
  # client = AiClient.new('gpt-3.5-turbo')
34
33
  #
@@ -131,25 +130,12 @@ class AiClient
131
130
 
132
131
  def content
133
132
  case @provider
134
- when :openai, :localai, :ollama
135
- # last_response.data.dig('choices', 0, 'message', 'content')
133
+ when :localai, :mistral, :ollama, :open_router, :openai
136
134
  last_response.data.tunnel 'content'
137
135
 
138
- when :anthropic
139
- # last_response.data.dig('content',0,'text')
136
+ when :anthropic, :google
140
137
  last_response.data.tunnel 'text'
141
138
 
142
- when :google
143
- # last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
144
- last_response.data.tunnel 'text'
145
-
146
- when :mistral
147
- # last_response.data.dig('choices', 0, 'message', 'content')
148
- last_response.data.tunnel 'content'
149
-
150
- when :open_router
151
- last_response.data.tunnel 'content'
152
-
153
139
  else
154
140
  raise NotImplementedError, "Content extraction not implemented for provider: #{@provider}"
155
141
  end
@@ -187,9 +173,8 @@ class AiClient
187
173
 
188
174
 
189
175
  def create_client
190
- api_key = fetch_api_key # Fetching the API key should only happen for valid providers
191
176
  client_options = {
192
- api_key: api_key,
177
+ api_key: fetch_api_key,
193
178
  logger: @logger,
194
179
  timeout: @timeout
195
180
  }
@@ -210,13 +195,13 @@ class AiClient
210
195
  OmniAI::Mistral::Client.new(**client_options)
211
196
 
212
197
  when :ollama
213
- OmniAI::Ollama::Client.new(**client_options)
198
+ OmniAI::OpenAI::Client.new(host: 'http://localhost:11434', api_key: nil, **client_options)
214
199
 
215
200
  when :localai
216
- OmniAI::LocalAI::Client.new(**client_options)
201
+ OmniAI::OpenAI::Client.new(host: 'http://localhost:8080', api_key: nil, **client_options)
217
202
 
218
203
  when :open_router
219
- OmniAI::OpenRouter::Client.new(**client_options)
204
+ OmniAI::OpenAI::Client.new(host: 'https://openrouter.ai', api_prefix: 'api', **client_options)
220
205
 
221
206
  else
222
207
  raise ArgumentError, "Unsupported provider: #{@provider}"
@@ -224,20 +209,14 @@ class AiClient
224
209
  end
225
210
 
226
211
 
212
+ # Similar to fetch_access_tokne but for the instance config
227
213
  def fetch_api_key
228
- env_var_name = "#{@provider.upcase}_API_KEY"
229
- api_key = ENV[env_var_name]
230
-
231
- if api_key.nil? || api_key.empty?
232
- unless [:localai, :ollama].include? provider
233
- raise ArgumentError, "API key not found in environment variable #{env_var_name}"
234
- end
235
- end
236
-
237
- api_key
214
+ config.envar_api_key_names[@provider]
215
+ &.map { |key| ENV[key] }
216
+ &.compact
217
+ &.first
238
218
  end
239
219
 
240
-
241
220
  def determine_provider(model)
242
221
  config.provider_patterns.find { |provider, pattern| model.match?(pattern) }&.first ||
243
222
  raise(ArgumentError, "Unsupported model: #{model}")
metadata CHANGED
@@ -1,15 +1,29 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.2
4
+ version: 0.2.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-08 00:00:00.000000000 Z
11
+ date: 2024-10-10 00:00:00.000000000 Z
12
12
  dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: active_hash
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - ">="
18
+ - !ruby/object:Gem::Version
19
+ version: '0'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: '0'
13
27
  - !ruby/object:Gem::Dependency
14
28
  name: hashie
15
29
  requirement: !ruby/object:Gem::Requirement
@@ -164,6 +178,20 @@ dependencies:
164
178
  - - ">="
165
179
  - !ruby/object:Gem::Version
166
180
  version: '0'
181
+ - !ruby/object:Gem::Dependency
182
+ name: tocer
183
+ requirement: !ruby/object:Gem::Requirement
184
+ requirements:
185
+ - - ">="
186
+ - !ruby/object:Gem::Version
187
+ version: '0'
188
+ type: :development
189
+ prerelease: false
190
+ version_requirements: !ruby/object:Gem::Requirement
191
+ requirements:
192
+ - - ">="
193
+ - !ruby/object:Gem::Version
194
+ version: '0'
167
195
  description: "`ai_client` is a versatile Ruby gem that offers a seamless interface
168
196
  \nfor integrating a wide range of AI service providers through a single, \nunified
169
197
  API. With `ai_client`, you can simply specify the model name \nand quickly leverage
@@ -197,16 +225,14 @@ files:
197
225
  - lib/ai_client/config.yml
198
226
  - lib/ai_client/configuration.rb
199
227
  - lib/ai_client/embed.rb
228
+ - lib/ai_client/llm.rb
200
229
  - lib/ai_client/logger_middleware.rb
201
230
  - lib/ai_client/middleware.rb
231
+ - lib/ai_client/open_router_extensions.rb
202
232
  - lib/ai_client/retry_middleware.rb
203
233
  - lib/ai_client/speak.rb
204
234
  - lib/ai_client/transcribe.rb
205
235
  - lib/ai_client/version.rb
206
- - lib/extensions/omniai-localai.rb
207
- - lib/extensions/omniai-ollama.rb
208
- - lib/extensions/omniai-open_router.rb
209
- - lib/extensions/open_router.md
210
236
  - sig/ai_client.rbs
211
237
  - the_ollama_model_problem.md
212
238
  homepage: https://github.com/MadBomber/ai_client
@@ -1,31 +0,0 @@
1
- # extensions/omniai-localai.rb
2
- # frozen_string_literal: true
3
-
4
- require 'omniai'
5
- require 'omniai/openai'
6
-
7
- module OmniAI
8
-
9
- # Create an alias for OmniAI::OpenAI module
10
- module LocalAI
11
- extend OmniAI::OpenAI
12
-
13
- # Alias classes from OmniAI::OpenAI
14
- class Client < OmniAI::OpenAI::Client
15
- def initialize(**options)
16
- options[:host] = 'http://localhost:8080' unless options.has_key?(:host)
17
- super(**options)
18
- end
19
- end
20
-
21
-
22
- Config = OmniAI::OpenAI::Config
23
-
24
- # Alias the Thread class and its nested classes
25
- Thread = OmniAI::OpenAI::Thread
26
- Annotation = OmniAI::OpenAI::Thread::Annotation
27
- Attachment = OmniAI::OpenAI::Thread::Attachment
28
- Message = OmniAI::OpenAI::Thread::Message
29
- Run = OmniAI::OpenAI::Thread::Run
30
- end
31
- end
@@ -1,30 +0,0 @@
1
- # extensions/omniai-ollama.rb
2
- # frozen_string_literal: true
3
-
4
- require 'omniai'
5
- require 'omniai/openai'
6
-
7
- module OmniAI
8
-
9
- # Create an alias for OmniAI::OpenAI module
10
- module Ollama
11
- extend OmniAI::OpenAI
12
-
13
- # Alias classes from OmniAI::OpenAI
14
- class Client < OmniAI::OpenAI::Client
15
- def initialize(**options)
16
- options[:host] = 'http://localhost:11434' unless options.has_key?(:host)
17
- super(**options)
18
- end
19
- end
20
-
21
- Config = OmniAI::OpenAI::Config
22
-
23
- # Alias the Thread class and its nested classes
24
- Thread = OmniAI::OpenAI::Thread
25
- Annotation = OmniAI::OpenAI::Thread::Annotation
26
- Attachment = OmniAI::OpenAI::Thread::Attachment
27
- Message = OmniAI::OpenAI::Thread::Message
28
- Run = OmniAI::OpenAI::Thread::Run
29
- end
30
- end
@@ -1,92 +0,0 @@
1
- # lib/extensions/omniai-open_router.rb
2
- # frozen_string_literal: true
3
-
4
- require 'omniai'
5
- require 'omniai/openai'
6
-
7
- module OmniAI
8
-
9
- # Create an alias for OmniAI::OpenAI module
10
- module OpenRouter
11
- extend OmniAI::OpenAI
12
-
13
- # Alias classes from OmniAI::OpenAI
14
- class Client < OmniAI::OpenAI::Client
15
- def initialize(**options)
16
- options[:host] = 'https://openrouter.ai/api/v1' unless options.has_key?(:host)
17
- super(**options)
18
- end
19
-
20
- def self.openrouter
21
- OmniAI::OpenRouter::Client
22
- end
23
-
24
- def self.open_router
25
- OmniAI::OpenRouter::Client
26
- end
27
-
28
- def self.find(provider:, **)
29
- return OmniAI.open_router.new(**) if :open_reouter == provider
30
-
31
- super(provider: provider.to_s, **)
32
- end
33
- end
34
-
35
- Chat = OmniAI::OpenAI::Chat
36
-
37
- class Chat
38
- def path
39
- "/api/v1/chat/completions"
40
- end
41
- end
42
-
43
- Config = OmniAI::OpenAI::Config
44
-
45
- # Alias the Thread class and its nested classes
46
- Thread = OmniAI::OpenAI::Thread
47
- Thread::Annotation = OmniAI::OpenAI::Thread::Annotation
48
- Thread::Attachment = OmniAI::OpenAI::Thread::Attachment
49
- Thread::Message = OmniAI::OpenAI::Thread::Message
50
- Thread::Run = OmniAI::OpenAI::Thread::Run
51
- end
52
- end
53
-
54
- ######################################################
55
- ## Extend Capabilities Using OpenRouter
56
- #
57
- # TODO: catch the models db
58
- # TODO: consider wrapping the models database in an ActiveModel
59
- #
60
- class AiClient
61
- class << self
62
- def orc_models
63
- @orc_models ||= ORC.models if defined?(ORC)
64
- end
65
-
66
- def orc_model_names(provider=nil)
67
- if provider.nil?
68
- orc_models.map{|e| e['id']}
69
- else
70
- orc_models
71
- .map{|e| e['id']}
72
- .select{|name| name.start_with? provider.to_s.downcase}
73
- .map{|e| e.split('/')[1]}
74
- end
75
- end
76
-
77
- def orc_model_details(model)
78
- orc_models.select{|e| e['id'].include?(model)}
79
- end
80
- end
81
- end
82
-
83
- if ENV.fetch('OPEN_ROUTER_API_KEY', nil)
84
- OpenRouter.configure do |config|
85
- config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
86
- end
87
-
88
- # Use a default provider/model
89
- AiClient::ORC = OpenRouter::Client.new
90
- end
91
-
92
-
@@ -1,97 +0,0 @@
1
- # Notes on OpenRouter
2
-
3
- OpenROuter is a web service that has a common API to many
4
- back-end LLM processors. Its goal is basically the same as the
5
- OmniAI gem - provide the flexibility of using multiple models
6
- processed by myltiple providers.
7
-
8
- ```ruby
9
- OpenRouter.configure do |config|
10
- config.access_token = ENV.fetch('OPEN_ROUTER_API_KEY', nil)
11
- end
12
-
13
- # Use a default provider/model
14
- AI = OpenRouter::Client.new
15
-
16
- # Returns an Array of Hash for supported
17
- # models/providers
18
- Models = AI.models
19
- ```
20
-
21
- models with a "/" are targeted to open router.
22
- before the "/" is the provider after it is the model name
23
-
24
- Will need to add this entriy to the AiClient::Config `provider_patterns` Hash:
25
-
26
- ```ruby
27
- open_router: /\//, # /(.*)\/(.*)/ provider / model name
28
- ```
29
-
30
- models can be an Array of Strings. The first is the primary while
31
- the rest are fallbacks in case there is one before fails
32
-
33
- ```ruby
34
- {
35
- "models": ["anthropic/claude-2.1", "gryphe/mythomax-l2-13b"],
36
- "route": "fallback",
37
- ... // Other params
38
- }
39
- ```
40
-
41
- You can have OpenRouter send your prompt to the best
42
- provider/model for the prompt like this:
43
-
44
- ```ruby
45
- require "open_router"
46
-
47
- OpenRouter.configure do |config|
48
- config.access_token = ENV["ACCESS_TOKEN"]
49
- config.site_name = "YOUR_APP_NAME"
50
- config.site_url = "YOUR_SITE_URL"
51
- end
52
-
53
- OpenRouter::Client.new.complete(
54
- model: "openrouter/auto",
55
- messages: [
56
- {
57
- "role": "user",
58
- "content": "What is the meaning of life?"
59
- }
60
- ]
61
- ).then do |response|
62
- puts response.dig("choices", 0, "message", "content")
63
- end
64
- ```
65
-
66
- OpenRouter can also support OpenAI's API by using this
67
- base_url: "https://openrouter.ai/api/v1",
68
-
69
- Request Format Documentation
70
- https://openrouter.ai/docs/requests
71
-
72
- Simple Quick Start ...
73
-
74
- ```ruby
75
- OpenRouter::Client.new.complete(
76
- model: "openai/gpt-3.5-turbo",
77
- messages: [
78
- {
79
- "role": "user",
80
- "content": "What is the meaning of life?"
81
- }
82
- ]
83
- ).then do |response|
84
- puts response.dig("choices", 0, "message", "content")
85
- end
86
- ```
87
-
88
- ## Design Approaches
89
-
90
- There are at least two different approaches to
91
- integrate the OpenRouter capability. 1) Use the open_router gem
92
- and forget about using the same common-ish
93
- API established by OmniAI; or 2) Take advantage
94
- or OpenRouter's OpenAI's API and do for it
95
- the same thing that was done of Ollama and LocalAI.
96
-
97
-