ai_client 0.2.2 → 0.2.4

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 07a66886ce84dd916574f7a38e292626124ed1a048da92eec33cfdc0e66435e4
4
- data.tar.gz: dc9c6734ae4a6a479ed8f6b76fc8edea31dbb588f67ae801b915a067346c5735
3
+ metadata.gz: 7b12150b0f3f29804d520aa319a99cc71adc96f023ec99bccc0a9a6682bdba6e
4
+ data.tar.gz: 59d811a6b9de974f76672a9a1386d6efd3e4547d53ae7b4d24c532a250c00c86
5
5
  SHA512:
6
- metadata.gz: ced4f6a3834132d0c1d127ad3fcd931e9600d748a28d6593fc9e382fb6bb98b9fd31de56d44d5a216c5f0bb8394c360f04779995fd2345ae1043e137cd994091
7
- data.tar.gz: be77228ac195d1f3de41f1ec0e82f2d2e2aa0f280e51354156c00b9c40bdb466581c87b4ba73b22c57fa86300df9555b7a2230854ac6ff93e365028b946275ba
6
+ metadata.gz: 0beacebae38d5c2101498a59245c5112469ff8fddda2723b5d916ae6d5f731fa6e9e55de46422a6daf463242356e1575c0bbdf77612f9edf9fd05d6b4f885407
7
+ data.tar.gz: 36b2b5f5fd3d51577b31ac8cbcd57980cb7d0a912428f37e8f345764620aa02198a65b905505136ea40ea2866e274b1f8d6b9b1c9390ccd3608eaad138c37ebb
data/CHANGELOG.md CHANGED
@@ -1,6 +1,18 @@
1
1
  ## [Unreleased]
2
2
 
3
3
  ## Released
4
+ ### [0.2.4] - 2024-10-10
5
+ - constrained gem omniai-openai to version 1.8.3+ for access to open_router.ai
6
+ - caching models database from open_router.ai
7
+ - added class methods reset_default_config and reset_llm_data
8
+ - support for open_router.ai should be fully integrated now that omniai-openai is at version 1.8.3
9
+
10
+
11
+ ### [0.2.3] - 2024-10-08
12
+ - refactored the OmniAI extensions for Ollama, LocalAI and OpenRouter
13
+ - added a file for OpenRouter extensions
14
+ - Added the LLM class
15
+
4
16
  ### [0.2.2] - 2024-10-07
5
17
  - Added support for open_router.ai with extensions/omniai-open_router.rb
6
18
  - Removed the `model_type` object
data/README.md CHANGED
@@ -4,32 +4,55 @@ First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com
4
4
 
5
5
  See the [change log](CHANGELOG.md) for recent modifications.
6
6
 
7
- ## Summary
8
7
 
9
- Are you ready to supercharge your applications with cutting-edge AI capabilities?
10
- Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface
11
- for interacting with a multitude of AI service providers through a single,
12
- unified API.
8
+ <!-- Tocer[start]: Auto-generated, don't remove. -->
9
+
10
+ ## Table of Contents
11
+
12
+ - [Summary](#summary)
13
+ - [Installation](#installation)
14
+ - [Environment Variables for Provider Access](#environment-variables-for-provider-access)
15
+ - [Changing Envar API Key Names](#changing-envar-api-key-names)
16
+ - [api_key: Parameter](#api_key-parameter)
17
+ - [provider: Parameter](#provider-parameter)
18
+ - [Usage](#usage)
19
+ - [Configuration](#configuration)
20
+ - [Default Configuration](#default-configuration)
21
+ - [Class Configuration](#class-configuration)
22
+ - [1. Class Configuration Block](#1-class-configuration-block)
23
+ - [2. Set by a Config File](#2-set-by-a-config-file)
24
+ - [3. Supplemented by a Config File](#3-supplemented-by-a-config-file)
25
+ - [Instance Configuration](#instance-configuration)
26
+ - [1. Supplement from a Constructor Block](#1-supplement-from-a-constructor-block)
27
+ - [2. Supplement from a YAML File](#2-supplement-from-a-yaml-file)
28
+ - [3. Load Complete Configuration from a YAML File](#3-load-complete-configuration-from-a-yaml-file)
29
+ - [Top-level Client Methods](#top-level-client-methods)
30
+ - [chat](#chat)
31
+ - [embed](#embed)
32
+ - [speak](#speak)
33
+ - [transcribe](#transcribe)
34
+ - [Options](#options)
35
+ - [Advanced Prompts](#advanced-prompts)
36
+ - [Advanced Prompts with Tools](#advanced-prompts-with-tools)
37
+ - [Best ?? Practices](#best--practices)
38
+ - [OmniAI and OpenRouter](#omniai-and-openrouter)
39
+ - [Contributing](#contributing)
40
+ - [License](#license)
41
+
42
+ <!-- Tocer[finish]: Auto-generated, don't remove. -->
43
+
44
+
45
+ ## Summary
13
46
 
14
- With `ai_client`, you can effortlessly integrate large language models (LLMs)
15
- into your projects—simply specify the model name and let the gem handle the
16
- rest! Say goodbye to tedious configuration and hello to rapid development.
47
+ Are you ready to supercharge your applications with cutting-edge AI capabilities? Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface for interacting with a multitude of AI service providers through a single, unified API.
17
48
 
18
- This gem comes packed with built-in support for leading AI providers, including
19
- OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to
20
- implement chatbots, transcription services, speech synthesis, or embeddings,
21
- `ai_client` abstracts the complexities of API interactions, allowing you to focus
22
- on what truly matters: building amazing applications.
49
+ With `ai_client`, you can effortlessly integrate large language models (LLMs) into your projects—simply specify the model name and let the gem handle the rest! Say goodbye to tedious configuration and hello to rapid development.
23
50
 
24
- Plus, with its flexible middleware architecture, you can easily customize request
25
- and response processing—implement logging, retry logic, and more with minimal effort.
26
- And thanks to its seamless integration with the `OmniAI` framework, you can leverage
27
- the latest AI advancements without worrying about vendor lock-in.
51
+ This gem comes packed with built-in support for leading AI providers, including OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to implement chatbots, transcription services, speech synthesis, or embeddings, `ai_client` abstracts the complexities of API interactions, allowing you to focus on what truly matters: building amazing applications.
28
52
 
29
- Join the growing community of developers who are transforming their applications
30
- with `ai_client`. Install it today and unlock the full potential of AI in your
31
- projects!
53
+ Plus, with its flexible middleware architecture, you can easily customize request and response processing—implement logging, retry logic, and more with minimal effort. And thanks to its seamless integration with the `OmniAI` framework, you can leverage the latest AI advancements without worrying about vendor lock-in.
32
54
 
55
+ Join the growing community of developers who are transforming their applications with `ai_client`. Install it today and unlock the full potential of AI in your projects!
33
56
 
34
57
  ## Installation
35
58
 
@@ -45,13 +68,9 @@ If bundler is not being used to manage dependencies, install the gem by executin
45
68
  gem install ai_client
46
69
  ```
47
70
 
48
- ## Providers Supported
71
+ ## Environment Variables for Provider Access
49
72
 
50
- To explicitely designate a provider to use with an AiClient instance
51
- use the parameter `provider: :your_provider` with the Symbol for the supported
52
- provider you want to use with the model you specify. The following providers
53
- are supported by the OmniAI gem upon which AiClient depends along with a few
54
- extensions.
73
+ For fee providers require an account and provide an access token to allow the use of their LLM models. The value of these access tokens is typically saved in system environment variables or some other secure data store. AiClient has a default set of system environment variable names for these access tokens based upon the pattern of `provider_api_key` which can be over-ridden.
55
74
 
56
75
  | Symbol | Envar API Key | Client Source |
57
76
  | --- | --- | --- |
@@ -63,6 +82,25 @@ extensions.
63
82
  | :open_router | [OPEN_ROUTER_API_KEY](https://openrouter.ai/) | AiClient Extension |
64
83
  | :openai | [OPENAI_API_KEY](https://www.openai.com/) | OmniAI |
65
84
 
85
+
86
+ ### Changing Envar API Key Names
87
+
88
+ You can also configure the system environment variable names to match your on standards at the class level.
89
+
90
+ ```ruby
91
+ AiClient.class_config.envar_api_key_bames = {
92
+ anthropic: 'your_envar_name',
93
+ google: 'your_envar_name',
94
+ mistral: 'your_envar_name',
95
+ open_router: 'your_envar_name',
96
+ opena: 'your_envar_name'
97
+ }
98
+
99
+ AiClient.class_config.save('path/to/file.yml')
100
+ ```
101
+
102
+ ### api_key: Parameter
103
+
66
104
  In case you are using a different environment variable for your access token than the ones shown above you can use the `api_key:` parameter.
67
105
 
68
106
  ```ruby
@@ -71,6 +109,12 @@ client = AiClient.new('provider/model_name', api_key: ENV['OPENROUTER_API_KEY'])
71
109
 
72
110
  This way if you are using `AiClient` inside of a Rails application you can retrieve your access token from a secretes file.
73
111
 
112
+
113
+ ### provider: Parameter
114
+
115
+ To explicitly designate a provider to use with an AiClient instance use the parameter `provider: :your_provider` with the Symbol for the supported provider you want to use with the model you specify. The following providers are supported by the OmniAI gem upon which AiClient depends along with a few extensions.
116
+
117
+
74
118
  ## Usage
75
119
 
76
120
  Basic usage:
@@ -95,22 +139,17 @@ AI = AiClient.new('nomic-embed-text', provider: :ollama)
95
139
 
96
140
  ### Configuration
97
141
 
98
- There are three levels of configuration, each inherenting from the level above. The following sections
99
- describe those configuration levels.
142
+ There are three levels of configuration, each inherenting from the level above. The following sections describe those configuration levels.
100
143
 
101
144
  #### Default Configuration
102
145
 
103
- The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to
104
- update the [lib/ai_client/config.yml] file during development. If you have
105
- some changes for this configuration please send me a pull request so we
106
- can all benefit from your efforts.
146
+ The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
107
147
 
108
148
  #### Class Configuration
109
149
 
110
- The class configuration is derived initially from the default configuration. It
111
- can be changed in three ways.
150
+ The class configuration is derived initially from the default configuration. It can be changed in three ways.
112
151
 
113
- 1. Class Configuration Block
152
+ ##### 1. Class Configuration Block
114
153
 
115
154
  ```ruby
116
155
  AiClient.configuration do |config|
@@ -119,13 +158,13 @@ AiClient.configuration do |config|
119
158
  end
120
159
  ```
121
160
 
122
- 2. Set by a Config File
161
+ ##### 2. Set by a Config File
123
162
 
124
163
  ```ruby
125
164
  AiClient.class_config = AiClient::Config.load('path/to/file.yml')
126
165
  ```
127
166
 
128
- 3. Supplemented by a Config File
167
+ ##### 3. Supplemented by a Config File
129
168
 
130
169
  ```ruby
131
170
  AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
@@ -133,12 +172,9 @@ AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
133
172
 
134
173
  #### Instance Configuration
135
174
 
136
- All instances have a configuration. Initially that configuration is the same
137
- as the class configuration; however, each instance can have its own separate
138
- configuration. For an instance the class configuration can either be supplemented
139
- or complete over-ridden.
175
+ All instances have a configuration. Initially that configuration is the same as the class configuration; however, each instance can have its own separate configuration. For an instance the class configuration can either be supplemented or complete over-ridden.
140
176
 
141
- 1. Supplement from a Constructor Block
177
+ ##### 1. Supplement from a Constructor Block
142
178
 
143
179
  ```ruby
144
180
  client = AiClient.new('super-ai-overlord-model') do |config|
@@ -147,43 +183,111 @@ client = AiClient.new('super-ai-overlord-model') do |config|
147
183
  end
148
184
  ```
149
185
 
150
- 2. Suppliment from a YAML File
186
+ ##### 2. Supplement from a YAML File
151
187
 
152
188
  ```ruby
153
189
  client = AiClient.new('baby-model', config: 'path/to/file.yml')
154
190
  ```
155
191
 
156
- 3. Load Complete Configuration from a YAML File
192
+ ##### 3. Load Complete Configuration from a YAML File
157
193
 
158
194
  ```ruby
159
195
  client = AiClient.new('your-model')
160
196
  client.config = AiClient::Config.load('path/to/file.yml')
161
197
  ```
162
198
 
163
- ### What Now?
199
+ ### Top-level Client Methods
200
+
201
+ See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
202
+
203
+ The following examples are based upon the same client configuration.
204
+
205
+ ```ruby
206
+ AI = AiClient.new(...) do ... end
207
+ ```
208
+
209
+ ##### chat
210
+
211
+ Typically `chat(...)` is the most used top-level. Sometimes refered to as completion. You are giving a prompt to an LLM and expecting the LLM to respond (ie. complete its transformation). If you consider the prompt to be a question, the response would be the answer. If the prompt were a task, the response would be the completion of that task.
212
+
213
+ ```ruby
214
+ response = AI.chat(...)
215
+ ```
216
+
217
+ The simplest form is a string prompt. The prompt can come from anywher - a litteral, variable, or get if from a database or a file.
218
+
219
+ ```ruby
220
+ response = AI.chat("Is there anything simpler than this?")
221
+ ```
222
+
223
+ The response will be a simple string or a response object based upon the setting of your `config.return_raw` item. If `true` then you get the whole shebang. If `false` you get just the string.
224
+
225
+ See the [Advanced Prompts] section to learn how to configure a complex prompt message.
226
+
227
+
228
+ ##### embed
164
229
 
165
- TODO: Document the methods and their options.
230
+ Embeddings (as in 'embed additional information') is how retrial augmented generation (RAG) works - which is a deeper subject for another place. Basically when using an LLM that supports the vectorization of stuff to create embeddings you can use `embed(stuff)` to return the vector associated with the stuff you gave the model. This vector (an Array of Floating Points Numbers) is a mathematical representation of the stuff that can be used to compare, mathematically, one piece of stuff to a collection of stuff to find other stuff in that collection that closely resembles the stuff for which you are looking. Q: What is stuff? A: You know; its just stuff.
166
231
 
167
232
  ```ruby
168
- AI.chat(...)
169
- AI.transcribe(...)
170
- AI.speak(...)
171
233
  AI.embed(...)
172
- AI.batch_embed(...)
234
+ response = AI.batch_embed(...)
173
235
  ```
174
236
 
175
- See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
237
+ Recommendation: Use PostgreSQL, pg_vector and the neighbor gem.
238
+
239
+ ##### speak
176
240
 
177
- ### System Environment Variables
241
+ ```ruby
242
+ res[pmse = AI.speak("Isn't it nice to have a computer that will talk to you?")
243
+ ```
244
+
245
+ The response will contain audio data that can be played, manipulated or saved to a file.
246
+
247
+ ##### transcribe
248
+
249
+ ```ruby
250
+ response = AI.transcribe(...)
251
+ ```
178
252
 
179
- The API keys used with each LLM provider have the pattern `XXX_API_KEY` where XXX is the name of the provided. For example `OPENAI_API_KEY1` and `ANTROPIC_API_KEY` etc.
180
253
 
181
- TODO: list all providers supported and their envar
182
254
 
183
255
  ### Options
184
256
 
185
257
  TODO: document the options like `provider: :ollama`
186
258
 
259
+ ### Advanced Prompts
260
+
261
+ In more complex application providing a simple string as your prompt is not sufficient. AiClient can take advantage of OmniAI's complex message builder.
262
+
263
+ ```ruby
264
+ client = AiClient.new 'some_model_bane'
265
+
266
+ completion = client.chat do |prompt|
267
+ prompt.system('You are an expert biologist with an expertise in animals.')
268
+ prompt.user do |message|
269
+ message.text 'What species are in the attached photos?'
270
+ message.url('https://.../cat.jpeg', "image/jpeg")
271
+ message.url('https://.../dog.jpeg', "image/jpeg")
272
+ message.file('./hamster.jpeg', "image/jpeg")
273
+ end
274
+ end
275
+
276
+ completion #=> 'The photos are of a cat, a dog, and a hamster.'
277
+ ```
278
+
279
+ Of course if `client.config.return_raw` is true, the completion value will be the complete response object.
280
+
281
+ ### Advanced Prompts with Tools
282
+
283
+ One of the latest innovations in LLMs is the ability to use functions (aka tools) as `callbacks` to gather more information or to execute a task at the direction of the LLM prompt processing.
284
+
285
+ See [blog post](https://ksylvest.com/posts/2024-08-16/using-omniai-to-leverage-tools-with-llms) by Kevin Sylvestre, author of the OmniAI gem.
286
+
287
+
288
+ TODO: Need to create an example RAG that does not need another access token to a service
289
+
290
+
187
291
  ## Best ?? Practices
188
292
 
189
293
  If you are going to be using one model for multiple purposes in different parts of your application you can assign the instance of `AiClient` to a constant so that the same client can be used everywhere.
@@ -199,18 +303,9 @@ AI.speak "warning Will Robinson! #{bad_things_happened}"
199
303
 
200
304
  Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
201
305
 
202
- ## Extensions for OmniAI
203
-
204
- The AiClient makes use of extensions to the OmniAI gem that define
205
- additional providers and protocols.
206
-
207
- 1. **OmniAI::Ollama^** which wraps the OmniAI::OpenAI class
208
- 2. **OmniAI::LocalAI** which also wraps the OmniAI::OpenAI class
209
- 3. **OmniAI::OpenRouter** TODO: Still under development
210
-
211
306
  ## OmniAI and OpenRouter
212
307
 
213
- OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
308
+ Both OmniAI and OpenRouter have similar goals - to provide a common interface to multiple providers and LLMs. OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
214
309
 
215
310
  The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
216
311
 
data/Rakefile CHANGED
@@ -3,6 +3,14 @@
3
3
  require "bundler/gem_tasks"
4
4
  require "minitest/test_task"
5
5
 
6
+ begin
7
+ require "tocer/rake/register"
8
+ rescue LoadError => error
9
+ puts error.message
10
+ end
11
+
12
+ Tocer::Rake::Register.call
13
+
6
14
  Minitest::TestTask.create
7
15
 
8
16
  task default: :test
@@ -14,11 +14,23 @@
14
14
  binmode: false
15
15
  reraise_write_errors: []
16
16
  mon_data: !ruby/object:Monitor {}
17
- mon_data_owner_object_id: 920
17
+ mon_data_owner_object_id: 45380
18
18
  level_override: {}
19
19
  :timeout:
20
20
  :return_raw: false
21
21
  :providers: {}
22
+ :envar_api_key_names:
23
+ :anthropic:
24
+ - ANTHROPIC_API_KEY
25
+ :google:
26
+ - GOOGLE_API_KEY
27
+ :mistral:
28
+ - MISTRAL_API_KEY
29
+ :open_router:
30
+ - OPEN_ROUTER_API_KEY
31
+ - OPENROUTER_API_KEY
32
+ :openai:
33
+ - OPENAI_API_KEY
22
34
  :provider_patterns:
23
35
  :anthropic: !ruby/regexp /^claude/i
24
36
  :openai: !ruby/regexp /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
@@ -91,16 +91,6 @@ class AiClient
91
91
  include Hashie::Extensions::Mash::PermissiveRespondTo
92
92
  include Hashie::Extensions::Mash::SymbolizeKeys
93
93
  include Hashie::Extensions::Mash::DefineAccessors
94
-
95
-
96
- # I'm not sure about this ...
97
- # def provider(name, &block)
98
- # if block_given?
99
- # providers[name] = block.call
100
- # else
101
- # providers[name] || {}
102
- # end
103
- # end
104
94
 
105
95
 
106
96
  def save(filepath=ENV['HOME']+'/aiclient_config.yml')
@@ -129,6 +119,11 @@ class AiClient
129
119
  yield(class_config)
130
120
  end
131
121
 
122
+ def reset_default_config
123
+ initialize_defaults
124
+ .save(Config::DEFAULT_CONFIG_FILEPATH)
125
+ end
126
+
132
127
  private
133
128
 
134
129
  def initialize_defaults
@@ -137,6 +132,13 @@ class AiClient
137
132
  timeout: nil,
138
133
  return_raw: false,
139
134
  providers: {},
135
+ envar_api_key_names: {
136
+ anthropic: ['ANTHROPIC_API_KEY'],
137
+ google: ['GOOGLE_API_KEY'],
138
+ mistral: ['MISTRAL_API_KEY'],
139
+ open_router: ['OPEN_ROUTER_API_KEY', 'OPENROUTER_API_KEY'],
140
+ openai: ['OPENAI_API_KEY']
141
+ },
140
142
  provider_patterns: {
141
143
  anthropic: /^claude/i,
142
144
  openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
@@ -0,0 +1,23 @@
1
+ # lib/ai_client/llm.rb
2
+
3
+ require 'active_hash'
4
+ require 'yaml'
5
+
6
+
7
+ class AiClient
8
+ class LLM < ActiveHash::Base
9
+ DATA_PATH = Pathname.new( __dir__ + '/models.yml')
10
+ self.data = YAML.parse(DATA_PATH.read).to_ruby
11
+
12
+ def model = id.split('/')[1]
13
+ def provider = id.split('/')[0]
14
+ end
15
+
16
+ class << self
17
+ def reset_llm_data
18
+ orc_models = AiClient.orc_client.models
19
+ AiClient::LLM.data = orc_models
20
+ AiClient::LLM::DATA_PATH.write(orc_models.to_yaml)
21
+ end
22
+ end
23
+ end