ai_client 0.2.1 → 0.2.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: c938cc076640fa6952b1140252b0c234a44e7dfd6ee24d2485c344d3b2c98880
4
- data.tar.gz: 1dc5cb21b49c2a731689e7ab7d4f6cce56392b06ecb83e2d54b535ec0d0b3d27
3
+ metadata.gz: a4128ef9024e8bc84820346570886aa9b3e25a4e8e442556591d45e0900cf9ea
4
+ data.tar.gz: 64d339534320c27eb060b02ae927f39a829f3792319dc1cd71e3e0920d6aa0ed
5
5
  SHA512:
6
- metadata.gz: d2d81aa7bc979a8c75856965261ce54470ef3df1586eb65e809c2e4d32f8a4771a22a85640e026260826ab691e54742666ce0b9d71f2ba992b60db3d5c689cc8
7
- data.tar.gz: 7347641ccb974d62c0393a364cb09b214019c0e99a2514478632a9b0c4736bf530c0ba415bae316b0438230691418265b21aed46ef32ca122935d73e30085f41
6
+ metadata.gz: ef72dd20224bd2f887686c1c75881e8bd9195c60ac2a416c9acee3a93cf09335b852381ce3bf314cab72e9b2a7136b4d2bf3f0f622f39f55f1555500796fca17
7
+ data.tar.gz: 33cc0e5fcf58ebadb5269138342f4e6c49d42604e2ee2fd22630764c1d288bfafdc3c7999467d2dfe890cd9daae573165f497bd1044d807f80d3a89e97f3de33
data/CHANGELOG.md CHANGED
@@ -1,6 +1,16 @@
1
1
  ## [Unreleased]
2
2
 
3
3
  ## Released
4
+ ### [0.2.3] - 2024-10-08
5
+ - refactored the OmniAI extensions for Ollama, LocalAI and OpenRouter
6
+ - added a file for OpenRouter extensions
7
+ - Added the LLM class
8
+
9
+ ### [0.2.2] - 2024-10-07
10
+ - Added support for open_router.ai with extensions/omniai-open_router.rb
11
+ - Removed the `model_type` object
12
+ - Added ability to dump a config to a YAML file
13
+
4
14
  ### [0.2.1] - 2024-10-05
5
15
  - Added support for YAML configuration files
6
16
 
data/README.md CHANGED
@@ -1,20 +1,58 @@
1
1
  # AiClient
2
2
 
3
- First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) upon which this effort depends.
3
+ First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
4
4
 
5
- **This is a work in progress** I could use your help extending its capability.
5
+ See the [change log](CHANGELOG.md) for recent modifications.
6
6
 
7
- AiClien` is working. I've used early versions of it in several projects.
8
7
 
9
- See the [change log](CHANGELOG.md) for recent modifications.
8
+ <!-- Tocer[start]: Auto-generated, don't remove. -->
9
+
10
+ ## Table of Contents
11
+
12
+ - [Summary](#summary)
13
+ - [Installation](#installation)
14
+ - [Environment Variables for Provider Access](#environment-variables-for-provider-access)
15
+ - [Changing Envar API Key Names](#changing-envar-api-key-names)
16
+ - [api_key: Parameter](#api_key-parameter)
17
+ - [provider: Parameter](#provider-parameter)
18
+ - [Usage](#usage)
19
+ - [Configuration](#configuration)
20
+ - [Default Configuration](#default-configuration)
21
+ - [Class Configuration](#class-configuration)
22
+ - [1. Class Configuration Block](#1-class-configuration-block)
23
+ - [2. Set by a Config File](#2-set-by-a-config-file)
24
+ - [3. Supplemented by a Config File](#3-supplemented-by-a-config-file)
25
+ - [Instance Configuration](#instance-configuration)
26
+ - [1. Supplement from a Constructor Block](#1-supplement-from-a-constructor-block)
27
+ - [2. Supplement from a YAML File](#2-supplement-from-a-yaml-file)
28
+ - [3. Load Complete Configuration from a YAML File](#3-load-complete-configuration-from-a-yaml-file)
29
+ - [Top-level Client Methods](#top-level-client-methods)
30
+ - [chat](#chat)
31
+ - [embed](#embed)
32
+ - [speak](#speak)
33
+ - [transcribe](#transcribe)
34
+ - [Options](#options)
35
+ - [Advanced Prompts](#advanced-prompts)
36
+ - [Advanced Prompts with Tools](#advanced-prompts-with-tools)
37
+ - [Best ?? Practices](#best--practices)
38
+ - [OmniAI and OpenRouter](#omniai-and-openrouter)
39
+ - [Contributing](#contributing)
40
+ - [License](#license)
41
+
42
+ <!-- Tocer[finish]: Auto-generated, don't remove. -->
43
+
10
44
 
11
45
  ## Summary
12
46
 
13
- `ai_client` is a versatile Ruby gem that serves as a generic client for interacting with various AI service providers through a unified API provided by Kevin's gem `OmniAI`. The `AiClient` class is designed to simplify the integration of large language models (LLMs) into applications. `AiClient` allows developers to create instances using just the model name, greatly reducing configuration overhead.
47
+ Are you ready to supercharge your applications with cutting-edge AI capabilities? Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface for interacting with a multitude of AI service providers through a single, unified API.
48
+
49
+ With `ai_client`, you can effortlessly integrate large language models (LLMs) into your projects—simply specify the model name and let the gem handle the rest! Say goodbye to tedious configuration and hello to rapid development.
50
+
51
+ This gem comes packed with built-in support for leading AI providers, including OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to implement chatbots, transcription services, speech synthesis, or embeddings, `ai_client` abstracts the complexities of API interactions, allowing you to focus on what truly matters: building amazing applications.
14
52
 
15
- With built-in support for popular AI providersincluding OpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the complexities of API interactions, offering methods for tasks such as chatting, transcription, speech synthesis, and embedding.
53
+ Plus, with its flexible middleware architecture, you can easily customize request and response processingimplement logging, retry logic, and more with minimal effort. And thanks to its seamless integration with the `OmniAI` framework, you can leverage the latest AI advancements without worrying about vendor lock-in.
16
54
 
17
- The middleware architecture enables customizable processing of requests and responses, making it easy to implement features like logging and retry logic. Seamlessly integrated with the `OmniAI` framework, `ai_client` empowers developers to leverage cutting-edge AI capabilities without vendor lock-in, making it an essential tool for modern AI-driven applications.
55
+ Join the growing community of developers who are transforming their applications with `ai_client`. Install it today and unlock the full potential of AI in your projects!
18
56
 
19
57
  ## Installation
20
58
 
@@ -30,6 +68,53 @@ If bundler is not being used to manage dependencies, install the gem by executin
30
68
  gem install ai_client
31
69
  ```
32
70
 
71
+ ## Environment Variables for Provider Access
72
+
73
+ For fee providers require an account and provide an access token to allow the use of their LLM models. The value of these access tokens is typically saved in system environment variables or some other secure data store. AiClient has a default set of system environment variable names for these access tokens based upon the pattern of `provider_api_key` which can be over-ridden.
74
+
75
+ | Symbol | Envar API Key | Client Source |
76
+ | --- | --- | --- |
77
+ | :anthropic | [ANTHROPIC_API_KEY](https://www.anthropic.com/) | OmniAI |
78
+ | :google | [GOOGLE_API_KEY](https://cloud.google.com/gemini) | OmniAI |
79
+ | :localai | [LOCALAI_API_KEY](https://localai.io/) | AiClient Extension |
80
+ | :mistral | [MISTRAL_API_KEY](https://mistral.ai/) | OmniAI |
81
+ | :ollama | [OLLAMA_API_KEY](https://ollama.com/) | AiClient Extension |
82
+ | :open_router | [OPEN_ROUTER_API_KEY](https://openrouter.ai/) | AiClient Extension |
83
+ | :openai | [OPENAI_API_KEY](https://www.openai.com/) | OmniAI |
84
+
85
+
86
+ ### Changing Envar API Key Names
87
+
88
+ You can also configure the system environment variable names to match your on standards at the class level.
89
+
90
+ ```ruby
91
+ AiClient.class_config.envar_api_key_bames = {
92
+ anthropic: 'your_envar_name',
93
+ google: 'your_envar_name',
94
+ mistral: 'your_envar_name',
95
+ open_router: 'your_envar_name',
96
+ opena: 'your_envar_name'
97
+ }
98
+
99
+ AiClient.class_config.save('path/to/file.yml')
100
+ ```
101
+
102
+ ### api_key: Parameter
103
+
104
+ In case you are using a different environment variable for your access token than the ones shown above you can use the `api_key:` parameter.
105
+
106
+ ```ruby
107
+ client = AiClient.new('provider/model_name', api_key: ENV['OPENROUTER_API_KEY'])
108
+ ```
109
+
110
+ This way if you are using `AiClient` inside of a Rails application you can retrieve your access token from a secretes file.
111
+
112
+
113
+ ### provider: Parameter
114
+
115
+ To explicitly designate a provider to use with an AiClient instance use the parameter `provider: :your_provider` with the Symbol for the supported provider you want to use with the model you specify. The following providers are supported by the OmniAI gem upon which AiClient depends along with a few extensions.
116
+
117
+
33
118
  ## Usage
34
119
 
35
120
  Basic usage:
@@ -45,24 +130,26 @@ c1 = AiClient.new('nomic-embeddings-text')
45
130
  c2 = AiClient.new('gpt-4o-mini')
46
131
  ```
47
132
 
133
+ You can also use the `provider:` parameter in the event that the model you want to use is available through multiple providers or that AiClient can not automatically associate the model name with a provider.
134
+
135
+
136
+ ```ruby
137
+ AI = AiClient.new('nomic-embed-text', provider: :ollama)
138
+ ```
139
+
48
140
  ### Configuration
49
141
 
50
- There are three levels of configuration, each inherenting from the level above. The following sections
51
- describe those configuration levels.
142
+ There are three levels of configuration, each inherenting from the level above. The following sections describe those configuration levels.
52
143
 
53
144
  #### Default Configuration
54
145
 
55
- The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to
56
- update the [lib/ai_client/config.yml] file during development. If you have
57
- some changes for this configuration please send me a pull request so we
58
- can all benefit from your efforts.
146
+ The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
59
147
 
60
148
  #### Class Configuration
61
149
 
62
- The class configuration is derived initially from the default configuration. It
63
- can be changed in three ways.
150
+ The class configuration is derived initially from the default configuration. It can be changed in three ways.
64
151
 
65
- 1. Class Configuration Block
152
+ ##### 1. Class Configuration Block
66
153
 
67
154
  ```ruby
68
155
  AiClient.configuration do |config|
@@ -71,13 +158,13 @@ AiClient.configuration do |config|
71
158
  end
72
159
  ```
73
160
 
74
- 2. Set by a Config File
161
+ ##### 2. Set by a Config File
75
162
 
76
163
  ```ruby
77
164
  AiClient.class_config = AiClient::Config.load('path/to/file.yml')
78
165
  ```
79
166
 
80
- 3. Supplemented by a Config File
167
+ ##### 3. Supplemented by a Config File
81
168
 
82
169
  ```ruby
83
170
  AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
@@ -85,12 +172,9 @@ AiClient.class_config.merge! AiClient::Config.load('path/to/file.yml')
85
172
 
86
173
  #### Instance Configuration
87
174
 
88
- All instances have a configuration. Initially that configuration is the same
89
- as the class configuration; however, each instance can have its own separate
90
- configuration. For an instance the class configuration can either be supplemented
91
- or complete over-ridden.
175
+ All instances have a configuration. Initially that configuration is the same as the class configuration; however, each instance can have its own separate configuration. For an instance the class configuration can either be supplemented or complete over-ridden.
92
176
 
93
- 1. Supplement from a Constructor Block
177
+ ##### 1. Supplement from a Constructor Block
94
178
 
95
179
  ```ruby
96
180
  client = AiClient.new('super-ai-overlord-model') do |config|
@@ -99,51 +183,131 @@ client = AiClient.new('super-ai-overlord-model') do |config|
99
183
  end
100
184
  ```
101
185
 
102
- 2. Suppliment from a YAML File
186
+ ##### 2. Supplement from a YAML File
103
187
 
104
188
  ```ruby
105
189
  client = AiClient.new('baby-model', config: 'path/to/file.yml')
106
190
  ```
107
191
 
108
- 3. Load Complete Configuration from a YAML File
192
+ ##### 3. Load Complete Configuration from a YAML File
109
193
 
110
194
  ```ruby
111
195
  client = AiClient.new('your-model')
112
196
  client.config = AiClient::Config.load('path/to/file.yml')
113
197
  ```
114
198
 
115
- ### What Now?
199
+ ### Top-level Client Methods
116
200
 
117
- TODO: Document the methods and their options.
201
+ See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
202
+
203
+ The following examples are based upon the same client configuration.
204
+
205
+ ```ruby
206
+ AI = AiClient.new(...) do ... end
207
+ ```
208
+
209
+ ##### chat
210
+
211
+ Typically `chat(...)` is the most used top-level. Sometimes refered to as completion. You are giving a prompt to an LLM and expecting the LLM to respond (ie. complete its transformation). If you consider the prompt to be a question, the response would be the answer. If the prompt were a task, the response would be the completion of that task.
212
+
213
+ ```ruby
214
+ response = AI.chat(...)
215
+ ```
216
+
217
+ The simplest form is a string prompt. The prompt can come from anywher - a litteral, variable, or get if from a database or a file.
218
+
219
+ ```ruby
220
+ response = AI.chat("Is there anything simpler than this?")
221
+ ```
222
+
223
+ The response will be a simple string or a response object based upon the setting of your `config.return_raw` item. If `true` then you get the whole shebang. If `false` you get just the string.
224
+
225
+ See the [Advanced Prompts] section to learn how to configure a complex prompt message.
226
+
227
+
228
+ ##### embed
229
+
230
+ Embeddings (as in 'embed additional information') is how retrial augmented generation (RAG) works - which is a deeper subject for another place. Basically when using an LLM that supports the vectorization of stuff to create embeddings you can use `embed(stuff)` to return the vector associated with the stuff you gave the model. This vector (an Array of Floating Points Numbers) is a mathematical representation of the stuff that can be used to compare, mathematically, one piece of stuff to a collection of stuff to find other stuff in that collection that closely resembles the stuff for which you are looking. Q: What is stuff? A: You know; its just stuff.
118
231
 
119
232
  ```ruby
120
- AI.chat(...)
121
- AI.transcribe(...)
122
- AI.speak(...)
123
233
  AI.embed(...)
124
- AI.batch_embed(...)
234
+ response = AI.batch_embed(...)
125
235
  ```
126
236
 
127
- See the [examples directory](examples/README.md) for some ideas on how to use AiClient.
237
+ Recommendation: Use PostgreSQL, pg_vector and the neighbor gem.
128
238
 
129
- ### System Environment Variables
239
+ ##### speak
240
+
241
+ ```ruby
242
+ res[pmse = AI.speak("Isn't it nice to have a computer that will talk to you?")
243
+ ```
244
+
245
+ The response will contain audio data that can be played, manipulated or saved to a file.
246
+
247
+ ##### transcribe
248
+
249
+ ```ruby
250
+ response = AI.transcribe(...)
251
+ ```
130
252
 
131
- The API keys used with each LLM provider have the pattern `XXX_API_KEY` where XXX is the name of the provided. For example `OPENAI_API_KEY1` and `ANTROPIC_API_KEY` etc.
132
253
 
133
- TODO: list all providers supported and their envar
134
254
 
135
255
  ### Options
136
256
 
137
257
  TODO: document the options like `provider: :ollama`
138
258
 
139
- ## Extensions for OmniAI
259
+ ### Advanced Prompts
260
+
261
+ In more complex application providing a simple string as your prompt is not sufficient. AiClient can take advantage of OmniAI's complex message builder.
262
+
263
+ ```ruby
264
+ client = AiClient.new 'some_model_bane'
265
+
266
+ completion = client.chat do |prompt|
267
+ prompt.system('You are an expert biologist with an expertise in animals.')
268
+ prompt.user do |message|
269
+ message.text 'What species are in the attached photos?'
270
+ message.url('https://.../cat.jpeg', "image/jpeg")
271
+ message.url('https://.../dog.jpeg', "image/jpeg")
272
+ message.file('./hamster.jpeg', "image/jpeg")
273
+ end
274
+ end
275
+
276
+ completion #=> 'The photos are of a cat, a dog, and a hamster.'
277
+ ```
278
+
279
+ Of course if `client.config.return_raw` is true, the completion value will be the complete response object.
280
+
281
+ ### Advanced Prompts with Tools
282
+
283
+ One of the latest innovations in LLMs is the ability to use functions (aka tools) as `callbacks` to gather more information or to execute a task at the direction of the LLM prompt processing.
284
+
285
+ See [blog post](https://ksylvest.com/posts/2024-08-16/using-omniai-to-leverage-tools-with-llms) by Kevin Sylvestre, author of the OmniAI gem.
286
+
287
+
288
+ TODO: Need to create an example RAG that does not need another access token to a service
289
+
290
+
291
+ ## Best ?? Practices
292
+
293
+ If you are going to be using one model for multiple purposes in different parts of your application you can assign the instance of `AiClient` to a constant so that the same client can be used everywhere.
294
+
295
+ ```ruby
296
+ AI = AiClient.new 'gpt-4o'
297
+ ...
298
+ AI.chat "do something with this #{stuff}"
299
+ ...
300
+ AI.speak "warning Will Robinson! #{bad_things_happened}"
301
+ ...
302
+ ```
303
+
304
+ Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
305
+
306
+ ## OmniAI and OpenRouter
140
307
 
141
- The AiClient makes use of extensions to the OmniAI gem that define
142
- additional providers and protocols.
308
+ Both OmniAI and OpenRouter have similar goals - to provide a common interface to multiple providers and LLMs. OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
143
309
 
144
- 1. **OmniAI::Ollama^** which wraps the OmniAI::OpenAI class
145
- 2. **OmniAI::LocalAI** which also wraps the OmniAI::OpenAI class
146
- 3. **OmniAI::OpenRouter** TODO: Still under development
310
+ The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
147
311
 
148
312
  ## Contributing
149
313
 
data/Rakefile CHANGED
@@ -3,6 +3,14 @@
3
3
  require "bundler/gem_tasks"
4
4
  require "minitest/test_task"
5
5
 
6
+ begin
7
+ require "tocer/rake/register"
8
+ rescue LoadError => error
9
+ puts error.message
10
+ end
11
+
12
+ Tocer::Rake::Register.call
13
+
6
14
  Minitest::TestTask.create
7
15
 
8
16
  task default: :test
data/examples/speak.rb CHANGED
@@ -24,7 +24,7 @@ end
24
24
  title "Default Configuration Text-to-Speech"
25
25
 
26
26
  clients.each do |c|
27
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
27
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
28
28
  text = "Text to speach example using the #{c.model} by provider #{c.provider} with the default voice."
29
29
  result = c.speak(text)
30
30
  puts "Audio generated. Tempfile: #{result.path}"
data/examples/text.rb CHANGED
@@ -48,7 +48,7 @@ end
48
48
  title "Default Configuration Response to 'hello'"
49
49
 
50
50
  clients.each do |c|
51
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
51
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
52
52
  begin
53
53
  response = c.chat('hello')
54
54
  puts response
@@ -73,7 +73,7 @@ puts
73
73
  title "Raw Configuration Response to 'hello'"
74
74
 
75
75
  raw_clients.each do |c|
76
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
76
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
77
77
  begin
78
78
  result = c.chat('hello')
79
79
  puts result.pretty_inspect
@@ -21,7 +21,7 @@ title "Default Configuration Speech-to-Text"
21
21
 
22
22
 
23
23
  clients.each do |c|
24
- puts "\nModel: #{c.model} (#{c.model_type}) Provider: #{c.provider}"
24
+ puts "\nModel: #{c.model} Provider: #{c.provider}"
25
25
  result = c.transcribe(audio_file)
26
26
  puts "Transcription: #{result.pretty_inspect}"
27
27
  end
@@ -0,0 +1,24 @@
1
+ # lib/ai_client/chat.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model: @model [String] optional
8
+ # format: @format [Symbol] optional :text or :json
9
+ # stream: @stream [Proc, nil] optional
10
+ # tools: @tools [Array<OmniAI::Tool>] optional
11
+ # temperature: @temperature [Float, nil] optional
12
+
13
+ def chat(messages, **params)
14
+ result = call_with_middlewares(:chat_without_middlewares, messages, **params)
15
+ @last_response = result
16
+ raw? ? result : content
17
+ end
18
+
19
+
20
+ def chat_without_middlewares(messages, **params)
21
+ @client.chat(messages, model: @model, **params)
22
+ end
23
+
24
+ end
@@ -14,20 +14,28 @@
14
14
  binmode: false
15
15
  reraise_write_errors: []
16
16
  mon_data: !ruby/object:Monitor {}
17
- mon_data_owner_object_id: 920
17
+ mon_data_owner_object_id: 700
18
18
  level_override: {}
19
19
  :timeout:
20
20
  :return_raw: false
21
21
  :providers: {}
22
+ :envar_api_key_names:
23
+ :anthropic:
24
+ - ANTHROPIC_API_KEY
25
+ :google:
26
+ - GOOGLE_API_KEY
27
+ :mistral:
28
+ - MISTRAL_API_KEY
29
+ :open_router:
30
+ - OPEN_ROUTER_API_KEY
31
+ - OPENROUTER_API_KEY
32
+ :openai:
33
+ - OPENAI_API_KEY
22
34
  :provider_patterns:
23
35
  :anthropic: !ruby/regexp /^claude/i
24
- :openai: !ruby/regexp /^(gpt|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
25
- :google: !ruby/regexp /^(gemini|palm)/i
26
- :mistral: !ruby/regexp /^(mistral|codestral)/i
36
+ :openai: !ruby/regexp /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i
37
+ :google: !ruby/regexp /^(gemini|gemma|palm)/i
38
+ :mistral: !ruby/regexp /^(mistral|codestral|mixtral)/i
27
39
  :localai: !ruby/regexp /^local-/i
28
40
  :ollama: !ruby/regexp /(llama|nomic)/i
29
- :model_types:
30
- :text_to_text: !ruby/regexp /^(nomic|gpt|davinci|curie|babbage|ada|claude|gemini|palm|command|generate|j2-|mistral|codestral)/i
31
- :speech_to_text: !ruby/regexp /^whisper/i
32
- :text_to_speech: !ruby/regexp /^tts/i
33
- :text_to_image: !ruby/regexp /^dall-e/i
41
+ :open_router: !ruby/regexp /\//
@@ -65,6 +65,18 @@
65
65
  # configuration you can do this.
66
66
  # client = AiClient.new('your_model')
67
67
  # client.config = AiClient::Config.load('path/to/file.yml')
68
+ #
69
+ # OmniAI Configuration Items
70
+ # OmniAI::OpenAI
71
+ # config.api_key = '...'
72
+ # config.host = 'http://localhost:8080'
73
+ # config.logger = Logger.new(STDOUT)
74
+ # config.timeout = 15
75
+ # config.chat_options = { ... }
76
+ # config.transcribe_options = { ... }
77
+ # config.speak_options = { ... }
78
+ #
79
+
68
80
 
69
81
 
70
82
  require 'hashie'
@@ -79,16 +91,14 @@ class AiClient
79
91
  include Hashie::Extensions::Mash::PermissiveRespondTo
80
92
  include Hashie::Extensions::Mash::SymbolizeKeys
81
93
  include Hashie::Extensions::Mash::DefineAccessors
82
-
83
94
 
84
- # I'm not sure about this ...
85
- # def provider(name, &block)
86
- # if block_given?
87
- # providers[name] = block.call
88
- # else
89
- # providers[name] || {}
90
- # end
91
- # end
95
+
96
+ def save(filepath=ENV['HOME']+'/aiclient_config.yml')
97
+ filepath = Pathname.new(filepath) unless filepath.is_a? Pathname
98
+
99
+ filepath.write(YAML.dump(to_hash))
100
+ end
101
+
92
102
 
93
103
  class << self
94
104
  def load(filepath=DEFAULT_CONFIG_FILEPATH)
@@ -117,21 +127,24 @@ class AiClient
117
127
  timeout: nil,
118
128
  return_raw: false,
119
129
  providers: {},
130
+ envar_api_key_names: {
131
+ anthropic: ['ANTHROPIC_API_KEY'],
132
+ google: ['GOOGLE_API_KEY'],
133
+ mistral: ['MISTRAL_API_KEY'],
134
+ open_router: ['OPEN_ROUTER_API_KEY', 'OPENROUTER_API_KEY'],
135
+ openai: ['OPENAI_API_KEY']
136
+ },
120
137
  provider_patterns: {
121
138
  anthropic: /^claude/i,
122
- openai: /^(gpt|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
123
- google: /^(gemini|palm)/i,
124
- mistral: /^(mistral|codestral)/i,
139
+ openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
140
+ google: /^(gemini|gemma|palm)/i,
141
+ mistral: /^(mistral|codestral|mixtral)/i,
125
142
  localai: /^local-/i,
126
- ollama: /(llama|nomic)/i
127
- },
128
- model_types: {
129
- text_to_text: /^(nomic|gpt|davinci|curie|babbage|ada|claude|gemini|palm|command|generate|j2-|mistral|codestral)/i,
130
- speech_to_text: /^whisper/i,
131
- text_to_speech: /^tts/i,
132
- text_to_image: /^dall-e/i
143
+ ollama: /(llama|nomic)/i,
144
+ open_router: /\//
133
145
  }
134
146
  )
147
+
135
148
  @class_config = @default_config.dup
136
149
  end
137
150
  end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/embed.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String] required
8
+ #
9
+
10
+ def embed(input, **params)
11
+ @client.embed(input, model: @model, **params)
12
+ end
13
+
14
+ def batch_embed(inputs, batch_size: 100, **params)
15
+ inputs.each_slice(batch_size).flat_map do |batch|
16
+ sleep 1 # DEBUG rate limits being exceeded
17
+ embed(batch, **params)
18
+ end
19
+ end
20
+
21
+ end
@@ -0,0 +1,26 @@
1
+ # lib/ai_client/llm.rb
2
+
3
+ require 'active_hash'
4
+
5
+ class AiClient
6
+
7
+ # TODO: Think about this for the OpenRouter modesl DB
8
+ # Might cahnge this to ActiveYaml
9
+
10
+ class LLM < ActiveHash::Base
11
+ self.data = AiClient.models
12
+
13
+ def model = id.split('/')[1]
14
+ def provider = id.split('/')[0]
15
+
16
+ class << self
17
+ def import(path_to_uml_file) # TODO: load
18
+ raise "TODO: Not Implemented: #{path_to_yml_file}"
19
+ end
20
+
21
+ def export(path_to_uml_file) # TODO: Dump
22
+ raise "TODO: Not Implemented: #{path_to_yml_file}"
23
+ end
24
+ end
25
+ end
26
+ end
@@ -0,0 +1,79 @@
1
+ # lib/ai_client/open_router_extensions.rb
2
+ # frozen_string_literal: true
3
+
4
+ # These extensions to AiClient are only available with
5
+ # a valid API Key for the open_router.ai web-service
6
+
7
+ require 'open_router'
8
+
9
+ class AiClient
10
+
11
+ def models = self.class.models
12
+ def providers = self.class.providers
13
+ def model_names(a_provider=nil) = self.class.model_names(a_provider)
14
+ def model_details(a_model) = self.class.model_details(a_model)
15
+ def find_model(a_model_substring) = self.class.find_model(a_model_substring)
16
+
17
+ class << self
18
+ def add_open_router_extensions
19
+ access_token = fetch_access_token
20
+
21
+ return unless access_token
22
+
23
+ configure_open_router(access_token)
24
+ initialize_orc_client
25
+ end
26
+
27
+ def orc_client
28
+ @orc_client ||= add_open_router_extensions || raise("OpenRouter extensions are not available")
29
+ end
30
+
31
+ def models
32
+ @models ||= orc_client.models
33
+ end
34
+
35
+ # TODO: Refactor these DB like methods to take
36
+ # advantage of AiClient::LLM
37
+
38
+ def model_names(provider=nil)
39
+ model_ids = models.map { _1['id'] }
40
+
41
+ return model_ids unless provider
42
+
43
+ model_ids.filter_map { _1.split('/')[1] if _1.start_with?(provider.to_s.downcase) }
44
+ end
45
+
46
+ def model_details(model)
47
+ orc_models.find { _1['id'].include?(model) }
48
+ end
49
+
50
+ def providers
51
+ @providers ||= models.map{ _1['id'].split('/')[0] }.sort.uniq
52
+ end
53
+
54
+ def find_model(a_model_substring)
55
+ model_names.select{ _1.include?(a_model_substring) }
56
+ end
57
+
58
+ private
59
+
60
+ # Similar to fetch_api_key but for the class_config
61
+ def fetch_access_token
62
+ class_config.envar_api_key_names.open_router
63
+ .map { |key| ENV[key] }
64
+ .compact
65
+ .first
66
+ end
67
+
68
+ def configure_open_router(access_token)
69
+ OpenRouter.configure { |config| config.access_token = access_token }
70
+ end
71
+
72
+ def initialize_orc_client
73
+ @orc_client ||= OpenRouter::Client.new
74
+ end
75
+ end
76
+ end
77
+
78
+
79
+ AiClient.add_open_router_extensions
@@ -0,0 +1,26 @@
1
+ # lib/ai_client/speak.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # input [String] required
8
+ # model [String] required
9
+ # voice [String] required
10
+ # speed [Float] optional
11
+ # format [String] optional (default "aac")
12
+ # aac mp3 flac opus pcm wav
13
+ #
14
+ # @yield [output] optional
15
+ #
16
+ # @return [Tempfile``]
17
+
18
+ def speak(text, **params)
19
+ call_with_middlewares(:speak_without_middlewares, text, **params)
20
+ end
21
+
22
+ def speak_without_middlewares(text, **params)
23
+ @client.speak(text, model: @model, **params)
24
+ end
25
+
26
+ end
@@ -0,0 +1,21 @@
1
+ # lib/ai_client/transcribe.rb
2
+
3
+ class AiClient
4
+
5
+ ######################################
6
+ # OmniAI Params
7
+ # model [String]
8
+ # language [String, nil] optional
9
+ # prompt [String, nil] optional
10
+ # format [Symbol] :text, :srt, :vtt, or :json (default)
11
+ # temperature [Float, nil] optional
12
+
13
+ def transcribe(audio, format: nil, **params)
14
+ call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
15
+ end
16
+
17
+ def transcribe_without_middlewares(audio, format: nil, **params)
18
+ @client.transcribe(audio, model: @model, format: format, **params)
19
+ end
20
+
21
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.2.1"
4
+ VERSION = "0.2.3"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -13,13 +13,21 @@ require 'omniai/anthropic'
13
13
  require 'omniai/google'
14
14
  require 'omniai/mistral'
15
15
  require 'omniai/openai'
16
- require_relative 'extensions/omniai-ollama'
17
- require_relative 'extensions/omniai-localai'
16
+
17
+ require 'open_router'
18
+
19
+ require_relative 'ai_client/chat'
20
+ require_relative 'ai_client/embed'
21
+ require_relative 'ai_client/speak'
22
+ require_relative 'ai_client/transcribe'
18
23
 
19
24
  require_relative 'ai_client/configuration'
20
25
  require_relative 'ai_client/middleware'
21
26
  require_relative 'ai_client/version'
22
27
 
28
+ require_relative 'ai_client/open_router_extensions'
29
+ require_relative 'ai_client/llm' # SMELL: must come after the open router stuff
30
+
23
31
  # Create a generic client instance using only model name
24
32
  # client = AiClient.new('gpt-3.5-turbo')
25
33
  #
@@ -28,13 +36,41 @@ require_relative 'ai_client/version'
28
36
  # AiClient.use(LoggingMiddleware.new(AiClient.configuration.logger))
29
37
  #
30
38
 
31
-
32
39
  class AiClient
33
40
 
41
+ # Define the refinement for Hash
42
+ module HashRefinement
43
+ refine Hash do
44
+ def tunnel(target_key)
45
+ queue = [self] # Initialize the queue with the current hash
46
+
47
+ until queue.empty?
48
+ current = queue.shift # Dequeue the front hash
49
+
50
+ # Check if the current hash contains the target key
51
+ return current[target_key] if current.key?(target_key)
52
+
53
+ # Enqueue sub-hashes and arrays to the queue for further searching
54
+ current.each_value do |value|
55
+ case value
56
+ when Hash
57
+ queue << value
58
+ when Array
59
+ queue.concat(value.select { |v| v.is_a?(Hash) }) # Add sub-hashes from the array
60
+ end
61
+ end
62
+ end
63
+
64
+ nil # Return nil if the key is not found
65
+ end
66
+ end
67
+ end
68
+
69
+ using HashRefinement
70
+
34
71
  attr_reader :client, # OmniAI's client instance
35
72
  :provider, # [Symbol]
36
73
  :model, # [String]
37
- :model_type, # [Symbol]
38
74
  :logger,
39
75
  :last_response,
40
76
  :timeout,
@@ -71,7 +107,6 @@ class AiClient
71
107
  explicit_provider = options.fetch(:provider, config.provider)
72
108
 
73
109
  @provider = validate_provider(explicit_provider) || determine_provider(model)
74
- @model_type = determine_model_type(model)
75
110
 
76
111
  provider_config = @config.providers[@provider] || {}
77
112
 
@@ -86,8 +121,6 @@ class AiClient
86
121
  @last_response = nil
87
122
  end
88
123
 
89
-
90
-
91
124
  def response = last_response
92
125
  def raw? = config.return_raw
93
126
 
@@ -95,73 +128,20 @@ class AiClient
95
128
  config.return_raw = value
96
129
  end
97
130
 
98
-
99
-
100
- ######################################
101
- def chat(messages, **params)
102
- result = call_with_middlewares(:chat_without_middlewares, messages, **params)
103
- @last_response = result
104
- raw? ? result : content
105
- end
106
-
107
-
108
- def chat_without_middlewares(messages, **params)
109
- @client.chat(messages, model: @model, **params)
110
- end
111
-
112
- ######################################
113
- def transcribe(audio, format: nil, **params)
114
- call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
115
- end
116
-
117
- def transcribe_without_middlewares(audio, format: nil, **params)
118
- @client.transcribe(audio, model: @model, format: format, **params)
119
- end
120
-
121
- ######################################
122
- def speak(text, **params)
123
- call_with_middlewares(:speak_without_middlewares, text, **params)
124
- end
125
-
126
- def speak_without_middlewares(text, **params)
127
- @client.speak(text, model: @model, **params)
128
- end
129
-
130
-
131
- ######################################
132
- def embed(input, **params)
133
- @client.embed(input, model: @model, **params)
134
- end
135
-
136
- def batch_embed(inputs, batch_size: 100, **params)
137
- inputs.each_slice(batch_size).flat_map do |batch|
138
- sleep 1 # DEBUG rate limits being exceeded
139
- embed(batch, **params)
140
- end
141
- end
142
-
143
- ######################################
144
- ## Utilities
145
-
146
131
  def content
147
132
  case @provider
148
- when :openai, :localai, :ollama
149
- last_response.data.dig('choices', 0, 'message', 'content')
150
- when :anthropic
151
- last_response.data.dig('content',0,'text')
152
- when :google
153
- last_response.data.dig('candidates', 0, 'content', 'parts', 0, 'text')
154
- when :mistral
155
- last_response.data.dig('choices', 0, 'message', 'content')
133
+ when :localai, :mistral, :ollama, :open_router, :openai
134
+ last_response.data.tunnel 'content'
135
+
136
+ when :anthropic, :google
137
+ last_response.data.tunnel 'text'
138
+
156
139
  else
157
140
  raise NotImplementedError, "Content extraction not implemented for provider: #{@provider}"
158
141
  end
159
142
  end
160
143
  alias_method :text, :content
161
144
 
162
- ##############################################
163
- ## Public Class Methods
164
-
165
145
  def method_missing(method_name, *args, &block)
166
146
  if @client.respond_to?(method_name)
167
147
  result = @client.send(method_name, *args, &block)
@@ -193,9 +173,8 @@ class AiClient
193
173
 
194
174
 
195
175
  def create_client
196
- api_key = fetch_api_key # Fetching the API key should only happen for valid providers
197
176
  client_options = {
198
- api_key: api_key,
177
+ api_key: fetch_api_key,
199
178
  logger: @logger,
200
179
  timeout: @timeout
201
180
  }
@@ -205,46 +184,43 @@ class AiClient
205
184
  case provider
206
185
  when :openai
207
186
  OmniAI::OpenAI::Client.new(**client_options)
187
+
208
188
  when :anthropic
209
189
  OmniAI::Anthropic::Client.new(**client_options)
190
+
210
191
  when :google
211
192
  OmniAI::Google::Client.new(**client_options)
193
+
212
194
  when :mistral
213
195
  OmniAI::Mistral::Client.new(**client_options)
196
+
214
197
  when :ollama
215
- OmniAI::Ollama::Client.new(**client_options)
198
+ OmniAI::OpenAI::Client.new(host: 'http://localhost:11434', api_key: nil, **client_options)
199
+
216
200
  when :localai
217
- OmniAI::LocalAI::Client.new(**client_options)
201
+ OmniAI::OpenAI::Client.new(host: 'http://localhost:8080', api_key: nil, **client_options)
202
+
203
+ when :open_router
204
+ OmniAI::OpenAI::Client.new(host: 'https://openrouter.ai', api_prefix: 'api', **client_options)
205
+
218
206
  else
219
207
  raise ArgumentError, "Unsupported provider: #{@provider}"
220
208
  end
221
209
  end
222
210
 
223
211
 
212
+ # Similar to fetch_access_tokne but for the instance config
224
213
  def fetch_api_key
225
- env_var_name = "#{@provider.upcase}_API_KEY"
226
- api_key = ENV[env_var_name]
227
-
228
- if api_key.nil? || api_key.empty?
229
- unless [:localai, :ollama].include? provider
230
- raise ArgumentError, "API key not found in environment variable #{env_var_name}"
231
- end
232
- end
233
-
234
- api_key
214
+ config.envar_api_key_names[@provider]
215
+ &.map { |key| ENV[key] }
216
+ &.compact
217
+ &.first
235
218
  end
236
219
 
237
-
238
220
  def determine_provider(model)
239
221
  config.provider_patterns.find { |provider, pattern| model.match?(pattern) }&.first ||
240
222
  raise(ArgumentError, "Unsupported model: #{model}")
241
223
  end
242
-
243
-
244
- def determine_model_type(model)
245
- config.model_types.find { |type, pattern| model.match?(pattern) }&.first ||
246
- raise(ArgumentError, "Unable to determine model type for: #{model}")
247
- end
248
224
  end
249
225
 
250
226
 
metadata CHANGED
@@ -1,15 +1,29 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.1
4
+ version: 0.2.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-05 00:00:00.000000000 Z
11
+ date: 2024-10-10 00:00:00.000000000 Z
12
12
  dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: active_hash
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - ">="
18
+ - !ruby/object:Gem::Version
19
+ version: '0'
20
+ type: :runtime
21
+ prerelease: false
22
+ version_requirements: !ruby/object:Gem::Requirement
23
+ requirements:
24
+ - - ">="
25
+ - !ruby/object:Gem::Version
26
+ version: '0'
13
27
  - !ruby/object:Gem::Dependency
14
28
  name: hashie
15
29
  requirement: !ruby/object:Gem::Requirement
@@ -94,6 +108,20 @@ dependencies:
94
108
  - - ">="
95
109
  - !ruby/object:Gem::Version
96
110
  version: '0'
111
+ - !ruby/object:Gem::Dependency
112
+ name: open_router
113
+ requirement: !ruby/object:Gem::Requirement
114
+ requirements:
115
+ - - ">="
116
+ - !ruby/object:Gem::Version
117
+ version: '0'
118
+ type: :runtime
119
+ prerelease: false
120
+ version_requirements: !ruby/object:Gem::Requirement
121
+ requirements:
122
+ - - ">="
123
+ - !ruby/object:Gem::Version
124
+ version: '0'
97
125
  - !ruby/object:Gem::Dependency
98
126
  name: amazing_print
99
127
  requirement: !ruby/object:Gem::Requirement
@@ -150,18 +178,29 @@ dependencies:
150
178
  - - ">="
151
179
  - !ruby/object:Gem::Version
152
180
  version: '0'
153
- description: "`ai_client` is a versatile Ruby gem that serves as a generic client
154
- \nfor interacting with various AI service providers through a unified \nAPI. Designed
155
- to simplify the integration of large language models \n(LLMs) into applications,
156
- `ai_client` allows developers to create \ninstances using just the model name, greatly
157
- reducing configuration \noverhead. With built-in support for popular AI providers—including
158
- \nOpenAI, Anthropic, Google, Mistral, LocalAI and Ollama—the gem abstracts the \ncomplexities
159
- of API interactions, offering methods for tasks such \nas chatting, transcription,
160
- speech synthesis, and embedding. The \nmiddleware architecture enables customizable
161
- processing of requests \nand responses, making it easy to implement features like
162
- logging and \nretry logic. Seamlessly integrated with the `OmniAI` framework, \n`ai_client`
163
- empowers developers to leverage cutting-edge AI capabilities \nwithout vendor lock-in,
164
- making it an essential tool for modern AI-driven \napplications.\n"
181
+ - !ruby/object:Gem::Dependency
182
+ name: tocer
183
+ requirement: !ruby/object:Gem::Requirement
184
+ requirements:
185
+ - - ">="
186
+ - !ruby/object:Gem::Version
187
+ version: '0'
188
+ type: :development
189
+ prerelease: false
190
+ version_requirements: !ruby/object:Gem::Requirement
191
+ requirements:
192
+ - - ">="
193
+ - !ruby/object:Gem::Version
194
+ version: '0'
195
+ description: "`ai_client` is a versatile Ruby gem that offers a seamless interface
196
+ \nfor integrating a wide range of AI service providers through a single, \nunified
197
+ API. With `ai_client`, you can simply specify the model name \nand quickly leverage
198
+ advanced features such as chatting, transcription, \nand speech synthesis. The gem
199
+ supports prominent providers, including \nOpenAI, Anthropic, Google, Mistral, LocalAI,
200
+ Ollama, and OpenRouter, \nmaking it easier than ever to build innovative applications.
201
+ Plus, its \nflexible middleware architecture allows for easy customization of request
202
+ \nand response handling, ensuring that you can harness the power of AI \nwithout
203
+ vendor lock-in.\n"
165
204
  email:
166
205
  - dvanhoozer@gmail.com
167
206
  executables: []
@@ -182,14 +221,18 @@ files:
182
221
  - examples/text.rb
183
222
  - examples/transcribe.rb
184
223
  - lib/ai_client.rb
224
+ - lib/ai_client/chat.rb
185
225
  - lib/ai_client/config.yml
186
226
  - lib/ai_client/configuration.rb
227
+ - lib/ai_client/embed.rb
228
+ - lib/ai_client/llm.rb
187
229
  - lib/ai_client/logger_middleware.rb
188
230
  - lib/ai_client/middleware.rb
231
+ - lib/ai_client/open_router_extensions.rb
189
232
  - lib/ai_client/retry_middleware.rb
233
+ - lib/ai_client/speak.rb
234
+ - lib/ai_client/transcribe.rb
190
235
  - lib/ai_client/version.rb
191
- - lib/extensions/omniai-localai.rb
192
- - lib/extensions/omniai-ollama.rb
193
236
  - sig/ai_client.rbs
194
237
  - the_ollama_model_problem.md
195
238
  homepage: https://github.com/MadBomber/ai_client
@@ -215,7 +258,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
215
258
  - !ruby/object:Gem::Version
216
259
  version: '0'
217
260
  requirements: []
218
- rubygems_version: 3.5.20
261
+ rubygems_version: 3.5.21
219
262
  signing_key:
220
263
  specification_version: 4
221
264
  summary: A generic AI Client for many providers
@@ -1,31 +0,0 @@
1
- # extensions/omniai-localai.rb
2
- # frozen_string_literal: true
3
-
4
- require 'omniai'
5
- require 'omniai/openai'
6
-
7
- module OmniAI
8
-
9
- # Create an alias for OmniAI::OpenAI module
10
- module LocalAI
11
- extend OmniAI::OpenAI
12
-
13
- # Alias classes from OmniAI::OpenAI
14
- class Client < OmniAI::OpenAI::Client
15
- def initialize(**options)
16
- options[:host] = 'http://localhost:8080' unless options.has_key?(:host)
17
- super(**options)
18
- end
19
- end
20
-
21
-
22
- Config = OmniAI::OpenAI::Config
23
-
24
- # Alias the Thread class and its nested classes
25
- Thread = OmniAI::OpenAI::Thread
26
- Annotation = OmniAI::OpenAI::Thread::Annotation
27
- Attachment = OmniAI::OpenAI::Thread::Attachment
28
- Message = OmniAI::OpenAI::Thread::Message
29
- Run = OmniAI::OpenAI::Thread::Run
30
- end
31
- end
@@ -1,30 +0,0 @@
1
- # extensions/omniai-ollama.rb
2
- # frozen_string_literal: true
3
-
4
- require 'omniai'
5
- require 'omniai/openai'
6
-
7
- module OmniAI
8
-
9
- # Create an alias for OmniAI::OpenAI module
10
- module Ollama
11
- extend OmniAI::OpenAI
12
-
13
- # Alias classes from OmniAI::OpenAI
14
- class Client < OmniAI::OpenAI::Client
15
- def initialize(**options)
16
- options[:host] = 'http://localhost:11434' unless options.has_key?(:host)
17
- super(**options)
18
- end
19
- end
20
-
21
- Config = OmniAI::OpenAI::Config
22
-
23
- # Alias the Thread class and its nested classes
24
- Thread = OmniAI::OpenAI::Thread
25
- Annotation = OmniAI::OpenAI::Thread::Annotation
26
- Attachment = OmniAI::OpenAI::Thread::Attachment
27
- Message = OmniAI::OpenAI::Thread::Message
28
- Run = OmniAI::OpenAI::Thread::Run
29
- end
30
- end