ai_client 0.3.1 → 0.4.1
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +16 -3
- data/README.md +104 -8
- data/lib/ai_client/chat.rb +79 -7
- data/lib/ai_client/config.yml +11 -17
- data/lib/ai_client/configuration.rb +12 -1
- data/lib/ai_client/middleware.rb +1 -1
- data/lib/ai_client/models.yml +526 -416
- data/lib/ai_client/open_router_extensions.rb +4 -1
- data/lib/ai_client/version.rb +4 -1
- data/lib/ai_client.rb +58 -32
- metadata +2 -2
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: a6a570561363cbf57df995b7154be4214cb7e7488f0e826bd92b9d523affe52b
|
4
|
+
data.tar.gz: 42c3545fd727a240261c7df1d39528fcb5bf8c6cf6dce5820763d3cef22ccd20
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: a20e587c840fc83a5a3253cead493b15672d64ed0994245da1f54b2fc0e74852170637ff7c2d745b1cec5776cdfc3716bd210a664667632db3700075c5b974d5
|
7
|
+
data.tar.gz: 2e25503d39cbb956905c46cea2c519d6b086a0887469bd5b7c237040063aac2c886bdd57ffca365376ad908b53f407c3ef090a731f83551452dbcb4a782105a6
|
data/CHANGELOG.md
CHANGED
@@ -1,12 +1,25 @@
|
|
1
1
|
## [Unreleased]
|
2
2
|
|
3
|
-
|
3
|
+
## Released
|
4
|
+
|
5
|
+
### [0.4.1] - 2024-10-21
|
6
|
+
- fixed the context problem. the chatbot method works now.
|
7
|
+
|
8
|
+
### [0.4.0] - 2024-10-20
|
9
|
+
- Removed Logger.new(STDOUT) from the default configuration
|
10
|
+
> config.logger now returns nil If you want a class or instance logger setup, then you will have to do a config.logger = Logger.new(STDOUT) or whatever you need.
|
11
|
+
- Adding basic @context for chat-bots
|
12
|
+
- Added `context_length` to configuration as the number of responses to remember as context
|
13
|
+
- Added a default model for each major provider; using "auto" for open_router.ai
|
14
|
+
- Added a default provider (OpenAI)
|
15
|
+
> AiClient.new() will use the config.default_provider and that provider's default_model
|
16
|
+
- Fixed problem with advanced block-based prompt construction for chat
|
17
|
+
|
18
|
+
### [0.3.1] - 2024-10-19
|
4
19
|
- updated the open_routed_extensions file
|
5
20
|
- added simplecov to see code coverage
|
6
21
|
- updated README with options doc
|
7
22
|
|
8
|
-
## Released
|
9
|
-
|
10
23
|
### [0.3.0] - 2024-10-13
|
11
24
|
- Breaking Change
|
12
25
|
- Added new class AiClient::Function to encapsulate the callback functions used as tools in chats.
|
data/README.md
CHANGED
@@ -2,11 +2,13 @@
|
|
2
2
|
|
3
3
|
First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
|
4
4
|
|
5
|
-
Version 0.
|
5
|
+
Version 0.4.0 has two changes which may break your existing application.
|
6
|
+
1. The default configuration no longer has a Logger instance. You will need to add your own instance to either the class or instance configuration using `AiClient.class_config.logger = YourLogger` and/or `client.config.logger = YourLogger`
|
7
|
+
2. The `chat` method now keeps a context window. The window length is defined by the configuration item `context_length` If you do not want to maintain a context window, set the `context_length` configuration item to either nil or zero.
|
6
8
|
|
7
9
|
See the [change log](CHANGELOG.md) for recent modifications.
|
8
10
|
|
9
|
-
You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions.
|
11
|
+
You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions. [raix-rails](https://github.com/OlympiaAI/raix-rails) is also available.
|
10
12
|
|
11
13
|
|
12
14
|
<!-- Tocer[start]: Auto-generated, don't remove. -->
|
@@ -32,6 +34,7 @@ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I l
|
|
32
34
|
- [3. Load Complete Configuration from a YAML File](#3-load-complete-configuration-from-a-yaml-file)
|
33
35
|
- [Top-level Client Methods](#top-level-client-methods)
|
34
36
|
- [chat](#chat)
|
37
|
+
- [Cpmtext](#cpmtext)
|
35
38
|
- [embed](#embed)
|
36
39
|
- [speak](#speak)
|
37
40
|
- [transcribe](#transcribe)
|
@@ -112,7 +115,7 @@ AiClient.class_config.envar_api_key_bames = {
|
|
112
115
|
google: 'your_envar_name',
|
113
116
|
mistral: 'your_envar_name',
|
114
117
|
open_router: 'your_envar_name',
|
115
|
-
|
118
|
+
openai: 'your_envar_name'
|
116
119
|
}
|
117
120
|
|
118
121
|
AiClient.class_config.save('path/to/file.yml')
|
@@ -139,13 +142,49 @@ To explicitly designate a provider to use with an AiClient instance use the para
|
|
139
142
|
Basic usage:
|
140
143
|
|
141
144
|
```ruby
|
142
|
-
|
145
|
+
require 'ai_client'
|
146
|
+
ai = AiClient.new # use default model and provider
|
147
|
+
ai.model #=> 'gpt-4o' is the default
|
148
|
+
ai.provider #=> :openai is the default
|
149
|
+
#
|
150
|
+
# To change the class defaults:
|
151
|
+
#
|
152
|
+
AiClient.default_provider = :anthropic
|
153
|
+
AiClient.default_model[:openai] = 'gpt-4o-mini'
|
154
|
+
#
|
155
|
+
# To get an Array of models and providers
|
156
|
+
#
|
157
|
+
AiClient.models # from open_router.ai
|
158
|
+
AiClient.providers # from open_router.ai
|
159
|
+
#
|
160
|
+
# To get details about a specific provider/model pair:
|
161
|
+
#
|
162
|
+
AiClient.model_details('openai/gpt-4o-mini') # from open_router.ai
|
143
163
|
```
|
144
164
|
|
145
|
-
|
165
|
+
You can specify which model you want to use and `AiClient` will use the provider associated with that model.
|
166
|
+
|
167
|
+
|
168
|
+
```ruby
|
169
|
+
AI = AiClient.new('gpt-4o-mini') # sets provider to :openai
|
170
|
+
#
|
171
|
+
# If you want to use the open_router.ai service instead of
|
172
|
+
# going directly to OpenAI do it this way:
|
173
|
+
#
|
174
|
+
AI = AiClient.new('openai/gpt-4o-mini') # sets provider to :open_router
|
175
|
+
```
|
176
|
+
|
177
|
+
Of course you could specify both the model and the provider that you want to use:
|
178
|
+
|
146
179
|
|
147
180
|
```ruby
|
148
|
-
|
181
|
+
AI = AiClient.new('mistral', provider: :ollama)
|
182
|
+
```
|
183
|
+
|
184
|
+
That's it. What could be simpler? If your application is using more than one model, no worries, just create multiple `AiClient` instances.
|
185
|
+
|
186
|
+
```ruby
|
187
|
+
c1 = AiClient.new('nomic-embed-text')
|
149
188
|
c2 = AiClient.new('gpt-4o-mini')
|
150
189
|
```
|
151
190
|
|
@@ -164,6 +203,53 @@ There are three levels of configuration, each inherenting from the level above.
|
|
164
203
|
|
165
204
|
The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
|
166
205
|
|
206
|
+
```ruby
|
207
|
+
{
|
208
|
+
:logger => nil,
|
209
|
+
:timeout => nil,
|
210
|
+
:return_raw => false,
|
211
|
+
:context_length => 5,
|
212
|
+
:providers => {},
|
213
|
+
:envar_api_key_names => {
|
214
|
+
:anthropic => [
|
215
|
+
"ANTHROPIC_API_KEY"
|
216
|
+
],
|
217
|
+
:google => [
|
218
|
+
"GOOGLE_API_KEY"
|
219
|
+
],
|
220
|
+
:mistral => [
|
221
|
+
"MISTRAL_API_KEY"
|
222
|
+
],
|
223
|
+
:open_router => [
|
224
|
+
"OPEN_ROUTER_API_KEY",
|
225
|
+
"OPENROUTER_API_KEY"
|
226
|
+
],
|
227
|
+
:openai => [
|
228
|
+
"OPENAI_API_KEY"
|
229
|
+
]
|
230
|
+
},
|
231
|
+
:provider_patterns => {
|
232
|
+
:anthropic => /^claude/i,
|
233
|
+
:openai => /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
|
234
|
+
:google => /^(gemini|gemma|palm)/i,
|
235
|
+
:mistral => /^(mistral|codestral|mixtral)/i,
|
236
|
+
:localai => /^local-/i,
|
237
|
+
:ollama => /(llama|nomic)/i,
|
238
|
+
:open_router => /\//
|
239
|
+
},
|
240
|
+
:default_provider => :openai,
|
241
|
+
:default_model => {
|
242
|
+
:anthropic => "claude-3-5-sonnet-20240620",
|
243
|
+
:openai => "gpt-4o",
|
244
|
+
:google => "gemini-pro-1.5",
|
245
|
+
:mistral => "mistral-large",
|
246
|
+
:localai => "llama3.2",
|
247
|
+
:ollama => "llama3.2",
|
248
|
+
:open_router => "auto"
|
249
|
+
}
|
250
|
+
}
|
251
|
+
```
|
252
|
+
|
167
253
|
#### Class Configuration
|
168
254
|
|
169
255
|
The class configuration is derived initially from the default configuration. It can be changed in three ways.
|
@@ -243,6 +329,16 @@ The response will be a simple string or a response object based upon the setting
|
|
243
329
|
|
244
330
|
See the [Advanced Prompts] section to learn how to configure a complex prompt message.
|
245
331
|
|
332
|
+
####### Cpmtext
|
333
|
+
|
334
|
+
**context_length**
|
335
|
+
|
336
|
+
The `context_length` configuration item is used to keep the last "context_length" responses within the chat context window. If you do not want to keep a context window, you should set the value of `config.context_length = 0` When you do either at the class or instance level, the chat response will be provided without the LLM knowing any prior context. If you are implementing a chat-bot, you will want it to have a context of the current conversation.
|
337
|
+
|
338
|
+
```ruby
|
339
|
+
AiClient.config.context_length #=> 5
|
340
|
+
AiClient.config.context_length = 0 # Turns off the context window
|
341
|
+
```
|
246
342
|
|
247
343
|
##### embed
|
248
344
|
|
@@ -258,7 +354,7 @@ Recommendation: Use PostgreSQL, pg_vector and the neighbor gem.
|
|
258
354
|
##### speak
|
259
355
|
|
260
356
|
```ruby
|
261
|
-
|
357
|
+
response = AI.speak("Isn't it nice to have a computer that will talk to you?")
|
262
358
|
```
|
263
359
|
|
264
360
|
The response will contain audio data that can be played, manipulated or saved to a file.
|
@@ -313,7 +409,7 @@ Note: The availability and exact names of these options may vary depending on th
|
|
313
409
|
In more complex application providing a simple string as your prompt is not sufficient. AiClient can take advantage of OmniAI's complex message builder.
|
314
410
|
|
315
411
|
```ruby
|
316
|
-
client = AiClient.new '
|
412
|
+
client = AiClient.new 'some_model_name'
|
317
413
|
|
318
414
|
completion = client.chat do |prompt|
|
319
415
|
prompt.system('You are an expert biologist with an expertise in animals.')
|
data/lib/ai_client/chat.rb
CHANGED
@@ -9,8 +9,17 @@ class AiClient
|
|
9
9
|
# stream: @stream [Proc, nil] optional
|
10
10
|
# tools: @tools [Array<OmniAI::Tool>] optional
|
11
11
|
# temperature: @temperature [Float, nil] optional
|
12
|
-
|
13
|
-
|
12
|
+
#
|
13
|
+
# Initiates a chat session.
|
14
|
+
#
|
15
|
+
# @param messages [Array<String>] the messages to send.
|
16
|
+
# @param params [Hash] optional parameters.
|
17
|
+
# @option params [Array<OmniAI::Tool>] :tools an array of tools to use.
|
18
|
+
# @return [String] the result from the chat.
|
19
|
+
#
|
20
|
+
# @raise [RuntimeError] if tools parameter is invalid.
|
21
|
+
#
|
22
|
+
def chat(messages='', **params, &block)
|
14
23
|
if params.has_key? :tools
|
15
24
|
tools = params[:tools]
|
16
25
|
if tools.is_a? Array
|
@@ -23,14 +32,77 @@ class AiClient
|
|
23
32
|
params[:tools] = tools
|
24
33
|
end
|
25
34
|
|
26
|
-
|
27
|
-
|
28
|
-
|
35
|
+
@last_messages = messages
|
36
|
+
messages = add_context(messages)
|
37
|
+
result = call_with_middlewares(
|
38
|
+
:chat_without_middlewares,
|
39
|
+
messages,
|
40
|
+
**params,
|
41
|
+
&block
|
42
|
+
)
|
43
|
+
@last_response = result
|
44
|
+
result = raw? ? result : content
|
45
|
+
|
46
|
+
@context = @context.push({
|
47
|
+
user: @last_messages,
|
48
|
+
bot: result
|
49
|
+
}).last(config.context_length)
|
50
|
+
|
51
|
+
result
|
52
|
+
end
|
53
|
+
|
54
|
+
|
55
|
+
# Adds context to the current prompt.
|
56
|
+
#
|
57
|
+
# @param prompt [String, Array<String>] the current prompt.
|
58
|
+
# @return [String, Array<String>] the prompt with context added.
|
59
|
+
#
|
60
|
+
def add_context(my_prompt)
|
61
|
+
return(my_prompt) if @config.context_length.nil? ||
|
62
|
+
0 == @config.context_length ||
|
63
|
+
my_prompt.is_a?(Array) ||
|
64
|
+
@context.empty?
|
65
|
+
|
66
|
+
prompt = "\nUser: #{my_prompt} Bot: "
|
67
|
+
prompt.prepend(
|
68
|
+
@context.map{|entry|
|
69
|
+
"User: #{entry[:user]} Bot: #{entry[:bot]}"
|
70
|
+
}.join("\n")
|
71
|
+
)
|
72
|
+
|
73
|
+
prompt
|
74
|
+
end
|
75
|
+
|
76
|
+
|
77
|
+
# Clears the current context.
|
78
|
+
#
|
79
|
+
# @return [void]
|
80
|
+
#
|
81
|
+
def clear_context
|
82
|
+
@context = []
|
29
83
|
end
|
30
84
|
|
31
85
|
|
32
|
-
|
33
|
-
|
86
|
+
# Chats with the client without middleware processing.
|
87
|
+
#
|
88
|
+
# @param messages [Array<String>] the messages to send.
|
89
|
+
# @param params [Hash] optional parameters.
|
90
|
+
# @return [String] the result from the chat.
|
91
|
+
#
|
92
|
+
def chat_without_middlewares(messages, **params, &block)
|
93
|
+
@client.chat(messages, model: @model, **params, &block)
|
34
94
|
end
|
35
95
|
|
96
|
+
|
97
|
+
def chatbot
|
98
|
+
prompt = "hello"
|
99
|
+
until prompt.empty? do
|
100
|
+
response = chat prompt
|
101
|
+
puts
|
102
|
+
puts content
|
103
|
+
puts
|
104
|
+
print "Follow Up: "
|
105
|
+
prompt = gets.chomp
|
106
|
+
end
|
107
|
+
end
|
36
108
|
end
|
data/lib/ai_client/config.yml
CHANGED
@@ -1,23 +1,8 @@
|
|
1
1
|
---
|
2
|
-
:logger:
|
3
|
-
level: 0
|
4
|
-
progname:
|
5
|
-
default_formatter: !ruby/object:Logger::Formatter
|
6
|
-
datetime_format:
|
7
|
-
formatter:
|
8
|
-
logdev: !ruby/object:Logger::LogDevice
|
9
|
-
shift_period_suffix:
|
10
|
-
shift_size:
|
11
|
-
shift_age:
|
12
|
-
filename:
|
13
|
-
dev: !ruby/object:IO {}
|
14
|
-
binmode: false
|
15
|
-
reraise_write_errors: []
|
16
|
-
mon_data: !ruby/object:Monitor {}
|
17
|
-
mon_data_owner_object_id: 45380
|
18
|
-
level_override: {}
|
2
|
+
:logger:
|
19
3
|
:timeout:
|
20
4
|
:return_raw: false
|
5
|
+
:context_length: 5
|
21
6
|
:providers: {}
|
22
7
|
:envar_api_key_names:
|
23
8
|
:anthropic:
|
@@ -39,3 +24,12 @@
|
|
39
24
|
:localai: !ruby/regexp /^local-/i
|
40
25
|
:ollama: !ruby/regexp /(llama|nomic)/i
|
41
26
|
:open_router: !ruby/regexp /\//
|
27
|
+
:default_provider: :openai
|
28
|
+
:default_model:
|
29
|
+
:anthropic: claude-3-5-sonnet-20240620
|
30
|
+
:openai: gpt-4o
|
31
|
+
:google: gemini-pro-1.5
|
32
|
+
:mistral: mistral-large
|
33
|
+
:localai: llama3.2
|
34
|
+
:ollama: llama3.2
|
35
|
+
:open_router: auto
|
@@ -152,9 +152,10 @@ class AiClient
|
|
152
152
|
#
|
153
153
|
def initialize_defaults
|
154
154
|
@default_config = Config.new(
|
155
|
-
logger: Logger.new(STDOUT),
|
155
|
+
logger: nil, # Logger.new(STDOUT),
|
156
156
|
timeout: nil,
|
157
157
|
return_raw: false,
|
158
|
+
context_length: 5, # number of responses to add as context
|
158
159
|
providers: {},
|
159
160
|
envar_api_key_names: {
|
160
161
|
anthropic: ['ANTHROPIC_API_KEY'],
|
@@ -171,6 +172,16 @@ class AiClient
|
|
171
172
|
localai: /^local-/i,
|
172
173
|
ollama: /(llama|nomic)/i,
|
173
174
|
open_router: /\//
|
175
|
+
},
|
176
|
+
default_provider: :openai,
|
177
|
+
default_model: {
|
178
|
+
anthropic: 'claude-3-5-sonnet-20240620',
|
179
|
+
openai: 'gpt-4o',
|
180
|
+
google: 'gemini-pro-1.5',
|
181
|
+
mistral: 'mistral-large',
|
182
|
+
localai: 'llama3.2',
|
183
|
+
ollama: 'llama3.2',
|
184
|
+
open_router: 'auto'
|
174
185
|
}
|
175
186
|
)
|
176
187
|
|
data/lib/ai_client/middleware.rb
CHANGED
@@ -23,7 +23,7 @@ class AiClient
|
|
23
23
|
#
|
24
24
|
def call_with_middlewares(method, *args, **kwargs, &block)
|
25
25
|
stack = self.class.middlewares.reverse.reduce(-> { send(method, *args, **kwargs, &block) }) do |next_middleware, middleware|
|
26
|
-
-> { middleware.call(self, next_middleware, *args, **kwargs) }
|
26
|
+
-> { middleware.call(self, next_middleware, *args, **kwargs, &block) }
|
27
27
|
end
|
28
28
|
stack.call
|
29
29
|
end
|