ai_client 0.3.1 → 0.4.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 320868a04996fe7ce2b99555b9d5b7c39579eff8be6e73a2ce60534720bd6dbb
4
- data.tar.gz: 87b7cf31ba5be1a8acb03a0b56f54ef741e8e1ba0cc03b34045c10768186729d
3
+ metadata.gz: 72666a60d1c49346ce8512fbfe97fbc163801f5cc22e9f7ed3883c3c8e28901b
4
+ data.tar.gz: bbcaee95e9ef2a6fd8bad5159ff8aa6d8cc88f443302dd37573c0bf7da512a66
5
5
  SHA512:
6
- metadata.gz: c5c96015690e48c578cf88eee643a6b5c4fea551418177b7d2f883114514a469761a502d57846976efac27f2e470b0d7cf37d646e01add5539bc40da3ab50b72
7
- data.tar.gz: ab0583961a965fbc7fd51b6159bbde629d19db1ef0105c6d6567f692fc64ca42ac714b488e01aacafa4165345f74a7b9482d1f4107651e591cb337a6a8a79d27
6
+ metadata.gz: 04045a6f8a78671930429f9481e4319b36cf5955d802f8f6ae5b4ec6c4bd1b4c88d4854bcaa44fbb3b4a0f4b7d68c7a4f3d0c4045c767c590cf417efac905e5f
7
+ data.tar.gz: d26f1770c0dd38d6c83fed91eca4e8489ac485415b91bb0e0d5ef0496f09af20f0e532974b984fcabedd997562494f20881966b3b5bee0e1a091013431f5e18d
data/CHANGELOG.md CHANGED
@@ -1,12 +1,22 @@
1
1
  ## [Unreleased]
2
2
 
3
- ### [0.3.0] - 2024-10-13
3
+ ## Released
4
+
5
+ ### [0.4.0] - 2024-10-20
6
+ - Removed Logger.new(STDOUT) from the default configuration
7
+ > config.logger now returns nil If you want a class or instance logger setup, then you will have to do a config.logger = Logger.new(STDOUT) or whatever you need.
8
+ - Adding basic @context for chat-bots
9
+ - Added `context_length` to configuration as the number of responses to remember as context
10
+ - Added a default model for each major provider; using "auto" for open_router.ai
11
+ - Added a default provider (OpenAI)
12
+ > AiClient.new() will use the config.default_provider and that provider's default_model
13
+ - Fixed problem with advanced block-based prompt construction for chat
14
+
15
+ ### [0.3.1] - 2024-10-19
4
16
  - updated the open_routed_extensions file
5
17
  - added simplecov to see code coverage
6
18
  - updated README with options doc
7
19
 
8
- ## Released
9
-
10
20
  ### [0.3.0] - 2024-10-13
11
21
  - Breaking Change
12
22
  - Added new class AiClient::Function to encapsulate the callback functions used as tools in chats.
data/README.md CHANGED
@@ -2,11 +2,13 @@
2
2
 
3
3
  First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
4
4
 
5
- Version 0.3.0 has a breaking change w/r/t how [Callback Functions (aka Tools)](#callback-functions-aka-tools) are defined and used.
5
+ Version 0.4.0 has two changes which may break your existing application.
6
+ 1. The default configuration no longer has a Logger instance. You will need to add your own instance to either the class or instance configuration using `AiClient.class_config.logger = YourLogger` and/or `client.config.logger = YourLogger`
7
+ 2. The `chat` method now keeps a context window. The window length is defined by the configuration item `context_length` If you do not want to maintain a context window, set the `context_length` configuration item to either nil or zero.
6
8
 
7
9
  See the [change log](CHANGELOG.md) for recent modifications.
8
10
 
9
- You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions. `raix-rails` is also available.
11
+ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions. [raix-rails](https://github.com/OlympiaAI/raix-rails) is also available.
10
12
 
11
13
 
12
14
  <!-- Tocer[start]: Auto-generated, don't remove. -->
@@ -32,6 +34,7 @@ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I l
32
34
  - [3. Load Complete Configuration from a YAML File](#3-load-complete-configuration-from-a-yaml-file)
33
35
  - [Top-level Client Methods](#top-level-client-methods)
34
36
  - [chat](#chat)
37
+ - [Cpmtext](#cpmtext)
35
38
  - [embed](#embed)
36
39
  - [speak](#speak)
37
40
  - [transcribe](#transcribe)
@@ -112,7 +115,7 @@ AiClient.class_config.envar_api_key_bames = {
112
115
  google: 'your_envar_name',
113
116
  mistral: 'your_envar_name',
114
117
  open_router: 'your_envar_name',
115
- opena: 'your_envar_name'
118
+ openai: 'your_envar_name'
116
119
  }
117
120
 
118
121
  AiClient.class_config.save('path/to/file.yml')
@@ -139,13 +142,49 @@ To explicitly designate a provider to use with an AiClient instance use the para
139
142
  Basic usage:
140
143
 
141
144
  ```ruby
142
- AI = AiClient.new('gpt-4o')
145
+ require 'ai_client'
146
+ ai = AiClient.new # use default model and provider
147
+ ai.model #=> 'gpt-4o' is the default
148
+ ai.provider #=> :openai is the default
149
+ #
150
+ # To change the class defaults:
151
+ #
152
+ AiClient.default_provider = :anthropic
153
+ AiClient.default_model[:openai] = 'gpt-4o-mini'
154
+ #
155
+ # To get an Array of models and providers
156
+ #
157
+ AiClient.models # from open_router.ai
158
+ AiClient.providers # from open_router.ai
159
+ #
160
+ # To get details about a specific provider/model pair:
161
+ #
162
+ AiClient.model_details('openai/gpt-4o-mini') # from open_router.ai
143
163
  ```
144
164
 
145
- That's it. Just provide the model name that you want to use. If you application is using more than one model, no worries, just create multiple AiClient instances.
165
+ You can specify which model you want to use and `AiClient` will use the provider associated with that model.
166
+
167
+
168
+ ```ruby
169
+ AI = AiClient.new('gpt-4o-mini') # sets provider to :openai
170
+ #
171
+ # If you want to use the open_router.ai service instead of
172
+ # going directly to OpenAI do it this way:
173
+ #
174
+ AI = AiClient.new('openai/gpt-4o-mini') # sets provider to :open_router
175
+ ```
176
+
177
+ Of course you could specify both the model and the provider that you want to use:
178
+
146
179
 
147
180
  ```ruby
148
- c1 = AiClient.new('nomic-embeddings-text')
181
+ AI = AiClient.new('mistral', provider: :ollama)
182
+ ```
183
+
184
+ That's it. What could be simpler? If your application is using more than one model, no worries, just create multiple `AiClient` instances.
185
+
186
+ ```ruby
187
+ c1 = AiClient.new('nomic-embed-text')
149
188
  c2 = AiClient.new('gpt-4o-mini')
150
189
  ```
151
190
 
@@ -164,6 +203,53 @@ There are three levels of configuration, each inherenting from the level above.
164
203
 
165
204
  The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
166
205
 
206
+ ```ruby
207
+ {
208
+ :logger => nil,
209
+ :timeout => nil,
210
+ :return_raw => false,
211
+ :context_length => 5,
212
+ :providers => {},
213
+ :envar_api_key_names => {
214
+ :anthropic => [
215
+ "ANTHROPIC_API_KEY"
216
+ ],
217
+ :google => [
218
+ "GOOGLE_API_KEY"
219
+ ],
220
+ :mistral => [
221
+ "MISTRAL_API_KEY"
222
+ ],
223
+ :open_router => [
224
+ "OPEN_ROUTER_API_KEY",
225
+ "OPENROUTER_API_KEY"
226
+ ],
227
+ :openai => [
228
+ "OPENAI_API_KEY"
229
+ ]
230
+ },
231
+ :provider_patterns => {
232
+ :anthropic => /^claude/i,
233
+ :openai => /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
234
+ :google => /^(gemini|gemma|palm)/i,
235
+ :mistral => /^(mistral|codestral|mixtral)/i,
236
+ :localai => /^local-/i,
237
+ :ollama => /(llama|nomic)/i,
238
+ :open_router => /\//
239
+ },
240
+ :default_provider => :openai,
241
+ :default_model => {
242
+ :anthropic => "claude-3-5-sonnet-20240620",
243
+ :openai => "gpt-4o",
244
+ :google => "gemini-pro-1.5",
245
+ :mistral => "mistral-large",
246
+ :localai => "llama3.2",
247
+ :ollama => "llama3.2",
248
+ :open_router => "auto"
249
+ }
250
+ }
251
+ ```
252
+
167
253
  #### Class Configuration
168
254
 
169
255
  The class configuration is derived initially from the default configuration. It can be changed in three ways.
@@ -243,6 +329,16 @@ The response will be a simple string or a response object based upon the setting
243
329
 
244
330
  See the [Advanced Prompts] section to learn how to configure a complex prompt message.
245
331
 
332
+ ####### Cpmtext
333
+
334
+ **context_length**
335
+
336
+ The `context_length` configuration item is used to keep the last "context_length" responses within the chat context window. If you do not want to keep a context window, you should set the value of `config.context_length = 0` When you do either at the class or instance level, the chat response will be provided without the LLM knowing any prior context. If you are implementing a chat-bot, you will want it to have a context of the current conversation.
337
+
338
+ ```ruby
339
+ AiClient.config.context_length #=> 5
340
+ AiClient.config.context_length = 0 # Turns off the context window
341
+ ```
246
342
 
247
343
  ##### embed
248
344
 
@@ -258,7 +354,7 @@ Recommendation: Use PostgreSQL, pg_vector and the neighbor gem.
258
354
  ##### speak
259
355
 
260
356
  ```ruby
261
- res[pmse = AI.speak("Isn't it nice to have a computer that will talk to you?")
357
+ response = AI.speak("Isn't it nice to have a computer that will talk to you?")
262
358
  ```
263
359
 
264
360
  The response will contain audio data that can be played, manipulated or saved to a file.
@@ -313,7 +409,7 @@ Note: The availability and exact names of these options may vary depending on th
313
409
  In more complex application providing a simple string as your prompt is not sufficient. AiClient can take advantage of OmniAI's complex message builder.
314
410
 
315
411
  ```ruby
316
- client = AiClient.new 'some_model_bane'
412
+ client = AiClient.new 'some_model_name'
317
413
 
318
414
  completion = client.chat do |prompt|
319
415
  prompt.system('You are an expert biologist with an expertise in animals.')
@@ -9,8 +9,17 @@ class AiClient
9
9
  # stream: @stream [Proc, nil] optional
10
10
  # tools: @tools [Array<OmniAI::Tool>] optional
11
11
  # temperature: @temperature [Float, nil] optional
12
-
13
- def chat(messages, **params)
12
+ #
13
+ # Initiates a chat session.
14
+ #
15
+ # @param messages [Array<String>] the messages to send.
16
+ # @param params [Hash] optional parameters.
17
+ # @option params [Array<OmniAI::Tool>] :tools an array of tools to use.
18
+ # @return [String] the result from the chat.
19
+ #
20
+ # @raise [RuntimeError] if tools parameter is invalid.
21
+ #
22
+ def chat(messages='', **params, &block)
14
23
  if params.has_key? :tools
15
24
  tools = params[:tools]
16
25
  if tools.is_a? Array
@@ -23,14 +32,62 @@ class AiClient
23
32
  params[:tools] = tools
24
33
  end
25
34
 
26
- result = call_with_middlewares(:chat_without_middlewares, messages, **params)
27
- @last_response = result
28
- raw? ? result : content
35
+ @last_messages = messages
36
+ messages = add_context(messages)
37
+ result = call_with_middlewares(
38
+ :chat_without_middlewares,
39
+ messages,
40
+ **params,
41
+ &block
42
+ )
43
+ @last_response = result
44
+ result = raw? ? result : content
45
+
46
+ @context.push(@last_response)
47
+
48
+ result
29
49
  end
30
50
 
31
51
 
32
- def chat_without_middlewares(messages, **params)
33
- @client.chat(messages, model: @model, **params)
52
+ # Adds context to the current prompt.
53
+ #
54
+ # @param prompt [String, Array<String>] the current prompt.
55
+ # @return [String, Array<String>] the prompt with context added.
56
+ #
57
+ def add_context(prompt)
58
+ return(prompt) if @config.context_length.nil? ||
59
+ 0 == @config.context_length ||
60
+ prompt.is_a?(Array) ||
61
+ @context.empty?
62
+
63
+
64
+ prompt << "\nUse the following context in crafting your response.\n"
65
+
66
+ @context[..config.context_length].each do |result|
67
+ prompt << "You previously responded with:\n"
68
+ prompt << "#{raw? ? result.inspect : content(result)}"
69
+ end
70
+
71
+ prompt
72
+ end
73
+
74
+
75
+ # Clears the current context.
76
+ #
77
+ # @return [void]
78
+ #
79
+ def clear_context
80
+ @context = []
34
81
  end
35
82
 
83
+
84
+ # Chats with the client without middleware processing.
85
+ #
86
+ # @param messages [Array<String>] the messages to send.
87
+ # @param params [Hash] optional parameters.
88
+ # @return [String] the result from the chat.
89
+ #
90
+ def chat_without_middlewares(messages, **params, &block)
91
+ @client.chat(messages, model: @model, **params, &block)
92
+ end
36
93
  end
@@ -1,23 +1,8 @@
1
1
  ---
2
- :logger: !ruby/object:Logger
3
- level: 0
4
- progname:
5
- default_formatter: !ruby/object:Logger::Formatter
6
- datetime_format:
7
- formatter:
8
- logdev: !ruby/object:Logger::LogDevice
9
- shift_period_suffix:
10
- shift_size:
11
- shift_age:
12
- filename:
13
- dev: !ruby/object:IO {}
14
- binmode: false
15
- reraise_write_errors: []
16
- mon_data: !ruby/object:Monitor {}
17
- mon_data_owner_object_id: 45380
18
- level_override: {}
2
+ :logger:
19
3
  :timeout:
20
4
  :return_raw: false
5
+ :context_length: 5
21
6
  :providers: {}
22
7
  :envar_api_key_names:
23
8
  :anthropic:
@@ -39,3 +24,12 @@
39
24
  :localai: !ruby/regexp /^local-/i
40
25
  :ollama: !ruby/regexp /(llama|nomic)/i
41
26
  :open_router: !ruby/regexp /\//
27
+ :default_provider: :openai
28
+ :default_model:
29
+ :anthropic: claude-3-5-sonnet-20240620
30
+ :openai: gpt-4o
31
+ :google: gemini-pro-1.5
32
+ :mistral: mistral-large
33
+ :localai: llama3.2
34
+ :ollama: llama3.2
35
+ :open_router: auto
@@ -152,9 +152,10 @@ class AiClient
152
152
  #
153
153
  def initialize_defaults
154
154
  @default_config = Config.new(
155
- logger: Logger.new(STDOUT),
155
+ logger: nil, # Logger.new(STDOUT),
156
156
  timeout: nil,
157
157
  return_raw: false,
158
+ context_length: 5, # number of responses to add as context
158
159
  providers: {},
159
160
  envar_api_key_names: {
160
161
  anthropic: ['ANTHROPIC_API_KEY'],
@@ -171,6 +172,16 @@ class AiClient
171
172
  localai: /^local-/i,
172
173
  ollama: /(llama|nomic)/i,
173
174
  open_router: /\//
175
+ },
176
+ default_provider: :openai,
177
+ default_model: {
178
+ anthropic: 'claude-3-5-sonnet-20240620',
179
+ openai: 'gpt-4o',
180
+ google: 'gemini-pro-1.5',
181
+ mistral: 'mistral-large',
182
+ localai: 'llama3.2',
183
+ ollama: 'llama3.2',
184
+ open_router: 'auto'
174
185
  }
175
186
  )
176
187
 
@@ -23,7 +23,7 @@ class AiClient
23
23
  #
24
24
  def call_with_middlewares(method, *args, **kwargs, &block)
25
25
  stack = self.class.middlewares.reverse.reduce(-> { send(method, *args, **kwargs, &block) }) do |next_middleware, middleware|
26
- -> { middleware.call(self, next_middleware, *args, **kwargs) }
26
+ -> { middleware.call(self, next_middleware, *args, **kwargs, &block) }
27
27
  end
28
28
  stack.call
29
29
  end