ai_client 0.3.0 → 0.3.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 2453f748447f37e755f0087615c845570fe3ea3c4bd06a687947dceedee3e89b
4
- data.tar.gz: 79359bcd209448add248514d9a8ee5dc4e0fa69833666df0e779715b574d450a
3
+ metadata.gz: 320868a04996fe7ce2b99555b9d5b7c39579eff8be6e73a2ce60534720bd6dbb
4
+ data.tar.gz: 87b7cf31ba5be1a8acb03a0b56f54ef741e8e1ba0cc03b34045c10768186729d
5
5
  SHA512:
6
- metadata.gz: 28007804ea1e223b22846cc199c4bd14d7349f5e051bbc7007ced3641e7174c3fac59b7a225bc4926ffc68d5018a35ed96a8809bf959c8007e978bc831770b9a
7
- data.tar.gz: 18ce04f9f83d0c12caadab051b81c48f7dbafa73ff0b2a34df28e3ddfb5ff3c088dcec019a15bc2f2dfc909bb6db0a308b512b6711baaab058629705a1448040
6
+ metadata.gz: c5c96015690e48c578cf88eee643a6b5c4fea551418177b7d2f883114514a469761a502d57846976efac27f2e470b0d7cf37d646e01add5539bc40da3ab50b72
7
+ data.tar.gz: ab0583961a965fbc7fd51b6159bbde629d19db1ef0105c6d6567f692fc64ca42ac714b488e01aacafa4165345f74a7b9482d1f4107651e591cb337a6a8a79d27
data/CHANGELOG.md CHANGED
@@ -1,5 +1,10 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ### [0.3.0] - 2024-10-13
4
+ - updated the open_routed_extensions file
5
+ - added simplecov to see code coverage
6
+ - updated README with options doc
7
+
3
8
  ## Released
4
9
 
5
10
  ### [0.3.0] - 2024-10-13
data/README.md CHANGED
@@ -6,6 +6,8 @@ Version 0.3.0 has a breaking change w/r/t how [Callback Functions (aka Tools)](#
6
6
 
7
7
  See the [change log](CHANGELOG.md) for recent modifications.
8
8
 
9
+ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions. `raix-rails` is also available.
10
+
9
11
 
10
12
  <!-- Tocer[start]: Auto-generated, don't remove. -->
11
13
 
@@ -34,9 +36,23 @@ See the [change log](CHANGELOG.md) for recent modifications.
34
36
  - [speak](#speak)
35
37
  - [transcribe](#transcribe)
36
38
  - [Options](#options)
39
+ - [Common Options for All Methods](#common-options-for-all-methods)
40
+ - [Chat-specific Options](#chat-specific-options)
41
+ - [Embed-specific Options](#embed-specific-options)
42
+ - [Speak-specific Options](#speak-specific-options)
43
+ - [Transcribe-specific Options](#transcribe-specific-options)
37
44
  - [Advanced Prompts](#advanced-prompts)
38
45
  - [Callback Functions (aka Tools)](#callback-functions-aka-tools)
39
46
  - [Defining a Callback Function](#defining-a-callback-function)
47
+ - [OpenRouter Extensions and AiClient::LLM](#openrouter-extensions-and-aiclientllm)
48
+ - [Instance Methods](#instance-methods)
49
+ - [Class Methods](#class-methods)
50
+ - [AiClient::LLM Data Table](#aiclientllm-data-table)
51
+ - [Key Features](#key-features)
52
+ - [Class Methods](#class-methods-1)
53
+ - [Instance Methods](#instance-methods-1)
54
+ - [Usage Example](#usage-example)
55
+ - [Integration with ActiveHash](#integration-with-activehash)
40
56
  - [Best ?? Practices](#best--practices)
41
57
  - [OmniAI and OpenRouter](#omniai-and-openrouter)
42
58
  - [Contributing](#contributing)
@@ -254,10 +270,43 @@ response = AI.transcribe(...)
254
270
  ```
255
271
 
256
272
 
257
-
258
273
  ### Options
259
274
 
260
- TODO: document the options like `provider: :ollama`
275
+ The four major methods (chat, embed, speak, and transcribe) support various options that can be passed to the underlying client code. Here's a breakdown of the common options for each method:
276
+
277
+ ##### Common Options for All Methods
278
+
279
+ - `provider:` - Specifies the AI provider to use (e.g., `:openai`, `:anthropic`, `:google`, `:mistral`, `:ollama`, `:localai`).
280
+ - `model:` - Specifies the model to use within the chosen provider.
281
+ - `api_key:` - Allows passing a specific API key, overriding the default environment variable.
282
+ - `temperature:` - Controls the randomness of the output (typically a float between 0 and 1).
283
+ - `max_tokens:` - Limits the length of the generated response.
284
+
285
+ ##### Chat-specific Options
286
+
287
+ - `messages:` - An array of message objects for multi-turn conversations.
288
+ - `functions:` - An array of available functions/tools for the model to use.
289
+ - `function_call:` - Specifies how the model should use functions ("auto", "none", or a specific function name).
290
+ - `stream:` - Boolean to enable streaming responses.
291
+
292
+ ##### Embed-specific Options
293
+
294
+ - `input:` - The text or array of texts to embed.
295
+ - `dimensions:` - The desired dimensionality of the resulting embeddings (if supported by the model).
296
+
297
+ ##### Speak-specific Options
298
+
299
+ - `voice:` - Specifies the voice to use for text-to-speech (provider-dependent).
300
+ - `speed:` - Adjusts the speaking rate (typically a float, where 1.0 is normal speed).
301
+ - `format:` - Specifies the audio format of the output (e.g., "mp3", "wav").
302
+
303
+ ##### Transcribe-specific Options
304
+
305
+ - `file:` - The audio file to transcribe (can be a file path or audio data).
306
+ - `language:` - Specifies the language of the audio (if known).
307
+ - `prompt:` - Provides context or specific words to aid in transcription accuracy.
308
+
309
+ Note: The availability and exact names of these options may vary depending on the specific provider and model being used. Always refer to the documentation of the chosen provider for the most up-to-date and accurate information on supported options.
261
310
 
262
311
  ### Advanced Prompts
263
312
 
@@ -331,6 +380,130 @@ In this example:
331
380
 
332
381
  See the [examples/tools.rb file](examples/tools.rb) for additional examples.
333
382
 
383
+ ### OpenRouter Extensions and AiClient::LLM
384
+
385
+ The `open_router.ai` API provides a service that allows you to download a JSON file containing detailed information about all of the providers and their available models. `AiClient` has saved an old copy of this information in the [`models.yml`](lib/ai_client/models.yml) file. If you want to update this file with the latest information from `open_router.ai`, you must have a valid API key.
386
+
387
+ You can still use the included `models.yml` file with the `AiClient::LLM` class. The following sections describe the convenient instance and class methods that are available. See the section on `AiClient::LLM` for complete details.
388
+
389
+ ##### Instance Methods
390
+
391
+ - **`model_details`**: Retrieves details for the current model. Returns a hash containing the model's attributes or `nil` if not found.
392
+
393
+ ```ruby
394
+ client = AiClient.new('gpt-3.5-turbo')
395
+ details = client.model_details
396
+ details #=>
397
+ {
398
+ :id => "openai/gpt-3.5-turbo",
399
+ :name => "OpenAI: GPT-3.5 Turbo",
400
+ :created => 1685232000,
401
+ :description => "GPT-3.5 Turbo is OpenAI's fastest model. It can understand and generate natural language or code, and is optimized for chat and traditional completion tasks.\n\nTraining data up to Sep 2021.",
402
+ :context_length => 16385,
403
+ :architecture => {
404
+ "modality" => "text->text",
405
+ "tokenizer" => "GPT",
406
+ "instruct_type" => nil
407
+ },
408
+ :pricing => {
409
+ "prompt" => "0.0000005",
410
+ "completion" => "0.0000015",
411
+ "image" => "0",
412
+ "request" => "0"
413
+ },
414
+ :top_provider => {
415
+ "context_length" => 16385,
416
+ "max_completion_tokens" => 4096,
417
+ "is_moderated" => true
418
+ },
419
+ :per_request_limits => {
420
+ "prompt_tokens" => "40395633",
421
+ "completion_tokens" => "13465211"
422
+ }
423
+ }
424
+ ```
425
+
426
+ - **`models`**: Retrieves model names for the current provider. Returns an array of strings with the names of the models.
427
+
428
+ ```ruby
429
+ models = client.models
430
+ ```
431
+
432
+ ##### Class Methods
433
+
434
+ - **`providers`**: Retrieves all available providers. Returns an array of unique symbols representing provider names.
435
+
436
+ ```ruby
437
+ available_providers = AiClient.providers
438
+ ```
439
+
440
+ - **`models(substring = nil)`**: Retrieves model IDs, optionally filtered by a substring.
441
+
442
+ ```ruby
443
+ available_models = AiClient.models('turbo')
444
+ ```
445
+
446
+ - **`model_details(model_id)`**: Retrieves details for a specific model using its ID. Accepts a string representing the model ID. Returns a hash containing the model's attributes or `nil` if not found.
447
+
448
+ ```ruby
449
+ model_info = AiClient.model_details('openai/gpt-3.5-turbo')
450
+ ```
451
+
452
+ - **`reset_llm_data`**: Resets the LLM data with the available ORC models. Returns `void`.
453
+
454
+ ```ruby
455
+ AiClient.reset_llm_data
456
+ ```
457
+
458
+ ### AiClient::LLM Data Table
459
+
460
+ The `AiClient::LLM` class serves as a central point for managing information about large language models (LLMs) available via the `open_router.ai` API service. The YAML file (`models.yml`) contains information about various LLMs and their providers. To update this information to the latest available, you must have an access API key for the `open_router.ai` service.
461
+
462
+ `AiClient::LLM` is a subclass of `ActiveHash::Base`, enabling it to act like and interact with `ActiveRecord::Base` defined models. Each entry in this data store is uniquely identified by an `id` in the pattern "provider/model" all lowercase without spaces.
463
+
464
+ ##### Key Features
465
+
466
+ - **Model and Provider Extraction**:
467
+ - The class provides methods to extract the model name and provider from the LLM's ID.
468
+ - The `model` method returns the model ID derived from the ID.
469
+ - The `provider` method extracts the provider name as a Symbol.
470
+
471
+ ##### Class Methods
472
+
473
+ - **`reset_llm_data`**:
474
+ - A class-level method that fetches the latest model data from the open_router.ai service and updates the `models.yml` file accordingly.
475
+
476
+ ##### Instance Methods
477
+
478
+ - **`model`**:
479
+ - Returns the name of the model derived from the LLM's ID.
480
+
481
+ ```ruby
482
+ llm_instance = AiClient::LLM.find('openai/gpt-3.5-turbo')
483
+ puts llm_instance.model # Output: gpt-3.5-turbo
484
+ ```
485
+
486
+ - **`provider`**:
487
+ - Returns the name of the provider associated with the LLM's ID.
488
+
489
+ ```ruby
490
+ llm_instance = AiClient::LLM.find('openai/gpt-3.5-turbo')
491
+ puts llm_instance.provider # Output: :openai
492
+ ```
493
+
494
+ ##### Usage Example
495
+
496
+ The `AiClient::LLM` class is predominantly used to interact with different providers of LLMs. By utilizing the `model` and `provider` methods, users can seamlessly retrieve and utilize models in their applications.
497
+
498
+ ```ruby
499
+ llm_instance = AiClient::LLM.find('google/bard')
500
+ puts "Model: #{llm_instance.model}, Provider: #{llm_instance.provider}"
501
+ ```
502
+
503
+ ##### Integration with ActiveHash
504
+
505
+ The `AiClient::LLM` class inherits from `ActiveHash::Base`, which provides an easy way to define a set of data and allows for lookups and easy manipulation of the data structure. The use of ActiveHash makes it easier to manage the LLM data effectively without needing a full database.
506
+
334
507
 
335
508
  ## Best ?? Practices
336
509
 
data/lib/ai_client/llm.rb CHANGED
@@ -9,6 +9,13 @@ class AiClient
9
9
  DATA_PATH = Pathname.new( __dir__ + '/models.yml')
10
10
  self.data = YAML.parse(DATA_PATH.read).to_ruby
11
11
 
12
+ scope :providers, -> {all.map(&:provider).uniq.map(&:to_sym)}
13
+
14
+ scope :models, ->(substring=nil) do
15
+ (substring.nil? ? all : all.where(id: /#{substring}/i))
16
+ .map(&:model).sort.uniq
17
+ end
18
+
12
19
  # Extracts the model name from the LLM ID.
13
20
  #
14
21
  # @return [String] the model name.
@@ -17,9 +24,12 @@ class AiClient
17
24
 
18
25
  # Extracts the provider name from the LLM ID.
19
26
  #
20
- # @return [String] the provider name.
27
+ # @return [Symbol] the provider name.
21
28
  #
22
- def provider = id.split('/')[0]
29
+ def provider = id.split('/')[0].to_sym
30
+
31
+ def to_h = attributes
32
+
23
33
  end
24
34
 
25
35
  class << self
@@ -34,5 +44,6 @@ class AiClient
34
44
  AiClient::LLM.data = orc_models
35
45
  AiClient::LLM::DATA_PATH.write(orc_models.to_yaml)
36
46
  end
47
+
37
48
  end
38
49
  end
@@ -1,6 +1,6 @@
1
1
  # lib/ai_client/middleware.rb
2
2
 
3
- # TODO: As concurrently designed the middleware must
3
+ # TODO: As currently designed the middleware must
4
4
  # be set before an instance of AiClient is created.
5
5
  # Any `use` commands for middleware made after
6
6
  # the instance is created will not be available
@@ -1,58 +1,74 @@
1
1
  # lib/ai_client/open_router_extensions.rb
2
- # frozen_string_literal: true
3
2
 
4
- # These extensions to AiClient are only available with
5
- # a valid API Key for the open_router.ai web-service
3
+ # OpenRouter Extensions for AiClient
4
+ #
5
+ # This file adds several public instance and class methods to the AiClient class
6
+ # to provide information about AI models and providers.
7
+ #
8
+ # Instance Methods:
9
+ # - model_details: Retrieves details for the current model.
10
+ # - models: Retrieves model names for the current provider.
11
+ #
12
+ # Class Methods:
13
+ # - providers: Retrieves all available providers.
14
+ # - models: Retrieves model names, optionally filtered by provider.
15
+ # - model_details: Retrieves details for a specific model.
16
+ #
17
+ # These methods utilize the AiClient::LLM class and the models.yml file
18
+ # for model information.
6
19
 
7
20
  require 'open_router'
8
21
  require 'yaml'
9
22
 
10
23
  class AiClient
11
-
12
- # Retrieves the available models.
13
- #
14
- # @return [Array<String>] List of model IDs.
15
- #
16
- def models
17
- self.class.models
18
- end
19
24
 
20
- # Retrieves the available providers.
25
+ # Retrieves details for the current model.
21
26
  #
22
- # @return [Array<String>] List of provider names.
23
- def providers
24
- self.class.providers
27
+ # @return [Hash, nil] Details of the current model or nil if not found.
28
+ def model_details
29
+ id = "#{@provider}/#{@model}"
30
+ LLM.find(id.downcase)
25
31
  end
26
32
 
27
- # Retrieves model names, optionally filtered by provider.
33
+ # Retrieves model names for the current provider.
28
34
  #
29
- # @param provider [String, nil] The provider to filter models by.
30
- # @return [Array<String>] List of model names.
31
- def model_names(provider = nil)
32
- self.class.model_names(provider)
33
- end
35
+ # @return [Array<String>] List of model names for the current provider.
36
+ def models = LLM.models(@provider)
34
37
 
35
- # Retrieves details for a specific model.
36
- #
37
- # @param a_model [String] The model ID to retrieve details for.
38
- # @return [Hash, nil] Details of the model or nil if not found.
39
- def model_details(a_model)
40
- self.class.model_details(a_model)
41
- end
42
38
 
43
- # Finds models matching a given substring.
44
- #
45
- # @param a_model_substring [String] The substring to search for.
46
- # @return [Array<String>] List of matching model names.
47
- def find_model(a_model_substring)
48
- self.class.find_model(a_model_substring)
49
- end
39
+ class << self
50
40
 
41
+ # Retrieves all available providers.
42
+ #
43
+ # @return [Array<Symbol>] List of all provider names.
44
+ def providers = LLM.providers
51
45
 
52
- class << self
53
-
54
- # Adds OpenRouter extensions to AiClient.
46
+
47
+ # Retrieves model names, optionally filtered by provider.
48
+ #
49
+ # @param substring [String, nil] Optional substring to filter models by.
50
+ # @return [Array<String>] List of model names.
51
+ def models(substring = nil) = LLM.models(substring)
52
+
53
+ # Retrieves details for a specific model.
54
+ #
55
+ # @param model_id [String] The model ID to retrieve details for,
56
+ # in the pattern "provider/model".downcase
57
+ # @return [AiClient::LLM, nil] Details of the model or nil if not found.
58
+ def model_details(model_id) = LLM.find(model_id.downcase)
59
+
60
+
61
+ # Resets LLM data with the available ORC models.
62
+ #
63
+ # @return [void]
55
64
  #
65
+ def reset_llm_data = LLM.reset_llm_data
66
+
67
+
68
+ # Initializes OpenRouter extensions for AiClient.
69
+ #
70
+ # This sets up the access token and initializes the ORC client.
71
+ #
56
72
  # @return [void]
57
73
  #
58
74
  def add_open_router_extensions
@@ -64,7 +80,10 @@ class AiClient
64
80
  initialize_orc_client
65
81
  end
66
82
 
67
- # Retrieves ORC client instance.
83
+
84
+ private
85
+
86
+ # Retrieves the ORC client instance.
68
87
  #
69
88
  # @return [OpenRouter::Client] Instance of the OpenRouter client.
70
89
  #
@@ -80,59 +99,7 @@ class AiClient
80
99
  @orc_models ||= orc_client.models
81
100
  end
82
101
 
83
- # TODO: Refactor these DB like methods to take
84
- # advantage of AiClient::LLM
85
-
86
- # Retrieves model names associated with a provider.
87
- #
88
- # @param provider [String, nil] The provider to filter models by.
89
- # @return [Array<String>] List of model names.
90
- #
91
- def model_names(provider=nil)
92
- model_ids = models.map { _1['id'] }
93
-
94
- return model_ids unless provider
95
-
96
- model_ids.filter_map { _1.split('/')[1] if _1.start_with?(provider.to_s.downcase) }
97
- end
98
-
99
- # Retrieves details of a specific model.
100
- #
101
- # @param model [String] The model ID to retrieve details for.
102
- # @return [Hash, nil] Details of the model or nil if not found.
103
- #
104
- def model_details(model)
105
- orc_models.find { _1['id'].include?(model) }
106
- end
107
-
108
- # Retrieves the available providers.
109
- #
110
- # @return [Array<String>] List of unique provider names.
111
- #
112
- def providers
113
- @providers ||= models.map{ _1['id'].split('/')[0] }.sort.uniq
114
- end
115
-
116
- # Finds models matching a given substring.
117
- #
118
- # @param a_model_substring [String] The substring to search for.
119
- # @return [Array<String>] List of matching model names.
120
- #
121
- def find_model(a_model_substring)
122
- model_names.select{ _1.include?(a_model_substring) }
123
- end
124
-
125
- # Resets LLM data with the available ORC models.
126
- #
127
- # @return [void]
128
- #
129
- def reset_llm_data
130
- LLM.data = orc_models
131
- LLM::DATA_PATH.write(orc_models.to_yaml)
132
- end
133
-
134
102
 
135
- private
136
103
 
137
104
  # Fetches the access token from environment variables.
138
105
  #
@@ -154,10 +121,9 @@ class AiClient
154
121
  OpenRouter.configure { |config| config.access_token = access_token }
155
122
  end
156
123
 
157
- # Initializes the ORC client.
158
- #
159
- # @return [void]
124
+ # Initializes the ORC client instance.
160
125
  #
126
+ # @return [OpenRouter::Client] Instance of the OpenRouter client.
161
127
  def initialize_orc_client
162
128
  @orc_client ||= OpenRouter::Client.new
163
129
  end
@@ -1,14 +1,11 @@
1
1
  # lib/ai_client/tool.rb
2
2
 
3
- # TODO: Turn this into a Function class using the pattern
4
- # in examples/tools.rb
5
- # put the function names as symbols into a class Array
6
- # In the AiClient class transform the tools: []
7
- # parameter from an Array of Symbols into an Array
8
- # of FUnction instances.
9
-
10
3
  class AiClient::Tool < OmniAI::Tool
11
4
 
5
+ # TODO: Is there any additional functionality that
6
+ # needs to be added to the Rool class that would
7
+ # be helpful?
8
+
12
9
  def xyzzy = self.class.xyzzy
13
10
 
14
11
  class << self
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.3.0"
4
+ VERSION = "0.3.1"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -1,6 +1,17 @@
1
1
  # ai_client.rb
2
- # WIP: a generic client to access LLM providers
3
- # kinda like the SaaS "open router"
2
+
3
+ # A generic client to access various LLM providers
4
+ # Inspired by the SaaS "open router" concept
5
+
6
+
7
+ # AiClient: A unified interface for interacting with various LLM providers
8
+ #
9
+ # Usage:
10
+ # client = AiClient.new('gpt-3.5-turbo')
11
+ #
12
+ # Add middlewares:
13
+ # AiClient.use(RetryMiddleware.new(max_retries: 5, base_delay: 2, max_delay: 30))
14
+ # AiClient.use(LoggingMiddleware.new(AiClient.configuration.logger))
4
15
  #
5
16
 
6
17
  unless defined?(DebugMe)
@@ -134,29 +145,30 @@ class AiClient
134
145
  @last_response = nil
135
146
  end
136
147
 
137
- # TODO: Review these raw-ish methods are they really needed?
138
- # raw? should be a private method ??
139
-
140
- # Returns the last response received from the client.
141
- #
142
- # @return [OmniAI::Response] The last response.
143
- #
144
- def response = last_response
145
148
 
146
149
  # Checks if the client is set to return raw responses.
147
150
  #
148
151
  # @return [Boolean] True if raw responses are to be returned.
149
- def raw? = config.return_raw
150
-
152
+ def raw?
153
+ config.return_raw
154
+ end
155
+
151
156
 
152
157
  # Sets whether to return raw responses.
153
158
  #
154
159
  # @param value [Boolean] The value to set for raw responses return.
155
- #
156
160
  def raw=(value)
157
161
  config.return_raw = value
158
162
  end
159
163
 
164
+
165
+ # Returns the last response received from the client.
166
+ #
167
+ # @return [OmniAI::Response] The last response.
168
+ #
169
+ def response = last_response
170
+
171
+
160
172
  # Extracts the content from the last response based on the provider.
161
173
  #
162
174
  # @return [String] The extracted content.
@@ -288,5 +300,3 @@ class AiClient
288
300
  raise(ArgumentError, "Unsupported model: #{model}")
289
301
  end
290
302
  end
291
-
292
-
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.3.0
4
+ version: 0.3.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-14 00:00:00.000000000 Z
11
+ date: 2024-10-20 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: active_hash
@@ -262,7 +262,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
262
262
  - !ruby/object:Gem::Version
263
263
  version: '0'
264
264
  requirements: []
265
- rubygems_version: 3.5.21
265
+ rubygems_version: 3.5.22
266
266
  signing_key:
267
267
  specification_version: 4
268
268
  summary: A generic AI Client for many providers