ai_client 0.2.5 → 0.3.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: f2b330f59ff7f380306ec354ce5f7197adce8dd101a4b506ab137568f098242a
4
- data.tar.gz: f169296b9f20479b1c608f2202d9ab8a8eb416aa80b07f4308ec4648b9c86fcc
3
+ metadata.gz: 320868a04996fe7ce2b99555b9d5b7c39579eff8be6e73a2ce60534720bd6dbb
4
+ data.tar.gz: 87b7cf31ba5be1a8acb03a0b56f54ef741e8e1ba0cc03b34045c10768186729d
5
5
  SHA512:
6
- metadata.gz: ce0e2cddd8ec6935a5bea24125358fbcc8fda902340d7767972142505af41eaeef7d7f6f5791bd7d26469fd45e59020237fd0061e75d6a987d8a4181597cdbe9
7
- data.tar.gz: 4b2513fba82802eda13cf26c1af0e73f0903a2140765401ed686540d80480d887f2152e342a5fe3546c89482320f61cc6bd04b4e41b8851668bbda31b2496fc9
6
+ metadata.gz: c5c96015690e48c578cf88eee643a6b5c4fea551418177b7d2f883114514a469761a502d57846976efac27f2e470b0d7cf37d646e01add5539bc40da3ab50b72
7
+ data.tar.gz: ab0583961a965fbc7fd51b6159bbde629d19db1ef0105c6d6567f692fc64ca42ac714b488e01aacafa4165345f74a7b9482d1f4107651e591cb337a6a8a79d27
data/CHANGELOG.md CHANGED
@@ -1,9 +1,19 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ### [0.3.0] - 2024-10-13
4
+ - updated the open_routed_extensions file
5
+ - added simplecov to see code coverage
6
+ - updated README with options doc
7
+
3
8
  ## Released
4
9
 
10
+ ### [0.3.0] - 2024-10-13
11
+ - Breaking Change
12
+ - Added new class AiClient::Function to encapsulate the callback functions used as tools in chats.
13
+ - Updated the examples/tools.rb file to use the new function class
14
+
5
15
  ### [0.2.5] - 2024-10-11
6
- - Added examples/tool.rb to demonstrate use of function callbacks to provide information to the LLM when it needs it.
16
+ - Added examples/tools.rb to demonstrate use of function callbacks to provide information to the LLM when it needs it.
7
17
 
8
18
  ### [0.2.4] - 2024-10-10
9
19
  - constrained gem omniai-openai to version 1.8.3+ for access to open_router.ai
data/README.md CHANGED
@@ -2,8 +2,12 @@
2
2
 
3
3
  First and foremost a big **THANK YOU** to [Kevin Sylvestre](https://ksylvest.com/) for his gem [OmniAI](https://github.com/ksylvest/omniai) and [Olympia](https://olympia.chat/) for their [open_router gem](https://github.com/OlympiaAI/open_router) upon which this effort depends.
4
4
 
5
+ Version 0.3.0 has a breaking change w/r/t how [Callback Functions (aka Tools)](#callback-functions-aka-tools) are defined and used.
6
+
5
7
  See the [change log](CHANGELOG.md) for recent modifications.
6
8
 
9
+ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I like the way that Obie's API is setup for callback functions. `raix-rails` is also available.
10
+
7
11
 
8
12
  <!-- Tocer[start]: Auto-generated, don't remove. -->
9
13
 
@@ -32,8 +36,23 @@ See the [change log](CHANGELOG.md) for recent modifications.
32
36
  - [speak](#speak)
33
37
  - [transcribe](#transcribe)
34
38
  - [Options](#options)
39
+ - [Common Options for All Methods](#common-options-for-all-methods)
40
+ - [Chat-specific Options](#chat-specific-options)
41
+ - [Embed-specific Options](#embed-specific-options)
42
+ - [Speak-specific Options](#speak-specific-options)
43
+ - [Transcribe-specific Options](#transcribe-specific-options)
35
44
  - [Advanced Prompts](#advanced-prompts)
36
- - [Advanced Prompts with Tools](#advanced-prompts-with-tools)
45
+ - [Callback Functions (aka Tools)](#callback-functions-aka-tools)
46
+ - [Defining a Callback Function](#defining-a-callback-function)
47
+ - [OpenRouter Extensions and AiClient::LLM](#openrouter-extensions-and-aiclientllm)
48
+ - [Instance Methods](#instance-methods)
49
+ - [Class Methods](#class-methods)
50
+ - [AiClient::LLM Data Table](#aiclientllm-data-table)
51
+ - [Key Features](#key-features)
52
+ - [Class Methods](#class-methods-1)
53
+ - [Instance Methods](#instance-methods-1)
54
+ - [Usage Example](#usage-example)
55
+ - [Integration with ActiveHash](#integration-with-activehash)
37
56
  - [Best ?? Practices](#best--practices)
38
57
  - [OmniAI and OpenRouter](#omniai-and-openrouter)
39
58
  - [Contributing](#contributing)
@@ -251,10 +270,43 @@ response = AI.transcribe(...)
251
270
  ```
252
271
 
253
272
 
254
-
255
273
  ### Options
256
274
 
257
- TODO: document the options like `provider: :ollama`
275
+ The four major methods (chat, embed, speak, and transcribe) support various options that can be passed to the underlying client code. Here's a breakdown of the common options for each method:
276
+
277
+ ##### Common Options for All Methods
278
+
279
+ - `provider:` - Specifies the AI provider to use (e.g., `:openai`, `:anthropic`, `:google`, `:mistral`, `:ollama`, `:localai`).
280
+ - `model:` - Specifies the model to use within the chosen provider.
281
+ - `api_key:` - Allows passing a specific API key, overriding the default environment variable.
282
+ - `temperature:` - Controls the randomness of the output (typically a float between 0 and 1).
283
+ - `max_tokens:` - Limits the length of the generated response.
284
+
285
+ ##### Chat-specific Options
286
+
287
+ - `messages:` - An array of message objects for multi-turn conversations.
288
+ - `functions:` - An array of available functions/tools for the model to use.
289
+ - `function_call:` - Specifies how the model should use functions ("auto", "none", or a specific function name).
290
+ - `stream:` - Boolean to enable streaming responses.
291
+
292
+ ##### Embed-specific Options
293
+
294
+ - `input:` - The text or array of texts to embed.
295
+ - `dimensions:` - The desired dimensionality of the resulting embeddings (if supported by the model).
296
+
297
+ ##### Speak-specific Options
298
+
299
+ - `voice:` - Specifies the voice to use for text-to-speech (provider-dependent).
300
+ - `speed:` - Adjusts the speaking rate (typically a float, where 1.0 is normal speed).
301
+ - `format:` - Specifies the audio format of the output (e.g., "mp3", "wav").
302
+
303
+ ##### Transcribe-specific Options
304
+
305
+ - `file:` - The audio file to transcribe (can be a file path or audio data).
306
+ - `language:` - Specifies the language of the audio (if known).
307
+ - `prompt:` - Provides context or specific words to aid in transcription accuracy.
308
+
309
+ Note: The availability and exact names of these options may vary depending on the specific provider and model being used. Always refer to the documentation of the chosen provider for the most up-to-date and accurate information on supported options.
258
310
 
259
311
  ### Advanced Prompts
260
312
 
@@ -278,13 +330,179 @@ completion #=> 'The photos are of a cat, a dog, and a hamster.'
278
330
 
279
331
  Of course if `client.config.return_raw` is true, the completion value will be the complete response object.
280
332
 
281
- ### Advanced Prompts with Tools
282
333
 
283
- One of the latest innovations in LLMs is the ability to use functions (aka tools) as `callbacks` to gather more information or to execute a task at the direction of the LLM prompt processing.
334
+ ### Callback Functions (aka Tools)
335
+
336
+ With the release of version 0.3.0, the way callback functions (also referred to as tools) are defined in the `ai_client` gem has undergone significant changes. This section outlines the new approach in detail. These changes are designed to create a clearer and more robust interface for developers when working with callback functions. If you encounter any issues while updating your functions, please consult the official documentation or raise an issue in the repository.
337
+
338
+ ##### Defining a Callback Function
339
+
340
+ To define a callback function, you need to create a subclass of `AiClient::Function`. In this subclass, both the `call` and `details` methods must be implemented.
341
+
342
+ **Example**
343
+
344
+ Here's an example illustrating how to define a callback function using the new convention:
345
+
346
+ ```ruby
347
+ class WeatherFunction < AiClient::Function
348
+ # The call class method returns a String to be used by the LLM
349
+ def self.call(location:, unit: 'Celsius')
350
+ "#{rand(20..50)}° #{unit} in #{location}"
351
+ end
352
+
353
+ # The details method must return a hash with metadata about the function.
354
+ def self.details
355
+ {
356
+ name: 'weather',
357
+ description: "Lookup the weather in a location",
358
+ parameters: AiClient::Tool::Parameters.new(
359
+ properties: {
360
+ location: AiClient::Tool::Property.string(description: 'e.g. Toronto'),
361
+ unit: AiClient::Tool::Property.string(enum: %w[Celsius Fahrenheit]),
362
+ },
363
+ required: [:location]
364
+ )
365
+ }
366
+ end
367
+ end
368
+
369
+ # Register the WeatherFunction for use.
370
+ WeatherFunction.register
371
+
372
+ # Use the *.details[:name] value to reference the tools available for
373
+ # the LLM to use in processing the prompt.
374
+ response = AI.chat("what is the weather in London", tools: ['weather'])
375
+ ```
376
+
377
+ In this example:
378
+ - The `call` method is defined to accept named parameters: `location` and `unit`. The default value for `unit` is set to `'Celsius'`.
379
+ - The `details` method provides metadata about the function, ensuring that the parameters section clearly indicates which parameters are required.
380
+
381
+ See the [examples/tools.rb file](examples/tools.rb) for additional examples.
382
+
383
+ ### OpenRouter Extensions and AiClient::LLM
384
+
385
+ The `open_router.ai` API provides a service that allows you to download a JSON file containing detailed information about all of the providers and their available models. `AiClient` has saved an old copy of this information in the [`models.yml`](lib/ai_client/models.yml) file. If you want to update this file with the latest information from `open_router.ai`, you must have a valid API key.
386
+
387
+ You can still use the included `models.yml` file with the `AiClient::LLM` class. The following sections describe the convenient instance and class methods that are available. See the section on `AiClient::LLM` for complete details.
388
+
389
+ ##### Instance Methods
390
+
391
+ - **`model_details`**: Retrieves details for the current model. Returns a hash containing the model's attributes or `nil` if not found.
392
+
393
+ ```ruby
394
+ client = AiClient.new('gpt-3.5-turbo')
395
+ details = client.model_details
396
+ details #=>
397
+ {
398
+ :id => "openai/gpt-3.5-turbo",
399
+ :name => "OpenAI: GPT-3.5 Turbo",
400
+ :created => 1685232000,
401
+ :description => "GPT-3.5 Turbo is OpenAI's fastest model. It can understand and generate natural language or code, and is optimized for chat and traditional completion tasks.\n\nTraining data up to Sep 2021.",
402
+ :context_length => 16385,
403
+ :architecture => {
404
+ "modality" => "text->text",
405
+ "tokenizer" => "GPT",
406
+ "instruct_type" => nil
407
+ },
408
+ :pricing => {
409
+ "prompt" => "0.0000005",
410
+ "completion" => "0.0000015",
411
+ "image" => "0",
412
+ "request" => "0"
413
+ },
414
+ :top_provider => {
415
+ "context_length" => 16385,
416
+ "max_completion_tokens" => 4096,
417
+ "is_moderated" => true
418
+ },
419
+ :per_request_limits => {
420
+ "prompt_tokens" => "40395633",
421
+ "completion_tokens" => "13465211"
422
+ }
423
+ }
424
+ ```
425
+
426
+ - **`models`**: Retrieves model names for the current provider. Returns an array of strings with the names of the models.
427
+
428
+ ```ruby
429
+ models = client.models
430
+ ```
431
+
432
+ ##### Class Methods
433
+
434
+ - **`providers`**: Retrieves all available providers. Returns an array of unique symbols representing provider names.
435
+
436
+ ```ruby
437
+ available_providers = AiClient.providers
438
+ ```
439
+
440
+ - **`models(substring = nil)`**: Retrieves model IDs, optionally filtered by a substring.
441
+
442
+ ```ruby
443
+ available_models = AiClient.models('turbo')
444
+ ```
445
+
446
+ - **`model_details(model_id)`**: Retrieves details for a specific model using its ID. Accepts a string representing the model ID. Returns a hash containing the model's attributes or `nil` if not found.
447
+
448
+ ```ruby
449
+ model_info = AiClient.model_details('openai/gpt-3.5-turbo')
450
+ ```
451
+
452
+ - **`reset_llm_data`**: Resets the LLM data with the available ORC models. Returns `void`.
453
+
454
+ ```ruby
455
+ AiClient.reset_llm_data
456
+ ```
457
+
458
+ ### AiClient::LLM Data Table
459
+
460
+ The `AiClient::LLM` class serves as a central point for managing information about large language models (LLMs) available via the `open_router.ai` API service. The YAML file (`models.yml`) contains information about various LLMs and their providers. To update this information to the latest available, you must have an access API key for the `open_router.ai` service.
284
461
 
285
- See [blog post](https://ksylvest.com/posts/2024-08-16/using-omniai-to-leverage-tools-with-llms) by Kevin Sylvestre, author of the OmniAI gem.
462
+ `AiClient::LLM` is a subclass of `ActiveHash::Base`, enabling it to act like and interact with `ActiveRecord::Base` defined models. Each entry in this data store is uniquely identified by an `id` in the pattern "provider/model" all lowercase without spaces.
286
463
 
287
- Take a look at the [examples/tools.rb](examples/tools.rb) file to see different ways in which these callable processes can be defined.
464
+ ##### Key Features
465
+
466
+ - **Model and Provider Extraction**:
467
+ - The class provides methods to extract the model name and provider from the LLM's ID.
468
+ - The `model` method returns the model ID derived from the ID.
469
+ - The `provider` method extracts the provider name as a Symbol.
470
+
471
+ ##### Class Methods
472
+
473
+ - **`reset_llm_data`**:
474
+ - A class-level method that fetches the latest model data from the open_router.ai service and updates the `models.yml` file accordingly.
475
+
476
+ ##### Instance Methods
477
+
478
+ - **`model`**:
479
+ - Returns the name of the model derived from the LLM's ID.
480
+
481
+ ```ruby
482
+ llm_instance = AiClient::LLM.find('openai/gpt-3.5-turbo')
483
+ puts llm_instance.model # Output: gpt-3.5-turbo
484
+ ```
485
+
486
+ - **`provider`**:
487
+ - Returns the name of the provider associated with the LLM's ID.
488
+
489
+ ```ruby
490
+ llm_instance = AiClient::LLM.find('openai/gpt-3.5-turbo')
491
+ puts llm_instance.provider # Output: :openai
492
+ ```
493
+
494
+ ##### Usage Example
495
+
496
+ The `AiClient::LLM` class is predominantly used to interact with different providers of LLMs. By utilizing the `model` and `provider` methods, users can seamlessly retrieve and utilize models in their applications.
497
+
498
+ ```ruby
499
+ llm_instance = AiClient::LLM.find('google/bard')
500
+ puts "Model: #{llm_instance.model}, Provider: #{llm_instance.provider}"
501
+ ```
502
+
503
+ ##### Integration with ActiveHash
504
+
505
+ The `AiClient::LLM` class inherits from `ActiveHash::Base`, which provides an easy way to define a set of data and allows for lookups and easy manipulation of the data structure. The use of ActiveHash makes it easier to manage the LLM data effectively without needing a full database.
288
506
 
289
507
 
290
508
  ## Best ?? Practices
@@ -302,6 +520,7 @@ AI.speak "warning Will Robinson! #{bad_things_happened}"
302
520
 
303
521
  Using the constant for the instance allows you to reference the same client instance inside any method through out your application. Of course it does not apply to only one instance. You could assign multiple instances for different models/providers. For example you could have `AI` for your primary client and `AIbackup` for a fallback client in case you have a problem on the primary; or, maybe `Vectorizer` as a client name tied to a model specializing in embedding vectorization.
304
522
 
523
+
305
524
  ## OmniAI and OpenRouter
306
525
 
307
526
  Both OmniAI and OpenRouter have similar goals - to provide a common interface to multiple providers and LLMs. OmniAI is a Ruby gem that supports specific providers directly using a common-ish API. You incur costs directly from those providers for which you have individual API keys (aka access tokens.) OpenRouter, on the other hand, is a web service that also establishes a common API for many providers and models; however, OpenRouter adds a small fee on top of the fee charged by those providers. You trade off cost for flexibility. With OpenRouter you only need one API key (OPEN_ROUTER_API_KEY) to access all of its supported services.
data/examples/tools.rb CHANGED
@@ -1,90 +1,123 @@
1
1
  #!/usr/bin/env ruby
2
2
  # examples/tools.rb
3
- # See: https://ksylvest.com/posts/2024-08-16/using-omniai-to-leverage-tools-with-llms
3
+ #
4
+ # Uses the AiClient::Function class to encapsulate the
5
+ # tools used as callback functions when specified in a
6
+ # chat prompt.
7
+
4
8
 
5
9
  require_relative 'common'
6
10
 
7
11
  AI = AiClient.new('gpt-4o')
8
12
 
9
- box "omniai-openai's random temp example"
13
+ box "Random Weather (temperature) Example"
14
+ title "Uses two named parameters to the callback function"
15
+
16
+ # Example subclass implementation
17
+ class WeatherFunction < AiClient::Function
18
+ def self.call(location:, unit: 'Celsius')
19
+ "#{rand(20..50)}° #{unit} in #{location}"
20
+ end
10
21
 
11
- my_weather_function = Proc.new do |location:, unit: 'Celsius'|
12
- "#{rand(20..50)}° #{unit} in #{location}"
22
+ # Encapsulates registration details for the function
23
+ def self.details
24
+ # SMELL: reconcile regester_tool and details
25
+ {
26
+ name: 'weather', # Must be a String
27
+ description: "Lookup the weather in a location",
28
+ parameters: AiClient::Tool::Parameters.new(
29
+ properties: {
30
+ location: AiClient::Tool::Property.string(description: 'e.g. Toronto'),
31
+ unit: AiClient::Tool::Property.string(enum: %w[Celsius Fahrenheit]),
32
+ },
33
+ required: [:location]
34
+ )
35
+ }
36
+ end
13
37
  end
14
38
 
15
- weather = AiClient::Tool.new(
16
- my_weather_function,
17
- name: 'weather',
18
- description: 'Lookup the weather in a location',
19
- parameters: AiClient::Tool::Parameters.new(
20
- properties: {
21
- location: AiClient::Tool::Property.string(description: 'e.g. Toronto'),
22
- unit: AiClient::Tool::Property.string(enum: %w[Celsius Fahrenheit]),
23
- },
24
- required: %i[location]
25
- )
26
- )
39
+ # Register the tool for MyFunction
40
+ WeatherFunction.register
27
41
 
28
42
  simple_prompt = <<~TEXT
29
43
  What is the weather in "London" in Celsius and "Paris" in Fahrenheit?
30
44
  Also what are some ideas for activities in both cities given the weather?
31
45
  TEXT
32
46
 
33
- response = AI.chat(simple_prompt, tools: [weather])
47
+ response = AI.chat(
48
+ simple_prompt,
49
+ tools: ['weather'] # must match the details[:name] value
50
+ )
51
+
34
52
  puts response
35
53
 
54
+
36
55
  ##########################################
37
56
  box "Accessing a database to get information"
57
+ title "Uses one named parameter to the callback function"
38
58
 
39
- llm_db_function = Proc.new do |params|
40
- records = AiClient::LLM.where(id: /#{params[:model_name]}/i)
41
- records.inspect
59
+ class LLMDetailedFunction < AiClient::Function
60
+ def self.call(model_name:)
61
+ records = AiClient::LLM.where(id: /#{model_name}/i)&.first
62
+ records.inspect
63
+ end
64
+
65
+ def self.details
66
+ {
67
+ name: 'llm_db',
68
+ description: 'lookup details about an LLM model name',
69
+ parameters: AiClient::Tool::Parameters.new(
70
+ properties: {
71
+ model_name: AiClient::Tool::Property.string
72
+ },
73
+ required: %i[model_name]
74
+ )
75
+ }
76
+ end
42
77
  end
43
78
 
79
+ # Registering the LLM detail function
80
+ LLMDetailedFunction.register
44
81
 
45
- llm_db = AiClient::Tool.new(
46
- llm_db_function,
47
- name: 'llm_db',
48
- description: 'lookup details about an LLM model name',
49
- parameters: AiClient::Tool::Parameters.new(
50
- properties: {
51
- model_name: AiClient::Tool::Property.string
52
- },
53
- required: %i[model_name]
54
- )
55
- )
82
+ simple_prompt = <<~PROMPT
83
+ Get the details on an LLM model named 'bison' from the models database
84
+ of #{AiClient::LLM.count} models. Format the details for the model
85
+ using markdown. Format pricing information in terms of number of
86
+ tokens per US dollar.
87
+ PROMPT
56
88
 
57
- response = AI.chat("Get details on an LLM model named bison. Which one is the cheapest per prompt token.", tools: [llm_db])
89
+ response = AI.chat(simple_prompt, tools: ['llm_db'])
58
90
  puts response
59
91
 
60
- ##########################################
61
92
 
62
- # TODO: Look at creating a better function
63
- # process such that the tools parameter
64
- # is an Array of Symbols which is
65
- # maintained as a class variable.
66
- # The symboles are looked up and the
67
- # proper instance is inserted in its
68
- # place.
93
+
94
+ ##########################################
69
95
 
70
96
  box "Using a function class and multiple tools"
97
+ title "Callback function has no parameters; but uses two functions"
71
98
 
72
- class FunctionClass
99
+ class PerfectDateFunction < AiClient::Function
73
100
  def self.call
74
- "April 25th its not to hot nor too cold."
101
+ "April 25th, it's not too hot nor too cold."
75
102
  end
76
103
 
77
- def function(my_name)
78
- AiClient::Tool.new(
79
- self.class, # with a self.call method
80
- name: my_name,
81
- description: 'what is the perfect date'
82
- )
104
+ def self.details
105
+ {
106
+ name: 'perfect_date',
107
+ description: 'what is the perfect date'
108
+ }
83
109
  end
84
110
  end
85
111
 
86
- perfect_date = FunctionClass.new.function('perfect_date')
112
+ # Registering the perfect date function
113
+ PerfectDateFunction.register
87
114
 
88
- response = AI.chat("what is the perfect date for paris weather?", tools: [weather, perfect_date])
115
+ response = AI.chat("what is the perfect date for current weather in Paris?",
116
+ tools: %w[weather perfect_date])
89
117
  puts response
90
118
  puts
119
+
120
+ debug_me{[
121
+ #'AiClient::Function.registry',
122
+ 'AiClient::Function.functions'
123
+ ]}
@@ -10,7 +10,19 @@ class AiClient
10
10
  # tools: @tools [Array<OmniAI::Tool>] optional
11
11
  # temperature: @temperature [Float, nil] optional
12
12
 
13
- def chat(messages, **params)
13
+ def chat(messages, **params)
14
+ if params.has_key? :tools
15
+ tools = params[:tools]
16
+ if tools.is_a? Array
17
+ tools.map!{|function_name| AiClient::Function.registry[function_name]}
18
+ elsif true == tools
19
+ tools = AiClient::Function.registry.values
20
+ else
21
+ raise 'what is this'
22
+ end
23
+ params[:tools] = tools
24
+ end
25
+
14
26
  result = call_with_middlewares(:chat_without_middlewares, messages, **params)
15
27
  @last_response = result
16
28
  raw? ? result : content
@@ -0,0 +1,100 @@
1
+ # lib/ai_client/function.rb
2
+
3
+ class AiClient
4
+
5
+ # The Function class serves as a base class for creating functions
6
+ # that can be registered and managed within the AiClient.
7
+ #
8
+ # Subclasses must implement the `call` and `details` methods
9
+ # to define their specific behavior and properties.
10
+ #
11
+ class Function
12
+ @@registry = {} # key is known by name (from details) and value is the AiClient::Tool
13
+
14
+ class << self
15
+
16
+ # Calls the function with the provided parameters.
17
+ #
18
+ # @param params [Hash] Named parameters required by the function.
19
+ # @raise [NotImplementedError] if not implemented in the subclass.
20
+ #
21
+ def call(**params)
22
+ raise NotImplementedError, "You must implement the call method"
23
+ end
24
+
25
+ # Provides the details about the function including its metadata.
26
+ #
27
+ # @return [Hash] Metadata containing details about the function.
28
+ # @raise [NotImplementedError] if not implemented in the subclass.
29
+ #
30
+ def details
31
+ raise NotImplementedError, "You must implement the details method"
32
+ end
33
+
34
+
35
+ # Registers a tool with the specified properties and parameters.
36
+ #
37
+ # This method creates an instance of AiClient::Tool with the
38
+ # function class and its details and adds it to the registry.
39
+ #
40
+ def register
41
+ this_tool = AiClient::Tool.new(
42
+ self, # This is the sub-class
43
+ **details
44
+ )
45
+
46
+ registry[known_by] = this_tool
47
+ end
48
+ alias_method :enable, :register
49
+
50
+
51
+ # Disables the function by removing its name from the registry.
52
+ #
53
+ # @return [void]
54
+ #
55
+ def disable
56
+ registry.delete(known_by)
57
+ end
58
+
59
+
60
+ # Returns a list of enabled functions.
61
+ #
62
+ # @return [Array<Symbol>] Sorted list of function names.
63
+ #
64
+ def functions
65
+ registry.keys.sort
66
+ end
67
+
68
+
69
+ # Returns the registry of currently registered functions.
70
+ # This method is private to limit access to the registry's state.
71
+ #
72
+ # @return [Hash] The registry of registered functions.
73
+ #
74
+ def registry
75
+ @@registry
76
+ end
77
+
78
+
79
+ # Returns the name under which the function is known.
80
+ #
81
+ # @return [Symbol] The symbol representation of the function's name.
82
+ #
83
+ def known_by
84
+ details[:name]
85
+ end
86
+
87
+
88
+ private
89
+
90
+ # Converts the class name to a symbol (e.g., MyFunction -> :my_function).
91
+ #
92
+ # @return [Symbol] The function name derived from the class name.
93
+ #
94
+ def function_name
95
+ name.split('::').last.gsub(/([a-z])([A-Z])/, '\1_\2').downcase
96
+ end
97
+ end
98
+ end
99
+ end
100
+
data/lib/ai_client/llm.rb CHANGED
@@ -9,6 +9,13 @@ class AiClient
9
9
  DATA_PATH = Pathname.new( __dir__ + '/models.yml')
10
10
  self.data = YAML.parse(DATA_PATH.read).to_ruby
11
11
 
12
+ scope :providers, -> {all.map(&:provider).uniq.map(&:to_sym)}
13
+
14
+ scope :models, ->(substring=nil) do
15
+ (substring.nil? ? all : all.where(id: /#{substring}/i))
16
+ .map(&:model).sort.uniq
17
+ end
18
+
12
19
  # Extracts the model name from the LLM ID.
13
20
  #
14
21
  # @return [String] the model name.
@@ -17,9 +24,12 @@ class AiClient
17
24
 
18
25
  # Extracts the provider name from the LLM ID.
19
26
  #
20
- # @return [String] the provider name.
27
+ # @return [Symbol] the provider name.
21
28
  #
22
- def provider = id.split('/')[0]
29
+ def provider = id.split('/')[0].to_sym
30
+
31
+ def to_h = attributes
32
+
23
33
  end
24
34
 
25
35
  class << self
@@ -34,5 +44,6 @@ class AiClient
34
44
  AiClient::LLM.data = orc_models
35
45
  AiClient::LLM::DATA_PATH.write(orc_models.to_yaml)
36
46
  end
47
+
37
48
  end
38
49
  end
@@ -1,6 +1,6 @@
1
1
  # lib/ai_client/middleware.rb
2
2
 
3
- # TODO: As concurrently designed the middleware must
3
+ # TODO: As currently designed the middleware must
4
4
  # be set before an instance of AiClient is created.
5
5
  # Any `use` commands for middleware made after
6
6
  # the instance is created will not be available
@@ -1,58 +1,74 @@
1
1
  # lib/ai_client/open_router_extensions.rb
2
- # frozen_string_literal: true
3
2
 
4
- # These extensions to AiClient are only available with
5
- # a valid API Key for the open_router.ai web-service
3
+ # OpenRouter Extensions for AiClient
4
+ #
5
+ # This file adds several public instance and class methods to the AiClient class
6
+ # to provide information about AI models and providers.
7
+ #
8
+ # Instance Methods:
9
+ # - model_details: Retrieves details for the current model.
10
+ # - models: Retrieves model names for the current provider.
11
+ #
12
+ # Class Methods:
13
+ # - providers: Retrieves all available providers.
14
+ # - models: Retrieves model names, optionally filtered by provider.
15
+ # - model_details: Retrieves details for a specific model.
16
+ #
17
+ # These methods utilize the AiClient::LLM class and the models.yml file
18
+ # for model information.
6
19
 
7
20
  require 'open_router'
8
21
  require 'yaml'
9
22
 
10
23
  class AiClient
11
-
12
- # Retrieves the available models.
13
- #
14
- # @return [Array<String>] List of model IDs.
15
- #
16
- def models
17
- self.class.models
18
- end
19
24
 
20
- # Retrieves the available providers.
25
+ # Retrieves details for the current model.
21
26
  #
22
- # @return [Array<String>] List of provider names.
23
- def providers
24
- self.class.providers
27
+ # @return [Hash, nil] Details of the current model or nil if not found.
28
+ def model_details
29
+ id = "#{@provider}/#{@model}"
30
+ LLM.find(id.downcase)
25
31
  end
26
32
 
27
- # Retrieves model names, optionally filtered by provider.
33
+ # Retrieves model names for the current provider.
28
34
  #
29
- # @param provider [String, nil] The provider to filter models by.
30
- # @return [Array<String>] List of model names.
31
- def model_names(provider = nil)
32
- self.class.model_names(provider)
33
- end
35
+ # @return [Array<String>] List of model names for the current provider.
36
+ def models = LLM.models(@provider)
34
37
 
35
- # Retrieves details for a specific model.
36
- #
37
- # @param a_model [String] The model ID to retrieve details for.
38
- # @return [Hash, nil] Details of the model or nil if not found.
39
- def model_details(a_model)
40
- self.class.model_details(a_model)
41
- end
42
38
 
43
- # Finds models matching a given substring.
44
- #
45
- # @param a_model_substring [String] The substring to search for.
46
- # @return [Array<String>] List of matching model names.
47
- def find_model(a_model_substring)
48
- self.class.find_model(a_model_substring)
49
- end
39
+ class << self
50
40
 
41
+ # Retrieves all available providers.
42
+ #
43
+ # @return [Array<Symbol>] List of all provider names.
44
+ def providers = LLM.providers
51
45
 
52
- class << self
53
-
54
- # Adds OpenRouter extensions to AiClient.
46
+
47
+ # Retrieves model names, optionally filtered by provider.
48
+ #
49
+ # @param substring [String, nil] Optional substring to filter models by.
50
+ # @return [Array<String>] List of model names.
51
+ def models(substring = nil) = LLM.models(substring)
52
+
53
+ # Retrieves details for a specific model.
54
+ #
55
+ # @param model_id [String] The model ID to retrieve details for,
56
+ # in the pattern "provider/model".downcase
57
+ # @return [AiClient::LLM, nil] Details of the model or nil if not found.
58
+ def model_details(model_id) = LLM.find(model_id.downcase)
59
+
60
+
61
+ # Resets LLM data with the available ORC models.
62
+ #
63
+ # @return [void]
55
64
  #
65
+ def reset_llm_data = LLM.reset_llm_data
66
+
67
+
68
+ # Initializes OpenRouter extensions for AiClient.
69
+ #
70
+ # This sets up the access token and initializes the ORC client.
71
+ #
56
72
  # @return [void]
57
73
  #
58
74
  def add_open_router_extensions
@@ -64,7 +80,10 @@ class AiClient
64
80
  initialize_orc_client
65
81
  end
66
82
 
67
- # Retrieves ORC client instance.
83
+
84
+ private
85
+
86
+ # Retrieves the ORC client instance.
68
87
  #
69
88
  # @return [OpenRouter::Client] Instance of the OpenRouter client.
70
89
  #
@@ -80,59 +99,7 @@ class AiClient
80
99
  @orc_models ||= orc_client.models
81
100
  end
82
101
 
83
- # TODO: Refactor these DB like methods to take
84
- # advantage of AiClient::LLM
85
-
86
- # Retrieves model names associated with a provider.
87
- #
88
- # @param provider [String, nil] The provider to filter models by.
89
- # @return [Array<String>] List of model names.
90
- #
91
- def model_names(provider=nil)
92
- model_ids = models.map { _1['id'] }
93
-
94
- return model_ids unless provider
95
-
96
- model_ids.filter_map { _1.split('/')[1] if _1.start_with?(provider.to_s.downcase) }
97
- end
98
-
99
- # Retrieves details of a specific model.
100
- #
101
- # @param model [String] The model ID to retrieve details for.
102
- # @return [Hash, nil] Details of the model or nil if not found.
103
- #
104
- def model_details(model)
105
- orc_models.find { _1['id'].include?(model) }
106
- end
107
-
108
- # Retrieves the available providers.
109
- #
110
- # @return [Array<String>] List of unique provider names.
111
- #
112
- def providers
113
- @providers ||= models.map{ _1['id'].split('/')[0] }.sort.uniq
114
- end
115
-
116
- # Finds models matching a given substring.
117
- #
118
- # @param a_model_substring [String] The substring to search for.
119
- # @return [Array<String>] List of matching model names.
120
- #
121
- def find_model(a_model_substring)
122
- model_names.select{ _1.include?(a_model_substring) }
123
- end
124
-
125
- # Resets LLM data with the available ORC models.
126
- #
127
- # @return [void]
128
- #
129
- def reset_llm_data
130
- LLM.data = orc_models
131
- LLM::DATA_PATH.write(orc_models.to_yaml)
132
- end
133
-
134
102
 
135
- private
136
103
 
137
104
  # Fetches the access token from environment variables.
138
105
  #
@@ -154,10 +121,9 @@ class AiClient
154
121
  OpenRouter.configure { |config| config.access_token = access_token }
155
122
  end
156
123
 
157
- # Initializes the ORC client.
158
- #
159
- # @return [void]
124
+ # Initializes the ORC client instance.
160
125
  #
126
+ # @return [OpenRouter::Client] Instance of the OpenRouter client.
161
127
  def initialize_orc_client
162
128
  @orc_client ||= OpenRouter::Client.new
163
129
  end
@@ -1,14 +1,11 @@
1
1
  # lib/ai_client/tool.rb
2
2
 
3
- # TODO: Turn this into a Function class using the pattern
4
- # in examples/tools.rb
5
- # put the function names as symbols into a class Array
6
- # In the AiClient class transform the tools: []
7
- # parameter from an Array of Symbols into an Array
8
- # of FUnction instances.
9
-
10
3
  class AiClient::Tool < OmniAI::Tool
11
4
 
5
+ # TODO: Is there any additional functionality that
6
+ # needs to be added to the Rool class that would
7
+ # be helpful?
8
+
12
9
  def xyzzy = self.class.xyzzy
13
10
 
14
11
  class << self
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  class AiClient
4
- VERSION = "0.2.5"
4
+ VERSION = "0.3.1"
5
5
  end
data/lib/ai_client.rb CHANGED
@@ -1,6 +1,17 @@
1
1
  # ai_client.rb
2
- # WIP: a generic client to access LLM providers
3
- # kinda like the SaaS "open router"
2
+
3
+ # A generic client to access various LLM providers
4
+ # Inspired by the SaaS "open router" concept
5
+
6
+
7
+ # AiClient: A unified interface for interacting with various LLM providers
8
+ #
9
+ # Usage:
10
+ # client = AiClient.new('gpt-3.5-turbo')
11
+ #
12
+ # Add middlewares:
13
+ # AiClient.use(RetryMiddleware.new(max_retries: 5, base_delay: 2, max_delay: 30))
14
+ # AiClient.use(LoggingMiddleware.new(AiClient.configuration.logger))
4
15
  #
5
16
 
6
17
  unless defined?(DebugMe)
@@ -29,6 +40,7 @@ require_relative 'ai_client/middleware'
29
40
  require_relative 'ai_client/open_router_extensions'
30
41
  require_relative 'ai_client/llm' # SMELL: must come after the open router stuff
31
42
  require_relative 'ai_client/tool'
43
+ require_relative 'ai_client/function'
32
44
 
33
45
  # Create a generic client instance using only model name
34
46
  # client = AiClient.new('gpt-3.5-turbo')
@@ -133,29 +145,30 @@ class AiClient
133
145
  @last_response = nil
134
146
  end
135
147
 
136
- # TODO: Review these raw-ish methods are they really needed?
137
- # raw? should be a private method ??
138
-
139
- # Returns the last response received from the client.
140
- #
141
- # @return [OmniAI::Response] The last response.
142
- #
143
- def response = last_response
144
148
 
145
149
  # Checks if the client is set to return raw responses.
146
150
  #
147
151
  # @return [Boolean] True if raw responses are to be returned.
148
- def raw? = config.return_raw
149
-
152
+ def raw?
153
+ config.return_raw
154
+ end
155
+
150
156
 
151
157
  # Sets whether to return raw responses.
152
158
  #
153
159
  # @param value [Boolean] The value to set for raw responses return.
154
- #
155
160
  def raw=(value)
156
161
  config.return_raw = value
157
162
  end
158
163
 
164
+
165
+ # Returns the last response received from the client.
166
+ #
167
+ # @return [OmniAI::Response] The last response.
168
+ #
169
+ def response = last_response
170
+
171
+
159
172
  # Extracts the content from the last response based on the provider.
160
173
  #
161
174
  # @return [String] The extracted content.
@@ -287,5 +300,3 @@ class AiClient
287
300
  raise(ArgumentError, "Unsupported model: #{model}")
288
301
  end
289
302
  end
290
-
291
-
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ai_client
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.2.5
4
+ version: 0.3.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Dewayne VanHoozer
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2024-10-11 00:00:00.000000000 Z
11
+ date: 2024-10-20 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: active_hash
@@ -226,6 +226,7 @@ files:
226
226
  - lib/ai_client/config.yml
227
227
  - lib/ai_client/configuration.rb
228
228
  - lib/ai_client/embed.rb
229
+ - lib/ai_client/function.rb
229
230
  - lib/ai_client/llm.rb
230
231
  - lib/ai_client/logger_middleware.rb
231
232
  - lib/ai_client/middleware.rb
@@ -261,7 +262,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
261
262
  - !ruby/object:Gem::Version
262
263
  version: '0'
263
264
  requirements: []
264
- rubygems_version: 3.5.21
265
+ rubygems_version: 3.5.22
265
266
  signing_key:
266
267
  specification_version: 4
267
268
  summary: A generic AI Client for many providers