ai_client 0.4.1 → 0.4.3

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a6a570561363cbf57df995b7154be4214cb7e7488f0e826bd92b9d523affe52b
4
- data.tar.gz: 42c3545fd727a240261c7df1d39528fcb5bf8c6cf6dce5820763d3cef22ccd20
3
+ metadata.gz: 602b3197f09d1a0cb0525073ee456c54e6de6f835488d36c0ebf56259b6a72e9
4
+ data.tar.gz: 59c7d27f2c56fff21041cdd68ed09e3f3e624b7dead6059871dfe54cfcd46e58
5
5
  SHA512:
6
- metadata.gz: a20e587c840fc83a5a3253cead493b15672d64ed0994245da1f54b2fc0e74852170637ff7c2d745b1cec5776cdfc3716bd210a664667632db3700075c5b974d5
7
- data.tar.gz: 2e25503d39cbb956905c46cea2c519d6b086a0887469bd5b7c237040063aac2c886bdd57ffca365376ad908b53f407c3ef090a731f83551452dbcb4a782105a6
6
+ metadata.gz: f98d17165dc760a2a4f81293a344b4edf65521e22228e8d908c809bd0c941c7c102c4b9de71779f7b9e2587278ded7e440b9b018fd4ef7054c4ab6d0ce2c563d
7
+ data.tar.gz: be3e0798c3894579417907732a9fd867d6d1eeae3de9dbe2dad1e89b2eb29146978fe015784920186773373b57dfff7fa8434a853e4e60668336066e2a9e1e81
data/.envrc CHANGED
@@ -1,3 +1,6 @@
1
1
  # ai_client/,envrc
2
2
 
3
3
  export RR=`pwd`
4
+
5
+ PATH_add $RR/bin
6
+
data/CHANGELOG.md CHANGED
@@ -1,5 +1,16 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ### [0.4.3] - 2025-01-05
4
+ - updated models.yml file with latest dump from open_router
5
+
6
+ ### [0.4.2] - 2025-01-05
7
+ - updated models.yml file with latest dump from open_router
8
+ - noted some SMELLS w/r/t ollama processing from a host that is not localhost
9
+ - updzted test cases for new open_router model dump
10
+ - increasing test coverage
11
+ - tweaking stuff
12
+
13
+
3
14
  ## Released
4
15
 
5
16
  ### [0.4.1] - 2024-10-21
data/README.md CHANGED
@@ -22,8 +22,10 @@ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I l
22
22
  - [api_key: Parameter](#api_key-parameter)
23
23
  - [provider: Parameter](#provider-parameter)
24
24
  - [Usage](#usage)
25
+ - [Basic Top-leve Methods](#basic-top-leve-methods)
26
+ - [Utility Methods](#utility-methods)
25
27
  - [Configuration](#configuration)
26
- - [Default Configuration](#default-configuration)
28
+ - [Default Configuration](#default-configuration)
27
29
  - [Class Configuration](#class-configuration)
28
30
  - [1. Class Configuration Block](#1-class-configuration-block)
29
31
  - [2. Set by a Config File](#2-set-by-a-config-file)
@@ -58,6 +60,7 @@ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I l
58
60
  - [Integration with ActiveHash](#integration-with-activehash)
59
61
  - [Best ?? Practices](#best--practices)
60
62
  - [OmniAI and OpenRouter](#omniai-and-openrouter)
63
+ - [The `chatbot` Method](#the-chatbot-method)
61
64
  - [Contributing](#contributing)
62
65
  - [License](#license)
63
66
 
@@ -66,15 +69,18 @@ You should also checkout the [raix gem](https://github.com/OlympiaAI/raix). I l
66
69
 
67
70
  ## Summary
68
71
 
69
- Are you ready to supercharge your applications with cutting-edge AI capabilities? Introducing `ai_client`, the ultimate Ruby gem that provides a seamless interface for interacting with a multitude of AI service providers through a single, unified API.
72
+ AiClient is a Ruby gem that simplifies integration with AI services by providing:
70
73
 
71
- With `ai_client`, you can effortlessly integrate large language models (LLMs) into your projects—simply specify the model name and let the gem handle the rest! Say goodbye to tedious configuration and hello to rapid development.
74
+ - Unified interface across multiple LLM providers
75
+ - Automatic provider detection from model names
76
+ - Context management for chat conversations
77
+ - Function calling (tools) support
78
+ - Middleware for logging, retries, and customization
79
+ - Batch processing for embeddings
80
+ - Text-to-speech and transcription
81
+ - Interactive chatbot mode
72
82
 
73
- This gem comes packed with built-in support for leading AI providers, including OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you need to implement chatbots, transcription services, speech synthesis, or embeddings, `ai_client` abstracts the complexities of API interactions, allowing you to focus on what truly matters: building amazing applications.
74
-
75
- Plus, with its flexible middleware architecture, you can easily customize request and response processing—implement logging, retry logic, and more with minimal effort. And thanks to its seamless integration with the `OmniAI` framework, you can leverage the latest AI advancements without worrying about vendor lock-in.
76
-
77
- Join the growing community of developers who are transforming their applications with `ai_client`. Install it today and unlock the full potential of AI in your projects!
83
+ The gem supports leading providers including OpenAI, Anthropic, Google, Mistral, LocalAI, and Ollama. Whether you're building a chatbot, implementing RAG, or need speech services, AiClient handles the complexity so you can focus on your application logic.
78
84
 
79
85
  ## Installation
80
86
 
@@ -139,7 +145,35 @@ To explicitly designate a provider to use with an AiClient instance use the para
139
145
 
140
146
  ## Usage
141
147
 
142
- Basic usage:
148
+ ### Basic Top-leve Methods
149
+
150
+ ```ruby
151
+ require 'ai_client'
152
+
153
+ # Simple chat with default model
154
+ ai = AiClient.new
155
+ response = ai.chat('Hello!')
156
+
157
+ # Interactive chatbot mode
158
+ ai.chatbot('Start conversation')
159
+
160
+ # Batch embeddings
161
+ texts = ['text1', 'text2', 'text3']
162
+ embeddings = ai.batch_embed(texts)
163
+
164
+ # Text-to-speech
165
+ audio = ai.speak("Convert this to speech")
166
+
167
+ # Transcription
168
+ text = ai.transcribe("path/to/audio.mp3")
169
+
170
+ # Function calling
171
+ WeatherFunction.register
172
+ response = ai.chat("What's the weather?", tools: ['weather'])
173
+ ```
174
+
175
+
176
+ ### Utility Methods
143
177
 
144
178
  ```ruby
145
179
  require 'ai_client'
@@ -166,19 +200,12 @@ You can specify which model you want to use and `AiClient` will use the provider
166
200
 
167
201
 
168
202
  ```ruby
169
- AI = AiClient.new('gpt-4o-mini') # sets provider to :openai
203
+ ai = AiClient.new('gpt-4o-mini') # sets provider to :openai
170
204
  #
171
205
  # If you want to use the open_router.ai service instead of
172
206
  # going directly to OpenAI do it this way:
173
207
  #
174
- AI = AiClient.new('openai/gpt-4o-mini') # sets provider to :open_router
175
- ```
176
-
177
- Of course you could specify both the model and the provider that you want to use:
178
-
179
-
180
- ```ruby
181
- AI = AiClient.new('mistral', provider: :ollama)
208
+ ai = AiClient.new('openai/gpt-4o-mini') # sets provider to :open_router
182
209
  ```
183
210
 
184
211
  That's it. What could be simpler? If your application is using more than one model, no worries, just create multiple `AiClient` instances.
@@ -192,14 +219,14 @@ You can also use the `provider:` parameter in the event that the model you want
192
219
 
193
220
 
194
221
  ```ruby
195
- AI = AiClient.new('nomic-embed-text', provider: :ollama)
222
+ ai = AiClient.new('nomic-embed-text', provider: :ollama)
196
223
  ```
197
224
 
198
225
  ### Configuration
199
226
 
200
227
  There are three levels of configuration, each inherenting from the level above. The following sections describe those configuration levels.
201
228
 
202
- #### Default Configuration
229
+ ##### Default Configuration
203
230
 
204
231
  The file [lib/ai_client/configuration.rb] hard codes the default configuration. This is used to update the [lib/ai_client/config.yml] file during development. If you have some changes for this configuration please send me a pull request so we can all benefit from your efforts.
205
232
 
@@ -553,9 +580,8 @@ AiClient.reset_llm_data
553
580
 
554
581
  ### AiClient::LLM Data Table
555
582
 
556
- The `AiClient::LLM` class serves as a central point for managing information about large language models (LLMs) available via the `open_router.ai` API service. The YAML file (`models.yml`) contains information about various LLMs and their providers. To update this information to the latest available, you must have an access API key for the `open_router.ai` service.
583
+ AiClient makes use of the [active_hash](http://github.com/active-hash/active_hash) to provide an ActiveRecord-like capability within its AiClient::LLM class. This class provides database-like access to the [models.yml](lib/ai_client/models.yml) YAML file. Using the AiClient::LLM class you have access to details on the models maintained and available through the [open_router.ai](https://open_router.ai) service. With an API key for OpenRouter you cab update your local copy of the `models.yml` file.
557
584
 
558
- `AiClient::LLM` is a subclass of `ActiveHash::Base`, enabling it to act like and interact with `ActiveRecord::Base` defined models. Each entry in this data store is uniquely identified by an `id` in the pattern "provider/model" all lowercase without spaces.
559
585
 
560
586
  ##### Key Features
561
587
 
@@ -623,6 +649,16 @@ Both OmniAI and OpenRouter have similar goals - to provide a common interface to
623
649
 
624
650
  The advantage of AiClient is that you have the added flexibility to choose on a client by client bases where you want your model to be processed. You get free local processing through Ollama and LocalAI. You get less costly direct access to some providers via OmniAI. You get slightly more costly wide-spread access via OpenRouter
625
651
 
652
+ ## The `chatbot` Method
653
+
654
+ Just for grins I added a `chatbot` instance method. Try this out in IRB:
655
+
656
+ ```ruby
657
+ require 'ai_client'
658
+ ai = AiClient.new
659
+ ai.chatbot
660
+ ```
661
+
626
662
  ## Contributing
627
663
 
628
664
  I can sure use your help. This industry is moving faster than I can keep up with. If you have a bug fix or new feature idea then have at it. Send me a pull request so we all can benefit from your efforts.
data/examples/text.rb CHANGED
@@ -1,51 +1,50 @@
1
1
  #!/usr/bin/env ruby
2
2
  # examples/text.rb
3
+ #
4
+ # This example demonstrates basic text chat functionality with different models
5
+ # and providers. It shows both default (processed) and raw responses.
3
6
 
4
7
  require_relative 'common'
5
8
 
6
-
7
9
  ###################################
8
- ## Working with Ollama
10
+ ## Basic Chat with Local Model
9
11
 
10
- # This is the default configuration which returns
11
- # text content from the client.
12
- #
12
+ title "1. Basic Chat with Local Ollama Model"
13
+
14
+ # Configure client to return processed text content
13
15
  AiClient.configure do |o|
14
16
  o.return_raw = false
15
17
  end
16
18
 
17
- title "Using Mistral model with Ollama locally"
18
-
19
+ # Create client using local Ollama model
19
20
  ollama_client = AiClient.new('mistral', provider: :ollama)
20
21
 
21
- puts "\nModel: mistral Provider: Ollama (local)"
22
+ puts "\nSimple chat example:"
22
23
  result = ollama_client.chat('Hello, how are you?')
23
24
  puts result
24
25
 
25
- puts "\nRaw response:"
26
+ puts "\nRaw response for inspection:"
26
27
  puts ollama_client.response.pretty_inspect
27
28
  puts
28
29
 
30
+ ###################################
31
+ ## Multi-Provider Example
29
32
 
33
+ title "2. Chat Across Different Providers"
30
34
 
31
- ###############################################################
32
- ## Lets look an generic configurations based upon model name ##
33
- ###############################################################
34
-
35
- models = [
35
+ # Example models from different providers
36
+ models = [
36
37
  'gpt-3.5-turbo', # OpenAI
37
38
  'claude-2.1', # Anthropic
38
39
  'gemini-1.5-flash', # Google
39
- 'mistral-large-latest', # Mistral - La Platform
40
+ 'mistral-large-latest', # Mistral
40
41
  ]
41
- clients = []
42
-
43
- models.each do |model|
44
- clients << AiClient.new(model)
45
- end
46
42
 
43
+ # Create clients for each model
44
+ clients = models.map { |model| AiClient.new(model) }
47
45
 
48
- title "Default Configuration Response to 'hello'"
46
+ # Test with default configuration (processed responses)
47
+ title "Default Configuration (Processed Text)"
49
48
 
50
49
  clients.each do |c|
51
50
  puts "\nModel: #{c.model} Provider: #{c.provider}"
@@ -53,32 +52,31 @@ clients.each do |c|
53
52
  response = c.chat('hello')
54
53
  puts response
55
54
  rescue => e
56
- puts e
55
+ puts "Error: #{e.message}"
57
56
  end
58
57
  end
59
58
 
60
59
  ###################################
60
+ ## Raw Response Example
61
61
 
62
+ title "3. Raw Response Example"
63
+
64
+ # Configure for raw responses
62
65
  AiClient.configure do |o|
63
66
  o.return_raw = true
64
67
  end
65
68
 
66
- raw_clients = []
67
-
68
- models.each do |model|
69
- raw_clients << AiClient.new(model)
70
- end
71
-
72
- puts
73
- title "Raw Configuration Response to 'hello'"
69
+ # Create new clients with raw configuration
70
+ raw_clients = models.map { |model| AiClient.new(model) }
74
71
 
72
+ puts "\nRaw Configuration Responses:"
75
73
  raw_clients.each do |c|
76
74
  puts "\nModel: #{c.model} Provider: #{c.provider}"
77
75
  begin
78
76
  result = c.chat('hello')
79
77
  puts result.pretty_inspect
80
78
  rescue => e
81
- puts e
79
+ puts "Error: #{e.message}"
82
80
  end
83
81
  end
84
82
 
@@ -51,7 +51,6 @@ class AiClient
51
51
  result
52
52
  end
53
53
 
54
-
55
54
  # Adds context to the current prompt.
56
55
  #
57
56
  # @param prompt [String, Array<String>] the current prompt.
@@ -82,7 +81,6 @@ class AiClient
82
81
  @context = []
83
82
  end
84
83
 
85
-
86
84
  # Chats with the client without middleware processing.
87
85
  #
88
86
  # @param messages [Array<String>] the messages to send.
@@ -94,14 +92,11 @@ class AiClient
94
92
  end
95
93
 
96
94
 
97
- def chatbot
98
- prompt = "hello"
95
+ # Just an example ...
96
+ def chatbot(prompt='hello')
99
97
  until prompt.empty? do
100
98
  response = chat prompt
101
- puts
102
- puts content
103
- puts
104
- print "Follow Up: "
99
+ print "\n\n#{content}\n\nFollow Up: "
105
100
  prompt = gets.chomp
106
101
  end
107
102
  end
@@ -166,7 +166,7 @@ class AiClient
166
166
  },
167
167
  provider_patterns: {
168
168
  anthropic: /^claude/i,
169
- openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e)/i,
169
+ openai: /^(gpt|chatgpt|o1|davinci|curie|babbage|ada|whisper|tts|dall-e|text-embedding)/i,
170
170
  google: /^(gemini|gemma|palm)/i,
171
171
  mistral: /^(mistral|codestral|mixtral)/i,
172
172
  localai: /^local-/i,