llm_conductor 1.0.1 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 00742db531ea95e6247acbb8b7ea7e92619df851b0d558fd33a4f20d8cf23873
4
- data.tar.gz: 9492982bc533d2552b9b55f8e0892fc523c2340d75086ade217651bf9d72730a
3
+ metadata.gz: 77bf332827087c373c19d318c72b21532974ae510b6b7958b8a5d77fc3e85833
4
+ data.tar.gz: 365717c91358d62d4a44264087d6500829d53ca1a9d1dc4550890282152ad75b
5
5
  SHA512:
6
- metadata.gz: 8d23aa8d06cc823d77ec4992605208f693a592971d13f88266d587cf06fe78cd42111b7904404cc998215736df1d2b9acf452c5474d070401619d6774a870993
7
- data.tar.gz: 154240f949c96860372b6cc6eb097267c1a4e9a52b6da968dc30f53d18069f4034fdadfe12fe38b51e5be99d54f75988f1aa48864b624635ce63077bc7f8da6f
6
+ metadata.gz: 547cb6b093784591aa0944e36cab482a380a2519fd73324a5dcefaae38fb4eade745f564ada7daee05cd7359515022f49492fee330fc0eee4e592eaf2e5d37bf
7
+ data.tar.gz: 1118f23b62a725962c7576ba368b5ddadd5fee7f5244f5aea1405d52bacd56708a14cbe50d833afc8e81e6916be8db5c50bb424c238f134957b7965272504944
data/LICENSE ADDED
@@ -0,0 +1,9 @@
1
+ The MIT License (MIT)
2
+
3
+ Copyright (c) 2025 Ekohe
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
6
+
7
+ The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
8
+
9
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
data/README.md CHANGED
@@ -1,10 +1,10 @@
1
1
  # LLM Conductor
2
2
 
3
- A powerful Ruby gem from [Ekohe](https://ekohe.com) for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
3
+ A powerful Ruby gem from [Ekohe](https://ekohe.com) for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT, Anthropic Claude, Google Gemini, Groq, and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
4
4
 
5
5
  ## Features
6
6
 
7
- 🚀 **Multi-Provider Support** - OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with automatic vendor detection
7
+ 🚀 **Multi-Provider Support** - OpenAI GPT, Anthropic Claude, Google Gemini, Groq, and Ollama with automatic vendor detection
8
8
  🎯 **Unified Modern API** - Simple `LlmConductor.generate()` interface with rich Response objects
9
9
  📝 **Advanced Prompt Management** - Registrable prompt classes with inheritance and templating
10
10
  🏗️ **Data Builder Pattern** - Structured data preparation for complex LLM inputs
@@ -106,6 +106,10 @@ LlmConductor.configure do |config|
106
106
  api_key: ENV['GEMINI_API_KEY']
107
107
  )
108
108
 
109
+ config.groq(
110
+ api_key: ENV['GROQ_API_KEY']
111
+ )
112
+
109
113
  config.ollama(
110
114
  base_url: ENV['OLLAMA_ADDRESS'] || 'http://localhost:11434'
111
115
  )
@@ -147,6 +151,7 @@ The gem automatically detects these environment variables:
147
151
  - `OPENAI_ORG_ID` - OpenAI organization ID (optional)
148
152
  - `ANTHROPIC_API_KEY` - Anthropic API key
149
153
  - `GEMINI_API_KEY` - Google Gemini API key
154
+ - `GROQ_API_KEY` - Groq API key
150
155
  - `OLLAMA_ADDRESS` - Ollama server address
151
156
 
152
157
  ## Supported Providers & Models
@@ -189,10 +194,31 @@ response = LlmConductor.generate(
189
194
  )
190
195
  ```
191
196
 
192
- ### Ollama (Default for non-GPT/Claude/Gemini models)
197
+ ### Groq (Automatic for Llama, Mixtral, Gemma, Qwen models)
198
+ ```ruby
199
+ response = LlmConductor.generate(
200
+ model: 'llama-3.1-70b-versatile', # Auto-detects Groq
201
+ prompt: 'Your prompt here'
202
+ )
203
+
204
+ # Supported Groq models
205
+ response = LlmConductor.generate(
206
+ model: 'mixtral-8x7b-32768', # Auto-detects Groq
207
+ prompt: 'Your prompt here'
208
+ )
209
+
210
+ # Or explicitly specify vendor
211
+ response = LlmConductor.generate(
212
+ model: 'qwen-2.5-72b-instruct',
213
+ vendor: :groq,
214
+ prompt: 'Your prompt here'
215
+ )
216
+ ```
217
+
218
+ ### Ollama (Default for other models)
193
219
  ```ruby
194
220
  response = LlmConductor.generate(
195
- model: 'llama3.2', # Auto-detects Ollama for non-GPT/Claude/Gemini models
221
+ model: 'deepseek-r1',
196
222
  prompt: 'Your prompt here'
197
223
  )
198
224
  ```
@@ -204,13 +230,15 @@ The gem automatically detects the appropriate provider based on model names:
204
230
  - **OpenAI**: Models starting with `gpt-` (e.g., `gpt-4`, `gpt-3.5-turbo`)
205
231
  - **Anthropic**: Models starting with `claude-` (e.g., `claude-3-5-sonnet-20241022`)
206
232
  - **Google Gemini**: Models starting with `gemini-` (e.g., `gemini-2.5-flash`, `gemini-2.0-flash`)
233
+ - **Groq**: Models starting with `llama`, `mixtral`, `gemma`, or `qwen` (e.g., `llama-3.1-70b-versatile`, `mixtral-8x7b-32768`, `gemma-7b-it`, `qwen-2.5-72b-instruct`)
207
234
  - **Ollama**: All other models (e.g., `llama3.2`, `mistral`, `codellama`)
208
235
 
209
236
  You can also explicitly specify the vendor:
210
237
 
211
238
  ```ruby
212
239
  response = LlmConductor.generate(
213
- model: 'llama3.2', # Auto-detects Ollama for non-GPT models
240
+ model: 'llama-3.1-70b-versatile',
241
+ vendor: :groq, # Explicitly use Groq
214
242
  prompt: 'Your prompt here'
215
243
  )
216
244
  ```
@@ -453,6 +481,8 @@ Check the `/examples` directory for comprehensive usage examples:
453
481
  - `prompt_registration.rb` - Custom prompt classes
454
482
  - `data_builder_usage.rb` - Data structuring patterns
455
483
  - `rag_usage.rb` - RAG implementation examples
484
+ - `gemini_usage.rb` - Google Gemini integration
485
+ - `groq_usage.rb` - Groq integration with various models
456
486
 
457
487
  ## Development
458
488
 
@@ -0,0 +1,77 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'llm_conductor'
4
+
5
+ # Configure Groq
6
+ LlmConductor.configure do |config|
7
+ config.groq(api_key: ENV['GROQ_API_KEY'])
8
+ end
9
+
10
+ # Example 1: Using Groq with automatic model detection
11
+ puts '=== Example 1: Automatic model detection ==='
12
+ client = LlmConductor::ClientFactory.build(
13
+ model: 'llama-3.1-70b-versatile',
14
+ type: :summarize_text
15
+ )
16
+
17
+ data = {
18
+ text: 'Groq is a cloud-based AI platform that provides fast inference for large language models. ' \
19
+ 'It offers various models including Llama, Mixtral, Gemma, and Qwen models with ' \
20
+ 'optimized performance for production use cases.'
21
+ }
22
+
23
+ response = client.generate(data:)
24
+ puts "Model: #{response.model}"
25
+ puts "Vendor: #{response.metadata[:vendor]}"
26
+ puts "Input tokens: #{response.input_tokens}"
27
+ puts "Output tokens: #{response.output_tokens}"
28
+ puts "Summary: #{response.output}"
29
+ puts
30
+
31
+ # Example 2: Using Groq with explicit vendor specification
32
+ puts '=== Example 2: Explicit vendor specification ==='
33
+ client = LlmConductor::ClientFactory.build(
34
+ model: 'mixtral-8x7b-32768',
35
+ type: :summarize_text,
36
+ vendor: :groq
37
+ )
38
+
39
+ response = client.generate(data:)
40
+ puts "Model: #{response.model}"
41
+ puts "Vendor: #{response.metadata[:vendor]}"
42
+ puts "Summary: #{response.output}"
43
+ puts
44
+
45
+ # Example 3: Using Groq with simple generation
46
+ puts '=== Example 3: Simple generation ==='
47
+ client = LlmConductor::ClientFactory.build(
48
+ model: 'gemma-7b-it',
49
+ type: :summarize_text,
50
+ vendor: :groq
51
+ )
52
+
53
+ response = client.generate_simple(prompt: 'Explain what Groq is in one sentence.')
54
+ puts "Model: #{response.model}"
55
+ puts "Response: #{response.output}"
56
+ puts
57
+
58
+ # Example 4: Using different Groq models
59
+ puts '=== Example 4: Different Groq models ==='
60
+ models = [
61
+ 'llama-3.1-70b-versatile',
62
+ 'mixtral-8x7b-32768',
63
+ 'gemma-7b-it',
64
+ 'qwen-2.5-72b-instruct'
65
+ ]
66
+
67
+ models.each do |model|
68
+ client = LlmConductor::ClientFactory.build(
69
+ model:,
70
+ type: :summarize_text,
71
+ vendor: :groq
72
+ )
73
+
74
+ response = client.generate_simple(prompt: 'What is artificial intelligence?')
75
+ puts "#{model}: #{response.output[0..100]}..."
76
+ puts
77
+ end
@@ -18,14 +18,13 @@ module LlmConductor
18
18
  openrouter: Clients::OpenrouterClient,
19
19
  ollama: Clients::OllamaClient,
20
20
  gemini: Clients::GeminiClient,
21
- google: Clients::GeminiClient
21
+ google: Clients::GeminiClient,
22
+ groq: Clients::GroqClient
22
23
  }
23
24
 
24
- client_classes[vendor] || raise(
25
- ArgumentError,
26
- "Unsupported vendor: #{vendor}. " \
27
- 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
28
- )
25
+ client_classes.fetch(vendor) do
26
+ raise ArgumentError, "Unsupported vendor: #{vendor}. Supported vendors: #{client_classes.keys.uniq.join(', ')}"
27
+ end
29
28
  end
30
29
 
31
30
  def self.determine_vendor(model)
@@ -36,6 +35,8 @@ module LlmConductor
36
35
  :openai
37
36
  when /^gemini/i
38
37
  :gemini
38
+ when /^(llama|mixtral|gemma|qwen)/i
39
+ :groq
39
40
  else
40
41
  :ollama # Default to Ollama for non-specific model names
41
42
  end
@@ -0,0 +1,24 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmConductor
4
+ module Clients
5
+ # Groq client implementation for accessing Groq models via Groq API
6
+ class GroqClient < BaseClient
7
+ private
8
+
9
+ def generate_content(prompt)
10
+ client.chat(
11
+ messages: [{ role: 'user', content: prompt }],
12
+ model:
13
+ ).dig('choices', 0, 'message', 'content')
14
+ end
15
+
16
+ def client
17
+ @client ||= begin
18
+ config = LlmConductor.configuration.provider_config(:groq)
19
+ Groq::Client.new(api_key: config[:api_key])
20
+ end
21
+ end
22
+ end
23
+ end
24
+ end
@@ -64,6 +64,14 @@ module LlmConductor
64
64
  }
65
65
  end
66
66
 
67
+ # Configure Groq provider
68
+ def groq(api_key: nil, **options)
69
+ @providers[:groq] = {
70
+ api_key: api_key || ENV['GROQ_API_KEY'],
71
+ **options
72
+ }
73
+ end
74
+
67
75
  # Get provider configuration
68
76
  def provider_config(provider)
69
77
  @providers[provider.to_sym] || {}
@@ -110,6 +118,14 @@ module LlmConductor
110
118
  gemini(api_key: value)
111
119
  end
112
120
 
121
+ def groq_api_key
122
+ provider_config(:groq)[:api_key]
123
+ end
124
+
125
+ def groq_api_key=(value)
126
+ groq(api_key: value)
127
+ end
128
+
113
129
  private
114
130
 
115
131
  def setup_defaults_from_env
@@ -118,6 +134,7 @@ module LlmConductor
118
134
  openai if ENV['OPENAI_API_KEY']
119
135
  openrouter if ENV['OPENROUTER_API_KEY']
120
136
  gemini if ENV['GEMINI_API_KEY']
137
+ groq if ENV['GROQ_API_KEY']
121
138
  ollama # Always configure Ollama with default URL
122
139
  end
123
140
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmConductor
4
- VERSION = '1.0.1'
4
+ VERSION = '1.1.0'
5
5
  end
data/lib/llm_conductor.rb CHANGED
@@ -10,13 +10,14 @@ require_relative 'llm_conductor/prompt_manager'
10
10
  require_relative 'llm_conductor/clients/base_client'
11
11
  require_relative 'llm_conductor/clients/anthropic_client'
12
12
  require_relative 'llm_conductor/clients/gpt_client'
13
+ require_relative 'llm_conductor/clients/groq_client'
13
14
  require_relative 'llm_conductor/clients/ollama_client'
14
15
  require_relative 'llm_conductor/clients/openrouter_client'
15
16
  require_relative 'llm_conductor/clients/gemini_client'
16
17
  require_relative 'llm_conductor/client_factory'
17
18
 
18
19
  # LLM Conductor provides a unified interface for multiple Language Model providers
19
- # including OpenAI GPT, Anthropic Claude, Google Gemini, OpenRouter, and Ollama
20
+ # including OpenAI GPT, Anthropic Claude, Google Gemini, Groq, OpenRouter, and Ollama
20
21
  # with built-in prompt templates, token counting, and extensible client architecture.
21
22
  module LlmConductor
22
23
  class Error < StandardError; end
@@ -61,16 +62,17 @@ module LlmConductor
61
62
  when :openrouter then Clients::OpenrouterClient
62
63
  when :ollama then Clients::OllamaClient
63
64
  when :gemini, :google then Clients::GeminiClient
65
+ when :groq then Clients::GroqClient
64
66
  else
65
67
  raise ArgumentError,
66
68
  "Unsupported vendor: #{vendor}. " \
67
- 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
69
+ 'Supported vendors: anthropic, openai, openrouter, ollama, gemini, groq'
68
70
  end
69
71
  end
70
72
  end
71
73
 
72
74
  # List of supported vendors
73
- SUPPORTED_VENDORS = %i[anthropic openai openrouter ollama gemini].freeze
75
+ SUPPORTED_VENDORS = %i[anthropic openai openrouter ollama gemini groq].freeze
74
76
 
75
77
  # List of supported prompt types
76
78
  SUPPORTED_PROMPT_TYPES = %i[
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_conductor
3
3
  version: !ruby/object:Gem::Version
4
- version: 1.0.1
4
+ version: 1.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ben Zheng
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-10-01 00:00:00.000000000 Z
10
+ date: 2025-10-15 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: activesupport
@@ -51,6 +51,20 @@ dependencies:
51
51
  - - "~>"
52
52
  - !ruby/object:Gem::Version
53
53
  version: '4.3'
54
+ - !ruby/object:Gem::Dependency
55
+ name: groq
56
+ requirement: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - "~>"
59
+ - !ruby/object:Gem::Version
60
+ version: '0.3'
61
+ type: :runtime
62
+ prerelease: false
63
+ version_requirements: !ruby/object:Gem::Requirement
64
+ requirements:
65
+ - - "~>"
66
+ - !ruby/object:Gem::Version
67
+ version: '0.3'
54
68
  - !ruby/object:Gem::Dependency
55
69
  name: ollama-ai
56
70
  requirement: !ruby/object:Gem::Requirement
@@ -122,24 +136,26 @@ dependencies:
122
136
  - !ruby/object:Gem::Version
123
137
  version: '3.0'
124
138
  description: LLM Conductor provides a clean, unified interface for working with multiple
125
- Language Model providers including OpenAI GPT, OpenRouter, and Ollama. Features
126
- include prompt templating, token counting, and extensible client architecture.
139
+ Language Model providers including OpenAI GPT, Anthropic Claude, Google Gemini,
140
+ Groq, OpenRouter, and Ollama. Features include prompt templating, token counting,
141
+ and extensible client architecture.
127
142
  email:
128
143
  - ben@ekohe.com
129
144
  executables: []
130
145
  extensions: []
131
146
  extra_rdoc_files: []
132
147
  files:
133
- - ".DS_Store"
134
148
  - ".rspec"
135
149
  - ".rubocop.yml"
136
150
  - ".rubocop_todo.yml"
137
151
  - ".ruby-version"
152
+ - LICENSE
138
153
  - README.md
139
154
  - Rakefile
140
155
  - config/initializers/llm_conductor.rb
141
156
  - examples/data_builder_usage.rb
142
157
  - examples/gemini_usage.rb
158
+ - examples/groq_usage.rb
143
159
  - examples/prompt_registration.rb
144
160
  - examples/rag_usage.rb
145
161
  - examples/simple_usage.rb
@@ -149,6 +165,7 @@ files:
149
165
  - lib/llm_conductor/clients/base_client.rb
150
166
  - lib/llm_conductor/clients/gemini_client.rb
151
167
  - lib/llm_conductor/clients/gpt_client.rb
168
+ - lib/llm_conductor/clients/groq_client.rb
152
169
  - lib/llm_conductor/clients/ollama_client.rb
153
170
  - lib/llm_conductor/clients/openrouter_client.rb
154
171
  - lib/llm_conductor/configuration.rb
@@ -160,7 +177,8 @@ files:
160
177
  - lib/llm_conductor/version.rb
161
178
  - sig/llm_conductor.rbs
162
179
  homepage: https://github.com/ekohe/llm_conductor
163
- licenses: []
180
+ licenses:
181
+ - MIT
164
182
  metadata:
165
183
  allowed_push_host: https://rubygems.org
166
184
  homepage_uri: https://github.com/ekohe/llm_conductor
data/.DS_Store DELETED
Binary file