llm_conductor 0.1.0 → 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 94848876574eca5294236d61d79175b6e28fc918c6bcbd0af969d70fc50cc6c8
4
- data.tar.gz: 18865a4228201006c9ebf84e0aac4bd6b946e1c551a121e323eed827477bb3ca
3
+ metadata.gz: c1b006030ac01906369f830f2c11d958f40c3bf3a48404179d78c6b1e75a82e5
4
+ data.tar.gz: 7ccf09702230675a381f164f4ccb91034aec880c5769cb8887711c6737385c46
5
5
  SHA512:
6
- metadata.gz: f16ee4255f54f3daef35a2557cd19fc171dbc238c2525475d95c3209ca5771f07f9ccd7d6e485f6adef979625008c0fb360d5705adea23bd14d987f051b236c0
7
- data.tar.gz: 62f464b4dcc76ce676f735db190e9a16ba9bee5f10b4dbdbd4c212badbfb7c8a736465bc40cd8228c11e2e1fed6f7afe823d05dfcd2dfc6c0cb689870e0e4acc
6
+ metadata.gz: 4c3283602c064e5c7e67ac5a1b26353c936b9d5d351ad21d87a89058aa045af1f8f916f350ac07db4a0405a6fe9091b09ecc00b091afd57039d3ae560bd199e8
7
+ data.tar.gz: 571400d81fa8029ecf9f9d4a53c48ac26392c13de927994c358b0e953d27257b94ff929f1d65f13b64ccddaada8785e33f271b4cf99ebdad74c63abef3dd4a97
data/README.md CHANGED
@@ -1,10 +1,10 @@
1
1
  # LLM Conductor
2
2
 
3
- A powerful Ruby gem for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
3
+ A powerful Ruby gem from [Ekohe](https://ekohe.com) for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
4
4
 
5
5
  ## Features
6
6
 
7
- 🚀 **Multi-Provider Support** - OpenAI GPT and Ollama with automatic vendor detection
7
+ 🚀 **Multi-Provider Support** - OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with automatic vendor detection
8
8
  🎯 **Unified Modern API** - Simple `LlmConductor.generate()` interface with rich Response objects
9
9
  📝 **Advanced Prompt Management** - Registrable prompt classes with inheritance and templating
10
10
  🏗️ **Data Builder Pattern** - Structured data preparation for complex LLM inputs
@@ -57,8 +57,8 @@ response = LlmConductor.generate(
57
57
  model: 'gpt-5-mini',
58
58
  type: :summarize_description,
59
59
  data: {
60
- name: 'TechCorp',
61
- domain_name: 'techcorp.com',
60
+ name: 'Ekohe',
61
+ domain_name: 'ekohe.com',
62
62
  description: 'An AI company specializing in...'
63
63
  }
64
64
  )
@@ -94,6 +94,14 @@ LlmConductor.configure do |config|
94
94
  organization: ENV['OPENAI_ORG_ID'] # Optional
95
95
  )
96
96
 
97
+ config.anthropic(
98
+ api_key: ENV['ANTHROPIC_API_KEY']
99
+ )
100
+
101
+ config.gemini(
102
+ api_key: ENV['GEMINI_API_KEY']
103
+ )
104
+
97
105
  config.ollama(
98
106
  base_url: ENV['OLLAMA_ADDRESS'] || 'http://localhost:11434'
99
107
  )
@@ -106,6 +114,8 @@ The gem automatically detects these environment variables:
106
114
 
107
115
  - `OPENAI_API_KEY` - OpenAI API key
108
116
  - `OPENAI_ORG_ID` - OpenAI organization ID (optional)
117
+ - `ANTHROPIC_API_KEY` - Anthropic API key
118
+ - `GEMINI_API_KEY` - Google Gemini API key
109
119
  - `OLLAMA_ADDRESS` - Ollama server address
110
120
 
111
121
  ## Supported Providers & Models
@@ -118,7 +128,81 @@ response = LlmConductor.generate(
118
128
  )
119
129
  ```
120
130
 
121
- ### Ollama (Default for non-GPT models)
131
+ ### Anthropic Claude (Automatic for Claude models)
132
+ ```ruby
133
+ response = LlmConductor.generate(
134
+ model: 'claude-3-5-sonnet-20241022', # Auto-detects Anthropic
135
+ prompt: 'Your prompt here'
136
+ )
137
+
138
+ # Or explicitly specify vendor
139
+ response = LlmConductor.generate(
140
+ model: 'claude-3-5-sonnet-20241022',
141
+ vendor: :anthropic,
142
+ prompt: 'Your prompt here'
143
+ )
144
+ ```
145
+
146
+ **Supported Claude Models:**
147
+ - `claude-3-5-sonnet-20241022` (Latest Claude 3.5 Sonnet)
148
+ - `claude-3-5-haiku-20241022` (Claude 3.5 Haiku)
149
+ - `claude-3-opus-20240229` (Claude 3 Opus)
150
+ - `claude-3-sonnet-20240229` (Claude 3 Sonnet)
151
+ - `claude-3-haiku-20240307` (Claude 3 Haiku)
152
+
153
+ **Why Choose Claude?**
154
+ - **Superior Reasoning**: Excellent for complex analysis and problem-solving
155
+ - **Code Generation**: Outstanding performance for programming tasks
156
+ - **Long Context**: Support for large documents and conversations
157
+ - **Safety**: Built with safety and helpfulness in mind
158
+ - **Cost Effective**: Competitive pricing for high-quality outputs
159
+
160
+ ### Google Gemini (Automatic for Gemini models)
161
+ ```ruby
162
+ response = LlmConductor.generate(
163
+ model: 'gemini-2.5-flash', # Auto-detects Gemini
164
+ prompt: 'Your prompt here'
165
+ )
166
+
167
+ # Or explicitly specify vendor
168
+ response = LlmConductor.generate(
169
+ model: 'gemini-2.5-flash',
170
+ vendor: :gemini,
171
+ prompt: 'Your prompt here'
172
+ )
173
+ ```
174
+
175
+ **Supported Gemini Models:**
176
+ - `gemini-2.5-flash` (Latest Gemini 2.5 Flash)
177
+ - `gemini-2.5-flash` (Gemini 2.5 Flash)
178
+ - `gemini-2.0-flash` (Gemini 2.0 Flash)
179
+
180
+ **Why Choose Gemini?**
181
+ - **Multimodal**: Native support for text, images, and other modalities
182
+ - **Long Context**: Massive context windows for large documents
183
+ - **Fast Performance**: Optimized for speed and efficiency
184
+ - **Google Integration**: Seamless integration with Google services
185
+ - **Competitive Pricing**: Cost-effective for high-volume usage
186
+
187
+ ### Ollama (Default for non-GPT/Claude/Gemini models)
188
+ ```ruby
189
+ response = LlmConductor.generate(
190
+ model: 'llama3.2', # Auto-detects Ollama for non-GPT/Claude/Gemini models
191
+ prompt: 'Your prompt here'
192
+ )
193
+ ```
194
+
195
+ ### Vendor Detection
196
+
197
+ The gem automatically detects the appropriate provider based on model names:
198
+
199
+ - **OpenAI**: Models starting with `gpt-` (e.g., `gpt-4`, `gpt-3.5-turbo`)
200
+ - **Anthropic**: Models starting with `claude-` (e.g., `claude-3-5-sonnet-20241022`)
201
+ - **Google Gemini**: Models starting with `gemini-` (e.g., `gemini-2.5-flash`, `gemini-2.0-flash`)
202
+ - **Ollama**: All other models (e.g., `llama3.2`, `mistral`, `codellama`)
203
+
204
+ You can also explicitly specify the vendor:
205
+
122
206
  ```ruby
123
207
  response = LlmConductor.generate(
124
208
  model: 'llama3.2', # Auto-detects Ollama for non-GPT models
@@ -159,8 +243,8 @@ response = LlmConductor.generate(
159
243
  model: 'gpt-5-mini',
160
244
  type: :detailed_analysis,
161
245
  data: {
162
- name: 'TechCorp',
163
- domain_name: 'techcorp.com',
246
+ name: 'Ekohe',
247
+ domain_name: 'ekohe.com',
164
248
  description: 'A leading AI company...'
165
249
  }
166
250
  )
@@ -385,7 +469,7 @@ bin/console
385
469
 
386
470
  ## Testing
387
471
 
388
- The gem includes comprehensive test coverage with unit, integration, and performance tests. See `spec/TESTING_GUIDE.md` for detailed testing information.
472
+ The gem includes comprehensive test coverage with unit, integration, and performance tests.
389
473
 
390
474
  ## Performance
391
475
 
@@ -407,7 +491,3 @@ Bug reports and pull requests are welcome on GitHub at https://github.com/ekohe/
407
491
  ## License
408
492
 
409
493
  The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
410
-
411
- ## Code of Conduct
412
-
413
- This project is intended to be a safe, welcoming space for collaboration. Contributors are expected to adhere to the [code of conduct](https://github.com/ekohe/llm_conductor/blob/main/CODE_OF_CONDUCT.md).
@@ -0,0 +1,18 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative '../lib/llm_conductor'
4
+
5
+ # Configure Gemini API key
6
+ LlmConductor.configure do |config|
7
+ config.gemini_api_key = ENV['GEMINI_API_KEY'] || 'your_gemini_api_key_here'
8
+ end
9
+
10
+ # Example usage
11
+ response = LlmConductor.generate(
12
+ model: 'gemini-2.5-flash',
13
+ prompt: 'Explain how AI works in a few words'
14
+ )
15
+
16
+ puts "Model: #{response.model}"
17
+ puts "Output: #{response.output}"
18
+ puts "Vendor: #{response.metadata[:vendor]}"
@@ -5,26 +5,37 @@ module LlmConductor
5
5
  class ClientFactory
6
6
  def self.build(model:, type:, vendor: nil)
7
7
  vendor ||= determine_vendor(model)
8
+ client_class = client_class_for_vendor(vendor)
9
+ client_class.new(model:, type:)
10
+ end
8
11
 
9
- client_class = case vendor
10
- when :openai, :gpt
11
- Clients::GptClient
12
- when :openrouter
13
- Clients::OpenrouterClient
14
- when :ollama
15
- Clients::OllamaClient
16
- else
17
- raise ArgumentError,
18
- "Unsupported vendor: #{vendor}. Supported vendors: openai, openrouter, ollama"
19
- end
12
+ def self.client_class_for_vendor(vendor)
13
+ client_classes = {
14
+ anthropic: Clients::AnthropicClient,
15
+ claude: Clients::AnthropicClient,
16
+ openai: Clients::GptClient,
17
+ gpt: Clients::GptClient,
18
+ openrouter: Clients::OpenrouterClient,
19
+ ollama: Clients::OllamaClient,
20
+ gemini: Clients::GeminiClient,
21
+ google: Clients::GeminiClient
22
+ }
20
23
 
21
- client_class.new(model:, type:)
24
+ client_classes[vendor] || raise(
25
+ ArgumentError,
26
+ "Unsupported vendor: #{vendor}. " \
27
+ 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
28
+ )
22
29
  end
23
30
 
24
31
  def self.determine_vendor(model)
25
32
  case model
33
+ when /^claude/i
34
+ :anthropic
26
35
  when /^gpt/i
27
36
  :openai
37
+ when /^gemini/i
38
+ :gemini
28
39
  else
29
40
  :ollama # Default to Ollama for non-specific model names
30
41
  end
@@ -0,0 +1,31 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'anthropic'
4
+
5
+ module LlmConductor
6
+ module Clients
7
+ # Anthropic Claude client implementation for accessing Claude models via Anthropic API
8
+ class AnthropicClient < BaseClient
9
+ private
10
+
11
+ def generate_content(prompt)
12
+ response = client.messages.create(
13
+ model:,
14
+ max_tokens: 4096,
15
+ messages: [{ role: 'user', content: prompt }]
16
+ )
17
+
18
+ response.content.first.text
19
+ rescue Anthropic::Errors::APIError => e
20
+ raise StandardError, "Anthropic API error: #{e.message}"
21
+ end
22
+
23
+ def client
24
+ @client ||= begin
25
+ config = LlmConductor.configuration.provider_config(:anthropic)
26
+ Anthropic::Client.new(api_key: config[:api_key])
27
+ end
28
+ end
29
+ end
30
+ end
31
+ end
@@ -90,7 +90,7 @@ module LlmConductor
90
90
  def build_metadata
91
91
  {
92
92
  vendor: self.class.name.split('::').last.gsub('Client', '').downcase.to_sym,
93
- timestamp: Time.zone.now.iso8601
93
+ timestamp: Time.now.utc.iso8601
94
94
  }
95
95
  end
96
96
  end
@@ -0,0 +1,36 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'gemini-ai'
4
+
5
+ module LlmConductor
6
+ module Clients
7
+ # Google Gemini client implementation for accessing Gemini models via Google AI API
8
+ class GeminiClient < BaseClient
9
+ private
10
+
11
+ def generate_content(prompt)
12
+ payload = {
13
+ contents: [
14
+ { parts: [{ text: prompt }] }
15
+ ]
16
+ }
17
+
18
+ response = client.generate_content(payload)
19
+ response.dig('candidates', 0, 'content', 'parts', 0, 'text')
20
+ end
21
+
22
+ def client
23
+ @client ||= begin
24
+ config = LlmConductor.configuration.provider_config(:gemini)
25
+ Gemini.new(
26
+ credentials: {
27
+ service: 'generative-language-api',
28
+ api_key: config[:api_key]
29
+ },
30
+ options: { model: }
31
+ )
32
+ end
33
+ end
34
+ end
35
+ end
36
+ end
@@ -22,6 +22,14 @@ module LlmConductor
22
22
  setup_defaults_from_env
23
23
  end
24
24
 
25
+ # Configure Anthropic provider
26
+ def anthropic(api_key: nil, **options)
27
+ @providers[:anthropic] = {
28
+ api_key: api_key || ENV['ANTHROPIC_API_KEY'],
29
+ **options
30
+ }
31
+ end
32
+
25
33
  # Configure OpenAI provider
26
34
  def openai(api_key: nil, organization: nil, **options)
27
35
  @providers[:openai] = {
@@ -47,12 +55,28 @@ module LlmConductor
47
55
  }
48
56
  end
49
57
 
58
+ # Configure Google Gemini provider
59
+ def gemini(api_key: nil, **options)
60
+ @providers[:gemini] = {
61
+ api_key: api_key || ENV['GEMINI_API_KEY'],
62
+ **options
63
+ }
64
+ end
65
+
50
66
  # Get provider configuration
51
67
  def provider_config(provider)
52
68
  @providers[provider.to_sym] || {}
53
69
  end
54
70
 
55
71
  # Legacy compatibility methods
72
+ def anthropic_api_key
73
+ provider_config(:anthropic)[:api_key]
74
+ end
75
+
76
+ def anthropic_api_key=(value)
77
+ anthropic(api_key: value)
78
+ end
79
+
56
80
  def openai_api_key
57
81
  provider_config(:openai)[:api_key]
58
82
  end
@@ -77,12 +101,22 @@ module LlmConductor
77
101
  ollama(base_url: value)
78
102
  end
79
103
 
104
+ def gemini_api_key
105
+ provider_config(:gemini)[:api_key]
106
+ end
107
+
108
+ def gemini_api_key=(value)
109
+ gemini(api_key: value)
110
+ end
111
+
80
112
  private
81
113
 
82
114
  def setup_defaults_from_env
83
115
  # Auto-configure providers if environment variables are present
116
+ anthropic if ENV['ANTHROPIC_API_KEY']
84
117
  openai if ENV['OPENAI_API_KEY']
85
118
  openrouter if ENV['OPENROUTER_API_KEY']
119
+ gemini if ENV['GEMINI_API_KEY']
86
120
  ollama # Always configure Ollama with default URL
87
121
  end
88
122
  end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmConductor
4
- VERSION = '0.1.0'
4
+ VERSION = '0.1.1'
5
5
  end
data/lib/llm_conductor.rb CHANGED
@@ -8,14 +8,16 @@ require_relative 'llm_conductor/prompts'
8
8
  require_relative 'llm_conductor/prompts/base_prompt'
9
9
  require_relative 'llm_conductor/prompt_manager'
10
10
  require_relative 'llm_conductor/clients/base_client'
11
+ require_relative 'llm_conductor/clients/anthropic_client'
11
12
  require_relative 'llm_conductor/clients/gpt_client'
12
13
  require_relative 'llm_conductor/clients/ollama_client'
13
14
  require_relative 'llm_conductor/clients/openrouter_client'
15
+ require_relative 'llm_conductor/clients/gemini_client'
14
16
  require_relative 'llm_conductor/client_factory'
15
17
 
16
18
  # LLM Conductor provides a unified interface for multiple Language Model providers
17
- # including OpenAI GPT, OpenRouter, and Ollama with built-in prompt templates,
18
- # token counting, and extensible client architecture.
19
+ # including OpenAI GPT, Anthropic Claude, Google Gemini, OpenRouter, and Ollama
20
+ # with built-in prompt templates, token counting, and extensible client architecture.
19
21
  module LlmConductor
20
22
  class Error < StandardError; end
21
23
 
@@ -54,17 +56,21 @@ module LlmConductor
54
56
 
55
57
  def client_class_for_vendor(vendor)
56
58
  case vendor
59
+ when :anthropic, :claude then Clients::AnthropicClient
57
60
  when :openai, :gpt then Clients::GptClient
58
61
  when :openrouter then Clients::OpenrouterClient
59
62
  when :ollama then Clients::OllamaClient
63
+ when :gemini, :google then Clients::GeminiClient
60
64
  else
61
- raise ArgumentError, "Unsupported vendor: #{vendor}. Supported vendors: openai, openrouter, ollama"
65
+ raise ArgumentError,
66
+ "Unsupported vendor: #{vendor}. " \
67
+ 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
62
68
  end
63
69
  end
64
70
  end
65
71
 
66
72
  # List of supported vendors
67
- SUPPORTED_VENDORS = %i[openai openrouter ollama].freeze
73
+ SUPPORTED_VENDORS = %i[anthropic openai openrouter ollama gemini].freeze
68
74
 
69
75
  # List of supported prompt types
70
76
  SUPPORTED_PROMPT_TYPES = %i[
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_conductor
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ben Zheng
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-09-19 00:00:00.000000000 Z
10
+ date: 2025-09-25 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: activesupport
@@ -23,6 +23,34 @@ dependencies:
23
23
  - - ">="
24
24
  - !ruby/object:Gem::Version
25
25
  version: '6.0'
26
+ - !ruby/object:Gem::Dependency
27
+ name: anthropic
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - "~>"
31
+ - !ruby/object:Gem::Version
32
+ version: '1.7'
33
+ type: :runtime
34
+ prerelease: false
35
+ version_requirements: !ruby/object:Gem::Requirement
36
+ requirements:
37
+ - - "~>"
38
+ - !ruby/object:Gem::Version
39
+ version: '1.7'
40
+ - !ruby/object:Gem::Dependency
41
+ name: gemini-ai
42
+ requirement: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - "~>"
45
+ - !ruby/object:Gem::Version
46
+ version: '4.3'
47
+ type: :runtime
48
+ prerelease: false
49
+ version_requirements: !ruby/object:Gem::Requirement
50
+ requirements:
51
+ - - "~>"
52
+ - !ruby/object:Gem::Version
53
+ version: '4.3'
26
54
  - !ruby/object:Gem::Dependency
27
55
  name: ollama-ai
28
56
  requirement: !ruby/object:Gem::Requirement
@@ -111,12 +139,15 @@ files:
111
139
  - Rakefile
112
140
  - config/initializers/llm_conductor.rb
113
141
  - examples/data_builder_usage.rb
142
+ - examples/gemini_usage.rb
114
143
  - examples/prompt_registration.rb
115
144
  - examples/rag_usage.rb
116
145
  - examples/simple_usage.rb
117
146
  - lib/llm_conductor.rb
118
147
  - lib/llm_conductor/client_factory.rb
148
+ - lib/llm_conductor/clients/anthropic_client.rb
119
149
  - lib/llm_conductor/clients/base_client.rb
150
+ - lib/llm_conductor/clients/gemini_client.rb
120
151
  - lib/llm_conductor/clients/gpt_client.rb
121
152
  - lib/llm_conductor/clients/ollama_client.rb
122
153
  - lib/llm_conductor/clients/openrouter_client.rb