llm_conductor 0.1.0 → 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 94848876574eca5294236d61d79175b6e28fc918c6bcbd0af969d70fc50cc6c8
4
- data.tar.gz: 18865a4228201006c9ebf84e0aac4bd6b946e1c551a121e323eed827477bb3ca
3
+ metadata.gz: c692236c5d26b598a2275a3a6591022312aec4e4a4d4ee57a186abc774bc39ce
4
+ data.tar.gz: 3c20edc85442b8e5227fb6896d3899f08495fa1296306042c07c8bc62a883941
5
5
  SHA512:
6
- metadata.gz: f16ee4255f54f3daef35a2557cd19fc171dbc238c2525475d95c3209ca5771f07f9ccd7d6e485f6adef979625008c0fb360d5705adea23bd14d987f051b236c0
7
- data.tar.gz: 62f464b4dcc76ce676f735db190e9a16ba9bee5f10b4dbdbd4c212badbfb7c8a736465bc40cd8228c11e2e1fed6f7afe823d05dfcd2dfc6c0cb689870e0e4acc
6
+ metadata.gz: 8ed23461382c3a8ee2fddea05fdaef5c72111bac8e4b820e4310a40105a8b0bd78b8f1c36c9176f9ac7ddf8c7b4300e9208b45711a129ab0531dd579f9a52964
7
+ data.tar.gz: d446efb23252acee7ea453361d402089b87c56450e20ed7e45b1aab2cd0381bff442ed51ea7e2a52a544c39f3117795ad17ac2208dc12514f211d12ba150bb2c
data/.rubocop.yml CHANGED
@@ -2,8 +2,6 @@
2
2
  plugins:
3
3
  - rubocop-performance
4
4
  - rubocop-rake
5
-
6
- require:
7
5
  - rubocop-capybara
8
6
  - rubocop-factory_bot
9
7
  - rubocop-rspec
@@ -27,21 +25,14 @@ Style/StringLiteralsInInterpolation:
27
25
 
28
26
  Style/HashSyntax:
29
27
  EnforcedShorthandSyntax: always
30
- # This is not rails application.
31
- # Rails/Blank:
32
- # Enabled: false
33
-
34
- # Rails/Present:
35
- # Enabled: false
36
-
37
- # Rails/TimeZone:
38
- # Enabled: false
39
28
 
40
29
  Lint/ConstantDefinitionInBlock:
41
30
  Enabled: false
42
31
 
43
32
  Metrics/MethodLength:
44
33
  Max: 15
34
+ Exclude:
35
+ - 'lib/llm_conductor/prompts.rb'
45
36
 
46
37
  RSpec/ExampleLength:
47
38
  Enabled: false
@@ -67,7 +58,7 @@ RSpec/MultipleDescribes:
67
58
  RSpec/SpecFilePathFormat:
68
59
  Enabled: false
69
60
 
70
- RSpec/FilePath:
61
+ RSpec/SpecFilePathSuffix:
71
62
  Enabled: false
72
63
 
73
64
  RSpec/UnspecifiedException:
@@ -94,6 +85,19 @@ Metrics/BlockLength:
94
85
  Exclude:
95
86
  - '*.gemspec'
96
87
 
88
+ # Prompt template methods naturally have high complexity due to conditional string building
89
+ Metrics/AbcSize:
90
+ Exclude:
91
+ - 'lib/llm_conductor/prompts.rb'
92
+
93
+ Metrics/CyclomaticComplexity:
94
+ Exclude:
95
+ - 'lib/llm_conductor/prompts.rb'
96
+
97
+ Metrics/PerceivedComplexity:
98
+ Exclude:
99
+ - 'lib/llm_conductor/prompts.rb'
100
+
97
101
  Layout/LineLength:
98
102
  Max: 120
99
103
 
data/README.md CHANGED
@@ -1,10 +1,10 @@
1
1
  # LLM Conductor
2
2
 
3
- A powerful Ruby gem for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
3
+ A powerful Ruby gem from [Ekohe](https://ekohe.com) for orchestrating multiple Language Model providers with a unified, modern interface. LLM Conductor provides seamless integration with OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with advanced prompt management, data building patterns, and comprehensive response handling.
4
4
 
5
5
  ## Features
6
6
 
7
- 🚀 **Multi-Provider Support** - OpenAI GPT and Ollama with automatic vendor detection
7
+ 🚀 **Multi-Provider Support** - OpenAI GPT, Anthropic Claude, Google Gemini, and Ollama with automatic vendor detection
8
8
  🎯 **Unified Modern API** - Simple `LlmConductor.generate()` interface with rich Response objects
9
9
  📝 **Advanced Prompt Management** - Registrable prompt classes with inheritance and templating
10
10
  🏗️ **Data Builder Pattern** - Structured data preparation for complex LLM inputs
@@ -52,14 +52,18 @@ puts response.estimated_cost # Cost in USD
52
52
  ### 2. Template-Based Generation
53
53
 
54
54
  ```ruby
55
- # Use built-in templates with structured data
55
+ # Use built-in text summarization template
56
56
  response = LlmConductor.generate(
57
57
  model: 'gpt-5-mini',
58
- type: :summarize_description,
58
+ type: :summarize_text,
59
59
  data: {
60
- name: 'TechCorp',
61
- domain_name: 'techcorp.com',
62
- description: 'An AI company specializing in...'
60
+ text: 'Ekohe (ee-koh-hee) means "boundless possibility." Our way is to make AI practical, achievable, and most importantly, useful for you — and we prove it every day. With almost 16 years of wins under our belt, a market-leading 24-hr design & development cycle, and 5 offices in the most vibrant cities in the world, we surf the seas of innovation. We create efficient, elegant, and scalable digital products — delivering the right interactive solutions to achieve your audience and business goals. We help you transform. We break new ground across the globe — from AI and ML automation that drives the enterprise, to innovative customer experiences and mobile apps for startups. Our special sauce is the care, curiosity, and dedication we offer to solve for your needs. We focus on your success and deliver the most impactful experiences in the most efficient manner. Our clients tell us we partner with them in a trusted and capable way, driving the right design and technical choices.',
61
+ max_length: '20 words',
62
+ style: 'professional and engaging',
63
+ focus_areas: ['core business', 'expertise', 'target market'],
64
+ audience: 'potential investors',
65
+ include_key_points: true,
66
+ output_format: 'paragraph'
63
67
  }
64
68
  )
65
69
 
@@ -67,7 +71,7 @@ response = LlmConductor.generate(
67
71
  if response.success?
68
72
  puts "Generated: #{response.output}"
69
73
  puts "Tokens: #{response.total_tokens}"
70
- puts "Cost: $#{response.estimated_cost}"
74
+ puts "Cost: $#{response.estimated_cost || 'N/A (free model)'}"
71
75
  else
72
76
  puts "Error: #{response.metadata[:error]}"
73
77
  end
@@ -94,6 +98,14 @@ LlmConductor.configure do |config|
94
98
  organization: ENV['OPENAI_ORG_ID'] # Optional
95
99
  )
96
100
 
101
+ config.anthropic(
102
+ api_key: ENV['ANTHROPIC_API_KEY']
103
+ )
104
+
105
+ config.gemini(
106
+ api_key: ENV['GEMINI_API_KEY']
107
+ )
108
+
97
109
  config.ollama(
98
110
  base_url: ENV['OLLAMA_ADDRESS'] || 'http://localhost:11434'
99
111
  )
@@ -106,6 +118,8 @@ The gem automatically detects these environment variables:
106
118
 
107
119
  - `OPENAI_API_KEY` - OpenAI API key
108
120
  - `OPENAI_ORG_ID` - OpenAI organization ID (optional)
121
+ - `ANTHROPIC_API_KEY` - Anthropic API key
122
+ - `GEMINI_API_KEY` - Google Gemini API key
109
123
  - `OLLAMA_ADDRESS` - Ollama server address
110
124
 
111
125
  ## Supported Providers & Models
@@ -118,7 +132,55 @@ response = LlmConductor.generate(
118
132
  )
119
133
  ```
120
134
 
121
- ### Ollama (Default for non-GPT models)
135
+ ### Anthropic Claude (Automatic for Claude models)
136
+ ```ruby
137
+ response = LlmConductor.generate(
138
+ model: 'claude-3-5-sonnet-20241022', # Auto-detects Anthropic
139
+ prompt: 'Your prompt here'
140
+ )
141
+
142
+ # Or explicitly specify vendor
143
+ response = LlmConductor.generate(
144
+ model: 'claude-3-5-sonnet-20241022',
145
+ vendor: :anthropic,
146
+ prompt: 'Your prompt here'
147
+ )
148
+ ```
149
+
150
+ ### Google Gemini (Automatic for Gemini models)
151
+ ```ruby
152
+ response = LlmConductor.generate(
153
+ model: 'gemini-2.5-flash', # Auto-detects Gemini
154
+ prompt: 'Your prompt here'
155
+ )
156
+
157
+ # Or explicitly specify vendor
158
+ response = LlmConductor.generate(
159
+ model: 'gemini-2.5-flash',
160
+ vendor: :gemini,
161
+ prompt: 'Your prompt here'
162
+ )
163
+ ```
164
+
165
+ ### Ollama (Default for non-GPT/Claude/Gemini models)
166
+ ```ruby
167
+ response = LlmConductor.generate(
168
+ model: 'llama3.2', # Auto-detects Ollama for non-GPT/Claude/Gemini models
169
+ prompt: 'Your prompt here'
170
+ )
171
+ ```
172
+
173
+ ### Vendor Detection
174
+
175
+ The gem automatically detects the appropriate provider based on model names:
176
+
177
+ - **OpenAI**: Models starting with `gpt-` (e.g., `gpt-4`, `gpt-3.5-turbo`)
178
+ - **Anthropic**: Models starting with `claude-` (e.g., `claude-3-5-sonnet-20241022`)
179
+ - **Google Gemini**: Models starting with `gemini-` (e.g., `gemini-2.5-flash`, `gemini-2.0-flash`)
180
+ - **Ollama**: All other models (e.g., `llama3.2`, `mistral`, `codellama`)
181
+
182
+ You can also explicitly specify the vendor:
183
+
122
184
  ```ruby
123
185
  response = LlmConductor.generate(
124
186
  model: 'llama3.2', # Auto-detects Ollama for non-GPT models
@@ -159,8 +221,8 @@ response = LlmConductor.generate(
159
221
  model: 'gpt-5-mini',
160
222
  type: :detailed_analysis,
161
223
  data: {
162
- name: 'TechCorp',
163
- domain_name: 'techcorp.com',
224
+ name: 'Ekohe',
225
+ domain_name: 'ekohe.com',
164
226
  description: 'A leading AI company...'
165
227
  }
166
228
  )
@@ -385,7 +447,7 @@ bin/console
385
447
 
386
448
  ## Testing
387
449
 
388
- The gem includes comprehensive test coverage with unit, integration, and performance tests. See `spec/TESTING_GUIDE.md` for detailed testing information.
450
+ The gem includes comprehensive test coverage with unit, integration, and performance tests.
389
451
 
390
452
  ## Performance
391
453
 
@@ -407,7 +469,3 @@ Bug reports and pull requests are welcome on GitHub at https://github.com/ekohe/
407
469
  ## License
408
470
 
409
471
  The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
410
-
411
- ## Code of Conduct
412
-
413
- This project is intended to be a safe, welcoming space for collaboration. Contributors are expected to adhere to the [code of conduct](https://github.com/ekohe/llm_conductor/blob/main/CODE_OF_CONDUCT.md).
@@ -0,0 +1,18 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative '../lib/llm_conductor'
4
+
5
+ # Configure Gemini API key
6
+ LlmConductor.configure do |config|
7
+ config.gemini_api_key = ENV['GEMINI_API_KEY'] || 'your_gemini_api_key_here'
8
+ end
9
+
10
+ # Example usage
11
+ response = LlmConductor.generate(
12
+ model: 'gemini-2.5-flash',
13
+ prompt: 'Explain how AI works in a few words'
14
+ )
15
+
16
+ puts "Model: #{response.model}"
17
+ puts "Output: #{response.output}"
18
+ puts "Vendor: #{response.metadata[:vendor]}"
@@ -5,26 +5,37 @@ module LlmConductor
5
5
  class ClientFactory
6
6
  def self.build(model:, type:, vendor: nil)
7
7
  vendor ||= determine_vendor(model)
8
+ client_class = client_class_for_vendor(vendor)
9
+ client_class.new(model:, type:)
10
+ end
8
11
 
9
- client_class = case vendor
10
- when :openai, :gpt
11
- Clients::GptClient
12
- when :openrouter
13
- Clients::OpenrouterClient
14
- when :ollama
15
- Clients::OllamaClient
16
- else
17
- raise ArgumentError,
18
- "Unsupported vendor: #{vendor}. Supported vendors: openai, openrouter, ollama"
19
- end
12
+ def self.client_class_for_vendor(vendor)
13
+ client_classes = {
14
+ anthropic: Clients::AnthropicClient,
15
+ claude: Clients::AnthropicClient,
16
+ openai: Clients::GptClient,
17
+ gpt: Clients::GptClient,
18
+ openrouter: Clients::OpenrouterClient,
19
+ ollama: Clients::OllamaClient,
20
+ gemini: Clients::GeminiClient,
21
+ google: Clients::GeminiClient
22
+ }
20
23
 
21
- client_class.new(model:, type:)
24
+ client_classes[vendor] || raise(
25
+ ArgumentError,
26
+ "Unsupported vendor: #{vendor}. " \
27
+ 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
28
+ )
22
29
  end
23
30
 
24
31
  def self.determine_vendor(model)
25
32
  case model
33
+ when /^claude/i
34
+ :anthropic
26
35
  when /^gpt/i
27
36
  :openai
37
+ when /^gemini/i
38
+ :gemini
28
39
  else
29
40
  :ollama # Default to Ollama for non-specific model names
30
41
  end
@@ -0,0 +1,31 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'anthropic'
4
+
5
+ module LlmConductor
6
+ module Clients
7
+ # Anthropic Claude client implementation for accessing Claude models via Anthropic API
8
+ class AnthropicClient < BaseClient
9
+ private
10
+
11
+ def generate_content(prompt)
12
+ response = client.messages.create(
13
+ model:,
14
+ max_tokens: 4096,
15
+ messages: [{ role: 'user', content: prompt }]
16
+ )
17
+
18
+ response.content.first.text
19
+ rescue Anthropic::Errors::APIError => e
20
+ raise StandardError, "Anthropic API error: #{e.message}"
21
+ end
22
+
23
+ def client
24
+ @client ||= begin
25
+ config = LlmConductor.configuration.provider_config(:anthropic)
26
+ Anthropic::Client.new(api_key: config[:api_key])
27
+ end
28
+ end
29
+ end
30
+ end
31
+ end
@@ -90,7 +90,7 @@ module LlmConductor
90
90
  def build_metadata
91
91
  {
92
92
  vendor: self.class.name.split('::').last.gsub('Client', '').downcase.to_sym,
93
- timestamp: Time.zone.now.iso8601
93
+ timestamp: Time.now.utc.iso8601
94
94
  }
95
95
  end
96
96
  end
@@ -0,0 +1,36 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'gemini-ai'
4
+
5
+ module LlmConductor
6
+ module Clients
7
+ # Google Gemini client implementation for accessing Gemini models via Google AI API
8
+ class GeminiClient < BaseClient
9
+ private
10
+
11
+ def generate_content(prompt)
12
+ payload = {
13
+ contents: [
14
+ { parts: [{ text: prompt }] }
15
+ ]
16
+ }
17
+
18
+ response = client.generate_content(payload)
19
+ response.dig('candidates', 0, 'content', 'parts', 0, 'text')
20
+ end
21
+
22
+ def client
23
+ @client ||= begin
24
+ config = LlmConductor.configuration.provider_config(:gemini)
25
+ Gemini.new(
26
+ credentials: {
27
+ service: 'generative-language-api',
28
+ api_key: config[:api_key]
29
+ },
30
+ options: { model: }
31
+ )
32
+ end
33
+ end
34
+ end
35
+ end
36
+ end
@@ -22,6 +22,14 @@ module LlmConductor
22
22
  setup_defaults_from_env
23
23
  end
24
24
 
25
+ # Configure Anthropic provider
26
+ def anthropic(api_key: nil, **options)
27
+ @providers[:anthropic] = {
28
+ api_key: api_key || ENV['ANTHROPIC_API_KEY'],
29
+ **options
30
+ }
31
+ end
32
+
25
33
  # Configure OpenAI provider
26
34
  def openai(api_key: nil, organization: nil, **options)
27
35
  @providers[:openai] = {
@@ -47,12 +55,28 @@ module LlmConductor
47
55
  }
48
56
  end
49
57
 
58
+ # Configure Google Gemini provider
59
+ def gemini(api_key: nil, **options)
60
+ @providers[:gemini] = {
61
+ api_key: api_key || ENV['GEMINI_API_KEY'],
62
+ **options
63
+ }
64
+ end
65
+
50
66
  # Get provider configuration
51
67
  def provider_config(provider)
52
68
  @providers[provider.to_sym] || {}
53
69
  end
54
70
 
55
71
  # Legacy compatibility methods
72
+ def anthropic_api_key
73
+ provider_config(:anthropic)[:api_key]
74
+ end
75
+
76
+ def anthropic_api_key=(value)
77
+ anthropic(api_key: value)
78
+ end
79
+
56
80
  def openai_api_key
57
81
  provider_config(:openai)[:api_key]
58
82
  end
@@ -77,12 +101,22 @@ module LlmConductor
77
101
  ollama(base_url: value)
78
102
  end
79
103
 
104
+ def gemini_api_key
105
+ provider_config(:gemini)[:api_key]
106
+ end
107
+
108
+ def gemini_api_key=(value)
109
+ gemini(api_key: value)
110
+ end
111
+
80
112
  private
81
113
 
82
114
  def setup_defaults_from_env
83
115
  # Auto-configure providers if environment variables are present
116
+ anthropic if ENV['ANTHROPIC_API_KEY']
84
117
  openai if ENV['OPENAI_API_KEY']
85
118
  openrouter if ENV['OPENROUTER_API_KEY']
119
+ gemini if ENV['GEMINI_API_KEY']
86
120
  ollama # Always configure Ollama with default URL
87
121
  end
88
122
  end
@@ -1,124 +1,140 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmConductor
4
- # Collection of pre-built prompt templates for common LLM tasks including
5
- # content analysis, link extraction, and data summarization.
4
+ # Collection of general-purpose prompt templates for common LLM tasks
6
5
  module Prompts
7
- def prompt_featured_links(data)
6
+ # General prompt for extracting links from HTML content
7
+ # More flexible and applicable to various use cases
8
+ def prompt_extract_links(data)
9
+ criteria = data[:criteria] || 'relevant and useful'
10
+ max_links = data[:max_links] || 10
11
+ link_types = data[:link_types] || %w[navigation content footer]
12
+
8
13
  <<~PROMPT
9
- You are an AI assistant tasked with analyzing a webpage's HTML content to extract the most valuable links. Your goal is to identify links related to features, products, solutions, pricing, and social media profiles, prioritizing those from the same domain as the current page. Here are your instructions:
14
+ Analyze the provided HTML content and extract links based on the specified criteria.
15
+
16
+ HTML Content:
17
+ #{data[:html_content] || data[:htmls]}
18
+
19
+ Extraction Criteria: #{criteria}
20
+ Maximum Links: #{max_links}
21
+ Link Types to Consider: #{link_types.join(', ')}
10
22
 
11
- - You will be provided with the HTML content of the current page in the following format:
12
- <page_html>
13
- #{data[:htmls]}
14
- </page_html>
23
+ #{"Domain Filter: Only include links from domain #{data[:domain_filter]}" if data[:domain_filter]}
15
24
 
16
- - Parse the HTML content and extract all hyperlinks (a href attributes). Pay special attention to links in the navigation menu, footer, and main content areas.
25
+ Instructions:
26
+ 1. Parse the HTML content and identify all hyperlinks
27
+ 2. Filter links based on the provided criteria
28
+ 3. Prioritize links from specified areas: #{link_types.join(', ')}
29
+ 4. Return up to #{max_links} most relevant links
30
+ #{if data[:format] == :json
31
+ '5. Format output as a JSON array of URLs'
32
+ else
33
+ '5. Format output as a newline-separated list of URLs'
34
+ end}
17
35
 
18
- - Filter and prioritize the extracted links based on the following criteria:
19
- a. The link must be from the same domain as the current URL.
20
- b. Prioritize links containing keywords such as "features", "products", "solutions", "pricing", "about", "contact", or similar variations.
21
- c. Include social media profile links (e.g., LinkedIn, Instagram, Twitter, Facebook) if available.
22
- d. Exclude links to login pages, search pages, or other utility pages.
36
+ Provide only the links without additional commentary.
37
+ PROMPT
38
+ end
23
39
 
24
- - Select the top 3 most valuable links based on the above criteria.
40
+ # General prompt for content analysis and data extraction
41
+ # Flexible template for various content analysis tasks
42
+ def prompt_analyze_content(data)
43
+ content_type = data[:content_type] || 'webpage content'
44
+ analysis_fields = data[:fields] || %w[summary key_points entities]
45
+ output_format = data[:output_format] || 'structured text'
25
46
 
26
- - Format your output as a JSON array of strings, where each string is a full URL. Use the following format:
27
- <output_format>
28
- ["https://example.com/about-us", "https://example.com/products", "https://example.com/services"]
29
- </output_format>
47
+ <<~PROMPT
48
+ Analyze the provided #{content_type} and extract the requested information.
30
49
 
31
- - The links must be the same domain of following
32
- <domain>
33
- #{data[:current_url]}
34
- </domain>
50
+ Content:
51
+ #{data[:content] || data[:htmls] || data[:text]}
35
52
 
36
- If fewer than 3 relevant links are found, include only the available links in the output array.
53
+ Analysis Fields:
54
+ #{analysis_fields.map { |field| "- #{field}" }.join("\n")}
37
55
 
38
- Remember to use the full URL for each link, including the domain name. If you encounter relative URLs, combine them with the domain from the current URL to create absolute URLs.
56
+ #{"Additional Instructions:\n#{data[:instructions]}" if data[:instructions]}
39
57
 
40
- Provide your final output without any additional explanation or commentary.
58
+ #{if output_format == 'json'
59
+ json_structure = analysis_fields.map { |field| " \"#{field}\": \"value or array\"" }.join(",\n")
60
+ "Output Format: JSON with the following structure:\n{\n#{json_structure}\n}"
61
+ else
62
+ "Output Format: #{output_format}"
63
+ end}
64
+
65
+ #{"Constraints:\n#{data[:constraints]}" if data[:constraints]}
66
+
67
+ Provide a comprehensive analysis focusing on the requested fields.
41
68
  PROMPT
42
69
  end
43
70
 
44
- def prompt_summarize_htmls(data)
71
+ # General prompt for text summarization
72
+ # Applicable to various types of text content
73
+ def prompt_summarize_text(data)
74
+ max_length = data[:max_length] || '200 words'
75
+ focus_areas = data[:focus_areas] || []
76
+ style = data[:style] || 'concise and informative'
77
+
45
78
  <<~PROMPT
46
- Extract useful information from the webpage including a domain, detailed description of what the company does, founding year, country, business model, product description and features, customers and partners, development stage, and social media links. output will be json
47
-
48
- You are tasked with extracting useful information about a company from a given webpage content. Your goal is to analyze the content and extract specific details about the company, its products, and its operations.
49
-
50
- You will be provided with raw HTML content in the following format:
51
-
52
- <html_content>
53
- #{data[:htmls]}
54
- </html_content>
55
-
56
- Carefully read through the webpage content and extract the following information about the company:
57
-
58
- - Name(field name): The company's name
59
- - Domain name(field domain_name): The company's domain
60
- - Description(field description): A comprehensive explanation of what the company does
61
- - Country(field country): The company's country
62
- - Region(field region): The company's region
63
- - Location(field location): The company's location
64
- - Founding on(field founded_on): Which year the company was established
65
- - Business model(field business_model): How the company generates revenue
66
- - Product description(product_description): A brief overview of the company's main product(s) or service(s)
67
- - Product features(product_features): Key features or capabilities of the product(s) or service(s)
68
- - Customers and partners(field customers_and_partners): Notable clients or business partners
69
- - Development stage(field development_stage): The current phase of the company (e.g., startup, growth, established)
70
- - Social media links(field social_media_links): URLs to the company's social media profiles
71
- - instagram_url
72
- - linkedin_url
73
- - twitter_url
74
-
75
- If any of the above information is not available in the webpage content, use "Not available" as the value for that field.
76
-
77
- Present your findings in a JSON format. Here's an example of the expected structure:
78
-
79
- <output_format>
80
- {
81
- "name": "AI-powered customer service",
82
- "domain_name": "example.com",
83
- "description": "XYZ Company develops AI chatbots that help businesses automate customer support...",
84
- "founding_on": 2018,
85
- "country": "United States",
86
- "Region": "SA",
87
- "Location": "SFO",
88
- "business_model": "SaaS subscription",
89
- "product_description": "AI-powered chatbot platform for customer service automation",
90
- "product_features": ["Natural language processing", "Multi-language support", "Integration with CRM systems"],
91
- "customers_and_partners": ["ABC Corp", "123 Industries", "Big Tech Co."],
92
- "development_stage": "Growth",
93
- "social_media_links": {
94
- "linkedin_url": "https://www.linkedin.com/company/xyzcompany",
95
- "twitter_url": "https://twitter.com/xyzcompany",
96
- "instagram_url": "https://www.instagram.com/xyzcompany"
97
- }
98
- }
99
- </output_format>
100
-
101
- Remember to use only the information provided in the webpage content. Do not include any external information or make assumptions beyond what is explicitly stated or strongly implied in the given content.
102
-
103
- Present your final output in JSON format, enclosed within <json_output> tags.
79
+ Summarize the following text content.
80
+
81
+ Text:
82
+ #{data[:text] || data[:content] || data[:description]}
83
+
84
+ Summary Requirements:
85
+ - Maximum Length: #{max_length}
86
+ - Style: #{style}
87
+ #{"- Focus Areas: #{focus_areas.join(', ')}" if focus_areas.any?}
88
+ #{"- Target Audience: #{data[:audience]}" if data[:audience]}
89
+
90
+ #{'Include key points and main themes.' if data[:include_key_points]}
91
+
92
+ #{if data[:output_format] == 'bullet_points'
93
+ 'Format as bullet points.'
94
+ elsif data[:output_format] == 'paragraph'
95
+ 'Format as a single paragraph.'
96
+ end}
97
+
98
+ Provide a clear and accurate summary.
104
99
  PROMPT
105
100
  end
106
101
 
107
- def prompt_summarize_description(data)
102
+ # General prompt for data classification and categorization
103
+ # Useful for various classification tasks
104
+ def prompt_classify_content(data)
105
+ categories = data[:categories] || []
106
+ classification_type = data[:classification_type] || 'content'
107
+ confidence_scores = data[:include_confidence] || false
108
+
108
109
  <<~PROMPT
109
- Given the company's name, domain, description, and a list of industry-related keywords,
110
- please summarize the company's core business and identify the three most relevant industries.
111
- Highlight the company's unique value proposition, its primary market focus,
112
- and any distinguishing features that set it apart within the identified industries.
113
- Be as objective as possible.
114
-
115
- Name: #{data[:name]}
116
- Domain Name: #{data[:domain_name]}
117
- Industry: #{data[:industries]}
118
- Description: #{data[:description]}
110
+ Classify the provided #{classification_type} into the most appropriate category.
111
+
112
+ Content to Classify:
113
+ #{data[:content] || data[:text] || data[:description]}
114
+
115
+ Available Categories:
116
+ #{categories.map.with_index(1) { |cat, i| "#{i}. #{cat}" }.join("\n")}
117
+
118
+ #{"Classification Criteria:\n#{data[:classification_criteria]}" if data[:classification_criteria]}
119
+
120
+ #{if confidence_scores
121
+ 'Output Format: JSON with category and confidence score (0-1)'
122
+ else
123
+ 'Output Format: Return the most appropriate category name'
124
+ end}
125
+
126
+ #{if data[:multiple_categories]
127
+ "Note: Multiple categories may apply - select up to #{data[:max_categories] || 3} most relevant."
128
+ else
129
+ 'Note: Select only the single most appropriate category.'
130
+ end}
131
+
132
+ Provide your classification based on the content analysis.
119
133
  PROMPT
120
134
  end
121
135
 
136
+ # Flexible custom prompt template
137
+ # Allows for dynamic prompt creation with variable substitution
122
138
  def prompt_custom(data)
123
139
  template = data.fetch(:template)
124
140
  template % data
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmConductor
4
- VERSION = '0.1.0'
4
+ VERSION = '1.0.0'
5
5
  end
data/lib/llm_conductor.rb CHANGED
@@ -8,14 +8,16 @@ require_relative 'llm_conductor/prompts'
8
8
  require_relative 'llm_conductor/prompts/base_prompt'
9
9
  require_relative 'llm_conductor/prompt_manager'
10
10
  require_relative 'llm_conductor/clients/base_client'
11
+ require_relative 'llm_conductor/clients/anthropic_client'
11
12
  require_relative 'llm_conductor/clients/gpt_client'
12
13
  require_relative 'llm_conductor/clients/ollama_client'
13
14
  require_relative 'llm_conductor/clients/openrouter_client'
15
+ require_relative 'llm_conductor/clients/gemini_client'
14
16
  require_relative 'llm_conductor/client_factory'
15
17
 
16
18
  # LLM Conductor provides a unified interface for multiple Language Model providers
17
- # including OpenAI GPT, OpenRouter, and Ollama with built-in prompt templates,
18
- # token counting, and extensible client architecture.
19
+ # including OpenAI GPT, Anthropic Claude, Google Gemini, OpenRouter, and Ollama
20
+ # with built-in prompt templates, token counting, and extensible client architecture.
19
21
  module LlmConductor
20
22
  class Error < StandardError; end
21
23
 
@@ -54,23 +56,28 @@ module LlmConductor
54
56
 
55
57
  def client_class_for_vendor(vendor)
56
58
  case vendor
59
+ when :anthropic, :claude then Clients::AnthropicClient
57
60
  when :openai, :gpt then Clients::GptClient
58
61
  when :openrouter then Clients::OpenrouterClient
59
62
  when :ollama then Clients::OllamaClient
63
+ when :gemini, :google then Clients::GeminiClient
60
64
  else
61
- raise ArgumentError, "Unsupported vendor: #{vendor}. Supported vendors: openai, openrouter, ollama"
65
+ raise ArgumentError,
66
+ "Unsupported vendor: #{vendor}. " \
67
+ 'Supported vendors: anthropic, openai, openrouter, ollama, gemini'
62
68
  end
63
69
  end
64
70
  end
65
71
 
66
72
  # List of supported vendors
67
- SUPPORTED_VENDORS = %i[openai openrouter ollama].freeze
73
+ SUPPORTED_VENDORS = %i[anthropic openai openrouter ollama gemini].freeze
68
74
 
69
75
  # List of supported prompt types
70
76
  SUPPORTED_PROMPT_TYPES = %i[
71
- featured_links
72
- summarize_htmls
73
- summarize_description
77
+ extract_links
78
+ analyze_content
79
+ summarize_text
80
+ classify_content
74
81
  custom
75
82
  ].freeze
76
83
  end
metadata CHANGED
@@ -1,13 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_conductor
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 1.0.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ben Zheng
8
8
  bindir: exe
9
9
  cert_chain: []
10
- date: 2025-09-19 00:00:00.000000000 Z
10
+ date: 2025-09-26 00:00:00.000000000 Z
11
11
  dependencies:
12
12
  - !ruby/object:Gem::Dependency
13
13
  name: activesupport
@@ -23,6 +23,34 @@ dependencies:
23
23
  - - ">="
24
24
  - !ruby/object:Gem::Version
25
25
  version: '6.0'
26
+ - !ruby/object:Gem::Dependency
27
+ name: anthropic
28
+ requirement: !ruby/object:Gem::Requirement
29
+ requirements:
30
+ - - "~>"
31
+ - !ruby/object:Gem::Version
32
+ version: '1.7'
33
+ type: :runtime
34
+ prerelease: false
35
+ version_requirements: !ruby/object:Gem::Requirement
36
+ requirements:
37
+ - - "~>"
38
+ - !ruby/object:Gem::Version
39
+ version: '1.7'
40
+ - !ruby/object:Gem::Dependency
41
+ name: gemini-ai
42
+ requirement: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - "~>"
45
+ - !ruby/object:Gem::Version
46
+ version: '4.3'
47
+ type: :runtime
48
+ prerelease: false
49
+ version_requirements: !ruby/object:Gem::Requirement
50
+ requirements:
51
+ - - "~>"
52
+ - !ruby/object:Gem::Version
53
+ version: '4.3'
26
54
  - !ruby/object:Gem::Dependency
27
55
  name: ollama-ai
28
56
  requirement: !ruby/object:Gem::Requirement
@@ -111,12 +139,15 @@ files:
111
139
  - Rakefile
112
140
  - config/initializers/llm_conductor.rb
113
141
  - examples/data_builder_usage.rb
142
+ - examples/gemini_usage.rb
114
143
  - examples/prompt_registration.rb
115
144
  - examples/rag_usage.rb
116
145
  - examples/simple_usage.rb
117
146
  - lib/llm_conductor.rb
118
147
  - lib/llm_conductor/client_factory.rb
148
+ - lib/llm_conductor/clients/anthropic_client.rb
119
149
  - lib/llm_conductor/clients/base_client.rb
150
+ - lib/llm_conductor/clients/gemini_client.rb
120
151
  - lib/llm_conductor/clients/gpt_client.rb
121
152
  - lib/llm_conductor/clients/ollama_client.rb
122
153
  - lib/llm_conductor/clients/openrouter_client.rb