llm_gateway 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: 912c28c117bad46986f56b7571497ff276cb3ac5b9e39b442c7b29aeb50692e6
4
+ data.tar.gz: 2a087a8344992f50e0a4258d6aecbbeb7b5670cba62cfefed7c8fe054ed98c40
5
+ SHA512:
6
+ metadata.gz: 541fb18a405c58505913a7b5dc57fe109dacec72644224cdfb9746e5da290ca93cab2670731ae41f865cc79a20f28d6ee5dab3ea6690ab2161f82fe82b3c81f2
7
+ data.tar.gz: 2d6b3294fb5140e7e8b3754ae4c16bd983a133db5a3dcf61fd1adc0f07eed18003f4be724e2b90d98ca98f0aa78678d87648d434893836b52934e7da4fb7a979
data/.rubocop.yml ADDED
@@ -0,0 +1,5 @@
1
+ inherit_gem:
2
+ rubocop-rails-omakase: rubocop.yml
3
+
4
+ AllCops:
5
+ TargetRubyVersion: 3.1
data/CHANGELOG.md ADDED
@@ -0,0 +1,5 @@
1
+ ## [Unreleased]
2
+
3
+ ## [0.1.0] - 2025-08-04
4
+
5
+ - Initial release
@@ -0,0 +1,132 @@
1
+ # Contributor Covenant Code of Conduct
2
+
3
+ ## Our Pledge
4
+
5
+ We as members, contributors, and leaders pledge to make participation in our
6
+ community a harassment-free experience for everyone, regardless of age, body
7
+ size, visible or invisible disability, ethnicity, sex characteristics, gender
8
+ identity and expression, level of experience, education, socio-economic status,
9
+ nationality, personal appearance, race, caste, color, religion, or sexual
10
+ identity and orientation.
11
+
12
+ We pledge to act and interact in ways that contribute to an open, welcoming,
13
+ diverse, inclusive, and healthy community.
14
+
15
+ ## Our Standards
16
+
17
+ Examples of behavior that contributes to a positive environment for our
18
+ community include:
19
+
20
+ * Demonstrating empathy and kindness toward other people
21
+ * Being respectful of differing opinions, viewpoints, and experiences
22
+ * Giving and gracefully accepting constructive feedback
23
+ * Accepting responsibility and apologizing to those affected by our mistakes,
24
+ and learning from the experience
25
+ * Focusing on what is best not just for us as individuals, but for the overall
26
+ community
27
+
28
+ Examples of unacceptable behavior include:
29
+
30
+ * The use of sexualized language or imagery, and sexual attention or advances of
31
+ any kind
32
+ * Trolling, insulting or derogatory comments, and personal or political attacks
33
+ * Public or private harassment
34
+ * Publishing others' private information, such as a physical or email address,
35
+ without their explicit permission
36
+ * Other conduct which could reasonably be considered inappropriate in a
37
+ professional setting
38
+
39
+ ## Enforcement Responsibilities
40
+
41
+ Community leaders are responsible for clarifying and enforcing our standards of
42
+ acceptable behavior and will take appropriate and fair corrective action in
43
+ response to any behavior that they deem inappropriate, threatening, offensive,
44
+ or harmful.
45
+
46
+ Community leaders have the right and responsibility to remove, edit, or reject
47
+ comments, commits, code, wiki edits, issues, and other contributions that are
48
+ not aligned to this Code of Conduct, and will communicate reasons for moderation
49
+ decisions when appropriate.
50
+
51
+ ## Scope
52
+
53
+ This Code of Conduct applies within all community spaces, and also applies when
54
+ an individual is officially representing the community in public spaces.
55
+ Examples of representing our community include using an official email address,
56
+ posting via an official social media account, or acting as an appointed
57
+ representative at an online or offline event.
58
+
59
+ ## Enforcement
60
+
61
+ Instances of abusive, harassing, or otherwise unacceptable behavior may be
62
+ reported to the community leaders responsible for enforcement at
63
+ [INSERT CONTACT METHOD].
64
+ All complaints will be reviewed and investigated promptly and fairly.
65
+
66
+ All community leaders are obligated to respect the privacy and security of the
67
+ reporter of any incident.
68
+
69
+ ## Enforcement Guidelines
70
+
71
+ Community leaders will follow these Community Impact Guidelines in determining
72
+ the consequences for any action they deem in violation of this Code of Conduct:
73
+
74
+ ### 1. Correction
75
+
76
+ **Community Impact**: Use of inappropriate language or other behavior deemed
77
+ unprofessional or unwelcome in the community.
78
+
79
+ **Consequence**: A private, written warning from community leaders, providing
80
+ clarity around the nature of the violation and an explanation of why the
81
+ behavior was inappropriate. A public apology may be requested.
82
+
83
+ ### 2. Warning
84
+
85
+ **Community Impact**: A violation through a single incident or series of
86
+ actions.
87
+
88
+ **Consequence**: A warning with consequences for continued behavior. No
89
+ interaction with the people involved, including unsolicited interaction with
90
+ those enforcing the Code of Conduct, for a specified period of time. This
91
+ includes avoiding interactions in community spaces as well as external channels
92
+ like social media. Violating these terms may lead to a temporary or permanent
93
+ ban.
94
+
95
+ ### 3. Temporary Ban
96
+
97
+ **Community Impact**: A serious violation of community standards, including
98
+ sustained inappropriate behavior.
99
+
100
+ **Consequence**: A temporary ban from any sort of interaction or public
101
+ communication with the community for a specified period of time. No public or
102
+ private interaction with the people involved, including unsolicited interaction
103
+ with those enforcing the Code of Conduct, is allowed during this period.
104
+ Violating these terms may lead to a permanent ban.
105
+
106
+ ### 4. Permanent Ban
107
+
108
+ **Community Impact**: Demonstrating a pattern of violation of community
109
+ standards, including sustained inappropriate behavior, harassment of an
110
+ individual, or aggression toward or disparagement of classes of individuals.
111
+
112
+ **Consequence**: A permanent ban from any sort of public interaction within the
113
+ community.
114
+
115
+ ## Attribution
116
+
117
+ This Code of Conduct is adapted from the [Contributor Covenant][homepage],
118
+ version 2.1, available at
119
+ [https://www.contributor-covenant.org/version/2/1/code_of_conduct.html][v2.1].
120
+
121
+ Community Impact Guidelines were inspired by
122
+ [Mozilla's code of conduct enforcement ladder][Mozilla CoC].
123
+
124
+ For answers to common questions about this code of conduct, see the FAQ at
125
+ [https://www.contributor-covenant.org/faq][FAQ]. Translations are available at
126
+ [https://www.contributor-covenant.org/translations][translations].
127
+
128
+ [homepage]: https://www.contributor-covenant.org
129
+ [v2.1]: https://www.contributor-covenant.org/version/2/1/code_of_conduct.html
130
+ [Mozilla CoC]: https://github.com/mozilla/diversity
131
+ [FAQ]: https://www.contributor-covenant.org/faq
132
+ [translations]: https://www.contributor-covenant.org/translations
data/LICENSE.txt ADDED
@@ -0,0 +1,21 @@
1
+ The MIT License (MIT)
2
+
3
+ Copyright (c) 2025 billybonks
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in
13
+ all copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
+ THE SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,128 @@
1
+ # LlmGateway
2
+
3
+ Provide nuts and bolts for LLM APIs. The goal is to provide a unified interface for multiple LLM provider API's; And Enable developers to have as much control as they want.
4
+
5
+ You can use the clients directly, Or you can use the gateway to have interop between clients.
6
+
7
+ ## Supported Providers
8
+ Anthropic, OpenAi, Groq
9
+
10
+
11
+ ## Installation
12
+
13
+ Add the gem to your application's Gemfile:
14
+
15
+ ```bash
16
+ bundle add llm_gateway
17
+ ```
18
+
19
+ Or install it yourself:
20
+
21
+ ```bash
22
+ gem install llm_gateway
23
+ ```
24
+
25
+ ## Usage
26
+
27
+ ### Basic Chat
28
+
29
+ ```ruby
30
+ require 'llm_gateway'
31
+
32
+ # Simple text completion
33
+ result = LlmGateway::Client.chat(
34
+ 'claude-sonnet-4-20250514',
35
+ 'What is the capital of France?'
36
+ )
37
+
38
+ # With system message
39
+ result = LlmGateway::Client.chat(
40
+ 'gpt-4',
41
+ 'What is the capital of France?',
42
+ system: 'You are a helpful geography teacher.'
43
+ )
44
+ ```
45
+
46
+ ### Tool Usage (Function Calling)
47
+
48
+ ```ruby
49
+ # Define a tool
50
+ weather_tool = {
51
+ name: 'get_weather',
52
+ description: 'Get current weather for a location',
53
+ input_schema: {
54
+ type: 'object',
55
+ properties: {
56
+ location: { type: 'string', description: 'City name' }
57
+ },
58
+ required: ['location']
59
+ }
60
+ }
61
+
62
+ # Use the tool
63
+ result = LlmGateway::Client.chat(
64
+ 'claude-sonnet-4-20250514',
65
+ 'What\'s the weather in Singapore?',
66
+ tools: [weather_tool],
67
+ system: 'You are a helpful weather assistant.'
68
+ )
69
+ ```
70
+
71
+ ### Response Format
72
+
73
+ All providers return responses in a consistent format:
74
+
75
+ ```ruby
76
+ {
77
+ choices: [
78
+ {
79
+ content: [
80
+ { type: 'text', text: 'The capital of France is Paris.' }
81
+ ],
82
+ finish_reason: 'end_turn',
83
+ role: 'assistant'
84
+ }
85
+ ],
86
+ usage: {
87
+ input_tokens: 15,
88
+ output_tokens: 8,
89
+ total_tokens: 23
90
+ },
91
+ model: 'claude-sonnet-4-20250514',
92
+ id: 'msg_abc123'
93
+ }
94
+ ```
95
+
96
+ ### Error Handling
97
+
98
+ LlmGateway provides consistent error handling across all providers:
99
+
100
+ ```ruby
101
+ begin
102
+ result = LlmGateway::Client.chat('invalid-model', 'Hello')
103
+ rescue LlmGateway::Errors::UnsupportedModel => e
104
+ puts "Unsupported model: #{e.message}"
105
+ rescue LlmGateway::Errors::AuthenticationError => e
106
+ puts "Authentication failed: #{e.message}"
107
+ rescue LlmGateway::Errors::RateLimitError => e
108
+ puts "Rate limit exceeded: #{e.message}"
109
+ end
110
+ ```
111
+
112
+ ## Development
113
+
114
+ After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake test` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
115
+
116
+ To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
117
+
118
+ ## Contributing
119
+
120
+ Bug reports and pull requests are welcome on GitHub at https://github.com/Hyper-Unearthing/llm_gateway. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [code of conduct](https://github.com/Hyper-Unearthing/llm_gateway/blob/master/CODE_OF_CONDUCT.md).
121
+
122
+ ## License
123
+
124
+ The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
125
+
126
+ ## Code of Conduct
127
+
128
+ Everyone interacting in the LlmGateway project's codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/Hyper-Unearthing/llm_gateway/blob/master/CODE_OF_CONDUCT.md).
data/Rakefile ADDED
@@ -0,0 +1,29 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "bundler/gem_tasks"
4
+ require "minitest/test_task"
5
+
6
+ Minitest::TestTask.create
7
+
8
+ require "rubocop/rake_task"
9
+
10
+ RuboCop::RakeTask.new
11
+
12
+ begin
13
+ require "gem-release"
14
+
15
+ desc "Release with changelog"
16
+ task :release do
17
+ # Generate changelog first
18
+ sh "github_changelog_generator" # or your preferred changelog tool
19
+ sh "git add CHANGELOG.md"
20
+ sh "git commit -m 'Update changelog' || echo 'No changelog changes'"
21
+
22
+ # Release
23
+ sh "gem bump --version patch --tag --push --release"
24
+ end
25
+ rescue LoadError
26
+ # gem-release not available in this environment
27
+ end
28
+
29
+ task default: %i[test rubocop]
@@ -0,0 +1,56 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "../../base_client"
4
+
5
+ module LlmGateway
6
+ module Adapters
7
+ module Claude
8
+ class Client < BaseClient
9
+ def initialize(model_key: "claude-3-7-sonnet-20250219", api_key: ENV["ANTHROPIC_API_KEY"])
10
+ @base_endpoint = "https://api.anthropic.com/v1"
11
+ super(model_key: model_key, api_key: api_key)
12
+ end
13
+
14
+ def chat(messages, response_format: { type: "text" }, tools: nil, system: [], max_completion_tokens: 4096)
15
+ body = {
16
+ model: model_key,
17
+ max_tokens: max_completion_tokens,
18
+ messages: messages
19
+ }
20
+
21
+ body.merge!(tools: tools) if LlmGateway::Utils.present?(tools)
22
+ body.merge!(system: system) if LlmGateway::Utils.present?(system)
23
+
24
+ post("messages", body)
25
+ end
26
+
27
+ def download_file(file_id)
28
+ get("files/#{file_id}/content")
29
+ end
30
+
31
+ private
32
+
33
+ def build_headers
34
+ {
35
+ "anthropic-version" => "2023-06-01",
36
+ "content-type" => "application/json",
37
+ "x-api-key" => api_key,
38
+ "anthropic-beta" => "code-execution-2025-05-22,files-api-2025-04-14"
39
+ }
40
+ end
41
+
42
+ def handle_client_specific_errors(response, error)
43
+ case response.code.to_i
44
+ when 400
45
+ if error["message"]&.start_with?("prompt is too long")
46
+ raise Errors::PromptTooLong.new(error["message"], error["type"])
47
+ end
48
+ end
49
+
50
+ # If we get here, we didn't handle it specifically
51
+ raise Errors::APIStatusError.new(error["message"], error["type"])
52
+ end
53
+ end
54
+ end
55
+ end
56
+ end
@@ -0,0 +1,34 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module Claude
6
+ class InputMapper
7
+ extend LlmGateway::FluentMapper
8
+
9
+ map :messages do |_, value|
10
+ value.map do |msg|
11
+ msg = msg.merge(role: "user") if msg[:role] == "developer"
12
+ msg.slice(:role, :content)
13
+ end
14
+ end
15
+
16
+ map :system do |_, value|
17
+ if value.empty?
18
+ nil
19
+ elsif value.length == 1 && value.first[:role] == "system"
20
+ # If we have a single system message, convert to Claude format
21
+ [ { type: "text", text: value.first[:content] } ]
22
+ else
23
+ # For multiple messages or non-standard format, pass through
24
+ value
25
+ end
26
+ end
27
+
28
+ map :tools do |_, value|
29
+ value
30
+ end
31
+ end
32
+ end
33
+ end
34
+ end
@@ -0,0 +1,24 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module Claude
6
+ class OutputMapper
7
+ extend LlmGateway::FluentMapper
8
+
9
+ map :id
10
+ map :model
11
+ map :usage
12
+ map :choices do |_, _|
13
+ # Claude returns content directly at root level, not in a choices array
14
+ # We need to construct the choices array from the full response data
15
+ [ {
16
+ content: @data[:content] || [], # Use content directly from Claude response
17
+ finish_reason: @data[:stop_reason],
18
+ role: "assistant"
19
+ } ]
20
+ end
21
+ end
22
+ end
23
+ end
24
+ end
@@ -0,0 +1,58 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "../../base_client"
4
+
5
+ module LlmGateway
6
+ module Adapters
7
+ module Groq
8
+ class Client < BaseClient
9
+ def initialize(model_key: "gemma2-9b-it", api_key: ENV["GROQ_API_KEY"])
10
+ @base_endpoint = "https://api.groq.com/openai/v1"
11
+ super(model_key: model_key, api_key: api_key)
12
+ end
13
+
14
+ def chat(messages, response_format: { type: "text" }, tools: nil, system: [], max_completion_tokens: 4096)
15
+ body = {
16
+ model: model_key,
17
+ messages: system + messages,
18
+ temperature: 0,
19
+ max_completion_tokens: max_completion_tokens,
20
+ response_format: response_format,
21
+ tools: tools
22
+ }
23
+
24
+ post("chat/completions", body)
25
+ end
26
+
27
+ private
28
+
29
+ def build_headers
30
+ {
31
+ "content-type" => "application/json",
32
+ "Authorization" => "Bearer #{api_key}"
33
+ }
34
+ end
35
+
36
+ def handle_client_specific_errors(response, error)
37
+ # Groq likely uses 'code' like OpenAI since it's OpenAI-compatible
38
+ error_code = error["code"]
39
+
40
+ case response.code.to_i
41
+
42
+ when 413
43
+ if error["message"]&.start_with?("Request too large")
44
+ raise Errors::PromptTooLong.new(error["message"], error["type"])
45
+ end
46
+ when 429
47
+ raise Errors::RateLimitError.new(error["type"], error_code) if error_code == "rate_limit_exceeded"
48
+
49
+ raise Errors::OverloadError.new(error["message"], error_code)
50
+ end
51
+
52
+ # If we get here, we didn't handle it specifically
53
+ raise Errors::APIStatusError.new(error["message"], error_code)
54
+ end
55
+ end
56
+ end
57
+ end
58
+ end
@@ -0,0 +1,78 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module Groq
6
+ class InputMapper
7
+ extend LlmGateway::FluentMapper
8
+
9
+ map :system
10
+ map :response_format
11
+
12
+ mapper :tool_usage do
13
+ map :role, default: "assistant"
14
+ map :content do
15
+ nil
16
+ end
17
+ map :tool_calls, from: :content do |_, value|
18
+ value.map do |content|
19
+ {
20
+ 'id': content[:id],
21
+ 'type': "function",
22
+ 'function': {
23
+ 'name': content[:name],
24
+ 'arguments': content[:input].to_json
25
+ }
26
+ }
27
+ end
28
+ end
29
+ end
30
+
31
+ mapper :tool_result_message do
32
+ map :role, default: "tool"
33
+ map :tool_call_id, from: "tool_use_id"
34
+ map :content
35
+ end
36
+
37
+ map :messages do |_, value|
38
+ value.map do |msg|
39
+ if msg[:role] == "user"
40
+ msg
41
+ elsif msg[:content].is_a?(Array)
42
+ results = []
43
+ # Handle tool_use messages
44
+ tool_uses = msg[:content].select { |c| c[:type] == "tool_use" }
45
+ results << map_single(msg, with: :tool_usage) if tool_uses.any?
46
+ # Handle tool_result messages
47
+ tool_results = msg[:content].select { |c| c[:type] == "tool_result" }
48
+ tool_results.each do |content|
49
+ results << map_single(content, with: :tool_result_message)
50
+ end
51
+
52
+ results
53
+ else
54
+ msg
55
+ end
56
+ end.flatten
57
+ end
58
+
59
+ map :tools do |_, value|
60
+ if value
61
+ value.map do |tool|
62
+ {
63
+ type: "function",
64
+ function: {
65
+ name: tool[:name],
66
+ description: tool[:description],
67
+ parameters: tool[:input_schema]
68
+ }
69
+ }
70
+ end
71
+ else
72
+ value
73
+ end
74
+ end
75
+ end
76
+ end
77
+ end
78
+ end
@@ -0,0 +1,46 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module Groq
6
+ class OutputMapper
7
+ extend LlmGateway::FluentMapper
8
+
9
+ mapper :tool_call do
10
+ map :id
11
+ map :type do
12
+ "tool_use" # Always return 'tool_use' regardless of input
13
+ end
14
+ map :name, from: "function.name"
15
+ map :input, from: "function.arguments" do |_, value|
16
+ parsed = value.is_a?(String) ? JSON.parse(value) : value
17
+ parsed
18
+ end
19
+ end
20
+
21
+ mapper :content_item do
22
+ map :text, from: "content"
23
+ map :type, default: "text"
24
+ end
25
+
26
+ map :id
27
+ map :model
28
+ map :usage
29
+ map :choices, from: "choices" do |_, value|
30
+ value.map do |choice|
31
+ message = choice[:message] || {}
32
+ content_item = map_single(message, with: :content_item, default: {})
33
+ tool_calls = map_collection(message[:tool_calls], with: :tool_call, default: [])
34
+
35
+ # Only include content_item if it has actual text content
36
+ content_array = []
37
+ content_array << content_item if LlmGateway::Utils.present?(content_item["text"])
38
+ content_array += tool_calls
39
+
40
+ { content: content_array }
41
+ end
42
+ end
43
+ end
44
+ end
45
+ end
46
+ end
@@ -0,0 +1,59 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "../../base_client"
4
+
5
+ module LlmGateway
6
+ module Adapters
7
+ module OpenAi
8
+ class Client < BaseClient
9
+ def initialize(model_key: "gpt-4o", api_key: ENV["OPENAI_API_KEY"])
10
+ @base_endpoint = "https://api.openai.com/v1"
11
+ super(model_key: model_key, api_key: api_key)
12
+ end
13
+
14
+ def chat(messages, response_format: { type: "text" }, tools: nil, system: [], max_completion_tokens: 4096)
15
+ body = {
16
+ model: model_key,
17
+ messages: system + messages,
18
+ max_completion_tokens: max_completion_tokens
19
+ }
20
+ body[:tools] = tools if tools
21
+
22
+ post("chat/completions", body)
23
+ end
24
+
25
+ def generate_embeddings(input)
26
+ body = {
27
+ input:,
28
+ model: model_key
29
+ }
30
+ post("embeddings", body)
31
+ end
32
+
33
+ private
34
+
35
+ def build_headers
36
+ {
37
+ "content-type" => "application/json",
38
+ "Authorization" => "Bearer #{api_key}"
39
+ }
40
+ end
41
+
42
+ def handle_client_specific_errors(response, error)
43
+ # OpenAI uses 'code' instead of 'type' for error codes
44
+ error_code = error["code"]
45
+
46
+ case response.code.to_i
47
+ when 429
48
+ raise Errors::RateLimitError.new(error["message"], error_code)
49
+ when 503
50
+ raise Errors::OverloadError.new(error["message"], error_code)
51
+ end
52
+
53
+ # If we get here, we didn't handle it specifically
54
+ raise Errors::APIStatusError.new(error["message"], error_code)
55
+ end
56
+ end
57
+ end
58
+ end
59
+ end
@@ -0,0 +1,19 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module OpenAi
6
+ class InputMapper < LlmGateway::Adapters::Groq::InputMapper
7
+ map :system do |_, value|
8
+ if value.empty?
9
+ []
10
+ else
11
+ value.map do |msg|
12
+ msg[:role] == "system" ? msg.merge(role: "developer") : msg
13
+ end
14
+ end
15
+ end
16
+ end
17
+ end
18
+ end
19
+ end
@@ -0,0 +1,10 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module OpenAi
6
+ class OutputMapper < LlmGateway::Adapters::Groq::OutputMapper
7
+ end
8
+ end
9
+ end
10
+ end
@@ -0,0 +1,104 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "net/http"
4
+ require "json"
5
+
6
+ module LlmGateway
7
+ class BaseClient
8
+ attr_accessor
9
+ attr_reader :api_key, :model_key, :base_endpoint
10
+
11
+ def initialize(model_key:, api_key:)
12
+ @model_key = model_key
13
+ @api_key = api_key
14
+ end
15
+
16
+ def get(url_part, extra_headers = {})
17
+ endpoint = "#{base_endpoint}/#{url_part.sub(%r{^/}, "")}"
18
+ response = make_request(endpoint, Net::HTTP::Get, nil, extra_headers)
19
+ process_response(response)
20
+ end
21
+
22
+ def post(url_part, body = nil, extra_headers = {})
23
+ endpoint = "#{base_endpoint}/#{url_part.sub(%r{^/}, "")}"
24
+ response = make_request(endpoint, Net::HTTP::Post, body, extra_headers)
25
+ process_response(response)
26
+ end
27
+
28
+ protected
29
+
30
+ def make_request(endpoint, method, params = nil, extra_headers = {})
31
+ uri = URI(endpoint)
32
+ http = Net::HTTP.new(uri.host, uri.port)
33
+ http.use_ssl = true
34
+ http.read_timeout = 480
35
+ http.open_timeout = 10
36
+
37
+ request = method.new(uri)
38
+ headers = build_headers.merge(extra_headers)
39
+ headers.each { |key, value| request[key] = value }
40
+ request.body = params.to_json if params
41
+
42
+ http.request(request)
43
+ end
44
+
45
+ def process_response(response)
46
+ case response.code.to_i
47
+ when 200
48
+ content_type = response["content-type"]
49
+ if content_type&.include?("application/json")
50
+ LlmGateway::Utils.deep_symbolize_keys(JSON.parse(response.body))
51
+ else
52
+ response.body
53
+ end
54
+ else
55
+ handle_error(response)
56
+ end
57
+ end
58
+
59
+ def handle_error(response)
60
+ error_body = begin
61
+ JSON.parse(response.body)
62
+ rescue StandardError
63
+ {}
64
+ end
65
+ error = error_body["error"] || {}
66
+
67
+ # Try client-specific error handling first
68
+ begin
69
+ handle_client_specific_errors(response, error)
70
+ rescue Errors::APIStatusError => e
71
+ # If client doesn't handle it, use standard HTTP status codes
72
+ # Use the message and code that were already passed to APIStatusError
73
+ case response.code.to_i
74
+ when 400
75
+ raise Errors::BadRequestError.new(e.message, e.code)
76
+ when 401
77
+ raise Errors::AuthenticationError.new(e.message, e.code)
78
+ when 403
79
+ raise Errors::PermissionDeniedError.new(e.message, e.code)
80
+ when 404
81
+ raise Errors::NotFoundError.new(e.message, e.code)
82
+ when 409
83
+ raise Errors::ConflictError.new(e.message, e.code)
84
+ when 422
85
+ raise Errors::UnprocessableEntityError.new(e.message, e.code)
86
+ when 429
87
+ raise Errors::RateLimitError.new(e.message, e.code)
88
+ when 503
89
+ raise Errors::OverloadError.new(e.message, e.code)
90
+ when 500..599
91
+ raise Errors::InternalServerError.new(e.message, e.code)
92
+ else
93
+ raise e # Re-raise the original APIStatusError
94
+ end
95
+ end
96
+ end
97
+
98
+ def build_headers
99
+ raise NotImplementedError, "Subclasses must implement build_headers"
100
+ end
101
+
102
+ def handle_client_specific_errors(response, error); end
103
+ end
104
+ end
@@ -0,0 +1,77 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ class Client
5
+ def self.chat(model, message, response_format: "text", tools: nil, system: nil)
6
+ client_klass = client_class(model)
7
+ client = client_klass.new(model_key: model)
8
+
9
+ input_mapper = input_mapper_for_client(client)
10
+ normalized_input = input_mapper.map({
11
+ messages: normalize_messages(message),
12
+ response_format: normalize_response_format(response_format),
13
+ tools: tools,
14
+ system: normalize_system(system)
15
+ })
16
+ result = client.chat(
17
+ normalized_input[:messages],
18
+ response_format: normalized_input[:response_format],
19
+ tools: normalized_input[:tools],
20
+ system: normalized_input[:system]
21
+ )
22
+ result_mapper(client).map(result)
23
+ end
24
+
25
+ def self.client_class(model)
26
+ return LlmGateway::Adapters::Claude::Client if model.start_with?("claude")
27
+ return LlmGateway::Adapters::Groq::Client if model.start_with?("llama")
28
+ return LlmGateway::Adapters::OpenAi::Client if model.start_with?("gpt") ||
29
+ model.start_with?("o4-") ||
30
+ model.start_with?("openai")
31
+
32
+ raise LlmGateway::Errors::UnsupportedModel, model
33
+ end
34
+
35
+ def self.input_mapper_for_client(client)
36
+ return LlmGateway::Adapters::Claude::InputMapper if client.is_a?(LlmGateway::Adapters::Claude::Client)
37
+ return LlmGateway::Adapters::OpenAi::InputMapper if client.is_a?(LlmGateway::Adapters::OpenAi::Client)
38
+
39
+ LlmGateway::Adapters::Groq::InputMapper if client.is_a?(LlmGateway::Adapters::Groq::Client)
40
+ end
41
+
42
+ def self.result_mapper(client)
43
+ return LlmGateway::Adapters::Claude::OutputMapper if client.is_a?(LlmGateway::Adapters::Claude::Client)
44
+ return LlmGateway::Adapters::OpenAi::OutputMapper if client.is_a?(LlmGateway::Adapters::OpenAi::Client)
45
+
46
+ LlmGateway::Adapters::Groq::OutputMapper if client.is_a?(LlmGateway::Adapters::Groq::Client)
47
+ end
48
+
49
+ def self.normalize_system(system)
50
+ if system.nil?
51
+ []
52
+ elsif system.is_a?(String)
53
+ [ { role: "system", content: system } ]
54
+ elsif system.is_a?(Array)
55
+ system
56
+ else
57
+ raise ArgumentError, "System parameter must be a string or array, got #{system.class}"
58
+ end
59
+ end
60
+
61
+ def self.normalize_messages(message)
62
+ if message.is_a?(String)
63
+ [ { 'role': "user", 'content': message } ]
64
+ else
65
+ message
66
+ end
67
+ end
68
+
69
+ def self.normalize_response_format(response_format)
70
+ if response_format.is_a?(String)
71
+ { type: response_format }
72
+ else
73
+ response_format
74
+ end
75
+ end
76
+ end
77
+ end
@@ -0,0 +1,30 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Errors
5
+ class BaseError < StandardError
6
+ attr_reader :code
7
+
8
+ def initialize(message = nil, code = nil)
9
+ @code = code
10
+ super(message)
11
+ end
12
+ end
13
+
14
+ class BadRequestError < BaseError; end
15
+ class AuthenticationError < BaseError; end
16
+ class PermissionDeniedError < BaseError; end
17
+ class NotFoundError < BaseError; end
18
+ class ConflictError < BaseError; end
19
+ class UnprocessableEntityError < BaseError; end
20
+ class RateLimitError < BaseError; end
21
+ class InternalServerError < BaseError; end
22
+ class APIStatusError < BaseError; end
23
+ class APITimeoutError < BaseError; end
24
+ class APIConnectionError < BaseError; end
25
+ class OverloadError < BaseError; end
26
+ class UnknownError < BaseError; end
27
+ class PromptTooLong < BadRequestError; end
28
+ class UnsupportedModel < BaseError; end
29
+ end
30
+ end
@@ -0,0 +1,144 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module FluentMapper
5
+ def self.extended(base)
6
+ base.instance_variable_set(:@mappers, {})
7
+ base.instance_variable_set(:@mappings, [])
8
+ end
9
+
10
+ def inherited(subclass)
11
+ super
12
+ # Copy parent's mappers and mappings to the subclass
13
+ subclass.instance_variable_set(:@mappers, @mappers.dup)
14
+ subclass.instance_variable_set(:@mappings, @mappings.dup)
15
+ end
16
+
17
+ def mapper(name, &block)
18
+ @mappers[name] = block
19
+ end
20
+
21
+ def map(field_or_data, options = {}, &block)
22
+ # If called with a single argument and no block, it's the class method usage
23
+ if block.nil? && options.empty? && !field_or_data.is_a?(Symbol) && !field_or_data.is_a?(String)
24
+ return new(field_or_data).call
25
+ end
26
+
27
+ # Otherwise it's the field mapping usage
28
+ @mappings << { field: field_or_data, options: options, block: block }
29
+ end
30
+
31
+ def new(data)
32
+ MapperInstance.new(data, @mappers, @mappings)
33
+ end
34
+
35
+ class MapperInstance
36
+ def initialize(data, mappers, mappings)
37
+ @data = data.respond_to?(:with_indifferent_access) ? data.with_indifferent_access : data
38
+ @mappers = mappers
39
+ @mappings = mappings
40
+ @mapper_definitions = {}
41
+
42
+ # Execute mapper definitions
43
+ mappers.each do |name, block|
44
+ @mapper_definitions[name] = MapperDefinition.new
45
+ @mapper_definitions[name].instance_eval(&block)
46
+ end
47
+ end
48
+
49
+ def call
50
+ result = {}
51
+
52
+ @mappings.each do |mapping|
53
+ field = mapping[:field]
54
+ options = mapping[:options]
55
+ block = mapping[:block]
56
+
57
+ from_path = options[:from] || field.to_s
58
+ default_value = options[:default]
59
+
60
+ value = get_nested_value(@data, from_path)
61
+ value = default_value if value.nil? && !default_value.nil?
62
+
63
+ value = instance_exec(field, value, &block) if block
64
+
65
+ result[field] = value
66
+ end
67
+
68
+ LlmGateway::Utils.deep_symbolize_keys(result)
69
+ end
70
+
71
+ def map_single(data, options = {})
72
+ mapper_name = options[:with]
73
+ default_value = options[:default]
74
+ return default_value if data.nil? && !default_value.nil?
75
+ return data unless mapper_name && @mapper_definitions[mapper_name]
76
+
77
+ # Apply with_indifferent_access to data
78
+ data = data.respond_to?(:with_indifferent_access) ? data.with_indifferent_access : data
79
+
80
+ mapper_def = @mapper_definitions[mapper_name]
81
+ result = {}
82
+
83
+ mapper_def.mappings.each do |mapping|
84
+ field = mapping[:field]
85
+ map_options = mapping[:options]
86
+ block = mapping[:block]
87
+
88
+ from_path = map_options[:from] || field.to_s
89
+ field_default_value = map_options[:default]
90
+
91
+ value = get_nested_value(data, from_path)
92
+ value = field_default_value if value.nil? && !field_default_value.nil?
93
+
94
+ value = instance_exec(field, value, &block) if block
95
+
96
+ result[field.to_s] = value
97
+ end
98
+
99
+ result
100
+ end
101
+
102
+ def map_collection(collection, options = {})
103
+ default_value = options[:default]
104
+ return default_value if collection.nil? && !default_value.nil?
105
+ return [] if collection.nil?
106
+
107
+ collection.map { |item| map_single(item, options) }
108
+ end
109
+
110
+ private
111
+
112
+ def get_nested_value(data, path)
113
+ return data[path] if data.respond_to?(:[]) && data.key?(path)
114
+ return data[path.to_sym] if data.respond_to?(:[]) && data.key?(path.to_sym)
115
+ return data[path.to_s] if data.respond_to?(:[]) && data.key?(path.to_s)
116
+
117
+ keys = path.split(".")
118
+ current = data
119
+
120
+ keys.each do |key|
121
+ return nil unless current.respond_to?(:[])
122
+
123
+ current = current[key] || current[key.to_sym] || current[key.to_s]
124
+
125
+ return nil if current.nil?
126
+ end
127
+
128
+ current
129
+ end
130
+ end
131
+
132
+ class MapperDefinition
133
+ attr_reader :mappings
134
+
135
+ def initialize
136
+ @mappings = []
137
+ end
138
+
139
+ def map(field, options = {}, &block)
140
+ @mappings << { field: field, options: options, block: block }
141
+ end
142
+ end
143
+ end
144
+ end
@@ -0,0 +1,39 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Utils
5
+ module_function
6
+
7
+ def deep_symbolize_keys(hash)
8
+ case hash
9
+ when Hash
10
+ hash.each_with_object({}) do |(key, value), result|
11
+ result[key.to_sym] = deep_symbolize_keys(value)
12
+ end
13
+ when Array
14
+ hash.map { |item| deep_symbolize_keys(item) }
15
+ else
16
+ hash
17
+ end
18
+ end
19
+
20
+ def present?(value)
21
+ !blank?(value)
22
+ end
23
+
24
+ def blank?(value)
25
+ case value
26
+ when nil
27
+ true
28
+ when String
29
+ value.strip.empty?
30
+ when Array, Hash
31
+ value.empty?
32
+ when Numeric
33
+ false
34
+ else
35
+ value.respond_to?(:empty?) ? value.empty? : false
36
+ end
37
+ end
38
+ end
39
+ end
@@ -0,0 +1,5 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ VERSION = "0.1.0"
5
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "llm_gateway/utils"
4
+ require_relative "llm_gateway/version"
5
+ require_relative "llm_gateway/errors"
6
+ require_relative "llm_gateway/fluent_mapper"
7
+ require_relative "llm_gateway/base_client"
8
+ require_relative "llm_gateway/client"
9
+
10
+ # Load adapters - order matters for inheritance
11
+ require_relative "llm_gateway/adapters/claude/client"
12
+ require_relative "llm_gateway/adapters/claude/input_mapper"
13
+ require_relative "llm_gateway/adapters/claude/output_mapper"
14
+ require_relative "llm_gateway/adapters/groq/client"
15
+ require_relative "llm_gateway/adapters/groq/input_mapper"
16
+ require_relative "llm_gateway/adapters/groq/output_mapper"
17
+ require_relative "llm_gateway/adapters/open_ai/client"
18
+ require_relative "llm_gateway/adapters/open_ai/input_mapper"
19
+ require_relative "llm_gateway/adapters/open_ai/output_mapper"
20
+
21
+ module LlmGateway
22
+ class Error < StandardError; end
23
+ end
@@ -0,0 +1,4 @@
1
+ module LlmGateway
2
+ VERSION: String
3
+ # See the writing guide of rbs: https://github.com/ruby/rbs#guides
4
+ end
metadata ADDED
@@ -0,0 +1,72 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: llm_gateway
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - billybonks
8
+ autorequire:
9
+ bindir: exe
10
+ cert_chain: []
11
+ date: 2025-08-04 00:00:00.000000000 Z
12
+ dependencies: []
13
+ description: LlmGateway provides a consistent Ruby interface for multiple LLM providers
14
+ including Claude, OpenAI, and Groq. Features include unified response formatting,
15
+ error handling, and fluent data mapping.
16
+ email:
17
+ - sebastienstettler@gmail.com
18
+ executables: []
19
+ extensions: []
20
+ extra_rdoc_files: []
21
+ files:
22
+ - ".rubocop.yml"
23
+ - CHANGELOG.md
24
+ - CODE_OF_CONDUCT.md
25
+ - LICENSE.txt
26
+ - README.md
27
+ - Rakefile
28
+ - lib/llm_gateway.rb
29
+ - lib/llm_gateway/adapters/claude/client.rb
30
+ - lib/llm_gateway/adapters/claude/input_mapper.rb
31
+ - lib/llm_gateway/adapters/claude/output_mapper.rb
32
+ - lib/llm_gateway/adapters/groq/client.rb
33
+ - lib/llm_gateway/adapters/groq/input_mapper.rb
34
+ - lib/llm_gateway/adapters/groq/output_mapper.rb
35
+ - lib/llm_gateway/adapters/open_ai/client.rb
36
+ - lib/llm_gateway/adapters/open_ai/input_mapper.rb
37
+ - lib/llm_gateway/adapters/open_ai/output_mapper.rb
38
+ - lib/llm_gateway/base_client.rb
39
+ - lib/llm_gateway/client.rb
40
+ - lib/llm_gateway/errors.rb
41
+ - lib/llm_gateway/fluent_mapper.rb
42
+ - lib/llm_gateway/utils.rb
43
+ - lib/llm_gateway/version.rb
44
+ - sig/llm_gateway.rbs
45
+ homepage: https://github.com/Hyper-Unearthing/llm_gateway
46
+ licenses:
47
+ - MIT
48
+ metadata:
49
+ homepage_uri: https://github.com/Hyper-Unearthing/llm_gateway
50
+ source_code_uri: https://github.com/Hyper-Unearthing/llm_gateway
51
+ changelog_uri: https://github.com/Hyper-Unearthing/llm_gateway/blob/main/CHANGELOG.md
52
+ post_install_message:
53
+ rdoc_options: []
54
+ require_paths:
55
+ - lib
56
+ required_ruby_version: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - ">="
59
+ - !ruby/object:Gem::Version
60
+ version: 3.1.0
61
+ required_rubygems_version: !ruby/object:Gem::Requirement
62
+ requirements:
63
+ - - ">="
64
+ - !ruby/object:Gem::Version
65
+ version: '0'
66
+ requirements: []
67
+ rubygems_version: 3.5.22
68
+ signing_key:
69
+ specification_version: 4
70
+ summary: A Ruby gateway for LLM APIs with unified interface for Claude, OpenAI, and
71
+ Groq
72
+ test_files: []