ruby_llm-gitlab 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml ADDED
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA256:
3
+ metadata.gz: ba3f9f64c038b7c3242e43d2b9788bd45d3b962cca9f371088a4522e802cceef
4
+ data.tar.gz: a7e6e3a8a144b9a0ff59693b7b319c7beefb73d3ae670d720f238a44eb8df27b
5
+ SHA512:
6
+ metadata.gz: 11d9a84ea5e990121e3d1873352840fe703b3131e8b826e76eee60cc3ccdc36a4c7792d101a14e46e6a9899f805f95f4191e3249eca0493788ce53166a5ce65e
7
+ data.tar.gz: a42a3537f0bb9e305cb4b207053e127db0a61dd6e91904ee8fa773fed8afe675c36a10fff6517a4f0cf05257e752cfc2dab45dba88a260f65212208e72927b80
data/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 RubyLLM Contributors
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
data/README.md ADDED
@@ -0,0 +1,154 @@
1
+ # RubyLLM::GitLab
2
+
3
+ Access Claude and GPT models through **GitLab Duo** via the [RubyLLM](https://rubyllm.com) interface. One provider, automatic routing.
4
+
5
+ ## Requirements
6
+
7
+ - Ruby >= 3.1
8
+ - [RubyLLM](https://github.com/crmne/ruby_llm) >= 1.0
9
+ - GitLab Premium/Ultimate with Duo Enterprise
10
+ - Personal Access Token with `ai_features` scope
11
+
12
+ ## Installation
13
+
14
+ ```ruby
15
+ gem 'ruby_llm-gitlab'
16
+ ```
17
+
18
+ ## Configuration
19
+
20
+ ### gitlab.com
21
+
22
+ ```ruby
23
+ RubyLLM.configure do |config|
24
+ config.gitlab_api_key = ENV['GITLAB_TOKEN']
25
+ end
26
+ ```
27
+
28
+ ### Self-managed
29
+
30
+ ```ruby
31
+ RubyLLM.configure do |config|
32
+ config.gitlab_api_key = ENV['GITLAB_TOKEN']
33
+ config.gitlab_instance_url = 'https://gitlab.example.com'
34
+ config.gitlab_gateway_url = 'https://cloud.gitlab.com' # default
35
+ end
36
+ ```
37
+
38
+ ## Usage
39
+
40
+ ### Chat
41
+
42
+ ```ruby
43
+ chat = RubyLLM.chat(model: 'duo-chat-opus-4-6', provider: :gitlab)
44
+ chat.ask "What's the best way to learn Ruby?"
45
+ ```
46
+
47
+ ### Streaming
48
+
49
+ ```ruby
50
+ chat = RubyLLM.chat(model: 'duo-chat-sonnet-4-6', provider: :gitlab)
51
+ chat.ask "Tell me a story" do |chunk|
52
+ print chunk.content
53
+ end
54
+ ```
55
+
56
+ ### Tools
57
+
58
+ ```ruby
59
+ class Weather < RubyLLM::Tool
60
+ description "Get current weather"
61
+ param :city, type: :string
62
+
63
+ def execute(city:)
64
+ { temperature: 22, condition: "sunny" }
65
+ end
66
+ end
67
+
68
+ chat = RubyLLM.chat(model: 'duo-chat-opus-4-6', provider: :gitlab)
69
+ chat.with_tool(Weather).ask "What's the weather in Berlin?"
70
+ ```
71
+
72
+ ### Agents
73
+
74
+ ```ruby
75
+ class CodeReviewer < RubyLLM::Agent
76
+ model 'duo-chat-opus-4-6'
77
+ provider :gitlab
78
+ instructions "You are a senior Ruby developer. Review code for bugs and style."
79
+ end
80
+
81
+ CodeReviewer.new.ask "Review this: def foo(x) x+1 end"
82
+ ```
83
+
84
+ ### Rails
85
+
86
+ ```ruby
87
+ class Chat < ApplicationRecord
88
+ acts_as_chat model: 'duo-chat-sonnet-4-6', provider: :gitlab
89
+ end
90
+
91
+ chat = Chat.create!
92
+ chat.ask "Explain Active Record callbacks"
93
+ ```
94
+
95
+ ## Models
96
+
97
+ | GitLab Model ID | Routes To | Backend |
98
+ |---|---|---|
99
+ | `duo-chat-opus-4-6` | `claude-opus-4-6` | Anthropic |
100
+ | `duo-chat-sonnet-4-6` | `claude-sonnet-4-6` | Anthropic |
101
+ | `duo-chat-opus-4-5` | `claude-opus-4-5-20251101` | Anthropic |
102
+ | `duo-chat-sonnet-4-5` | `claude-sonnet-4-5-20250929` | Anthropic |
103
+ | `duo-chat-haiku-4-5` | `claude-haiku-4-5-20251001` | Anthropic |
104
+ | `duo-chat-gpt-5-1` | `gpt-5.1-2025-11-13` | OpenAI |
105
+ | `duo-chat-gpt-5-2` | `gpt-5.2-2025-12-11` | OpenAI |
106
+ | `duo-chat-gpt-5-mini` | `gpt-5-mini-2025-08-07` | OpenAI |
107
+
108
+ ## How It Works
109
+
110
+ ```
111
+ User → RubyLLM.chat(provider: :gitlab)
112
+
113
+ GitLab Provider (router)
114
+ ├── Claude model? → AnthropicDelegate → GitLab AI Gateway → Anthropic API
115
+ └── GPT model? → OpenAIDelegate → GitLab AI Gateway → OpenAI API
116
+ ```
117
+
118
+ 1. **Token exchange** — Your PAT is exchanged for a short-lived gateway token via GitLab's Direct Access API
119
+ 2. **Routing** — The provider inspects the model ID and delegates to the correct sub-provider
120
+ 3. **Proxy** — Requests go through GitLab's AI Gateway (`cloud.gitlab.com`) which proxies to the upstream API
121
+ 4. **Model mapping** — GitLab model IDs (e.g. `duo-chat-opus-4-6`) are transparently mapped to upstream IDs (e.g. `claude-opus-4-6`)
122
+
123
+ ## Limitations
124
+
125
+ - Chat completions only (no embeddings, images, audio, moderation)
126
+ - No Codex/Responses API models
127
+ - Token cached for 25 minutes; auto-refreshes on expiry
128
+ - Requires GitLab Duo Enterprise license
129
+
130
+ ## Troubleshooting
131
+
132
+ ### `ConfigurationError: Missing configuration for GitLab: gitlab_api_key`
133
+
134
+ Set your token:
135
+
136
+ ```ruby
137
+ RubyLLM.configure { |c| c.gitlab_api_key = ENV['GITLAB_TOKEN'] }
138
+ ```
139
+
140
+ ### `ConfigurationError: Failed to obtain GitLab direct access token`
141
+
142
+ - Verify your PAT has `ai_features` scope
143
+ - Check your GitLab instance URL is correct
144
+ - Ensure Duo Enterprise is enabled for your group/project
145
+
146
+ ### 401 errors during chat
147
+
148
+ The gem automatically retries once with a fresh token. If it persists:
149
+ - Your PAT may have expired
150
+ - The Duo license may have been revoked
151
+
152
+ ## License
153
+
154
+ MIT
@@ -0,0 +1,45 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module Providers
5
+ class GitLab
6
+ class AnthropicDelegate < RubyLLM::Providers::Anthropic
7
+ include RubyLLM::Providers::GitLab::Chat
8
+
9
+ def api_base
10
+ token_manager.anthropic_base_url
11
+ end
12
+
13
+ def headers
14
+ {
15
+ 'Authorization' => "Bearer #{token_manager.token}",
16
+ 'anthropic-version' => '2023-06-01'
17
+ }.merge(token_manager.gateway_headers)
18
+ end
19
+
20
+ def complete(...)
21
+ super
22
+ rescue RubyLLM::UnauthorizedError
23
+ token_manager.invalidate!
24
+ super
25
+ end
26
+
27
+ class << self
28
+ def configuration_requirements
29
+ %i[gitlab_api_key]
30
+ end
31
+
32
+ def configuration_options
33
+ %i[gitlab_api_key gitlab_instance_url gitlab_gateway_url]
34
+ end
35
+ end
36
+
37
+ private
38
+
39
+ def token_manager
40
+ @token_manager ||= TokenManager.new(@config)
41
+ end
42
+ end
43
+ end
44
+ end
45
+ end
@@ -0,0 +1,37 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module Providers
5
+ class GitLab
6
+ module Capabilities
7
+ MODEL_MAPPINGS = {
8
+ 'duo-chat-opus-4-6' => 'claude-opus-4-6',
9
+ 'duo-chat-sonnet-4-6' => 'claude-sonnet-4-6',
10
+ 'duo-chat-opus-4-5' => 'claude-opus-4-5-20251101',
11
+ 'duo-chat-sonnet-4-5' => 'claude-sonnet-4-5-20250929',
12
+ 'duo-chat-haiku-4-5' => 'claude-haiku-4-5-20251001',
13
+ 'duo-chat-gpt-5-1' => 'gpt-5.1-2025-11-13',
14
+ 'duo-chat-gpt-5-2' => 'gpt-5.2-2025-12-11',
15
+ 'duo-chat-gpt-5-mini' => 'gpt-5-mini-2025-08-07'
16
+ }.freeze
17
+
18
+ ANTHROPIC_MODELS = MODEL_MAPPINGS.select { |_, v| v.start_with?('claude-') }.keys.freeze
19
+ OPENAI_MODELS = MODEL_MAPPINGS.reject { |_, v| v.start_with?('claude-') }.keys.freeze
20
+
21
+ module_function
22
+
23
+ def actual_model(id)
24
+ MODEL_MAPPINGS[id]
25
+ end
26
+
27
+ def anthropic_model?(id)
28
+ ANTHROPIC_MODELS.include?(id)
29
+ end
30
+
31
+ def openai_model?(id)
32
+ OPENAI_MODELS.include?(id)
33
+ end
34
+ end
35
+ end
36
+ end
37
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'delegate'
4
+
5
+ module RubyLLM
6
+ module Providers
7
+ class GitLab
8
+ module Chat
9
+ def render_payload(messages, tools:, temperature:, model:, stream: false, schema: nil,
10
+ thinking: nil, tool_prefs: nil)
11
+ super(messages, tools: tools, temperature: temperature, model: ModelIdProxy.new(model),
12
+ stream: stream, schema: schema, thinking: thinking, tool_prefs: tool_prefs)
13
+ end
14
+ end
15
+
16
+ class ModelIdProxy < SimpleDelegator
17
+ def id
18
+ Capabilities.actual_model(__getobj__.id) || __getobj__.id
19
+ end
20
+ end
21
+ end
22
+ end
23
+ end
@@ -0,0 +1,44 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module Providers
5
+ class GitLab
6
+ class OpenAIDelegate < RubyLLM::Providers::OpenAI
7
+ include RubyLLM::Providers::GitLab::Chat
8
+
9
+ def api_base
10
+ token_manager.openai_base_url
11
+ end
12
+
13
+ def headers
14
+ {
15
+ 'Authorization' => "Bearer #{token_manager.token}"
16
+ }.merge(token_manager.gateway_headers)
17
+ end
18
+
19
+ def complete(...)
20
+ super
21
+ rescue RubyLLM::UnauthorizedError
22
+ token_manager.invalidate!
23
+ super
24
+ end
25
+
26
+ class << self
27
+ def configuration_requirements
28
+ %i[gitlab_api_key]
29
+ end
30
+
31
+ def configuration_options
32
+ %i[gitlab_api_key gitlab_instance_url gitlab_gateway_url]
33
+ end
34
+ end
35
+
36
+ private
37
+
38
+ def token_manager
39
+ @token_manager ||= TokenManager.new(@config)
40
+ end
41
+ end
42
+ end
43
+ end
44
+ end
@@ -0,0 +1,103 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module Providers
5
+ class GitLab
6
+ class TokenManager
7
+ CACHE_DURATION = 25 * 60 # 25 minutes
8
+ DEFAULT_INSTANCE_URL = 'https://gitlab.com'
9
+ DEFAULT_GATEWAY_URL = 'https://cloud.gitlab.com'
10
+ TOKEN_PATH = '/api/v4/ai/third_party_agents/direct_access'
11
+
12
+ def initialize(config)
13
+ @config = config
14
+ @mutex = Mutex.new
15
+ @cached_token = nil
16
+ @cached_headers = nil
17
+ @cached_at = nil
18
+ end
19
+
20
+ def token
21
+ refresh_if_needed
22
+ @cached_token
23
+ end
24
+
25
+ def gateway_headers
26
+ refresh_if_needed
27
+ @cached_headers
28
+ end
29
+
30
+ def invalidate!
31
+ @mutex.synchronize do
32
+ @cached_token = nil
33
+ @cached_headers = nil
34
+ @cached_at = nil
35
+ end
36
+ end
37
+
38
+ def anthropic_base_url
39
+ "#{gateway_url}/ai/v1/proxy/anthropic"
40
+ end
41
+
42
+ def openai_base_url
43
+ "#{gateway_url}/ai/v1/proxy/openai/v1"
44
+ end
45
+
46
+ private
47
+
48
+ def refresh_if_needed
49
+ @mutex.synchronize do
50
+ fetch_token if stale?
51
+ end
52
+ end
53
+
54
+ def stale?
55
+ @cached_at.nil? || (Time.now - @cached_at) > CACHE_DURATION
56
+ end
57
+
58
+ def fetch_token
59
+ response = request_direct_access
60
+ body = response.body
61
+
62
+ @cached_token = body['token']
63
+ # SECURITY: Filter out x-api-key — gateway auth uses Bearer token only
64
+ raw_headers = body.fetch('headers', {})
65
+ @cached_headers = raw_headers.reject { |k, _| k.downcase == 'x-api-key' }
66
+ @cached_at = Time.now
67
+ rescue Faraday::Error => e
68
+ raise RubyLLM::ConfigurationError,
69
+ "Failed to obtain GitLab direct access token: #{e.message}"
70
+ end
71
+
72
+ def request_direct_access
73
+ connection.post(TOKEN_PATH) do |req|
74
+ req.headers['Authorization'] = "Bearer #{pat}"
75
+ req.headers['Content-Type'] = 'application/json'
76
+ req.body = JSON.generate(feature_flags: { DuoAgentPlatformNext: true })
77
+ end
78
+ end
79
+
80
+ def connection
81
+ @connection ||= RubyLLM::Connection.basic do |f|
82
+ f.url_prefix = instance_url
83
+ f.request :json
84
+ f.response :json
85
+ f.adapter :net_http
86
+ end
87
+ end
88
+
89
+ def pat
90
+ @config.gitlab_api_key
91
+ end
92
+
93
+ def instance_url
94
+ @config.respond_to?(:gitlab_instance_url) && @config.gitlab_instance_url || DEFAULT_INSTANCE_URL
95
+ end
96
+
97
+ def gateway_url
98
+ @config.respond_to?(:gitlab_gateway_url) && @config.gitlab_gateway_url || DEFAULT_GATEWAY_URL
99
+ end
100
+ end
101
+ end
102
+ end
103
+ end
@@ -0,0 +1,72 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module Providers
5
+ class GitLab < Provider
6
+ require_relative 'gitlab/capabilities'
7
+ require_relative 'gitlab/token_manager'
8
+ require_relative 'gitlab/chat'
9
+ require_relative 'gitlab/anthropic_delegate'
10
+ require_relative 'gitlab/openai_delegate'
11
+
12
+ include GitLab::Chat
13
+
14
+ def initialize(config)
15
+ @config = config
16
+ ensure_configured!
17
+ end
18
+
19
+ def complete(messages, tools:, temperature:, model:, **kwargs, &block)
20
+ delegate_for(model).complete(messages, tools: tools, temperature: temperature, model: model, **kwargs, &block)
21
+ end
22
+
23
+ def list_models
24
+ GitLab::Capabilities::MODEL_MAPPINGS.map do |gitlab_id, _actual_id|
25
+ RubyLLM::Model::Info.new(
26
+ id: gitlab_id,
27
+ name: gitlab_id.tr('-', ' ').capitalize,
28
+ provider: 'gitlab',
29
+ capabilities: %w[function_calling streaming vision structured_output],
30
+ modalities: { input: %w[text image], output: %w[text] }
31
+ )
32
+ end
33
+ end
34
+
35
+ def test_connection
36
+ token_manager.token
37
+ true
38
+ rescue StandardError
39
+ false
40
+ end
41
+
42
+ class << self
43
+ def configuration_options = %i[gitlab_api_key gitlab_instance_url gitlab_gateway_url]
44
+ def configuration_requirements = %i[gitlab_api_key]
45
+ def assume_models_exist? = true
46
+ end
47
+
48
+ private
49
+
50
+ def delegate_for(model)
51
+ model_id = model.is_a?(String) ? model : model.id
52
+ if GitLab::Capabilities.anthropic_model?(model_id)
53
+ anthropic_delegate
54
+ else
55
+ openai_delegate
56
+ end
57
+ end
58
+
59
+ def anthropic_delegate
60
+ @anthropic_delegate ||= GitLab::AnthropicDelegate.new(@config)
61
+ end
62
+
63
+ def openai_delegate
64
+ @openai_delegate ||= GitLab::OpenAIDelegate.new(@config)
65
+ end
66
+
67
+ def token_manager
68
+ @token_manager ||= GitLab::TokenManager.new(@config)
69
+ end
70
+ end
71
+ end
72
+ end
@@ -0,0 +1,6 @@
1
+ # frozen_string_literal: true
2
+
3
+ require 'ruby_llm'
4
+ require_relative 'ruby_llm/providers/gitlab'
5
+
6
+ RubyLLM::Provider.register :gitlab, RubyLLM::Providers::GitLab
metadata ADDED
@@ -0,0 +1,118 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: ruby_llm-gitlab
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.0
5
+ platform: ruby
6
+ authors:
7
+ - Compasify
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
11
+ date: 2026-03-24 00:00:00.000000000 Z
12
+ dependencies:
13
+ - !ruby/object:Gem::Dependency
14
+ name: ruby_llm
15
+ requirement: !ruby/object:Gem::Requirement
16
+ requirements:
17
+ - - ">="
18
+ - !ruby/object:Gem::Version
19
+ version: '1.0'
20
+ - - "<"
21
+ - !ruby/object:Gem::Version
22
+ version: '3.0'
23
+ type: :runtime
24
+ prerelease: false
25
+ version_requirements: !ruby/object:Gem::Requirement
26
+ requirements:
27
+ - - ">="
28
+ - !ruby/object:Gem::Version
29
+ version: '1.0'
30
+ - - "<"
31
+ - !ruby/object:Gem::Version
32
+ version: '3.0'
33
+ - !ruby/object:Gem::Dependency
34
+ name: rspec
35
+ requirement: !ruby/object:Gem::Requirement
36
+ requirements:
37
+ - - "~>"
38
+ - !ruby/object:Gem::Version
39
+ version: '3.0'
40
+ type: :development
41
+ prerelease: false
42
+ version_requirements: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - "~>"
45
+ - !ruby/object:Gem::Version
46
+ version: '3.0'
47
+ - !ruby/object:Gem::Dependency
48
+ name: rubocop
49
+ requirement: !ruby/object:Gem::Requirement
50
+ requirements:
51
+ - - "~>"
52
+ - !ruby/object:Gem::Version
53
+ version: '1.0'
54
+ type: :development
55
+ prerelease: false
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - "~>"
59
+ - !ruby/object:Gem::Version
60
+ version: '1.0'
61
+ - !ruby/object:Gem::Dependency
62
+ name: webmock
63
+ requirement: !ruby/object:Gem::Requirement
64
+ requirements:
65
+ - - "~>"
66
+ - !ruby/object:Gem::Version
67
+ version: '3.0'
68
+ type: :development
69
+ prerelease: false
70
+ version_requirements: !ruby/object:Gem::Requirement
71
+ requirements:
72
+ - - "~>"
73
+ - !ruby/object:Gem::Version
74
+ version: '3.0'
75
+ description: Access Claude and GPT models through GitLab Duo via the RubyLLM interface.
76
+ One provider, automatic routing to Anthropic or OpenAI backends.
77
+ email:
78
+ - timdapan.com@gmail.com
79
+ executables: []
80
+ extensions: []
81
+ extra_rdoc_files: []
82
+ files:
83
+ - LICENSE
84
+ - README.md
85
+ - lib/ruby_llm-gitlab.rb
86
+ - lib/ruby_llm/providers/gitlab.rb
87
+ - lib/ruby_llm/providers/gitlab/anthropic_delegate.rb
88
+ - lib/ruby_llm/providers/gitlab/capabilities.rb
89
+ - lib/ruby_llm/providers/gitlab/chat.rb
90
+ - lib/ruby_llm/providers/gitlab/openai_delegate.rb
91
+ - lib/ruby_llm/providers/gitlab/token_manager.rb
92
+ homepage: https://github.com/compasify/ruby_llm-gitlab
93
+ licenses:
94
+ - MIT
95
+ metadata:
96
+ homepage_uri: https://github.com/compasify/ruby_llm-gitlab
97
+ source_code_uri: https://github.com/compasify/ruby_llm-gitlab
98
+ rubygems_mfa_required: 'true'
99
+ post_install_message:
100
+ rdoc_options: []
101
+ require_paths:
102
+ - lib
103
+ required_ruby_version: !ruby/object:Gem::Requirement
104
+ requirements:
105
+ - - ">="
106
+ - !ruby/object:Gem::Version
107
+ version: '3.1'
108
+ required_rubygems_version: !ruby/object:Gem::Requirement
109
+ requirements:
110
+ - - ">="
111
+ - !ruby/object:Gem::Version
112
+ version: '0'
113
+ requirements: []
114
+ rubygems_version: 3.5.3
115
+ signing_key:
116
+ specification_version: 4
117
+ summary: GitLab AI Gateway provider for RubyLLM
118
+ test_files: []