llm_gateway 0.1.0 → 0.1.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 912c28c117bad46986f56b7571497ff276cb3ac5b9e39b442c7b29aeb50692e6
4
- data.tar.gz: 2a087a8344992f50e0a4258d6aecbbeb7b5670cba62cfefed7c8fe054ed98c40
3
+ metadata.gz: 21b3998df57de474c78626d8267db572418f80426ddaf82730cf8738e181d96c
4
+ data.tar.gz: 9a37ff5a3907a8b0d48dbd0fc83be2e50cd5757a60cacd051e93ff1dc63d734e
5
5
  SHA512:
6
- metadata.gz: 541fb18a405c58505913a7b5dc57fe109dacec72644224cdfb9746e5da290ca93cab2670731ae41f865cc79a20f28d6ee5dab3ea6690ab2161f82fe82b3c81f2
7
- data.tar.gz: 2d6b3294fb5140e7e8b3754ae4c16bd983a133db5a3dcf61fd1adc0f07eed18003f4be724e2b90d98ca98f0aa78678d87648d434893836b52934e7da4fb7a979
6
+ metadata.gz: 1be591a45a6fbee0b89846c679e0a9709e3a5493757eac3196be8194f8e592615b177c15f20d6763cf6a444bf8dbf3d899cf6e9fd9b11de7e2f02845f53ab1ff
7
+ data.tar.gz: ed5a981d3e7ff311d26fe4924760e286e7370da598464e4b00ad004c940015432d873be73afae7067d5fa659adbe7af701b5b6f4163432c10cdf7f950532d690
data/CHANGELOG.md CHANGED
@@ -1,5 +1,20 @@
1
- ## [Unreleased]
1
+ # Changelog
2
2
 
3
- ## [0.1.0] - 2025-08-04
3
+ ## [Unreleased](https://github.com/Hyper-Unearthing/llm_gateway/tree/HEAD)
4
4
 
5
- - Initial release
5
+ [Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.1.0...HEAD)
6
+
7
+ **Merged pull requests:**
8
+
9
+ - feat: add tool base class [\#4](https://github.com/Hyper-Unearthing/llm_gateway/pull/4) ([billybonks](https://github.com/billybonks))
10
+ - feat: add prompt base class [\#3](https://github.com/Hyper-Unearthing/llm_gateway/pull/3) ([billybonks](https://github.com/billybonks))
11
+ - lint files and add coverage [\#2](https://github.com/Hyper-Unearthing/llm_gateway/pull/2) ([billybonks](https://github.com/billybonks))
12
+ - test: vcr lookup was not working when using different commands [\#1](https://github.com/Hyper-Unearthing/llm_gateway/pull/1) ([billybonks](https://github.com/billybonks))
13
+
14
+ ## [v0.1.0](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.1.0) (2025-08-04)
15
+
16
+ [Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/505c78116a2e778b23f319a380cd4bf6e300db89...v0.1.0)
17
+
18
+
19
+
20
+ \* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
data/README.md CHANGED
@@ -43,10 +43,117 @@ result = LlmGateway::Client.chat(
43
43
  )
44
44
  ```
45
45
 
46
+ ### Prompt Class
47
+
48
+ You can also create reusable prompt classes by subclassing `LlmGateway::Prompt`:
49
+
50
+ ```ruby
51
+ # Simple text completion with prompt class
52
+ class GeographyQuestionPrompt < LlmGateway::Prompt
53
+ def initialize(model, question)
54
+ super(model)
55
+ @question = question
56
+ end
57
+
58
+ def prompt
59
+ @question
60
+ end
61
+ end
62
+
63
+ # Usage
64
+ geography_prompt = GeographyQuestionPrompt.new('claude-sonnet-4-20250514', 'What is the capital of France?')
65
+ result = geography_prompt.run
66
+
67
+ # With system message
68
+ class GeographyTeacherPrompt < LlmGateway::Prompt
69
+ def initialize(model, question)
70
+ super(model)
71
+ @question = question
72
+ end
73
+
74
+ def prompt
75
+ @question
76
+ end
77
+
78
+ def system_prompt
79
+ 'You are a helpful geography teacher.'
80
+ end
81
+ end
82
+
83
+ # Usage
84
+ teacher_prompt = GeographyTeacherPrompt.new('gpt-4', 'What is the capital of France?')
85
+ result = teacher_prompt.run
86
+ ```
87
+
88
+ ### Using Prompt with Tools
89
+
90
+ You can combine the Prompt class with tools for more complex interactions:
91
+
92
+ ```ruby
93
+ # Define a tool class
94
+ class GetWeatherTool < LlmGateway::Tool
95
+ name 'get_weather'
96
+ description 'Get current weather for a location'
97
+ input_schema({
98
+ type: 'object',
99
+ properties: {
100
+ location: { type: 'string', description: 'City name' }
101
+ },
102
+ required: ['location']
103
+ })
104
+
105
+ def execute(input, login = nil)
106
+ # Your weather API implementation here
107
+ "The weather in #{input['location']} is sunny and 25°C"
108
+ end
109
+ end
110
+
111
+ class WeatherAssistantPrompt < LlmGateway::Prompt
112
+ def initialize(model, location)
113
+ super(model)
114
+ @location = location
115
+ end
116
+
117
+ def prompt
118
+ "What's the weather like in #{@location}?"
119
+ end
120
+
121
+ def system_prompt
122
+ 'You are a helpful weather assistant.'
123
+ end
124
+
125
+ def tools
126
+ [GetWeatherTool]
127
+ end
128
+ end
129
+
130
+ # Usage
131
+ weather_prompt = WeatherAssistantPrompt.new('claude-sonnet-4-20250514', 'Singapore')
132
+ result = weather_prompt.run
133
+ ```
134
+
46
135
  ### Tool Usage (Function Calling)
47
136
 
48
137
  ```ruby
49
- # Define a tool
138
+ # Define a tool class
139
+ class GetWeatherTool < LlmGateway::Tool
140
+ name 'get_weather'
141
+ description 'Get current weather for a location'
142
+ input_schema({
143
+ type: 'object',
144
+ properties: {
145
+ location: { type: 'string', description: 'City name' }
146
+ },
147
+ required: ['location']
148
+ })
149
+
150
+ def execute(input, login = nil)
151
+ # Your weather API implementation here
152
+ "The weather in #{input['location']} is sunny and 25°C"
153
+ end
154
+ end
155
+
156
+ # Use the tool
50
157
  weather_tool = {
51
158
  name: 'get_weather',
52
159
  description: 'Get current weather for a location',
@@ -59,13 +166,97 @@ weather_tool = {
59
166
  }
60
167
  }
61
168
 
62
- # Use the tool
63
169
  result = LlmGateway::Client.chat(
64
170
  'claude-sonnet-4-20250514',
65
171
  'What\'s the weather in Singapore?',
66
172
  tools: [weather_tool],
67
173
  system: 'You are a helpful weather assistant.'
68
174
  )
175
+
176
+ # Note: Tools are not automatically executed. The LLM will indicate when a tool should be called,
177
+ # but it's up to you to find the appropriate tool and execute it based on the response.
178
+
179
+ # Example of handling tool execution with conversation transcript:
180
+ class WeatherAssistant
181
+ def initialize
182
+ @transcript = []
183
+ @weather_tool = {
184
+ name: 'get_weather',
185
+ description: 'Get current weather for a location',
186
+ input_schema: {
187
+ type: 'object',
188
+ properties: {
189
+ location: { type: 'string', description: 'City name' }
190
+ },
191
+ required: ['location']
192
+ }
193
+ }
194
+ end
195
+
196
+ attr_reader :weather_tool
197
+
198
+ def process_message(content)
199
+ # Add user message to transcript
200
+ @transcript << { role: 'user', content: [{ type: 'text', text: content }] }
201
+
202
+ result = LlmGateway::Client.chat(
203
+ 'claude-sonnet-4-20250514',
204
+ @transcript,
205
+ tools: [@weather_tool],
206
+ system: 'You are a helpful weather assistant.'
207
+ )
208
+
209
+ process_response(result[:choices][0][:content])
210
+ end
211
+
212
+ private
213
+
214
+ def process_response(response)
215
+ # Add assistant response to transcript
216
+ @transcript << { role: 'assistant', content: response }
217
+
218
+ response.each do |message|
219
+ if message[:type] == 'text'
220
+ puts message[:text]
221
+ elsif message[:type] == 'tool_use'
222
+ result = handle_tool_use(message)
223
+
224
+ # Add tool result to transcript
225
+ tool_result = {
226
+ type: 'tool_result',
227
+ tool_use_id: message[:id],
228
+ content: result
229
+ }
230
+ @transcript << { role: 'user', content: [tool_result] }
231
+
232
+ # Continue conversation with full transcript context
233
+ follow_up = LlmGateway::Client.chat(
234
+ 'claude-sonnet-4-20250514',
235
+ @transcript,
236
+ tools: [@weather_tool],
237
+ system: 'You are a helpful weather assistant.'
238
+ )
239
+
240
+ process_response(follow_up[:choices][0][:content])
241
+ end
242
+ end
243
+ end
244
+
245
+ def handle_tool_use(message)
246
+ tool_class = WeatherAssistantPrompt.find_tool(message[:name])
247
+ raise "Unknown tool: #{message[:name]}" unless tool_class
248
+
249
+ # Execute the tool with the provided input
250
+ tool_instance = tool_class.new
251
+ tool_instance.execute(message[:input])
252
+ rescue StandardError => e
253
+ "Error executing tool: #{e.message}"
254
+ end
255
+ end
256
+
257
+ # Usage
258
+ assistant = WeatherAssistant.new
259
+ assistant.process_message("What's the weather in Singapore?")
69
260
  ```
70
261
 
71
262
  ### Response Format
data/Rakefile CHANGED
@@ -10,12 +10,12 @@ require "rubocop/rake_task"
10
10
  RuboCop::RakeTask.new
11
11
 
12
12
  begin
13
- require "gem-release"
13
+ require "gem/release"
14
14
 
15
15
  desc "Release with changelog"
16
- task :release do
16
+ task :gem_release do
17
17
  # Generate changelog first
18
- sh "github_changelog_generator" # or your preferred changelog tool
18
+ sh "bundle exec github_changelog_generator -u Hyper-Unearthing -p llm_gateway"
19
19
  sh "git add CHANGELOG.md"
20
20
  sh "git commit -m 'Update changelog' || echo 'No changelog changes'"
21
21
 
@@ -2,7 +2,9 @@
2
2
 
3
3
  module LlmGateway
4
4
  module Errors
5
- class BaseError < StandardError
5
+ class BaseError < StandardError; end
6
+
7
+ class ClientError < BaseError
6
8
  attr_reader :code
7
9
 
8
10
  def initialize(message = nil, code = nil)
@@ -11,20 +13,26 @@ module LlmGateway
11
13
  end
12
14
  end
13
15
 
14
- class BadRequestError < BaseError; end
15
- class AuthenticationError < BaseError; end
16
- class PermissionDeniedError < BaseError; end
17
- class NotFoundError < BaseError; end
18
- class ConflictError < BaseError; end
19
- class UnprocessableEntityError < BaseError; end
20
- class RateLimitError < BaseError; end
21
- class InternalServerError < BaseError; end
22
- class APIStatusError < BaseError; end
23
- class APITimeoutError < BaseError; end
24
- class APIConnectionError < BaseError; end
25
- class OverloadError < BaseError; end
26
- class UnknownError < BaseError; end
16
+ class BadRequestError < ClientError; end
17
+ class AuthenticationError < ClientError; end
18
+ class PermissionDeniedError < ClientError; end
19
+ class NotFoundError < ClientError; end
20
+ class ConflictError < ClientError; end
21
+ class UnprocessableEntityError < ClientError; end
22
+ class RateLimitError < ClientError; end
23
+ class InternalServerError < ClientError; end
24
+ class APIStatusError < ClientError; end
25
+ class APITimeoutError < ClientError; end
26
+ class APIConnectionError < ClientError; end
27
+ class OverloadError < ClientError; end
28
+ class UnknownError < ClientError; end
27
29
  class PromptTooLong < BadRequestError; end
28
- class UnsupportedModel < BaseError; end
30
+ class UnsupportedModel < ClientError; end
31
+
32
+ class PromptError < BaseError; end
33
+
34
+ class HallucinationError < PromptError; end
35
+ class UnknownModel < PromptError; end
36
+ class InvalidResponseGrammar < PromptError; end
29
37
  end
30
38
  end
@@ -0,0 +1,86 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ class Prompt
5
+ attr_reader :model
6
+
7
+ def self.before_execute(*methods, &block)
8
+ before_execute_callbacks.concat(methods)
9
+ before_execute_callbacks << block if block_given?
10
+ end
11
+
12
+ def self.after_execute(*methods, &block)
13
+ after_execute_callbacks.concat(methods)
14
+ after_execute_callbacks << block if block_given?
15
+ end
16
+
17
+ def self.before_execute_callbacks
18
+ @before_execute_callbacks ||= []
19
+ end
20
+
21
+ def self.after_execute_callbacks
22
+ @after_execute_callbacks ||= []
23
+ end
24
+
25
+ def self.inherited(subclass)
26
+ super
27
+ subclass.instance_variable_set(:@before_execute_callbacks, before_execute_callbacks.dup)
28
+ subclass.instance_variable_set(:@after_execute_callbacks, after_execute_callbacks.dup)
29
+ end
30
+
31
+ def initialize(model)
32
+ @model = model
33
+ end
34
+
35
+ def run
36
+ run_callbacks(:before_execute, prompt)
37
+
38
+ response = post
39
+
40
+ response_content = if respond_to?(:extract_response)
41
+ extract_response(response)
42
+ else
43
+ response[:choices][0][:content]
44
+ end
45
+
46
+ result = if respond_to?(:parse_response)
47
+ parse_response(response_content)
48
+ else
49
+ response_content
50
+ end
51
+
52
+ run_callbacks(:after_execute, response, response_content)
53
+
54
+ result
55
+ end
56
+
57
+ def post
58
+ LlmGateway::Client.chat(model, prompt, tools: tools, system: system_prompt)
59
+ end
60
+
61
+ def tools
62
+ nil
63
+ end
64
+
65
+ def self.find_tool(tool_name)
66
+ tools.find { |tool| tool.tool_name == tool_name }
67
+ end
68
+
69
+ def system_prompt
70
+ nil
71
+ end
72
+
73
+ private
74
+
75
+ def run_callbacks(callback_type, *args)
76
+ callbacks = self.class.send("#{callback_type}_callbacks")
77
+ callbacks.each do |callback|
78
+ if callback.is_a?(Proc)
79
+ instance_exec(*args, &callback)
80
+ elsif callback.is_a?(Symbol) || callback.is_a?(String)
81
+ send(callback, *args)
82
+ end
83
+ end
84
+ end
85
+ end
86
+ end
@@ -0,0 +1,46 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ class Tool
5
+ def initialize(*args)
6
+ # Empty constructor to allow subclasses to call super
7
+ end
8
+
9
+ def self.name(value = nil)
10
+ @name = value if value
11
+ @name
12
+ end
13
+
14
+ def self.description(value = nil)
15
+ @description = value if value
16
+ @description
17
+ end
18
+
19
+ def self.input_schema(value = nil)
20
+ @input_schema = value if value
21
+ @input_schema
22
+ end
23
+
24
+ def self.cache(value = nil)
25
+ @cache = value if value
26
+ @cache
27
+ end
28
+
29
+ def self.definition
30
+ {
31
+ name: @name,
32
+ description: @description,
33
+ input_schema: @input_schema,
34
+ cache_control: @cache ? { type: "ephemeral" } : nil
35
+ }.compact
36
+ end
37
+
38
+ def self.tool_name
39
+ definition[:name]
40
+ end
41
+
42
+ def execute(input, login)
43
+ raise NotImplementedError, "Subclasses must implement execute"
44
+ end
45
+ end
46
+ end
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module LlmGateway
4
- VERSION = "0.1.0"
4
+ VERSION = "0.1.3"
5
5
  end
data/lib/llm_gateway.rb CHANGED
@@ -6,6 +6,8 @@ require_relative "llm_gateway/errors"
6
6
  require_relative "llm_gateway/fluent_mapper"
7
7
  require_relative "llm_gateway/base_client"
8
8
  require_relative "llm_gateway/client"
9
+ require_relative "llm_gateway/prompt"
10
+ require_relative "llm_gateway/tool"
9
11
 
10
12
  # Load adapters - order matters for inheritance
11
13
  require_relative "llm_gateway/adapters/claude/client"
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: llm_gateway
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.1.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - billybonks
@@ -39,6 +39,8 @@ files:
39
39
  - lib/llm_gateway/client.rb
40
40
  - lib/llm_gateway/errors.rb
41
41
  - lib/llm_gateway/fluent_mapper.rb
42
+ - lib/llm_gateway/prompt.rb
43
+ - lib/llm_gateway/tool.rb
42
44
  - lib/llm_gateway/utils.rb
43
45
  - lib/llm_gateway/version.rb
44
46
  - sig/llm_gateway.rbs