llm_gateway 0.1.6 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (29) hide show
  1. checksums.yaml +4 -4
  2. data/.rubocop.yml +29 -0
  3. data/CHANGELOG.md +32 -0
  4. data/README.md +157 -4
  5. data/Rakefile +7 -1
  6. data/lib/llm_gateway/adapters/claude/bidirectional_message_mapper.rb +83 -0
  7. data/lib/llm_gateway/adapters/claude/client.rb +4 -0
  8. data/lib/llm_gateway/adapters/claude/input_mapper.rb +36 -13
  9. data/lib/llm_gateway/adapters/claude/output_mapper.rb +33 -7
  10. data/lib/llm_gateway/adapters/groq/bidirectional_message_mapper.rb +18 -0
  11. data/lib/llm_gateway/adapters/groq/input_mapper.rb +7 -67
  12. data/lib/llm_gateway/adapters/groq/output_mapper.rb +1 -37
  13. data/lib/llm_gateway/adapters/open_ai/chat_completions/bidirectional_message_mapper.rb +103 -0
  14. data/lib/llm_gateway/adapters/open_ai/chat_completions/input_mapper.rb +110 -0
  15. data/lib/llm_gateway/adapters/open_ai/chat_completions/output_mapper.rb +40 -0
  16. data/lib/llm_gateway/adapters/open_ai/client.rb +21 -0
  17. data/lib/llm_gateway/adapters/open_ai/file_output_mapper.rb +25 -0
  18. data/lib/llm_gateway/adapters/open_ai/responses/bidirectional_message_mapper.rb +72 -0
  19. data/lib/llm_gateway/adapters/open_ai/responses/input_mapper.rb +62 -0
  20. data/lib/llm_gateway/adapters/open_ai/responses/output_mapper.rb +47 -0
  21. data/lib/llm_gateway/base_client.rb +35 -0
  22. data/lib/llm_gateway/client.rb +116 -20
  23. data/lib/llm_gateway/errors.rb +2 -0
  24. data/lib/llm_gateway/version.rb +1 -1
  25. data/lib/llm_gateway.rb +11 -4
  26. metadata +11 -5
  27. data/lib/llm_gateway/adapters/open_ai/input_mapper.rb +0 -19
  28. data/lib/llm_gateway/adapters/open_ai/output_mapper.rb +0 -10
  29. data/lib/llm_gateway/fluent_mapper.rb +0 -144
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 86c8c0937c6d8d78b1b4b3c2dfa1ed749fbd74eb40e52ccf084dbf0e3cccbb8e
4
- data.tar.gz: 682021570b7d903ba44bfc606a9ca9e9fc45ce897bb8637a32d563bfd5497de4
3
+ metadata.gz: 972dac306c8d8f59b41c462f0133d0ce415e689515fbd84d5503541a9fee6f93
4
+ data.tar.gz: 6014254f858af1b83906b29950811e2ce84940f0ff49301a9b36dd19f592b7a8
5
5
  SHA512:
6
- metadata.gz: b684e11152959b054bb30213982845e0978dfe91a2473de7e2a326ca37d2c9e6ec8411be1b5a729e3d99bfb0654bfc420bacf7b752219286295cf80d2c1245f1
7
- data.tar.gz: 3da1cf5fcc649024b9e08859905430e1a89082333ec614c39b6fd22ff360143fe5be5f1efb9d7ddf81e8d9f08e62549c23ae1daadc8ef82e234fb6e433a4b899
6
+ metadata.gz: 388573aa9afe19a075c561f17d5e4a6b8e550de25b73c3e6d9583403096f1c26f13667055d5920408c2c06f9add888af81a0bf34434762b8ece4a58de32ee3c3
7
+ data.tar.gz: 42d0b37c296ee058109cb38daaeefc269ce128a39bfbb3b89745eb734653a2828ec8c54f8acd1a12242aa03d1706304507c627074a53e4e6cddc7eb27324d69d
data/.rubocop.yml CHANGED
@@ -3,3 +3,32 @@ inherit_gem:
3
3
 
4
4
  AllCops:
5
5
  TargetRubyVersion: 3.1
6
+
7
+ Layout/EndAlignment:
8
+ EnforcedStyleAlignWith: start_of_line
9
+
10
+ Layout/FirstHashElementLineBreak:
11
+ Enabled: true
12
+
13
+ Layout/FirstHashElementIndentation:
14
+ Enabled: true
15
+ EnforcedStyle: consistent
16
+
17
+ Layout/MultilineHashKeyLineBreaks:
18
+ Enabled: true
19
+
20
+ Layout/HashAlignment:
21
+ Enabled: true
22
+ EnforcedHashRocketStyle: key
23
+ EnforcedColonStyle: key
24
+
25
+ Layout/IndentationConsistency:
26
+ Enabled: true
27
+ EnforcedStyle: normal
28
+
29
+ Layout/IndentationWidth:
30
+ Width: 2
31
+
32
+ Layout/MultilineHashBraceLayout:
33
+ Enabled: true
34
+ EnforcedStyle: new_line
data/CHANGELOG.md CHANGED
@@ -1,5 +1,37 @@
1
1
  # Changelog
2
2
 
3
+ ## [v0.3.0](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.3.0) (2025-08-19)
4
+
5
+ [Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.2.0...v0.3.0)
6
+
7
+ **Merged pull requests:**
8
+
9
+ - feat: create a method called responses to implement modern apis [\#30](https://github.com/Hyper-Unearthing/llm_gateway/pull/30) ([billybonks](https://github.com/billybonks))
10
+ - test: Create a test that asserts the transcript format. [\#29](https://github.com/Hyper-Unearthing/llm_gateway/pull/29) ([billybonks](https://github.com/billybonks))
11
+ - refactor: move open ai chat completions to its own folder [\#28](https://github.com/Hyper-Unearthing/llm_gateway/pull/28) ([billybonks](https://github.com/billybonks))
12
+ - refactor: bundle all provider resources in a hash [\#27](https://github.com/Hyper-Unearthing/llm_gateway/pull/27) ([billybonks](https://github.com/billybonks))
13
+ - refactor: message mapper to become bidirectional mapper [\#26](https://github.com/Hyper-Unearthing/llm_gateway/pull/26) ([billybonks](https://github.com/billybonks))
14
+ - refactor: extract message mapper from input mapper [\#25](https://github.com/Hyper-Unearthing/llm_gateway/pull/25) ([billybonks](https://github.com/billybonks))
15
+ - docs: add more information about how the library works [\#24](https://github.com/Hyper-Unearthing/llm_gateway/pull/24) ([billybonks](https://github.com/billybonks))
16
+ - feat: enable uploading and downloading files from openai anthropic [\#23](https://github.com/Hyper-Unearthing/llm_gateway/pull/23) ([billybonks](https://github.com/billybonks))
17
+ - ci: upload build version to github when we release [\#22](https://github.com/Hyper-Unearthing/llm_gateway/pull/22) ([billybonks](https://github.com/billybonks))
18
+
19
+ ## [v0.2.0](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.2.0) (2025-08-08)
20
+
21
+ [Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.1.6...v0.2.0)
22
+
23
+ **Merged pull requests:**
24
+
25
+ - feat: improve read me [\#21](https://github.com/Hyper-Unearthing/llm_gateway/pull/21) ([billybonks](https://github.com/billybonks))
26
+ - refactor: remove fluent mapper from the lib [\#20](https://github.com/Hyper-Unearthing/llm_gateway/pull/20) ([billybonks](https://github.com/billybonks))
27
+ - test: dont fail vcr if key order changes [\#19](https://github.com/Hyper-Unearthing/llm_gateway/pull/19) ([billybonks](https://github.com/billybonks))
28
+ - burn: fluent mapper from input mappers [\#18](https://github.com/Hyper-Unearthing/llm_gateway/pull/18) ([billybonks](https://github.com/billybonks))
29
+ - test: open ai mapper [\#17](https://github.com/Hyper-Unearthing/llm_gateway/pull/17) ([billybonks](https://github.com/billybonks))
30
+ - feat: handle basic text document handling [\#16](https://github.com/Hyper-Unearthing/llm_gateway/pull/16) ([billybonks](https://github.com/billybonks))
31
+ - test: improve issues [\#15](https://github.com/Hyper-Unearthing/llm_gateway/pull/15) ([billybonks](https://github.com/billybonks))
32
+ - style: format aligment of content automatically to my preference [\#14](https://github.com/Hyper-Unearthing/llm_gateway/pull/14) ([billybonks](https://github.com/billybonks))
33
+ - fix: tool calling open ai [\#13](https://github.com/Hyper-Unearthing/llm_gateway/pull/13) ([billybonks](https://github.com/billybonks))
34
+
3
35
  ## [v0.1.6](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.1.6) (2025-08-05)
4
36
 
5
37
  [Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.1.5...v0.1.6)
data/README.md CHANGED
@@ -1,8 +1,65 @@
1
1
  # LlmGateway
2
2
 
3
- Provide nuts and bolts for LLM APIs. The goal is to provide a unified interface for multiple LLM provider API's; And Enable developers to have as much control as they want.
3
+ Provide a unified translation interface for LLM Provider API's, While allowing developers to have as much control as possible, This does make it more complicated because we dont want developers to be blocked at using something that the provider supports. As time progress the library will mature and support more responses
4
4
 
5
- You can use the clients directly, Or you can use the gateway to have interop between clients.
5
+
6
+ ## Principles:
7
+ 1. Transcription integrity is most important
8
+ 2. Input messages must have bidirectional integrity
9
+ 3. Allow developers as much control as possible
10
+
11
+ ## Assumptions
12
+ things that do not support unidirectional format, probably cant be sent between providers
13
+
14
+ ## Mechanics
15
+ Messages either support unidirectional or bidirectional format. (unidirectional means we can format it as an output but should not be added as an input).
16
+
17
+ The result from the llm is in the format that can be sent to the provider, but if you want to consolidate complex messages like code_execution, you must run a mapper we provide manually, but dont send that format back to the provider.
18
+
19
+ ### bidirectional Support
20
+ Messages
21
+ - Text
22
+ - Tool Use
23
+ - Tool Response
24
+
25
+ Tools
26
+ - Server Tools
27
+ - Tools
28
+
29
+ ### Unidirectional Support
30
+ - Server Tool Use Reponse
31
+
32
+ ### Example flow
33
+
34
+
35
+ ```mermaid
36
+ sequenceDiagram
37
+ actor developer
38
+ participant llm_gateway
39
+ participant llm_provider
40
+
41
+ developer ->> llm_gateway: Send Text Message
42
+ llm_gateway ->> llm_gateway: transform to provider format
43
+ llm_gateway ->> llm_provider: Transformed Text Message
44
+ llm_provider ->> llm_gateway: Response <br />(transcript in provider format)
45
+ llm_gateway ->> developer: Response <br />(transcript in combination <br />of gatway and provider formats)
46
+ Note over llm_gateway,developer: llm_gateway will transform <br /> messages that support bi-direction
47
+ developer ->> developer: save the transcript
48
+ loop ProcessMessage
49
+ developer ->> llm_gateway: format message
50
+ llm_gateway ->> developer: return transformed message
51
+ Note over llm_gateway,developer: if the message: <br /> supports bidirection format returns as is <br /> otherwise will transform <br />into consolidated format
52
+ developer ->> developer: append earlier saved transcript
53
+ Note over developer, developer: for example tool use
54
+ end
55
+ developer -> llm_gateway: Transcript
56
+ llm_gateway ->> llm_gateway: transform to provider format
57
+ Note over llm_gateway,llm_gateway: non bidirectional messages are sent as is
58
+ llm_gateway ->> llm_provider: etc etc etc
59
+
60
+
61
+
62
+ ```
6
63
 
7
64
  ## Supported Providers
8
65
  Anthropic, OpenAi, Groq
@@ -30,17 +87,113 @@ gem install llm_gateway
30
87
  require 'llm_gateway'
31
88
 
32
89
  # Simple text completion
33
- result = LlmGateway::Client.chat(
90
+ LlmGateway::Client.chat(
34
91
  'claude-sonnet-4-20250514',
35
92
  'What is the capital of France?'
36
93
  )
37
94
 
38
95
  # With system message
39
- result = LlmGateway::Client.chat(
96
+ LlmGateway::Client.chat(
40
97
  'gpt-4',
41
98
  'What is the capital of France?',
42
99
  system: 'You are a helpful geography teacher.'
43
100
  )
101
+
102
+ # With inline file
103
+ LlmGateway::Client.chat(
104
+ "claude-sonnet-4-20250514",
105
+ [
106
+ {
107
+ role: "user", content: [
108
+ { type: "text", text: "return the content of the document exactly" },
109
+ { type: "file", data: "abc\n", media_type: "text/plain", name: "small.txt" }
110
+ ]
111
+ },
112
+ ]
113
+ )
114
+
115
+ # Transcript
116
+ LlmGateway::Client.chat('llama-3.3-70b-versatile',[
117
+ { role: "user", content: "Tell Me a joke" },
118
+ { role: "assistant", content: "what kind of content"},
119
+ { role: "user", content: "About Sparkling water" },
120
+ ]
121
+ )
122
+
123
+
124
+ # Tool usage
125
+ LlmGateway::Client.chat('gpt-5',[
126
+ { role: "user", content: "What's the weather in Singapore? reply in 10 words and no special characters" },
127
+ { role: "assistant",
128
+ content: [
129
+ { id: "call_gpXfy9l9QNmShNEbNI1FyuUZ", type: "tool_use", name: "get_weather", input: { location: "Singapore" } }
130
+ ]
131
+ },
132
+ { role: "developer",
133
+ content: [
134
+ { content: "-15 celcius", type: "tool_result", tool_use_id: "call_gpXfy9l9QNmShNEbNI1FyuUZ" }
135
+ ]
136
+ }
137
+ ],
138
+ tools: [ { name: "get_weather", description: "Get current weather for a location", input_schema: { type: "object", properties: { location: { type: "string", description: "City name" } }, required: [ "location" ] } } ]
139
+ )
140
+ ```
141
+
142
+ ### Supported Roles
143
+
144
+ - user
145
+ - developer
146
+ - assistant
147
+
148
+ #### Examples
149
+ ```ruby
150
+ # tool call
151
+ { role: "developer",
152
+ content: [
153
+ { content: "-15 celcius", type: "tool_result", tool_use_id: "call_gpXfy9l9QNmShNEbNI1FyuUZ" }
154
+ ]
155
+ }
156
+ # plain message
157
+ { role: "user", content: "What's the weather in Singapore? reply in 10 words and no special characters" }
158
+
159
+ # plain response
160
+ { role: "assistant", content: "what kind of content"},
161
+
162
+ # tool call response
163
+ { role: "assistant",
164
+ content: [
165
+ { id: "call_gpXfy9l9QNmShNEbNI1FyuUZ", type: "tool_use", name: "get_weather", input: { location: "Singapore" } }
166
+ ]
167
+ },
168
+ ```
169
+
170
+ developer is an open ai role, but i thought it was usefull for tracing if message sent from server or user so i added
171
+ it to the list of roles, when it is not supported it will be mapped to user instead.
172
+
173
+ you can assume developer and user to be interchangeable
174
+
175
+
176
+ ### Files
177
+
178
+ Many providers offer the ability to upload files which can be referenced in conversations, or for other purposes like batching. Downloading files is also used for when llm generates something or batches complete.
179
+
180
+ ## Examples
181
+
182
+ ```ruby
183
+ # Upload File
184
+ result = LlmGateway::Client.upload_file("openai", filename: "test.txt", content: "Hello, world!", mime_type: "text/plain")
185
+ result = LlmGateway::Client.download_file("openai", file_id: "file-Kb6X7f8YDffu7FG1NcaPVu")
186
+ # Response Format
187
+ {
188
+ id: "file-Kb6X7f8YDffu7FG1NcaPVu",
189
+ size_bytes: 13, # follows anthropic naming cause clearer
190
+ created_at: "2025-08-08T06:03:16.000000Z", # follow anthropic style cause easier to read as human
191
+ filename: "test.txt",
192
+ mime_type: nil,
193
+ downloadable: true, # anthropic returns this for other providers it is infered
194
+ expires_at: nil,
195
+ purpose: "user_data" # for anthropic this is always user_data
196
+ }
44
197
  ```
45
198
 
46
199
  ### Sample Application
data/Rakefile CHANGED
@@ -62,12 +62,18 @@ begin
62
62
  sh "git add ."
63
63
  sh "git commit -m \"Bump llm_gateway to #{new_version}\""
64
64
 
65
+ # Build the gem first
66
+ gem_file = `gem build llm_gateway.gemspec | grep 'File:' | awk '{print $2}'`.strip
67
+
65
68
  # Tag and push
66
69
  sh "git tag v#{new_version}"
67
70
  sh "git push origin main --tags"
68
71
 
72
+ # Create GitHub release with gem file
73
+ sh "gh release create v#{new_version} #{gem_file} --title \"Release v#{new_version}\" --generate-notes"
74
+
69
75
  # Release the gem
70
- sh "gem push $(gem build llm_gateway.gemspec | grep 'File:' | awk '{print $2}')"
76
+ sh "gem push #{gem_file}"
71
77
  end
72
78
  rescue LoadError
73
79
  # gem-release not available in this environment
@@ -0,0 +1,83 @@
1
+ # frozen_string_literal: true
2
+
3
+ module LlmGateway
4
+ module Adapters
5
+ module Claude
6
+ class BidirectionalMessageMapper
7
+ attr_reader :direction
8
+
9
+ def initialize(direction)
10
+ @direction = direction
11
+ end
12
+
13
+ def map_content(content)
14
+ # Convert string content to text format
15
+ content = { type: "text", text: content } unless content.is_a?(Hash)
16
+
17
+ case content[:type]
18
+ when "text"
19
+ map_text_content(content)
20
+ when "file"
21
+ map_file_content(content)
22
+ when "image"
23
+ map_image_content(content)
24
+ when "tool_use"
25
+ map_tool_use_content(content)
26
+ when "tool_result"
27
+ map_tool_result_content(content)
28
+ else
29
+ content
30
+ end
31
+ end
32
+
33
+ private
34
+
35
+ def map_text_content(content)
36
+ {
37
+ type: "text",
38
+ text: content[:text]
39
+ }
40
+ end
41
+
42
+ def map_file_content(content)
43
+ {
44
+ type: "document",
45
+ source: {
46
+ data: content[:data],
47
+ type: "text",
48
+ media_type: content[:media_type]
49
+ }
50
+ }
51
+ end
52
+
53
+ def map_image_content(content)
54
+ {
55
+ type: "image",
56
+ source: {
57
+ data: content[:data],
58
+ type: "base64",
59
+ media_type: content[:media_type]
60
+ }
61
+ }
62
+ end
63
+
64
+ def map_tool_use_content(content)
65
+ {
66
+ type: "tool_use",
67
+ id: content[:id],
68
+ name: content[:name],
69
+ input: content[:input]
70
+ }
71
+ end
72
+
73
+ def map_tool_result_content(content)
74
+ {
75
+ type: "tool_result",
76
+ tool_use_id: content[:tool_use_id],
77
+ content: content[:content]
78
+ }
79
+ end
80
+ end
81
+ end
82
+ end
83
+ end
@@ -28,6 +28,10 @@ module LlmGateway
28
28
  get("files/#{file_id}/content")
29
29
  end
30
30
 
31
+ def upload_file(filename, content, mime_type = "application/octet-stream")
32
+ post_file("files", content, filename, mime_type: mime_type)
33
+ end
34
+
31
35
  private
32
36
 
33
37
  def build_headers
@@ -1,33 +1,56 @@
1
1
  # frozen_string_literal: true
2
2
 
3
+ require_relative "bidirectional_message_mapper"
4
+
3
5
  module LlmGateway
4
6
  module Adapters
5
7
  module Claude
6
8
  class InputMapper
7
- extend LlmGateway::FluentMapper
9
+ def self.map(data)
10
+ {
11
+ messages: map_messages(data[:messages]),
12
+ response_format: data[:response_format],
13
+ tools: data[:tools],
14
+ system: map_system(data[:system])
15
+ }
16
+ end
17
+
18
+ private
19
+
20
+ def self.map_messages(messages)
21
+ return messages unless messages
22
+
23
+ message_mapper = BidirectionalMessageMapper.new(LlmGateway::DIRECTION_IN)
8
24
 
9
- map :messages do |_, value|
10
- value.map do |msg|
25
+ messages.map do |msg|
11
26
  msg = msg.merge(role: "user") if msg[:role] == "developer"
12
- msg.slice(:role, :content)
27
+
28
+ content = if msg[:content].is_a?(Array)
29
+ msg[:content].map do |content|
30
+ message_mapper.map_content(content)
31
+ end
32
+ else
33
+ [ message_mapper.map_content(msg[:content]) ]
34
+ end
35
+
36
+ {
37
+ role: msg[:role],
38
+ content: content
39
+ }
13
40
  end
14
41
  end
15
42
 
16
- map :system do |_, value|
17
- if value.empty?
43
+ def self.map_system(system)
44
+ if !system || system.empty?
18
45
  nil
19
- elsif value.length == 1 && value.first[:role] == "system"
46
+ elsif system.length == 1 && system.first[:role] == "system"
20
47
  # If we have a single system message, convert to Claude format
21
- [ { type: "text", text: value.first[:content] } ]
48
+ [ { type: "text", text: system.first[:content] } ]
22
49
  else
23
50
  # For multiple messages or non-standard format, pass through
24
- value
51
+ system
25
52
  end
26
53
  end
27
-
28
- map :tools do |_, value|
29
- value
30
- end
31
54
  end
32
55
  end
33
56
  end
@@ -3,18 +3,44 @@
3
3
  module LlmGateway
4
4
  module Adapters
5
5
  module Claude
6
+ class FileOutputMapper
7
+ def self.map(data)
8
+ data.delete(:type) # Didnt see much value in this only option is "file"
9
+ data.merge(
10
+ expires_at: nil, # came from open ai api
11
+ purpose: "user_data", # came from open ai api
12
+ )
13
+ end
14
+ end
15
+
6
16
  class OutputMapper
7
- extend LlmGateway::FluentMapper
17
+ def self.map(data)
18
+ {
19
+ id: data[:id],
20
+ model: data[:model],
21
+ usage: data[:usage],
22
+ choices: map_choices(data)
23
+ }
24
+ end
25
+
26
+ private
27
+
28
+ def self.map_choices(data)
29
+ message_mapper = BidirectionalMessageMapper.new(LlmGateway::DIRECTION_OUT)
30
+
31
+ content = if data[:content].is_a?(Array)
32
+ data[:content].map do |content|
33
+ message_mapper.map_content(content)
34
+ end
35
+ else
36
+ data[:content] ? [ message_mapper.map_content(data[:content]) ] : []
37
+ end
8
38
 
9
- map :id
10
- map :model
11
- map :usage
12
- map :choices do |_, _|
13
39
  # Claude returns content directly at root level, not in a choices array
14
40
  # We need to construct the choices array from the full response data
15
41
  [ {
16
- content: @data[:content] || [], # Use content directly from Claude response
17
- finish_reason: @data[:stop_reason],
42
+ content: content, # Use content directly from Claude response
43
+ finish_reason: data[:stop_reason],
18
44
  role: "assistant"
19
45
  } ]
20
46
  end
@@ -0,0 +1,18 @@
1
+ # frozen_string_literal: true
2
+
3
+ require_relative "../open_ai/chat_completions/bidirectional_message_mapper"
4
+
5
+ module LlmGateway
6
+ module Adapters
7
+ module Groq
8
+ class BidirectionalMessageMapper < OpenAi::ChatCompletions::BidirectionalMessageMapper
9
+ private
10
+
11
+ def map_file_content(content)
12
+ # Groq doesn't support files, return as text
13
+ content[:text] || "[File: #{content[:name]}]"
14
+ end
15
+ end
16
+ end
17
+ end
18
+ end
@@ -1,76 +1,16 @@
1
1
  # frozen_string_literal: true
2
2
 
3
+ require_relative "bidirectional_message_mapper"
4
+ require_relative "../open_ai/chat_completions/input_mapper"
5
+
3
6
  module LlmGateway
4
7
  module Adapters
5
8
  module Groq
6
- class InputMapper
7
- extend LlmGateway::FluentMapper
8
-
9
- map :system
10
- map :response_format
11
-
12
- mapper :tool_usage do
13
- map :role, default: "assistant"
14
- map :content do
15
- nil
16
- end
17
- map :tool_calls, from: :content do |_, value|
18
- value.map do |content|
19
- {
20
- 'id': content[:id],
21
- 'type': "function",
22
- 'function': {
23
- 'name': content[:name],
24
- 'arguments': content[:input].to_json
25
- }
26
- }
27
- end
28
- end
29
- end
30
-
31
- mapper :tool_result_message do
32
- map :role, default: "tool"
33
- map :tool_call_id, from: "tool_use_id"
34
- map :content
35
- end
36
-
37
- map :messages do |_, value|
38
- value.map do |msg|
39
- if msg[:role] == "user"
40
- msg
41
- elsif msg[:content].is_a?(Array)
42
- results = []
43
- # Handle tool_use messages
44
- tool_uses = msg[:content].select { |c| c[:type] == "tool_use" }
45
- results << map_single(msg, with: :tool_usage) if tool_uses.any?
46
- # Handle tool_result messages
47
- tool_results = msg[:content].select { |c| c[:type] == "tool_result" }
48
- tool_results.each do |content|
49
- results << map_single(content, with: :tool_result_message)
50
- end
51
-
52
- results
53
- else
54
- msg
55
- end
56
- end.flatten
57
- end
9
+ class InputMapper < OpenAi::ChatCompletions::InputMapper
10
+ private
58
11
 
59
- map :tools do |_, value|
60
- if value
61
- value.map do |tool|
62
- {
63
- type: "function",
64
- function: {
65
- name: tool[:name],
66
- description: tool[:description],
67
- parameters: tool[:input_schema]
68
- }
69
- }
70
- end
71
- else
72
- value
73
- end
12
+ def self.map_system(system)
13
+ system
74
14
  end
75
15
  end
76
16
  end
@@ -3,43 +3,7 @@
3
3
  module LlmGateway
4
4
  module Adapters
5
5
  module Groq
6
- class OutputMapper
7
- extend LlmGateway::FluentMapper
8
-
9
- mapper :tool_call do
10
- map :id
11
- map :type do
12
- "tool_use" # Always return 'tool_use' regardless of input
13
- end
14
- map :name, from: "function.name"
15
- map :input, from: "function.arguments" do |_, value|
16
- parsed = value.is_a?(String) ? JSON.parse(value) : value
17
- parsed
18
- end
19
- end
20
-
21
- mapper :content_item do
22
- map :text, from: "content"
23
- map :type, default: "text"
24
- end
25
-
26
- map :id
27
- map :model
28
- map :usage
29
- map :choices, from: "choices" do |_, value|
30
- value.map do |choice|
31
- message = choice[:message] || {}
32
- content_item = map_single(message, with: :content_item, default: {})
33
- tool_calls = map_collection(message[:tool_calls], with: :tool_call, default: [])
34
-
35
- # Only include content_item if it has actual text content
36
- content_array = []
37
- content_array << content_item if LlmGateway::Utils.present?(content_item["text"])
38
- content_array += tool_calls
39
-
40
- { content: content_array }
41
- end
42
- end
6
+ class OutputMapper < LlmGateway::Adapters::OpenAi::ChatCompletions::OutputMapper
43
7
  end
44
8
  end
45
9
  end