llm_gateway 0.2.0 → 0.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +16 -0
- data/README.md +80 -2
- data/Rakefile +7 -1
- data/lib/llm_gateway/adapters/claude/bidirectional_message_mapper.rb +83 -0
- data/lib/llm_gateway/adapters/claude/client.rb +4 -0
- data/lib/llm_gateway/adapters/claude/input_mapper.rb +8 -7
- data/lib/llm_gateway/adapters/claude/output_mapper.rb +21 -1
- data/lib/llm_gateway/adapters/groq/bidirectional_message_mapper.rb +18 -0
- data/lib/llm_gateway/adapters/groq/input_mapper.rb +4 -91
- data/lib/llm_gateway/adapters/groq/output_mapper.rb +1 -53
- data/lib/llm_gateway/adapters/open_ai/chat_completions/bidirectional_message_mapper.rb +103 -0
- data/lib/llm_gateway/adapters/open_ai/chat_completions/input_mapper.rb +110 -0
- data/lib/llm_gateway/adapters/open_ai/chat_completions/output_mapper.rb +40 -0
- data/lib/llm_gateway/adapters/open_ai/client.rb +21 -0
- data/lib/llm_gateway/adapters/open_ai/file_output_mapper.rb +25 -0
- data/lib/llm_gateway/adapters/open_ai/responses/bidirectional_message_mapper.rb +72 -0
- data/lib/llm_gateway/adapters/open_ai/responses/input_mapper.rb +62 -0
- data/lib/llm_gateway/adapters/open_ai/responses/output_mapper.rb +47 -0
- data/lib/llm_gateway/base_client.rb +35 -0
- data/lib/llm_gateway/client.rb +111 -15
- data/lib/llm_gateway/errors.rb +2 -0
- data/lib/llm_gateway/version.rb +1 -1
- data/lib/llm_gateway.rb +11 -3
- metadata +11 -4
- data/lib/llm_gateway/adapters/open_ai/input_mapper.rb +0 -63
- data/lib/llm_gateway/adapters/open_ai/output_mapper.rb +0 -10
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: 972dac306c8d8f59b41c462f0133d0ce415e689515fbd84d5503541a9fee6f93
|
4
|
+
data.tar.gz: 6014254f858af1b83906b29950811e2ce84940f0ff49301a9b36dd19f592b7a8
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: 388573aa9afe19a075c561f17d5e4a6b8e550de25b73c3e6d9583403096f1c26f13667055d5920408c2c06f9add888af81a0bf34434762b8ece4a58de32ee3c3
|
7
|
+
data.tar.gz: 42d0b37c296ee058109cb38daaeefc269ce128a39bfbb3b89745eb734653a2828ec8c54f8acd1a12242aa03d1706304507c627074a53e4e6cddc7eb27324d69d
|
data/CHANGELOG.md
CHANGED
@@ -1,5 +1,21 @@
|
|
1
1
|
# Changelog
|
2
2
|
|
3
|
+
## [v0.3.0](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.3.0) (2025-08-19)
|
4
|
+
|
5
|
+
[Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.2.0...v0.3.0)
|
6
|
+
|
7
|
+
**Merged pull requests:**
|
8
|
+
|
9
|
+
- feat: create a method called responses to implement modern apis [\#30](https://github.com/Hyper-Unearthing/llm_gateway/pull/30) ([billybonks](https://github.com/billybonks))
|
10
|
+
- test: Create a test that asserts the transcript format. [\#29](https://github.com/Hyper-Unearthing/llm_gateway/pull/29) ([billybonks](https://github.com/billybonks))
|
11
|
+
- refactor: move open ai chat completions to its own folder [\#28](https://github.com/Hyper-Unearthing/llm_gateway/pull/28) ([billybonks](https://github.com/billybonks))
|
12
|
+
- refactor: bundle all provider resources in a hash [\#27](https://github.com/Hyper-Unearthing/llm_gateway/pull/27) ([billybonks](https://github.com/billybonks))
|
13
|
+
- refactor: message mapper to become bidirectional mapper [\#26](https://github.com/Hyper-Unearthing/llm_gateway/pull/26) ([billybonks](https://github.com/billybonks))
|
14
|
+
- refactor: extract message mapper from input mapper [\#25](https://github.com/Hyper-Unearthing/llm_gateway/pull/25) ([billybonks](https://github.com/billybonks))
|
15
|
+
- docs: add more information about how the library works [\#24](https://github.com/Hyper-Unearthing/llm_gateway/pull/24) ([billybonks](https://github.com/billybonks))
|
16
|
+
- feat: enable uploading and downloading files from openai anthropic [\#23](https://github.com/Hyper-Unearthing/llm_gateway/pull/23) ([billybonks](https://github.com/billybonks))
|
17
|
+
- ci: upload build version to github when we release [\#22](https://github.com/Hyper-Unearthing/llm_gateway/pull/22) ([billybonks](https://github.com/billybonks))
|
18
|
+
|
3
19
|
## [v0.2.0](https://github.com/Hyper-Unearthing/llm_gateway/tree/v0.2.0) (2025-08-08)
|
4
20
|
|
5
21
|
[Full Changelog](https://github.com/Hyper-Unearthing/llm_gateway/compare/v0.1.6...v0.2.0)
|
data/README.md
CHANGED
@@ -1,8 +1,65 @@
|
|
1
1
|
# LlmGateway
|
2
2
|
|
3
|
-
Provide
|
3
|
+
Provide a unified translation interface for LLM Provider API's, While allowing developers to have as much control as possible, This does make it more complicated because we dont want developers to be blocked at using something that the provider supports. As time progress the library will mature and support more responses
|
4
4
|
|
5
|
-
|
5
|
+
|
6
|
+
## Principles:
|
7
|
+
1. Transcription integrity is most important
|
8
|
+
2. Input messages must have bidirectional integrity
|
9
|
+
3. Allow developers as much control as possible
|
10
|
+
|
11
|
+
## Assumptions
|
12
|
+
things that do not support unidirectional format, probably cant be sent between providers
|
13
|
+
|
14
|
+
## Mechanics
|
15
|
+
Messages either support unidirectional or bidirectional format. (unidirectional means we can format it as an output but should not be added as an input).
|
16
|
+
|
17
|
+
The result from the llm is in the format that can be sent to the provider, but if you want to consolidate complex messages like code_execution, you must run a mapper we provide manually, but dont send that format back to the provider.
|
18
|
+
|
19
|
+
### bidirectional Support
|
20
|
+
Messages
|
21
|
+
- Text
|
22
|
+
- Tool Use
|
23
|
+
- Tool Response
|
24
|
+
|
25
|
+
Tools
|
26
|
+
- Server Tools
|
27
|
+
- Tools
|
28
|
+
|
29
|
+
### Unidirectional Support
|
30
|
+
- Server Tool Use Reponse
|
31
|
+
|
32
|
+
### Example flow
|
33
|
+
|
34
|
+
|
35
|
+
```mermaid
|
36
|
+
sequenceDiagram
|
37
|
+
actor developer
|
38
|
+
participant llm_gateway
|
39
|
+
participant llm_provider
|
40
|
+
|
41
|
+
developer ->> llm_gateway: Send Text Message
|
42
|
+
llm_gateway ->> llm_gateway: transform to provider format
|
43
|
+
llm_gateway ->> llm_provider: Transformed Text Message
|
44
|
+
llm_provider ->> llm_gateway: Response <br />(transcript in provider format)
|
45
|
+
llm_gateway ->> developer: Response <br />(transcript in combination <br />of gatway and provider formats)
|
46
|
+
Note over llm_gateway,developer: llm_gateway will transform <br /> messages that support bi-direction
|
47
|
+
developer ->> developer: save the transcript
|
48
|
+
loop ProcessMessage
|
49
|
+
developer ->> llm_gateway: format message
|
50
|
+
llm_gateway ->> developer: return transformed message
|
51
|
+
Note over llm_gateway,developer: if the message: <br /> supports bidirection format returns as is <br /> otherwise will transform <br />into consolidated format
|
52
|
+
developer ->> developer: append earlier saved transcript
|
53
|
+
Note over developer, developer: for example tool use
|
54
|
+
end
|
55
|
+
developer -> llm_gateway: Transcript
|
56
|
+
llm_gateway ->> llm_gateway: transform to provider format
|
57
|
+
Note over llm_gateway,llm_gateway: non bidirectional messages are sent as is
|
58
|
+
llm_gateway ->> llm_provider: etc etc etc
|
59
|
+
|
60
|
+
|
61
|
+
|
62
|
+
```
|
6
63
|
|
7
64
|
## Supported Providers
|
8
65
|
Anthropic, OpenAi, Groq
|
@@ -116,7 +173,28 @@ it to the list of roles, when it is not supported it will be mapped to user inst
|
|
116
173
|
you can assume developer and user to be interchangeable
|
117
174
|
|
118
175
|
|
176
|
+
### Files
|
177
|
+
|
178
|
+
Many providers offer the ability to upload files which can be referenced in conversations, or for other purposes like batching. Downloading files is also used for when llm generates something or batches complete.
|
119
179
|
|
180
|
+
## Examples
|
181
|
+
|
182
|
+
```ruby
|
183
|
+
# Upload File
|
184
|
+
result = LlmGateway::Client.upload_file("openai", filename: "test.txt", content: "Hello, world!", mime_type: "text/plain")
|
185
|
+
result = LlmGateway::Client.download_file("openai", file_id: "file-Kb6X7f8YDffu7FG1NcaPVu")
|
186
|
+
# Response Format
|
187
|
+
{
|
188
|
+
id: "file-Kb6X7f8YDffu7FG1NcaPVu",
|
189
|
+
size_bytes: 13, # follows anthropic naming cause clearer
|
190
|
+
created_at: "2025-08-08T06:03:16.000000Z", # follow anthropic style cause easier to read as human
|
191
|
+
filename: "test.txt",
|
192
|
+
mime_type: nil,
|
193
|
+
downloadable: true, # anthropic returns this for other providers it is infered
|
194
|
+
expires_at: nil,
|
195
|
+
purpose: "user_data" # for anthropic this is always user_data
|
196
|
+
}
|
197
|
+
```
|
120
198
|
|
121
199
|
### Sample Application
|
122
200
|
|
data/Rakefile
CHANGED
@@ -62,12 +62,18 @@ begin
|
|
62
62
|
sh "git add ."
|
63
63
|
sh "git commit -m \"Bump llm_gateway to #{new_version}\""
|
64
64
|
|
65
|
+
# Build the gem first
|
66
|
+
gem_file = `gem build llm_gateway.gemspec | grep 'File:' | awk '{print $2}'`.strip
|
67
|
+
|
65
68
|
# Tag and push
|
66
69
|
sh "git tag v#{new_version}"
|
67
70
|
sh "git push origin main --tags"
|
68
71
|
|
72
|
+
# Create GitHub release with gem file
|
73
|
+
sh "gh release create v#{new_version} #{gem_file} --title \"Release v#{new_version}\" --generate-notes"
|
74
|
+
|
69
75
|
# Release the gem
|
70
|
-
sh "gem push
|
76
|
+
sh "gem push #{gem_file}"
|
71
77
|
end
|
72
78
|
rescue LoadError
|
73
79
|
# gem-release not available in this environment
|
@@ -0,0 +1,83 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
module LlmGateway
|
4
|
+
module Adapters
|
5
|
+
module Claude
|
6
|
+
class BidirectionalMessageMapper
|
7
|
+
attr_reader :direction
|
8
|
+
|
9
|
+
def initialize(direction)
|
10
|
+
@direction = direction
|
11
|
+
end
|
12
|
+
|
13
|
+
def map_content(content)
|
14
|
+
# Convert string content to text format
|
15
|
+
content = { type: "text", text: content } unless content.is_a?(Hash)
|
16
|
+
|
17
|
+
case content[:type]
|
18
|
+
when "text"
|
19
|
+
map_text_content(content)
|
20
|
+
when "file"
|
21
|
+
map_file_content(content)
|
22
|
+
when "image"
|
23
|
+
map_image_content(content)
|
24
|
+
when "tool_use"
|
25
|
+
map_tool_use_content(content)
|
26
|
+
when "tool_result"
|
27
|
+
map_tool_result_content(content)
|
28
|
+
else
|
29
|
+
content
|
30
|
+
end
|
31
|
+
end
|
32
|
+
|
33
|
+
private
|
34
|
+
|
35
|
+
def map_text_content(content)
|
36
|
+
{
|
37
|
+
type: "text",
|
38
|
+
text: content[:text]
|
39
|
+
}
|
40
|
+
end
|
41
|
+
|
42
|
+
def map_file_content(content)
|
43
|
+
{
|
44
|
+
type: "document",
|
45
|
+
source: {
|
46
|
+
data: content[:data],
|
47
|
+
type: "text",
|
48
|
+
media_type: content[:media_type]
|
49
|
+
}
|
50
|
+
}
|
51
|
+
end
|
52
|
+
|
53
|
+
def map_image_content(content)
|
54
|
+
{
|
55
|
+
type: "image",
|
56
|
+
source: {
|
57
|
+
data: content[:data],
|
58
|
+
type: "base64",
|
59
|
+
media_type: content[:media_type]
|
60
|
+
}
|
61
|
+
}
|
62
|
+
end
|
63
|
+
|
64
|
+
def map_tool_use_content(content)
|
65
|
+
{
|
66
|
+
type: "tool_use",
|
67
|
+
id: content[:id],
|
68
|
+
name: content[:name],
|
69
|
+
input: content[:input]
|
70
|
+
}
|
71
|
+
end
|
72
|
+
|
73
|
+
def map_tool_result_content(content)
|
74
|
+
{
|
75
|
+
type: "tool_result",
|
76
|
+
tool_use_id: content[:tool_use_id],
|
77
|
+
content: content[:content]
|
78
|
+
}
|
79
|
+
end
|
80
|
+
end
|
81
|
+
end
|
82
|
+
end
|
83
|
+
end
|
@@ -1,5 +1,7 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
|
+
require_relative "bidirectional_message_mapper"
|
4
|
+
|
3
5
|
module LlmGateway
|
4
6
|
module Adapters
|
5
7
|
module Claude
|
@@ -18,20 +20,19 @@ module LlmGateway
|
|
18
20
|
def self.map_messages(messages)
|
19
21
|
return messages unless messages
|
20
22
|
|
23
|
+
message_mapper = BidirectionalMessageMapper.new(LlmGateway::DIRECTION_IN)
|
24
|
+
|
21
25
|
messages.map do |msg|
|
22
26
|
msg = msg.merge(role: "user") if msg[:role] == "developer"
|
23
|
-
|
27
|
+
|
24
28
|
content = if msg[:content].is_a?(Array)
|
25
29
|
msg[:content].map do |content|
|
26
|
-
|
27
|
-
{ type: "document", source: { data: content[:data], type: "text", media_type: content[:media_type] } }
|
28
|
-
else
|
29
|
-
content
|
30
|
-
end
|
30
|
+
message_mapper.map_content(content)
|
31
31
|
end
|
32
32
|
else
|
33
|
-
msg[:content]
|
33
|
+
[ message_mapper.map_content(msg[:content]) ]
|
34
34
|
end
|
35
|
+
|
35
36
|
{
|
36
37
|
role: msg[:role],
|
37
38
|
content: content
|
@@ -3,6 +3,16 @@
|
|
3
3
|
module LlmGateway
|
4
4
|
module Adapters
|
5
5
|
module Claude
|
6
|
+
class FileOutputMapper
|
7
|
+
def self.map(data)
|
8
|
+
data.delete(:type) # Didnt see much value in this only option is "file"
|
9
|
+
data.merge(
|
10
|
+
expires_at: nil, # came from open ai api
|
11
|
+
purpose: "user_data", # came from open ai api
|
12
|
+
)
|
13
|
+
end
|
14
|
+
end
|
15
|
+
|
6
16
|
class OutputMapper
|
7
17
|
def self.map(data)
|
8
18
|
{
|
@@ -16,10 +26,20 @@ module LlmGateway
|
|
16
26
|
private
|
17
27
|
|
18
28
|
def self.map_choices(data)
|
29
|
+
message_mapper = BidirectionalMessageMapper.new(LlmGateway::DIRECTION_OUT)
|
30
|
+
|
31
|
+
content = if data[:content].is_a?(Array)
|
32
|
+
data[:content].map do |content|
|
33
|
+
message_mapper.map_content(content)
|
34
|
+
end
|
35
|
+
else
|
36
|
+
data[:content] ? [ message_mapper.map_content(data[:content]) ] : []
|
37
|
+
end
|
38
|
+
|
19
39
|
# Claude returns content directly at root level, not in a choices array
|
20
40
|
# We need to construct the choices array from the full response data
|
21
41
|
[ {
|
22
|
-
content:
|
42
|
+
content: content, # Use content directly from Claude response
|
23
43
|
finish_reason: data[:stop_reason],
|
24
44
|
role: "assistant"
|
25
45
|
} ]
|
@@ -0,0 +1,18 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require_relative "../open_ai/chat_completions/bidirectional_message_mapper"
|
4
|
+
|
5
|
+
module LlmGateway
|
6
|
+
module Adapters
|
7
|
+
module Groq
|
8
|
+
class BidirectionalMessageMapper < OpenAi::ChatCompletions::BidirectionalMessageMapper
|
9
|
+
private
|
10
|
+
|
11
|
+
def map_file_content(content)
|
12
|
+
# Groq doesn't support files, return as text
|
13
|
+
content[:text] || "[File: #{content[:name]}]"
|
14
|
+
end
|
15
|
+
end
|
16
|
+
end
|
17
|
+
end
|
18
|
+
end
|
@@ -1,104 +1,17 @@
|
|
1
1
|
# frozen_string_literal: true
|
2
2
|
|
3
|
+
require_relative "bidirectional_message_mapper"
|
4
|
+
require_relative "../open_ai/chat_completions/input_mapper"
|
5
|
+
|
3
6
|
module LlmGateway
|
4
7
|
module Adapters
|
5
8
|
module Groq
|
6
|
-
class InputMapper
|
7
|
-
def self.map(data)
|
8
|
-
{
|
9
|
-
messages: map_messages(data[:messages]),
|
10
|
-
response_format: map_response_format(data[:response_format]),
|
11
|
-
tools: map_tools(data[:tools]),
|
12
|
-
system: map_system(data[:system])
|
13
|
-
}
|
14
|
-
end
|
15
|
-
|
9
|
+
class InputMapper < OpenAi::ChatCompletions::InputMapper
|
16
10
|
private
|
17
11
|
|
18
12
|
def self.map_system(system)
|
19
13
|
system
|
20
14
|
end
|
21
|
-
|
22
|
-
def self.map_response_format(response_format)
|
23
|
-
response_format
|
24
|
-
end
|
25
|
-
|
26
|
-
def self.map_messages(messages)
|
27
|
-
return messages unless messages
|
28
|
-
|
29
|
-
messages.flat_map do |msg|
|
30
|
-
if msg[:content].is_a?(Array)
|
31
|
-
# Handle array content with tool calls and tool results
|
32
|
-
tool_calls = []
|
33
|
-
regular_content = []
|
34
|
-
tool_messages = []
|
35
|
-
|
36
|
-
msg[:content].each do |content|
|
37
|
-
case content[:type]
|
38
|
-
when "tool_result"
|
39
|
-
tool_messages << map_tool_result_message(content)
|
40
|
-
when "tool_use"
|
41
|
-
tool_calls << map_tool_usage(content)
|
42
|
-
else
|
43
|
-
regular_content << content
|
44
|
-
end
|
45
|
-
end
|
46
|
-
|
47
|
-
result = []
|
48
|
-
|
49
|
-
# Add the main message with tool calls if any
|
50
|
-
if tool_calls.any? || regular_content.any?
|
51
|
-
main_msg = msg.dup
|
52
|
-
main_msg[:role] = "assistant" if !main_msg[:role]
|
53
|
-
main_msg[:tool_calls] = tool_calls if tool_calls.any?
|
54
|
-
main_msg[:content] = regular_content.any? ? regular_content : nil
|
55
|
-
result << main_msg
|
56
|
-
end
|
57
|
-
|
58
|
-
# Add separate tool result messages
|
59
|
-
result += tool_messages
|
60
|
-
|
61
|
-
result
|
62
|
-
else
|
63
|
-
# Regular message, return as-is
|
64
|
-
[ msg ]
|
65
|
-
end
|
66
|
-
end
|
67
|
-
end
|
68
|
-
|
69
|
-
def self.map_tools(tools)
|
70
|
-
return tools unless tools
|
71
|
-
|
72
|
-
tools.map do |tool|
|
73
|
-
{
|
74
|
-
type: "function",
|
75
|
-
function: {
|
76
|
-
name: tool[:name],
|
77
|
-
description: tool[:description],
|
78
|
-
parameters: tool[:input_schema]
|
79
|
-
}
|
80
|
-
}
|
81
|
-
end
|
82
|
-
end
|
83
|
-
|
84
|
-
def self.map_tool_usage(content)
|
85
|
-
{
|
86
|
-
'id': content[:id],
|
87
|
-
'type': "function",
|
88
|
-
'function': {
|
89
|
-
'name': content[:name],
|
90
|
-
'arguments': content[:input].to_json
|
91
|
-
}
|
92
|
-
}
|
93
|
-
end
|
94
|
-
|
95
|
-
def self.map_tool_result_message(content)
|
96
|
-
{
|
97
|
-
role: "tool",
|
98
|
-
tool_call_id: content[:tool_use_id],
|
99
|
-
content: content[:content]
|
100
|
-
}
|
101
|
-
end
|
102
15
|
end
|
103
16
|
end
|
104
17
|
end
|
@@ -3,59 +3,7 @@
|
|
3
3
|
module LlmGateway
|
4
4
|
module Adapters
|
5
5
|
module Groq
|
6
|
-
class OutputMapper
|
7
|
-
def self.map(data)
|
8
|
-
{
|
9
|
-
id: data[:id],
|
10
|
-
model: data[:model],
|
11
|
-
usage: data[:usage],
|
12
|
-
choices: map_choices(data[:choices])
|
13
|
-
}
|
14
|
-
end
|
15
|
-
|
16
|
-
private
|
17
|
-
|
18
|
-
def self.map_choices(choices)
|
19
|
-
return [] unless choices
|
20
|
-
|
21
|
-
choices.map do |choice|
|
22
|
-
message = choice[:message] || {}
|
23
|
-
content_item = map_content_item(message)
|
24
|
-
tool_calls = map_tool_calls(message[:tool_calls])
|
25
|
-
|
26
|
-
# Only include content_item if it has actual text content
|
27
|
-
content_array = []
|
28
|
-
content_array << content_item if LlmGateway::Utils.present?(content_item[:text])
|
29
|
-
content_array += tool_calls
|
30
|
-
|
31
|
-
{ content: content_array }
|
32
|
-
end
|
33
|
-
end
|
34
|
-
|
35
|
-
def self.map_content_item(message)
|
36
|
-
{
|
37
|
-
text: message[:content],
|
38
|
-
type: "text"
|
39
|
-
}
|
40
|
-
end
|
41
|
-
|
42
|
-
def self.map_tool_calls(tool_calls)
|
43
|
-
return [] unless tool_calls
|
44
|
-
|
45
|
-
tool_calls.map do |tool_call|
|
46
|
-
{
|
47
|
-
id: tool_call[:id],
|
48
|
-
type: "tool_use",
|
49
|
-
name: tool_call.dig(:function, :name),
|
50
|
-
input: parse_tool_arguments(tool_call.dig(:function, :arguments))
|
51
|
-
}
|
52
|
-
end
|
53
|
-
end
|
54
|
-
|
55
|
-
def self.parse_tool_arguments(arguments)
|
56
|
-
return arguments unless arguments.is_a?(String)
|
57
|
-
JSON.parse(arguments, symbolize_names: true)
|
58
|
-
end
|
6
|
+
class OutputMapper < LlmGateway::Adapters::OpenAi::ChatCompletions::OutputMapper
|
59
7
|
end
|
60
8
|
end
|
61
9
|
end
|
@@ -0,0 +1,103 @@
|
|
1
|
+
# frozen_string_literal: true
|
2
|
+
|
3
|
+
require "base64"
|
4
|
+
|
5
|
+
module LlmGateway
|
6
|
+
module Adapters
|
7
|
+
module OpenAi
|
8
|
+
module ChatCompletions
|
9
|
+
class BidirectionalMessageMapper
|
10
|
+
attr_reader :direction
|
11
|
+
|
12
|
+
def initialize(direction)
|
13
|
+
@direction = direction
|
14
|
+
end
|
15
|
+
|
16
|
+
def map_content(content)
|
17
|
+
# Convert string content to text format
|
18
|
+
content = { type: "text", text: content } unless content.is_a?(Hash)
|
19
|
+
case content[:type]
|
20
|
+
when "text"
|
21
|
+
map_text_content(content)
|
22
|
+
when "file"
|
23
|
+
map_file_content(content)
|
24
|
+
when "image"
|
25
|
+
map_image_content(content)
|
26
|
+
when "tool_use"
|
27
|
+
map_tool_use_content(content)
|
28
|
+
when "function"
|
29
|
+
map_tool_use_content(content)
|
30
|
+
when "tool_result"
|
31
|
+
map_tool_result_content(content)
|
32
|
+
else
|
33
|
+
content
|
34
|
+
end
|
35
|
+
end
|
36
|
+
|
37
|
+
private
|
38
|
+
|
39
|
+
def parse_tool_arguments(arguments)
|
40
|
+
return arguments unless arguments.is_a?(String)
|
41
|
+
JSON.parse(arguments, symbolize_names: true)
|
42
|
+
end
|
43
|
+
|
44
|
+
def map_text_content(content)
|
45
|
+
{
|
46
|
+
type: "text",
|
47
|
+
text: content[:text]
|
48
|
+
}
|
49
|
+
end
|
50
|
+
|
51
|
+
def map_file_content(content)
|
52
|
+
# Map text/plain to application/pdf for OpenAI
|
53
|
+
media_type = content[:media_type] == "text/plain" ? "application/pdf" : content[:media_type]
|
54
|
+
{
|
55
|
+
type: "file",
|
56
|
+
file: {
|
57
|
+
filename: content[:name],
|
58
|
+
file_data: "data:#{media_type};base64,#{Base64.encode64(content[:data])}"
|
59
|
+
}
|
60
|
+
}
|
61
|
+
end
|
62
|
+
|
63
|
+
def map_image_content(content)
|
64
|
+
{
|
65
|
+
type: "image_url",
|
66
|
+
image_url: {
|
67
|
+
url: "data:#{content[:media_type]};base64,#{content[:data]}"
|
68
|
+
}
|
69
|
+
}
|
70
|
+
end
|
71
|
+
|
72
|
+
def map_tool_use_content(content)
|
73
|
+
if direction == LlmGateway::DIRECTION_IN
|
74
|
+
{
|
75
|
+
id: content[:id],
|
76
|
+
type: "function",
|
77
|
+
function: {
|
78
|
+
name: content[:name],
|
79
|
+
arguments: content[:input].to_json
|
80
|
+
}
|
81
|
+
}
|
82
|
+
else
|
83
|
+
{
|
84
|
+
id: content[:id],
|
85
|
+
type: "tool_use",
|
86
|
+
name: content[:function][:name],
|
87
|
+
input: parse_tool_arguments(content[:function][:arguments])
|
88
|
+
}
|
89
|
+
end
|
90
|
+
end
|
91
|
+
|
92
|
+
def map_tool_result_content(content)
|
93
|
+
{
|
94
|
+
role: "tool",
|
95
|
+
tool_call_id: content[:tool_use_id],
|
96
|
+
content: content[:content]
|
97
|
+
}
|
98
|
+
end
|
99
|
+
end
|
100
|
+
end
|
101
|
+
end
|
102
|
+
end
|
103
|
+
end
|