ruby_llm-mcp 0.1.0 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: b6cf22dd6e681b51686e73f0149f91b0fd3a256ccbba1e3dd20461d0a92b370d
4
- data.tar.gz: 8ed938f20b001c4879a409ab77944e6d8b5e53aa0062d6e499d3739355c5054d
3
+ metadata.gz: f54f8d20c1ebbaddd31672de6715ca3adc92db0187a98b743711f0aff06f373c
4
+ data.tar.gz: a41e69179703af28db4f7ea5ad27bc1f47c0f6c9eb189cacfb0392cf9d2adebb
5
5
  SHA512:
6
- metadata.gz: 817fa242ef9d4cd526ea8905093d6c548f36ffeb228ee3b0acfbb0fd80303ec69557493e9b0ee0b2419c0d5adddf79c2bda02ba577e7e90a0503fd9482444bc5
7
- data.tar.gz: '09326061dfcdc66d963ef55740ad4d0d82c260ed1da6021229d6441c9c8f1c17af744fd90d9e865e9ddc209589f663c8c5a94aea020ca7bc189c41b668d58433'
6
+ metadata.gz: 0a01c76e79c94eb3cf19a422182ae1427fc5e2ac39f32951c18d32ed7be68007c09155f95e23595c4ae864fb1b192fea963193d325abe9e9d5a4406736f61f82
7
+ data.tar.gz: 25e4f7611a0a0c7c3462c1114daeeed832fc4900a1abccda65253c5a0b459a6ebf7955f2d3ef3473045d6530f220a8ac69f285ec32acca0e921a12a0f4feeebf
data/README.md CHANGED
@@ -2,16 +2,19 @@
2
2
 
3
3
  Aiming to make using MCP with RubyLLM as easy as possible.
4
4
 
5
- This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/patvice/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools as part of LLM conversations.
5
+ This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.
6
6
 
7
- **Note:** This project is still under development and the API is subject to change. Currently supports the connecting workflow, tool lists and tool execution.
7
+ **Note:** This project is still under development and the API is subject to change.
8
8
 
9
9
  ## Features
10
10
 
11
11
  - 🔌 **Multiple Transport Types**: Support for SSE (Server-Sent Events), Streamable HTTP, and stdio transports
12
12
  - 🛠️ **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools
13
+ - 📄 **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations
14
+ - 🎯 **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions
13
15
  - 🔄 **Real-time Communication**: Efficient bidirectional communication with MCP servers
14
- - 🎯 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
16
+ - 🎨 **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration
17
+ - 📚 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
15
18
 
16
19
  ## Installation
17
20
 
@@ -50,7 +53,7 @@ end
50
53
  # Connect to an MCP server via SSE
51
54
  client = RubyLLM::MCP.client(
52
55
  name: "my-mcp-server",
53
- transport_type: "sse",
56
+ transport_type: :sse,
54
57
  config: {
55
58
  url: "http://localhost:9292/mcp/sse"
56
59
  }
@@ -59,7 +62,7 @@ client = RubyLLM::MCP.client(
59
62
  # Or connect via stdio
60
63
  client = RubyLLM::MCP.client(
61
64
  name: "my-mcp-server",
62
- transport_type: "stdio",
65
+ transport_type: :stdio,
63
66
  config: {
64
67
  command: "node",
65
68
  args: ["path/to/mcp-server.js"],
@@ -70,7 +73,7 @@ client = RubyLLM::MCP.client(
70
73
  # Or connect via streamable HTTP
71
74
  client = RubyLLM::MCP.client(
72
75
  name: "my-mcp-server",
73
- transport_type: :streamable",
76
+ transport_type: :streamable,
74
77
  config: {
75
78
  url: "http://localhost:8080/mcp",
76
79
  headers: { "Authorization" => "Bearer your-token" }
@@ -139,6 +142,142 @@ result = client.execute_tool(
139
142
  puts result
140
143
  ```
141
144
 
145
+ ### Working with Resources
146
+
147
+ MCP servers can provide access to resources - structured data that can be included in conversations. Resources come in two types: normal resources and resource templates.
148
+
149
+ #### Normal Resources
150
+
151
+ ```ruby
152
+ # Get available resources from the MCP server
153
+ resources = client.resources
154
+ puts "Available resources:"
155
+ resources.each do |name, resource|
156
+ puts "- #{name}: #{resource.description}"
157
+ end
158
+
159
+ # Access a specific resource
160
+ file_resource = resources["project_readme"]
161
+ content = file_resource.content
162
+ puts "Resource content: #{content}"
163
+
164
+ # Include a resource in a chat conversation for reference with an LLM
165
+ chat = RubyLLM.chat(model: "gpt-4")
166
+ chat.with_resource(file_resource)
167
+
168
+ # Or add a resource directly to the conversation
169
+ file_resource.include(chat)
170
+
171
+ response = chat.ask("Can you summarize this README file?")
172
+ puts response
173
+ ```
174
+
175
+ #### Resource Templates
176
+
177
+ Resource templates are parameterized resources that can be dynamically configured:
178
+
179
+ ```ruby
180
+ # Get available resource templates
181
+ templates = client.resource_templates
182
+ log_template = templates["application_logs"]
183
+
184
+ # Use a template with parameters
185
+ chat = RubyLLM.chat(model: "gpt-4")
186
+ chat.with_resource(log_template, arguments: {
187
+ date: "2024-01-15",
188
+ level: "error"
189
+ })
190
+
191
+ response = chat.ask("What errors occurred on this date?")
192
+ puts response
193
+
194
+ # You can also get templated content directly
195
+ content = log_template.content(arguments: {
196
+ date: "2024-01-15",
197
+ level: "error"
198
+ })
199
+ puts content
200
+ ```
201
+
202
+ #### Resource Argument Completion
203
+
204
+ For resource templates, you can get suggested values for arguments:
205
+
206
+ ```ruby
207
+ template = client.resource_templates["user_profile"]
208
+
209
+ # Search for possible values for a specific argument
210
+ suggestions = template.arguments_search("username", "john")
211
+ puts "Suggested usernames:"
212
+ suggestions.arg_values.each do |value|
213
+ puts "- #{value}"
214
+ end
215
+ puts "Total matches: #{suggestions.total}"
216
+ puts "Has more: #{suggestions.has_more}"
217
+ ```
218
+
219
+ ### Working with Prompts
220
+
221
+ MCP servers can provide predefined prompts that can be used in conversations:
222
+
223
+ ```ruby
224
+ # Get available prompts from the MCP server
225
+ prompts = client.prompts
226
+ puts "Available prompts:"
227
+ prompts.each do |name, prompt|
228
+ puts "- #{name}: #{prompt.description}"
229
+ prompt.arguments.each do |arg|
230
+ puts " - #{arg.name}: #{arg.description} (required: #{arg.required})"
231
+ end
232
+ end
233
+
234
+ # Use a prompt in a conversation
235
+ greeting_prompt = prompts["daily_greeting"]
236
+ chat = RubyLLM.chat(model: "gpt-4")
237
+
238
+ # Method 1: Ask prompt directly
239
+ response = chat.ask_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
240
+ puts response
241
+
242
+ # Method 2: Add prompt to chat and then ask
243
+ chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
244
+ response = chat.ask("Continue with the greeting")
245
+ ```
246
+
247
+ ### Combining Resources, Prompts, and Tools
248
+
249
+ You can combine all MCP features for powerful conversations:
250
+
251
+ ```ruby
252
+ client = RubyLLM::MCP.client(
253
+ name: "development-assistant",
254
+ transport_type: :sse,
255
+ config: { url: "http://localhost:9292/mcp/sse" }
256
+ )
257
+
258
+ chat = RubyLLM.chat(model: "gpt-4")
259
+
260
+ # Add tools for capabilities
261
+ chat.with_tools(*client.tools)
262
+
263
+ # Add resources for context
264
+ chat.with_resource(client.resources["project_structure"])
265
+ chat.with_resource(
266
+ client.resource_templates["recent_commits"],
267
+ arguments: { days: 7 }
268
+ )
269
+
270
+ # Add prompts for guidance
271
+ chat.with_prompt(
272
+ client.prompts["code_review_checklist"],
273
+ arguments: { focus: "security" }
274
+ )
275
+
276
+ # Now ask for analysis
277
+ response = chat.ask("Please review the recent commits using the checklist and suggest improvements")
278
+ puts response
279
+ ```
280
+
142
281
  ## Transport Types
143
282
 
144
283
  ### SSE (Server-Sent Events)
@@ -0,0 +1,27 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This is an override of the RubyLLM::Chat class to convient methods for easy MCP support
4
+ module RubyLLM
5
+ class Chat
6
+ def with_resources(*resources, **args)
7
+ resources.each do |resource|
8
+ resource.include(self, **args)
9
+ end
10
+ self
11
+ end
12
+
13
+ def with_resource(resource, **args)
14
+ resource.include(self, **args)
15
+ self
16
+ end
17
+
18
+ def with_prompt(prompt, arguments: {})
19
+ prompt.include(self, arguments: arguments)
20
+ self
21
+ end
22
+
23
+ def ask_prompt(prompt, ...)
24
+ prompt.ask(self, ...)
25
+ end
26
+ end
27
+ end
@@ -0,0 +1,18 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ class Attachment < RubyLLM::Attachment
6
+ attr_reader :content, :mime_type
7
+
8
+ def initialize(content, mime_type) # rubocop:disable Lint/MissingSuper
9
+ @content = content
10
+ @mime_type = mime_type
11
+ end
12
+
13
+ def encoded
14
+ @content
15
+ end
16
+ end
17
+ end
18
+ end
@@ -55,13 +55,13 @@ module RubyLLM
55
55
  @resource_templates ||= fetch_and_create_resources(set_as_template: true)
56
56
  end
57
57
 
58
- def execute_tool(name:, parameters:)
59
- response = execute_tool_request(name: name, parameters: parameters)
60
- result = response["result"]
61
- # TODO: handle tool error when "isError" is true in result
62
- #
63
- # TODO: Implement "type": "image" and "type": "resource"
64
- result["content"].map { |content| content["text"] }.join("\n")
58
+ def prompts(refresh: false)
59
+ @prompts = nil if refresh
60
+ @prompts ||= fetch_and_create_prompts
61
+ end
62
+
63
+ def execute_tool(**args)
64
+ RubyLLM::MCP::Requests::ToolCall.new(self, **args).call
65
65
  end
66
66
 
67
67
  def resource_read_request(**args)
@@ -72,6 +72,10 @@ module RubyLLM
72
72
  RubyLLM::MCP::Requests::Completion.new(self, **args).call
73
73
  end
74
74
 
75
+ def execute_prompt(**args)
76
+ RubyLLM::MCP::Requests::PromptCall.new(self, **args).call
77
+ end
78
+
75
79
  private
76
80
 
77
81
  def initialize_request
@@ -87,10 +91,6 @@ module RubyLLM
87
91
  RubyLLM::MCP::Requests::ToolList.new(self).call
88
92
  end
89
93
 
90
- def execute_tool_request(**args)
91
- RubyLLM::MCP::Requests::ToolCall.new(self, **args).call
92
- end
93
-
94
94
  def resources_list_request
95
95
  RubyLLM::MCP::Requests::ResourceList.new(self).call
96
96
  end
@@ -99,6 +99,10 @@ module RubyLLM
99
99
  RubyLLM::MCP::Requests::ResourceTemplateList.new(self).call
100
100
  end
101
101
 
102
+ def prompt_list_request
103
+ RubyLLM::MCP::Requests::PromptList.new(self).call
104
+ end
105
+
102
106
  def fetch_and_create_tools
103
107
  tools_response = tool_list_request
104
108
  tools_response = tools_response["result"]["tools"]
@@ -112,9 +116,30 @@ module RubyLLM
112
116
  resources_response = resources_list_request
113
117
  resources_response = resources_response["result"]["resources"]
114
118
 
115
- @resources = resources_response.map do |resource|
116
- RubyLLM::MCP::Resource.new(self, resource, template: set_as_template)
119
+ resources = {}
120
+ resources_response.each do |resource|
121
+ new_resource = RubyLLM::MCP::Resource.new(self, resource, template: set_as_template)
122
+ resources[new_resource.name] = new_resource
123
+ end
124
+
125
+ resources
126
+ end
127
+
128
+ def fetch_and_create_prompts
129
+ prompts_response = prompt_list_request
130
+ prompts_response = prompts_response["result"]["prompts"]
131
+
132
+ prompts = {}
133
+ prompts_response.each do |prompt|
134
+ new_prompt = RubyLLM::MCP::Prompt.new(self,
135
+ name: prompt["name"],
136
+ description: prompt["description"],
137
+ arguments: prompt["arguments"])
138
+
139
+ prompts[new_prompt.name] = new_prompt
117
140
  end
141
+
142
+ prompts
118
143
  end
119
144
  end
120
145
  end
@@ -0,0 +1,15 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ class Completion
6
+ attr_reader :values, :total, :has_more
7
+
8
+ def initialize(values:, total:, has_more:)
9
+ @values = values
10
+ @total = total
11
+ @has_more = has_more
12
+ end
13
+ end
14
+ end
15
+ end
@@ -0,0 +1,20 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ class Content < RubyLLM::Content
6
+ attr_reader :text, :attachments, :content
7
+
8
+ def initialize(text: nil, attachments: nil) # rubocop:disable Lint/MissingSuper
9
+ @text = text
10
+ @attachments = attachments || []
11
+ end
12
+
13
+ # This is a workaround to allow the content object to be passed as the tool call
14
+ # to return audio or image attachments.
15
+ def to_s
16
+ attachments.empty? ? text : self
17
+ end
18
+ end
19
+ end
20
+ end
@@ -17,6 +17,10 @@ module RubyLLM
17
17
  class SessionExpiredError < BaseError; end
18
18
 
19
19
  class TimeoutError < BaseError; end
20
+
21
+ class PromptArgumentError < BaseError; end
22
+
23
+ class CompletionNotAvailable < BaseError; end
20
24
  end
21
25
  end
22
26
  end
@@ -0,0 +1,95 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ class Prompt
6
+ class Argument
7
+ attr_reader :name, :description, :required
8
+
9
+ def initialize(name:, description:, required:)
10
+ @name = name
11
+ @description = description
12
+ @required = required
13
+ end
14
+ end
15
+
16
+ attr_reader :name, :description, :arguments, :mcp_client
17
+
18
+ def initialize(mcp_client, name:, description:, arguments:)
19
+ @mcp_client = mcp_client
20
+ @name = name
21
+ @description = description
22
+
23
+ @arguments = arguments.map do |arg|
24
+ Argument.new(name: arg["name"], description: arg["description"], required: arg["required"])
25
+ end
26
+ end
27
+
28
+ def include(chat, arguments: {})
29
+ validate_arguments!(arguments)
30
+ messages = fetch_prompt_messages(arguments)
31
+
32
+ messages.each { |message| chat.add_message(message) }
33
+ chat
34
+ end
35
+
36
+ def ask(chat, arguments: {}, &)
37
+ include(chat, arguments: arguments)
38
+
39
+ chat.complete(&)
40
+ end
41
+
42
+ alias say ask
43
+
44
+ def arguments_search(argument, value)
45
+ if @mcp_client.capabilities.completion?
46
+ response = @mcp_client.completion(type: :prompt, name: @name, argument: argument, value: value)
47
+ response = response.dig("result", "completion")
48
+
49
+ Completion.new(values: response["values"], total: response["total"], has_more: response["hasMore"])
50
+ else
51
+ raise Errors::CompletionNotAvailable, "Completion is not available for this MCP server"
52
+ end
53
+ end
54
+
55
+ private
56
+
57
+ def fetch_prompt_messages(arguments)
58
+ response = @mcp_client.execute_prompt(
59
+ name: @name,
60
+ arguments: arguments
61
+ )
62
+
63
+ response["result"]["messages"].map do |message|
64
+ content = create_content_for_message(message["content"])
65
+
66
+ RubyLLM::Message.new(
67
+ role: message["role"],
68
+ content: content
69
+ )
70
+ end
71
+ end
72
+
73
+ def validate_arguments!(incoming_arguments)
74
+ @arguments.each do |arg|
75
+ if arg.required && incoming_arguments.key?(arg.name)
76
+ raise Errors::PromptArgumentError, "Argument #{arg.name} is required"
77
+ end
78
+ end
79
+ end
80
+
81
+ def create_content_for_message(content)
82
+ case content["type"]
83
+ when "text"
84
+ MCP::Content.new(text: content["text"])
85
+ when "image", "audio"
86
+ attachment = MCP::Attachment.new(content["content"], content["mime_type"])
87
+ MCP::Content.new(text: nil, attachments: [attachment])
88
+ when "resource"
89
+ resource = Resource.new(mcp_client, content["resource"])
90
+ resource.to_content
91
+ end
92
+ end
93
+ end
94
+ end
95
+ end
@@ -5,20 +5,36 @@ module RubyLLM
5
5
  module Providers
6
6
  module Anthropic
7
7
  module ComplexParameterSupport
8
+ module_function
9
+
8
10
  def clean_parameters(parameters)
9
11
  parameters.transform_values do |param|
10
- format = {
11
- type: param.type,
12
- description: param.description
13
- }.compact
12
+ build_properties(param).compact
13
+ end
14
+ end
14
15
 
15
- if param.type == "array"
16
- format[:items] = param.items
17
- elsif param.type == "object"
18
- format[:properties] = clean_parameters(param.properties)
19
- end
16
+ def required_parameters(parameters)
17
+ parameters.select { |_, param| param.required }.keys
18
+ end
20
19
 
21
- format
20
+ def build_properties(param)
21
+ case param.type
22
+ when :array
23
+ {
24
+ type: param.type,
25
+ items: { type: param.item_type }
26
+ }
27
+ when :object
28
+ {
29
+ type: param.type,
30
+ properties: clean_parameters(param.properties),
31
+ required: required_parameters(param.properties)
32
+ }
33
+ else
34
+ {
35
+ type: param.type,
36
+ description: param.description
37
+ }
22
38
  end
23
39
  end
24
40
  end
@@ -5,18 +5,29 @@ module RubyLLM
5
5
  module Providers
6
6
  module OpenAI
7
7
  module ComplexParameterSupport
8
+ module_function
9
+
8
10
  def param_schema(param)
9
- format = {
10
- type: param.type,
11
- description: param.description
12
- }.compact
11
+ properties = case param.type
12
+ when :array
13
+ {
14
+ type: param.type,
15
+ items: { type: param.item_type }
16
+ }
17
+ when :object
18
+ {
19
+ type: param.type,
20
+ properties: param.properties.transform_values { |value| param_schema(value) },
21
+ required: param.properties.select { |_, p| p.required }.keys
22
+ }
23
+ else
24
+ {
25
+ type: param.type,
26
+ description: param.description
27
+ }
28
+ end
13
29
 
14
- if param.type == "array"
15
- format[:items] = param.items
16
- elsif param.type == "object"
17
- format[:properties] = param.properties.transform_values { |value| param_schema(value) }
18
- end
19
- format
30
+ properties.compact
20
31
  end
21
32
  end
22
33
  end
@@ -0,0 +1,32 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ module Requests
6
+ class PromptCall
7
+ def initialize(client, name:, arguments: {})
8
+ @client = client
9
+ @name = name
10
+ @arguments = arguments
11
+ end
12
+
13
+ def call
14
+ @client.request(request_body)
15
+ end
16
+
17
+ private
18
+
19
+ def request_body
20
+ {
21
+ jsonrpc: "2.0",
22
+ method: "prompts/get",
23
+ params: {
24
+ name: @name,
25
+ arguments: @arguments
26
+ }
27
+ }
28
+ end
29
+ end
30
+ end
31
+ end
32
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module RubyLLM
4
+ module MCP
5
+ module Requests
6
+ class PromptList < Base
7
+ def call
8
+ client.request(request_body)
9
+ end
10
+
11
+ private
12
+
13
+ def request_body
14
+ {
15
+ jsonrpc: "2.0",
16
+ method: "prompts/list",
17
+ params: {}
18
+ }
19
+ end
20
+ end
21
+ end
22
+ end
23
+ end
@@ -11,20 +11,50 @@ module RubyLLM
11
11
  @name = resource["name"]
12
12
  @description = resource["description"]
13
13
  @mime_type = resource["mimeType"]
14
+ @content = resource["content"] || nil
14
15
  @template = template
15
16
  end
16
17
 
17
- def content(parsable_information: {})
18
+ def content(arguments: {})
19
+ return @content if @content && !template?
20
+
18
21
  response = if template?
19
- templated_uri = apply_template(@uri, parsable_information)
22
+ templated_uri = apply_template(@uri, arguments)
20
23
 
21
24
  read_response(uri: templated_uri)
22
25
  else
23
26
  read_response
24
27
  end
25
28
 
26
- response.dig("result", "contents", 0, "text") ||
27
- response.dig("result", "contents", 0, "blob")
29
+ content = response.dig("result", "contents", 0)
30
+ @type = if content.key?("type")
31
+ content["type"]
32
+ else
33
+ "text"
34
+ end
35
+
36
+ @content = content["text"] || content["blob"]
37
+ end
38
+
39
+ def include(chat, **args)
40
+ message = Message.new(
41
+ role: "user",
42
+ content: to_content(**args)
43
+ )
44
+
45
+ chat.add_message(message)
46
+ end
47
+
48
+ def to_content(arguments: {})
49
+ content = content(arguments: arguments)
50
+
51
+ case @type
52
+ when "text"
53
+ MCP::Content.new(content)
54
+ when "blob"
55
+ attachment = MCP::Attachment.new(content, mime_type)
56
+ MCP::Content.new(text: "#{name}: #{description}", attachments: [attachment])
57
+ end
28
58
  end
29
59
 
30
60
  def arguments_search(argument, value)
@@ -32,10 +62,9 @@ module RubyLLM
32
62
  response = @mcp_client.completion(type: :resource, name: @name, argument: argument, value: value)
33
63
  response = response.dig("result", "completion")
34
64
 
35
- Struct.new(:arg_values, :total, :has_more)
36
- .new(response["values"], response["total"], response["hasMore"])
65
+ Completion.new(values: response["values"], total: response["total"], has_more: response["hasMore"])
37
66
  else
38
- []
67
+ raise Errors::CompletionNotAvailable, "Completion is not available for this MCP server"
39
68
  end
40
69
  end
41
70
 
@@ -60,9 +89,9 @@ module RubyLLM
60
89
  { "result" => { "contents" => [{ "text" => response.body }] } }
61
90
  end
62
91
 
63
- def apply_template(uri, parsable_information)
92
+ def apply_template(uri, arguments)
64
93
  uri.gsub(/\{(\w+)\}/) do
65
- parsable_information[::Regexp.last_match(1).to_sym] || "{#{::Regexp.last_match(1)}}"
94
+ arguments[::Regexp.last_match(1).to_sym] || "{#{::Regexp.last_match(1)}}"
66
95
  end
67
96
  end
68
97
  end
@@ -15,10 +15,17 @@ module RubyLLM
15
15
  end
16
16
 
17
17
  def execute(**params)
18
- @mcp_client.execute_tool(
18
+ response = @mcp_client.execute_tool(
19
19
  name: @name,
20
20
  parameters: params
21
21
  )
22
+
23
+ text_values = response.dig("result", "content").map { |content| content["text"] }.compact.join("\n")
24
+ if text_values.empty?
25
+ create_content_for_message(response.dig("result", "content", 0))
26
+ else
27
+ create_content_for_message({ type: "text", text: text_values })
28
+ end
22
29
  end
23
30
 
24
31
  private
@@ -45,6 +52,19 @@ module RubyLLM
45
52
 
46
53
  params
47
54
  end
55
+
56
+ def create_content_for_message(content)
57
+ case content["type"]
58
+ when "text"
59
+ MCP::Content.new(text: content["text"])
60
+ when "image", "audio"
61
+ attachment = MCP::Attachment.new(content["content"], content["mime_type"])
62
+ MCP::Content.new(text: nil, attachments: [attachment])
63
+ when "resource"
64
+ resource = Resource.new(mcp_client, content["resource"])
65
+ resource.to_content
66
+ end
67
+ end
48
68
  end
49
69
  end
50
70
  end
@@ -2,6 +2,6 @@
2
2
 
3
3
  module RubyLLM
4
4
  module MCP
5
- VERSION = "0.1.0"
5
+ VERSION = "0.2.0"
6
6
  end
7
7
  end
data/lib/ruby_llm/mcp.rb CHANGED
@@ -2,6 +2,7 @@
2
2
 
3
3
  require "ruby_llm"
4
4
  require "zeitwerk"
5
+ require_relative "chat"
5
6
 
6
7
  loader = Zeitwerk::Loader.for_gem_extension(RubyLLM)
7
8
  loader.inflector.inflect("mcp" => "MCP")
@@ -19,6 +20,7 @@ module RubyLLM
19
20
  def support_complex_parameters!
20
21
  require_relative "mcp/providers/open_ai/complex_parameter_support"
21
22
  require_relative "mcp/providers/anthropic/complex_parameter_support"
23
+ require_relative "mcp/providers/gemini/complex_parameter_support"
22
24
  end
23
25
  end
24
26
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruby_llm-mcp
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0
4
+ version: 0.2.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Patrick Vice
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2025-06-11 00:00:00.000000000 Z
11
+ date: 2025-06-17 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  name: faraday
@@ -107,17 +107,24 @@ extra_rdoc_files: []
107
107
  files:
108
108
  - LICENSE
109
109
  - README.md
110
+ - lib/ruby_llm/chat.rb
110
111
  - lib/ruby_llm/mcp.rb
112
+ - lib/ruby_llm/mcp/attachment.rb
111
113
  - lib/ruby_llm/mcp/capabilities.rb
112
114
  - lib/ruby_llm/mcp/client.rb
115
+ - lib/ruby_llm/mcp/completion.rb
116
+ - lib/ruby_llm/mcp/content.rb
113
117
  - lib/ruby_llm/mcp/errors.rb
114
118
  - lib/ruby_llm/mcp/parameter.rb
119
+ - lib/ruby_llm/mcp/prompt.rb
115
120
  - lib/ruby_llm/mcp/providers/anthropic/complex_parameter_support.rb
116
121
  - lib/ruby_llm/mcp/providers/open_ai/complex_parameter_support.rb
117
122
  - lib/ruby_llm/mcp/requests/base.rb
118
123
  - lib/ruby_llm/mcp/requests/completion.rb
119
124
  - lib/ruby_llm/mcp/requests/initialization.rb
120
125
  - lib/ruby_llm/mcp/requests/notification.rb
126
+ - lib/ruby_llm/mcp/requests/prompt_call.rb
127
+ - lib/ruby_llm/mcp/requests/prompt_list.rb
121
128
  - lib/ruby_llm/mcp/requests/resource_list.rb
122
129
  - lib/ruby_llm/mcp/requests/resource_read.rb
123
130
  - lib/ruby_llm/mcp/requests/resource_template_list.rb
@@ -148,14 +155,14 @@ required_ruby_version: !ruby/object:Gem::Requirement
148
155
  requirements:
149
156
  - - ">="
150
157
  - !ruby/object:Gem::Version
151
- version: 3.1.0
158
+ version: 3.1.3
152
159
  required_rubygems_version: !ruby/object:Gem::Requirement
153
160
  requirements:
154
161
  - - ">="
155
162
  - !ruby/object:Gem::Version
156
163
  version: '0'
157
164
  requirements: []
158
- rubygems_version: 3.3.7
165
+ rubygems_version: 3.5.11
159
166
  signing_key:
160
167
  specification_version: 4
161
168
  summary: A RubyLLM MCP Client