ruby_llm-mcp 0.5.0 → 0.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (34) hide show
  1. checksums.yaml +4 -4
  2. data/README.md +20 -620
  3. data/lib/generators/ruby_llm/mcp/install_generator.rb +27 -0
  4. data/lib/generators/ruby_llm/mcp/templates/README.txt +32 -0
  5. data/lib/generators/ruby_llm/mcp/templates/initializer.rb +42 -0
  6. data/lib/generators/ruby_llm/mcp/templates/mcps.yml +9 -0
  7. data/lib/ruby_llm/mcp/client.rb +56 -2
  8. data/lib/ruby_llm/mcp/completion.rb +3 -2
  9. data/lib/ruby_llm/mcp/configuration.rb +30 -1
  10. data/lib/ruby_llm/mcp/coordinator.rb +30 -6
  11. data/lib/ruby_llm/mcp/elicitation.rb +46 -0
  12. data/lib/ruby_llm/mcp/errors.rb +2 -0
  13. data/lib/ruby_llm/mcp/prompt.rb +4 -3
  14. data/lib/ruby_llm/mcp/protocol.rb +34 -0
  15. data/lib/ruby_llm/mcp/requests/completion_prompt.rb +13 -3
  16. data/lib/ruby_llm/mcp/requests/completion_resource.rb +13 -3
  17. data/lib/ruby_llm/mcp/resource.rb +1 -2
  18. data/lib/ruby_llm/mcp/resource_template.rb +4 -3
  19. data/lib/ruby_llm/mcp/response_handler.rb +10 -1
  20. data/lib/ruby_llm/mcp/responses/elicitation.rb +33 -0
  21. data/lib/ruby_llm/mcp/result.rb +2 -1
  22. data/lib/ruby_llm/mcp/tool.rb +33 -5
  23. data/lib/ruby_llm/mcp/transports/sse.rb +69 -25
  24. data/lib/ruby_llm/mcp/transports/stdio.rb +2 -2
  25. data/lib/ruby_llm/mcp/transports/streamable_http.rb +87 -19
  26. data/lib/ruby_llm/mcp/transports/support/http_client.rb +28 -0
  27. data/lib/ruby_llm/mcp/transports/support/rate_limit.rb +47 -0
  28. data/lib/ruby_llm/mcp/transports/support/timeout.rb +34 -0
  29. data/lib/ruby_llm/mcp/version.rb +1 -1
  30. data/lib/ruby_llm/mcp.rb +21 -9
  31. data/lib/tasks/release.rake +23 -0
  32. metadata +28 -8
  33. data/lib/ruby_llm/mcp/transports/http_client.rb +0 -26
  34. data/lib/ruby_llm/mcp/transports/timeout.rb +0 -32
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 328e09780647e7ef9a35aac8a8fd8e0b1c84aded38239b4ad9b08803e1d638d3
4
- data.tar.gz: f0f1022e6917f56b95ecfd4540fb517245f6b5450f9369d4cee2cc34e2eb0934
3
+ metadata.gz: dd3795fdab2bd6822d9f11c5b747f291b91d46148fb1f6e9bbb3644b6c7e98a4
4
+ data.tar.gz: 75c7105d9c276c74c6eff97d497da5600af21e66add76e01c8c9a5c79e5f4477
5
5
  SHA512:
6
- metadata.gz: d52fcbfb4fe1c1ebae1fece801a0f9b20a9aa467f7a3333c65e0db4fcc1ee0038df571ad4d7cdae2d8863d4d4ddd29c019a3ff245de5970142dcdccf7228ae45
7
- data.tar.gz: 471a551100918f7a86c6fe22b0065f340ec7ed87a4b1d709f49ce8c782b564532e840b6f160d5c082ef5f08734d1bd3ad59fb514f119150ec37f42e626678cc4
6
+ metadata.gz: 57ba084ac16260e889e84a028ab10366093478996f5d6218e2b794c96b927b1e5dc2164711a634b7954d3886eb938668119651a0498b29efde6418c9777e29c3
7
+ data.tar.gz: ca54b9f680884e3f6e6c7e2251f6416da0c28346763de35d59ba90c56e3b6d1ba19ce76f170c11554353dae81087275b71d0a4a319517107a2b41fdaad65b3af
data/README.md CHANGED
@@ -1,18 +1,25 @@
1
- # RubyLLM::MCP
1
+ <img src="/docs/assets/images/rubyllm-mcp-logo-text.svg" alt="RubyLLM" height="120" width="250">
2
2
 
3
- Aiming to make using MCPs with RubyLLM as easy as possible.
3
+ **Aiming to make using MCPs with RubyLLM and Ruby as easy as possible.**
4
4
 
5
5
  This project is a Ruby client for the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/), designed to work seamlessly with [RubyLLM](https://github.com/crmne/ruby_llm). This gem enables Ruby applications to connect to MCP servers and use their tools, resources and prompts as part of LLM conversations.
6
6
 
7
- **Note:** This project is still under development and the API is subject to change.
7
+ For a more detailed guide, see the [RubyLLM::MCP docs](https://rubyllm-mcp.com/).
8
8
 
9
- ## Features
9
+ Currently full support for MCP protocol version up to `2025-06-18`.
10
+
11
+ <div class="badge-container">
12
+ <a href="https://badge.fury.io/rb/ruby_llm-mcp"><img src="https://badge.fury.io/rb/ruby_llm-mcp.svg" alt="Gem Version" /></a>
13
+ <a href="https://rubygems.org/gems/ruby_llm-mcp"><img alt="Gem Downloads" src="https://img.shields.io/gem/dt/ruby_llm-mcp"></a>
14
+ </div>
15
+
16
+ ## RubyLLM::MCP Features
10
17
 
11
18
  - 🔌 **Multiple Transport Types**: Streamable HTTP, and STDIO and legacy SSE transports
12
19
  - 🛠️ **Tool Integration**: Automatically converts MCP tools into RubyLLM-compatible tools
13
20
  - 📄 **Resource Management**: Access and include MCP resources (files, data) and resource templates in conversations
14
21
  - 🎯 **Prompt Integration**: Use predefined MCP prompts with arguments for consistent interactions
15
- - 🎛️ **Client Features**: Support for sampling and roots
22
+ - 🎛️ **Client Features**: Support for sampling, roots, and elicitation
16
23
  - 🎨 **Enhanced Chat Interface**: Extended RubyLLM chat methods for seamless MCP integration
17
24
  - 🔄 **Multiple Client Management**: Create and manage multiple MCP clients simultaneously for different servers and purposes
18
25
  - 📚 **Simple API**: Easy-to-use interface that integrates seamlessly with RubyLLM
@@ -105,62 +112,16 @@ response = chat.ask("Can you help me search for recent files in my project?")
105
112
  puts response
106
113
  ```
107
114
 
108
- ### Human in the Loop
109
-
110
- You can use the `on_human_in_the_loop` callback to allow the human to intervene in the tool call. This is useful for tools that require human input or programic input to verify if the tool should be executed.
111
-
112
- For tool calls that have access to do important operations, there SHOULD always be a human in the loop with the ability to deny tool invocations.
113
-
114
- ```ruby
115
- client.on_human_in_the_loop do |name, params|
116
- name == "add" && params[:a] == 1 && params[:b] == 2
117
- end
118
-
119
- tool = client.tool("add")
120
- result = tool.execute(a: 1, b: 2)
121
- puts result # 3
122
-
123
- # If the human in the loop returns false, the tool call will be cancelled
124
- result = tool.execute(a: 2, b: 2)
125
- puts result # Tool execution error: Tool call was cancelled by the client
126
-
127
- tool = client.tool("add")
128
- result = tool.execute(a: 1, b: 2)
129
- puts result
130
- ```
131
-
132
- ### Support Complex Parameters
133
-
134
- If you want to support complex parameters, like an array of objects it currently requires a patch to RubyLLM itself. This is planned to be temporary until the RubyLLM is updated.
135
-
136
- ```ruby
137
- RubyLLM::MCP.support_complex_parameters!
138
- ```
139
-
140
- ### Streaming Responses with Tool Calls
141
-
142
- ```ruby
143
- chat = RubyLLM.chat(model: "gpt-4")
144
- chat.with_tools(*client.tools)
145
-
146
- chat.ask("Analyze my project structure") do |chunk|
147
- if chunk.tool_call?
148
- chunk.tool_calls.each do |key, tool_call|
149
- puts "\n🔧 Using tool: #{tool_call.name}"
150
- end
151
- else
152
- print chunk.content
153
- end
154
- end
155
- ```
156
-
157
115
  ### Manual Tool Execution
158
116
 
159
117
  You can also execute MCP tools directly:
160
118
 
161
119
  ```ruby
120
+ # Tools Execution
121
+ tool = client.tool("search_files")
122
+
162
123
  # Execute a specific tool
163
- result = client.execute_tool(
124
+ result = tool.execute(
164
125
  name: "search_files",
165
126
  parameters: {
166
127
  query: "*.rb",
@@ -256,577 +217,16 @@ chat.with_prompt(greeting_prompt, arguments: { name: "Alice", time: "morning" })
256
217
  response = chat.ask("Continue with the greeting")
257
218
  ```
258
219
 
259
- ### Combining Resources, Prompts, and Tools
260
-
261
- You can combine all MCP features for powerful conversations:
262
-
263
- ```ruby
264
- client = RubyLLM::MCP.client(
265
- name: "development-assistant",
266
- transport_type: :sse,
267
- config: { url: "http://localhost:9292/mcp/sse" }
268
- )
269
-
270
- chat = RubyLLM.chat(model: "gpt-4")
271
-
272
- # Add tools for capabilities
273
- chat.with_tools(*client.tools)
274
-
275
- # Add resources for context
276
- chat.with_resource(client.resource("project_structure"))
277
- chat.with_resource(
278
- client.resource_template("recent_commits"),
279
- arguments: { days: 7 }
280
- )
281
-
282
- # Add prompts for guidance
283
- chat.with_prompt(
284
- client.prompt("code_review_checklist"),
285
- arguments: { focus: "security" }
286
- )
287
-
288
- # Now ask for analysis
289
- response = chat.ask("Please review the recent commits using the checklist and suggest improvements")
290
- puts response
291
- ```
292
-
293
- ### Argument Completion
294
-
295
- Some MCP servers support argument completion for prompts and resource templates:
296
-
297
- ```ruby
298
- # For prompts
299
- prompt = client.prompt("user_search")
300
- suggestions = prompt.complete("username", "jo")
301
- puts "Suggestions: #{suggestions.values}" # ["john", "joanna", "joseph"]
302
-
303
- # For resource templates
304
- template = client.resource_template("user_logs")
305
- suggestions = template.complete("user_id", "123")
306
- puts "Total matches: #{suggestions.total}"
307
- puts "Has more results: #{suggestions.has_more}"
308
- ```
309
-
310
- ### Pagination
311
-
312
- MCP servers can support pagination for their lists. The client will automatically paginate the lists to include all items from the list you wanted to pull.
313
-
314
- Pagination is supported for tools, resources, prompts, and resource templates.
315
-
316
- ### Additional Chat Methods
317
-
318
- The gem extends RubyLLM's chat interface with convenient methods for MCP integration:
319
-
320
- ```ruby
321
- chat = RubyLLM.chat(model: "gpt-4")
322
-
323
- # Add a single resource
324
- chat.with_resource(resource)
325
-
326
- # Add multiple resources
327
- chat.with_resources(resource1, resource2, resource3)
328
-
329
- # Add a resource template with arguments
330
- chat.with_resource_template(resource_template, arguments: { key: "value" })
331
-
332
- # Add a prompt with arguments
333
- chat.with_prompt(prompt, arguments: { name: "Alice" })
334
-
335
- # Ask using a prompt directly
336
- response = chat.ask_prompt(prompt, arguments: { name: "Alice" })
337
- ```
338
-
339
- ## Rails Integration
340
-
341
- RubyLLM MCP provides seamless Rails integration through a Railtie and generator system.
220
+ ## Development
342
221
 
343
- ### Setup
222
+ After checking out the repo, run `bundle` to install dependencies. Then, run `bundle exec rake` to run the tests. Tests currently use `bun` to run test MCP servers You can also run `bin/console` for an interactive prompt that will allow you to experiment.
344
223
 
345
- Generate the configuration files:
224
+ There are also examples you you can run to verify the gem is working as expected.
346
225
 
347
226
  ```bash
348
- rails generate ruby_llm:mcp:install
349
- ```
350
-
351
- This creates:
352
-
353
- - `config/initializers/ruby_llm_mcp.rb` - Main configuration
354
- - `config/mcps.yml` - MCP servers configuration
355
-
356
- ### MCP Server Configuration
357
-
358
- Configure your MCP servers in `config/mcps.yml`:
359
-
360
- ```yaml
361
- mcp_servers:
362
- filesystem:
363
- transport_type: stdio
364
- command: npx
365
- args:
366
- - "@modelcontextprotocol/server-filesystem"
367
- - "<%= Rails.root %>"
368
- env: {}
369
- with_prefix: true
370
-
371
- api_server:
372
- transport_type: sse
373
- url: "https://api.example.com/mcp/sse"
374
- headers:
375
- Authorization: "Bearer <%= ENV['API_TOKEN'] %>"
376
- ```
377
-
378
- ### Automatic Client Management
379
-
380
- With `launch_control: :automatic`, Rails will:
381
-
382
- - Start all configured MCP clients when the application initializes
383
- - Gracefully shut down clients when the application exits
384
- - Handle client lifecycle automatically
385
-
386
- However, it's very command to due to the performace of LLM calls that are made in the background.
387
-
388
- For this, we recommend using `launch_control: :manual` and use `establish_connection` method to manage the client lifecycle manually inside your background jobs. It will provide you active connections to the MCP servers, and take care of closing them when the job is done.
389
-
390
- ```ruby
391
- RubyLLM::MCP.establish_connection do |clients|
392
- chat = RubyLLM.chat(model: "gpt-4")
393
- chat.with_tools(*clients.tools)
394
-
395
- response = chat.ask("Hello, world!")
396
- puts response
397
- end
398
- ```
399
-
400
- You can also avoid this completely manually start and stop the clients if you so choose.
401
-
402
- ## Client Lifecycle Management
403
-
404
- You can manage the MCP client connection lifecycle:
405
-
406
- ```ruby
407
- client = RubyLLM::MCP.client(name: "my-server", transport_type: :stdio, start: false, config: {...})
408
-
409
- # Manually start the connection
410
- client.start
411
-
412
- # Check if connection is alive
413
- puts client.alive?
414
-
415
- # Restart the connection
416
- client.restart!
417
-
418
- # Stop the connection
419
- client.stop
420
- ```
421
-
422
- ### Ping
423
-
424
- You can ping the MCP server to check if it is alive:
425
-
426
- ```ruby
427
- client.ping # => true or false
227
+ bundle exec ruby examples/tools/local_mcp.rb
428
228
  ```
429
229
 
430
- ## Refreshing Cached Data
431
-
432
- The client caches tools, resources, prompts, and resource templates list calls are cached to reduce round trips back to the MCP server. You can refresh this cache:
433
-
434
- ```ruby
435
- # Refresh all cached tools
436
- tools = client.tools(refresh: true)
437
-
438
- # Refresh a specific tool
439
- tool = client.tool("search_files", refresh: true)
440
-
441
- # Same pattern works for resources, prompts, and resource templates
442
- resources = client.resources(refresh: true)
443
- prompts = client.prompts(refresh: true)
444
- templates = client.resource_templates(refresh: true)
445
-
446
- # Or refresh specific items
447
- resource = client.resource("project_readme", refresh: true)
448
- prompt = client.prompt("daily_greeting", refresh: true)
449
- template = client.resource_template("user_logs", refresh: true)
450
- ```
451
-
452
- ## Notifications
453
-
454
- MCPs can produce notifications that happen in an async nature outside normal calls to the MCP server.
455
-
456
- ### Subscribing to a Resource Update
457
-
458
- By default, the client will look for any resource cha to resource updates and refresh the resource content when it changes.
459
-
460
- ### Logging Notifications
461
-
462
- MCPs can produce logging notifications for long-running tool operations. Logging notifications allow tools to send real-time updates about their execution status.
463
-
464
- ```ruby
465
- client.on_logging do |logging|
466
- puts "Logging: #{logging.level} - #{logging.message}"
467
- end
468
-
469
- # Execute a tool that supports logging notifications
470
- tool = client.tool("long_running_operation")
471
- result = tool.execute(operation: "data_processing")
472
-
473
- # Logging: info - Processing data...
474
- # Logging: info - Processing data...
475
- # Logging: warning - Something went wrong but not major...
476
- ```
477
-
478
- Different levels of logging are supported:
479
-
480
- ```ruby
481
- client.on_logging(RubyLLM::MCP::Logging::WARNING) do |logging|
482
- puts "Logging: #{logging.level} - #{logging.message}"
483
- end
484
-
485
- # Execute a tool that supports logging notifications
486
- tool = client.tool("long_running_operation")
487
- result = tool.execute(operation: "data_processing")
488
-
489
- # Logging: warning - Something went wrong but not major...
490
- ```
491
-
492
- ### Progress Notifications
493
-
494
- MCPs can produce progress notifications for long-running tool operations. Progress notifications allow tools to send real-time updates about their execution status.
495
-
496
- **Note:** that we only support progress notifications for tool calls today.
497
-
498
- ```ruby
499
- # Set up progress tracking
500
- client.on_progress do |progress|
501
- puts "Progress: #{progress.progress}% - #{progress.message}"
502
- end
503
-
504
- # Execute a tool that supports progress notifications
505
- tool = client.tool("long_running_operation")
506
- result = tool.execute(operation: "data_processing")
507
-
508
- # Progress 25% - Processing data...
509
- # Progress 50% - Processing data...
510
- # Progress 75% - Processing data...
511
- # Progress 100% - Processing data...
512
- puts result
513
-
514
- # Result: { status: "success", data: "Processed data" }
515
- ```
516
-
517
- ## Client Features
518
-
519
- The RubyLLM::MCP client provides support functionality that can be exposed to MCP servers. These features must be explicitly configured before creating client objects to ensure you're opting into this functionality.
520
-
521
- ### Roots
522
-
523
- Roots provide MCP servers with access to underlying file system information. The implementation starts with a lightweight approach due to the MCP specification's current limitations on root usage.
524
-
525
- When roots are configured, the client will:
526
-
527
- - Expose roots as a supported capability to MCP servers
528
- - Support dynamic addition and removal of roots during the client lifecycle
529
- - Fire `notifications/roots/list_changed` events when roots are modified
530
-
531
- #### Configuration
532
-
533
- ```ruby
534
- RubyLLM::MCP.config do |config|
535
- config.roots = ["to/a/path", Rails.root]
536
- end
537
-
538
- client = RubyLLM::MCP::Client.new(...)
539
- ```
540
-
541
- #### Usage
542
-
543
- ```ruby
544
- # Access current root paths
545
- client.roots.paths
546
- # => ["to/a/path", #<Pathname:/to/rails/root/path>]
547
-
548
- # Add a new root (fires list_changed notification)
549
- client.roots.add("new/path")
550
- client.roots.paths
551
- # => ["to/a/path", #<Pathname:/to/rails/root/path>, "new/path"]
552
-
553
- # Remove a root (fires list_changed notification)
554
- client.roots.remove("to/a/path")
555
- client.roots.paths
556
- # => [#<Pathname:/to/rails/root/path>, "new/path"]
557
- ```
558
-
559
- ### Sampling
560
-
561
- Sampling allows MCP servers to offload LLM requests to the MCP client rather than making them directly from the server. This enables MCP servers to optionally use LLM connections through the client.
562
-
563
- #### Configuration
564
-
565
- ```ruby
566
- RubyLLM::MCP.configure do |config|
567
- config.sampling.enabled = true
568
- config.sampling.preferred_model = "gpt-4.1"
569
-
570
- # Optional: Use a block for dynamic model selection
571
- config.sampling.preferred_model do |model_preferences|
572
- model_preferences.hints.first
573
- end
574
-
575
- # Optional: Add guards to filter sampling requests
576
- config.sampling.guard do |sample|
577
- sample.message.include("Hello")
578
- end
579
- end
580
- ```
581
-
582
- #### How It Works
583
-
584
- With the above configuration:
585
-
586
- - Clients will respond to all incoming sample requests using the specified model (`gpt-4.1`)
587
- - Sample messages will only be approved if they contain the word "Hello" (when using the guard)
588
- - The `preferred_model` can be a string or a proc that provides dynamic model selection based on MCP server characteristics
589
-
590
- The `preferred_model` proc receives model preferences from the MCP server, allowing you to make intelligent model selection decisions based on the server's requirements for success.
591
-
592
- ## Transport Types
593
-
594
- ### SSE (Server-Sent Events)
595
-
596
- Best for web-based MCP servers or when you need HTTP-based communication:
597
-
598
- ```ruby
599
- client = RubyLLM::MCP.client(
600
- name: "web-mcp-server",
601
- transport_type: :sse,
602
- config: {
603
- url: "https://your-mcp-server.com/mcp/sse",
604
- headers: { "Authorization" => "Bearer your-token" }
605
- }
606
- )
607
- ```
608
-
609
- ### Streamable HTTP
610
-
611
- Best for HTTP-based MCP servers that support streaming responses:
612
-
613
- ```ruby
614
- client = RubyLLM::MCP.client(
615
- name: "streaming-mcp-server",
616
- transport_type: :streamable,
617
- config: {
618
- url: "https://your-mcp-server.com/mcp",
619
- headers: { "Authorization" => "Bearer your-token" }
620
- }
621
- )
622
- ```
623
-
624
- ### Stdio
625
-
626
- Best for local MCP servers or command-line tools:
627
-
628
- ```ruby
629
- client = RubyLLM::MCP.client(
630
- name: "local-mcp-server",
631
- transport_type: :stdio,
632
- config: {
633
- command: "python",
634
- args: ["-m", "my_mcp_server"],
635
- env: { "DEBUG" => "1" }
636
- }
637
- )
638
- ```
639
-
640
- ## Creating Custom Transports
641
-
642
- Part of the MCP specification outlines that custom transports can be used for some MCP servers. Out of the box, RubyLLM::MCP supports Streamable HTTP transports, STDIO and the legacy SSE transport.
643
-
644
- You can create custom transport implementations to support additional communication protocols or specialized connection methods.
645
-
646
- ### Transport Registration
647
-
648
- Register your custom transport with the transport factory:
649
-
650
- ```ruby
651
- # Define your custom transport class
652
- class MyCustomTransport
653
- # Implementation details...
654
- end
655
-
656
- # Register it with the factory
657
- RubyLLM::MCP::Transport.register_transport(:my_custom, MyCustomTransport)
658
-
659
- # Now you can use it
660
- client = RubyLLM::MCP.client(
661
- name: "custom-server",
662
- transport_type: :my_custom,
663
- config: {
664
- # Your custom configuration
665
- }
666
- )
667
- ```
668
-
669
- ### Required Interface
670
-
671
- All transport implementations must implement the following interface:
672
-
673
- ```ruby
674
- class MyCustomTransport
675
- # Initialize the transport
676
- def initialize(coordinator:, **config)
677
- @coordinator = coordinator # Uses for communication between the client and the MCP server
678
- @config = config # Transport-specific configuration
679
- end
680
-
681
- # Send a request and optionally wait for response
682
- # Returns a RubyLLM::MCP::Result object
683
- # body: the request body
684
- # add_id: true will add an id to the request
685
- # wait_for_response: true will wait for a response from the MCP server
686
- # Returns a RubyLLM::MCP::Result object
687
- def request(body, add_id: true, wait_for_response: true)
688
- # Implementation: send request and return result
689
- data = some_method_to_send_request_and_get_result(body)
690
- # Use Result object to make working with the protocol easier
691
- result = RubyLLM::MCP::Result.new(data)
692
-
693
- # Call the coordinator to process the result
694
- @coordinator.process_result(result)
695
- return if result.nil? # Some results are related to notifications and should not be returned to the client, but processed by the coordinator instead
696
-
697
- # Return the result
698
- result
699
- end
700
-
701
- # Check if transport is alive/connected
702
- def alive?
703
- # Implementation: return true if connected
704
- end
705
-
706
- # Start the transport connection
707
- def start
708
- # Implementation: establish connection
709
- end
710
-
711
- # Close the transport connection
712
- def close
713
- # Implementation: cleanup and close connection
714
- end
715
-
716
- # Set the MCP protocol version, used in some transports to identify the agreed upon protocol version
717
- def set_protocol_version(version)
718
- @protocol_version = version
719
- end
720
- end
721
- ```
722
-
723
- ### The Result Object
724
-
725
- The `RubyLLM::MCP::Result` class wraps MCP responses and provides convenient methods:
726
-
727
- ```ruby
728
- result = transport.request(body)
729
-
730
- # Core properties
731
- result.id # Request ID
732
- result.method # Request method
733
- result.result # Result data (hash)
734
- result.params # Request parameters
735
- result.error # Error data (hash)
736
- result.session_id # Session ID (if applicable)
737
-
738
- # Type checking
739
- result.success? # Has result data
740
- result.error? # Has error data
741
- result.notification? # Is a notification
742
- result.request? # Is a request
743
- result.response? # Is a response
744
-
745
- # Specialized methods
746
- result.tool_success? # Successful tool execution
747
- result.execution_error? # Tool execution failed
748
- result.matching_id?(id) # Matches request ID
749
- result.next_cursor? # Has pagination cursor
750
-
751
- # Error handling
752
- result.raise_error! # Raise exception if error
753
- result.to_error # Convert to Error object
754
-
755
- # Notifications
756
- result.notification # Get notification object
757
- ```
758
-
759
- ### Error Handling
760
-
761
- Custom transports should handle errors appropriately. If request fails, you should raise a `RubyLLM::MCP::Errors::TransportError` exception. If the request times out, you should raise a `RubyLLM::MCP::Errors::TimeoutError` exception. This will ensure that a cancellation notification is sent to the MCP server correctly.
762
-
763
- ```ruby
764
- def request(body, add_id: true, wait_for_response: true)
765
- begin
766
- # Send request
767
- send_request(body)
768
- rescue SomeConnectionError => e
769
- # Convert to MCP transport error
770
- raise RubyLLM::MCP::Errors::TransportError.new(
771
- message: "Connection failed: #{e.message}",
772
- error: e
773
- )
774
- rescue Timeout::Error => e
775
- # Convert to MCP timeout error
776
- raise RubyLLM::MCP::Errors::TimeoutError.new(
777
- message: "Request timeout after #{@request_timeout}ms",
778
- request_id: body["id"]
779
- )
780
- end
781
- end
782
- ```
783
-
784
- ## RubyLLM::MCP and Client Configuration Options
785
-
786
- MCP comes with some common configuration options that can be set on the client.
787
-
788
- ```ruby
789
- RubyLLM::MCP.configure do |config|
790
- # Set the progress handler
791
- config.support_complex_parameters!
792
-
793
- # Set parameters on the built in logger
794
- config.log_file = $stdout
795
- config.log_level = Logger::ERROR
796
-
797
- # Or add a custom logger
798
- config.logger = Logger.new(STDOUT)
799
- end
800
- ```
801
-
802
- ### MCP Client Options
803
-
804
- MCP client options are set on the client itself.
805
-
806
- - `name`: A unique identifier for your MCP client
807
- - `transport_type`: Either `:sse`, `:streamable`, or `:stdio`
808
- - `start`: Whether to automatically start the connection (default: true)
809
- - `request_timeout`: Timeout for requests in milliseconds (default: 8000)
810
- - `config`: Transport-specific configuration
811
- - For SSE: `{ url: "http://...", headers: {...} }`
812
- - For Streamable: `{ url: "http://...", headers: {...} }`
813
- - For stdio: `{ command: "...", args: [...], env: {...} }`
814
-
815
- ## Development
816
-
817
- After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment.
818
-
819
- To install this gem onto your local machine, run `bundle exec rake install`. Run `bundle exec rake` to test specs and run linters. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and the created tag, and push the `.gem` file to [rubygems.org](https://rubygems.org).
820
-
821
- ## Examples
822
-
823
- Check out the `examples/` directory for more detailed usage examples:
824
-
825
- - `examples/tools/local_mcp.rb` - Complete example with stdio transport
826
- - `examples/tools/sse_mcp_with_gpt.rb` - Example using SSE transport with GPT
827
- - `examples/resources/list_resources.rb` - Example of listing and using resources
828
- - `examples/prompts/streamable_prompt_call.rb` - Example of using prompts with streamable transport
829
-
830
230
  ## Contributing
831
231
 
832
232
  We welcome contributions! Bug reports and pull requests are welcome on GitHub at https://github.com/patvice/ruby_llm-mcp.